Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
π€ 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Orca-AI β’ Tel Aviv, Tel-Aviv District, Israel
Role & seniority
Stack / tools
Python (automation frameworks), SQL-based testing (complex queries)
ETL/ELT pipelines, Data Warehouses/L warehouses (Snowflake preferred), data lakes
Jira (issue tracking), QA methodology (STP, QA cycles)
Optional: pytest, AWS (nice to have)
Top 3 responsibilities
Develop and maintain automated QA tests for data pipelines, transformations, and data products
Validate end-to-end data flow: ingestion, transformations, reports/dashboards; perform data quality testing (completeness, consistency, accuracy, timeliness, schema)
Investigate data failures, provide RCA and actionable bug reports; define test coverage with stakeholders; document plans and automation coverage
Must-have skills
4+ years QA experience with automation or data validation flows
Strong SQL skills for validation, troubleshooting, joins, aggregations
Strong Python skills for automated validations (pytest a plus)
Experience testing data systems (ETL/ELT, DWH, analytics platforms, BI reports)
Snowflake or similar data warehouse experience; Jira for bug tracking
Detail-oriented, problem-solving, clear cross-functional communication
Nice-to-have
Knowledge of AWS or other cloud platforms
Experience with large-scale datasets, partitions, performance tuning
Location & work type
Location: not specified
Work type: not specified
We are looking for a QA Data Automation Engineer to join our Data Team in a dynamic and challenging role, providing critical test coverage for Orcaβs data pipelines, reporting layers, and analytics solutions. In this role, you will be responsible for validating data integrity end-to-end β from raw ingestion and transformation layers to dashboards and downstream consumers. You will design and maintain automated tests to ensure accurate, reliable, and scalable data systems. Key Responsibilities Develop and maintain automated QA tests for data pipelines, transformations, and data products.
Validate data flow across the system, including: ingestion, transformations, reports/dashboards. Perform data quality testing (completeness, consistency, accuracy, timeliness, schema validation). Write and execute SQL-based tests to validate logic, joins, aggregations, metrics, and anomalies. Build automation frameworks and validation scripts using Python. Work closely with Data Engineers and Analytics/BI stakeholders to define test coverage and acceptance criteria. Investigate failures and data issues, providing clear RCA and actionable bug reports. Document test plans, test scenarios, expected results, and automation coverage. Track issues in Jira, including reproducible steps and supporting evidence. Continuously improve QA processes for better monitoring, reliability, and faster releases.
Requirements 4+ years of QA experience, including experience with automation or data validation flows. QA Methodology knowledge (STP, QA cycles) Proven experience testing data systems (ETL/ELT pipelines, DWH, analytics platforms, BI reports). Strong SQL skills β ability to write complex queries for validation and troubleshooting. Strong Python skills β writing scripts/tests for automated validations (pytest is a plus). Hands-on experience working with Data Lakes / Data Warehouses such as Snowflake (preferred). Strong understanding of bug lifecycle management using Jira. High attention to detail, critical thinking, and problem-solving mindset. Excellent communication skills and ability to work cross-functionally in a fast-paced environment. Nice to Have Knowledge of cloud platforms (AWS). Experience working with large-scale datasets, partitions, and performance tuning.