Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.
Storable Careers - One Posting • Hyderabad, Telangana, India
Role & seniority: Quality Engineer (Data Engineering focused), 5+ years experience
Stack/tools: SQL; Python/PHP/JavaScript for test automation; cloud data warehousing/Lake concepts (AWS S3/Glue, Azure Data Lake/Synapse, GCP BigQuery); ETL/ELT tools and orchestration (Airflow, dbt, Talend); BI tools (Tableau, Looker, Power BI)
Design and execute data quality strategy for ingestion, transformation, storage, and consumption in data lake and reporting ecosystem
Create and maintain automated/manual data quality tests (unit/integration/E2E); validate ETL/ELT pipelines; verify BI reports against source data
Implement quality gates in CI/CD and establish data quality monitoring dashboards; collaborate with data engineers/scientists/product managers; document plans and metrics
5+ years in QA/QA Engineering with data-centric testing or data warehousing
Proficient in SQL for data validation
Experience building test automation for data pipelines (Python/PHP/JavaScript)
Knowledge of cloud data warehousing/lake concepts and ETL/ELT tools; familiarity with Airflow, dbt, Talend
Experience with BI tools for report validation
Experience with big data technologies; data governance, metadata, MDM
Performance testing/optimization for large-scale data systems
Agile/Scrum environment experience; technical leadership for testing efforts
Location & work type: Hyderabad, India; on-site or h
We are seeking a highly motivated and skilled Quality Engineer to join our Data Engineering team in Hyderabad, India. This individual will play a critical role in ensuring the accuracy, reliability, and integrity of our central data lake and reporting products. The ideal candidate will be passionate about data quality, possess a strong understanding of data engineering principles, and be adept at developing and implementing robust quality controls and testing strategies for data pipelines and reporting solutions. Key Responsibilities
Design and Execute Quality Strategy: Develop and implement comprehensive test strategies and quality control processes specifically for data ingestion, transformation, storage, and consumption within the data lake and reporting ecosystem.
Data Quality Testing: Create, execute, and maintain automated and manual data quality tests (including unit, integration, and end-to-end tests) to validate data accuracy, completeness, consistency, and conformity with business rules.
Pipeline Validation: Validate ETL/ELT data pipelines to ensure data is processed correctly and efficiently, focusing on error handling, performance, and scalability.
Reporting and Dashboard Verification: Thoroughly test and validate the data displayed in business intelligence (BI) reports and dashboards against source data to guarantee accuracy and reliability for business decision-making.
Quality Gates & Monitoring: Implement quality gates within the CI/CD pipeline and establish data quality monitoring dashboards to proactively identify and alert on data anomalies and pipeline failures.
Collaboration: Work closely with Data Engineers, Data Scientists, and Product Managers to understand data requirements, define quality metrics, and integrate quality assurance throughout the data development lifecycle.
Documentation: Document test plans, test cases, and quality control processes, as well as track and report on data quality metrics and issues. Required Qualifications 5+ years of experience in Quality Assurance or Quality Engineering, with a focus on data-centric testing or data warehousing environments. Strong proficiency in SQL for data querying and validation. Experience in developing test automation frameworks and scripts for data pipelines using tools/languages like Python, PHP, or Javascript. Solid understanding of cloud-based data warehousing and data lake concepts (e.g., AWS S3/Glue, Azure Data Lake/Synapse, Google Cloud Storage/BigQuery). Familiarity with ETL/ELT tools and data pipeline orchestration (e.g., Apache Airflow, dbt, Talend). Experience with BI tools (e.g., Tableau, Looker, Power BI) for report validation and testing. Preferred Qualifications Experience with big data technologies. Knowledge of data governance, metadata management, and master data management principles. Familiarity with performance testing and optimization techniques for large-scale data systems. Experience working in an Agile/Scrum development environment. Technical leadership for steering testing efforts.