
Test Automation Data - Senior Engineer
Iris Software Inc. • Noida, Uttar Pradesh, India
Role & seniority: QA Automation Engineer - ETL/Data Testing (mid-to-senior level implied by scope)
Stack/tools: Python, PySpark, SQL, Databricks, DBT, Snowflake, SQL Server; API testing (Postman/Newman, Python requests), CI/CD (GitHub Actions, Jenkins, AWS CodePipeline), AWS data services (S3, Glue, Lambda, Step Functions, Athena), Jira/Zephyr
Top 3 responsibilities
-
Design and develop automation scripts for API testing, ETL pipeline validation, and data transformation checks (Databricks, DBT)
-
Automate regression suites (DBT models, schema validations, data layer validations) and data reconciliation/aggregation validations
-
Integrate automation with CI/CD pipelines, participate in Agile ceremonies, log defects with evidence, and improve test data and standards
Must-have skills
-
API & automation testing (strong)
-
Data validation for Snowflake, Databricks (SQL/PySpark), SQL Server; DBT knowledge
-
Advanced SQL (joins, CTEs, window functions), Python (pytest/unittest) or Java (TestNG/JUnit)
-
AWS data services (S3, Glue, Lambda, Step Functions, Athena)
-
CI/CD exposure (GitHub Actions, Jenkins, CodePipeline); Jira/Zephyr
Nice-to-haves
-
Databricks API usage and notebook/workflow validation
-
Data engineering collaboration experience; data quality/utilities tooling
-
Additional cloud or data lake tooling beyond listed items
Location & work type
- Location: India (organization noted as a top IT workplace; global delivery mentioned)
Full Description
Why Join Iris?
Are you ready to do the best work of your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to grow in an award-winning culture that truly values your talent and ambitions?
Join Iris Software — one of the fastest-growing IT services companies — where you own and shape your success story.
About Us
At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential.
With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.
Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.
Working with Us
At Iris, every role is more than a job — it’s a launchpad for growth.
Our Employee Value Proposition, “Build Your Future. Own Your Journey.” reflects our belief that people thrive when they have ownership of their career and the right opportunities to shape it.
We foster a culture where your potential is valued, your voice matters, and your work creates real impact. With cutting-edge projects, personalized career development, continuous learning and mentorship, we support you to grow and become your best — both personally and professionally.
Curious what it’s like to work at Iris? Head to this video for an inside look at the people, the passion, and the possibilities. Watch it here.
Job Description
Design and develop automation scripts for
API testing (REST/JSON)
ETL pipeline validation
Data transformation checks in Databricks
Build and maintain reusable automation components using Python, PySpark
Automate regression suites covering DBT model outputs, schema validations, and upstream/downstream data layers.
Implement Automated Data Validation Across
Databricks Delta Lake
SQL Server sources
AWS S3 raw layers
Data Engineering Test Automation
Automate validation of DBT transformations (tests, snapshots, seed data checks).
Build SQL-based and script-based automation for
- Data reconciliation
- Aggregation validation
- Schema evolution testing
- Data freshness checks
- Use Databricks APIs or automation tools to validate notebook runs and workflows.
API & Integration Automation
Develop API automation scripts for
- Data ingestion
- Data consumption
- Metadata services
- Use tools like Postman/Newman or Python requests for automated Web/API testing.
CI/CD Integration
Integrate automation suites with
- AWS CodePipeline / GitHub Actions / Jenkins / GitLab CI
- Configure pipelines to run tests automatically for every code push, DBT model change, or Databricks workflow update.
Agile Collaboration
Participate actively in Agile ceremonies—stand-ups, sprint planning, grooming, retros. Work closely with Data Engineers, DBT Developers, Cloud Engineers, and Product Owners. Provide automation insights, effort estimates, and feasibility judgments.
Defect Management
Log defects with clear data evidence in Jira. Collaborate with teams to identify root cause (pipeline logic, DBT model, AWS service failure, etc.). Maintain traceability between requirements → test cases → automated scripts.
Quality Improvement & Standards
Enhance test coverage and reliability by contributing to
- Automation strategy
- Data testing best practices
- Test data generation utilities
- Error handling and logging improvements
- Advocate for quality-first development in a data engineering environment.
Technical Skills
Strong experience in API and Automation Testing.
Hands-on experience automating data validations for
- Snowflake
- Databricks (SQL/PySpark)
- SQL Server
- Understanding of DBT (models, tests, documentation, lineage).
- Strong SQL skills (joins, CTEs, window functions, reconciliations).
- Experience with Python automation frameworks (pytest, unittest), or Java (TestNG, JUnit).
Exposure to AWS data services
- S3, Glue, Lambda, Step Functions, Athena, EMR (optional)
- CI/CD exposure (GitHub Actions, Jenkins, AWS CodePipeline).
- Experience with test management tools (Zephyr, Jira).
Mandatory Competencies
QA/QE - QA Automation - ETL Testing
Beh - Communication
Big Data - Big Data - Pyspark
Cloud - AWS - AWS S3, S3 glacier, AWS EBS
Development Tools and Management - Development Tools and Management - Postman
QA/QE - QA Manual - API Testing
DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket
QA/QE - QA Automation - Python
Perks And Benefits For Irisians
Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.