Iris Software Inc. logo

Test Automation Data - Senior Engineer

Iris Software Inc. Noida, Uttar Pradesh, India

onsitefull-time
Posted Feb 12, 2026Apply by Mar 14, 2026

Role & seniority: QA Automation Engineer / Data Engineering Test Automation (mid to senior level)

Stack/tools

  • API & automation testing (Postman/Newman, Python requests, pytest/unittest; Java TestNG/JUnit)

  • Data engineering validation: Snowflake, Databricks (SQL/PySpark), DBT, SQL Server

  • Cloud & data services: AWS S3, Glue, Lambda, Step Functions, Athena, EMR (optional)

  • CI/CD: GitHub Actions, Jenkins, AWS CodePipeline; Jira/Zephyr

  • Data validation concepts: data reconciliation, schema validation, data freshness, notebook validation (Databricks)

  • Other: ETL testing, DBT tests, documentation and lineage

Top 3 responsibilities

  1. Design and develop automated tests for API, ETL pipelines, and data transformations (Databricks, DBT, SQL)

  2. Build and maintain reusable automation components; automate regression suites across data layers and notebook runs

  3. Integrate automation with CI/CD, participate in Agile ceremonies, log defects with evidence, and drive quality improvements

Must-have skills

  • Strong API/Automation testing experience

  • Data validation automation for Snowflake, Databricks (SQL/PySpark), SQL Server

  • DBT knowledge (models, tests, lineage) and strong SQL

  • Python automation (pytest/unittest) or Java (TestNG/JUnit)

  • AWS data services exposure (S3, Glue, Lambda, Step Functions, Athena)

  • CI/CD familiarity (GitHub Actions, Jenkins, CodePipeline) and Jira/Zephyr

Nice-to-haves

  • Databricks notebook/workflow

Full Description

Why Join Iris?

Are you ready to do the best work of your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to grow in an award-winning culture that truly values your talent and ambitions?

Join Iris Software — one of the fastest-growing IT services companies — where you own and shape your success story.

About Us

At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential.

With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.

Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.

Working with Us

At Iris, every role is more than a job — it’s a launchpad for growth.

Our Employee Value Proposition, “Build Your Future. Own Your Journey.” reflects our belief that people thrive when they have ownership of their career and the right opportunities to shape it.

We foster a culture where your potential is valued, your voice matters, and your work creates real impact. With cutting-edge projects, personalized career development, continuous learning and mentorship, we support you to grow and become your best — both personally and professionally.

Curious what it’s like to work at Iris? Head to this video for an inside look at the people, the passion, and the possibilities. Watch it here.

Job Description

Design and develop automation scripts for

API testing (REST/JSON)

ETL pipeline validation

Data transformation checks in Databricks

Build and maintain reusable automation components using Python, PySpark

Automate regression suites covering DBT model outputs, schema validations, and upstream/downstream data layers.

Implement Automated Data Validation Across

Databricks Delta Lake

SQL Server sources

AWS S3 raw layers

Data Engineering Test Automation

Automate validation of DBT transformations (tests, snapshots, seed data checks).

Build SQL-based and script-based automation for

  • Data reconciliation
  • Aggregation validation
  • Schema evolution testing
  • Data freshness checks
  • Use Databricks APIs or automation tools to validate notebook runs and workflows.

API & Integration Automation

Develop API automation scripts for

  • Data ingestion
  • Data consumption
  • Metadata services
  • Use tools like Postman/Newman or Python requests for automated Web/API testing.

CI/CD Integration

Integrate automation suites with

  • AWS CodePipeline / GitHub Actions / Jenkins / GitLab CI
  • Configure pipelines to run tests automatically for every code push, DBT model change, or Databricks workflow update.

Agile Collaboration

Participate actively in Agile ceremonies—stand-ups, sprint planning, grooming, retros. Work closely with Data Engineers, DBT Developers, Cloud Engineers, and Product Owners. Provide automation insights, effort estimates, and feasibility judgments.

Defect Management

Log defects with clear data evidence in Jira. Collaborate with teams to identify root cause (pipeline logic, DBT model, AWS service failure, etc.). Maintain traceability between requirements → test cases → automated scripts.

Quality Improvement & Standards

Enhance test coverage and reliability by contributing to

  • Automation strategy
  • Data testing best practices
  • Test data generation utilities
  • Error handling and logging improvements
  • Advocate for quality-first development in a data engineering environment.

Technical Skills

Strong experience in API and Automation Testing.

Hands-on experience automating data validations for

  • Snowflake
  • Databricks (SQL/PySpark)
  • SQL Server
  • Understanding of DBT (models, tests, documentation, lineage).
  • Strong SQL skills (joins, CTEs, window functions, reconciliations).
  • Experience with Python automation frameworks (pytest, unittest), or Java (TestNG, JUnit).

Exposure to AWS data services

  • S3, Glue, Lambda, Step Functions, Athena, EMR (optional)
  • CI/CD exposure (GitHub Actions, Jenkins, AWS CodePipeline).
  • Experience with test management tools (Zephyr, Jira).

Mandatory Competencies

QA/QE - QA Automation - ETL Testing

Beh - Communication

Big Data - Big Data - Pyspark

Cloud - AWS - AWS S3, S3 glacier, AWS EBS

Development Tools and Management - Development Tools and Management - Postman

QA/QE - QA Manual - API Testing

DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket

QA/QE - QA Automation - Python

Perks And Benefits For Irisians

Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Api TestingAutomation TestingRest/JsonEtl Pipeline ValidationDatabricksPythonPysparkDbtSql ServerSqlPostmanNewmanAws S3Ci/CdJenkinsJiramulti-location

Cookies & analytics consent

We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.

Read how we use data in our Privacy Policy and Terms of Service.