Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Gallagher • Colombo, Western Province, Sri Lanka
Role & seniority: QE Lead – Data Operations; leadership/subject-matter-expert level within data platforms; 6–8+ years data-centric testing with at least 3 years in QE leadership or SME role.
Stack/tools: SQL (joins, window functions, etc.); data validation tools (Datagaps, QuerySurge, AccelQ, Selenium, Pytest, Python); ETL/ELT, DBT, Dataiku, Airflow; CI/CD integration; BI tools (Power BI, Tableau); cloud DWH (Snowflake, Azure Synapse, BigQuery); data governance/MDM; performance testing (JMeter) a plus.
Act as QE contact for Data Ops; mentor QA engineers; lead QA for new data pipelines and changes.
Define and enforce data quality monitoring, validation, testing strategies, and regression testing; manage defect triage and RCA.
Validate end-to-end data flows, data reconciliation, critical reports, and align with cross-functional teams (Data Engineering, DevOps, BI, business SMEs); integrate automated tests into CI/CD.
Strong SQL expertise; experience with data validation/testing of ETL/ELT, data warehouses/data lakes.
Proficiency with automation tools and Python-based testing; experience in Agile/Scrum.
Familiarity with data quality frameworks, data governance/MDM, and BI report validation.
Ability to lead, communicate risks/status to leadership, and operate in cross-functional teams.
QA/testing certifications (ISTQB, CSTE, CSQE, etc.); AI/ML-based testing tools.
Experience
About Arthur J. Gallagher Arthur J. Gallagher is one of the world’s largest insurance brokerage and risk management companies, providing tailored insurance solutions and strategic organizational wellbeing initiatives. With over 50,000 employees globally, Gallagher is committed to innovation, client success, and delivering high‑quality solutions across all aspects of the business. Role Summary The QE Lead – Data Operations is a dual-role position focused on ensuring production data reliability and embedding quality engineering into enterprise data platform development. Responsibilities include end-to-end data quality validation, operational testing, defect prevention, and test automation across maintenance, incident resolution, enhancements, and new development. The role collaborates with Data Engineering, Data Ops, BI, and Business SMEs to integrate quality controls into daily operations and promote scalable, shift-left testing practices.
Key Responsibilities Act as the QE contact for Data Ops and support teams. Mentor QA engineers to ensure quality deliverables and best practices. Collaborate with Data Engineering, DevOps, BI teams, and business SMEs for DataOps activities. Manage defect triage, root cause analysis, and data issue resolution. Establish processes for data quality monitoring, validation, and improvement. Communicate risks, quality status, and release readiness to leadership. Support L2/L3 production incidents, including data defects, report mismatches, and pipeline failures. Perform root cause analysis (RCA) and collaborate on permanent fixes. Validate hotfixes, break-fix changes, and data. Ensure regression testing during support releases. Lead QA for new data pipelines, transformations, schema changes, and reporting enhancements. Review requirements, mappings, and designs for testability. Define test strategies for incremental and large-scale data initiatives. Validate end-to-end data flow from source systems to BI consumption. Promote shift-left testing in development cycles. Define and enforce data quality rules (accuracy, completeness, timeliness, consistency). Validate ETL/ELT pipelines, transformations, and business rules. Ensure validation of critical financial and regulatory reports. Establish data reconciliation and variance analysis processes. Write and review complex SQL scripts for validation and reconciliation. Support pipeline automation using tools like DBT, Dataiku, and Airflow. Create reusable frameworks for data quality checks and automated testing. Integrate automated tests into CI/CD pipelines. Use Python and data testing tools to reduce manual effort. Define quality gates for deployments and data releases. Set SLAs, KPIs, and quality metrics for Data Ops. Ensure compliance with data governance, audit, and regulatory standards. Maintain test documentation, runbooks, and operational dashboards.
Qualifications Education A bachelor’s degree in computer science, Software Engineering, Information Systems, or a related field is required, or equivalent professional experience.
Experience The ideal candidate will possess 6–8+ years of extensive experience in data-centric testing, including a minimum of 3 years in a Quality Engineering (QE) leadership or Subject Matter Expert (SME) role within data platforms. The candidate should demonstrate a proven track record in testing ETL processes, data warehouses, data lakes, and complex data transformations, with the ability to effectively balance the demands of support urgency and development quality. Proficiency in utilising automation tools such as Selenium, Query Surge, Datagaps, or ACCELQ, coupled with substantial experience in Agile/Scrum environments, is required.
Technical Skills Proficient in SQL (joins, window functions, aggregations, subqueries, stored procedures). Experience with data validation tools (Datagaps, QuerySurge, AccelQ, Selenium, Pytest, Python). Skilled in data migration testing (legacy to modern/cloud systems). Strong knowledge of insurance datasets (policy, commissions, premiums, claims, billing). Familiar with data quality frameworks, metadata management, and MDM principles. Experienced in testing pipelines in DevOps/CI/CD environments. Proficient in validating BI reports (Power BI, Tableau). Knowledge of cloud DWH platforms (Snowflake, Azure Synapse, BigQuery). Experience with performance testing tools (e.g., JMeter) is a plus. Understanding of ETL/ELT processes and workflow automation tools (DBT, Dataiku, Astronomer).
Soft Skills Strong leadership and collaboration skills. Excellent problem-solving and analytical abilities. Clear written and verbal communication. Effective in cross-functional team environments.
Preferred Qualifications QA or testing certifications (ISTQB, CSTE, CSQE, etc.). Experience with AI/ML‑based testing tools. Understanding of compliance standards.