Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

EPAM Systems • Brazil
Role & seniority: Senior Data Quality Engineer (pleno-sênior) seeking to lead data quality initiatives and shape data landscape.
Languages/automation: Python; SQL (PostgreSQL, MSSQL, MySQL, Oracle)
Big data: Hadoop (HDFS, Hive, Spark); Kafka/Flume/Kinesis
NoSQL: Cassandra, MongoDB, HBase
Data viz: Tableau, Power BI, Tibco Spotfire
Cloud: AWS, Azure, GCP; multi-cloud
ETL/MDM/testing: Talend, Informatica; MDM tools; JMeter
CI/CD/version control: Jenkins, Git/GitLab, SVN; GitOps
Testing frameworks: TDD, DDT, BDT; automated validation pipelines
Define and oversee data quality strategies; lead quality improvement initiatives; ensure enterprise data standards
Design, build, and scale automated validation pipelines; establish governance and testing frameworks
Collaborate cross-functionally, mentor engineers, allocate resources, and maintain documentation
3+ years in Data Quality Engineering or similar
Python proficiency for automation and validation
Expertise with Hadoop/Spark, Kafka or equivalent streaming, and NoSQL (Cassandra/MongoDB/HBase)
SQL mastery; experience with ETL (Talend/Informatica); CI/CD (Jenkins/GH Actions)
Data visualization, cloud platforms (AWS/Azure/GCP), TDD/DDT/BDT, Git-based workflows
Strong analytical, problem-solving, and English communication skills
Java/Scala or advanced Bash scripting
XPath for da
We are seeking an experienced Senior Data Quality Engineer to join our team and ensure the reliability, accuracy, and efficiency of our data systems and processes. In this role, you will lead key initiatives in data quality, leveraging advanced technologies to drive impactful results. If you are passionate about improving data workflows and enjoy working with innovative tools, this role offers the opportunity to shape the future of our data landscape. Responsibilities Develop and oversee data quality strategies to ensure consistent accuracy across data products and processes Lead initiatives to improve data quality, embedding best practices across teams and projects Create and deploy advanced testing frameworks and methodologies to uphold enterprise-level data quality standards Manage complex data quality tasks, ensuring efficiency and prioritization within tight deadlines Design robust testing strategies tailored to evolving system architectures and data pipelines Provide strategic direction on resource allocation, aligning testing priorities with business and compliance requirements Establish and refine governance frameworks to ensure adherence to industry data standards Build and scale automated validation pipelines to support production systems Collaborate with cross-functional teams to resolve infrastructure issues and optimize system performance Mentor junior engineers and maintain comprehensive documentation for testing strategies and plans Requirements Minimum of 3 years of professional experience in Data Quality Engineering or related roles Advanced proficiency in Python for automation and data validation tasks Expertise in Big Data platforms, including Hadoop tools like HDFS, Hive, and Spark, as well as modern streaming technologies such as Kafka, Flume, or Kinesis Hands-on experience with NoSQL databases like Cassandra, MongoDB, or HBase for managing large-scale datasets Proficiency in data visualization tools such as Tableau, Power BI, or Tibco Spotfire to support analytics and decision-making Extensive experience with cloud platforms like AWS, Azure, or GCP, with a solid understanding of multi-cloud architectures Advanced knowledge of relational databases and SQL (PostgreSQL, MSSQL, MySQL, Oracle) in high-volume environments Proven expertise in implementing and scaling ETL processes using tools like Talend, Informatica, or equivalent platforms Familiarity with MDM tools and performance testing solutions like JMeter Advanced experience with version control systems such as Git, GitLab, or SVN, and automation for large-scale systems Comprehensive knowledge of testing frameworks like TDD, DDT, and BDT for data-centric environments Experience implementing CI/CD pipelines using tools such as Jenkins or GitHub Actions Strong analytical and problem-solving skills, with the ability to translate complex datasets into actionable insights Excellent English communication skills (B2 level or higher), with experience engaging stakeholders and leading discussions Nice to have Experience with additional programming languages like Java, Scala, or advanced Bash scripting for production-level solutions Advanced knowledge of XPath for data validation and transformation workflows Expertise in designing custom data generation tools and synthetic data techniques for testing scenarios We offer International projects with top brands Work with global teams of highly skilled, diverse peers Healthcare benefits Employee financial programs Paid time off and sick leave Upskilling, reskilling and certification courses Unlimited access to the LinkedIn Learning library and 22,000+ courses Global career opportunities Volunteer and community involvement opportunities EPAM Employee Groups Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Nível de experiência Pleno-sênior Tipo de emprego Tempo integral Função Tecnologia da informação, Engenharia e Controle de qualidade Setores Desenvolvimento de software, Atividades dos serviços de tecnologia da informação e Tecnologia, Informação e Internet