New York Technology Partners logo

W2 - Lead SDET / Data Quality Lead

New York Technology Partners Charlotte, North Carolina, United States

onsitecontract

Salary: USD 60–65 per hour

Posted Oct 1, 2025Apply by Oct 31, 2025

Role & seniority: Data Quality Lead / Lead SDET, senior (10+ years)

Stack/tools: Python (automation), Java; SQL; data lake validation; Big Data cloud platforms (AWS/Azure/GCP); CI/CD; test management (Xray); data governance tools (Collibra, Informatica, Alation)

Top 3 responsibilities

  1. Design, build, and execute data validation frameworks for Basel III, BCBS 239, and Dodd-Frank regulatory programs

  2. Ensure end-to-end data integrity across data lake ingestion, transformation, and consumption, with governance alignment

  3. Develop/expand automated testing frameworks and integrate automation into CI/CD; manage test artifacts and stakeholder collaboration

Must-have skills

  • 10+ years in application development, automation framework development, or senior QA/SDET

  • Deep domain experience in Capital Markets Back Office testing (Fixed Income, Derivatives) and regulatory programs

  • Data modeling, lineage, metadata management, and data quality frameworks

  • Python/Java coding for test automation (5–7+ years), advanced SQL for data validation

  • Experience with data lake validation and cloud platforms; familiarity with data governance tools is a plus

  • Strong communication and proven long-term project leadership

Nice-to-haves

  • Experience with Collibra, Informatica, or Alation; prior experience in regulatory data programs

  • Strong stakeholder management and cross-team collaboration

  • Location & work type: Charlotte, NC; Contract position

Full Description

Job Title: Data Quality Lead / Lead SDET

Location: Charlotte, NC

Position Type: Contract position

Job Description

  • We are seeking a highly seasoned Data Quality Lead with over 10 years of experience to join our Compliance Back Office Testing team. This senior role is critical for leading data validation strategy and execution to ensure regulatory adherence and data integrity across our post-trade, Fixed Income capital markets applications. The ideal candidate is a hands-on technical leader with deep expertise in Python automation, data lake validation, and a strong command of financial regulations such as Basel III, BCBS 239, and Dodd-Frank.

Key Responsibilities

Lead Data Validation Strategy: Design, build, and execute comprehensive data validation frameworks for regulatory programs (Basel III, BCBS 239, Dodd-Frank).

End-to-End Data Integrity: Validate data across the entire pipeline—from data lake ingestion and transformation to consumption layers—ensuring alignment with data governance and reporting requirements.

Automation & Framework Development: Develop and expand automated testing frameworks using Python to create scripts for data checks, lineage validation, and dashboard-driven monitoring.

Stakeholder Collaboration: Engage continually with development teams to review user stories, assess impacts, vet business requirements, and integrate automation into the CI/CD pipeline.

Test Artifact Management: Maintain all test artifacts, including test plans, cases, and data within tools like Xray, converting manual scenarios into robust automation.

Risk Management: Proactively identify, report, and mitigate risks and issues related to data quality and regulatory compliance.

Mandatory Qualifications & Skills

Experience: 10+ years in application development, automation framework development, or a senior QA/SDET role.

Domain Knowledge

  • Extensive experience in Capital Markets Back Office testing, specifically in Fixed Income and Derivatives.
  • Proven track record leading data validation for regulatory programs (Basel III, BCBS 239, Dodd-Frank).
  • Deep understanding of data modeling, lineage, metadata management, and data quality frameworks.

Technical Proficiency

Python/Java Programming: Strong coding skills to automate test cases for post-trade back-office and trade flow (5-7+ years).

SQL: Advanced ability to write complex queries for data validation.

Data Platforms: Hands-on experience validating data lake ingestion, transformation, and consumption; familiarity with Big Data ecosystems and cloud platforms (AWS, Azure, GCP).

Data Governance Tools: Familiarity with tools such as Collibra, Informatica, or Alation is a plus.

Soft Skills: Excellent communication skills, demonstrated ability to lead initiatives, and a proven history of stable tenure on long-term projects.

Thanks!

multi-locationreview:company

Cookies & analytics consent

We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.

Read how we use data in our Privacy Policy and Terms of Service.