Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Yameo • Poland
Salary: PLN 135 - PLN 160 / hour
Role & seniority
Stack/tools
Python, PySpark, SQL; Databricks
Azure services for storage/processing; Azure DevOps; CI/CD
AMQP for JSON message validation; SCADA data interfaces
Top 3 responsibilities
Build and maintain automated data tests in Databricks (Python, PySpark, SQL)
Validate incoming JSON messages over AMQP and ensure data quality (structure, completeness, business rules)
Test end-to-end data flows (SCADA → messaging → Databricks), run quality checks, and store/present results; integrate tests into CI/CD and document results
Must-have skills
Solid experience with Python and PySpark/SQL
Hands-on Databricks practice
Experience using Azure services for data storage/processing
Familiarity with Azure DevOps and CI/CD pipelines
Nice-to-haves
Location & work type
Poland-based team; B2B cooperation (135–160 PLN/hour)
Remote-friendly; long-term, real-project focus with international clients
Growth opportunities (training, certifications) and supportive team culture
We’re a Polish–Dutch team of engineers who have been building real-world tech since 2005. Our products are used globally in finance, insurance, and healthcare - and we care about doing things properly: solid quality, long-term thinking, and solutions we can genuinely be proud of. No quick hacks, no throwaway projects.
We are seeking experienced Data Quality Testers to support a large-scale data integration initiative for a leading company in the energy sector. In this role, you will ensure the quality, consistency, and reliability of data flowing from SCADA systems through messaging layers into the Databricks environment. As a Data Quality Tester, you’ll make sure data coming from SCADA systems actually makes sense once it lands in the Databricks platform. You’ll build automated checks, validate incoming messages, and help the team trust their data.
What you’ll be doing Building and maintaining automated data tests in Databricks (Python, PySpark, SQL) Validating incoming JSON messages over AMQP (structure, completeness, business rules)
Testing end-to-end data flows: from SCADA → messaging → Databricks Running data quality checks (schemas, ranges, anomalies, transformation logic) Storing and presenting test results using Microsoft Azure services Plugging tests into CI/CD pipelines with Azure DevOps Working closely with Data Engineers, SCADA experts, and Cloud teams Keeping test cases and results well documented (so others can actually use them)
Must have Solid experience with Python and PySpark/SQL Hands-on work with Databricks Experience using Azure services for storing and processing data Familiarity with Azure DevOps and CI/CD pipelines Nice to have Experience with SCADA or industrial data
What we offer
B2B cooperation: 135–160 PLN/hour
Growth: training budget, certifications, conferences, clear development path
Real projects: long-term product work for large international clients
Health: private medical care (also for family), sports events support, wellness initiatives
People: team meetups, celebrations, and a friendly, no-drama atmosphere