Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Acunor • Cleveland, Ohio, United States
Role & seniority: Kafka Tester, Experienced/Mid-senior level
Stack/tools: Apache Kafka (topics/partitions/producers/consumers/offsets/security), JSON/Avro payloads, schema validation, Oracle SQL, MongoDB, Postman, Git, CI/CD, Jira, Confluence, X-ray
Design, develop, and execute test plans to validate Kafka-based data pipelines and message flows
Verify production/consumption across Kafka topics, partitions, and consumer groups; ensure data integrity across downstream systems; perform schema validation and serialization/deserialization checks
Troubleshoot Kafka client configurations, connectivity, and message processing; perform API testing; support CI/CD automated testing; collaborate in Agile environments
Strong knowledge of Kafka architecture (topics, partitions, producers/consumers, offsets, security)
Backend/message testing in event-driven systems; integration testing and automation
Proficient SQL (complex queries) and NoSQL experience (MongoDB); familiarity with Oracle SQL
Git and modern DevOps workflows; experience with CI/CD processes
Practical experience with Avro/JSON schema validation; schema evolution
Experience with API testing tools beyond Postman; familiarity with test tracking/documentation tools
Location & work type: Cleveland, OH; Hybrid (3 days onsite, 2 days remote); Full-Time
Job Title: Kafka Tester
Location: Cleveland, OH (Hybrid – 3 days onsite, 2 days remote)
Employment Type: Full-Time
Role Overview We are seeking a highly skilled Kafka Tester with strong backend and message validation expertise to join our team. The ideal candidate will have hands-on experience in testing event-driven architectures, validating Kafka message flows, and ensuring data integrity across distributed systems.
Key Responsibilities Design, develop, and execute test plans to validate Kafka-based data pipelines and message flows. Verify data production and consumption across Kafka topics, partitions, and consumer groups. Validate JSON and Avro message payloads, including schema validation and serialization/deserialization checks. Write and execute SQL queries in Oracle and perform data validation in MongoDB based on mapping logic. Ensure integrity and accuracy of data flowing through Kafka to downstream systems. Troubleshoot Kafka client configurations, connectivity issues, and message processing errors. Perform API testing using tools such as Postman. Support continuous integration and automated testing within CI/CD pipelines. Collaborate with development and DevOps teams in Agile environments to ensure quality delivery. Actively participate in Agile ceremonies and use tools such as Jira, Confluence, and X-ray for tracking and documentation.
Required Skills & Qualifications Strong understanding of Kafka architecture (topics, partitions, producers, consumers, offsets, security). Hands-on experience in backend/message testing within event-driven systems. Proficiency in writing complex SQL queries and working with NoSQL databases. Experience in integration testing and automation practices. Familiarity with Git and modern DevOps workflows.