Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.
symplr • Karnataka, India
Role & seniority
Stack/tools
Performance testing: JMeter, BlazeMeter, Taurus, NeoLoad, LoadRunner
Monitoring: Perfmon, Linux performance tools; DataDog, AppDynamics, Honeycomb
CI/CD: Jenkins, Azure DevOps
Cloud/Orchestration: AWS, Docker, Kubernetes
Languages: Beanshell, Java, Python
Other: QA/Dev collaboration, test planning & environment setup
Top 3 responsibilities
Design and lead end-to-end performance testing strategies; define KPIs, benchmarks, dashboards, and real-time insights
Lead troubleshooting of complex performance issues across QA, staging, pre-prod, and production; mentor junior QA engineers
Integrate performance testing into CI/CD pipelines; produce test plans, scripts, reports, and performance analysis for stakeholders
Must-have skills
6–9 years’ experience with industry-standard performance test automation tools (JMeter, BlazeMeter, Taurus, NeoLoad, LoadRunner)
Strong scripting in Beanshell/Java/Python
Expertise in test planning, workload modeling, environment/data setup, defect management
Experience with system monitoring (Perfmon, Linux metrics) and APM tools
AWS/cloud, containerization (Docker/Kubernetes); CI/CD integration (Jenkins, Azure DevOps)
Nice-to-haves
Frontend performance testing (web/mobile)
Experience with AI/ML frameworks (TensorFlow, PyTorch)
Ability to communicate complex concepts to non-technical stakeho
Overview Designing comprehensive performance testing strategies, and collaborating with cross-functional teams to ensure system reliability, scalability, and responsiveness across applications Work closely with development and operations teams to Identifying key performance indicators (KPIs) and establishing benchmarks, monitoring solutions, and dashboards that provide real-time insights into system performance. Lead the troubleshooting and resolution of complex performance-related issues in QA, Staging, Pre-production and/or Production environments. Provide guidance and mentorship to junior QA engineers, fostering a culture of quality and continuous learning. Utilize industry-standard performance testing tools (e.g., JMeter, LoadRunner, Gatling) to simulate real-world scenarios and measure system performance, staying current with emerging tools and technologies in the performance testing space. Collaborate with development, QA, and operations teams to integrate performance testing into the continuous integration and continuous deployment (CI/CD) processes, providing guidance and support to team members on performance testing best practices. Analyze the CPU Utilization, Memory usage, Network usage, Garbage Collection to verify the performance of the applications. Generate performance graphs, session reports, and other related documentation required for validation and analysis. Create comprehensive performance test documentation, including test plans, test scripts, and performance analysis reports, effectively communicating performance testing results and recommendations to technical and non-technical stakeholders. Duties & Responsibilities Bachelor’s or Master’s degree in computer science, Engineering, or a related field. 6-9 years of experience with industry-standard performance test automation using tools like JMeter, BlazeMeter, Taurus, NeoLoad, LoadRunner (Optional) Strong scripting knowledge in Beanshell/ Java/Python programming Expertise in Test Planning, Test Strategy, workload model design, Test case design, Test Environment Setup, Test Data Setup, Defect Management. Skills Required Good experience in NFR gathering from scratch for performance testing project. Understand hardware and software architecture to effectively design and execute performance tests. Experience in system monitoring techniques and tools such as Perfmon, Linux Performance Observability Experience with one or more Application Performance Management software such as DataDog, AppDynamics or HoneyComb This position requires a dynamic, hard-working and ambitious individual with excellent oral and written communication skills Knowledge of Unix/Windows hardware, software, and applications environments (Java, .Net and Open Source) and a solid understanding of their associated capacity and performance indicators Experience in investigating complex application/infrastructure performance issues Extensive experience with AWS cloud technologies and containerization (Docker/Kubernetes). Integrate automated tests into the CI/CD pipeline using Jenkins or Azure DevOps to achieve continuous testing and deployment. Good to have knowledge of Frontend Performance testing for web and mobile. Experience with AI/ML frameworks (e.g., TensorFlow, PyTorch) is a plus. Strong verbal and written communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Agile development experience