Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Shrive Technologies • Miramar, Florida, United States
Role & seniority: QA Performance Test Engineer, 2–5 years experience in performance QA.
Performance testing: JMeter, LoadRunner, NeoLoad, LoadComplete
Monitoring: Dynatrace, AppDynamics, Splunk, Datadog, JProfiler
Data analytics: Google Analytics, Adobe Analytics, Open Web Analytics
Version control / repos: GitHub, GitLab, Bitbucket, TFS, AWS CodeCommit
Project/issue tracking: JIRA; collaboration: Confluence
Other: automated testing tools; experience with large-scale platforms (SAP, JDA) is a plus
Design, develop, and execute performance, load, spike, stress, endurance, and stability tests; create testing strategies and scripts
Analyze results, track performance metrics (response times, throughput, error rates, CPU/memory, latency), and report to stakeholders
Coordinate testing programs, manage test data requirements, promote process improvements, and assist tool adoption/maintenance
Bachelor’s degree or equivalent experience
2–5 years as a QA Performance Engineer
Scripting with JMeter, LoadRunner, NeoLoad, or LoadComplete
Experience with monitoring and analytics tools (Dynatrace, AppDynamics, Splunk, Datadog; GA/Adobe/Open Web Analytics)
Version control (GitHub/GitLab/Bitbucket/TFS/AWS CodeCommit)
Project management/issue-tracking (JIRA); collaboration (Confluence)
Strong time management; solid knowledge of performance metrics and QA methodolog
Job Description: The QA Performance Test Engineer job is responsible for the overall performance quality of systems and technology that are released into ETP's production environment. The scope of this role includes the comprehensive aspects of quality including Performance Testing Strategy and planning, design and development of automated scripts, execution of testing, analyzing results, and metrics reporting. In their role, the QA Performance Test Engineer is responsible for ensuring systems meet business performance requirements (SLAs) and overall business objectives by tracking and reporting performance test results, performance metrics (response times, error rates, throughput, CPU and memory utilization, latency, and more), identified defects, as well as documenting observations and communicating issues to development staff to assist in their resolution.
Duties And Responsibilities
Write testing strategies, test scripts and conditions to ensure effective performance test coverage
Execute performance, Load, Spike, Stress, Endurance and Stability Testing
Utilize data analytics to determine client usage characteristics and user flows
Identify test data requirements and work with other teams as needed to create and maintain test data
Coordinate and schedule testing programs with project and program teams
Reporting testing outcomes to project stakeholders
Assist in the implementation and maintenance of software testing tools
Identify, recommend, and implement process improvement opportunities for SGWS quality assurance programs
Stay up-to-date on software application testing tools and QA best practices
Recommend opportunities to lower testing costs or improve quality through alternative staffing models (e.g., gig, offshore, 3rd party)
Motivate team members to work collaboratively and effectively
Plan and document succession planning within scope of responsibility
Minimum Qualifications
Bachelor's Degree (computer science, information systems, business administration or other industry related curriculum) or combination of education and equivalent experience
Two to five (2-5) years serving as a QA Performance Engineer
Demonstrated command scripting using JMeter, LoadRunner, NeoLoad and/or LoadComplete
Experience with monitoring tools (Dynatrace, Performance Center, Splunk, AppDynamics, JProfiler, Datadog, NinjaOne)
Experience with data analytics tools (Google Analytics, Adobe Analytics, Open Web Analytics)
Experience with repository/version control tools such as GitHub, GitLab, Bitbucket, TFS, AWS CodeCommit
Experience with project management and issue-tracking software such as JIRA
Experience with team collaboration and knowledge management software such as Confluence
Good time management skills (i.e. works efficiently)
Experience executing testing with large-scale platforms (SAP, JDA, etc.) is a PLUS
Experience working with automated testing tools
Solid working knowledge of metrics and models in product quality engineering
Comprehensive understanding of test methodologies and techniques