Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Inherent Technologies • Florida, United States
Role & seniority: QA Performance Analyst (mid-level to senior) focused on performance quality for production systems.
Performance testing: JMeter, LoadRunner, NeoLoad, LoadComplete
Monitoring/analytics: Dynatrace, AppDynamics, Splunk, Datadog, Google/Adobe/Open Web Analytics
Version control: GitHub, GitLab, Bitbucket, TFS, AWS CodeCommit
Project/collaboration: JIRA, Confluence
Automated testing tools experience a plus
Platforms: large-scale environments (e.g., SAP, JDA) a plus
Define performance testing strategy, design/execute tests (load, spike, stress, endurance, stability) and report results against SLAs
Analyze performance metrics (response times, throughput, error rates, resource utilization) and communicate defects/observations to development
Coordinate testing programs, maintain test data, support tool implementation/maintenance, and drive process improvements
2–5 years in QA performance engineering
Scripting with JMeter, LoadRunner, NeoLoad, or LoadComplete
Experience with monitoring/analytics tools and data analytics for user behavior
Proficiency with version control (Git-based) and issue-tracking (JIRA)
Knowledge of QA methodologies, metrics, and performance testing lifecycle
Ability to manage time and coordinate with teams
Experience with automated testing tools
Familiarity with large-scale platforms (SAP, JDA)
Position: QA Performance Analyst
Location: Dallas, TX/Miramar, FL (Initial remote ok, first preference is local profile)
Duration: 1 Years
Job Description: The QA Performance Test Engineer job is responsible for the overall performance quality of systems and technology that are released into ETP's production environment. The scope of this role includes the comprehensive aspects of quality including Performance Testing Strategy and planning, design and development of automated scripts, execution of testing, analyzing results, and metrics reporting. In their role, the QA Performance Test Engineer is responsible for ensuring systems meet business performance requirements (SLAs) and overall business objectives by tracking and reporting performance test results, performance metrics (response times, error rates, throughput, CPU and memory utilization, latency, and more), identified defects, as well as documenting observations and communicating issues to development staff to assist in their resolution.
Duties And Responsibilities
Write testing strategies, test scripts and conditions to ensure effective performance test coverage
Execute performance, Load, Spike, Stress, Endurance and Stability Testing
Utilize data analytics to determine client usage characteristics and user flows
Identify test data requirements and work with other teams as needed to create and maintain test data
Coordinate and schedule testing programs with project and program teams
Reporting testing outcomes to project stakeholders
Assist in the implementation and maintenance of software testing tools
Identify, recommend, and implement process improvement opportunities for SGWS quality assurance programs
Stay up-to-date on software application testing tools and QA best practices
Recommend opportunities to lower testing costs or improve quality through alternative staffing models (e.g., gig, offshore, 3rd party)
Motivate team members to work collaboratively and effectively
Plan and document succession planning within scope of responsibility
Minimum Qualifications
Bachelor's Degree (computer science, information systems, business administration or other industry related curriculum) or combination of education and equivalent experience
Two to five (2-5) years serving as a QA Performance Engineer
Demonstrated command scripting using JMeter, LoadRunner, NeoLoad and/or LoadComplete
Experience with monitoring tools (Dynatrace, Performance Center, Splunk, AppDynamics, JProfiler, Datadog, NinjaOne)
Experience with data analytics tools (Google Analytics, Adobe Analytics, Open Web Analytics)
Experience with repository/version control tools such as GitHub, GitLab, Bitbucket, TFS, AWS CodeCommit
Experience with project management and issue-tracking software such as JIRA
Experience with team collaboration and knowledge management software such as Confluence
Good time management skills (i.e. works efficiently)
Experience executing testing with large-scale platforms (SAP, JDA, etc.) is a PLUS
Experience working with automated testing tools
Solid working knowledge of metrics and models in product quality engineering
Comprehensive understanding of test methodologies and techniques