Cookies & analytics consent
We serve candidates globally, so we only activate Google Tag Manager and other analytics after you opt in. This keeps us aligned with GDPR/UK DPA, ePrivacy, LGPD, and similar rules. Essential features still run without analytics cookies.
Read how we use data in our Privacy Policy and Terms of Service.
🤖 15+ AI Agents working for you. Find jobs, score and update resumes, cover letter, interview questions, missing keywords, and lots more.

Pronto • San Francisco, California, United States
Role & seniority
Stack / tools
Python (main automation language)
Test frameworks: pytest, unittest or similar
Simulation-based testing, automation infrastructure, log data analysis
CI/CD: GitHub Actions, Jenkins; Docker; cloud compute for simulation
Robotics/autonomy domains knowledge (preferred)
Tools: Gazebo/CARLA (familiarity), Notion/Slack for docs/communication
Top 3 responsibilities
Release & regression testing: define relevant test cases per release, coordinate field tests with Ops, analyze results, reproduce and diagnose issues, maintain sign-off docs
Test automation & simulation: build/maintain simulation-based test infra, develop automated scripts, compare expected vs. actual vehicle behavior from logs, establish test coverage metrics and dashboards
Test request management & process improvement: own intake of test requests, identify patterns to steer simulation tests, continuously improve testing processes and train Ops on procedures
Must-have skills
3+ years in software testing, QA, or test automation
Strong Python coding for automation
Experience building test frameworks or automation infrastructure
Excellent cross-team communication (engineering, operations)
Systematic test-case design, bug tracking, log data debugging
Nice-to-haves
Robotics/autonomous vehicles or safety-critical systems
CI/CD experience (GitHub Actions, Jenkins)
Proficiency with
What You'll Own Release Testing Pipeline — Design and execute regression test suites for each release, coordinating with Ops to schedule field tests and ensuring smooth hand-offs Test Case Development — Create, maintain, and curate test cases that cover critical autonomous behaviors, edge cases, and regression scenarios Test Request Management — Triage incoming test requests from developers and iterate on process improvement Test Automation Infrastructure — Build and maintain automated testing frameworks, with a focus on simulation-based testing that can catch issues before they reach the field Responsibilities Release & Regression Testing Define which test cases are relevant for each release based on code changes and risk assessment Coordinate with Ops team to schedule and execute field tests Analyze test results, identify failures, and work with developers to reproduce and diagnose issues Maintain release testing documentation and sign-off processes Test Automation & Simulation Build simulation-based test infrastructure to validate autonomy software before field deployment Develop automated test scripts that can run against simulation and log replay Create tooling to compare expected vs. actual vehicle behavior from logged data Establish metrics and dashboards for test coverage and pass rates Test Request Management Own the test request intake process from developers Identify patterns in test requests to inform simulation-based testing priorities Process Improvement Continuously improve the testing process to reduce cycle time and increase coverage Document test procedures and maintain test case repositories Train Ops team members on test execution procedures Identify gaps in test coverage and propose new test cases
Travel note: This role requires periodic travel to customer sites (up to 10%)
Schedule note: Some schedule flexibility may be required during deployments Required Qualifications 3+ years of experience in software testing, QA, or test automation Strong Python programming skills — you'll write significant automation code Experience building test frameworks or automation infrastructure Excellent communication skills for coordinating across engineering and operations teams Systematic approach to test case design and bug/issue tracking Comfortable working with log data and debugging complex system behavior Preferred Qualifications Experience with robotics, autonomous vehicles, or safety-critical systems Background in CI/CD pipelines and test integration (GitHub Actions, Jenkins) Experience with pytest, unittest, or similar Python test frameworks Understanding of control systems, localization, or perception (enough to design meaningful tests) Data analysis skills for querying test results and logs Familiarity with simulation environments (Gazebo, CARLA, or similar) Technical Environment
Languages: Python
Test Frameworks: pytest, simulation-based
Infrastructure: Docker, GitHub Actions, cloud compute for simulation
Communication/Documentation: Slack, Notion Example Projects Create an automated regression suite that validates steering controller performance across a library of recorded scenarios Design a test case prioritization system that maps code changes to relevant test cases
Pronto is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.