More jobs:
Simulation & Test Engineer
Job in
San Francisco, San Francisco County, California, 94199, USA
Listed on 2026-01-27
Listing for:
Andromeda
Full Time
position Listed on 2026-01-27
Job specializations:
-
IT/Tech
AI Engineer, Robotics
Job Description & How to Apply Below
Overview
At Andromeda Robotics, we re not just imagining the future of human-robot relationships; we re building it. Abi is the first emotionally intelligent humanoid companion robot, designed to bring care, conversation, and joy to the people who need it most. Backed by tier-1 investors and with customers already deploying Abi across aged care and healthcare, we re scaling fast, and we re doing it with an engineering-first culture that s obsessed with pushing the limits of what s possible.
The Role
We are looking for a creative and driven Simulation and Test Engineer to own the simulation and test infrastructure, and be responsible for developing the Abi acceptance criteria that underpins both Abi s autonomous navigation and conversational AI & embodied behaviours, ideally both, but a deep understanding in at least one of these areas is essential. Your work will set the foundations for us to extend those simulation environments to generate synthetic data for our Machine Learning (ML) models.
The Team
You ll work at the intersection of:
• Conversational AI
• Robotics & controls
• Perception & audio engineering
• Autonomy
• Platform engineering
You ll collaborate closely with product owners and technical leads to define requirements and their test case and be the custodian of quality across our autonomy and AI/ML stack. Your work will directly impact the speed and quality of our development, ensuring that every software build is robust, reliable, and safe before it ever touches physical hardware or real users.
What You ll Do
Architect & Build Simulation and Test Platforms
• Design, develop and maintain a scalable, high-fidelity simulation platform for Abi that supports both navigation and embodied interaction use cases
• Own sim-to-real and test-to-deployment:
Develop robust CI/CD pipelines for automated testing in simulation and synthetic test environments, enabling rapid iteration and guaranteeing software quality before deployment onto our physical robots
• Model with fidelity:
Implement accurate models of Abi s hardware, including sensors (cameras, microphones, LiDAR, etc.), actuators, kinematics and upper-body motion as needed for both navigation and interaction scenarios
Develop Worlds, Scenarios and Test Suites
• Develop virtual worlds and test scenarios:
- Navigation-focused environments (indoor facilities, dynamic human traffic, obstacles, edge cases)
- Conversational & social interaction scenarios (multi-speaker audio scenes, social group configurations, gesture contexts)
• Conversational AI & memory testing:
Build synthetic test environments for:
- Voice-to-voice conversational quality and response appropriateness
- Tool-calling / action selection behaviour
- Memory systems - context retention, recall accuracy, conversation coherence
• Perception & audio testing:
Create test suites and synthetic scenes for:
- Social awareness (face detection, gaze tracking, person tracking)
- Audio modelling (multi-speaker, room acoustics, noise conditions, VAD)
• Gesture / embodiment testing:
Working with Controls/ML, create infrastructure to validate that Abi s body gestures and animations are appropriate, synchronised and safe in real and simulated interactions
Own Quality, Metrics and Regression
• Custodian of quality metrics:
If they don t exist, work with stakeholders to elicit use cases, derive requirements, and define measurable quality metrics for navigation, conversational AI, audio, perception and gesture
• Formalise requirements and traceability:
Capture requirements and trace them through to test cases and automated regression suites
• Analyse and improve:
Build dashboards, tools and analysis pipelines to mine test and simulation data, identify bugs, track performance over time, and feed actionable insights back to engineering teams
Scale to Synthetic Data & ML Training
• Extend test environments into training data generation pipelines, working closely with character and autonomy teams
• Investigate and stand up simulation tools (e.g. Unity, Unreal Engine, Isaac) to generate high-fidelity synthetic interaction data at scale for:
- Character animation and gesture models
- Perception models (vision, audio, social…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×