×
Register Here to Apply for Jobs or Post Jobs. X

Senior Software Engineer, Perception

Job in Austin, Travis County, Texas, 78716, USA
Listing for: BotCrew
Full Time position
Listed on 2026-01-12
Job specializations:
  • Engineering
    Robotics, AI Engineer
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below

Join to apply for Senior Software Engineer, Perception role at Bot Crew
.

Who We Are

Founded in 2022, Bot Crew has emerged as one of the leaders in the solar robotics space for solving real world problems that provide value to our end customers. Our robotics platform, Gravion, is trusted by 80% of the top Engineering, Procurement, and Construction companies in North America and we have ambitions to expand worldwide in the near future. For additional information about Bot Crew and Gravion, please visit our website

About

The Role

We are seeking a skilled perception engineer to lead the design, implementation, and deployment of the perception stack powering our autonomous robotic systems. In this role, you will own the end-to-end perception pipeline, from sensors and calibration through real-time inference and tracking, delivering reliable scene understanding that enables safe, robust robot behavior in unstructured environments. You will work closely with autonomy, robotics software, and hardware teams to integrate and optimize computer vision and sensor‑fusion capabilities that operate on embedded compute at the edge.

Responsibilities
  • Architect, implement, and maintain Bot Crew’s on‑robot perception stack, including detection, segmentation, depth/3D understanding, tracking, and state estimation inputs needed by autonomy.
  • Develop and deploy computer vision and machine learning models for real‑time operation on embedded or edge compute (e.g., NVIDIA Jetson/Orin‑class platforms), including optimization and profiling.
  • Build robust sensor pipelines (e.g., RGB cameras, stereo/depth, LiDAR, IMU), including time synchronization, calibration, and data validation.
  • Implement and product ionize sensor fusion approaches (e.g., camera + depth/LiDAR + IMU) to improve reliability across lighting, weather, motion, and environmental variability.
  • Design evaluation methodologies and metrics; create tooling for offline analysis, dataset curation, model regression testing, and performance monitoring.
  • Partner with hardware and systems engineering to select sensors, define compute requirements, and ensure thermal/power/performance constraints are met.
  • Improve runtime resilience: fault detection, graceful degradation, and recovery behaviors when sensors or models underperform.
  • Lead technical decision‑making across perception; contribute to roadmap planning, technical reviews, and mentoring other engineers.
  • Document system architecture, interfaces, and operational playbooks to support testing, deployment, and field operations.
Qualifications
  • 3+ years of professional software engineering experience, with significant ownership of production systems.
  • Strong proficiency in modern C++, including performance‑aware design for real‑time systems.
  • Demonstrated experience shipping perception or robotics capabilities to production (on‑robot, on‑vehicle, or edge deployment).
  • Solid understanding of computer vision fundamentals (multi‑view geometry, tracking, camera models) and practical ML deployment.
  • Experience with common perception tooling and frameworks (e.g., OpenCV, PyTorch/Tensor Flow, ROS/ROS2 or equivalent middleware).
  • Experience integrating and validating sensors, including calibration, synchronization, and handling noisy/partial data.
  • Ability to debug complex systems using logs, traces, profiling tools, and structured experimentation.
  • Strong communication skills and ability to collaborate across autonomy, hardware, and operations teams.
  • Proven ability to leverage AI‑assisted tools (for coding, debugging, and technical research) as part of the development workflow.
Preferred/Bonus Qualifications
  • Experience deploying optimized inference (Tensor

    RT, ONNX Runtime, CUDA) and accelerating models on NVIDIA GPUs/edge platforms.
  • Prior work with 3D perception: point clouds, voxel/BEV representations, LiDAR‑camera fusion, SLAM inputs, or depth estimation.
  • Experience with dataset and training pipelines: labeling strategies, active learning, data versioning, and ML experimentation platforms.
  • Familiarity with real‑time constraints and systems engineering (latency budgets, throughput, determinism, resource scheduling).
  • Experience designing safety‑and‑reliability‑oriented systems: monitoring, redundancy, fallback modes, and field diagnostics.
  • Exposure to simulation and synthetic data generation workflows for robotics validation.
  • Leadership experience mentoring engineers and driving cross‑functional technical initiatives from concept through deployment.
Seniority Level

Mid‑Senior level

Employment Type

Full‑time

Job Function

Engineering and Information Technology

Industries

Robot Manufacturing

Referrals increase your chances of interviewing at Bot Crew by 2x.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary