More jobs:
AI/Autonomous Systems Engineer
Job in
Lusail, Al Daayen, Qatar
Listed on 2026-03-12
Listing for:
سنونو
Full Time
position Listed on 2026-03-12
Job specializations:
-
Software Development
Robotics, AI Engineer
Job Description & How to Apply Below
What You’ll Get Your Hands On
- Develop and implement autonomous perception, localisation, and navigation algorithms for ground robots and aerial drones within Snoonu’s AI‑driven autonomous logistics platform.
- Design and optimise multi‑sensor perception pipelines using LiDAR, radar, RGB/depth cameras, IMU, and GPS to ensure robust operation in Qatar’s challenging conditions (heat, dust, glare, reflective surfaces).
- Develop and improve object detection, free‑space segmentation, tracking, and obstacle prediction modules to support safe real‑world navigation.
- Implement localisation and mapping capabilities, including Visual‑Inertial SLAM, sensor‑fusion‑based localisation, and GPS‑assisted fallback mechanisms.
- Build simulation‑based testing workflows to validate autonomy performance under realistic Qatari environments, traffic patterns, and operational constraints.
- Collaborate with robotics, embedded, and cloud teams to ensure seamless integration of perception outputs into motion planning, control, fleet management, and RaaS orchestration layers.
- Evaluate model performance using defined validation metrics (accuracy, latency, robustness), and contribute to iterative improvements to achieve TRL advancement targets.
- Support dataset collection, labelling strategies, and pipeline improvements to enable continuous model training and validation from real‑world deployments.
- Participate in on‑site experiments, field testing, and pilot demonstrations, providing debugging and performance tuning of autonomy modules.
- Document model designs, experimental results, and validation procedures to support R&D reporting, regulatory compliance, and knowledge‑transfer activities.
- Bachelor’s or Master’s degree in Artificial Intelligence, Robotics, Computer Vision, Computer Engineering, or a related field. (PhD is a plus for research‑heavy autonomy work.)
- 2–4 years of backend development experience, with strong hands‑on Python/C++ expertise.
- Strong analytical and problem‑solving skills with the ability to work on complex real‑world robotics challenges.
- Research‑oriented mindset and ability to translate experimentation into production‑ready autonomy improvements.
- Ability to work effectively in cross‑functional teams (robotics, embedded, platform/software, operations).
- Clear communication skills and ability to document technical work, trade‑offs, and validation outcomes.
- High ownership and accountability for results, timelines, and engineering quality.
- Adaptability to fast‑paced R&D environments involving prototyping, testing, and iterative development.
- Strong business context understanding, able to translate operational needs into technical solutions.
- Open to feedback and proactive in applying improvements suggested by senior engineers or tech leads.
- Strong foundation in Python, including OOP principles, design patterns, and writing clean, maintainable code.
- Experience building backend services using frameworks such as FastAPI, Flask, or Django.
- Ability to design, develop, and maintain RESTful APIs with proper error handling and logging.
- Experience using AWS services, such as Lambda, SQS/SNS, API Gateway, Step Functions, Dynamo
DB, RDS, S3, and Cloud Watch. - Experience designing event‑driven and serverless architectures.
- Familiarity with IAM, environment configuration, and cloud security best practices.
- Strong proficiency in Python and experience using deep-learning frameworks.
- Experience in one or more autonomy domains:
- Computer Vision and Perception Pipelines
- Sensor fusion
- Localisation and mapping (SLAM / Visual‑Inertial Odometry)
- Object detection / tracking / segmentation
- Motion planning support systems
- Familiarity with robotics development environments such as ROS / ROS2 and real‑time robotics data pipelines.
- Experience working with LiDAR point clouds, camera streams, and IMU‑based motion data.
- Ability to optimise ML/AI systems for real‑time performance and edge deployment constraints.
- Experience with simulation tools or autonomy test environments is a strong plus (e.g., Gazebo, CARLA, Air Sim, Isaac Sim).
- Knowledge of safe autonomy principles, anomaly detection, or human‑in‑the‑loop safety mechanisms is a plus.
Apply now to join a team where your contributions spark change and your voice is heard. Let’s make some magic together.
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×