×
Register Here to Apply for Jobs or Post Jobs. X

Sr. Computer Vision Engineer - Visual Target Tracking

Job in Orem, Utah County, Utah, 84058, USA
Listing for: FLIR
Full Time position
Listed on 2026-03-10
Job specializations:
  • Engineering
    Robotics, AI Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Be visionary

Teledyne Technologies Incorporated provides enabling technologies for industrial growth markets that require advanced technology and high reliability. These markets include aerospace and defense, factory automation, air and water quality environmental monitoring, electronics design and development, oceanographic research, deepwater oil and gas exploration and production, medical imaging and pharmaceutical research.

We are looking for individuals who thrive on making an impact and want the excitement of being on a team that wins.

Job Description
Job Summary

We are a fast-growing, mission-driven company developing advanced military Unmanned Aerial Vehicle (UAV) systems for the U.S. Department of Defense. Our platforms operate in challenging real-world environments and support capabilities such as autonomous navigation, precision guidance, visual target tracking, GPS-denied operation, path planning, and obstacle avoidance. We are seeking a Sr. Computer Vision Engineer to develop and improve our multi-camera visual target tracking algorithms for UAV systems.

The primary focus of this role is building robust computer vision algorithms that track moving targets across multiple cameras and challenging environments. This role involves integrating modern deep learning object detectors with classical computer vision, evaluating performance using flight data, and deploying production‑quality algorithms on embedded systems. You will work across the full development pipeline—from algorithm design and simulation to flight testing on operational platforms.

Job

Duties & Responsibilities:

  • Develop and improve visual target tracking algorithms for UAV systems using camera and sensor data.
  • Integrate modern deep learning object detectors, classifiers, and re‑identification models (e.g., YOLO‑style detectors) to improve tracking robustness.
  • Maintain target identity as objects move across multiple camera fields of view without relying on geolocation.
  • Evaluate and refine algorithms using flight data, simulation, and controlled testing to ensure robustness to edge cases.
  • Implement high‑performance algorithms in C++ for deployment on embedded Linux platforms.
  • Work closely with flight test engineers and system developers to validate performance through simulation, analysis, and live flight tests.
  • Support development of robotics algorithms for state estimation, navigation, and guidance.
  • Fuse data from sensors including cameras, IMUs, GPS, magnetometers, laser range finders, and barometers.
  • Contribute to integration with PX4‑based flight control systems.
  • Support integration and testing of new sensors and payloads.
  • Contribute to embedded software components such as sensor interfaces, timing, and synchronization.
Job Qualifications
  • Master’s degree or higher in Computer Science, Electrical Engineering, Robotics, Mechanical Engineering, or a related field strongly preferred.
  • Strong background in computer vision and camera models.
  • 4+ years of experience developing computer vision or perception algorithms.
  • 4+ years of experience working on robotics, autonomous systems, or aerospace platforms.
  • Experience implementing robotics algorithms such as visual tracking, state estimation (e.g., Kalman filtering), perception pipelines, or vision‑based deep learning integration.
  • Strong software development skills in C++, Python, and

    C.
  • Experience developing software for Linux‑based embedded systems.
  • Experience working with sensors such as IMUs, cameras, magnetometers, GPS, laser range finders, and barometers.
  • Understanding of feedback control systems and practical tuning methods.
  • Experience integrating deep learning object detectors (e.g., YOLO‑style detectors) into perception pipelines preferred.
  • Experience with visual SLAM or visual odometry systems preferred.
  • Experience working with multi‑camera systems or multi‑object tracking preferred.
  • Experience with UAV flight testing or aerial robotics platforms preferred.

Applicants must be a U.S. citizen, U.S. national, lawful permanent resident, asylee, refugee, or otherwise eligible to obtain the required U.S. export control authorization from the Departments of State or Commerce.

About Teledyne FLIR Defense

Jo…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary