Senior Staff Engineer, Software - Perception (R3771)
Shield AI
What you'll do:
- Develop advanced perception algorithms — Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.
- Implement sensor fusion frameworks — Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.
- Develop state estimation capabilities — Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.
- Analyze and utilize sensor ICDs — Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.
- Optimize perception performance — Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.
- Support autonomy integration — Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.
- Validate in simulated and operational settings — Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.
- Collaborate with hardware and sensor teams — Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.
- Drive innovation in airborne sensing — Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.
- Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).
Required Qualifications:
- BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience
- Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.
- Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.
- Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.
- Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.
- Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.
- Proficiency with version control, debugging, and test-driven development in cross-functional teams.
- Ability to obtain a SECRET clearance.
Preferred Qualifications:
- Hands-on integration or algorithm development with airborne sensing systems.
- Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.
- Experience deploying perception software on SWaP-constrained platforms.
- Familiarity with validating perception systems during flight test events or operational environments.
- Understanding of sensing challenges in denied or degraded conditions.
- Exposure to perception applications across air, maritime, and ground platforms.