Staff Perception Software Engineer, Sensor Fusion (R2657)
Shield AI
What You'll Do:
- Write production quality software in C++
- Produce an Assured Position, Navigation, and Timing (A-PNT) system to enable reliable autonomy in GNSS-degraded or denied environments
- Extend and specialize Shield AI’s state-of-the-art state estimation framework for new sensors, platforms, and missions
- Write test code to validate your software with simulated and real-world data
- Collaborate with hardware and test teams to validate algorithms/code on aerial platforms
- Write analyzers to ingest data and produce statistics to validate code quality
- Enhance sensor models within a high-fidelity simulation environment
- Work in a fast-paced, collaborative, continuous development environment, enhancing analysis and benchmarking capabilities
Required Qualifications:
- BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience
- Ability to develop, benchmark, debug, and deploy software-based algorithms
- Demonstrated problem solving skills by applying a scientific approach
- Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience
- Demonstrated experience integrating and working with sensor payloads in the DoD space
- Proficient with sensor fusion for noisy high-bandwidth exteroceptive sensors on compute-constrained systems
- A solid foundation in theory related to state estimation, object detection, data association, probabilistic robotics, and signal processing.
- Experience working projects with 10+ contributors
- Offers Fast, efficient, effective problem solving approaches
- Exceptional collaborator and communicator
- Comfortability within Unix environments
- Hard-working, trustworthy teammate
- Exhibits holding themselves and others to high standards
- Being kind to others
- Ability to obtain a SECRET clearance
Preferred Qualifications:
- MS or greater in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience
- Understanding of robotics technologies related to autonomous behavior development e.g. task allocation or planning.
- Understanding/Experience with unmanned system technologies and accompanying algorithms (specifically air domain)
- Familiarity with high-fidelity simulation and sensor modeling
- Working knowledge of Kalman Filter, Factor Graphs and other modern estimator fundamentals.
- Strong working knowledge of Computer Vision with hands-on experience with OpenCV or similar CV libraries.
- Experience developing sensor effects (control) algorithms.
- Hands-on experience developing or implementing state of the art object detection/recognition pipelines.
- Active SECRET clearance