Rivet Industries Careers

Augmented Reality Software Engineer, Sensor Fusion

About Rivet  

Rivet is an American company building integrated task systems — fusing hardened hardware with software, sensors, AI, and networking — for industrial workforces and defense personnel. We create capabilities that multiply the effectiveness of every individual and withstand the world’s toughest environments. 

 

We serve the people who build, operate, maintain, and defend our way of life. From technicians and engineers to first responders and service members, they embody the hard work, ingenuity, and meritocratic values that drive Western prosperity. Yet too often they are forced to rely on outdated tools that fail under modern pressures. Rivet exists to reset that priority. 

 

At Rivet, you’ll join a mission-driven team that fuses disciplines to deliver decisive outcomes where they matter most. Whether shaping our technology, strengthening our partnerships, or building our culture, every role here contributes to equipping the front lines with the modern systems they deserve. 


Work Authorization Requirement: Due to the nature of our business and compliance with federal regulations, all candidates must be a "U.S. Person". Upon hire, you will be required to provide documentation verifying your status as a U.S. Citizen, a lawful permanent resident, or a protected individual under 8 U.S.C. 1324b(a)(3).


Role: Augmented Reality Software Engineer, Sensor Fusion

Location: San Jose preferred, open to Bellevue, WA

Compensation*: $180,000-$240,000 + benefits


Role Description 

Rivet is looking for an Augmented Reality Software Engineer focused on sensor fusion for real-time AR systems on Android and embedded Linux devices. This role sits at the intersection of perception, tracking, and real-time XR experiences — integrating IMUs, cameras, depth sensors, and other device inputs into robust, low-latency spatial computing systems. You will develop the software pipelines that enable accurate 3DoF/6DoF tracking, spatial awareness, and stable AR experiences in demanding operational environments. The role spans real-time systems, state estimation, runtime integration, and performance optimization on constrained hardware platforms. You will work closely with hardware, firmware, computer vision, and XR application teams to deliver performant and reliable sensor-driven AR capabilities. 

 

Responsibilities 

  • Develop sensor fusion systems for AR/XR devices using IMU, camera, depth, and other sensor inputs.  
  • Implement tracking, calibration, synchronization, filtering, and state estimation pipelines for spatial computing applications.  
  • Build performant C++ and C# systems for real-time XR applications and device integration.  
  • Integrate with OpenXR and AR runtimes for cross-platform compatibility.  
  • Optimize latency, timing, motion stability, and tracking performance on Android and embedded Linux platforms.  
  • Develop tooling, visualization, and debugging workflows for sensor validation and performance analysis.  
  • Build companion applications and device services in Java/Kotlin for Android-based systems.  
  • Use Python for simulation, automation, testing, and build/CI workflows.  
  • Collaborate across hardware, firmware, perception, graphics, and backend teams to deliver end-to-end AR capabilities. 

 

Role Requirements 

  • Deep experience building real-time sensor fusion, robotics, AR/VR, perception, or spatial computing systems.  
  • MS + at least 3 years (or PhD + at least 1 year) of relevant professional experience.  
  • Strong fundamentals in sensor fusion, state estimation, filtering, coordinate systems, motion tracking, and real-time systems.  
  • Proficient in C++ and/or C# for performance-critical applications.  
  • Experience developing on Android or embedded Linux systems.  
  • Familiarity with XR and perception frameworks such as OpenXR, ARCore, SLAM/VIO pipelines, or related technologies.  
  • Strong understanding of latency optimization, synchronization, profiling, and CPU/GPU/memory tradeoffs.  
  • Experience with Java/Kotlin for Android services and peripheral integration.  
  • Python experience for scripting, simulation, automation, or test infrastructure.  
  • Ability to collaborate across hardware, firmware, computer vision, and cloud/backend systems.  
  • Track record of shipping complex real-time systems or XR capabilities.  

 

Preferred Qualifications 

  • PhD in a relevant discipline with 5+ years of industry experience, or MS with 10+ years of industry experience.  
  • Experience with visual-inertial odometry (VIO), SLAM, Kalman filtering, or probabilistic estimation techniques.  
  • Experience with Unity, Unreal Engine, StereoKit, or comparable 3D/XR engines.  
  • Experience working on wearable, robotics, aerospace, defense, or embedded sensing platforms. 


*Total compensation may vary within this range and is determined by years and level of relevant experience, job-related skills, education, and other factors. In addition to base salary, this role may be eligible for equity grants and other forms of compensation. Eligible employees also receive a competitive benefits package, including unlimited PTO.


EOE

Engineering, Product, & Design

San Jose, CA

Bellevue, WA

Compartilhar no:

Termos de serviçoPrivacidadeCookiesDesenvolvido pela Rippling