Dusty construction sites. Fog-covered fields. Crowded warehouses. Heavy rain. Uneven terrain. What does it take for an autonomous machine to perceive and navigate challenging real-world environments like these – reliably, in real time?
Together with Au-Zone Technologies, we took on this challenge not only to build a perception system that performs under operational stress, but to make that system fast to integrate and easy to scale.
This collaboration led to Au-Zone’s Raivin module: a 3D perception system that fuses radar sensing, vision processing and edge AI inference into a single, production-ready unit. Built for operational complexity, the Raivin enables machines to process and act on complex environmental data in real time, delivering perception that performs under pressure.
With pre-trained AI perception models and a unified hardware-software stack, the Raivin simplifies the deployment of intelligent perception, marking a step forward in bringing scalable autonomy to the edge.
The next step in autonomous systems demands more robust, accurate and cost effective real-time 3D Spatial Perception. Working together with NXP, we engineered the Raivin to meet that demand head on.
Brad Scott, CEO, Au-Zone Technologies
A Shared Vision for Real-Time Perception
The push toward autonomy and physical AI is outpacing the readiness of traditional perception solutions. Many still rely on single-sensor stacks offering partial solutions that falter in complex, unpredictable environments. Camera-only systems degrade in low visibilty and poor lighting. LiDAR is precise but costly and power-hungry. Radar, while reliable in challenging weather conditions, lacks the resolution necessary for precise object classification.
Together with Au-Zone, we set out to solve the problem, co-developing an Edge AI sensor fusion system designed to deliver high-confidence, low-latency perception.
Vision delivers rich semantic understanding – object detection, classification and segmentation. Meanwhile, radar adds continuous depth and motion tracking, even through obscured environments. By fusing these signals with AI inference, the Raivin builds a synchronized, context-aware 3D model of the world, enabling real-time decision-making with a high degree of confidence.
But even with the benefits of multi-sensor systems, autonomous applications are still limited by how quickly they can respond to real-world stimuli. Latency matters, and these workloads can’t tolerate delays caused by cloud processing or slow sensor refresh rates.
Only by combining radar and vision with edge AI processing in one unit could we deliver a system fast enough, reliable enough and robust enough to meet the demands of next-generation autonomy.
The Raivin reflects system-level thinking where radar, vision and edge AI are engineered to operate as a single, deterministic pipeline. This approach is what makes real-time perception possible in complex environments.
Altaf Hussain, Director Industrial Segment Marketing for Transportation and Mobility, NXP
A Unified Hardware-Software Stack
From the start, the Raivin was a co-development effort. A full-stack design process where every layer, from silicon to software, was developed collaboratively to deliver unified performance at the edge.
The Raivin Module is a commercially available AI perception solution that provides low level radar cube and vision data processing with edge AI into a single, deployable unit.
NXP Provided the High-Performance, Scalable Compute and Sensing Foundation
- i.MX 8M Plus applications processor, with a quad-core Arm Cortex-A53 and neural processing unit operating at up to 2.3 TOPS for AI inference and handles vision-based classification, segmentation and scene understanding
- S32R294 radar microcontroller, compliant with ISO 26262 ASIL D, supports real-time radar signal processing and sensor fusion via its dual e200z7 application cores and lockstep e200z4 safety cores
- RFCMOS radar transceiver, operating in the 76–81 GHz band, unlocks reliable spatial awareness under dynamic and degraded conditions
Au-Zone Delivered the Edge AI Software Stack and Development Tools
The Raivin module was developed with Au-Zone’s EdgeFirst Studio™ , which facilitates the multimodal data collection, AI assisted labeling, training, validation and deployment of sensor fusion models without requiring extensive ML expertise. Within this, the EdgeFirst Perception Stack enables engineers to accelerate development through pre-trained models and workflow-optimized software. Developers can label datasets, fine-tune models and validate performance, all within an integrated environment. Such an end-to-end approach significantly reduces development effort and lowers the barrier to entry for designers implementing complex, multimodal AI perception systems.
The result is a tightly integrated 3D perception system optimized for low latency, low power and ready for deployment in an edge environment.
Raivin operates in dynamic and uncertain environments.
Watch our CES demo to see how it's showcasing trusted spatial perception at the edge.
Breakthrough Performance in Real-World Conditions
At CES 2025, the Raivin was put to the test in a live demo replicating the types of environmental stressors autonomous machines face every day from weather and motion to visual obstructions:
- In fog, radar maintained object detection, tracking and spatial awareness
- In glare, the fusion engine maintained accurate object tracking
- In simulated rainfall, radar and AI worked together to retain accurate perception
- In cluttered scenes, radar tracked velocity, while AI and vision segmented and classified people, equipment and obstacles in real time
During simulated rainfall, radar and AI worked in tandem to compensate for visual degradation and allowed the system to retain accurate perception in low-visibility environments.
Simplifying Sensor Fusion at Scale
Historically, sensor fusion has been complex, requiring fragmented tools, custom pipelines and deep domain expertise. The Raivin changes that.
With pretrained AI models integrated into Au-Zone’s EdgeFirst Studio, engineers can implement radar and vision integration without starting from scratch. The software supports dataset management, training and validation, enabling fast iteration with minimal coding or ML infrastructure. It can also be used as a data collection platform to explore custom solutions for different objects and working environments.
At the same time, the ready-built hardware solution is optimized for edge AI processing, eliminating concerns about custom implementations and hardware tradeoffs. And the Raivin is already commercially available, giving OEMs a validated 3D perception system that can scale.
Whether deployed in mobile robots, precision agriculture or fleet vehicles, the Raivin module enables fast integration of AI-powered perception through a single platform.
Through efficient edge processing, pre-trained AI models and integrated hardware-software design, the Raivin simplifies sensor fusion deployment across applications like robotics, agriculture and fleet vehicles.
Brad Scott, CEO, Au-Zone Technologies
Step Forward with the Raivin
The Raivin represents a step forward in both technology and collaboration. By integrating radar, vision and edge AI into a single platform, solutions like the Raivin module make intelligent perception faster to deploy, easier to scale and ready for the real world.
To begin building or integrating similar systems, explore
Au-Zone’s EdgeFirst Studio and learn how it simplifies edge AI deployment.