Next-Generation Perception Fusion for Safer, Smarter Mobility
Fusionride develops high-performance millimeter-wave radar and radar-camera fusion systems. By seamlessly integrating physics-based modeling, advanced system engineering, and AI algorithms, we are pushing the boundaries of mmWave and vision perception for automotive and industrial applications.
We deliver all-weather, highly robust perception solutions for autonomous driving and intelligent mobility—powering vehicles, two-wheelers, commercial low-altitude platforms, and robotics.
We don't just build technology; we build production-ready, verifiable, and scalable solutions. By optimizing sensing coverage, system costs, compute budgets, and data strategies, Fusionride provides a rock-solid perception foundation for the future of intelligent mobility.
Mission & Vision
Mission
To give machines precise perception, enabling intelligence to navigate extreme conditions with confidence.
Vision
To become a globally trusted full-stack perception technology partner. Focused on mobility, robotics and intelligent infrastructure, Fusionride helps build a safer foundation for the connected world through reliable, verifiable and production-ready perception capabilities.
Values
Excellence:Pursue the highest standard with resilient focus, refining extreme-scenario capability with discipline.
Integrity:Respect life, uphold safety and pursue scientific rigor, truth and pragmatism in the real world.
Courage:Break technical and cognitive boundaries, opening new paths for reliable perception in unknown scenarios.
Strategic Integration
Fusionride × Zadar Labs
In January 2026, Fusionride completed a strategic integration with Zadar Labs, combining product roadmaps, global markets, technical capabilities and customer ecosystems into a unified perception platform.
Product Line Integration
From 3D radar to high-definition 4D imaging radar, the combined roadmap expands sensing products across performance levels and application domains.
Market Integration
The integrated market footprint covers China, Japan, the United States, Europe, Southeast Asia and other countries and regions, supporting global customer program delivery.
Capability Integration
AI, software, hardware and radar algorithms are brought together to create a complete sensing stack from signal to perception output.
By combining global R&D capabilities, advanced engineering expertise and scalable manufacturing strength, the integrated team operates with a clear international division of labor to serve automotive OEMs, Tier-1 suppliers, AMR platforms and intelligent infrastructure customers worldwide.
Global Footprint
Fusionride + Zadar Global R&D Layout
A coordinated R&D and industrial footprint connects China, Europe and North America, supporting radar hardware, DeepFusion software, manufacturing and customer delivery.
Global layout image is being prepared.
2021 → 2026
Fusionride Development History
A six-year product journey from Fusionride's foundation to Columbus 4D radar, centralized-compute imaging radar, Super Sensor, First International Customer Programs and software-defined imaging radar.
Full-Stack Capabilities from Software-Defined Radar to DeepFusion Intelligence
Fusionride is building a production-oriented full-stack perception ecosystem. We deeply integrate Zadar's leading software-defined 4D imaging radar hardware and SDIR core capabilities with Fusionride's in-house DeepFusion early-fusion algorithms, embedded systems and strong manufacturing resources. The platform covers the full chain from configurable sensing to Occupancy Grid understanding. Through feature-level radar-camera alignment and BEV representation, it enables reliable, continuous active-safety outputs even in highly complex and extreme environments.
Software-defined imaging radar
High-resolution 4D imaging radar, software-defined waveform control, calibration, interference mitigation and multi-radar fusion provide a measurable physical sensing foundation across range, velocity, angle and elevation.
Software-defined waveforms allow the same radar platform to be configured for range, resolution and scene requirements across vehicles, robots, commercial low-altitude platforms, and infrastructure.
Radar-camera DeepFusion
DeepFusion aligns radar and camera features earlier in the perception chain, before object-level decisions are finalized. The fused 3D / BEV representation supports obstacle detection, freespace estimation and dense scene understanding.
Radar velocity and range evidence is aligned with visual semantics in BEV space, helping perception remain stable when light, weather or target shape changes.
Robust scene understanding
The perception stack is designed for glare, low light, rain, fog, unstructured roads and unknown obstacles. Occupancy grids, temporal memory and radar physical evidence help maintain stable understanding when individual sensors degrade.
Occupancy Grid reasoning turns sparse and dense sensing evidence into a continuous understanding of free space, obstacles and safety-critical events.
Embedded and production ready
Lightweight neural networks, automotive interfaces, time synchronization, diagnostics, validation workflows and EMS manufacturing resources support deployment on mainstream edge AI platforms and volume production programs.
The stack connects algorithm design with embedded deployment, validation, manufacturing traceability and scalable delivery for production programs.
Signal Layer
From raw radar echoes to high-confidence physical evidence
The signal layer converts raw radar echoes into structured physical evidence, including range, velocity, angle, elevation, micro-motion and temporal consistency. Software-defined configuration, calibration and interference mitigation allow radar behavior to be tuned for automotive, robot, commercial low-altitude platform, and infrastructure domains.
Fusion Layer
DeepFusion before object-level decisions
The fusion layer brings radar and camera information together before final object reasoning. By aligning complementary features in 3D / BEV space, DeepFusion preserves radar's all-weather ranging and velocity advantages while using visual semantics for structure, class and context.
Production Layer
A deployable stack, not a laboratory demo
The deployment layer connects algorithms with embedded software, interface adaptation, validation, data iteration and manufacturability. Fusionride delivers perception outputs that downstream systems can consume directly, including point cloud, tracking, freespace, occupancy, obstacle class, road structure and safety events.
Production Capability
Scalable production capability for radar delivery
Fusionride connects product engineering, validation testing and manufacturing resources to support radar programs from early engineering builds to scalable production. The production system is designed around consistency, traceability and delivery efficiency, providing stable manufacturing support for automotive, robotics and mobility customers.
2,400 sqmHighly automated production line
300,000 units/yearscalable to 800,000 units/year
Product Center
Product Portfolio by Perception Scenario
Designed for automotive and ground-based mobility applications.
76-81GHz | FPGA imaging radar
zPRIME FPGA
High-performance imaging radar platform with point cloud output and optional tracking, classification, odometry and freespace outputs.
Specification
120° x 24°
120° x 50°
120° x 90°
Field of View
120° x 24°
120° x 50°
120° x 90°
Static Angular Resolution
0.35° x 0.4°
0.35° x 0.9°
0.35° x 1.6°
Angular Accuracy
±0.05°
Truck Detection
> 1,000 m
> 400 m
> 250 m
Car Detection
> 800 m
> 400 m
> 250 m
Human Detection
> 250 m
> 150 m
> 90 m
Hardware Dimensions
140 x 103 x 36 mm
Weight
500 g (17.6 oz)
Sealing Protection
IP68 (2-meter)
Operating Temperature
-40° to 85° C
Storage Temperature
-40° to 95° C
Compliance
IEC 60068-2-27, 64, ISO 16750
Frame Time
50 ms / 20 Hz
76-81GHz | Compact FPGA radar
zPULSE FPGA
Compact software-defined radar platform for high-resolution point cloud and optional perception outputs in production programs.
zPULSE Specifications
Field of View
120° x 90°
Static Angular Resolution
1.5°
Angular Accuracy
±0.25°
Truck Detection
> 400 m
Car Detection
> 400 m
Human Detection
> 200 m
Hardware Dimensions
80 x 70 x 40 mm
Weight
200 g
Weather Proofing
IP68 (2-meter)
4T4R | 250 m front radar
Columbus Front Radar – TI 2944
Front automotive radar for detections, objects and freespace with CAN-FD or optional Ethernet output.
Size
72 x 62.5 x 18.5 mm
Weight
<100 g
Operating Temperature
-40° to +85° C
Typical power
4.5 W nominal
Voltage
9-16 VDC
Frequency
76-77 GHz
Interface
CAN-FD / Ethernet optional
IP Level
IP6K7; IP6K9 with special connector
Channels
4T4R
Detection range
250 m @ 10 dBsm
Range accuracy
6 cm
Range resolution
0.25 m
Speed range
±330 km/h
Speed accuracy
0.023 m/s
Speed resolution
0.14 m/s
Azimuth FoV
120°
Azimuth accuracy
±0.15°
Azimuth separation
2.5°
Elevation FoV
±15°
Elevation accuracy
±1.0°
FuSa / Cybersecurity
Designed for ASIL B (ISO 26262) / HSM / AUTOSAR
4T4R | 180 m corner radar
Columbus Corner Radar – TI 2944
Corner radar for detections, objects, freespace and ADAS functions including LCDAS, RCTA/B, FCTA/B, DOW and RCW.
Size
72 x 63 x 16.4 mm
Weight
<100 g
Operating Temperature
-40° to +85° C
Typical power
4.5 W nominal
Voltage
9-16 VDC
Frequency
76-77 GHz
Interface
CAN-FD / Ethernet optional
IP Level
IP6K7; IP6K9 with special connector
Channels
4T4R
Detection range
180 m @ 10 dBsm
Range accuracy
6 cm
Range resolution
0.24 m
Speed range
±330 km/h
Speed accuracy
0.023 m/s
Speed resolution
0.14 m/s
Azimuth FoV
150°
Azimuth accuracy
±0.3°
Azimuth separation
4.8°
Elevation FoV
±15°
Elevation accuracy
±1.0°
FuSa / Cybersecurity
Designed for ASIL B (ISO 26262) / HSM / AUTOSAR
4T4R | Calterah 344
Columbus Corner Radar – Calterah 344
CAN-FD corner radar for compact ADAS functions, supporting detections, objects, freespace and surrounding alerts.
Size
72 x 63 x 16.4 mm
Weight
<100 g
Operating Temperature
-40° to +85° C
Typical power
4.5 W nominal
Voltage
9-16 VDC
Frequency
76-77 GHz
Interface
CAN-FD
IP Level
IP6K7; IP6K9 with special connector
Channels
4T4R
Detection range
160 m @ 10 dBsm
Range accuracy
6 cm
Range resolution
0.24 m
Speed range
±300 km/h
Speed accuracy
0.023 m/s
Speed resolution
0.14 m/s
Azimuth FoV
150°
Azimuth accuracy
±0.3°
Azimuth separation
4.8°
Elevation FoV
±15°
Elevation accuracy
±1.0°
FuSa / Cybersecurity
Designed for ASIL B (ISO 26262) / HSE / AUTOSAR
4T4R | Calterah 244 | 76-79 GHz
Door Radar – Calterah 244
Ultra-compact CAN-FD door radar for door anti-pinch and opening-angle sensing, designed for ASIL B (ISO 26262) functional safety with AUTOSAR support.
Size
77 × 22 × 10.3 mm
Weight
<35 g
Operating Temperature
-40° to +85° C
Typical power
<2 W
Voltage
9-16 VDC
Frequency
76-79 GHz
Interface
CAN-FD
IP Level
IP6K7
Channels
4T4R
Detection range
120 cm
Range accuracy
2 cm
Range resolution
0.06 m
Speed range
±13 m/s
Speed accuracy
0.02 m/s
Speed resolution
0.14 m/s
Azimuth FoV
140°
Elevation FoV
120°
FuSa / Cybersecurity
Designed for ASIL B (ISO 26262) / HSM / AUTOSAR
77GHz 6T8R | 350 m
Picasso 4D Imaging Radar 6×8 NXP
6T8R front imaging radar with long-range, short-range and configurable modes for detection list, optional tracking object and freespace output.
Parameter
Long Range Mode
Short Range Mode
Configurable Mode
Channels
6T8R
6T8R
Configurable
Detection range
350 m @ 10 dBsm
125 m @ 10 dBsm
Range accuracy
0.08 m
0.05 m
Range resolution
0.4 m
0.25 m
Doppler range
±300 km/h
±100 km/h
Doppler accuracy
0.023 m/s
0.012 m/s
Doppler resolution
0.1 m/s
0.06 m/s
Horizontal FoV
±60°
±60°
Horizontal angle accuracy
±0.15°
±0.15°
Horizontal angle resolution
2.0°
2.0°
Vertical FoV
±15°
±15°
Vertical angle accuracy
±0.5°
±0.5°
Vertical angle resolution
4.0°
4.0°
Size
101 x 85.5 x 17.5 mm excluding connector
Weight
<200 g
Operating Temperature
-40° to +85° C
Typical power
<10 W
Voltage
9-16 VDC
Frequency
76-77 GHz
Interface
CAN-FD / Ethernet
IP Level
IP6K7; IP6K9 with specific connector
FuSa / Cybersecurity
Designed for ASIL B (ISO 26262) / HSM / AUTOSAR
24GHz | 0.65-70 m
24GHz Two-Wheeler BSD Radar
Low-power two-wheeler blind spot radar with detections, objects and LCA, RCW, BSD functions.
Size
61 x 45.5 x 13 mm
Weight
<100 g
Operating Temperature
-40° to +85° C
Voltage
9-16 V; 12 V typical
Frequency
24 GHz
Detection range
0.65-70 m
Range resolution
0.65 m
Speed range
-150 to 150 km/h
Speed resolution
0.76 km/h
Azimuth FoV
±60°
Elevation FoV
±30°
Cycle time
75 ms
Interface
CAN; 500K baud
Wireless interface
BLE 5.0
Typical power
<0.8 W
77GHz | 70 m
77GHz Two-Wheeler Rear Radar
Small rear radar with UART output, lighting control and wide temperature operation for motorcycles and e-bikes.
Size
35 x 25 x 22 mm
Weight
<100 g
Operating Temperature
-40° to +85° C
Voltage
9-16 V; 12 V typical
Frequency
77 GHz
Detection range
70 m
Range resolution
0.50 m
Speed range
-150 to 150 km/h
Speed resolution
0.76 km/h
Azimuth FoV
±60°
Elevation FoV
±30°
Cycle time
75 ms
Typical power
1 W
Interface
UART; 921600 baud
Lighting control
200 mA
77GHz | 100 m
77GHz High-Performance Two-Wheeler BSD Radar
High-performance rear perception for lane-change assistance, rear collision warning and blind spot monitoring.
Lightweight sensing module for commercial low-altitude platforms, supporting short-range obstacle awareness, target output and OTA updates.
Size
50 x 60 x 5.5 mm
Weight
<200 g
Operating Temperature
-40° to +85° C
Voltage
5 V
Frequency
24 GHz
Detection range
1.5-60 m
Range resolution
0.65 m
Speed range
-50 to 50 m/s
Speed resolution
0.4 m/s
Azimuth FoV
±45°
Elevation FoV
30°
Cycle time
50 ms
Interface
UART; 921600 baud
Typical power
<0.8 W
60GHz | Indoor safety
Fall Detection Radar
Privacy-friendly human detection, body keypoint detection, activity and posture recognition, and fall detection.
Size
60 x 60 x 27 mm
Weight
<200 g
Operating Temperature
-20° to +50° C
Voltage
5 V / 2 A
Frequency
60 GHz
Modulation
FMCW
Bandwidth
3.2 GHz
Detection range
5 m
Azimuth FoV
±50°
Cycle time
50 ms
Operating humidity
5%-95%
Interface
UART; 921600 baud
Wireless interface
WiFi IEEE 802.11 b/g/n/ac
60GHz | 2M+ points/s
Super Sensor
Dense depth sensing module for robots, supporting obstacle classification, multi-target tracking, freespace estimation and ground surface classification.
Size
76.5 x 47 x 25 mm
Operating Temperature
-30° to +65° C
Typical power
4 W nominal; 17 W with heating
Voltage
20 VDC
Frequency
60 GHz
Detection range
20 m
Relative depth accuracy
5% up to 10 m; 8% up to 20 m
Horizontal FoV
102°
Vertical FoV
67°
Depth output
640 x 360 @ 10 fps
Point cloud
Over 2,000,000 points/s
Data interface
CSI / UART over GMSL
Power interface
POC
IP Level
IP67
Time synchronization
PPS + GPRMC
Detection range figures are reference values measured under controlled test conditions. Actual performance may vary based on target reflectivity, environmental conditions and system configuration. 'Designed for ASIL B (ISO 26262)' indicates architectural design intent; final functional safety qualification is subject to customer system-level validation per ISO 26262.
Applications
One perception stack, many markets
Automotive
4D imaging radar and DeepFusion perception for urban roads, highways, bridges and rural roads, supporting ADAS and higher-level automated driving with all-weather, full-target awareness.
Two-wheelers
Compact 24GHz and 77GHz radars provide blind spot detection, lane-change assistance and rear collision warning for motorcycles and e-bikes, balancing low power, small size and wide temperature operation.
Robots
Super Sensor enables dense depth perception for indoor and outdoor mobile robots, covering general obstacle detection, obstacle classification, multi-target tracking, freespace estimation and ground surface classification.
Commercial Low-Altitude Platforms
Lightweight 24GHz sensing modules support commercial low-altitude platforms for inspection, campus operations and industrial service scenarios, with OTA capability, UART output, 50ms cycle time and a compact payload-friendly form factor.
Healthcare & Safety
60GHz FMCW radar provides privacy-friendly, non-contact human sensing for homes, hospitals and elderly care, including body keypoint detection, activity and posture recognition, occupancy counting and fall detection.
Careers
Build the future of intelligent perception with Fusionride
Open roles are updated according to project needs. Candidates with radar, AI perception, embedded software and automotive delivery experience are welcome to contact the HR team.
Lead breakthroughs
An innovation-oriented culture that encourages people to break conventions and move fast.
Embrace unique minds
A diverse, inclusive and flat organization with space for global talent to grow.
Competitive rewards
Competitive compensation, incentives, insurance, team activities, food, communication and transportation benefits.
Employee Benefits
Care for focused, sustainable work
Fusionride provides competitive rewards and practical support so every team member can stay focused on meaningful engineering work.
Social insurance and housing fundCommercial insuranceTeam activities and birthday eventsTea break, meals and communication allowanceHoliday benefits and employee purchase programTalent settlement, housing support and shuttle service
Leave a message for HR
For recruitment inquiries, referrals or open applications, please contact the HR team.