Core Philosophy

Next-Generation Perception Fusion for Safer, Smarter Mobility

Fusionride develops high-performance millimeter-wave radar and radar-camera fusion systems. By seamlessly integrating physics-based modeling, advanced system engineering, and AI algorithms, we are pushing the boundaries of mmWave and vision perception for automotive and industrial applications.

We deliver all-weather, highly robust perception solutions for autonomous driving and intelligent mobility—powering vehicles, two-wheelers, commercial low-altitude platforms, and robotics.

We don't just build technology; we build production-ready, verifiable, and scalable solutions. By optimizing sensing coverage, system costs, compute budgets, and data strategies, Fusionride provides a rock-solid perception foundation for the future of intelligent mobility.

Mission & Vision

Mission

To give machines precise perception, enabling intelligence to navigate extreme conditions with confidence.

Vision

To become a globally trusted full-stack perception technology partner. Focused on mobility, robotics and intelligent infrastructure, Fusionride helps build a safer foundation for the connected world through reliable, verifiable and production-ready perception capabilities.

Values

Excellence:Pursue the highest standard with resilient focus, refining extreme-scenario capability with discipline.

Integrity:Respect life, uphold safety and pursue scientific rigor, truth and pragmatism in the real world.

Courage:Break technical and cognitive boundaries, opening new paths for reliable perception in unknown scenarios.

Strategic Integration

Fusionride × Zadar Labs

In January 2026, Fusionride completed a strategic integration with Zadar Labs, combining product roadmaps, global markets, technical capabilities and customer ecosystems into a unified perception platform.

Product Line Integration

From 3D radar to high-definition 4D imaging radar, the combined roadmap expands sensing products across performance levels and application domains.

Market Integration

The integrated market footprint covers China, Japan, the United States, Europe, Southeast Asia and other countries and regions, supporting global customer program delivery.

Capability Integration

AI, software, hardware and radar algorithms are brought together to create a complete sensing stack from signal to perception output.

By combining global R&D capabilities, advanced engineering expertise and scalable manufacturing strength, the integrated team operates with a clear international division of labor to serve automotive OEMs, Tier-1 suppliers, AMR platforms and intelligent infrastructure customers worldwide.

Global Footprint

Fusionride + Zadar Global R&D Layout

A coordinated R&D and industrial footprint connects China, Europe and North America, supporting radar hardware, DeepFusion software, manufacturing and customer delivery.

Fusionride and Zadar global layout

Global layout image is being prepared.

2021 → 2026

Fusionride Development History

A six-year product journey from Fusionride's foundation to Columbus 4D radar, centralized-compute imaging radar, Super Sensor, First International Customer Programs and software-defined imaging radar.

2021

Established

2022

Launched 4D Radar Columbus

2023

Launched 4D Centralized-Compute Imaging Radar Picasso

2024

Launched Camera Radar AI Super Sensor

2025

Launched Commercial Low-Altitude and Robotics Sensors

2026

Software-Defined 4D HD-Imaging Radar — Coming 2026

Core Capabilities

Full-Stack Capabilities from Software-Defined Radar to DeepFusion Intelligence

Fusionride is building a production-oriented full-stack perception ecosystem. We deeply integrate Zadar's leading software-defined 4D imaging radar hardware and SDIR core capabilities with Fusionride's in-house DeepFusion early-fusion algorithms, embedded systems and strong manufacturing resources. The platform covers the full chain from configurable sensing to Occupancy Grid understanding. Through feature-level radar-camera alignment and BEV representation, it enables reliable, continuous active-safety outputs even in highly complex and extreme environments.

Software-defined radar visualization

Software-defined imaging radar

High-resolution 4D imaging radar, software-defined waveform control, calibration, interference mitigation and multi-radar fusion provide a measurable physical sensing foundation across range, velocity, angle and elevation.

Software-defined waveforms allow the same radar platform to be configured for range, resolution and scene requirements across vehicles, robots, commercial low-altitude platforms, and infrastructure.
DeepFusion radar-camera visualization

Radar-camera DeepFusion

DeepFusion aligns radar and camera features earlier in the perception chain, before object-level decisions are finalized. The fused 3D / BEV representation supports obstacle detection, freespace estimation and dense scene understanding.

Radar velocity and range evidence is aligned with visual semantics in BEV space, helping perception remain stable when light, weather or target shape changes.
Robust scene understanding visualization

Robust scene understanding

The perception stack is designed for glare, low light, rain, fog, unstructured roads and unknown obstacles. Occupancy grids, temporal memory and radar physical evidence help maintain stable understanding when individual sensors degrade.

Occupancy Grid reasoning turns sparse and dense sensing evidence into a continuous understanding of free space, obstacles and safety-critical events.
Embedded AI production-ready visualization

Embedded and production ready

Lightweight neural networks, automotive interfaces, time synchronization, diagnostics, validation workflows and EMS manufacturing resources support deployment on mainstream edge AI platforms and volume production programs.

The stack connects algorithm design with embedded deployment, validation, manufacturing traceability and scalable delivery for production programs.
Radar signal evidence visualization Signal Layer

From raw radar echoes to high-confidence physical evidence

The signal layer converts raw radar echoes into structured physical evidence, including range, velocity, angle, elevation, micro-motion and temporal consistency. Software-defined configuration, calibration and interference mitigation allow radar behavior to be tuned for automotive, robot, commercial low-altitude platform, and infrastructure domains.

DeepFusion feature layer visualization Fusion Layer

DeepFusion before object-level decisions

The fusion layer brings radar and camera information together before final object reasoning. By aligning complementary features in 3D / BEV space, DeepFusion preserves radar's all-weather ranging and velocity advantages while using visual semantics for structure, class and context.

Deployable perception stack visualization Production Layer

A deployable stack, not a laboratory demo

The deployment layer connects algorithms with embedded software, interface adaptation, validation, data iteration and manufacturability. Fusionride delivers perception outputs that downstream systems can consume directly, including point cloud, tracking, freespace, occupancy, obstacle class, road structure and safety events.

Fusionride radar manufacturing base

Production Capability

Scalable production capability for radar delivery

Fusionride connects product engineering, validation testing and manufacturing resources to support radar programs from early engineering builds to scalable production. The production system is designed around consistency, traceability and delivery efficiency, providing stable manufacturing support for automotive, robotics and mobility customers.

2,400 sqm Highly automated production line
300,000 units/year scalable to 800,000 units/year

Product Center

Product Portfolio by Perception Scenario

Designed for automotive and ground-based mobility applications.

zPRIME FPGA
76-81GHz | FPGA imaging radar

zPRIME FPGA

High-performance imaging radar platform with point cloud output and optional tracking, classification, odometry and freespace outputs.

Specification 120° x 24° 120° x 50° 120° x 90°
Field of View 120° x 24° 120° x 50° 120° x 90°
Static Angular Resolution 0.35° x 0.4° 0.35° x 0.9° 0.35° x 1.6°
Angular Accuracy ±0.05°
Truck Detection > 1,000 m > 400 m > 250 m
Car Detection > 800 m > 400 m > 250 m
Human Detection > 250 m > 150 m > 90 m
Hardware Dimensions 140 x 103 x 36 mm
Weight 500 g (17.6 oz)
Sealing Protection IP68 (2-meter)
Operating Temperature -40° to 85° C
Storage Temperature -40° to 95° C
Compliance IEC 60068-2-27, 64, ISO 16750
Frame Time 50 ms / 20 Hz
zPULSE FPGA
76-81GHz | Compact FPGA radar

zPULSE FPGA

Compact software-defined radar platform for high-resolution point cloud and optional perception outputs in production programs.

zPULSE Specifications
Field of View 120° x 90°
Static Angular Resolution 1.5°
Angular Accuracy ±0.25°
Truck Detection > 400 m
Car Detection > 400 m
Human Detection > 200 m
Hardware Dimensions 80 x 70 x 40 mm
Weight 200 g
Weather Proofing IP68 (2-meter)
Columbus Front Radar TI 2944
4T4R | 250 m front radar

Columbus Front Radar – TI 2944

Front automotive radar for detections, objects and freespace with CAN-FD or optional Ethernet output.

Size72 x 62.5 x 18.5 mm
Weight<100 g
Operating Temperature-40° to +85° C
Typical power4.5 W nominal
Voltage9-16 VDC
Frequency76-77 GHz
InterfaceCAN-FD / Ethernet optional
IP LevelIP6K7; IP6K9 with special connector
Channels4T4R
Detection range250 m @ 10 dBsm
Range accuracy6 cm
Range resolution0.25 m
Speed range±330 km/h
Speed accuracy0.023 m/s
Speed resolution0.14 m/s
Azimuth FoV120°
Azimuth accuracy±0.15°
Azimuth separation2.5°
Elevation FoV±15°
Elevation accuracy±1.0°
FuSa / CybersecurityDesigned for ASIL B (ISO 26262) / HSM / AUTOSAR
Columbus Corner Radar TI 2944
4T4R | 180 m corner radar

Columbus Corner Radar – TI 2944

Corner radar for detections, objects, freespace and ADAS functions including LCDAS, RCTA/B, FCTA/B, DOW and RCW.

Size72 x 63 x 16.4 mm
Weight<100 g
Operating Temperature-40° to +85° C
Typical power4.5 W nominal
Voltage9-16 VDC
Frequency76-77 GHz
InterfaceCAN-FD / Ethernet optional
IP LevelIP6K7; IP6K9 with special connector
Channels4T4R
Detection range180 m @ 10 dBsm
Range accuracy6 cm
Range resolution0.24 m
Speed range±330 km/h
Speed accuracy0.023 m/s
Speed resolution0.14 m/s
Azimuth FoV150°
Azimuth accuracy±0.3°
Azimuth separation4.8°
Elevation FoV±15°
Elevation accuracy±1.0°
FuSa / CybersecurityDesigned for ASIL B (ISO 26262) / HSM / AUTOSAR
Columbus Corner Radar Calterah 344
4T4R | Calterah 344

Columbus Corner Radar – Calterah 344

CAN-FD corner radar for compact ADAS functions, supporting detections, objects, freespace and surrounding alerts.

Size72 x 63 x 16.4 mm
Weight<100 g
Operating Temperature-40° to +85° C
Typical power4.5 W nominal
Voltage9-16 VDC
Frequency76-77 GHz
InterfaceCAN-FD
IP LevelIP6K7; IP6K9 with special connector
Channels4T4R
Detection range160 m @ 10 dBsm
Range accuracy6 cm
Range resolution0.24 m
Speed range±300 km/h
Speed accuracy0.023 m/s
Speed resolution0.14 m/s
Azimuth FoV150°
Azimuth accuracy±0.3°
Azimuth separation4.8°
Elevation FoV±15°
Elevation accuracy±1.0°
FuSa / CybersecurityDesigned for ASIL B (ISO 26262) / HSE / AUTOSAR
Door Radar Calterah 244
4T4R | Calterah 244 | 76-79 GHz

Door Radar – Calterah 244

Ultra-compact CAN-FD door radar for door anti-pinch and opening-angle sensing, designed for ASIL B (ISO 26262) functional safety with AUTOSAR support.

Size77 × 22 × 10.3 mm
Weight<35 g
Operating Temperature-40° to +85° C
Typical power<2 W
Voltage9-16 VDC
Frequency76-79 GHz
InterfaceCAN-FD
IP LevelIP6K7
Channels4T4R
Detection range120 cm
Range accuracy2 cm
Range resolution0.06 m
Speed range±13 m/s
Speed accuracy0.02 m/s
Speed resolution0.14 m/s
Azimuth FoV140°
Elevation FoV120°
FuSa / CybersecurityDesigned for ASIL B (ISO 26262) / HSM / AUTOSAR
Picasso 4D Imaging Radar 6×8 NXP
77GHz 6T8R | 350 m

Picasso 4D Imaging Radar 6×8 NXP

6T8R front imaging radar with long-range, short-range and configurable modes for detection list, optional tracking object and freespace output.

Parameter Long Range Mode Short Range Mode Configurable Mode
Channels6T8R6T8RConfigurable
Detection range350 m @ 10 dBsm125 m @ 10 dBsm
Range accuracy0.08 m0.05 m
Range resolution0.4 m0.25 m
Doppler range±300 km/h±100 km/h
Doppler accuracy0.023 m/s0.012 m/s
Doppler resolution0.1 m/s0.06 m/s
Horizontal FoV±60°±60°
Horizontal angle accuracy±0.15°±0.15°
Horizontal angle resolution2.0°2.0°
Vertical FoV±15°±15°
Vertical angle accuracy±0.5°±0.5°
Vertical angle resolution4.0°4.0°
Size101 x 85.5 x 17.5 mm excluding connector
Weight<200 g
Operating Temperature-40° to +85° C
Typical power<10 W
Voltage9-16 VDC
Frequency76-77 GHz
InterfaceCAN-FD / Ethernet
IP LevelIP6K7; IP6K9 with specific connector
FuSa / CybersecurityDesigned for ASIL B (ISO 26262) / HSM / AUTOSAR
24GHz Two-Wheeler BSD Radar
24GHz | 0.65-70 m

24GHz Two-Wheeler BSD Radar

Low-power two-wheeler blind spot radar with detections, objects and LCA, RCW, BSD functions.

Size61 x 45.5 x 13 mm
Weight<100 g
Operating Temperature-40° to +85° C
Voltage9-16 V; 12 V typical
Frequency24 GHz
Detection range0.65-70 m
Range resolution0.65 m
Speed range-150 to 150 km/h
Speed resolution0.76 km/h
Azimuth FoV±60°
Elevation FoV±30°
Cycle time75 ms
InterfaceCAN; 500K baud
Wireless interfaceBLE 5.0
Typical power<0.8 W
77GHz Two-Wheeler Rear Radar
77GHz | 70 m

77GHz Two-Wheeler Rear Radar

Small rear radar with UART output, lighting control and wide temperature operation for motorcycles and e-bikes.

Size35 x 25 x 22 mm
Weight<100 g
Operating Temperature-40° to +85° C
Voltage9-16 V; 12 V typical
Frequency77 GHz
Detection range70 m
Range resolution0.50 m
Speed range-150 to 150 km/h
Speed resolution0.76 km/h
Azimuth FoV±60°
Elevation FoV±30°
Cycle time75 ms
Typical power1 W
InterfaceUART; 921600 baud
Lighting control200 mA
77GHz High-Performance Two-Wheeler BSD Radar
77GHz | 100 m

77GHz High-Performance Two-Wheeler BSD Radar

High-performance rear perception for lane-change assistance, rear collision warning and blind spot monitoring.

Size52 x 51 x 15.5 mm
Weight<80 g
Operating Temperature-40° to +85° C
Voltage9-16 V; 12 V typical
Frequency77 GHz
Detection range100 m
Range resolution0.55 m
Speed range-270 to 270 km/h
Speed resolution0.76 km/h
Azimuth FoV±75°
Elevation FoV±15°
Cycle time75 ms
Typical power2.5 W
InterfaceCAN
Lighting control200 mA
Commercial Low-Altitude Lightweight Sensing Module
24GHz | 1.5-60 m

Commercial Low-Altitude Lightweight Sensing Module

Lightweight sensing module for commercial low-altitude platforms, supporting short-range obstacle awareness, target output and OTA updates.

Size50 x 60 x 5.5 mm
Weight<200 g
Operating Temperature-40° to +85° C
Voltage5 V
Frequency24 GHz
Detection range1.5-60 m
Range resolution0.65 m
Speed range-50 to 50 m/s
Speed resolution0.4 m/s
Azimuth FoV±45°
Elevation FoV30°
Cycle time50 ms
InterfaceUART; 921600 baud
Typical power<0.8 W
Fall Detection Radar
60GHz | Indoor safety

Fall Detection Radar

Privacy-friendly human detection, body keypoint detection, activity and posture recognition, and fall detection.

Size60 x 60 x 27 mm
Weight<200 g
Operating Temperature-20° to +50° C
Voltage5 V / 2 A
Frequency60 GHz
ModulationFMCW
Bandwidth3.2 GHz
Detection range5 m
Azimuth FoV±50°
Cycle time50 ms
Operating humidity5%-95%
InterfaceUART; 921600 baud
Wireless interfaceWiFi IEEE 802.11 b/g/n/ac
Super Sensor
60GHz | 2M+ points/s

Super Sensor

Dense depth sensing module for robots, supporting obstacle classification, multi-target tracking, freespace estimation and ground surface classification.

Size76.5 x 47 x 25 mm
Operating Temperature-30° to +65° C
Typical power4 W nominal; 17 W with heating
Voltage20 VDC
Frequency60 GHz
Detection range20 m
Relative depth accuracy5% up to 10 m; 8% up to 20 m
Horizontal FoV102°
Vertical FoV67°
Depth output640 x 360 @ 10 fps
Point cloudOver 2,000,000 points/s
Data interfaceCSI / UART over GMSL
Power interfacePOC
IP LevelIP67
Time synchronizationPPS + GPRMC

Detection range figures are reference values measured under controlled test conditions. Actual performance may vary based on target reflectivity, environmental conditions and system configuration. 'Designed for ASIL B (ISO 26262)' indicates architectural design intent; final functional safety qualification is subject to customer system-level validation per ISO 26262.

Applications

One perception stack, many markets

Automotive perception scene

Automotive

4D imaging radar and DeepFusion perception for urban roads, highways, bridges and rural roads, supporting ADAS and higher-level automated driving with all-weather, full-target awareness.

Two-wheeler radar scene

Two-wheelers

Compact 24GHz and 77GHz radars provide blind spot detection, lane-change assistance and rear collision warning for motorcycles and e-bikes, balancing low power, small size and wide temperature operation.

Robot perception scene

Robots

Super Sensor enables dense depth perception for indoor and outdoor mobile robots, covering general obstacle detection, obstacle classification, multi-target tracking, freespace estimation and ground surface classification.

Commercial low-altitude platform sensing scene

Commercial Low-Altitude Platforms

Lightweight 24GHz sensing modules support commercial low-altitude platforms for inspection, campus operations and industrial service scenarios, with OTA capability, UART output, 50ms cycle time and a compact payload-friendly form factor.

Healthcare and safety radar scene

Healthcare & Safety

60GHz FMCW radar provides privacy-friendly, non-contact human sensing for homes, hospitals and elderly care, including body keypoint detection, activity and posture recognition, occupancy counting and fall detection.

Careers

Build the future of intelligent perception with Fusionride

Open roles are updated according to project needs. Candidates with radar, AI perception, embedded software and automotive delivery experience are welcome to contact the HR team.

Lead breakthroughs

An innovation-oriented culture that encourages people to break conventions and move fast.

Embrace unique minds

A diverse, inclusive and flat organization with space for global talent to grow.

Competitive rewards

Competitive compensation, incentives, insurance, team activities, food, communication and transportation benefits.

Employee Benefits

Care for focused, sustainable work

Fusionride provides competitive rewards and practical support so every team member can stay focused on meaningful engineering work.

Social insurance and housing fund Commercial insurance Team activities and birthday events Tea break, meals and communication allowance Holiday benefits and employee purchase program Talent settlement, housing support and shuttle service

Leave a message for HR

For recruitment inquiries, referrals or open applications, please contact the HR team.

hr@fusionride.com

Press

Company News and Media Coverage

Contact Fusionride

R&D centers and contact information

Shanghai R&D Center

7F, Building 1, No. 2377, Shenkun Road, Minhang District, Shanghai City, China

London Engineering Office

Alpha House, 100 Borough High Street, London SE1 1LB, United Kingdom

Munich Engineering Office

Konrad-Zuse-Platz 8, 81829 Munich, Germany

Fusionride Production Line

Liaobang Road & Fushi Road, Wujiang District, Suzhou, Jiangsu, China

Business Inquiries

For product information, project discussion or partnership requests, please contact the Fusionride sales team.

business@fusionride.com

Affiliated Company Zadar Labs

Visit Zadar Labs' official website for high-performance 4D imaging radar technology, SDIR capabilities and product information.

zadarlabs.com