Arnold NextG Blogspot: Sensor fusion in action – How autonomous systems are already making safe decisions today
A particularly practical example is public transport: autonomous shuttles move in highly dynamic, urban traffic areas – with pedestrians, cyclists, traffic signs, light signals, and unpredictable obstacles. To navigate safely here, modern systems rely on the fusion of camera and LiDAR data. Cameras provide contextual information such as traffic light phases or signage, while LiDAR uses centimeter-accurate 3D point cloud mapping to detect object contours and distances even at night or in strong glare.
This fusion not only provides a complete picture of the environment, but also enables robust classification of road users through real-time evaluation. As emphasized in the current handbook of the Federal Ministry of Digital and Transport (BMDV, 2024), sensor-based multiple redundancy is crucial for the approval and operation of autonomous vehicles in public spaces.
Implementation can only succeed if perception and action are closely linked. Platforms such as NX NextMotion enable this connection through their integrated control architecture – drive, steer, and brake-by-wire – with real-time responsiveness, certified to ASIL D and SIL3.
Precision in the port: Sensor fusion for logistics processes at the limit
In container handling or intermodal logistics centers, precision down to the centimeter is essential. Freight vehicles must dock reliably at night, in fog, or in heavy rain – often in mixed operations involving people, machines, and other vehicles. Sensor architectures combining radar, LiDAR, GPS, and IMUs have become established here.
Radar provides reliable object and distance data even in poor visibility conditions. LiDAR refines contour detection and compares measurements. This is supplemented by inertial measurement units and high-resolution GPS – for localization accurate to within a few centimeters. The data streams from these different sources are synchronized at the system level through sensor fusion to detect and correct inconsistencies – as impressively described in the Intel Mobileye architecture for self-driving systems ( Mobileye & Intel, 2024 ).
NX NextMotion supports precisely these requirements: The system enables a continuous control chain with multiple redundancy in sensor technology, communication, and actuators. Integrated edge logic eliminates the need to outsource raw data, which reduces latency and increases reliability.
Mining & Construction: Autonomous Vehicles in Extreme Environments
In mines, tunnels, and on unsecured construction sites, conventional sensor solutions quickly reach their limits. Dense dust clouds, uneven terrain, and changing light conditions make reliable environmental perception particularly challenging. Robust sensor architectures are required—such as radar-based systems, supplemented by thermal imaging technology and selective LiDAR fusion.
Sensor fusion becomes a safety issue in such environments: it not only enables the detection of people and obstacles, but also the validation of sensor data across multiple channels. False positive reports and blind spots must be avoided, especially in teleoperated or semi-autonomous machines. In Minen,
As expert analyses show, multisensory fusion plays a decisive role in minimizing precisely these sources of error – by specifically combining the strengths of different sensor types and compensating for inaccurate or missing individual measurements. This increased reliability is considered a key success factor for the perception performance of modern autonomous systems – especially under difficult environmental conditions.
NX NextMotion is designed for this application. The platform separates safety-critical from secondary functions, offers real-time fusion, and is tested according to requirements such as VW80000 (Class 5). Thanks to its modular architecture, legacy platforms can also be retrofitted safely – an important prerequisite for many operators in the construction and raw materials industries.
Defense & remote mobility: Control under threat
In military applications or crisis areas, vehicle control is subject to maximum risk. Communication failures, interference signals, or rough terrain require systems that not only operate independently but also remain fail-operational in the event of a fault. Here, emergency services rely on redundant sensor arrays, often with physically separate computing units and multiple safety logics, to protect against failures, interference, and attacks.
As a technical report by the NATO Science and Technology Organization (STO) on the safety of autonomous unmanned systems shows, the systematic risk and safety assessment of unmanned vehicles is at the core of reliable mission assurance — including the identification of threats, risk aggregation across system levels, and the definition of safety capabilities that must be implemented in practice (STO Report, 2024).
International standards such as ISO/SAE 21434 (cybersecurity in vehicle engineering) and the newly published ISO/PAS 8800 (safety of artificial intelligence in vehicles) also specify binding requirements for the development, validation, and risk analysis of autonomous systems. These standards address functional safety, cyber resilience, and the interface between perception, control, and protection mechanisms—key dimensions when autonomous or teleoperated vehicles are to be used in safety-critical environments (ISO/SAE 21434; ISO/PAS 8800).
NX NextMotion enables the implementation of such architectures through its quadruple redundant control logic, dual power supply, separate communication paths, and dedicated watchdog logic. The ability to secure autonomous or remote-controlled vehicles even in NATO-compatible platforms makes the technology particularly interesting for defense suppliers. For military UGVs (unmanned ground vehicles), reconnaissance units, or convoy-capable transporters, the platform already delivers scalable security today – prepared for teleoperation, AI control, and cyber threats in accordance with the relevant ISO and defense-related security frameworks.
Action instead of just perception: real time is decisive
All of the above-mentioned fields of application show that sensor fusion alone is not enough. Security can only be achieved through integration into a robust control architecture. Systems must not only recognize, but also react immediately – even under time pressure, in the event of malfunctions or in exceptional situations.
NX NextMotion combines these levels into a closed-loop response system: from sensor input and validated fusion to real-time control of vehicle movement – drive, steer, brake. The direct connection of actuators, safety modules, and diagnostic tools turns perception into action. And technology into trust.
We control what moves!
Arnold NextG realizes the safety-by-wire® technology of tomorrow: The multi-redundant central control unit NX NextMotion enables a fail-safe and individual implementation, independent of the vehicle platform and unique worldwide. The system can be used to safely implement autonomous vehicle concepts in accordance with the latest hardware, software and safety standards, as well as remote control, teleoperation or platooning solutions. As an independent pre-developer, incubator and system supplier, Arnold NextG takes care of planning and implementation – from vision to road approval. With the road approval of NX NextMotion, we are setting the global drive-by-wire standard. www.arnoldnextg.com
Arnold NextG GmbH
Breite 3
72539 Pfronstetten-Aichelau
Telefon: +49 171 5340377
http://www.arnoldnextg.de
![]()