Create New Human Centric 
Mobility Experiences

Automotive Intelligent Cockpit

Intelligent Cockpit

Next generation mobility requires AI, from self-driving cars to new ways to engage customers. Build and deploy robust AI-powered interior monitoring systems. The NVISO Human Behaviour SDK includes building blocks and tools that accelerate in-cabin monitoring developments that require the increased perception and interaction features enabled by AI including gaze and eye state tracking, body tracking, and activity and gesture recognition.

Driving Monitoring System

Driver Monitoring Systems

A key feature expected to become a standard feature in new cars as a result of regulatory and rating agency requirements Euro NCAP 2025 Roadmap. Advanced driver monitoring systems (DMS) can detect distracted and drowsy drivers by accurately measuring eye and head position, attention, and fatigue. The DMS alerts the driver and integrated safety systems upon detection of a risk such as drowsiness or distraction. This feedback enables the driver and vehicle to take action before safety is compromised.

Occupant Monitoring Systems

Occupant Monitoring Systems

It is not only the driver who is the focus of attention. The camera is positioned in such a way that all seats are in its field of vision. The system can detect the presence of any other occupant, front passengers, and can thus deactivate the airbag if, for instance, a child safety seat is present. IMS uses machine learning to enable in-vehicle systems to sense their occupants’ emotional states and gestures to provide personalized experiences in the transition to automated driving.



Performance That Scales
Any Sensor, Any Placement

The interior of a vehicle is an unpredictable environment. Typical constraints range from driving environmental unpredictability to drastic changes in ambient temperature. These factors drive the need for systems to include sufficient algorithms capable of handling tough environmental conditions and choice of camera placement is critical to enable the robust operation of AI systems. Another factor that adds to the system complexity is accommodating the cosmetic design of the vehicle. Automotive designers constantly try to introduce new design concepts while also maximizing driver comfort features. These constraints require the position and location of the camera to often change. NVISO addresses these challenges through supporting flexible camera positioning anywhere between the A-pillar and the center stack which is critical to large scale adoption.


Multi-party AI Systems Development
Integration Ready

EMBEDDING TOOLS

EMBEDDING TOOLS

Supporting prototype platforms (Intel) to production platforms from centralized computing (NVIDIA), multimedia computing (Qualcomm), to close-to-sensor computing (Arm A5x, A7x + NPU Accelerators).

SIMULATION TOOLS<

SIMULATION TOOLS

Camera sensor type and location play a critical role in system performance. 3D simulation tools provide a fast and effective design and verification platform.

MACHINE LEARNING TOOLS

FLEXIBLE CAMERA INTEGRATION

Data driven software development require automated data tools to enable full cycle development of AI systems cost effectively.

NVISO Neuro SDK
for Automotive OEMs

Automotive manufacturers can add robust real-time human behaviour features using off-the-shelve automotive grade camera devices. Featuring NVISO Neuro Models™, that are interoperable and optimised for neuromorphic computing, the NVISO Neuro SDK is designed for high volume where cost and power are critical to market success. Flexible sensor integration options and placements are available delivering faster development cycles and time-to-value for software developers and integrators. It enables solutions that can sense, comprehend, and act upon human behavior and designed for real-world environments using edge computing it uniquely targets deep learning for embedded systems.


Accurate and Robust

CNNs scale to learn from billions of examples resulting in an extraordinary capacity to learn highly complex behaviors and thousands of categories. NVISO can train powerful and highly accurate and robust models for use in the toughest environments thanks to its proprietary datasets captured in real-world environments.


Easy to Integrate

Where AI is fragmented and difficult-to-navigate at the edge, NVISO AI Apps are simple to use, develop, and deploy, with easy software portability across a variety of hardware and architectures.​ It reduces the high barriers-to-entry into the edge AI space through cost-effective standardized AI Apps that are future proof and work optimally at the extreme edge.


Ethical and Trustworthy

AI systems need to be resilient and secure. They need to be safe, ensuring a fall back plan in case something goes wrong, as well as being accurate, reliable and reproducible. Additionally unfair bias must be avoided, as it could could have multiple negative implications. NVISO adopts Trustworthy AI frameworks and state-of-the-art policies and practices to ensure its AI Apps are "fit-for-purpose".



Run on Any Device
Enterprise Grade Performance

Microcontroller Unit (MCU)

AI functionality is implemented in low-cost MCUs via inference engines specifically targeting MCU embedding design requirements which are configured for low-power operations for continuous monitoring to discover trigger events in a sound, image, or vibration and more. In addition, the availability of AI-dedicated co-processors is allowing MCU suppliers to accelerate the deployment of machine learning functions.

Central Processing Unit (CPU)

Once a trigger event is detected, a high-performance subsystem such as an ARM Cortex A-Class CPU processor is engaged to examine and classify the event and determine the correct action. With its broad adoption, the ARM A-class processor powers some of the largest edge device categories in the world.

Graphic Processing Unit (GPU)

In systems where high AI workloads must run in real-time where MCUs and CPUs do not have enough processing power, embedded low power GPUs can used. GPUs are highly parallel cores (100s or 1,000s) for high-speed graphics rendering. They deliver high-performance processing, and typically have a larger footprint and higher power consumption than CPUs.

Want to learn more about our SDK?

SDK for HPC Devices




Automotive Partners

nvidia_logo
Brainchip
tobii_logo2
BCA_Logo_Grey_Web_Grey
BON
arm-logo
ceva-logo
zf-logo-1