HUMAN BEHAVIOUR AI

INTERNET OF THINGS

NVISO empowers consumer electronic manufacturers to build innovative solutions to transform our lives using AI powered Internet of Things (IoT) devices. Understand people and their behavior to make consumer devices safe, secure, and personalized for humans.

HUMAN BEHAVIOUR AI​

INTERNET OF THINGS

NVISO empowers consumer electronic manufacturers to build innovative solutions to transform our lives using AI powered Internet of Things (IoT) devices. Understand people and their behavior to make consumer devices safe, secure, and personalized for humans.

AI-ENABLED

HUMAN MACHINE INTERFACES

NVISO’s IoT SDK provides a robust real-time human behaviour AI API, NVISO Neuro Models™ interoperable and optimised for neuromorphic computing, the ability for flexible sensor integration and placement while delivering faster development cycles and time-to-value for software developers and integrators. It enables solutions that can sense, comprehend, and act upon human behavior including emotion recognition, gaze detection, distraction detection, drowsiness detection, gesture recognition, 3d face tracking, face analysis, facial recognition, object detection, and human pose estimation. Designed for real-world environments using edge computing it uniquely targets deep learning for embedded systems, 

NVISO delivers real-time perception and observation of people and objects in contextual situations combined with the reasoning and semantics of human behavior based on trusted scientific research. The NVISO IoT SDK is supported through a long term maintenance agreement for multi-party implementation of tools for AI systems development and can be used with large-scale neuromorphic computing systems. When used with neuromorphic chips, the NVISO IoT SDK can be used to build gaze detection systems, distraction and drowsiness detection systems, facial emotion recognition software, and a range of other applications of neuromorphic computing where understanding human behaviour in real-time is mission critical.

ACCURATE AND ROBUST

CNNs scale to learn from billions of examples resulting in an extraordinary capacity to learn highly complex behaviors and thousands of categories. NVISO can train powerful and highly accurate and robust models for use in the toughest environments thanks to its proprietary datasets captured in real-world environments.

EASY TO INTEGRATE

Where AI is fragmented and difficult-to-navigate at the edge, NVISO AI Apps are simple to use, develop, and deploy, with easy software portability across a variety of hardware and architectures. It reduces the high barriers-to-entry into the edge AI space through cost-effective standardized AI Apps that are future proof and work optimally at the extreme edge.

ETHICAL AND TRUSTWORTHY

AI systems need to be resilient and secure. They need to be safe, ensuring a fall back plan in case something goes wrong, as well as being accurate, reliable and reproducible. Additionally unfair bias must be avoided, as it could could have multiple negative implications. NVISO adopts Trustworthy AI frameworks and state-of-the-art policies and practices to ensure its AI Apps are "fit-for-purpose".

NVISO AI APP CATALOGUE

APPLICATION BY TECHNOLOGY

PRESENCE AND IDENTITY

Supporting the interaction of consumer devices with their owners in their daily lives. Detect presence and identity through facial recognition software. Anticipate and react to owner needs by combining with observations from gaze detection software. Pay attention to and appropriately adjust to mood through observation by facial emotion recognition.

OBJECT DETECTION

With the rise of autonomous vehicles, smart video surveillance, facial detection and various people counting applications, fast and accurate object detection systems are rising in demand. With the advent of deep learning in object detection and recognition systems they can not only recognise and classifying every object in an image but also localise each one.

FACIAL EXPRESSIONS

Facial emotion recognition software decodes facial behavior into seven primary emotional states, along with their respective intensity and probability of occurrence. Reading facial micro expressions allows emotion analytics to infer subtle changes in emotional expression over time and can be used to detect changes in mood and understand instantaneous reactions.

REMOTE VITAL SIGNS

Remote vital sign monitoring using sensors such as time-of-flight cameras, mm-wave radar, and rgb cameras allows heart rate, breathing rate, fatigue levels, and advanced emotional states (anxiety, stress, and pain) to be measured remotely to a sensor (no physical contact required). This information can be gathered to assist in decision making.

GAZE DETECTION

Gaze detection software using deep learning performs real-time eye movement tracking providing gaze direction, as well as 3D and 2D coordinates of the eyes (pupils). Gaze detection systems are calibration free and provide the basis of more complex eye tracking systems which analyses human processing of visual information, measuring attention, interest, and arousal.

EMOTION RECOGNITION

Artificial intelligence and human emotion recognition software allows non-verbal human communication to be detected and analysed by a computer. By reading emotional “expressions” from the face, tone of voice, and body and hand gestures more complex and advanced emotions can be detected. Custom emotion recognition development services and software solutions allow tailoring to specific use cases.

DISTRACTION AND DROWINESS

Driver attention detection systems are designed to warn you that you are fatigued or are at risk of becoming drowsy. Cars with drowsiness detection and occupant monitoring systems can monitoring eye state, blink rates, head gestures, body movements, and signs of fatigue such as yawning to provide driver distraction and drowsiness detection.

HUMAN POSE ESTIMATION

Human pose estimation provides multi-person 2d pose estimation for human body pose and shape estimation. Correspondingly 3d pose estimation can be performed using reference 3d human body models and by combining detection and tracking for human pose estimation in videos advanced interactive human machine interfaces can be enabled.

HAND GESTURE RECOGNITION

Deep learning for hand gesture recognition on skeletal data provides a fast, robust, and accurate method to detect hand gestures from a variety of camera sensors. Hand gesture recognition software will then classify both static and dynamic hand poses for interaction with autonomous systems for control and search tasks as well as emotional interactions.

NVISO NEURO MODELS™

EMBEDDED DEEP LEARNING

NVISO Neuro Models™ are purpose built for a new class of ultra-efficient AI processors designed for ultra-low deep learning on edge devices. Supporting a wide range of heterogenous computing platforms ranging from CPU, GPU, VPU, NPU, and neuromorphic computing they reduce the high barriers-to-entry into the embedded AI space through cost-effective standardized AI Apps which work optimally on edge devices for a range of common human behaviour use cases (low power, on-device, without requiring an internet connection). NVISO Neuro Models™ use low and mixed precision activations and weights data types (1 to 8-bit) combined with state-of-the-art unstructured sparsity to reduce memory bandwidth and power consumption. Using proprietary compact network architectures, they can be fully sequential suitable for ultra-low power mixed signal inference engines and fully interoperable with neuromorphic processors as well as existing digital accelerators.

PROPRIETARY DATA

NVISO Neuro Models™ use proprietary datasets and modern machine learning to learn from billions of examples resulting in an extraordinary capacity to learn highly complex behaviors and thousands of categories. Thanks to high quality datasets and low-cost access to powerful computing resources, NVISO can train powerful and highly accurate deep learning models.

RUN FASTER

NVISO Neuro Models™ store their knowledge in a single network, making them easy to deploy in any environment and can adapt to the available hardware resources. There is no need to store any additional data when new data is analysed. This means that NVISO Human Behaviour AI can run on inexpensive devices with no internet connectivity providing response times in milliseconds not seconds.

RUN ANYWHERE

NVISO Neuro Models™ are scalable across heterogeneous AI hardware processors being interoperable and optimised for CPUs, GPUs, DSPs, NPUs, and the latest neuromorphic processors using in-memory computing, analog processing, and spiking neural networks. NVISO Neuro Models™ maximise hardware performance while providing seamless cross-platform support on any device.

RUN SMARTER

OPTIMISED FOR COST/POWER/PERFORMANCE

MICROCONTROLLER UNIT (MCU)

AI functionality is implemented in low-cost MCUs via inference engines specifically targeting MCU embedding design requirements which are configured for low-power operations for continuous monitoring to discover trigger events in a sound, image, or vibration and more. In addition, the availability of AI-dedicated co-processors is allowing MCU suppliers to accelerate the deployment of machine learning functions.

CENTRAL PROCESSING UNIT (CPU)

Once a trigger event is detected, a high-performance subsystem such as an ARM Cortex A-Class CPU processor is engaged to examine and classify the event and determine the correct action. With its broad adoption, the ARM A-class processor powers some of the largest edge device categories in the world.

GRAPHIC PROCESSING UNIT (GPU)

In systems where high AI workloads must run in real-time where MCUs and CPUs do not have enough processing power, embedded low power GPUs can used. GPUs are highly parallel cores (100s or 1,000s) for high-speed graphics rendering. They deliver high-performance processing, and typically have a larger footprint and higher power consumption than CPUs.

PANASONIC NICOBO

CASE STUDY

Companion robots are designed to interact naturally with humans, with the ability to perceive and respond to a user’s mental state, behaviours and commands. With these capabilities they can assist in combating loneliness and detecting depression along with helping in keeping people healthy at home through the remote monitoring of vital signs. This is achieved through visual comprehension and NVISO’s human behavioural analytics AI systems have the capabilities to deliver this. NVISO’s solutions do this through its range of AI Apps providing visual observation, perception and semantic reasoning capabilities, the results of which can be used in identifying issues, in decision making processes and in supporting autonomous “human like” interactions.

CREATE NEW USER EXPERIENCES

SAFE AND SECURE

SMART LIVING

Smart Living is a solution that aims to make an environment of the future that improves people’s quality of life. Smart devices enabled by multimedia, artificial intelligence (AI), and Internet of Things (IoT) are constantly emerging and are becoming more popular, smart devices will continue to interconnect and converge, making people's lives smarter and more convenient.

COMPANION ROBOTS

Supporting the interaction of companion robots with their owners in their daily lives. Detect presence and identity through facial recognition. Anticipate and react to owner needs by observing head and eye movements, by tracking head pose and gaze. Pay attention to and appropriately adjust to mood through observation of owner’s emotional state.

METAVERSE

The metaverse is a concept of a 3D digital world. To offer an immersive metaverse virtual experience, tech companies are incorporating cutting-edge technologies to power the 3D world’s development. Such technologies include blockchain, augmented reality (AR) and virtual reality (VR), 3D reconstruction, artificial intelligence (AI), and the Internet of things (IoT).

EASY TO INTEGRATE

FASTER TIME-TO-MARKET

VERIFY USE CASE

Process captured data from a camera in real-time on-device with our EVK. Quickly verify your use case using our 30-day Trial EVK License by processing captured data from a camera sensor in real-time. Understand if existing NVISO AI Apps are suitable for desired end application performance.

BUILD PROTOTYPE

Fast-track your development with our x86 development platforms with APIs for software in-the-loop testing, evaluation, and creating demonstrators. Out-of-the-box software using our Developer SDK License allows you to get up and running in minutes not weeks.

DEPLOY TO PRODUCTION

Access the provided signals on-device or transmit them to other devices and then act on them to deliver innovative product features. Deploy AI-driven human machine interfaces by using our Production SDK License on production hardware.

REQUEST TRIAL EVALUATION KIT (EVK)

Talk with NVISO AI expert to learn more about about trial Evaluation Kit (EVK) for neuromorphic computing devices.

HUMAN CENTRIC

AI SOLUTIONS

CONSUMER ROBOTS

Human–robot interaction plays a crucial role in the burgeoning market for intelligent personal-service and entertainment robots.

AUTOMOTIVE INTERIOR SENSING

Next generation mobility requires AI, from self-driving cars to new ways to engage customers. Build and deploy robust AI-powered interior monitoring systems.

GAMING AND AVATARS

The gaming industry (computer, console or mobile) is about to make extensive use of the camera input to deliver entertainment value.

PERFORMANCE THAT SCALES

ANY SENSOR, ANY PLACEMENT

The interior of a vehicle is an unpredictable environment. Typical constraints range from driving environmental unpredictability to drastic changes in ambient temperature. These factors drive the need for systems to include sufficient algorithms capable of handling tough environmental conditions and choice of camera placement is critical to enable the robust operation of AI systems. Another factor that adds to the system complexity is accommodating the cosmetic design of the vehicle. Automotive designers constantly try to introduce new design concepts while also maximizing driver comfort features. These constraints require the position and location of the camera to often change. NVISO addresses these challenges through supporting flexible camera positioning anywhere between the A-pillar and the center stack which is critical to large scale adoption.

NVISO MOBILE EVK

Please complete this form to receive purchasing instructions via email for the NVISO Mobile EVK.

お問い合わせ先

REQUEST SDK

REQUEST SDK

DOWNLOAD WHITEPAPER