HUMAN BEHAVIOUR AI

MOBILE PHONES

NVISO’s Human Behaviour AI SDK allows application developers to build innovative solutions to transform our lives using AI on mobile phones. Understand people and their behavior to make autonomous devices safe, secure, and personalized for humans.

HUMAN BEHAVIOUR AI

MOBILE PHONES

NVISO’s Human Behaviour AI SDK allows application developers to build innovative solutions to transform our lives using AI on mobile phones. Understand people and their behavior to make autonomous devices safe, secure, and personalized for humans.

AI-ENABLED

HUMAN MACHINE INTERFACES

NVISO’s Mobile SDK provides a robust real-time human behaviour AI API, NVISO Neuro Models™ interoperable and optimised for neuromorphic computing, the ability for flexible sensor integration and placement while delivering faster development cycles and time-to-value for software developers and integrators. It enables solutions that can sense, comprehend, and act upon human behavior including emotion recognition, gaze detection, distraction detection, drowsiness detection, gesture recognition, 3d face tracking, face analysis, facial recognition, object detection, and human pose estimation. Designed for real-world environments using edge computing it uniquely targets deep learning for embedded systems, 

NVISO delivers real-time perception and observation of people and objects in contextual situations combined with the reasoning and semantics of human behavior based on trusted scientific research. The NVISO Mobile SDK is supported through a long term maintenance agreement for multi-party implementation of tools for AI systems development and can be used with large-scale neuromorphic computing systems. When used with neuromorphic chips, the NVISO Mobile SDK can be used to build gaze detection systems, distraction and drowsiness detection systems, facial emotion recognition software, and a range of other applications of neuromorphic computing where understanding human behaviour in real-time is mission critical.

ACCURATE AND ROBUST

CNNs scale to learn from billions of examples resulting in an extraordinary capacity to learn highly complex behaviors and thousands of categories. NVISO can train powerful and highly accurate and robust models for use in the toughest environments thanks to its proprietary datasets captured in real-world environments.

EASY TO INTEGRATE

Where AI is fragmented and difficult-to-navigate at the edge, NVISO AI Apps are simple to use, develop, and deploy, with easy software portability across a variety of hardware and architectures. It reduces the high barriers-to-entry into the edge AI space through cost-effective standardized AI Apps that are future proof and work optimally at the extreme edge.

ETHICAL AND TRUSTWORTHY

AI systems need to be resilient and secure. They need to be safe, ensuring a fall back plan in case something goes wrong, as well as being accurate, reliable and reproducible. Additionally unfair bias must be avoided, as it could could have multiple negative implications. NVISO adopts Trustworthy AI frameworks and state-of-the-art policies and practices to ensure its AI Apps are "fit-for-purpose".

NVISO AI APP CATALOGUE

DETECT AND ANALYSE

PRESENCE AND IDENTITY

Supporting the interaction of consumer devices with their owners in their daily lives. Detect presence and identity through facial recognition software. Anticipate and react to owner needs by combining with observations from gaze detection software. Pay attention to and appropriately adjust to mood through observation by facial emotion recognition.

OBJECT DETECTION

With the rise of autonomous vehicles, smart video surveillance, facial detection and various people counting applications, fast and accurate object detection systems are rising in demand. With the advent of deep learning in object detection and recognition systems they can not only recognise and classifying every object in an image but also localise each one.

FACIAL EXPRESSIONS

Facial emotion recognition software decodes facial behavior into seven primary emotional states, along with their respective intensity and probability of occurrence. Reading facial micro expressions allows emotion analytics to infer subtle changes in emotional expression over time and can be used to detect changes in mood and understand instantaneous reactions.

REMOTE VITAL SIGNS

Remote vital sign monitoring using sensors such as time-of-flight cameras, mm-wave radar, and rgb cameras allows heart rate, breathing rate, fatigue levels, and advanced emotional states (anxiety, stress, and pain) to be measured remotely to a sensor (no physical contact required). This information can be gathered to assist in decision making.

GAZE DETECTION

Gaze detection software using deep learning performs real-time eye movement tracking providing gaze direction, as well as 3D and 2D coordinates of the eyes (pupils). Gaze detection systems are calibration free and provide the basis of more complex eye tracking systems which analyses human processing of visual information, measuring attention, interest, and arousal.

EMOTION RECOGNITION

Artificial intelligence and human emotion recognition software allows non-verbal human communication to be detected and analysed by a computer. By reading emotional “expressions” from the face, tone of voice, and body and hand gestures more complex and advanced emotions can be detected. Custom emotion recognition development services and software solutions allow tailoring to specific use cases.

DISTRACTION AND DROWINESS

Driver attention detection systems are designed to warn you that you are fatigued or are at risk of becoming drowsy. Cars with drowsiness detection and occupant monitoring systems can monitoring eye state, blink rates, head gestures, body movements, and signs of fatigue such as yawning to provide driver distraction and drowsiness detection.

HUMAN POSE ESTIMATION

Human pose estimation provides multi-person 2d pose estimation for human body pose and shape estimation. Correspondingly 3d pose estimation can be performed using reference 3d human body models and by combining detection and tracking for human pose estimation in videos advanced interactive human machine interfaces can be enabled.

HAND GESTURE RECOGNITION

Deep learning for hand gesture recognition on skeletal data provides a fast, robust, and accurate method to detect hand gestures from a variety of camera sensors. Hand gesture recognition software will then classify both static and dynamic hand poses for interaction with autonomous systems for control and search tasks as well as emotional interactions.

NVISO NEURO MODELS™

ULTRA-EFFICIENT DEEP LEARNING AT THE EDGE

NVISO Neuro Models™ are purpose built for a new class of ultra-efficient machine learning processors designed for ultra-low power edge devices. Supporting a wide range of heterogenous computing platforms ranging from CPU, GPU, DSP, NPU, and neuromorphic computing they reduce the high barriers-to-entry into the AI space through cost-effective standardized AI Apps which work optimally at the extreme edge (low power, on-device, without requiring an internet connection). NVISO uses low and mixed precision activations and weights data types (1 to 8-bit) combined with state-of-the-art unstructured sparsity to reduce memory bandwidth and power consumption. Proprietary compact network architectures can be fully sequential suitable for ultra-low power mixed signal inference engines and fully interoperable with both GPUs and neuromorphic processors

PROPRIETARY DATA

NVISO Neuro Models™ use proprietary datasets and modern machine learning to learn from billions of examples resulting in an extraordinary capacity to learn highly complex behaviors and thousands of categories. Thanks to high quality datasets and low-cost access to powerful computing resources, NVISO can train powerful and highly accurate deep learning models.

RUN FASTER

NVISO Neuro Models™ store their knowledge in a single network, making them easy to deploy in any environment and can adapt to the available hardware resources. There is no need to store any additional data when new data is analysed. This means that NVISO Human Behaviour AI can run on inexpensive devices with no internet connectivity providing response times in milliseconds not seconds.

RUN ANYWHERE

NVISO Neuro Models™ are scalable across heterogeneous AI hardware processors being interoperable and optimised for CPUs, GPUs, DSPs, NPUs, and the latest neuromorphic processors using in-memory computing, analog processing, and spiking neural networks. NVISO Neuro Models™ maximise hardware performance while providing seamless cross-platform support on any device.

RUN SMARTER

OPTIMISED FOR COST/POWER/PERFORMANCE

MICROCONTROLLER UNIT (MCU)

AI functionality is implemented in low-cost MCUs via inference engines specifically targeting MCU embedding design requirements which are configured for low-power operations for continuous monitoring to discover trigger events in a sound, image, or vibration and more. In addition, the availability of AI-dedicated co-processors is allowing MCU suppliers to accelerate the deployment of machine learning functions.

CENTRAL PROCESSING UNIT (CPU)

Once a trigger event is detected, a high-performance subsystem such as an ARM Cortex A-Class CPU processor is engaged to examine and classify the event and determine the correct action. With its broad adoption, the ARM A-class processor powers some of the largest edge device categories in the world.

GRAPHIC PROCESSING UNIT (GPU)

In systems where high AI workloads must run in real-time where MCUs and CPUs do not have enough processing power, embedded low power GPUs can used. GPUs are highly parallel cores (100s or 1,000s) for high-speed graphics rendering. They deliver high-performance processing, and typically have a larger footprint and higher power consumption than CPUs.

CASE STUDY

UNDERSTANDING NON-COMMUNICATIVE PATIENTS

NVISO’s business partner PainChek is using NVISO’s advanced emotion recognition technology to develop a medical device that will use a smartphone to visually analyze facial muscle movements, assessing levels of pain from non-verbal cues and care giver observations, and updating medical records in the cloud. Today, accurate pain assessment requires expert support from pain specialists. PainChek wanted to augment caregivers’ ability to manage pain with a simpler, more objective assessment method. NVISO’s Human Behaviour SDK provides the perfect platform of choice for innovators like PainChek to build next-generation Smart Health solutions. For more information see the Case study with IBM | Transforms pain management for vulnerable patients with innovative facial muscle movement analytics.

CREATE NEW PATIENT EXPERIENCES

SAFE AND SECURE

SMART HEALTH

Modern AI can transform the healthcare industry by analyzing vast amounts of data with incredible accuracy. Build and deploy secure and robust AI-powered medical devices. The NVISO Human Behaviour SDK includes building blocks and tools that accelerate sensor fusion developments that require the increased perception and interaction features enabled by AI including vital sign detection, eye tracking, advanced emotions. Monitoring of both patient and staff identities and activities throughout the patient journey can lead to significantly improved outcomes and efficiencies as well as enhancing security.

ADVANCED EMOTIONS

Using AI powered visual observation for measurement of vital signs, assessment of advanced emotional states such as anxiety, stress, and pain in non-communicative and pre-verbal patients, and assessment of mood and fatigue levels information can be gathered to assist in decision making leading to improved patient outcomes whilst delivering increased efficiencies. It thereby lets you look at the cognitive and emotive aspects of communication and patient state, providing you with actionable insights to make smarter decisions.

PATIENT ASSESSMENT

Monitoring of patients throughout their care experience integrated with hospital information systems, can lead to a safer, more secure, smoother and more patient centric experience with faster and improved outcomes. Highly accurate biometrics analysis helps in reliable identification of patients to ensure patient security throughout the treatment journey along with the observation of vital signs, management of stress when under treatment and observation of overall mood. Further, observations of body movement can help identify medical conditions along with emergencies such as collapse.

EASY TO INTEGRATE

FASTER TIME-TO-MARKET

VERIFY USE CASE

Process captured data from a camera in real-time on-device with our EVK. Quickly verify your use case using our 30-day Trial EVK License by processing captured data from a camera sensor in real-time. Understand if existing NVISO AI Apps are suitable for desired end application performance.

BUILD PROTOTYPE

Fast-track your development with our x86 development platforms with APIs for software in-the-loop testing, evaluation, and creating demonstrators. Out-of-the-box software using our Developer SDK License allows you to get up and running in minutes not weeks.

DEPLOY TO PRODUCTION

Access the provided signals on-device or transmit them to other devices and then act on them to deliver innovative product features. Deploy AI-driven human machine interfaces by using our Production SDK License on production hardware.

REQUEST TRIAL EVALUATION KIT (EVK)

Talk with NVISO AI expert to learn more about about trial Evaluation Kit (EVK) for neuromorphic computing devices.

HUMAN CENTRIC

AI SOLUTIONS

CONSUMER ROBOTS

Human–robot interaction plays a crucial role in the burgeoning market for intelligent personal-service and entertainment robots.

AUTOMOTIVE INTERIOR SENSING

Next generation mobility requires AI, from self-driving cars to new ways to engage customers. Build and deploy robust AI-powered interior monitoring systems.

GAMING AND AVATARS

The gaming industry (computer, console or mobile) is about to make extensive use of the camera input to deliver entertainment value.

NVISO MOBILE EVK

Please complete this form to receive purchasing instructions via email for the NVISO Mobile EVK.

REQUEST SDK

REQUEST SDK

DOWNLOAD WHITEPAPER