Get ready for in-cabin monitoring Euro NCAP requirements

Anyverse
Anyverse™
Published in
5 min readNov 17, 2022

If you are an OEM, or simply you are involved in any of the automotive interior sensing systems development stages, surely you are familiar with the Euro NCAP* requirements to assess the adequation and safety of the new in-cabin monitoring systems. And if you are not, do not worry, because we will break them down throughout this article.

*This is an informative article. If you want to know the regulation or the assessment and scoring criteria in detail, visit the Euro NCAP official website.

Get ready for in-cabin monitoring Euro NCAP requirements

Euro NCAP has taken the evaluation of Driver State Monitoring systems (DMS) very seriously and will require from the OEMs a detailed technical assessment on the following:

  • Sensing: providing evidence that the sensing system is capable of sensing a wide variety of different drivers and that is able to operate in a wide range of circumstances.
  • Driver state: demonstrating which elements of distraction, drowsiness, and unresponsive driver can be identified by the system
  • Vehicle response: detailing the vehicle response to a certain driver state.

To move forward in the assessment process, the OEM must demonstrate that its interior monitoring system meets the general requirements; E.g. “The system needs to be default ON at the start of every journey”, or “Deactivation of the system should not be possible with a momentary single push on a button”. And then, the requirements that will mean more than one headache to OEMs… the noise variable requirements.

Build a robust in-cabin monitoring system to meet Euro NCAP requirements

Craft pixel-accurate synthetic data to train and validate your DMS & OMS with Anyverse Synthetic Data platform for in-cabin monitoring AI.

Learn more

Noise variable requirements

1.Drivers

OEMs must provide a sensing system that is robust and covers a wide variety of the driver population. The sensing system must be verified using a diverse population covering at least the following ranges and elements*:

  • Age range: Youthful (16–18) — aged (≥80)
  • Sex: all
  • Stature: AF05 — AM95
  • Skin Complexion: Fitzpatrick Skin Type (1–6)
  • Eyelid aperture:1 From 6.0mm up to 14.0mm

The OEM must demonstrate that system performance does not deviate strongly with different noise variables e.g. gender, age, ethnicity, etc. The OEM’s supporting evidence may be generated by sampling different noise variable combinations.

2.Occlusion

This area refers to the number of variables seen in real-world driving that may occlude the driver’s facial features from the in-cabin monitoring system. A robust system must not be degraded by the most common occlusion variables. Again, OEMs must demonstrate that the system performance is not degraded in the following ranges and elements*:

  • Lighting: Daytime (100,000 lux) — nighttime (1 lux) when measured outside the vehicle.
  • Eyewear: Clear glasses and sunglasses with >70% transmittance including those with thick rims.
  • Facial hair: Short facial hair (<20mm in length)

The system’s performance is evaluated as well. Since there are several variables that may occlude a driver’s face which, in turn, may prevent a suitably robust system from maintaining a consistent level of performance, a robust system should be able to recognize when its performance is degraded.

Euro NCAP will assess that when the driver monitoring system is faced with the following ranges and elements, is either not degraded in performance or that performance is degraded and the driver is informed within 10s of the occlusion being present with visual and/or audible information.

  • Hand on wheel: One hand on wheel at 12 o’clock position
  • Facial occlusion: Face mask, hats, long head hair fringe obscuring eyes
  • Eyewear: Sunglasses with a <15% transmittance
  • Eyelash makeup: Thick eyelash makeup
  • Facial hair: Long facial hair (>150mm in length)

3.Driver behaviors

Not surprisingly, driver behaviors have the potential to critically affect the system’s performance. If and how the in-cabin monitoring system performance is affected by the following behaviors, will be assessed as well:

  • Eating
  • Talking
  • Laughing
  • Singing
  • Smoking / Vaping
  • Eye scratching/rubbing
  • Sneezing

4.Detection of driver state

Euro NCAP will evaluate the following driver statuses: Distraction, Fatigue, and Unresponsive Driver.

Distraction

Several types of distraction have been considered: long distraction, short multiple distractions, and phone usage. To detect them, the system will have to be able to identify head movement, eye movement, and body lean-looking behaviors.

Long distraction: a long distraction is considered a single long-duration driver gaze away from the forward road to one consistent location of ≥3 seconds.

Short multiple distractions: A short distraction is considered to be repeated glances away from the forward road view either repeated towards one location, or to multiple different locations for a period long enough for the driver to fully interpret the road situation.

Phone use: Phone use is considered to be a specific type of short distraction where the driver’s repeated gaze is towards their mobile phone.

Fatigue

Fatigue is a typical behavior that builds up over a long driving time. Euro NCAP has defined three types of fatigue:

  • Drowsiness
  • Microsleep: eye closure of < 3s although
  • Sleep: continued eye closure >3 seconds

Unresponsive driver

According to Euro NCAP, an unresponsive driver is determined as a driver who either does not return their gaze to the forward road view within 3 seconds of an inattention warning being issued or a driver whose gaze has been away from the forward road view or has been eyes closed for ≥ 6 seconds.

Wondering how you are going to prepare your in-cabin monitoring system to meet the Euro NCAP requirements?

Meet Euro NCAP requirements, avoid privacy issues, and customize variability with Anyverse Synthetic Data platform for in-cabin monitoring AI.

Learn more

Source

ASSESSMENT PROTOCOL — SAFETY ASSIST — SAFE DRIVING — Implementation 2023. (Version 10.1 — July 2022)

About Anyverse™

Anyverse™ helps you continuously improve your deep learning perception models to reduce your system’s time to market applying new software 2.0 processes. Our synthetic data production platform allows us to provide high-fidelity accurate and balanced datasets. Along with a data-driven iterative process, we can help you reach the required model performance.

With Anyverse™, you can accurately simulate any camera sensor and help you decide which one will perform better with your perception system. No more complex and expensive experiments with real devices, thanks to our state-of-the-art photometric pipeline.

Need to know more?

Visit our website, anyverse.ai anytime, or our Linkedin, Instagram, and Twitter profiles.

--

--

Anyverse
Anyverse™

The hyperspectral synthetic data platform for advanced perception