FRANKFURT, Germany – It’s the future.

Your vehicle is in automated-driving mode on the highway and suddenly you pass out.

But the highway is ending and that mode will soon be canceled as your car approaches more complex city streets.

What will happen?

Continental thinks it has found a solution to this scenario with its driver-analyzer technology. Using a driver-facing camera and image-processing software, the supplier can determine if someone using an automated-driving mode falls unconscious (or is merely not paying attention). Infrared sensors also are included so facial movements can be detected during nighttime driving.

“(A driver passing out is) one use case which will probably come very soon on the automated driving road map – if something goes wrong, how to safely park the car, slow it down, bring it to a safe position?” Frank Rabe, executive vice president-instrumentation and driver HMI business unit for Continental tells WardsAuto in an interview here.

Rabe says Continental sees the technology, which he says is due in market in a few months in unnamed vehicles, as a way to help re-engage a driver who may be switching between an automated highway mode (SAE Levels 2 and 3) and non-automated city driving.

“There will be a phase where not everything is automated. So when you switch from the one mode to the other, from automated to non-automated, well that’s when you better wake up the driver so he takes over.

“So you need to always understand, ‘What’s the driver doing?’” he continues. “Is he sleeping? OK, so it may take longer to wake him up. And this is why we need to understand the situation of the driver.”

The camera monitors a driver’s eyes, mouth and even ears and in conjunction with the image-processing software judges attentiveness and time spent looking away from the road, Rabe says.

“Is the driver having a smartphone close to the ear? Not using hands-free (calling)?” he says of distractions the software is trying to identify, noting the camera will have full view of not just the driver’s face but likely part of his or her shoulders as well.

Conti sees multiple other use cases, including parents preventing their children from driving their cars, as well as uniting autonomy with the other coming trend of new mobility.

“Think about car(-sharing) fleets,” Rabe says. “Wouldn’t it be nice to enter these cars, no matter which one or where, and it would recognize you and it would immediately know your (song or seat-setting) preferences?” In this scenario, a cloud-stored profile of a driver and his or her picture could be accessed after the driver-analyzer tech scanned the person’s face.

By focusing on a driver’s lips, Rabe notes the driver-analyzer camera and software could not only be used for safety reasons, but to bolster the accuracy of spoken commands for navigation, audio and climate-control systems.

“If you would add the movement of the lips, it would immediately improve the voice-recognition quality,” he says, noting that by combining different inputs a better in-vehicle user experience can be provided.

How and when to rouse a driver not paying attention, as well as intervene to bring a vehicle to a safe stop in a scenario where he or she can’t be roused, will be left up to the OEMs purchasing the driver-analyzer technology, Rabe says.

OEMs also will select where to position the camera – above the steering column, in the instrument cluster or in the roof module.

There are some limitations to the technology, such as a driver wearing sunglasses with polarized lenses. In this situation, Rabe says today’s implementation would call for an alert to be given if the pupils cannot be viewed and then a warning to the driver which he summarizes as: “I cannot let you go fully automated, you’re not cooperating.”

Conti expects image data analysis to improve over time and, combined with the infrared sensors, sees a day when the driver-analyzer can determine a driver’s position in a fully automated vehicle, where he or she may be in a chair swiveled around and not facing the road.

Driver position will become critical in the future when it comes to the safe deployment of airbags.

“You have (airbags) right where you need them (today): They are in front of you, (on your) side and so on,” Rabe says. “In the future, the driver’s position might be random, so how will (OEMs determine when to) deploy the airbags?”

He says the camera being used for the driver-analyzer system is relatively standard, but Continental has worked to make it suitable for the extreme temperatures and conditions encountered inside an automobile.

“Your phone (camera) will very likely fail at temperatures of below minus 20, and above 80…The automotive environment is pretty harsh, and that’s what we are focusing on: using automotive components and making it automotive compliant.”

The technology theoretically could be used to analyze the faces of passengers as well, but Rabe questions whether OEMs will see any business cases for that.

Combining the camera, software and sensors with a breath-detection system also would be possible to prevent those who have been drinking from getting behind the wheel of a semi-automated vehicle. But here too Rabe is uncertain.

“The question is: Does it make sense to integrate? Can it be robust? Because today’s systems you have to blow air into them. That’s pretty safe. If you have it open, somewhat integrated and you can’t even see it, can we make it robust in the sense that it will always detect the issue? I’m doubtful.”

cschweinsberg@wardsauto.com