We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

woman yawning image via Shutterstock

Owners of self-driving cars may need illegal distractions to stop them nodding off

A study in the US found people were less likely to fall asleep behind the wheel if they were watching a screen with moving images.

NEW CARS THAT can steer and brake themselves risk lulling people in the driver’s seat into a false sense of security — and even to sleep. One way to keep people alert may be providing distractions that are now illegal.

That was one surprising finding when researchers put Stanford University students in a simulated self-driving car to study how they reacted when their robo-chauffeur needed help.

The experiment was one in a growing number that assesses how cars can safely hand control back to a person when their self-driving software and sensors are overwhelmed or overmatched. With some models already able to stay in their lane or keep a safe distance from other traffic, and automakers pushing for more automation, the car-to-driver handoff is a big open question.


The elimination of distracted driving is a major selling point for the technology. But in the Stanford experiment, reading or watching a movie helped keep participants awake.

Among the 48 students, 13 who were instructed to monitor the car and road from the driver’s seat began to nod off. Only three did so when told to focus on a screen full of words or moving images.

Alertness was particularly helpful when students needed to grab the wheel because a car or pedestrian got in the way.

There’s no consensus on the right car-to-driver handoff approach: the Stanford research suggests engaging people with media could help, while some automakers are marketing vehicles with limited self-driving features that will slow down if they detect a person has stopped paying attention to the road.

Self-driving car experts at Google, which is pursuing the technology more aggressively than any automaker, concluded that involving humans would make its cars less safe. Google’s solution is a prototype with no steering wheel or pedals — human control would be limited to go and stop buttons.

Meanwhile, traditional automakers are phasing in the technology. Mercedes and Toyota sell cars that can hit the brakes and stay in their lane. By adding new features each year, they might produce a truly self-driving car in about a decade.

One potential hazard of this gradualist approach became clear this fall, when Tesla Motors had to explain that it’s “auto pilot” feature did not mean drivers could stop paying attention. Several videos posted online showed people recording the novelty — then seizing the wheel when the car made a startling move.

Not foolproof

A Super Cruise system, which will allow semi-autonomous highway driving in the Cadillac CTS starting late next year, monitors drivers. If their eyes are off the road, and they don’t respond to repeated prodding, the car will slow itself.

“We are in no way selling this as a technology where the driver can check out,” General Motors spokesman Dan Flores said. “You can relax, glance away, but you still have to be aware because you know the technology’s not foolproof.”

Though research is ongoing, it appears that people need at least 5 seconds to take over — if they’re not totally checked out.

One riddle automakers must solve: How to get owners to trust the technology so that they’ll use it — but not trust it so much that they’ll be lulled into a false security that makes them slow to react when the car needs them.

Trust was on the mind of researchers who in August published an extensive report on self-driving cars funded by the National Highway Traffic Safety Administration. “Although this trust is essential for widespread adoption, participants were also observed prioritising non-driving activities over the operation of the vehicle,” the authors wrote.

Another wide-open question: How to alert the person in the driver’s seat of the need to begin driving.

It appears that the car should appeal to several senses. Visual warnings alone may not suffice. Combine a light with spoken instructions or physical stimulation such as a vibrating seat, and people are quicker to re-assume control.

“If it is done courteously and subtle and not annoying, it could be missed by someone that is distracted,” said Greg Fitch, a research scientist at the Virginia Tech Transportation Institute. Then again, the way the car interacts with people will be one way automakers differentiate their product — and overbearing warnings may sour potential buyers.


Other issues Fitch cites include “mode confusion” (making sure the car clearly informs the person whether or not it is driving itself) and clear explanations to drivers of what the car can — and cannot — handle.

Cars with the right sensors are becoming really good at monitoring the outside world and have quicker response times than humans. People are much better at making decisions under uncertain circumstances.

One lesson from the Stanford study may be that master and machine are better viewed as collaborators.

“There’s really a relationship between drivers and cars,” said David Sirkin, who helped run the experiment at Stanford’s Center for Design Research, “and that relationship is becoming more a peer relationship.”

Read: The huge, unexpected ethical question self-driving cars will have to tackle>

Associated Foreign Press
Your Voice
Readers Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.