09-17-2021, 02:24 AM
The automobile, in American life, has long been a hallmark of freedom. A teenager’s first driver’s license offers freedom from Mom and Dad. A new car and the open road bring the freedom to chase the American dream. But as more technology creeps in to help drivers, so, too, will systems that eavesdrop on and monitor them, necessitated not by convenience but by new safety concerns.
Cameras that recognize facial expressions, sensors that detect heart rates and software that assesses a driver’s state of awareness may seem like superfluous flights of fancy, but they are increasingly viewed as part of an inevitable driving future.
At upstarts like the electric car company Byton and mainstream mainstays like Volvo, car designers are working on facial recognition, drowsy-driver alert systems and other features for keeping track of the people behind the wheel.
The most immediate impetus: concerns about the safe use of driver-assistance options like automatic lane-keeping that still require drivers to pay attention. And when truly autonomous vehicles finally arrive, the consensus among automakers and their suppliers is that new ways will be needed to check on drivers and passengers to make sure they are safe inside.
“It’s really taken off from no car monitor to tactile monitoring to taking a look at your eyes,” said Grant Courville, a vice president at BlackBerry QNX, which creates in-dash software systems. “I definitely see more of that coming as you get to Level 3 cars,” he added, referring to vehicles that can perform some self-driving functions in limited situations.The feature is part of the car’s Super Cruise system, the first hands-free driving tool to operate on select United States highways. The camera tracks a driver’s head position and eye movements to ensure that the person is attentive and able to retake control of the car when needed.
Similar concerns about BMW’s semi-autonomous systems prompted the German carmaker to add a driver monitoring camera in its 2019 X5 sport utility vehicle. The video camera is mounted in the instrument cluster as part of BMW’s Extended Traffic Jam Assistant system, part of a $1,700 package, that allows the car to go autonomous — with driver monitoring — in stop-and-go traffic under 37 miles per hour.
“It looks at the head pose and the eyes of the driver,” said Dirk Wisselmann of BMW’s automated driving program. “We have to, because by doing so it empowers us to add more functionality.”
Automakers understand that tracking technology raises privacy issues, so BMW does not record or store the ahd car monitor information, Mr. Wisselmann said.
Perhaps still smarting from lessons learned in the past, G.M. also does not record what transpires inside the car’s cabin, the company said. In 2011, G.M. tried to change the user agreement in its OnStar service to allow it to share driver information with third-party companies. The backlash from owners was so swift and severe that the Supreme Court cited the episode as proof that people had an expectation of privacy in their cars.
“But it’s not just about distraction management,” said Jada Smith, a vice president in the advanced engineering department at the auto supplier Aptiv. During an autonomous driving demonstration, she pointed out that such driver monitoring systems can assess a driver’s cognitive load levels — how many tasks the person is trying to juggle — and then adjust other car functions.
“If the driver is not fully aware,” Ms. Smith said, “we might brake faster.” Other ideas include putting radar inside the car for interior sensing like detecting that a child has been left behind. (Every nine days a child left in a car dies from vehicular heatstroke in the United States, according to KidsAndCars.org, an advocacy group.)
It was infants’ being left in cars that first prompted Guardian Optical Technologies, based in Tel Aviv, to develop in-cabin monitoring technology, said Tal Recanati, the company’s chief business officer. The company has now expanded its 3-D vision and “micro vibration” sensing system to recognize faces, check seatbelt use, even adjust elements like airbag deployment velocity based on a passenger’s approximate weight. Eventually Guardian’s technology could be able to judge the emotional state of people in the car.
Affectiva, a Boston company developing technology for measuring emotions, has been conducting such research for several years to assess driver behavior. On a closed test track peppered with distractions — people dressed as construction workers, a security vehicle with flashing lights, pedestrians, fake storefronts — Affectiva demonstrated how the company’s program works in tandem with a “collaborative driving” system made by the Swedish auto supplier Veoneer. Veoneer’s technology can control steering and braking on its own, with the occasional intervention of a human driver.
Affectiva collected a variety of driver information during the test, measuring elements like the amount of grip on the wheel, throttle action, vehicle dome camera, facial and head movements. It then compared that information with what was happening around the car to determine how much trust the driver had in the semi-autonomous system and the perceived level of cognitive load.
“We want them to trust the car — but not too much,” said Ola Bostrom, a vice president of research at Veoneer. “The driver still has to be engaged” in order to take over the controls when a car encounters a situation it can’t handle.
To deliver other advanced services, like augmented reality information about nearby businesses and locations, it will also be necessary to monitor what drivers are paying attention to, said Andrew Poliak, a vice president at Panasonic Automotive Systems. And companies as diverse as Mercedes-Benz and the voice-recognition company Nuance want to add Alexa-like services, meaning that your sedan or S.U.V. may always be listening.
“So these systems are going to become standard in all cars,” said Nakul Duggal, who leads the automotive products group at Qualcomm.
Will privacy concerns then recede in the rearview mirror of advancing technologies?
When fully autonomous vehicles begin circulating on public roads, designers note, they will have to be able to detect when people enter or exit a vehicle, who the person is, whether they have left anything behind in the car, and especially if a person has become disabled (because of intoxication or a medical emergency). And that information will inevitably be shared online, although there may be ways that some people can still preserve their sense of independence in the car.
“In the future, it may be different for people who own their own cars, where there’s more privacy,” said Mr. Wisselmann at BMW, “and for people who use robo taxis, where there will be less.”
It’s 2025 and you’re cruising down the highway late at night. It’s been a long day and your eyelids feel heavy. All of a sudden, you hear three beeps, lights flash, your car slows down, and it pulls itself safely to the side of the road.
This scenario is closer to becoming reality than you may think, and although vehicle camra get all the headlines, most drivers will experience something like it long before they can buy a car that drives itself.
Full self-driving cars are taking longer to arrive than techno-optimists predicted a few years ago. In fact, in a financial filing Wednesday, Tesla acknowledged it may never be able to deliver a full self-driving car at all.
But with features such as automated cruise control, steering assist and automatic highway lane changing, new cars come loaded with driver-assist options. As they proliferate, the task of a human driver is beginning to shift from operating the vehicle to supervising the systems that do so.
That development carries promise and peril. Decades of research make clear that humans aren’t good at paying attention in that way. The auto industry’s answer: systems that monitor us to make sure we’re monitoring the car.
Such systems, usually relying on a driver-facing camera that car rear view monitor and head movements, already have been deployed in tens of thousands of long-haul trucks, mining trucks and heavy construction vehicles, mainly to recognize drowsiness, alcohol or drug use, and general distraction.
Some new automobile models can already be purchased with option packages that include monitoring systems, usually as part of driver-assist features such as lane keeping and automated cruise control. They include cars from General Motors, Ford, Toyota, Tesla, Subaru, Nissan and Volvo.
One reason for the sudden rush: European regulators plan to require such systems be installed on every new car sold there by mid-decade.
The top U.S. car industry lobby, recently renamed the Alliance for Automotive Innovation, told a Senate panel Tuesday that it welcomes regulation that would require driver-monitoring systems in all new cars sold with driver-assist technologies. The National Transportation Safety Board, after several fatal Tesla Autopilot crashes, has recommended that safety regulators require more robust systems than the one Tesla uses to keep drivers engaged.
So-called advanced driver-assist systems serve as a bridge as companies work to develop safe, fully self-driving cars, which are beginning to appear in very limited locations. Most driverless car developers put tight restrictions on how they can be used and where they can go.
“We’re in an in-between phase at the moment,” said Colin Barnden, a market analyst at Semicast Research.
On the plus side, such technologies can reduce driving stress and, if deployed responsibly, improve safety. At the same time, the less input a car needs from a human driver, the harder it is for that driver to remain vigilant. Humans aren’t good at “monitoring things, waiting for something to go wrong. We just aren’t wired to do that,” Barnden said.
Driver-monitoring systems come in two basic types: eye trackers and steering wheel sensors. In either case, if a driver is detected not paying attention, warnings are sounded through lights or sounds or both; if the driver doesn’t reengage, the car pulls itself to the roadside and stops.
Tesla uses the steering sensor. Practically everybody else uses eye trackers.
Cameras that recognize facial expressions, sensors that detect heart rates and software that assesses a driver’s state of awareness may seem like superfluous flights of fancy, but they are increasingly viewed as part of an inevitable driving future.
At upstarts like the electric car company Byton and mainstream mainstays like Volvo, car designers are working on facial recognition, drowsy-driver alert systems and other features for keeping track of the people behind the wheel.
The most immediate impetus: concerns about the safe use of driver-assistance options like automatic lane-keeping that still require drivers to pay attention. And when truly autonomous vehicles finally arrive, the consensus among automakers and their suppliers is that new ways will be needed to check on drivers and passengers to make sure they are safe inside.
“It’s really taken off from no car monitor to tactile monitoring to taking a look at your eyes,” said Grant Courville, a vice president at BlackBerry QNX, which creates in-dash software systems. “I definitely see more of that coming as you get to Level 3 cars,” he added, referring to vehicles that can perform some self-driving functions in limited situations.The feature is part of the car’s Super Cruise system, the first hands-free driving tool to operate on select United States highways. The camera tracks a driver’s head position and eye movements to ensure that the person is attentive and able to retake control of the car when needed.
Similar concerns about BMW’s semi-autonomous systems prompted the German carmaker to add a driver monitoring camera in its 2019 X5 sport utility vehicle. The video camera is mounted in the instrument cluster as part of BMW’s Extended Traffic Jam Assistant system, part of a $1,700 package, that allows the car to go autonomous — with driver monitoring — in stop-and-go traffic under 37 miles per hour.
“It looks at the head pose and the eyes of the driver,” said Dirk Wisselmann of BMW’s automated driving program. “We have to, because by doing so it empowers us to add more functionality.”
Automakers understand that tracking technology raises privacy issues, so BMW does not record or store the ahd car monitor information, Mr. Wisselmann said.
Perhaps still smarting from lessons learned in the past, G.M. also does not record what transpires inside the car’s cabin, the company said. In 2011, G.M. tried to change the user agreement in its OnStar service to allow it to share driver information with third-party companies. The backlash from owners was so swift and severe that the Supreme Court cited the episode as proof that people had an expectation of privacy in their cars.
“But it’s not just about distraction management,” said Jada Smith, a vice president in the advanced engineering department at the auto supplier Aptiv. During an autonomous driving demonstration, she pointed out that such driver monitoring systems can assess a driver’s cognitive load levels — how many tasks the person is trying to juggle — and then adjust other car functions.
“If the driver is not fully aware,” Ms. Smith said, “we might brake faster.” Other ideas include putting radar inside the car for interior sensing like detecting that a child has been left behind. (Every nine days a child left in a car dies from vehicular heatstroke in the United States, according to KidsAndCars.org, an advocacy group.)
It was infants’ being left in cars that first prompted Guardian Optical Technologies, based in Tel Aviv, to develop in-cabin monitoring technology, said Tal Recanati, the company’s chief business officer. The company has now expanded its 3-D vision and “micro vibration” sensing system to recognize faces, check seatbelt use, even adjust elements like airbag deployment velocity based on a passenger’s approximate weight. Eventually Guardian’s technology could be able to judge the emotional state of people in the car.
Affectiva, a Boston company developing technology for measuring emotions, has been conducting such research for several years to assess driver behavior. On a closed test track peppered with distractions — people dressed as construction workers, a security vehicle with flashing lights, pedestrians, fake storefronts — Affectiva demonstrated how the company’s program works in tandem with a “collaborative driving” system made by the Swedish auto supplier Veoneer. Veoneer’s technology can control steering and braking on its own, with the occasional intervention of a human driver.
Affectiva collected a variety of driver information during the test, measuring elements like the amount of grip on the wheel, throttle action, vehicle dome camera, facial and head movements. It then compared that information with what was happening around the car to determine how much trust the driver had in the semi-autonomous system and the perceived level of cognitive load.
“We want them to trust the car — but not too much,” said Ola Bostrom, a vice president of research at Veoneer. “The driver still has to be engaged” in order to take over the controls when a car encounters a situation it can’t handle.
To deliver other advanced services, like augmented reality information about nearby businesses and locations, it will also be necessary to monitor what drivers are paying attention to, said Andrew Poliak, a vice president at Panasonic Automotive Systems. And companies as diverse as Mercedes-Benz and the voice-recognition company Nuance want to add Alexa-like services, meaning that your sedan or S.U.V. may always be listening.
“So these systems are going to become standard in all cars,” said Nakul Duggal, who leads the automotive products group at Qualcomm.
Will privacy concerns then recede in the rearview mirror of advancing technologies?
When fully autonomous vehicles begin circulating on public roads, designers note, they will have to be able to detect when people enter or exit a vehicle, who the person is, whether they have left anything behind in the car, and especially if a person has become disabled (because of intoxication or a medical emergency). And that information will inevitably be shared online, although there may be ways that some people can still preserve their sense of independence in the car.
“In the future, it may be different for people who own their own cars, where there’s more privacy,” said Mr. Wisselmann at BMW, “and for people who use robo taxis, where there will be less.”
It’s 2025 and you’re cruising down the highway late at night. It’s been a long day and your eyelids feel heavy. All of a sudden, you hear three beeps, lights flash, your car slows down, and it pulls itself safely to the side of the road.
This scenario is closer to becoming reality than you may think, and although vehicle camra get all the headlines, most drivers will experience something like it long before they can buy a car that drives itself.
Full self-driving cars are taking longer to arrive than techno-optimists predicted a few years ago. In fact, in a financial filing Wednesday, Tesla acknowledged it may never be able to deliver a full self-driving car at all.
But with features such as automated cruise control, steering assist and automatic highway lane changing, new cars come loaded with driver-assist options. As they proliferate, the task of a human driver is beginning to shift from operating the vehicle to supervising the systems that do so.
That development carries promise and peril. Decades of research make clear that humans aren’t good at paying attention in that way. The auto industry’s answer: systems that monitor us to make sure we’re monitoring the car.
Such systems, usually relying on a driver-facing camera that car rear view monitor and head movements, already have been deployed in tens of thousands of long-haul trucks, mining trucks and heavy construction vehicles, mainly to recognize drowsiness, alcohol or drug use, and general distraction.
Some new automobile models can already be purchased with option packages that include monitoring systems, usually as part of driver-assist features such as lane keeping and automated cruise control. They include cars from General Motors, Ford, Toyota, Tesla, Subaru, Nissan and Volvo.
One reason for the sudden rush: European regulators plan to require such systems be installed on every new car sold there by mid-decade.
The top U.S. car industry lobby, recently renamed the Alliance for Automotive Innovation, told a Senate panel Tuesday that it welcomes regulation that would require driver-monitoring systems in all new cars sold with driver-assist technologies. The National Transportation Safety Board, after several fatal Tesla Autopilot crashes, has recommended that safety regulators require more robust systems than the one Tesla uses to keep drivers engaged.
So-called advanced driver-assist systems serve as a bridge as companies work to develop safe, fully self-driving cars, which are beginning to appear in very limited locations. Most driverless car developers put tight restrictions on how they can be used and where they can go.
“We’re in an in-between phase at the moment,” said Colin Barnden, a market analyst at Semicast Research.
On the plus side, such technologies can reduce driving stress and, if deployed responsibly, improve safety. At the same time, the less input a car needs from a human driver, the harder it is for that driver to remain vigilant. Humans aren’t good at “monitoring things, waiting for something to go wrong. We just aren’t wired to do that,” Barnden said.
Driver-monitoring systems come in two basic types: eye trackers and steering wheel sensors. In either case, if a driver is detected not paying attention, warnings are sounded through lights or sounds or both; if the driver doesn’t reengage, the car pulls itself to the roadside and stops.
Tesla uses the steering sensor. Practically everybody else uses eye trackers.