Skip navigation
AI is driving fourth industrial revolution will transform industries like steam and electricity did in earlier times says NVIDIArsquos Sanford Russell Rebecca Cook
<p><strong>AI is driving fourth industrial revolution, will transform industries like steam and electricity did in earlier times, says NVIDIA&rsquo;s Sanford Russell.</strong></p> <p> </p>

Artificial Intelligence Next Big Thing for Cars

Key takeaways from a diverse panel of experts at the recent WardsAuto User Experience Conference discussing how AI will recast our time behind the wheel. &nbsp;

Newer cars already incorporate some artificial intelligence, but soon it will transform every aspect of driving, from navigation and voice commands to autonomy.

The goal is to create a perfect experience for everyone in the vehicle, from making it simpler to use voice commands to find an Italian restaurant with a good wine list, to enabling a self-driving car to know when it is okay to drive off the road to avoid an obstacle.

There is nothing to fear from this rapid expansion of machine intelligence, but a few spectacular failures outside the automotive world do show humans need to play a supervisory role over AI programs for the foreseeable future.

These are key takeaways from a diverse panel of experts at the recent WardsAuto User Experience Conference discussing how AI and cognitive learning can recast our time behind the wheel.

AI and neural networks already have become part of driving for those who use smartphone personal assistants through Apple CarPlay and Android Auto. Adding machine intelligence to vehicles also promises to become big business for companies such as IBM, speech recognition supplier Nuance and software and chip supplier NVIDIA.

AI is driving the fourth industrial revolution and will transform many industries like steam and electricity did in earlier times, says Sanford Russell, head of autonomous driving at NVIDIA. Neural networks and what NVIDIA calls deep learning emerged in 2012 and “blew away 25 years of programming expertise,” he says.

IBM is making a big play in the medical field using AI and now is looking to make a splash in automotive as Watson, the company’s cloud-based computer system, uses what IBM calls cognitive computing to comb through vast amounts of published research and data to analyze information and diagnose problems.

IBM is rebranding itself as a cognitive computing company and has gotten out of most of the older hardware businesses with which it has been associated. 

At the WardsAuto UX Conference, Dan Ricci, global automotive leader at IBM Cognitive Solutions, tells attendees IBM is an established automotive supplier with about $5 billion in annual auto-related revenue.

Now it is looking to expand further by making Watson a Siri-like assistant inside the vehicle that provides more personalized services than smartphones by incorporating vehicle information and the owner’s manual to perfect the driving experience.

Ricci outlines a scenario where Watson enhances the user experience by monitoring miles traveled and letting drivers know when preventative maintenance is required; explains the function of components such as the timing belt and why it needs to be replaced; and even schedules service appointments and arranges loaner vehicles.

Watson also can act as a co-pilot, advising the best routes and when and where the adaptive cruise control should be activated to maximize fuel economy.

Tayler Blake, a machine learning expert at Pillar Technology, says interfaces that use algorithms to converse in natural language are becoming important because they simplify the user interface and make it more personal at the same time. “No matter how cool your user interface is, it would be even cooler if there is less of it,” she says.

“In 2016, we’re living faster than ever. I want to perform tasks with as little interaction as possible, but I want it to seem relevant and personal. Algorithms simplify bots like Siri. How to integrate algorithms with current (user experience) design practices, that’s the big question,” she says.

Natural Language Recognition Key

Nuance says it has the answer. It is a major provider of speech-recognition software for the auto industry, and it is using AI to go the next step to natural language recognition which makes voice commands more conversational.

Natural language recognition is reinventing the relationship between people and technology, says Charlie Ortiz, senior principal manager-Artificial Intelligence and Reasoning Group, Nuance Natural Language and AI Laboratory.

The personal assistant apps offered by smartphones are good examples of natural language recognition, but when used in the car, they can be distracting and pull eyes from the road and hands from the steering wheel, Ortiz says.

Therefore, it is important to bring natural language speech recognition into the vehicle itself, so it can perform a multitude of vehicle-related tasks beyond navigation and search, he says.

However, in order to be more conversational than conventional menu-based systems, natural-language systems need a variety of AI technologies to use reasoning and lots of knowledge to fill in blanks that are inferred but not mentioned explicitly by users, Ortiz says.

“You want to have a conversation with the car, and you want an utterance to introduce a task.”

That is not as easy as it sounds, he says. Speaking conversationally, a person might say: “I’m looking for an Italian restaurant with a good wine list.”

Humans can infer the query likely is about finding Italian cuisine at a restaurant nearby that is open for dinner and has a well-stocked wine cellar.

But for computers with typical menu-based programming, getting desired information can turn into “dart-throwing sessions” instead of conversation as the computer struggles to determine the right answer to what it sees as a long list of possibilities that could include someone looking to buy a restaurant or make a reservation weeks in advance.

The solution is using a variety of AI technologies that work together and can use reasoning to fill in the informational blanks, and create a natural-sounding dialog with the driver, Ortiz says.

“Deep learning didn’t exist 10 years ago. Now it’s coming to the automotive space,” says NVIDIA’s Russell. “Neural networks break (camera) views into tiny fragments and then identify what they are in the real world, such as people walking and car doors opening; things that have to be solved in real time.”

NVIDIA started out in the late 1990s making powerful data processors for the gaming world. Gaming back then largely was simulating car-racing games, pixel by pixel, Sanford says. Now, NVIDIA is using the prodigious power of NVIDIA processors to take simultaneous input from cameras, Lidar (3D radar) and radar systems (called sensor fusion) to build 3D worlds in real time that can serve as guidance systems for autonomous vehicles.

NVIDIA also is using deep learning technology to train vehicles to do specific driving tasks and keep teaching them better, more efficient ways of doing them. Autonomous vehicles powered by NVIDIA chips  can beat professional drivers around closed-course race tracks, and now deep learning is being used to train vehicles in how to cope with the uncertainties of the open road.

By combining sensor fusion with deep learning, the latest processors can solve some of the biggest hurdles for self-driving cars: guiding vehicles through bad atmospheric conditions such as snow and rain, driving off road, and enabling autonomous cars to know when it is okay to drive off the road to avoid hazards or obstacles.

“This is 2016 and we believe we have a lot of the pieces ready. We’re driving at night on dirt roads; we’re doing it today. The deep learning innovation is very, very real,” Russell says.

Microsoft, Facebook Disasters

But AI’s breathtaking progress also has had some stumbles. PillarTechnology’s Blake points out Microsoft created an AI “Chatbot” named Tay this year designed to imitate a 19-year old American girl on Twitter as an experiment in “conversational understanding.” But in less than 24 hours, pranksters turned the supposedly loveable Tay into a Hitler-loving racist spewing out the worst tweets imaginable.   

Facebook had a similar disaster when it chose to fire the journalists running its news trending feature and turned over the job of delivering news to AI and a technical team.

That resulted in high-traffic fake stories and pornographic videos becoming top trending topics. When dealing with algorithms, garbage will start coming out if you put garbage in, Blake warns.

AI’s breathtaking progress has suffered spectacular failures outside automotive, Pillar’s Blake points out.

That’s why successful shopping sites such as stitchfix.com that use AI to pick personalized clothing ensembles still has a stylist inspect every box of items before shipping, she says.

Despite a few horror stories, the adoption of AI throughout vehicles promises to be fast and furious, not just because automakers and suppliers want it, but because every industry is working on it, from tech giants such as Google, Apple and Facebook to thousands of smaller companies in every industry all over the world, says NVIDIA’s Russell.

“They are all trying to solve fundamentally the same problem: How do I service you, how do I answer your questions? The beauty of this from our perspective is you are not on your own. This isn’t something the auto industry has to figure out for itself like antilock brakes. Other than aerospace, who else had to solve that? Nobody.”

[email protected]

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish