Audi’s new A8 will drive through traffic jams for you
Semi-autonomous driving systems like Tesla’s Autopilot are all the rage these days. In advance of fully autonomous vehicles taking over our roads, automakers are pushing systems that can do the driving for you in specific circumstances.
The newest is Audi’s new A8, which debuted at the Audi Summit in Barcelona this week. The car is powered by Nvidia, and features Audi’s “AI Traffic Jam Pilot” that will handle gridlock for you.
Much like other autonomous systems, the Traffic Jam Pilot uses cameras and laser sensors to map out the road and other vehicles. Then, at speeds up to 40mph, it can stop, start, and follow vehicles for you, enabling you to relinquish control throughout a traffic jam.
It’s the same technology being used to solve a different problem as Tesla’s Autopilot, which is designed for highway cruising situations. Traffic jams all happen at lower speeds and with fewer lane changes than highway cruising, which makes autonomous driving easier for the computer, and far less dangerous.
The system is activated by a single button on the dash, and it has audio and visual prompts for the driver when it’s time for the human to take back control.
In addition to the autonomous driving features, Nvidia also powers the in-car infotainment system and screens, which are meant to create a “virtual cockpit.” Using screens to replace traditional buttons and dials is a trend that’s increasingly popular on expensive, tech-focused cars, thanks to the success of Tesla’s dial-free cars.
This startup is using Uber and Lyft drivers to bring self-driving cars to market faster
Roads aren’t the stagnant strips of concrete and asphalt they appear to be. They’re constantly changing. Traffic lights are added and removed, stop-and-go intersections turn into roundabouts, a typically quiet street turns into a construction zone. It’s happening everywhere, all the time.
Human drivers might be able adjust to this dynamic environment, but autonomous vehicles need extra help. They need maps. But not just any old map will do, say the three founders behind Lvl5, a new mapping and localization startup that launched publicly today.
Lvl5 was founded in December by Andrew Kouri and Erik Reed, who both worked on Tesla’s Autopilot team, and George Tall, a computer vision engineer from iRobot, has developed a way to take enormous amounts of video collected from a camera and turn it into high-definition 3D maps that are constantly refreshing. These maps will always reflect the latest road conditions, providing self-driving cars with the information they need to detect and plan their route safely.
“The thing that everyone is kind of ignoring silently is that self-driving cars won’t ship unless we have really good HD maps that update every single day,” Kouri said in an interview with The Verge. “And nobody has a system to do this yet. This is what we’re building.”
The company, which graduated from the Y Combinator accelerator in March, contends that it’s an inexpensive solution that beats other costly sensors that tech companies and automakers are betting on. And it believes it has a better system than much bigger competitors like Mobileye, TomTom, and HERE.
Kouri says self-driving cars don’t need LIDAR, light detection, and ranging radar used to see the world around it. That’s a departure from what many automakers and tech companies like Google’s Waymo say is needed for the safe deployment of autonomous vehicles.
Lvl5’s philosophy, in many ways, mirrors Tesla’s approach, which contends it can deploy fully autonomous vehicle technology without relying on LIDAR.
“We don’t really care if LIDAR wins out or computer vision wins out,” Kouri said. “Right now we know that if we want to make self-driving car en masse, cameras are ready and LIDAR is not.”
The company’s system uses consumer-grade cameras and a computer vision algorithm to turn all of the video it captures into useable, 3D maps. But it needed to scale it.
So they reached out to Uber and Lyft drivers who can crowdsource the video data via a dashcam app created by Lvl5 called Payver.
Drivers are paid to mount smartphones on the dashboard of their cars and run the app, which automatically collects video, accelerometer, and GPS data. Huge amounts of data are captured; video is taken every meter along a vehicle’s route. The compressed data is then sent to the cloud and then sent to lvl5’s central hub. From there, lvl5 uses its computer vision algorithm to translate all of this footage into high-definition 3D maps.
“That’s something that even Tesla doesn’t do right now,” Kouri said. (Although it should be noted that earlier this year Tesla started collecting “short video clips” using the car’s external cameras to learn how to spot lane lines, traffic lights, street signs, and other visual cues. The aim is to use this data to improve autonomous safety features, Tesla has said.)
The company was able to get 2,500 drivers to download and use the app. In three months, they’ve developed a pipeline of map data that covers 500,000 miles of US roads, and it’s refreshed constantly.
The startup is currently piloting its system with an unnamed major automaker. The aim is to work with multiple automakers, each one paying an initial fee to install the system into the vehicles. The HD maps change multiple times in a day, which requires constant maintenance. Lvl5 will charge a monthly subscription fee per vehicle to maintain the maps.
“If Tesla solves this problem that’s great, but they only have 250,000 cars on the road,” he said. “On the other hand, if we partner with three or four OEMs, we’re going to prevent a lot of Josh Brown incidents from happening.”
Last year, Joshua Brown was killed while driving a Tesla Model S while the semi-autonomous Autopilot feature was engaged. A white tractor trailer drove across the divided highway that Brown was traveling on. In that moment, neither the driver nor the camera used for Autopilot, noticed the white side of truck set against a brightly lit sky. Meanwhile, the radar resolution tricked Autopilot into thinking there was a space between the road and the underside of the truck for the car to pass under.
Autopilot thought it was a bridge because it looked similar to a radar signature of a bridge, Kouri explained. “Had there been a map, the car would have known there’s no bridge here,” he said, adding that it would have slammed on the brakes because it was “seeing” an anomaly.
The fatality didn’t lead directly to Lvl5. But the accident did underscore the significance of what we were doing, Kouri said.
“At that time I knew that others were close to shipping Level 2 autonomy, but likely didn’t have any maps available to them—a dangerous combination,” Kouri explained in an email. “This was our call to action.”