DEARBORN, Mich., March 10, 2016
– Driving in snow can be a slippery challenge, with the potential for
one blizzardy gust to white-out your field of view – a situation faced
by the majority of people in the United States. So if self-driving cars
are to become a reality – and they almost certainly will – they must be
able to navigate snow-covered roads. In its quest to bring self-driving
vehicles to millions of people around the world, Ford reveals six facts
about its technology that allows for a car to drive itself in snow.
- Mapping the way: Ford first creates high-resolution 3D maps using
LiDAR technology to scan the area its autonomous vehicle will later
drive in the snow.
To operate in snow, Ford Fusion Hybrid autonomous vehicles first need to scan the environment to create high-resolution 3D digital maps. By driving the test route in ideal weather, the Ford autonomous vehicle creates highly accurate digital models of the road and surrounding infrastructure using four LiDAR scanners that generate a total of 2.8 million laser points a second. The resulting map then serves as a baseline that’s used to identify the car’s position when driving in autonomous mode. Using the LiDAR sensors to scan the environment in real time, the car can locate itself within the mapped area later, when the road is covered in snow.
- Better have an unlimited data plan: Ford’s autonomous vehicles
collect and process significantly more mapping data in an hour than the
average person uses in mobile-phone data in 10 years.
While mapping their environment, Ford autonomous vehicles collect and process a diverse set of data about the road and surrounding landmarks – signs, buildings, trees and other features. All told, the car collects up to 600 gigabytes per hour, which it uses to create a high-resolution 3D map of the landscape. In the United States, the average subscriber of a cellular data plan uses about 21.6 gigabytes per year, for a 10-year total of 216 gigabytes.
- Super smart sensors: Ford uses LiDAR sensors that are so powerful, they can even identify falling snowflakes and raindrops.
Ford’s autonomous vehicles generate so many laser points from the LiDAR sensors that some can even bounce off falling snowflakes or raindrops, returning the false impression that there’s an object in the way. Of course, there’s no need to steer around precipitation, so Ford – working with University of Michigan researchers – created an algorithm that recognizes snow and rain, filtering them out of the car’s vision so it can continue along its path.
- Not your average navigation: The way Ford’s autonomous vehicles identify their location is more accurate than GPS.
When you think about vehicle navigation, GPS usually comes to mind. But where current GPS is accurate to just more than 10 yards, autonomous operation requires precise vehicle location. By scanning their environment for landmarks, then comparing that information to the 3D digital maps stored in their databanks, Ford’s autonomous vehicles can precisely locate themselves to within a centimeter.
- No need for glasses: Sensor fusion – the combination of data from
multiple sensors – plus smart monitoring of sensor health help keep
Ford’s autonomous vehicles out of the blind.
In addition to LiDAR sensors, Ford uses cameras and radar to monitor the environment around the vehicle, with the data generated from all of those sensors fused together in a process known as sensor fusion. This process results in robust 360-degree situational awareness. Sensor fusion means that one inactive sensor – perhaps caused by ice, snow, grime or debris buildup on a sensor lens – does not necessarily hinder autonomous driving. Still, Ford autonomous vehicles monitor all LiDAR, camera and radar systems to identify the deterioration of sensor performance, which helps keep sensors in ideal working order. Eventually, the cars might be able to handle ice and grime buildup themselves through self-cleaning or defogging measures.
- Look Mom, no hands: The first person behind the wheel of a
demonstrated autonomy test in snow is an astrophysics major who never
dreamed he’d be in a self-driving car.
Before Wayne Williams joined Ford’s autonomy team, he worked on remote sensing technology on behalf of the federal government. A self-described “geek,” Williams was intrigued by autonomous vehicles. But he never envisioned one day being part of a team working to bring them to reality – let alone being behind the wheel of the auto industry’s first publicly demonstrated autonomous snow test. The mood in the car that day was all business, he recalls, with a coworker monitoring the computing system from the back seat. “Because of the extensive development work, we were confident the car would do exactly what we asked of it,” says Williams. “But it wasn’t until after the test that the achievement began to sink in.”
No comments:
Post a Comment