Introduction
Self-driving cars have long captured the imagination of motorists and engineers across the globe. They have become even more of a conversation topic since the advent of artificial intelligence algorithms. As the world is moving closer to having truly autonomous cars, questions and concerns remain about their abilities and limitations.
One of the most common questions relates to the performance of self-driving cars in inclement weather conditions. Here is a look at the latest developments.
Table of contents
How Weather Impacts Self-Driving Cars
Any driver knows that weather has a huge impact on road safety. Dry, clean roads are certainly easier to navigate for human drivers than rain-covered or even flooded streets. Anyone who has ever needed to drive in snowy conditions will understand the inconvenience of scraping car windows and watching for icy roads and ice patches.
But what does the weather do to self-driving vehicles? To answer this question, we need to take a closer look at how self-driving cars work. Self-driving cars navigate roads with the help of sensors that position the car in relation to other objects around them, including road signs, road markings, other cars, or pedestrians. They use sensors to help them see.
This is where the limitations start. In snowy conditions, sensors may be covered by snow and ice, rendering them effectively blind. The principle is similar to reversing sensors in most modern cars. If they are covered by mud, dirt, or snow, they simply cannot produce a picture. Without reliable images of their surroundings, autonomous car technology can be unsafe.
What is Road Risk?
According to the American Meteorological Society, more than 200 million cars and trucks use America’s national highway system. Their safety and the safety of their passengers are strongly affected by adverse weather conditions. The organization estimates that roughly 7,000 highway deaths and approximately 800,000 injuries are in some way caused by driving in poor weather conditions.
Inclement weather includes fog, rain, sleet, and snow. Apart from limiting safety, these conditions also reduce the capacity of roadways and minimize the highway system’s efficiency.
The U.S. Department of Transportation’s Federal Highway Administration (FHWA) divides the impact that adverse weather has on road scenes in general, the flow of traffic, and road operations. In addition to rain, snow-covered roads, and fog, the FHWA also takes windspeed and pavement temperature into account when assessing road safety features. The latter may lead to pavement damage both caused by excessively high or low temperatures.
Road risks affect driver behavior and vehicle performance. Strong winds can destabilize a car and render roads inaccessible due to falling trees, for example. Rain, sleet, snow, and fog all limit visibility and can greatly increase the distance cars need to stop. Mitigating these risks often means lowering speed limits, although restricted visibility can make it harder for drivers to read road signs and adhere to new limits. In short, inclement conditions increase road risk.
Also Read: How is AI Improving Weather Forecasting?
Optimize for Weather Along Any Route
Traffic management aims to optimize roads for weather conditions. Adjusting speed limits and opening and closing lanes depending on weather conditions all help optimize driving conditions for traditional and self-driving cars.
The goal is to increase safety for all cars, including electric vehicles, even in the worst conditions.
Better Sensors
Like human drivers, self-driving cars struggle in difficult weather. Their limitations are caused by sensors that struggle to recognize close-range objects, lane lines, and road signs in torrential downpours. Snow cover on roads can block those lane markings, further adding to the difficulties of self-driving cars.
To move autonomous technology from imagination to reality, better sensors are needed to deal with difficult weather conditions. They need to be able to recognize lane markers in simulated fog conditions and apply 3D geometry to deliver a 360-degree picture.
LIDAR
Most automatic cars today use a combination of radar images and a laser-based system called LiDAR. Both are intended to aid visibility and navigation, which they do but not without limitations. Ultrasonic sensors are another option.
LiDAR stands for light detection and ranging. Its applications range far beyond driving. The technology is being used for surveying, geology, seismology, and laser guidance, among others. The helicopter Ingenuity utilized LiDAR when flying over the terrain of Mars.
LiDAR works by bouncing ultraviolet, visible, or near-infrared light to create images of objects. In self-driving cars, the lasers target objects surrounding a car and measure the speed at which the beams return to their source.
The concept dates back as far as 1930 and has since been updated and developed further. NASA believes that LiDAR is a key technology to support future lunar landings. When used in autonomous vehicle technology, LiDAR helps detect and avoid obstacles and navigate safely through a variety of environments.
Also Read: What is LiDar? How is it Used in Robotic Vision?
3D Imaging
A LIDAR sensor works by creating high-resolution 3D pictures of driverless vehicles’ surroundings. On clear days, the technology works extremely well and results in high-resolution pictures. Its biggest downfall is its inability to see in foggy conditions, when driving through dusty environments, or navigating rain and snow.
Using only LiDAR, a self-driving car cannot see very well in bad weather conditions. Scientists at the University of California San Diego believe that LiDAR alone is not the solution. According to the researchers, LiDAR is little more than a rotating laser that is looking for objects it can bounce off from. When used in the rain or thick fog, the laser will bounce off the fog and thus create a faulty image of a car’s surroundings.
The technology is not enough to provide the required perception for the car to drive safely. Helping self-driving cars navigate in inclement weather when they cannot sense the road beneath them or objects around them remains challenging.
Radar Imaging Through Fog
However, the UC San Diego team of electrical engineers believes they are coming closer to a solution. Their goal is to improve the imaging capability of existing radar sensors, effectively leading to a LiDAR-like radar. This should help the sensors deliver accurate predictions of the shape and size of objects in the car’s “field of vision.”
Radar technology transmits radio waves and is able to see in any kind of weather conditions, including through thick fog. Compared to laser-based technologies, it has one disadvantage. Radar for imaging captures only a partial picture of the road surrounding a vehicle. The goal of the UC San Diego researchers is to improve the way radar sees.
They are attempting to combine the strengths of LiDAR with the benefits of radar. Plus, radar has another advantage. As an existing technology, it is far cheaper than LiDAR. If the team succeeds in combining the technologies, prospective self-driving car buyers may see the benefit through reduced prices and increased safety.
Both LiDAR and radar operate with the help of sensors. To improve the vision of radar technology, the scientists are placing two sensors on the hood of the car as opposed to relying on one. Doubling up on sensors allows the system to observe a larger space and register more detail than a single sensor could. In some ways, it is as simple as saying two eyes see more than one.
The San Diego scientists are looking for the sensors to deliver rich data. It is not enough for radar frames to notice a car approaching the autonomous vehicle. In addition, they should be able to detect the speed of the approaching vehicle, its dimensions, and where it is compared to the sensors. By recording a wide variety of data points, the scientists are looking to create a solution that is safe and cost-effective.
Using a single radar sensor limits the amount of data the sensor receives. Only a fraction of the reflections of radio waves bounce back from the object the sensor is “seeing”, which is not enough to estimate exact dimensions. Using two sensors solves the issue.
Promising Initial Tests
Test drives on clear days and nights showed that the radar-based approach performed just as well as the LiDAR equivalent when it came to determining another car’s dimensions. But the real challenge would be an assessment of the radar sensors in bad conditions.
To simulate challenging conditions, the scientists added foggy weather to their test scenario. Their car performed just as well as it had in perfect weather and delivered a 3D image of the approaching vehicle. The LiDAR car, on the other hand, effectively failed the test, being unable to make out the second car through the fog.
The team at UC San Diego believes their solution could not only work but improve the way human drivers see. The radar sensors enable the car to see through fog or snow, which would put them not only on par with but ahead of human vision.
Also Read: Autonomous Cars: How do Self-Driving Cars Actually Work?
Conclusion
Self-driving cars are no longer a figment of the imagination of artificial intelligence enthusiasts. As sensor technology develops, including radar- and laser-based approaches, autonomous driving is becoming more reliable, no matter the weather.