Can ADAS Aid Us on a Dark and Stormy Night?
--
It’s one thing to ridicule Elon Musk for realizing, belatedly, that real-world AI is easier preached than done. Last week, in a tweet, Musk wrote: “Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect. Nothing has more degrees of freedom than reality.”
Shading the truth about highly automated vehicles’ capabilities isn’t Tesla’s hallmark alone. Most carmakers have yet to admit that, despite the many sensors installed in their Advanced driver assistance systems (ADAS) and autonomous vehicles (AVs), they remain vulnerable when driving in the dark, or into the sun, or in fog, heavy rain or snow.
“When the sun is up and birds are singing, yes, autonomous vehicles can mostly drive themselves,” said Eyal Levi, co-founder, executive vice president, business development & product at BrightWay Vision Ltd. (Haifa, Israel) But a lot of driving routinely occurs after sunset and the weather is bad.
AV companies claim to have already nailed highly automated driving. However, “we know that those same companies are all still looking for solutions” to cope with stormy weather, said Levi.
Whether a car is driven by human or machine, many challenges trip up drivers, among them the element of surprise, when an emergency crops up suddenly. Bad weather and dark of night, however, are hardly a surprise.
Demonstrations/presentations offered by technology suppliers at the “Destination ACM” event last month in Detroit made clear that the next frontier in the race to autonomy is all about technologies that can improve perception and expand operational design domain (ODD), said Phil Magney, founder and president of VSI Labs. In his opinion, advancements in perception hold the key to steadily improve ADAS, “which will last over the next 20 to 25 years.”
Here are a few companies which strutted their stuff.
Brightway Vision
Brightway Vision (Haifa, Israel) offers a gated vision camera system with a Near-infrared CMOS image sensor and a pulse-based illuminator.
The ten-year-old startup, spun out of Elbit Systems — one of the largest defense electronics companies in Israel — has developed an “all weather automotive camera system,” poised to be automotive qualified by mid-2022.
The system, called Visdom, is no ordinary camera. It comes with a “smart flash” and a CMOS image sensor with a built-in “smart accumulator.” The setup supports integration of reflected light from multiple illumination pulses, according to Brightway Vision’s Levi. What makes the flash IR sensor smart is “its ability to place a specific amount of light at each range,” he said, “without affecting other ranges.”
This brings two major benefits, noted Levi. First, “it creates an eerily uniform lighting across all ranges — looking more like a daytime image than a standard active-illumination night image.” Second, “the use of sophisticated control of the pulse illuminator makes it possible to effectively eliminate backscatter.” The backscatter is the reason why main headlights of a vehicle can’t penetrate fog.
Brightway Vision uses Vertical-Cavity Surface-Emitting Laser (VCSEL) as its illuminator. Unlike typical lidars using VCSEL to measure distance, Brightway Vision deploys VCSEL for “collecting light,” Levi said.
By pulsing the shutter 1,000’s times per frame, the camera captures “slices” of an image and accumulates its multiple exposure — uniformly lit across all ranges — into a single image. The camera “combines these slices of images in the analog domain,” noted Levi. One interesting use of slices is that they allow “background removal” at any desired range. The result is significantly improved image contrast, enabling AI and humans to detect objects much better, Levi claimed.
Brightway Vision, nonetheless, believes its camera is the “ultimate” complementary automotive sensor to color cameras in ADAS cars or AVs. Many experts agree that cameras are the key sensor for autonomous driving, but “the shortcomings of regular cameras are low light, and they are adverse to bad weather,” Levi noted.
Visdom will change that equation, claimed Levi, because it can offer the same level of performance at low visibility, as in clear weather.” The main functional shortcoming of Birghtway Vision’s Visdom camera is that it cannot provide color.
“There will be no single sensor solutions for AVs,” said Levi. The company’s CMOS-based NIR gated sensors can be also used “as redundancy or verification” of objects detected but not classified by other types of sensors. Assume, for example, a radar picks up a blip on the road. Brightway Vision’s cameras can shed more light on that blip — captured on a slice. Given that the API for its camera is open, communication with other sensors is possible.
VSI Labs’ Magney, whose team has been testing Brightway technology, noted that the startup’s Visdom “shows just how good a camera can be in challenging conditions.” His team has had on-the-road experience in a vehicle equipped with Visdom when the car was suddenly engulfed in torrential rain. Thus far, he said, “nothing is better” under such circumstances than this camera. However, he added, Visdom is “more complex than a regular camera since it requires emitters.”
Visdom’s camera system requires carmakers to deploy both a camera and illuminators, which adds complexity in integration. Some OEMs and Tier Ones consider it “challenging,” acknowledged Levi.
However, there are some advantages in the system integration. The Visdom camera can be placed behind the windscreen, just like any other CMOS image sensors. It means that carmakers do not need to worry about cleaning and cooling the cameras, or fender bender minor accidents damaging the camera. Moreover, the illuminators can be located in the car headlights, thus protecting and concealing them.
That said, the uniqueness of Brightway Vision’s technology could make Visdom a hardsell to some risk-averse OEMs and Tier Ones. They hesitate when they find out that Brightway Vision is the only company combining a custom-made CMOS image sensor and a pulse-based illuminator.
Nira Dynamics
Meanwhile, Nira Dynamics (Linköping, Sweden), also at the Destination ACM to tout its technology, unveiled software that can presumably better prepare drivers (machine or human) of AVs and ADAS for sudden changes in road conditions in adverse weather.
Nira is a world leader in indirect tire pressure monitoring systems. Nira’s software — installed in many vehicles — dynamically measures road surface info such as tire grip, air pressure and loose wheels. Fusing information from anti-lock braking systems (ABS), microslip, accelerometers and steering angles, Nira’s software can detect friction with the road, collect data on road moisture and send it to car OEMs’ cloud. From this, Nira anonymizes the data and sends alerts of upcoming road conditions back to the driver.
Magney noted, “Nira’s dynamic algorithm will enable AVs to operate in slippery conditions. It also works well for ADAS.”
An advantage is that all vehicles, ADAS, AV and otherwise, can use data collected by Nira without adding yet another sensor devoted to safety, explained and Gregory Weber, business development manager at Nira.
While knowledge about road conditions enables safer driving, Nira also sees a growing opportunity in offering its data to municipalities, for example. States or cities can in turn use it to make road maintenance more environmentally friendly and managing more efficiently such tasks as snow-plowing and pothole mending. Once Nira’s road information is overlaid atop weather information, it could be helpful to trucking companies planning routes, for example, Weber added.
Aeye
Aeye believes that its lidar belongs to a class of its own, because it has “bistatic lidar architecture.” Under the patented architecture, transmission and reception channels for lasers are kept separate. Aeye’s chief marketing officer Stephen Lambright claimed that his company’s iDAR, can bring “more resolution where it is needed.”
As each laser pulse is transmitted, the receiver is told where and when to look for its return. This makes it possible to introduce “deterministic artificial intelligence into the sensing process at the point of acquisition.” Describing the process as “biomimicry,” Aeye explained that it allows iDAR to focus on what matters most in a vehicle’s surroundings.
One of the known issues of LiDAR sensors, although not broadly acknowledged, is the degradation of its performance in rain. If a lidar beam intersects with a raindrop close to the transmitter, it can reflect enough of the beam back to the receiver to register rain as an object. The droplets can also absorb some of the emitted light, degrading the sensors’ range of performance.
Demonstrating iDAR at the American Center for Mobility (ACM), Aeye even prepared a rainmaking machine to show off the sensor’s object detection capability in wet weather. As things turned out , it actually rained on Aeye’s parade, helping iDAR prove its point.
Aeye’s customers can create a library of deterministic, software-configurable scan patterns at design time, each addressing a specific use case, according to the company. In addition to creating different scan patterns for highway, urban, and suburban driving, or an “exit ramp” pattern, Aeye explained that the customer can create scan patterns for those same driving environments but optimized for bad weather — such as “Highway rain scan pattern” vs “Highway sunlight scan pattern.”
Earlier this year, VSI Labs oversaw the testing of Aeye’s idea system.
VSI Labs confirmed that 1) the AEye sensor could detect targets with a substantial number of points from farther than 1000 meters without compromising frame rate, 2) AEye’s sensor produced more than 175 points on an 18% target from 200 meters, doing so with 1600 points per degree squared, 3) iDAR scanned at better than 200Hz for full frame and field of view (FOV), and 4) iDAR senses effectively behind a windshield glass at various angles.
Asked about iDAR, Magney summed up, “It is pretty cool because it can see up to a kilometers and the iDAR can operate behind the windscreen.” He added, “Long range lidar is going to be important for trucking especially.”
— Junko Yoshida, the former bureau chief and editor-in-chief of EE Times and most recently, global editor-in-chief of Aspencore, is now an independent technology and business journalist.