A set of sensors detect small changes in shadows on the ground in order to effectively see what's coming around corners

MIT engineers have developed a system for autonomous vehicles that senses tiny changes in shadows on the ground to determine if there’s a moving object coming around the corner (Credit: MIT)

MIT engineers have developed a system for autonomous vehicles that senses tiny changes in shadows on the ground to determine if there’s a moving object coming around the corner (Credit: MIT)

Autonomous cars could soon outperform human drivers when it comes to hazard perception by “seeing around corners”, thanks to new sensor technology developed by MIT engineers.

The ShadowCam system interprets changes in shadows to provide an early warning identifying unseen people or cars moving out from behind a building, corner or stationary vehicle.

Researchers claim experiments, using a self-driving car in a car park and an autonomous wheelchair moving through hallways, showed the tech outperform LiDAR sensors by more than half a second.

MIT Computer Science and Artificial Intelligence Laboratory director Daniela Rus said: “For applications where robots are moving around environments with other moving objects or people, our method can give the robot an early warning that somebody is coming around the corner, so the vehicle can slow down, adapt its path, and prepare in advance to avoid a collision.

“The big dream is to provide ‘X-ray vision’ of sorts to vehicles moving fast on the streets.”

 

How does ShadowCam help cars see around corners?

The new technology builds on previous computer-vision techniques developed by MIT professors William Freeman and Antonio Torralba.

Live video is analysed frame-by-frame, allowing the computer-vision system to detect any changes in light intensity between each image, which would be invisible to the naked eye.

This helps the autonomous vehicle determine whether an object is moving away or coming closer and the velocity at which it is moving.

The autonomous vehicle can then make a suitable adjustment to its own speed and trajectory in preparation for the object emerging from around the corner.

By actively predicting what is coming, ShadowCam achieves something that traditional autonomous vehicle sensor systems, such as video cameras and LiDAR detection, cannot.

The research paper, funded by the Toyota Research Institute and presented at the International Conference on Intelligent Robots and Systems, claims the technology could eventually be used to help robots negotiate busy hallways and autonomous delivery vans avoid pedestrians.

The paper’s author Felix Naser added: “By detecting that signal, you can then be careful. It may be a shadow of some person running from behind the corner or a parked car, so the autonomous car can slow down or stop completely.”

Previously, corners had to be marked with augmented-reality “AprilTags” — which are described as resembling simple QR codes — in order for the computer-vision system to know where to look.

As these identifiers would be impractical to implement in real-world situation, the researchers developed a new visual odometry technique, similar to ones used in the Mars Rovers, for the computer system to recognise corners on its own.

Experiments on an autonomous wheelchair, which was tasked with avoiding people while turning hallway corners, achieved a 70% classification accuracy.

In a separate test, a driverless car navigating through a multi-story car park was able to spot hazards with 86% accuracy, once the ShadowCam had been tuned to the low-light conditions.

The next challenge for the researchers will be to make the system adapt to different indoor and outdoor lighting conditions and speed up the detection process.