AVs contain numerous dedicated devices known as sensors to assist with self-driving navigation. A sensor monitors the physical environment, responding to different stimuli by producing different electrical signals.
AVs cannot rely on one individual sensor to visualise the surrounding environment. Sensors operate and deliver data in different ways, and each sensor has its advantages and disadvantages. However, combining these sensors can ensure a vehicle can overcome automotive challenges safely and reliably.
Inside a LiDAR (light detection and ranging) device are multiple stacked semiconductors that fire lasers (tightly focused beams of photons). These semiconductors can be rotor mounted to provide a 360° field of view or, more commonly, solid-state and mounted to fire arrays over a specific field of view.
Once a laser hits an object, it returns to be picked up by an onboard infrared detector. The duration from the emission of a laser to its return (time-of-flight) determines how far away an object is. Poor weather can impede the capability of LiDAR, with fog or rain reducing the integrity of the data LiDAR can produce.
Also operating on the time-of-flight principle but less weather-dependent is radar (radio detection and ranging), which emits radio waves rather than lasers. Radar sensors are typically fixed, focusing radio waves in one direction. Although radar usually results in a lower resolution picture than LiDAR, radio waves have a longer wavelength than light, meaning they can travel further and thus detect objects at a greater distance.
Noticing shifts in the radio wave frequency of a detected object can also help determine if it is moving and its relative speed. As an object comes closer to the radar, the reflected frequency will increase as the radio waves become more compressed; as an object moves away, the frequency will decrease as the radio waves flatten out.
While often mounted onto the front, sides and rear of a vehicle, the ability to detect objects at a distance and determine relative movement makes front-facing radar a particularly compelling choice for forward-collision warning, avoidance and adaptive cruise control.
We have explored how an AV can map surroundings with LiDAR and radar alone to determine objects' location and relative movement. However, identifying the surrounding objects and determining where the AV is remains restricted without additional input. Cameras can provide an AV with the means to recognise and track objects in the environment and basic positioning.
When objects are identified and used as control points, the change in successive images captured can provide relative positioning as the AV travels through 3D space. By helping identify known surveyed targets such as landmarks, cameras can also generate an absolute position in an environment. The relative usefulness of a camera is dependent on the availability of visible objects. Featureless surroundings or environments severely impeded by poor lighting or local weather can hamper camera effectiveness.