360° Sensors give autonomous cars vision.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

This was crafted for the downsizing seminar

The concept of the autonomous vehicle has been around for decades: In 1977, the Tsukuba mechanical engineering laboratory in Japan built a prototype that drove a special course, following white markings, at up to 30 km/h; while in 1980, Mercedes-Benz unveiled a vision-guided van that achieved speeds of up to 100 km/h on streets with no traffic.

However the common vision of a fully autonomous vehicle is one capable of sensing its environment and navigating without the need for driver intervention. Such a vehicle needs to “see” the surroundings. This can be achieved using sensors such as radar, LiDAR, GPS and computer vision systems.
First tier Advanced Driver Assistance Systems (ADAS) supplier, Continental, predicts that fully autonomous passenger vehicles will be driving on public roads by 2025.

Continental speculate that autonomy will initially be implemented through the partial automation of construction vehicles using lateral collision avoidance, before being used on highly automated vehicles and finally fully autonomous passenger vehicles.

Key to the fully autonomous vehicle are the various sensors that gather information about the environment, to produce a 360° image of the immediate surroundings.

Currently, 360° sensors such as Long Range and Short Range Radar (LRR and SRR) and Light Detection and Ranging (LiDAR) are used in ADAS systems such as adaptive cruise control and autonomous emergency braking. These sensors will be pivotal to fully autonomous driving in the future.

Nissan 360

Higher frequency bands to improve Radar imaging.

In order to provide these essential car safety functions, systems must be able to clearly distinguish between objects on the road and surroundings. When using radar, the 24 GHz and 76 GHz narrowband frequency ranges commonly used at present, do not provide enough definition.

A wide bandwidth will ensure enhanced resolution and better object distinction. These are essential for functions such as autonomous emergency braking and pedestrian detection, in busy urban environments.

In Europe the frequency band 77-81 GHz (79 GHz) has been allocated to vehicle radar. To speed-up worldwide harmonized frequency allocation of this bandwidth the Coordination and Support Activity (CSA) 79 GHz, coordinated by Robert Bosch, began canvassing authorities across the globe in 2011, hoping to achieve a single standard by June 2014.

The performance gains 79GHz sensors hold over lower frequency systems include:

•    Better target discrimination is possible with the 4 GHz (77-81GHz) bandwidth. Therefore multiple objects will still be distinguished even if they appear in the same range gate while focused on one virtual object.
•    The higher the bandwidth, the better the spatial resolution will be.
Spatial resolution allowing for an exact location determination, which is essential for safety-critical applications, is directly linked to the available bandwidth. This is very important to reduce unjustified system responses or false alarms.
•    Another interesting benefit is that one technology can be used across applications. Currently automotive radar systems use different frequency ranges for the different applications. By using a common 77GHz to 81GHz technology platform, a holistic and flexible system can be engineered, saving on system development and facilitating multi-functional use of individual sensors.
•    79 GHz radar sensors are also much smaller when compared to 24 GHz. HF-circuit structures and antenna sizes are directly dependant on the wavelength used: The higher the operational frequency, the smaller the total size of the radar device. The relationship is linear and, when comparing 24 GHz with 79 GHz, an improvement with a ratio-factor of 3 can be measured.
•    Finally, mutual interference risk is low due to the smaller emission power. When compared to the 77 GHz long range radar (and also the 24 GHz) narrowband devices the average emission power of 79 GHz is three orders of magnitude lower and the interference distance is more than one order of magnitude lower, resulting in more robustness and a higher interference margin.

The use of even higher frequency bands than the 79 GHz range (i.e. 122 GHz) is currently being investigated by the European Commission.

Since the European decision in 2004 to open the 79 GHz band for automotive short range radar a number of projects have been undertaken. One such initiative, the Radar on Chips for Cars (RoCC) project involves Daimler, BMW, Bosch, Infineon and Continental and seeks to reduce costs by focusing on the main cost drivers: The high frequency laminate, millimetre wave chips and the micro-processor itself. By advancing silicon-based radar technology the project believes significant cost savings are possible.

Existing sensors are good enough to keep a vehicle perfectly centred in its lane, follow the curvature of a bend or maintain a constant distance from the car in front. But this is not enough for it to make complex ‘what if?’ decisions and having the best sensor technology and data fusion algorithms won’t be enough – vehicles need to learn situational awareness.

To achieve this, in January 2014 Ford initiated a project with Stamford University and the Massachusetts Institute of Technology to develop predictive functions for autonomous vehicles.

Ford’s global manager for driver assistance and active safety, Greg Stevens, commented: “Our goal is to provide the vehicle with common sense. Drivers are good at using the clues around them to predict what will happen next, and they know that what you can’t see is often as important as what you can. Our goal in working with Stamford and MIT is to bring a similar type of intuition to the vehicle.”

LiDAR for pin-point mapping accuracy.

In the Ford research a Fusion midsize sedan, will use series production radar, camera and ultrasonic sensors but will also be equipped with four LiDAR sensors that scan the road 2.5 million times per second. The reflected light is used to generate a real-time 3D map of the surrounding environment up to 70m around the vehicle.

The sensors can track anything dense enough to redirect light, whether stationary or moving objects, such as vehicles, pedestrians and cyclists. The sensors are so sensitive they can sense the difference between a paper bag and a small animal at nearly a football field away.

While most LiDAR sensors tend to be bulky and awkward to fit to a vehicle, tier one systems supplier, Continental, has installed a combined LiDAR and CMOS camera in the mounting of the rear-view mirror. The sensor module is able to categorize objects in front of the vehicle and detect an imminent collision. In addition to the sensors, the module houses the entire analysis unit.

CMOS cameras are already used for categorizing objects in front of a vehicle: However, by itself, a CMOS camera cannot always provide sufficiently reliable information for initiating automatic emergency functions. For this reason, Continental combines the passive sensor technology with LIDAR in the SRL-CAM400.

The sensor monitors a distance of more than ten meters in front of the vehicle, which classifies it as a short-range LIDAR system. The LIDAR sensor in this installation transmits three pulsed infrared beams with a 905nm wavelength and measures the time-of-flight until the reflected beams reach the receiving optics.

From the speed of light and the time-of-flight, the sensor computes the distance to the object at an accuracy of up to 10 centimeters. In conjunction with the CMOS camera, the analysis unit in the sensor module now has access to both a robust means of object categorization and an accurate distance measurement. Before an automatic emergency action is initiated, the two signal paths are compared with each other, thus further enhancing the probability that a correct decision will be made.
The new sensor module is expected to enter series production in 2015.

Military class Lidar could find its way into cars in the future.

A new breed of LiDAR technology is being developed and tested by the US Air Force at a base in Massachusetts. This system is capable of precisely mapping over 300 square kilometers from the belly of an airplane in about half an hour.

LiDAR has been used for aerial mapping of disaster areas and remote archaeological sites, but the process has always taken time. After the Haiti earthquake in 2010, a system similar to the one being developed by the Air Force was able to capture a 600 square-meter section of Port-au-Prince at 30cm pixel resolution in a single pass. The chip at the heart of the next generation system is about ten times more powerful.

The key to the next generation LiDAR’s incredible speed and resolution is semiconductor technology based on indium gallium arsenide (InGaAs). These III-V semiconductors (so-called because they are made from metals in periodic groups III and V) are seen as a potential replacement for silicon in numerous applications. In this case, InGaAs semiconductors operate in the infrared spectrum, which allows for the use of longer wavelengths of light that can travel farther and scan wider areas.

Automotive IQ disclaimer

While these new systems are still secret, the results from chips of the type used to map Port-au-Prince are beginning to make it into industrial applications.

Princeton Lightwave and a division of Boeing have both been working with single-photon InGaAs LiDAR that could one day be incorporated into automated self-driving cars. Princeton Lightwave is already in talks with car manufacturers to build a prototype LiDAR system that could point the way to the future.

Of course, InGaAs-based LiDAR has to come down in price first — Princeton Lightwave’s current industrial model costs $150,000, but it’s the size of a shoebox.

Several 360° sensors are required to ensure fail-safe autonomous action. The correct selection of the sensor for the application is vital to the proper functioning of the system.

Integrating 360° sensor technology.

Different_sensors

At the 2013 CES, Lexus demonstrated a research vehicle equipped with GPS, stereo cameras, radar and a 360° LiDAR system. The latter, installed on the roof of the vehicle, is used to detect objects at distances of up to 70 meters.

The LiDAR system is supported in its task of generating a complete image of the vehicle environment by three high-resolution color cameras with a range of about 150 meters. One of the three faces forward and recognizes, amongst other objects, traffic lights. The other two cameras scan the area behind the car.

Frontal and lateral radar sensors identify the position and speed of cars around the demo vehicle. Thus, they generate a complete panorama, which is vital for safely navigating crossroads and intersections. Other sensors are located at the rear wheels and on the roof, measuring speed as well as acceleration in longitudinal and lateral directions.

One of the challenges faced by 360° technology is that of packaging, in particular the LiDAR. Because it uses light for sensing it cannot be concealed behind bodywork or bumpers.

Following success with the DARPA challenge, GM has renewed its development partnership with Carnegie Mellon University (CMU) to work on sensor technology and data fusion. The original Chevy Tahoe “Boss” prototype used a number of bulky radar and LiDAR sensors, together with GPS, to successfully drive itself for almost 100km through a test route which included complex traffic scenarios such as crossing busy intersections.

Boss_sensors

The focus for the latest five-year collaboration will be sensor technologies which will enable autonomous vehicles. The challenge for the engineers is to make the sensors smaller and cheaper and improve algorithms and controls to enable future series production of autonomous vehicles.

The research mule, a Cadillac, has integrated radar and lidar sensors and the processors which handle all of the data are hidden beneath the cargo floor rather than filling the interior.

The way forward for 360° sensors.
.
There’s no doubt that the current 360° sensors and related systems are capable of guiding an autonomous vehicle through many real world environments. However for self driving vehicles to be accepted, the technology needs to be affordable and fail-safe.

With the expected global harmonization of the 79GHz frequency for radar these systems will probably see a significant reduction in cost and overall size.

LiDAR on the other hand, faces a more complex development path that may see it overlooked in the short term. Cost and packaging are challenges that need to be addressed before this technology is likely to find its way into segment B and C vehicles, which constitute a sizeable share of the market.

,

No comments yet.

Leave a Reply

UA-72741985-1