ALEAD: Artificial Learning Environments for Autonomous Driving

CGA Simulation has created ALEAD, a system designed to teach autonomous vehicles to drive in a safe, virtual environment. It saves time and money for companies developing autonomous driving technologies; the simulated environment is created using Unity 3D graphics, with Baidu Apollo open driving solution, and Robot Operating System (ROS). ALEAD will provide various locations comprising of HD roads and assets, that pose hyper-realistic scenarios for autonomous vehicles. All environments and assets created are compatible with LG Sim and other open-source driving simulators.

We have mapped Chester, a city in the North West.

Realistic Environments

The bonus for autonomous vehicle developers is that simulated environments can replace live trials in initial product development and testing. ALEAD allows autonomous vehicles to navigate complex junctions featuring realistic road markings and street furniture. ALEAD operates a million times faster than running live trials and can replicate extreme events and environmental conditions like fog, debris in the road and unpredictable (non-autonomous) vehicles. Its physically realistic sensor models and environmental factors, create a more realistic teaching environment.

We have realistically modelled complex junctions to create a quality environment.

Accurate Sensor Models

The ALEAD system uses representative sensor models for key sensors that are likely to be present in future autonomous vehicles, including short-range Radar, IR/TV cameras, Lidar scanners, and GPS/ satellite navigation systems. The aim is to identify factors that determine or limit sensor performance and therefore could have an adverse effect on the safety of autonomous vehicles, like high humidity, fog, bright light (the sun being low in the sky and building reflections), erratic behaviour from other road users, deteriorated street signs or markings and deliberate jamming of sensor data. The sensor modelling makes the training physically realistic for computer vision, which is different from human perception, we can also create environmental models to represent the effect of active sources on the sensor models including thermal illumuniation, light sources and radio emissions.

Autonomous Driving Test

Each autonomous vehicle will take a simulated driving test before taking to the road, to make sure it is truly safe and road-worthy. HD environments will pose various realistic problems for autonomous vehicles.

HD Junctions and signs help to create realistic considerations for autonomous vehicles. 

VR DRIVE

VR Drive is a virtual driving simulator which aims to improve road safety by engaging novice drivers with a virtual experience that improves hazard perception and raises awareness of risk-taking in a digital world. Similar to ALEAD except it’s for humans instead of autonomous cars.

Participant at Liverpool John Moores University trialing VR Drive