ALEAD: Artificial Learning Environments for Autonomous Driving

CGA Simulation has created ALEAD, a system designed to teach autonomous vehicles to drive in a safe, virtual environment. It saves time and money for companies developing autonomous driving technologies; the simulated environment is created using Unity 3D graphics, with Baidu Apollo open driving solution, and Robot Operating System (ROS). ALEAD will provide various locations comprising of HD roads and assets, that pose hyper-realistic scenarios for autonomous vehicles. All environments and assets created are compatible with LG Sim and other open-source driving simulators.

HD Environments

The bonus for autonomous vehicle developers is that simulated environments can replace live trials in initial product development and testing. ALEAD allows autonomous vehicles to navigate complex junctions featuring realistic road markings and street furniture. ALEAD operates a million times faster than running live trials and can replicate extreme events and environmental conditions like fog, debris in the road and unpredictable (non-autonomous) vehicles. Its physically realistic sensor models and environmental factors, create a more realistic teaching environment.

We have realistically modelled complex junctions to create a quality environment.
HD Junctions and signs help to create realistic considerations for autonomous vehicles. 

Weather

The addition of weather to simulation including varying degrees of rain
Snow and ice have been integrated into the sim for scenario creation.

HD Maps

Environment Data in HD map format is necessary for autonomous vehicle functionality. We have used a variety of different mapping technologies to get as accurate a representation of the world as possible in order to create HD large-scale maps for various locations. Our expertise in this field has gone on to help Project Synergy successfully trial their new V2X (vehicle to everything) Signal Phase and Timing (SPaT) software module and roadside unit (RSU) at the A555 Manchester Airport Relief Road / Styal Road intersection in South Manchester.

Project Synergy was established in 2019 to test autonomous vehicles on roads in the Greater Manchester area. The project aims to innovate autonomous cars and the use of autonomous pods in an airside setting at Manchester Airport and advances in connected and autonomous vehicles.

Accurate Sensor Models

The ALEAD system uses representative sensor models for key sensors that are likely to be present in future autonomous vehicles, including short-range Radar, IR/TV cameras, Lidar scanners, and GPS/ satellite navigation systems. The aim is to identify factors that determine or limit sensor performance and therefore could have an adverse effect on the safety of autonomous vehicles, like high humidity, fog, bright light (the sun being low in the sky and building reflections), erratic behaviour from other road users, deteriorated street signs or markings and deliberate jamming of sensor data. The sensor modelling makes the training physically realistic for computer vision, which is different from human perception, we can also create environmental models to represent the effect of active sources on the sensor models including thermal illumination, light sources and radio emissions.

Radar Sensor
IR Camera
Lidar Sensor
Multiple Sensors

In-Game Editor System and Autonomous Driving Test

Each autonomous vehicle will take a simulated driving test before taking to the road, to make sure it is truly safe and road-worthy. We have created HD environments suitable for a virtual driving test modelled around a ‘Sim City’- style simulated world. These HD environments will pose various realistic hazards for autonomous vehicles.

CGA has developed a simulator that integrates a complex 3D model of urban environments that accurately depicts Conwy and the San Jose area with accurate vehicle traffic flow, peoples’ actions, and other spatial data. Our high-resolution Artificial Intelligence vehicles feature varying driver behaviours, AI cars indicating, and realistic traffic flow.

We have an integrated Editor System to allow different scenarios and tests to be created easily. This In-Game Editor allows end-users to set up their own CCAV and general driving challenges, allowing bespoke driving tests to be created and shared.

Features

  • Continued to add a variety of variables to the in-game editor i.e. pedestrians, weather. 
  • Functional ‘Edit Route’ button – allows route to be inspected and edited 
  • Start Trigger Visual – Visuals for where the trigger is in the scene 
  • Created our own UI element that could be used to select an asset more easily in the inspector  (drop down menu + text in search bar suggests assets) 
  • Route Visualizer
  • Created an Array UI Element – (an element that can display an array of one object type) – This would be used to display, add and remove hazards that will be displayed on a route 
Scenario Creation
Steamworks Integration
Create and share content with other users
Created Scenario Play Through