Racing Lounge

Best practices and teamwork for student competitions

Students aiming for SAE Level 4 Autonomy by 2020 – AutoDrive Challenge™

Posted by Christoph Hahn,

This post is the first of a loose series in the racing lounge blog featuring highlights of the AutoDrive Challenge™, especially those around modeling and simulation.

Figure: Panoramic photo of all teams and partners at AutoDrive Challenge™ Year 2 finals, MCity (Source: SAE’s CDS Photography Portal)

AutoDrive Challenge™

As you might not yet be aware of the AutoDrive Challenge™, allow me to introduce this initiative. If you are familiar already, feel free to skip to the next section about the simulation challenge.

The AutoDrive Challenge™ is a three-year autonomous vehicle competition run by SAE International and General Motors (GM) tasking students to develop and demonstrate a fully autonomous passenger vehicle a fully autonomous passenger vehicle. The technical goal of the competition is to navigate an urban driving course in an automated driving mode as described by SAE Standard (J3016) Level 4 definition by 2020.

MathWorks is the official software supplier and supports teams together with other sponsors such as hardware suppliers and product sponsors. Find linked a complete list of sponsors. MathWorks has a mentor, a technical specialist so to say, assigned to each team helping and supporting them on matters as they pop-up and making sure teams are effective and efficient in use of simulation. Also, as our mentors bring in dozens of years of industry experience, they are helping teams by adding another perspective on general engineering questions.

Eight engineering schools based out of the US and Canada have been granted access to the challenge which meant receiving an all-electric Chevrolet Bolt, an Intel compute platform and numerous sensors such as lidars, cameras and radars to just name the hardware highlights.

Here is a list of teams with car number, school and team name:

  • 011 – Kettering Univ – BulldogBolt
  • 012 – Michigan State Univ – MSU AutoDrive
  • 013 – Michigan Tech Univ – Prometheus Borealis
  • 014 – North Carolina A & T State Univ – Aggies Autonomous Auto
  • 015 – Texas A & M Univ – The 12th Unmanned
  • 016 – Univ of Toronto – Autoronto
  • 017 – Univ of Waterloo – Watonomous
  • 018 – Virginia Tech – Victor Tango

Year 1 (2018) focused on the concept selection for the university teams by having them become familiar with their sensing and computation software. They were tasked with completion of a concept design written paper as well as simple missions for on-site evaluation, located in Yuma, AZ.

Year 2 (2019) focused on urban environment driving scenarios with static and dynamic objects. Finals were held between early June 2019 in Ann Arbor (Michigan, USA) at Mcity, a test facility run by University of Michigan.

Figure: Autoronto’s Chevrolet Bolt equipped with multiple sensors (Source: SAE’s CDS Photography Portal)

MathWorks Simulation Challenge

In year 2, MathWorks provided teams with a virtual environment of MCity, based on Unreal Engine 4 (UE4) and its Simulink integration in Vehicle Dynamics Blockset™.

Figure: Virtual environment of MCity based on Unreal Engine 4 (Source: MathWorks Documentation)

Also, MathWorks provided software models of sensors, that were the exact same sensor types that teams would use for the real-world challenge, namely cameras, lidar and radar sensors. I am referencing an SAE paper here that talks in depth about using sensors in a virtual UE4 environment.

Video: Virtual environment of MCity based on Unreal Engine 4

Basically, what teams had to do was to navigate their vehicle ‘around the block’ while at the same time sticking to the traffic rules and avoiding static as well as dynamic objects. That’s actually very similar to the real-world control design exercises, called dynamic challenges, in year 2:

  • Traffic Control Sign Challenge, to evaluate the vehicle’s ability to react to various regulatory traffic control signs that it might encounter in a typical urban driving situation.
  • Intersection Challenge, to evaluate the vehicle’s ability to properly react to stop lights and properly navigate four-way intersections.
  • Pedestrian Challenge, to evaluate the vehicle’s pedestrian recognition and response abilities. The vehicle will have to react safely and appropriately to pedestrians at dedicated pedestrian crossings and traffic lights.
  • MCity Challenge, to evaluate the vehicle’s ability to travel through a defined layout, obeying all traffic laws as they apply to what the vehicle encounters.

Figure: Detecting pedestrians and avoiding to collide (Source: SAE’s CDS Photography Portal)

The simulation challenge allowed teams to win 100 out of the 1000 points maximum that each team could score overall in year 2. The number of points alone would justify a significant amount of effort going into the simulation challenge. But ‘No!’, we are working here with some of the brightest engineering students in North America and the teams had two major insights right from the beginning:

  1. The extra workload in simulation at the beginning will be small compared to the gain later in the project.
  2. Having a racetrack in a high-definition 3D virtual environment is a great benefit while preparing for the real-world challenges.

And ‘Yes!’, we couldn’t agree more.  Let me share some examples. From my correspondence with Team “Autoronto” (University of Toronto), it seemed to me that the simulation challenge was being tackled by one person, Mollie Bianchi. Since most other teams had 2 to 4 people dedicated to the challenge, I asked Mollie how she could possibly do all the work on her own. “When there’s a controls problem” Mollie said “I get in touch with our controls team. The same holds true for mapping, planning, and perception issues respectively. We are not considering the simulation challenge as an extra task; it goes along with our other developments.”

Another supporting thought was added by teams based in Michigan and Canada who more than happy to start their virtual testing season a lot earlier compared to on-road testing which had to wait until the residuals of snow from the long winter disappeared.

Before concluding, I thought, the figure below would nicely support the shared vision the teams were acting upon. It is admittedly stolen from a keynote lecture (slides / video) that Andy Grace, one of our VPs of Engineering, presented at a 2019’s MathWorks Automotive Conference.

Investing a critical amount of effort – industry term for that would be ‘cost’ – into simulation, will result in key savings as well as deeper technical insight later on.

Figure: Cost and benefits of simulation

Conclusion

On a side note, the top-scoring teams in the simulation challenge were those from U Toronto, who won, and Texas A+M as well as Michigan Tech which had a shared 2nd place with exact same points. If there is interest from you, we will be happy to share more files and context of the simulation challenge as well as team achievements on MATLAB Central File Exchange and in upcoming blog posts.

Also, maybe worth a note, currently we are getting our hands dirty on preparing interesting tasks for Year 3 that will be kicked off with the beginning of the academic year 2019 / 20.

We are always keen on hearing your comments on questions posted below in the comments section. Looking forward to a fruitful discussion!