The Center for Autonomy at the University of Illinois at Urbana-Champaign has been busy putting an AutonomouStuff GEM Automated Research Development Platform enabled by the PACMod drive-by-wire system to good use: exploring autonomous vehicle safety in a variety of contexts.
Electrical and Computer Engineering Professor Sayan Mitra is publishing research derived from data generated with the GEM platform, as well as acquiring a half-million-dollar grant to further safety analysis for highway-speed autonomy and training the next generation of engineers with a course in safe autonomy.
Publication: Mitra’s paper “Online monitoring for safe pedestrian-vehicle interactions” — entirely based on work with the GEM platform — is slated to appear in the International Conference for IEEE Intelligent Transportation Systems Society (ITSC 2020). The work is already available for download.
Grant: Professors at The Grainger College of Engineering have created at least 10 proposals using the GEM platform as the basis for experimentation, and Prof. Mitra has already secured funding for the first round of investigation into predictive online safety analysis for high-speed autonomy. Learn more about how that project aims to lengthen an automated vehicle’s path planning and data sharing with other vehicles.
Safety course: Prof. Mitra also developed the GEM platform and simulator for a course in safe autonomy, with homework and programming assignments designed around the GEM. Check out the material he’s using to train the next generation of engineers working to solve autonomy challenges on the Principles of Safe Autonomy class page.
As part of a partnership between the Center for Autonomy at the University of Illinois at Urbana-Champaign’s Grainger College of Engineering and AutonomouStuff, professors and students have long-term access to an automated Polaris GEM. Projects involving the AutonomouStuff Automated Research Development Platform began in the spring of 2019 with a focus on autonomous software. Additional projects and courses already are planned, and the GEM’s availability for other research initiatives on campus will expand.
Assistant Professor Katherine Driggs-Campbell of electrical and computer engineering is among the first researchers at the University to work with the GEM after its arrival in Spring 2019. The initial efforts concentrate on autonomous software development.
“This summer, we have a few master’s students and undergraduate students working on getting the autonomous stack up and running smoothly,” Driggs-Campbell said. “Our goal for the end of the summer is to get a nice demo of the GEM smoothly navigating campus scenarios, like following and overtaking a cyclist and effectively navigating crosswalks with pedestrians.”
Other professors currently using the vehicle in their research include: Professor Geir Dellerud of mechanical science and engineering; Professor Minh Do of electrical and computer engineering; Professor Sayan Mitra of electrical and computer engineering; and Professor David Forsyth of computer science.
Mitra will be using the car in his Safety and Autonomy course in spring 2020. The first months with the GEM will be spent learning about current software modules and developing additional applications for autonomous and intelligent path following in the campus setting, including path and object detection; path planning and control; and real-time predictive safety analysis.
“We are still in the very early stages of software development and exploration,” Mitra said. “Hopefully, by the end of the summer, the software we develop and the insights we gain will be beneficial for courses and forthcoming research projects.”
Do’s plans for the vehicle include the collection and synchronization of data with LiDAR, radar and vision to develop sensor fusion algorithms for automotive applications.
Forsyth plans to teach a class using GEM that will investigate problems including; vision based detection, multi-lane model detection, on road free-space detection, traffic light detection, camera based localization, traffic sign and speed limit recognition, surround view with object detection and classification, birds eye view 360 degrees around a vehicle, sensor fusion including camera objects, lidar objects and radar tracks.