Researchers at the University of North Texas are investigating novel ways to expand the capabilities of sensor-based perception systems fundamental to autonomous vehicle operation — with an AutonomouStuff Automated Research Development Platform at the heart of concepts that have vast implications for next-generation mobility.
Assistant Professor Qing Yang in the Department of Computer Science and Engineering and his colleagues are exploring how connected autonomous vehicle (CAV) technology can overcome the limitations inherent to single-vehicle perception systems operating independently. Just like with human drivers, the ability of individual sensors to detect objects such as vehicles and pedestrians depends on an unobstructed view.
“This is not a problem with the sensor or software, but with the physical setting — it’s a line of sight issue,” Yang said. “This problem will be there forever because that’s the environment where we’re driving.”
The University of North Texas approach to this problem involves collective perception — vehicles and infrastructure equipped with sensors and communication technology in order to share data on object detection and other information. But unlike other similar concepts, their work involves sharing greater quantities of data in different forms and fashions.
While other collective perception models involve the sharing of processed data, Yang’s research highlights the advantages of sharing raw sensor data, which gives individual vehicles within the systems a richer, more detailed and expansive understanding of the surrounding environment, including areas not within the line of sight.
Other fundamental challenges to that approach, however, remain. Data must be shared at a rate fast enough for computer systems to analyze it and make split-second driving decisions.
One possible solution proposed by Yang’s team includes centralized perceptions systems — making computers part of the infrastructure, rather than the vehicles, for increased power and performance.
Another potential solution can be found in data packaging. The University of North Texas research found collective perception advantages with a data format called feature maps, which sits between fully processed data and raw sensor data in terms of volume and utility.
For more information on the University of North Texas’ research, check out recent papers they’ve published here and here. Yang said the AutonomouStuff GEM Automated Research Development Platform made much of the work possible.
“Without AutonomouStuff’s support, we could not start up our research so quickly. I am positive that the research platform we got from AS has placed us in a unique position,” Yang wrote. “We have witnessed a positive impact of the platform on our proposals and research projects.”
For example, with the advanced CAV cyberinfrastructure established at UNT, including the AutonomouStuff GEM Automated Research Development Platform, Yang’s team recently received a $50,000 NSF award to develop a collaborative and integrated training program to assist the scientific research workforce development for the CAV domain. The training materials and modules developed as part of the program will become publicly available to the CAV community.