A GW professor will launch a $6 million project in September to develop methods to protect autonomous aircrafts from cyberattacks.
Peng Wei, an associate professor of mechanical and aerospace engineering, will lead the three-year, NASA-funded project where students and faculty from multiple universities will work on protecting autonomous urban drones used to deliver commercial goods from cyberattacks. Wei said the security of artificial intelligence in the aviation sector is “critical” because hacks or malfunctions can lead to crashes or breaches of information.
“If the AI or machine learning component makes a mistake, there will be serious consequences,” Wei said.
Wei leads one of three teams from multiple universities that will receive a combined total of $18 million in funding from NASA over the next three years. Wei’s team will include faculty and engineering students from GW and seven partner universities that will work to secure aircraft autonomy and train engineering students in the field of cybersecurity.
The project will begin in September with a meeting at GW of all participating universities, including Vanderbilt, Purdue and Tennessee State universities, as well as the universities of California-Irvine, Texas at Austin, Collins Aerospace and Northern Virginia Community College.
Wei said the project will study smaller commercial vehicles used to ship goods and small, helicopter-like vehicles used to transport small numbers of people. He said the project is studying the algorithms on all the programming on these vehicles to ensure they can fly on the correct paths on their own and adjust themselves in case of emergency landings using AI.
“Humans have a certain cognitive limit,” Wei said. “So that’s why we need to rely on AI technology to make sure we can scale up to more operations and also to have safe operations.”
Wei said the project’s advisory board includes representatives from different government agencies and companies, like the Federal Aviation Administration and Boeing. He said experts from different universities will present background information to the research team and then tackle practical research on how to secure the AI programs.
“We want to bring people together to ground our research to practical application,” Wei said. “We want to develop something that is useful to the industry, to its elements and also to train our students.”
The project team will include faculty and students from participating universities. Wei said he will split up the team into blue and red teams, with one focusing on acting like the “bad guys” and hacking into the systems to find vulnerabilities, while the other works to solve those vulnerabilities.
“We bring people together and then we try to train students at different levels,” Wei said. “So from undergrad from community college to masters students to PhD to postdoc.”
Bryan Ward, an assistant professor of electrical and computer engineering at Vanderbilt University and collaborator on the project, said the project is targeting an emerging sector of commercial autonomous unmanned airspace vehicles that are used to ship goods and people across urban areas. Ward said it is useful to start securing devices in emerging markets early on so that security problems don’t emerge later when it is harder to change existing infrastructure.
“This is an emerging area, and so it’s really important to be able to kind of get in on the ground floor and develop security solutions that are tailored to these applications,” Ward said. “And make sure that we aren’t five years down the road, dealing with kind of legacy architectures and designs that are more difficult to secure.”
Ward said Vanderbilt researchers are experts in “lower level” security systems that implement flight software on the drones and ensure the operating system’s compatibility. He said other researchers on the project are more focused on detecting malicious activity from outside sources instead of dissecting the mechanics of the drones themselves.
“One of the cool things about this project is that it’s pretty broad and all encompassing and we have people from different universities who are bringing different expertise and experience levels from a bunch of different angles,” Ward said.
Ward said cybersecurity is becoming more prevalent due to increased large-scale cyberattacks, which could bleed into commercial aircrafts.
“That’s a really important need to really focus more attention on security, and particularly on drones or other devices that are flying in our airspace over our heads,” Ward said. “And we don’t want them crashing or doing other malicious things.”
Lanier Watkins, a research assistant at the Johns Hopkins University and collaborator on the project, said some ways to ensure autonomy of aircrafts is to have a person monitoring the drone’s flight path to make sure there is no unusual activity or to have an artificial intelligence program embedded into the drone to detect outer attacks. Watkins said researchers are going to look at all of these methods and identify the most effective avenue.
Watkins said the attacks can come from various types of bad actors. He said these actors can be criminals looking to intercept a package delivery or hackers just attacking drones for fun. He said they could even be another nation attempting to obtain U.S. intelligence.
“There are various forms of different threat models that exist,” Watkins said. “It really depends on what is actually happening, what type of servers that it is and who is the end receiver of that delivery.”
Watkins said commercial vendors like UPS are increasingly using drones to ship goods over long distances, which is why there is a need to secure these vehicles.
“This is pretty much the wave of the future,” Watkins said. “You’re gonna have more and more key vendors that support critical infrastructure that are gonna start using these types of services.”