Four Great Valley professors receive seed grant for AI research

Headshots of Youakim Badr, Partha Mukherjee, Raghu Sangwan, and Satish Srinivasan

Youakim Badr, Partha Mukherjee, Raghu Sangwan, and Satish Srinivasan will bring a multidisciplinary approach to evaluating and improving artificial intelligence systems.

Credit: Penn State

MALVERN, Pa. — Four Penn State Great Valley faculty members recently received a Multi-Disciplinary Research Grant for their research proposal “Managing Risks in AI Systems: Mitigating Vulnerabilities and Threats Using Design Tactics and Patterns.”

The team — comprised of Youakim Badr, associate professor of data analytics; Partha Mukherjee, assistant professor of data analytics; Raghu Sangwan, associate professor of software engineering; and Satish Srinivasan, assistant professor of information science — was one of eight faculty teams from across all Penn State campuses to receive a one-year seed grants to fund research on cybersecurity for Artificial Intelligence (AI). The grants are funded in concert with the 2020 industryXchange, an annual University-wide event hosted by the College of Engineering.

Badr, Mukherjee, Sangwan and Srinivasan’s proposal brings a multidisciplinary approach to evaluating intelligent systems, identifying vulnerabilities, and developing solutions to mitigate those risks.

“It’s an excellent opportunity for our campus and faculty to bring together different expertise around AI, cybersecurity and software engineering,” Badr said. “This research proposal makes it possible to integrate our expertise in a multidisciplinary approach to create a risk management framework for developing AI systems that are testable, secure and reliable. We are confident that the research topic will attract industry partners and have a significant impact on the industry.”

The team’s diverse research background creates a unique approach to test for vulnerabilities when developing AI systems. Badr will focus on the risk management framework for AI-based systems, Mukherjee on monitoring and evaluating risks when these systems are distributed, Sangwan on developing a software engineering approach to architecting and designing AI systems centering on their testability and security, and Srinivasan on fault tolerance and predictions.

The project also includes Prasenjit Mitra, professor of information sciences and technology, associate dean for research in the College of Information Sciences and Technology, and the director of the Center for Socially Responsible Artificial Intelligence. Mitra serves as a technical consultant and will contribute to risk assessment of adversarial attacks and mitigation with resilient federated learning techniques.

The impetus for the project came when Badr and Sangwan noticed significant vulnerabilities in AI systems, like self-driving cars that could be tricked to misread traffic signs. Because intelligent systems aren’t typically designed from a software or systems engineering perspective, the team saw an opportunity to explore how those frameworks could help advance AI systems.

“From a software and systems engineering perspective, it’s always a good idea to think about designing qualities into the system,” Sangwan said. “What we are hoping is that we’ll be able to come up with a systematic approach for people who are interested in developing intelligent systems so that, number one, they are aware of vulnerabilities, and, number two, they think about these vulnerabilities up front before they develop the product and put it out in the market.”

The multidisciplinary approach doesn’t just extend to the faculty, though, said the researchers. The broad reach of cybersecurity and AI will provide opportunities for graduate students from multiple programs to contribute to the research.

The project also aligns with the campus’ focus of bridging the gap between industry and academia, both for full-time students preparing to enter the workforce and for students already working in industry full-time.

“We seek to create the synergy needed to provide the best opportunities between research and academic programs for the students,” Badr said. “We hope that the project’s outcomes can be transferred to our classrooms and support our campus mission of providing high-quality, innovative and technologically progressive opportunities to collaborate with companies and industry.”