This October, the University of Maryland (UMD) is recognizing the incredible innovations fostered by artificial intelligence (AI). As part of this celebration, UMD welcomed media and guests to join researchers from across campus who are leveraging AI for real-world impact—from underwater drones for oyster farming to robotic dogs for mass casualty response, voice bots breaking down barriers to food access, and AI-powered communication for stroke survivors.
Local journalists and UMD leaders were welcomed by the Director of the Artificial Intelligence Interdisciplinary Institute at Maryland (AIM), Hal Daumé III, who highlighted the importance of the institute’s mission to bring together AI experts across UMD to focus on the responsible, ethical development and use of AI to advance the public good in industry, government and society. UMD has over 200 faculty who engage in AI research and education, and offers students over 200 AI-focused courses across 50 departments including multiple dedicated degree programs.
Below, explore some of the demos featuring student- and faculty-led projects.
RoboScout: AI-Powered Triage for Mass Casualty Events
Presented by Derek Paley and Kleio Baxevani
College Represented: ENGR

Researchers are developing a system of drones and ground robots designed to autonomously locate survivors and assess injuries when there are more casualties than first responders can handle. An AI voice assistant communicates with victims to determine injury severity while sensors collect health data, feeding into a triage system that would help responders prioritize care. The technology aims to address the critical gap between when disaster strikes and when help arrives.
Underwater Oyster Drone: Precision Aquaculture for Sustainable Seafood
Presented by Miao Yu, Michael Xu, Yisheng Zhang, Kaustubh Joshi, Chiao-Yi Wang and William Chen
Colleges Represented: ENGR, CMNS
Oyster farmers have been probing murky bay water with bamboo poles for centuries—a technique one waterman calls "caveman-ish." AI-trained sonar mounted on autonomous drones can now see through turbid water to locate oysters, determine their size, and distinguish living shellfish from dead ones. The technology aims to modernize Maryland's $350 million oyster industry while supporting oysters' role as natural water filters.
Mario Kart Safety Training: Teaching AI to Drive Responsibly
Presented by Mumu Xu, Kristy Sakano and Alexis Chen
Colleges Represented: ENGR, CMNS
A 33-year-old Nintendo game is helping researchers tackle one of autonomous vehicles' biggest problems: the "black box" algorithms that even their designers can't fully explain. By training AI to race through Super Mario Kart tracks while prioritizing safety over speed, researchers are developing a relatable framework for certifying autonomous systems.
Food Security Voice Bot: Breaking Down Barriers to Food Access
Presented by Arden Lawson, Amira Abujuma and Saurav Vidyadhara
College Represented: ENGR
Finding food assistance often means navigating a maze of phone calls and waiting on hold—especially difficult when dealing with transportation barriers or language differences.
This voice-based AI system, designed by undergraduate students as part of the Smith School of Business AI and Food Insecurity case competition in partnership with Capital Area Food Bank, guides callers through personalized questions about location, transportation, dietary restrictions, and other needs, communicating in dozens of languages.
Augmentative and Alternative Communication for Disabilities: AI-Powered Voice for Stroke Survivors
Presented by Stephanie Valencia2, Claire OʼConnor and Jong Ho Lee
College Represented: INFO
Nearly 800,000 Americans experience strokes each year, and many face long-term speech impairments that turn even simple communication into a frustrating challenge. This AI-powered device lets users input between just one and four words, then generates complete sentences they can vocalize. While initially tested with stroke survivors, it's designed for anyone struggling with speech—from aphasia to dementia to autism.
GenAI Architecture: Democratizing Design Through AI
Presented by Brittany Williams, Lindsey May and Michael Ezban
College Represented: ARCH
A hand-drawn sketch of a building becomes a photorealistic architectural rendering in seconds with generative AI tools. Researchers have been studying how this technology can support the design process, finding that it accelerates early ideation and lets designers spend less time on preliminary drafts. Their work with architecture students found that using generative AI encouraged more creative thinking and sparked richer conversations about design possibilities.
Perception and Robotics: Bio-Inspired Drones
Presented by Naitri Rajyaguru and Levi Burner
Colleges Represented: CMNS, ENGR
These aren't your typical quadcopters—Flapper drones have bird-like flapping wings that require AI-powered camera stabilization to capture steady footage for agricultural monitoring.
Separately, palm-sized Crazyflie drones use computer vision to autonomously weave through obstacles, with potential applications in emergency response scenarios like navigating smoke-filled buildings or assessing hard-to-reach civil infrastructure.
Small Artifacts (SMART) Lab: Wearable Robots and Tangible Design for Accessibility
Presented by: Jiasheng Li and Dannuo Li
Colleges Represented: ENGR, CMNS
Calico is a miniature robot that rides along tracks sewn into clothing, able to relocate anywhere on the wearer's body for applications from dance training to health monitoring—co-designed with elderly users to better suit their needs. TangibleGrid takes a different approach to accessibility: shape-changing physical brackets that let blind users design webpage layouts by touch, developed through co-design sessions with blind users. Both projects demonstrate how involving end users in the design process can create more inclusive technology.
Every day, UMD faculty and students are developing more applications to use AI for good. To learn more about AI at the University of Maryland, visit ai.umd.edu.