Past Projects

Current Projects | Past Projects | Videos | Publications

Agriculture Robotics
IoT4Ag: Xu Liu, Alice Li, Matt, Fernando, Sandeep, Alex
NRI: Xu Liu, Yuwei, Steven Chen
Pointers to the videos: Large-scale Autonomous Flight with Real-time Semantic SLAM under Dense Forest Canopy

Multi Robot Systems
DCIST: Lifeng, Dan, Dinesh, Austin, Ian, Fernando
Qualcomm(Self-driving cars): Pratik, Anish
IMOD: Dinesh, Yuezhan, Malakhi
Human–Robot Collaboration: Laura H
Pointers to the videos: Swarm of Inexpensive Heterogeneous Micro Aerial Vehicles

Agile Autonomous Flight
DCIST: Laura JL, Fernando, Alex
C-BRIC: Fernando, Ty, Elijah, Wenxin
IARPA: Fernando, Anish, Anthony
Dynamic Aerial Robots: Jake, Pratik, Spencer, Anish
Pointers to the videos: Fast flight in GPS-denied environments

Autonomous Micro UAVs

We are creating autonomous flying robots that are able to navigate in complex, three-dimensional environments with or without GPS with applications to search and rescue, first response and precision farming. The robots are quadrotors with onboard sensors and processors customized for state estimation, control and planning. We rely on a range of sensors that can include an IMU, cameras, a laser range scanner, an altimeter and a GPS sensor. Our smallest robot has a mass of 20 grams and is capable of navigating at 6 m/s in indoor environments. That’s about 53 Body lengths/second. Our largest robot is nearly 2 kgs and can navigate indoors or outdoors, through forests, buildings, and farms. For more information, see websites of researchers below.

Grants: MAST, ONR SMART, DARPA FLA,  NSF Printable Robots, Exyn Technologies, NASA
Research Staff: Giuseppe Loianno
Students: Yash Mulgaonkar,  Kartik Mohta, Sikang Liu, Tolga Ozaslan, Sarah Tang, Justin Thomas, Ke Sun, Anurag Makineni, Mike Watterson, Sadat Shaik
Autonomous Inspection of Critical Infrastructure using Micro Aerial Vehicles

Penstocks are long, featureless, wide, dark tunnels that carry the water from the lake to the turbines of a dam. This infrastructure requires regular and proper maintenance due to possible catastrophic consequences such as cracking of the penstock or even complete demolishment of the dam. Maintenance engineers and workers manually inspect penstocks by either building scaffolds inside the tunnel to climb through the tunnel or swing down from the gate in steep tunnels such as Glen Canyon Dam, AZ. This is a labor and time demanding practice and poses significant danger to the maintenance personnel.

In this work we replace the human inspection personnel with an autonomous Micro Aerial Vehicle (MAV) that can collect high-resolution imagery from inside the penstock autonomously. The inspection process can be completed within tens of seconds even in huge dams with a
moderately trained operator. The operator’s role in this scenario is just to prepare the robot for the flight and give high-level commands such as take-off, land and inspect.

Details of the project can be found at the project website.

Grants: U.S. Army Corps of Engineers
Research Staff: Giuseppe Loianno
Students: Tolga Ozaslan, James F. Keller
Scalable sWarms of Autonomous Robots and Mobile Sensors (SWARMS)

We are interested in developing a framework and methodology for the analysis of swarming behavior in biology and the synthesis of bio-inspired swarming behavior for engineered systems. We are interested in such questions as: Can large numbers of autonomously functioning vehicles be reliably deployed in the form of a “swarm” to carry out a prescribed mission and to respond as a group to high-level management commands? Can such a group successfully function in a potentially hostile environment, without a designated leader, with limited communications between its members, and/or with different and potentially dynamically changing “roles” for its members? What can we learn about how to organize these teams from biological groupings such as insect swarms, bird flocks, and fish schools? Is there a hierarchy of “compatible” models appropriate to swarming/schooling/flocking which is rich enough to explain these behaviors at various “resolutions” ranging from aggregate characterizations of emergent behavior to detailed descriptions which model individual vehicle dynamics? For more information, visit websites of researchers below.

Grants: ONR Antidote, ONR Science of Autonomy, Terraswarms,  MAST, NSF, UTRC
Research Staff: Ani Hsieh (Visiting Professor),  Amanda Prorok
Students: Tee Ramaithitima, Kelsey Saulnier, David Saldana, Sarah Tang,
Robot Motion Planning

Our work in motion planning has three goals. First, we are interested in topological representations of the set of possible trajectories called homologies, the effect of environment uncertainty on these homology classes, and exploration of partially known environments by multiple robots using topological (instead of metric) information. Second, we are interested in how these topological representations can lead to natural human robot interaction. Finally, we are interested in motion planning algorithms for large teams of robots with dynamic constraints, especially when the robots are identical and therefore interchangeable.  For more information, visit websites of researchers below.

Research Staff: Subhrajit Bhattacharya
Students: Tee Ramaithitima
RAPID: Aerial Robots for Remote Autonomous Exploration and Mapping

We are interested in exploring the possibility of leveraging an autonomous quadrotor in earthquake-damaged environments through field experiments that focus on cooperative mapping using both ground and aerial robots. Aerial robots offer several advantages over ground robots, including the ability to maneuver through complex three-dimensional (3D) environments and gather data from vantages inaccessible to ground robots. We consider an earthquake-damaged building with multiple floors that are generally accessible to ground robots. However, various locations in the environment are inaccessible to the ground robots due to debris or clutter. The goal is the generation of 3D maps that capture the layout of the environment and provide insight into the degree of damage inside the building. Read More …

Grants: NSF Rapid, MAST
Students: Yash Mulgaonkar, Kartik Mohta, Tolga Ozaslan
Human-Robot Coordinated Manipulation and Transportation

We will address the fundamental challenges of cooperative human-robot object manipulation and transportation, based on the precise formulation and rigorous solution of problems in perception, cognition, and control. The key concepts that this research seeks to promote are adaptability to human activity under minimal communication, and robustness to variability and uncertainty in the environment, achieved through a layered representation and deliberate processing of the available information. Moreover, this project aims to make maximum use of a minimal set of sensors to plan and control the actions of the robot, while ensuring safe and efficient cooperative transportation. For more information, visit websites of researchers below.

Grants: Human-robot Coordinated Manipulation and Transportation of Large Objects, TROOPER
Research Staff: Dinesh Thakur
Students: Monroe KennedyMike Watterson
Autonomous Robotic Rotorcraft for Exploration, Surveillance and Transportation (ARREST)

In this NSF sponsored Partnerships for Innovation (PFI) project, we are creating partnerships with small-business entrepreneurs in the area of micro aerial vehicles with applications to agriculture, security, law enforcement and first response. The partnership will enable the translation of fundamental, federally-sponsored, research results into products with societal benefits and commercial impact by implementing a loosely structured, commercially focused “play-like sandbox” environment among its partners. The Y-Prize competition at Penn is designed to explore novel applications and create new companies.

Students: Mickey Whitzer, Justin Thomas, Kartik Mohta