Current Projects

Current Projects | Past Projects | Videos | Publications 

Autonomous Micro UAVs

We are creating autonomous flying robots that are able to navigate in complex, three-dimensional environments with or without GPS with applications to search and rescue, first response and precision farming. The robots are quadrotors with onboard sensors and processors customized for state estimation, control and planning. We rely on a range of sensors that can include an IMU, cameras, a laser range scanner, an altimeter and a GPS sensor. Our smallest robot has a mass of 20 grams and is capable of navigating at 6 m/s in indoor environments. That’s about 53 Body lengths/second. Our largest robot is nearly 2 kgs and can navigate indoors or outdoors, through forests, buildings, and farms. For more information, see websites of researchers below.

Grants: MAST, ONR SMART, DARPA FLA,  NSF Printable Robots, Exyn Technologies, NASA
Research Staff: Giuseppe Loianno
Students: Yash Mulgaonkar,  Kartik Mohta, Sikang Liu, Tolga Ozaslan, Sarah Tang, Justin Thomas, Ke Sun, Anurag Makineni, Mike Watterson, Sadat Shaik
 
Scalable sWarms of Autonomous Robots and Mobile Sensors (SWARMS)

We are interested in developing a framework and methodology for the analysis of swarming behavior in biology and the synthesis of bio-inspired swarming behavior for engineered systems. We are interested in such questions as: Can large numbers of autonomously functioning vehicles be reliably deployed in the form of a “swarm” to carry out a prescribed mission and to respond as a group to high-level management commands? Can such a group successfully function in a potentially hostile environment, without a designated leader, with limited communications between its members, and/or with different and potentially dynamically changing “roles” for its members? What can we learn about how to organize these teams from biological groupings such as insect swarms, bird flocks, and fish schools? Is there a hierarchy of “compatible” models appropriate to swarming/schooling/flocking which is rich enough to explain these behaviors at various “resolutions” ranging from aggregate characterizations of emergent behavior to detailed descriptions which model individual vehicle dynamics? For more information, visit websites of researchers below.

Grants: ONR Antidote, ONR Science of Autonomy, Terraswarms,  MAST, NSF, UTRC
Research Staff: Ani Hsieh (Visiting Professor),  Amanda Prorok
Students: Tee Ramaithitima, Kelsey Saulnier, David Saldana, Sarah Tang,
 
Micro Bio Robots

We are interested in synthesizing Micro Bio Robots (MBRs), robots that are 10-100 µm in size powered by biological sensors and/or actuators, as well as intelligent control and planning software. MBRs capable of navigating micro scale environments have applications in drug discovery, proteomics, therapeutics, and micro assembly. Our research focuses on fundamental problems underlying mobility and adhesion of bacteria, swarming interactions between bacteria, and micro robot navigation in fluid channels, to the synthesis of bio sensors and biological circuits for sensor-actuator-communication loops. For more information, visit websites of researchers below.

Grants: ONR MBR, ONR Antidote, NSF Bio-CPS
Research Staff: Ed Steagear
Students: Elizabeth Beattie, Denise Wong,
 
Cooperative Manipulation and Transport

How can independent, autonomous robots collaborate to perform such manipulation tasks as lifting and transporting large or heavy payloads? There are several examples in nature where individuals cooperate to perform tasks they individually cannot perform. In collaboration with biologists, we study how ants engage in cooperative prey retrieval carrying large, awkwardly shaped morsels of food back to their nest. We also study bacteria lifting and swimming with large payloads. Using biological inspiration we develop models, design algorithms and create robotic systems that are able to cooperate both on the ground and in the air.

Grants: ONR Antidote, NSF Human-robot Coordinated Manipulation and Transportation of Large Objects
Students: Monroe Kennedy, Denise Wong
 
Printable Robots

We are creating the desktop technology that prints programmable, printable robots lowering the barrier to entry in the field of robotics and enabling consumers to use robots in their lives. Just as the personal computer made it possible to synthesize and manipulate bits of information, analyze data before reducing the results to a printer, we want to make it possible to create and manipulate designs of physical objects and use such technologies as 3-D printing to create robots on demand. Our immediate focus is on creating small, autonomous flying robots and modular robots that can be reconfigured for grasping, manipulation and locomotion. For more information, visit websites of researchers below.

Grants: NSF Expedition in Computing for Compiling Printable Programmable Machines
Research Staff: Giuseppe Loianno
Students: Yash MulgaonkarKartik Mohta, Mickey Whitzer,Chao Qu
 
Flying Smart Phones

In this project, our goal is to build and enable autonomous Micro Aerial Vehicles (MAVs) using regular off-the-shelf smartphones and their chipset. Smartphones represent one of the most compact, high-performance, and lowest-cost computation and sensor packages. On-board an average smartphone, one will find an excellent CPU processor, two cameras, a battery, GPS, Wi-Fi, and Bluetooth module, an inertial measurement unit (IMU), telecom cellular connectivity, memory, and sometimes a GPU. The number of smartphones in the world is superior to the number of people. 2 Billion people globally have access to smartphones. Through this project, we hope a similar number of people around the world will soon have access to robots capable of making their lives better.

Grants: Qualcomm Research
Research Staff: Giuseppe Loianno
Students: Yash Mulgaonkar, Adam Cho, Tanmay Chordia
 
Robotic First Response – Search and Rescue

Robotic first responders can quickly respond to disasters and emergencies by providing support and situational awareness before human responders can get to the scene. We envision a team of heterogeneous robots launched either by high level commands from a dispatcher or automatically triggered by a disaster detection system. Using onboard sensors, these robots can relay information back to the command station in order to guide further efforts. By providing such guidance, we believe that the human efforts can be more focused and their effectiveness increased.

Grants: NSF, DARPA
Research Staff: Ani Hsieh (Visiting Professor), Amanda Prorok, Nikolay Atanasov
Students: Kartik Mohta
 
Robot Scouts for Precision Agriculture

We are developing smart robotic systems to improve efficiency and yield of farm operations. Our goal is to provide specialty crop growers with a data-driven deployment strategy that makes synergistic use of a networked robotic system working interactively with a human scout. First, we have developed a lightweight and self-contained multi-spectral 3-D imaging system that has been deployed using unmanned aerial vehicles (UAVs), ground vehicles, and carried by a human scout. Acquired data have been used to train statistical models enabling persistent monitoring of crop yield, morphology, and health. Second, we are developing the framework and algorithms to deploy multiple UAVs that can collaborate with and be controlled by a single human scout. Finally, an agricultural decision support system (AgDSS) is being developed to facilitate annotation of field data acquired by our systems, and introspection of learned predictive models. Our technology stack will enable a human scout and a swarm of co-robots to operate in concert over extended periods while accommodating constraints on sensing, navigation speeds, and power consumption.

Grants: USDA Robot Swarms and Human Scouts for Persistent Monitoring of Specialty Crops, NSF, ONR, Berkman Opportunity Fund
Research Staff: Jnaneshwar Das
Students: Chao Qu, Steven Chen, Shreyas Skandan, Anuj Panda, Andrew Block, Daniel Orol, Delaney Kaufman, Sandeep Dcunha , Jacob Beckerman, Ryan Kortvelesy
 
UAV Testbed for the CPS and Robotics Community

We are developing a cloud-enabled testbed for UAV education and research. Supported by the NSF CPS Virtual Organization’s Active Resources initiative, the testbed includes standardized UAV hardware and an end-to-end simulation stack built upon open source technologies. The testbed facilitated a pilot student UAV challenge held at the TIMPA airfield in Arizona on October 3-4, 2016, where four student teams from Vanderbilt University, University of Arizona, UCLA, and UPenn demonstrated with varying degrees of autonomy, the deployment and retrieval of a mosquito trap. This task was motivated by Microsoft Research’s Project Premonition, which also funded the hardware for the participating teams.

Project Website: http://openuav.us

Grants: NSF, Microsoft Research
Research Staff: Jnaneshwar Das
Students: Lukas Vacek, Edward Atter, Ben Kramer, Pedro Rizo, Brian Nam, Ryan Kortvelesy, Anurag Makineni, Anuj Panda, Delaney Kaufman
 
Top