Current Projects

Current Projects | Past Projects | Videos | Publications  | Github

Agriculture Robotics

Smart robotic systems can improve the efficiency and yield in farm operations. Our goal is to provide specialty crop and tree growers with the necessary data for monitoring and planning operations for agriculture. We are creating autonomous robotic systems are able to navigate in unstructured environments such as forests and orchards and create semantic maps that provide data for carbon management and climate adaptation, and actionable information to farmers and forest management experts. Some recent work in autonomous under-the-canopy flight and semantic mapping can be seen in this video. We are part of the IoT4Ag center. Some of our work has been commercialized by Treeswift, a spin out from our lab.
Researchers: Steven Chen, Fernando Cladera, Alice Li, Xu Liu, Matt Malencia, Sandeep Manjanna, Alex Zhou, and Yuwei Wu

Multi Robot Systems

We are interested in developing a framework and methodology for the analysis of cooperative and collaborative behavior and the synthesis of bio-inspired collective behavior for engineered systems. We are interested in such questions as: Can large numbers of autonomously functioning robots be reliably deployed in the form of a “swarm” to carry out a prescribed mission and to respond as a group to high-level management commands? Can such a group successfully function in an unstructured and potentially dangerous environment, without a designated leader, with limited communications between its members, and/or with different and potentially dynamically changing “roles” for its members? What can we learn about how to organize these teams from biological groupings such as insect swarms, bird flocks, and fish schools? Is there a hierarchy of “compatible” models appropriate to swarming/schooling/flocking which is rich enough to explain these behaviors at various “resolutions” ranging from aggregate characterizations of emergent behavior to detailed descriptions which model individual vehicle dynamics? Can we create networked robotic systems that can assist humans in complex physical tasks? Examples of our recent work can be seen on SWAP-constrained platforms, swarms of quadrotors, cross-view localization and mapping by UAVs and UGVs for exploration, and midair self-assembly and vision-based coordinated flight.

Researchers: Anish Bhattacharya, Austin Chen, Fernando Cladera, Laura Hallock Malakhi Hopkins, Pratik Kunapulli, Ian Miller, Dan Mox, Ty Nguyen, Yuezhan Tao, Dinesh Thakur, and Lifeng Zheng

Agile Autonomous Flight

We are creating small, safe, autonomous flying robots that are able to navigate in complex, three-dimensional environments with or without GPS with applications to search and rescue, first response and precision farming. The robots are quadrotors with onboard sensors and processors customized for state estimation, control and planning. We rely on a range of sensors that can include an IMU, cameras, a laser range scanner, an altimeter and a GPS sensor. Our smallest robot has a mass of 20 grams and is capable of navigating at 6 m/s in indoor environments. Our largest robot is over 4 kgs and can navigate indoors or outdoors, through forests, buildings, and farms. Our robots can also manipulate and interact with the environment. Exyn Technologies, a spin out from our lab, explores applications for mining and logistics.

Researchers: Anish Bhattacharya, Anthony Bisulco, Austin Chen, Fernando Cladera, Spencer Folk, Laura Hallock, Malakhi Hopkins, Pratik Kunapulli, Elijah Lee, Wenxin Liu, Laura Jarin-Lipschitz, Ian Miller, Dan Mox, Yuezhan Tao, Dinesh Thakur, Jake Welde, and Alex Zhou.

“Our work is supported by NSF, NVIDIA, Treeswift, Army Research Laboratory, NSF, Qualcomm, NASA, IARPA, USDA, ONR, Lockheed Martin and the Semiconductor Research Corporation.”