Videos
Current Projects | Past Projects | Videos | Publications
Published on Sep 11, 2016
This video presents an autonomous 250 g quadrotor performing aggressive maneuvers using a qualcomm snapdragon flight and relying only on on-board computation and sensor capabilities. The control planning and estimation tasks are solved based on the information provided by a single camera and an IMU.
Highlights from DARPA’s Fast Light Autonomy competition data collection tests. The fastest flight was from Penn’s “Team Falcon” from the GRASP Lab! Other teams composed of MIT/Draper and SSCI/AV.
In this paper we present the flying monkey, a novel robot platform having three main capabilities: walking, grasping, and flight. This new robotic platform merges one of the world’s smallest quadrotor aircraft with a lightweight, single-degree-of-freedom walking mechanism and an SMA-actuated gripper to enable all three functions in a 30g package. The main goal and key contribution of this paper is to design and prototype the flying monkey that has increased mission life and capabilities through the combination of the functionalities of legged and aerial robots. Yash Mulgaonkar, Brandon Araki, Je-sung Koh, Luis Guerrero-Bonilla, Daniel M. Aukes, Anurag Makineni, Michael T. Tolley, Daniela Rus, Robert J. Wood, and Vijay Kumar.
Sikang Liu, Michael Watterson, Sarah Tang, and Vijay Kumar
Justin Thomas, Giuseppe Loianno, Morgan Pope, Elliot W. Hawkes, Matthew A. Estrada, Hao Jiang, Mark R. Cutkosky, Vijay Kumar.
This video showcases a research collaboration between our group at the University of Pennsylvania and Qualcomm Research in creating autonomous flying robots powered by smartphones. The (750 g, 54 cm diameter) robot is designed and build at Penn. The sensing, sensor fusion, control, and planning are all done on an off-the-shelf Samsung Galaxy S5 phone. The robot senses its environment and estimates its pose using information from the camera and IMU on the phone.
Quadrotor flying along the horizontal region of the penstock without external illumination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.
Quadrotor flying along the inclined region of the penstock approximately 50 meters into the inclination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.
Quadrotor flying along the inclined region of the penstock approximately 15 meters into the inclination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.
State estimation and autonomous navigation in complex indoor/outdoor environments
Shaojie Shen, Yash Mulgaonkar, Nathan Michael and Vijay Kumar, ICRA 2014 (to be published)
A 740g quadrotor flies fully autonomously with speed up to 4m/s using only onboard sensing and computation. The quadrotor is equipped with two cameras, an IMU, and an 1.6GHz Intel Atom processor
A troupe of 16 quadrotors (flying robots) dance to and manipulate sound and light at the Saatchi & Saatchi New Directors’ Showcase 2012.
Quadrotors create and execute a plan for building a 3-D structure.They also navigate new areas with a lasar and camera and actually learn about the environment as they fly through it. It will detect obstacles and fly around them.
Flying Robots Build Dorm Room Shelves.
Teams of quadrotors autonomously build tower-like cubic structures from modular parts. Work done by Quentin Lindsey, Daniel Mellinger, and Vijay Kumar at the GRASP Lab, University of Pennsylvania.
Vision-based autonomous navigation and mapping using a 740 gram quadrotor equipped with two fisheye cameras and an IMU.
Autonomous robotic “fetch” operation, where a robot is shown a novel object and then asked to locate it in the field, retrieve it and bring it back to the human operator.
A 740g quadrotor flies fully autonomously with speed up to 4m/s using only onboard sensing and computation. The quadrotor is equipped with two cameras, an IMU, and an 1.6GHz Intel Atom processor
We present a methodology for estimating the state of a micro-aerial vehicle (MAV) as it transitions between different operating environments with varying applicable sensors. We ensure that the estimate is smooth and continuous throughout and provide an associated quality measure of the state estimate. The resulting onboard state estimate is directly applied for feedback control. This video shows experimental results of a MAV autonomously flies through indoor and outdoor environments. Work done by Shaojie Shen and Nathan Michael at the GRASP Lab at the University of Pennsylvania.
We present a methodology that enables a quadrotor aerial robot to autonomously explore single- or multi- floor indoor environments without any human interaction.The quadrotor is purchased from ascending technologies. It comes with an IMU and low level attitude stabilization. We outfitted the robot with a laser scanner, Microsoft Kinect sensor, and deflective mirrors to create a fully autonomous platform. We developed a navigation system that enables realtime localization, mapping, planning and control of the robot in confined indoor environments. All computations are done onboard the 1.6GHz atom processor with no requirements of external infrastructure. The exploration algorithm interacts with the planner and controller and provides continuous guidance to the robot.
This video presents experimental results of autonomous navigation in confined indoor environments using an aerial robot. The robot is equipped with an IMU, camera, and laser scanner with deflective mirrors. All computations are performed onboard using a 1.6GHz atom processor. The robot is able to navigate autonomously in indoor or outdoor, GPS-denied environments. A SLAM module with vision based loop closure allows the robot to map large-scale, multi-floor environments. A sparse 3D map is generated on the robot based on sensor data, enabling high-level planning and visualization.
This video shows our results on autonomous multi-floor indoor navigation with a quadrotor. We designed a system that is capable of autonomous navigation with real-time performance on a mobile processor using only onboard sensors. Specifically, we address multi-floor mapping with loop closure, localization, planning, and autonomous control, including adaptation to aerodynamic effects during traversal through spaces with low vertical clearance or strong external disturbances. All of the computation is done onboard the 1.6Ghz Intel Atom processor and uses ROS for interprocess communication. Human interaction is limited to provide high-level goals to the robot.
Scalable sWarms of Autonomous Robots and Mobile Sensors (SWARMS) project.
The SWARMS project brings together experts in artificial intelligence, control theory, robotics, systems engineering and biology with the goal of understanding swarming behaviors in nature and applications of biologically-inspired models of swarm behaviors to large networked groups of autonomous vehicles.
This video accompanies our submission to ICRA 2019 on multi-MAV mutual localization under the following assumptions:
1. Initial relative poses between robots are unknown.
2. Robot detection provides no identity information.
3. Item Robot detection can include false negatives and false positives.
4. Vision-based distance and bearing measurements are noisy.
Aerial Robots for Remote Autonomous Exploration and Mapping
We are interested in exploring the possibility of leveraging an autonomous quadrotor in earthquake-damaged environments through field experiments that focus on cooperative mapping using both ground and aerial robots. Aerial robots offer several advantages over ground robots, including the ability to maneuver through complex three-dimensional (3D) environments and gather data from vantages inaccessible to ground robots. Read More …
N. Michael, S. Shen, K. Mohta, Y. Mulgaonkar, V. Kumar, K. Nagatani, Y. Okada, S. Kiribayashi, K. Otake, K. Yoshida, K. Ohno, E. Takeuchi, and S. Tadokoro, “Collaborative mapping of an earthquake-damaged building via ground and aerial robots,” J. Field Robotics, vol. 29, no. 5, pp. 832–841, 2012.
CPS: Frontier: Collaborative Research: bioCPS for Engineering Living Cells
Calin Belta-Boston University (Lead PI)
Doug Densmore-Boston University (Co-PI)
Vijay Kumar- University of Pennsylvania (PI)
Ron Weiss- Massachusetts Institute of Technology (PI)
We combine strategies for passive particle assembly in soft matter with robotics to develop new means of controlled interaction. In capillary assembly, particles distort fluid interfaces and move in directions that minimize the surface area. In particular, they move along principle axes on curved interfaces to sites of high curvature via capillary migration. We propose a robot that serves as a programmable source of fluid curvature and allows the collection of passive particles. When settled on a fluid interface, the magnetic robot distorts the interface, which strongly influences curvature capillary migration. The shape of the robot dictates the interface shape, for example, by imposing high interface curvature near corners, create sites of preferred assembly. This freedom to manipulate interface curvature dynamically and to migrate laterally on the interface creates new possibilities for directed bottom-up particle assemblies and precise manipulation of these complex assembled structures. Since the passive particles can be functionalized to sense, report and interact with their surroundings, this work paves the way to new schemes for creation and control of functionalized micro robots.
Independent Control of Identical Magnetic Robots in Plane
Denise Wong, Edward Steager, Vijay Kumar