Videos

Current Projects | Videos | Publications 

Smartphones Power Flying Robots (CES 2015)

This video showcases a research collaboration between our group at the University of Pennsylvania and Qualcomm Research in creating autonomous flying robots powered by smartphones. The (750 g, 54 cm diameter) robot is designed and build at Penn. The sensing, sensor fusion, control, and planning are all done on an off-the-shelf Samsung Galaxy S5 phone. The robot senses its environment and estimates its pose using information from the camera and IMU on the phone.

Carters Dam Flight1

Quadrotor flying along the horizontal region of the penstock without external illumination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.

Carters Dam Flight2

Quadrotor flying along the inclined region of the penstock approximately 50 meters into the inclination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.

Carters Dam Flight3

Quadrotor flying along the inclined region of the penstock approximately 15 meters into the inclination. We collect imagery using the on-board camera and use visual odometry to estimate position along the tunnel axis.

State estimation and autonomous navigation in complex indoor/outdoor environments

State estimation and autonomous navigation in complex indoor/outdoor environments
Shaojie Shen, Yash Mulgaonkar, Nathan Michael and Vijay Kumar, ICRA 2014 (to be published)

Vision-based, autonomous flight in indoor environments (4 m/s)

A 740g quadrotor flies fully autonomously with speed up to 4m/s using only onboard sensing and computation. The quadrotor is equipped with two cameras, an IMU, and an 1.6GHz Intel Atom processor

Air-ground collaboration for mapping in a damaged building after the 2011 Japan Earthquake

KMel quadrotors dance to music and lights at Saatchi and Saatchi event in Cannes

A troupe of 16 quadrotors (flying robots) dance to and manipulate sound and light at the Saatchi & Saatchi New Directors’ Showcase 2012.

A Swarm of Nano Quadrotors

discovery

Discovery Channel: Quadrotors build autonomously

Quadrotors create and execute a plan for building a 3-D structure.They also navigate new areas with a lasar and camera and actually learn about the environment as they fly through it. It will detect obstacles and fly around them.

The Colbert Report (watch 3 mins out)

Flying Robots Build Dorm Room Shelves.

Assembly of Structures for Construction with Multiple Quadrotors

Teams of quadrotors autonomously build tower-like cubic structures from modular parts. Work done by Quentin Lindsey, Daniel Mellinger, and Vijay Kumar at the GRASP Lab, University of Pennsylvania.

Vision-based Autonomous Navigation and Mapping with a Small Quadrotor

Vision-based autonomous navigation and mapping using a 740 gram quadrotor equipped with two fisheye cameras and an IMU.

CANINE – GRASP Lab, UPenn

Autonomous robotic “fetch” operation, where a robot is shown a novel object and then asked to locate it in the field, retrieve it and bring it back to the human operator.

Vision-Based Aggressive Flight with a Small Quadrotor

A 740g quadrotor flies fully autonomously with speed up to 4m/s using only onboard sensing and computation. The quadrotor is equipped with two cameras, an IMU, and an 1.6GHz Intel Atom processor

Avian-Inspired Grasping For Quadrotor Micro Aerial Vehicles

Collaborative mapping of an earthquake-damaged building via ground and aerial robots

Vision-based state estimation for autonomous rotorcraft

State Estimation for Indoor and Outdoor Operation with a Micro-Aerial Vehicle

We present a methodology for estimating the state of a micro-aerial vehicle (MAV) as it transitions between different operating environments with varying applicable sensors. We ensure that the estimate is smooth and continuous throughout and provide an associated quality measure of the state estimate. The resulting onboard state estimate is directly applied for feedback control. This video shows experimental results of a MAV autonomously flies through indoor and outdoor environments. Work done by Shaojie Shen and Nathan Michael at the GRASP Lab at the University of Pennsylvania.

3D Indoor Exploration with a Computationally Constrained MAV

We present a methodology that enables a quadrotor aerial robot to autonomously explore single- or multi- floor indoor environments without any human interaction.The quadrotor is purchased from ascending technologies. It comes with an IMU and low level attitude stabilization. We outfitted the robot with a laser scanner, Microsoft Kinect sensor, and deflective mirrors to create a fully autonomous platform. We developed a navigation system that enables realtime localization, mapping, planning and control of the robot in confined indoor environments. All computations are done onboard the 1.6GHz atom processor with no requirements of external infrastructure. The exploration algorithm interacts with the planner and controller and provides continuous guidance to the robot.

Autonomous Aerial Navigation in Confined Indoor Environments

This video presents experimental results of autonomous navigation in confined indoor environments using an aerial robot. The robot is equipped with an IMU, camera, and laser scanner with deflective mirrors. All computations are performed onboard using a 1.6GHz atom processor. The robot is able to navigate autonomously in indoor or outdoor, GPS-denied environments. A SLAM module with vision based loop closure allows the robot to map large-scale, multi-floor environments. A sparse 3D map is generated on the robot based on sensor data, enabling high-level planning and visualization.

Autonomous Multi-Floor Indoor Navigation with a Computationally Constrained MAV

This video shows our results on autonomous multi-floor indoor navigation with a quadrotor. We designed a system that is capable of autonomous navigation with real-time performance on a mobile processor using only onboard sensors. Specifically, we address multi-floor mapping with loop closure, localization, planning, and autonomous control, including adaptation to aerodynamic effects during traversal through spaces with low vertical clearance or strong external disturbances. All of the computation is done onboard the 1.6Ghz Intel Atom processor and uses ROS for interprocess communication. Human interaction is limited to provide high-level goals to the robot.



Scalable sWarms of Autonomous Robots and Mobile Sensors (SWARMS) project.

The SWARMS project brings together experts in artificial intelligence, control theory, robotics, systems engineering and biology with the goal of understanding swarming behaviors in nature and applications of biologically-inspired models of swarm behaviors to large networked groups of autonomous vehicles.

20 Robo Swarm

Alex Kushleyev, Daniel Mellinger, and Vijay Kumar. Towards A Swarm of Agile Micro Quadrotors. Robotics: Science and Systems, July 2012

Anonymity in Pattern Formation by Swarms

M. Turpin, N. Michael, and V. Kumar, “CAPT: Concurrent assignment and planning of trajectories for multiple robots,” International Journal of Robotics Research, 2014 (in press).

Control of formation shape and position/orientation

M. Turpin, N. Michael, and V. Kumar, “Trajectory design and control for aggressive formation flight with quadrotors,” Autonomous Robots, Feb. 2012.

Cooperation in Construction

Quentin Lindsey, Daniel Mellinger and Vijay Kumar, “Construction with quadrotor teams,” Autonomous Robots, 33, (3), 2012.

Design of Small, Safe and Robust Quadrotor Swarms.

Networks of Leaders and Followers

Autonomous boats caging and manipulating objects

In collaboration with Gaurav Sukhatme, USC.This video is supplementary contents for the RSS 2013 submission of “A Topological Approach to Object Separation and Caging Using a Cable.”

Lexus Commercial

A KMel Robotics tour de force performance.


Aerial Robots for Remote Autonomous Exploration and Mapping
We are interested in exploring the possibility of leveraging an autonomous quadrotor in earthquake-damaged environments through field experiments that focus on cooperative mapping using both ground and aerial robots. Aerial robots offer several advantages over ground robots, including the ability to maneuver through complex three-dimensional (3D) environments and gather data from vantages inaccessible to ground robots. Read More

Collaborative mapping of an earthquake-damaged building via ground and aerial robots

N. Michael, S. Shen, K. Mohta, Y. Mulgaonkar, V. Kumar, K. Nagatani, Y. Okada, S. Kiribayashi, K. Otake, K. Yoshida, K. Ohno, E. Takeuchi, and S. Tadokoro, “Collaborative mapping of an earthquake-damaged building via ground and aerial robots,” J. Field Robotics, vol. 29, no. 5, pp. 832–841, 2012.

Y-Prize – 2013

Created from a vast trove of thousands and thousands of newly discovered pieces of footage from the early days of robotics, going back to the early 1980s. The public has never seen most of this footage. Collected from the archives of three preeminent engineers, the footage tells the story of the creation of quadrotors, RHex, CKBots and more.

Produced by Kurtis Films | http://www.kurtisfilms.com
Music by Helen Jane Long: http://www.helenjanelong.com


Top