Organizers: Cynthia Matuszek, Dieter Fox
Abstract: PechaKucha 20x20 is a new approach to giving presentations. From the FAQ: "PechaKucha 20x20 is a simple presentation format where you show 20 images, each for 20 seconds. The images advance automatically and you talk along to the images." While the presentation format was originally developed for architecture presentations, it has been successfully applied to fields as diverse as art, cooking, design, and journalism. This talk will give an overview of the format and some examples, in the interest of stimulating discussion about the role of such a format in technology.
Abstract: In this session, eight interested (and interesting!) robotics researchers will use one of the popular "Flash presentation" styles to spend a few minutes covering information about their own work, related work that they think is worth knowing about, or any other robotics-related topic they wish.
Abstract: Kiva's mobile fulfillment system blends techniques from AI, Controls Systems, Machine Learning, Operations Research and other engineering disciplines into the world's largest mobile robotic platform. Kiva uses hundreds of mobile robots to carry inventory shelves around distribution centers for customers like Staples, Walgreens, and The Gap. Kiva currently has equipment in over 30 warehouses in three countries. This talk will describe the application domain and the business solution, and some of the practical engineering problems that Kiva has solved along the way.
Abstract: We often assume that general-purpose robot hands should be complex, perhaps even as complex as human hands. Yet humans can do a lot even when using tongs. This talk describes ongoing work with simple hands - hands inspired by very simple tools like tongs. We explore a robot's ability to grasp, recognize, localize, place and even manipulate objects in the hand, with a very simple hand. The perception and planning algorithms are based on learned models, which are in turn based on thousands of experiments with the objects in question.
Biography: Dr. Matthew T. Mason earned the BS, MS, and PhD degrees in Computer Science and Artificial Intelligence at MIT, finishing his PhD in 1982. Since that time he has been on the faculty at Carnegie Mellon University, where he is presently Professor of Robotics and Computer Science, and Director of the Robotics Institute. His prior work includes force control, automated assembly planning, mechanics of pushing and grasping, automated parts orienting and feeding, and mobile robotics. He is co-author of "Robot Hands and the Mechanics of Manipulation" (MIT Press 1985), co-editor of "Robot Motion: Planning and Control" (MIT Press 1982), and author of "Mechanics of Robotic Manipulation" (MIT Press 2001). He is a Fellow of the AAAI, and a Fellow of the IEEE. He is a winner of the System Development Foundation Prize and the IEEE Robotics and Automation Society's Pioneer Award.
Abstract: All living creatures process information from multiple sensory modalities and, in turn, control movement through multiple actuators. They do so to navigate through spatially and temporally complex environments with amazing agility. Among the most successful of nature's robots are insects, occupying every major habitat. This talk will review sensorimotor control of movement in flying insects, with a focus on where the functional role of sensing and actuation become blurred.
Biography: Dr. Tom Daniel holds the Joan and Richard Komen Endowed Chair and has appointments in the Department of Biology, Computer Science & Engineering, the Program on Neurobiology and Behavior Faculty at the University of Washington. He is currently the Interim Director of the National Science Foundation Center for Sensorimotor Neural Engineering (CSNE). He has served as a UW faculty member since his initial appointment in 1984. He was the founding chair of the Department of Biology at the University of Washington (2000-2008). Prior to the UW, he was the Myron A. Bantrell Postdoctoral Fellow in Engineering Sciences at the California Institute of Technology. He received his PhD degree from Duke University. He was awarded MacArthur Fellow in 1996, the University of Washington Distinguished Teaching Award, and the University of Washington Distinguished Graduate Mentor Award. He is on the editorial boards of the Science Magazine, Proceedings of the Royal Society (Biology Letters). He is also on the Board of Directors and the Scientific Advisory Board of the Allen Institute of Brain Science, and the Scientific Advisory Board for the NSF Mathematical Biosciences Institutes. His research programs focus on biomechanics, neurobiology, and sensory systems, addressing questions about the physics, engineering and neural control of movement in biological systems.
Abstract: Although physical interaction with the world is at the core of human experience, few computer and machine interfaces provide the operator with high-fidelity touch feedback, limiting their usability. Similarly, autonomous robots rarely take advantage of touch perception and thus struggle to match the manipulation capabilities of humans. My long-term research goal is to leverage scientific knowledge about the sense of touch to engineer haptic interfaces and robotic systems that increase the range and quality of tasks humans can accomplish. This talk will describe my group's three main research thrusts: haptic texture rendering, touch feedback for robotic surgery, and touch perception for autonomous robots. First, most haptic interfaces struggle to mimic the feel of a tool dragging along a surface due to both software and hardware limitations. We pioneered a data-driven method of capturing and recreating the high-bandwidth vibrations that characterize tool-mediated interactions with real textured surfaces. Second, although commercial robotic surgery systems are approved for use on human patients, they provide the surgeon with little to no haptic feedback. We have invented, refined, and studied a practical method for giving the surgeon realistic tactile feedback of instrument vibrations during robotic surgery. Third, household robots will need to know how to grasp and manipulate a wide variety of objects. We have invented a set of methods that enable a robot equipped with commercial tactile sensors to delicately and firmly grasp real-world objects and perceive their haptic properties. Our work in all three of these areas has been principally enabled by a single insight: although less studied than kinesthetic cues, tactile sensations convey much of the richness of physical interactions.
Biography: Dr. Katherine J. Kuchenbecker is the Skirkanich Assistant Professor of Innovation in Mechanical Engineering and Applied Mechanics at the University of Pennsylvania. Her research centers on the design and control of haptic interfaces for applications such as robot-assisted surgery, medical simulation, stroke rehabilitation, and personal computing. She directs the Penn Haptics Group, which is part of the General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory. She has won several awards for her research, including an NSF CAREER Award in 2009, Popular Science Brilliant 10 in 2010, and the IEEE Robotics and Automation Society Academic Early Career Award in 2012. Prior to becoming a professor, she completed a postdoctoral fellowship at the Johns Hopkins University, and she earned her Ph.D. in Mechanical Engineering at Stanford University in 2006.
Abstract: Robots are typically far less capable in autonomous mode than in teleoperated mode. The few exceptions tend to stem from long days (and more often weeks, or even years) of expert engineering for a specific robot and its operating environment. Current control methodology is quite slow and labor intensive. I believe advances in machine learning and optimization have the potential to revolutionize robotics. First I will present new machine learning techniques we have developed that are tailored to robotics. I will describe in depth "Apprenticeship learning," a new approach to high-performance robot control based on learning for control from ensembles of expert human demonstrations. Our initial work in apprenticeship learning has enabled the most advanced helicopter aerobatics to-date, including maneuvers such as chaos, tic-tocs, and auto-rotation landings which only exceptional expert human pilots can fly. Our most recent work in apprenticeship learning is inspired by challenges in surgical robotics. We are studying how a robot could learn to perform challenging robotic manipulation tasks, such as knot-tying. Then I will describe our recent advances in optimization based planning — both in state space and in belief space. Finally, I will briefly highlight our recent work on enabling robots to learn on their own through non-parametric model-based reinforcement learning.
Biography: Dr. Pieter Abbeel received a BS/MS in Electrical Engineering from KU Leuven (Belgium) and received his Ph.D. degree in Computer Science from Stanford University in 2008. He joined the faculty at UC Berkeley in Fall 2008, with an appointment in the Department of Electrical Engineering and Computer Sciences. He has won various awards, including best paper awards at ICML and ICRA, the Sloan Fellowship, the Air Force Office of Scientific Research Young Investigator Program (AFOSR-YIP) award, the Office of Naval Research Young Investigator Program (ONR-YIP) award, the Okawa Foundation award, the TR35, the IEEE Robotics and Automation Society (RAS) Early Career Award, and the Dick Volz Best U.S. Ph.D. Thesis in Robotics and Automation Award. He has developed apprenticeship learning algorithms which have enabled advanced helicopter aerobatics, including maneuvers such as tic-tocs, chaos and auto-rotation, which only exceptional human pilots can perform. His group has also enabled the first end-to-end completion of reliably picking up a crumpled laundry article and folding it. His work has been featured in many popular press outlets, including BBC, New York Times, MIT Technology Review, Discovery Channel, SmartPlanet and Wired. His current research focuses on robotics and machine learning with a particular emphasis on challenges in personal robotics, surgical robotics and connectomics.