The history of robotic grasping and manipulation spans twenty five years and millions of dollars in research funding, yet we have no successful devices for grasping in unstructured environments. The new generation of highly successful mobile and humanoid robots still lack basic hands that can reliably grasp arbitrary objects. This proposal is aimed at alleviating this glaring gap in robotic capabilities, founded on four principles: Start with robust grasping as a goal, not full and complex dexterous manipulation. Current robotic hands cant even acquire objects in unstructured environments, let alone manipulate them: robot hand research has tried to run before learning how to walk. It has proved too difficult to jump to dexterous manipulation from our present level of understanding. We aim to define the basis for lower-complexity hands that can grasp a wide variety of objects despite the noise and uncertainties that pertain in unstructured environments. This will provide the physical understanding and experimental foundation for future work in dexterous manipulation, as well as providing design principles for functional grasping devices. We need to learn from the human hand, not replicate it. Many robot hands in the literature have complex kinematics and actuation schemes that attempt to replicate the functionality of the human musculo-skeletal system. In contrast, we propose looking beyond the kinematic structure to some of the key principles that enable human grasping, in particular low-dimensional control. We will extend and apply this principle to define hand mechanisms that capture much of human grasping functionality with low DOFs. Simplicity is essential. Reducing complexity brings major benefits. By determining the minimal hand configuration (number and structure of joints, actuators, sensors, etc.) we can limit the cost of implementation. This not only enables use of robot hands in cost-sensitive applications (e.g. household assistants, elder or disabled care), it greatly speeds research. Low-complexity hands can be easily fabricated, so designs can be quickly iterated as experiments lend new insight into functionality. Put functionality in passive mechanics, not elaborate sensing and control. Task execution with robot hands is usually based on complex actuation and control algorithms for every joint using detailed contact sensing. This approach has not succeeded to date. In contrast, our preliminary results suggest that carefully tuned kinematic configurations and joint compliance allows hands to passively adapt to objects with minimal forces. Using these ideas, we propose to build a low-cost, low degree-of-freedom grasping device (a hand) that is based on hard human grasping data. Further, using new tools we have developed, we can test our designs in simulation, and build hardware that is functionally proven for a given set of robotic grasping tasks. The simulator also gives us the ability to iterate designs quickly, and new fabrication methods allow us to build different grasping devices inexpensively and quickly. This research project also establishes a set of hardware methods, modeling and simulation tools, and insights into a better understanding of human grasping that will drive the future research agenda in robotic and prosthetic hands. The collaborative team contains experts in mechanical design and biomechanics (Howe), neuroscience of human grasping (Santello) and robotic grasping (Allen). Progress in robotic hands can only occur with this kind of collaboration on a very difficult problem. The intellectual merit in this proposal includes: Development of a new class of low-dimensional robotic hands using inexpensive and robust fabrication and sensor technologies. Human experiments to gain insights from human grasping that can influence design and control of robotic and prosthetic hands. Study of human adaptive compliance during grasping, and its applicability
|Effective start/end date||7/1/09 → 6/30/13|
- National Science Foundation (NSF): $236,003.00
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.