We propose a novel data-driven animation method for the synthesis of natural looking human grasping. Motion data captured from human grasp actions is used to train a probabilistic model of the human grasp space. This model greatly reduces the high number of degrees of freedom of the human hand to a few dimensions in a continuous grasp space. The low dimensionality of the grasp space in turn allows for efficient optimization when synthesizing grasps for arbitrary objects. The method requires only a short training phase with no need for preprocessing of graphical objects for which grasps are to be synthesized.
- Gaussian mixture models
- Grasp synthesis
- Principal component analysis
ASJC Scopus subject areas
- Computer Graphics and Computer-Aided Design