The direct neural control of external prosthetic devices such as robot hands requires the accurate decoding of neural activity representing continuous movement. This requirement becomes formidable when multiple degrees of freedom (DoFs) are to be controlled as in the case of the fingers of a robotic hand. In this paper a methodology is proposed for estimating grasp aperture using the spiking activity of multiple neurons recorded with an electrode array implanted in the arm/hand area of primary motor cortex (M1). Grasp aperture provides a reasonable approximation to the hand configuration during grasping tasks, while it offers a large reduction in the number of DoFs that must be estimated. A family of state space models with hidden variables is used to decode each finger grasp aperture with respect to the thumb from a population of motor-cortical neurons. The firing rates of multiple neurons in M1 were found to be correlated with grasp aperture and were used as inputs to our decoding algorithm. The proposed decoding architecture was evaluated off-line by decoding pre-recorded neural activity from monkey motor cortex during a natural grasping task. We found that our model was able to accurately reconstruct finger grasp aperture from a small population of cells. This demonstrates the first decoding of continuous grasp aperture from M1 suggesting the feasibility for neural control of prosthetic robotic hands from neuronal population signals.