The quest for highly functional neuroprosthetics in activities of daily living has implicitly assumed that the neural interface would include both motor and sensory (i.e. tactile and proprioceptive) functionalities. It is likely that for reaching and grasping tasks, the dynamic sensorimotor programs will need to be developed to enable dexterous control. Interestingly, the neural decoding, stimulation, and hardware principles for sensorimotor interfaces are often developed in isolation in motor-only or sensory-only studies. In this week’s issue of the journal Nature, a new study was published by Prof. Nicolelis group from Duke University attempting to create a bi-directional sensorimotor neural interface for reaching tasks. Primates used both direct brain motor control and artificial tactile sensory feedback delivered back to the brain to complete the task. Both the motor and sensory channels bypassed the subject’s body, effectively liberating a brain from the physical constraints of slow nerve implse propagation through the nervous system. Potential use of such bidirectional control is not limited to artificial limbs and can include fast communication with a variety of external sensors and actuators.