
HAND AND ARM CONTROL
The hand is the most versatile manipulative organ in the known universe. While manual behaviors can reach the complexity of virtuoso
pianism or assembling a clockwork mechanism, everyday activities – such as
grasping irregularly shaped objects or opening bottles – pose significant challenges for even the most advanced robots. The remarkable dexterity of the hand is enabled by its intricate anatomy and sophisticated control systems.
Here we investigate how the nervous system orchestrates manual behaviors, with a particular focus on unconstrained object grasping and force application. Using a variety of electrophysiological techniques, we examine how different brain regions contribute to the control of reaching, grasping, transporting objects, and applying force with individual digits. By linking neural activity to kinematic and kinetic descriptors of behavior, we aim to create a novel understanding of the motor system controlling the hand and drive the advancement of neural prosthetics.

NEUROPROSTHETICS
Neuroprosthetics provide the promise of restoring lost function and quality of life to people who have suffered life-changing
injuries. Our work focuses on prosthetic hands and arms that interface directly with the brain. This interface allows people with paralyzed hands and arms to control the prosthetic by thinking about moving their own arm. When the prosthetic hand touches an object, we can stimulate the brain so that the user feels a sense of touch as though it came from their own hand. Taken together, this allows someone who has lost the use of their hand to move and feel with a robotic hand as though it were their own. We expect this work to lead to
devices that can restore independence and the ability to quickly and accurately complete a variety of activities of daily life that are often lost following spinal cord injury, brain stem stroke or high-level amputation.

TACTILE CODING
The sense of touch enables dexterous interactions with objects and provides rich information about these objects: size, shape, texture. The frequency composition of objects and the manner in which we interact with each object results in distinct vibrations of the skin that activate the mechanoreceptors. Furthermore, the skin is a unique sensory organ as these object interactions causes it to deform in 3D space resulting in a unique integration of tactile and proprioceptive signals to enable object recognition, also known as stereognosis. One goal of our group is to understand how these precise and complex signals are transformed and integrated along the neuraxis, recording from peripheral afferents, cuneate, thalamus, and somatosensory cortex, to understand how touch is perceived. This allows us to build models of each region that can inform ICMS feedback to help restore touch.