Project Summary/Abstract The goal of this study is to investigate the feasibility of incorporating force-modulating neural signals into intracortical brain-computer interfaces (BCIs). BCIs have emerged as a promising assistive technology for restoring hand grasping and object interaction in individuals with tetraplegia. To date, most BCIs have focused on decoding position- and velocity-related information, or kinematics, from the motor cortex, in order to control the motion of external effectors. Kinematics have provided sufficient information to achieve BCI-controlled reaching and postural tasks. However, unlike reaching, natural hand grasping involves a combination of kinematic and kinetic (force-related) information. Incorporating force-related information into BCIs could enhance their functionality; however, the nature of how force is represented in the brain has yet to be fully elucidated in individuals with chronic tetraplegia, who are incapable of producing forces. Furthermore, while force-based intracortical BCI systems have been implemented in non-human primates, this system is one of the first to be developed for human use. The proposed research therefore seeks to achieve two aims: 1) to determine how neural activity modulates to attempted hand grasp force production in persons with tetraplegia, and 2) to decode force-related information from the motor cortex for real-time control of force using an intracortical BCI. These investigations will occur in human participants already enrolled in the BrainGate2 Pilot Clinical Trial, a multi-site study that assesses the safety and feasibility of chronic, intracortical BCI operation in persons with tetraplegia. In the first study aim, we will collect neural data while participants attempt to produce four discrete force levels. We will extract time-varying features from the neural data. We will apply a linear and a state-based tuning model to these features, in order to elucidate how force information is encoded in the motor cortex. To address the second study goal, we will use force-tuned neural features to train a neural force decoder (NFD), which will translate kinetic representations in the motor cortex to output force commands for an external effector. Participants will acquire various force targets using both the NFD and a standard kinematic decoder. They will then perform a virtual grasp-and-reach task using 1) pure kinematic control, in which extracted movement-related parameters will be used to control both reaching and grasping; and 2) combined control, in which the kinematic decoder output will control reaching movements while the NFD output will control grasping force. We will quantify performance with metrics such as number of targets successfully acquired and times to trial completion. NFD performance levels greater than chance would suggest that kinetic information can be successfully incorporated into human-operated BCIs. Additionally, if performance during combined control exceeds performance during pure kinematic control, then incorporating force-related information into BCIs could move the technology towards restoring grasping, object interaction, and other modes of functional hand control in persons with tetraplegia.