Work Description

Title: Dataset for "Error detection and correction in intracortical brain–machine interfaces controlling two finger groups" Open Access Deposited

h
Attribute Value
Methodology
  • The following methodology is also described in depth in the related published work: Two adult male rhesus macaques (Monkey W and Monkey N) were implanted with Utah arrays (Blackrock Microsystems, Salt Lake City, Utah) in the hand area of PCG, which typically includes both M1 and PMd in monkey cortex, and one monkey was additionally implanted with a Utah array in sensory cortex (not used in this study). The arrays in both monkeys were over one year old at time of study. The monkeys were trained to sit in a chair and use a hand manipulandum to control virtual fingers on a screen and move the virtual fingers to target positions. The angles of the virtual fingers were determined from bend sensors embedded within the hand manipulandum. Neural data was recorded from 96 channels within the motor cortex and processed via a Cerebus neural signal processor (Blackrock Microsystems). Spiking-band power (SBP), which was demonstrated to be well correlated with single-unit activity by Nason et al [1], was used as the neural feature. The quality of the neural signals was inspected visually, during daily setup, and channels with visible noise were excluded from decoding and analysis. Channels were referenced to the average activity of all the remaining channels as detailed in Ludwig et al [2]. Monkey W performed a single degree-of-freedom finger task, detailed in Vaskov et al [3] while monkey N performed a two-finger task, previously developed in Nason et al [4]. References: [1] Nason S R et al 2020 A low-power band of neuronal spiking activity dominated by local single units improves the performance of brain-machine interfaces Nat. Biomed. Eng. 4 973–83 [2] Ludwig K A, Miriani R M, Langhals N B, Joseph M D, Anderson D J and Kipke D R 2009 Using a common average reference to improve cortical neuron recordings from microelectrode arrays J. Neurophysiol. 101 1679–89 [3] Vaskov A K, Irwin Z T, Nason S R, Vu P P, Nu C S, Bullard A J, Hill M, North N, Patil P G and Chestek C A 2018 Cortical decoding of individual finger group motions using ReFIT Kalman filter Front. Neurosci. 12 26–36 [4] Nason S R, Mender M J, Vaskov A K, Willsey M S, Ganesh Kumar N, Kung T A, Patil P G and Chestek C A 2021 Real-time linear prediction of simultaneous and independent movements of two finger groups using an intracortical brain-machine interface Neuron 109 3164–77.e8
Description
  • This is data from Wallace, Benyamini et al., 2023, Journal of Neural Engineering. There are two sets of data included: 1. Neural features and error labels used to train error classifiers for each day used in the study 2. Trial data from an example experiment day (Monkey N, Day 6), with runs for offline calibration, online brain control, error monitoring, and error correction. The purpose of this study was to investigate the use of error signals in motor cortex to improve brain-machine interface (BMI) performance for control of two finger groups. All data is contained in .mat files, which can be opened using MATLAB or the Python SciPy library.
Creator
Creator ORCID
Depositor
  • dywallac@umich.edu
Contact information
Discipline
Funding agency
  • Other Funding Agency
  • National Science Foundation (NSF)
Other Funding agency
  • Dan and Betty Kahn Foundation
ORSP grant number
  • Dan and Betty Kahn Foundation Grant 2029755, NSF Grant 1926576
Keyword
Citations to related material
  • Wallace, D. M., Benyamini, M., Nason-Tomaszewski, S. R., Costello, J. T., Cubillos, L. H., Mender, M. J., Temmar, H., Willsey, M. S., Patil, P. G., Chestek, C. A., & Zacksenhouse, M. (2023). Error detection and correction in intracortical brain–machine interfaces controlling two finger groups. Journal of Neural Engineering, 20(4), 046037. https://doi.org/10.1088/1741-2552/acef95
Resource type
Last modified
  • 12/04/2023
Published
  • 12/04/2023
Language
DOI
  • https://doi.org/10.7302/np99-cz36
License
To Cite this Work:
Wallace, D. M., Benyamini, M., Nason-Tomaszewski, S. R., Costello, J. T., Cubillos, L. H., Mender, M. J., Temmar, H., Willsey, M. S., Patil, P. P., Chestek, C. A., Zacksenhouse, M. (2023). Dataset for "Error detection and correction in intracortical brain–machine interfaces controlling two finger groups" [Data set], University of Michigan - Deep Blue Data. https://doi.org/10.7302/np99-cz36

Relationships

This work is not a member of any user collections.

Files (Count: 14; Size: 1020 MB)

======
Dataset for "Error detection and correction in intracortical brain–machine interfaces controlling two finger groups"
======
Prepared by Dylan M Wallace
======
Contact cchestek@umich.edu with any questions or requests for additional data
======
Last Updated Nov 30 2023
======
Description
======

This is data from Wallace, Benyamini et al., 2023, Journal of Neural Engineering. There are two sets of data included:

1. Neural features and error labels used to train error classifiers for each day used in the study

2. Trial data from an example experiment day (Monkey N, Day 6), with runs for offline calibration, online brain control, error monitoring, and error correction.

The purpose of this study was to investigate the use of error signals in motor cortex to improve brain-machine interface (BMI) performance for control of two finger groups. All data is contained in .mat files, which can be opened using MATLAB or the Python SciPy library.

======
Methodology
======

Two adult male rhesus macaques (Monkey W and Monkey N) were implanted with Utah arrays (Blackrock Microsystems, Salt Lake City, Utah) in the hand area of PCG, which typically includes both M1 and PMd in monkey cortex, and one monkey was additionally implanted with a Utah array in sensory cortex (not used in this study). The arrays in both monkeys were over one year old at time of study. The monkeys were trained to sit in a chair and use a hand manipulandum to control virtual fingers on a screen and move the virtual fingers to target positions. The angles of the virtual fingers were determined from bend sensors embedded within the hand manipulandum.
Neural data was recorded from 96 channels within the motor cortex and processed via a Cerebus neural signal processor (Blackrock Microsystems). Spiking-band power (SBP), which was demonstrated to be well correlated with single-unit activity by Nason et al [1], was used as the neural feature. The quality of the neural signals was inspected visually, during daily setup, and channels with visible noise were excluded from decoding and analysis. Channels were referenced to the average activity of all the remaining channels as detailed in Ludwig et al [2]. Monkey W performed a single degree-of-freedom finger task, detailed in Vaskov et al [3] while monkey N performed a two-finger task, previously developed in Nason et al [4].

References:
[1] Nason S R et al 2020 A low-power band of neuronal spiking activity dominated by local single units improves the performance of brain-machine interfaces Nat. Biomed. Eng. 4 973–83
[2] Ludwig K A, Miriani R M, Langhals N B, Joseph M D, Anderson D J and Kipke D R 2009 Using a common average reference to improve cortical neuron recordings from microelectrode arrays J. Neurophysiol. 101 1679–89
[3] Vaskov A K, Irwin Z T, Nason S R, Vu P P, Nu C S, Bullard A J, Hill M, North N, Patil P G and Chestek C A 2018 Cortical decoding of individual finger group motions using ReFIT Kalman filter Front. Neurosci. 12 26–36
[4] Nason S R, Mender M J, Vaskov A K, Willsey M S, Ganesh Kumar N, Kung T A, Patil P G and Chestek C A 2021 Real-time linear prediction of simultaneous and independent movements of two finger groups using an intracortical brain-machine interface Neuron 109 3164–77.e8

======
Files Included
======

1. Classifier Data:
a. MonkeyN_Error_Classifier_Data_Day_1.mat
b. MonkeyN_Error_Classifier_Data_Day_2.mat
c. MonkeyN_Error_Classifier_Data_Day_3.mat
d. MonkeyN_Error_Classifier_Data_Day_4.mat
e. MonkeyN_Error_Classifier_Data_Day_5.mat
f. MonkeyN_Error_Classifier_Data_Day_6.mat

2. Trial Data:
a. MonkeyN_Offline_Calibration_Trials.mat
b. MonkeyN_Online_Brain_Control_Trials.mat
c. MonkeyN_Online_Error_Monitoring_Trials.mat
d. MonkeyN_Online_Error_Correction_Trials.mat
g. MonkeyW_Error_Classifier_Data_Day_1.mat
h. MonkeyW_Error_Classifier_Data_Day_2.mat

======
Data Description
======

1. Classifier Data: This is a .mat file containing the data used to train and test (offline) the error classifiers used in this study. Contains ‘TestData’ and ‘TrainingData’ structs. Within these structs are data for each finger group (Index & MRS for Monkey N, Index for Monkey W). Two matrices per finger (flex & extend) represent the neural data (T bins for each masked neural channel used for decode), with rows representing bins and columns representing features (channels*T). Two vectors per finger (flex & extend) represent the ground truth labels used to train and test the error classifiers. 0 represents no prediction (inconsistent direction), 1 represents predicted error, and -1 represents no predicted error.
2. Trial Data: This is a .mat file created by a MATLAB script to generate an indexable data format that includes important data and information gathered during a real-time BMI experiment. The file is generated from behavior, parameters, and neural data files logged for every trial during the experiment. The files are organized with a number of rows of data representing each trial of the experiment run. The subsequent columns include parameters and data collected during that trial. Relevant fields in the structure are described below:
a. ‘MoveMask’: A row vector describing which fingers are being used for control:
i. Index 1 = Thumb
ii. Index 2 = Index
iii. Index 3 = Middle
iv. Index 4 = Ring (Also Middle-Ring-Small (MRS) group)
v. Index 5 = Small

b. ‘TargetHoldTime’ (ms): The length of time that the finger(s) from the MoveMask must be held in the target(s) for to complete the trial. Typically, is not changed during a run. 750 ms used for “offline” training runs, and 500 ms used for “online” brain-control runs.
c. ‘TargetScaling’ (%): The relative scaling of the finger target(s). All runs used in this study use a scaling of 100%.
d. ‘TargetPos’: A row vector describing the positions of the target(s) used for each trial. A target of -1 indicates no target present for that finger (if finger is not specified in ‘MoveMask’). Positions range from 0.0 – 1.0, and represent the fractional percentage of flexion, where 1.0 is 100% flexed. The position represents the center of the target, and at 100% scaling the target will take up 15% of the flexion range, centered at the given position.
e. ‘TrialTimeoutms’ (ms): The total amount of time the monkey has to complete a trial before the trial is considered a failure.
f. ‘ClosedLoop’ (0/1): A boolean flag representing whether the trial is being controlled in closed-loop, or brain-control in this situation. For online runs/trials this will be 1, and for offline runs/trials it will be 0.
g. ‘TrialSuccess’ (0/1): A boolean flag representing whether a trial was completed within the given trial timeout.
h. ‘FingerAnglesTIMRL’: A matrix representing the real-time positions of each physical finger group, sampled every 1 ms. Positions range from 0.0 – 1.0, and represent the fractional percentage of flexion, where 1.0 is 100% flexed. Rows represent samples and columns are each finger group as described in ‘MoveMask’. These values are only used for offline calibration runs, as the ‘Decode’ field controls the virtual hand during online brain control runs. During online runs, they still represent the position of the physical hand, but do not necessarily represent the position of the virtual hand.
i. ‘ExperimentTime’ (ms): A vector representing the time in ms since the run began. For each trial, this is the times for each sample in that trial.
j. ‘NeuralFeature’: A matrix representing the spiking band power for all 96 channels accumulated each ms corresponding to ‘ExperimentTime’ (rows = ms, columns = channels), acquired via the following procedure:
i. Filtered to 300-1,000Hz using the Digital Filter Editor in the Central Software Suite (Blackrock Microsystems, LLC).
ii. Sampled at 2 kilo-samples per second.
iii. Transmitted from the Cerebus system to the xPC Target system (see methods in the main paper) at 2 kilo-samples per second.
iv. All samples received by the xPC within 1 ms are absolute-valued then summed within each channel, contained in the ‘NeuralFeature’ field.

k. ‘SampleWidth’: A vector representing the number of ‘NeuralFeature’ samples received by the xPC during each 1 ms, corresponding to ‘ExperimentTime’.
l. ‘Decode’: A matrix representing the decoder output for each 1 ms. Decoder output is only updated every bin (50 ms for all data in this study), so this data will repeat 50 times before updating. This field is not used during the offline calibration runs, but is used during all online runs.
i. Columns 1-5: Represent the position predictions for each finger group using the threshold crossing firing rate (TCFR) neural feature, which is not used in this study.
ii. Columns 6-10: Represent the position predictions for each finger group using the spiking band power neural feature, which is recorded in ‘NeuralFeature’. Positions range from 0.0 – 1.0, and represent the fractional percentage of flexion, where 1.0 is 100% flexed. Finger groups not set in ‘MoveMask’ set to -1.
iii. Columns 11-15: Represent the velocity predictions for each finger group using the spiking band power neural feature. Velocity is percentage flexion/100 per bin. Finger groups not set in ‘MoveMask’ set to 0.
iv. Columns 16-20: Represent the thresholded error classifier predictions for each finger group using the spiking band power neural feature. 0 represents no prediction (inconsistent direction), 1 represents predicted error, and -1 represents no predicted error. Finger groups not set in ‘MoveMask’ set to 0.
v. Columns 21-25: Represent the unthresholded error classifier predictions for each finger group using the spiking band power neural feature. Finger groups not set in ‘MoveMask’ set to 0.

m. ‘TrialNumber’: A vector representing the number of each trial during the experiment run.
======

Download All Files (To download individual files, select them in the “Files” panel above)

Best for data sets < 3 GB. Downloads all files plus metadata into a zip file.

Files are ready   Download Data from Globus
Best for data sets > 3 GB. Globus is the platform Deep Blue Data uses to make large data sets available.   More about Globus

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.