fnirs_decodingspatialattention

Codes for Decoding Attended Spatial Location during Complex Scene Analysis with fNIRS.

https://github.com/nsnc-lab/fnirs_decodingspatialattention

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 2 DOI reference(s) in README
  • Academic publication links
    Links to: biorxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.3%) to scientific vocabulary
Last synced: 6 months ago · JSON representation

Repository

Codes for Decoding Attended Spatial Location during Complex Scene Analysis with fNIRS.

Basic Info
  • Host: GitHub
  • Owner: NSNC-Lab
  • License: gpl-3.0
  • Language: MATLAB
  • Default Branch: main
  • Size: 101 MB
Statistics
  • Stars: 7
  • Watchers: 3
  • Forks: 1
  • Open Issues: 1
  • Releases: 0
Created about 3 years ago · Last pushed almost 2 years ago
Metadata Files
Readme License Citation

README.md

Codes for Decoing Attended Spatial Location during Complex Scene Analysis with fNIRS

Stable Version Build Status Mirror

For "Decoding Attended Spatial Location during Complex Scene Analysis with fNIRS."

Authors: Matthew Ning, Meryem Ycel, Alexander Von Lhmann, David A Boas, Kamal Sen.

Affliation: Neurophotonics Center, Department of Biomedical Engineering, Boston University.

Preprint: https://www.biorxiv.org/content/10.1101/2022.09.06.506821v2

Citation:

Ning, M., Ycel, M. A., Von Lhmann, A., Boas, D. A. & Sen, K. Decoding Attended Spatial Location during Complex Scene Analysis with fNIRS. bioRxiv (2022) doi:10.1101/2022.09.06.506821.

Publication: Coming soon.

Dataset: Google Drive

Abstract: When analyzing complex scenes, humans often focus their attention on an object at a particular spatial location. The ability to decode the attended spatial location would facilitate brain computer interfaces for complex scene analysis. Here, we investigated functional near-infrared spectroscopys (fNIRS) capability to decode audio-visual spatial attention in the presence of competing stimuli from multiple locations. We targeted dorsal frontoparietal network including frontal eye field (FEF) and intra-parietal sulcus (IPS) as well as superior temporal gyrus/planum temporal (STG/PT). They all were shown in previous fMRI studies to be activated by auditory, visual, or audio-visual spatial tasks. We found that fNIRS provides robust decoding of attended spatial locations for most participants and correlates with behavioral performance. Moreover, we found that FEF makes a large contribution to decoding performance. Surprisingly, performance was significantly above chance level ~1s, well before the peak of the fNIRS response. Our results demonstrate that fNIRS is a promising platform for a compact, wearable technology that could be applied to decode attended spatial location and reveal contributions of specific brain regions during complex scene analysis.

Running Instructions: * To run everything in one click, run the runAllPreprocessing.m script within the SbjLvlProcessing folder. * WARNING: it'll take 1-3 days depending on your computer. Parallel computing version is available at Parallel CPU Version

  • To run preprocessing pipeline for Figure 5 and 6, in the SbjLvlProcessing folder, run the runPreprocessingMain2Class.m script for 2-class classification and similarly, runPreprocessingMain3Class.m script for the 3-class classification.

    • Estimated running time: up to 8-20 hours.
  • To run preprocessing pipeline for Figure 3, 4, 7 & 8, in the SbjLvlProcessing folder, run the runPreprocessing_Others.m script.

    • Estimated running time: 1-4 hours.
  • To generate Figure 3-8, in the SbjLvlProcessing folder, run the genFigures.m script.

Parallel Computing Instructions (for CPU)

List of directories for binaural lab: * /usr3/graduate/(your username) (smaller storage quota, for small personal project). * /projectnb/binaural/(your username) (larger storage quota, shared among lab directory, for bigger project. where I run parallel computing).

Before running batch file, run below command:

dos2unix *.sh *.m

Then check status of file using below command:

file runBatchfNIRSJob.m

Should be ASCII text executable

Finally run the following command:

qsub -pe omp 8 -l h_rt=28:00:00 ./runfNIRSBash.sh

You can vary the parameter values if you know what you're doing.

Helpful tips:

To view current status of job:

qstat -u (your username)

To view outputs: view runfNIRSBash.sh.o(jobnumber) view runfNIRSBash.sh.po(jobnumber)

To view error messages: runfNIRSBash.sh.e(jobnumber) view runfNIRSBash.sh.pe(jobnumber)

vim tip/commands: go to top of file: ESC + gg go to bottom of file: SHIFT + g go to line: in view mode, enter number to save and quit: ":wq" to quit without saving: ":q" or ":q!" to go to edit mode: type "i"

To run MATLAB without desktop:

module load matlab matlab -nodisplay

To view/manage storage quota:

scc-ondemand.bu.edu

To debug MATLAB without desktop but from command line: https://www.mathworks.com/help/releases/R2019b/matlab/debugging-code.html

Advanced options for sun grid engine: To view a list of parallel computing environments and their configurations, type qmon, then when main GUI window pops up, click Parallel Environment Configuration.

For description of different parallel environments on BU SCC, refer to [4]

To run multicores, use -pe option in qsub command. To submit array of jobs, use -t option in qsub command.

For full list of options for SGE, refer to [3].

References & Additional Readings: [1] https://docs.oracle.com/cd/E19957-01/820-0698/6ncdvjcmd/index.html [2] https://www.bu.edu/tech/support/research/software-and-programming/common-languages/matlab/matlab-batch/ [3] https://gridscheduler.sourceforge.net/htmlman/htmlman1/qsub.html [4] https://www.bu.edu/tech/support/research/system-usage/running-jobs/parallel-batch/#pe

Owner

  • Name: NSNC-Lab
  • Login: NSNC-Lab
  • Kind: organization

GitHub Events

Total
  • Watch event: 5
Last Year
  • Watch event: 5