https://github.com/lokinou/p3k_offline_analysis
P300 speller offline analysis recorded with Openvibe and BCI2000. Using python-MNE
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
✓Committers with academic emails
2 of 4 committers (50.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.2%) to scientific vocabulary
Keywords
Repository
P300 speller offline analysis recorded with Openvibe and BCI2000. Using python-MNE
Basic Info
Statistics
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 1
- Releases: 0
Topics
Metadata Files
README.md
p3k - yet another offline ERP analysis tool
From BCI2000 and OpenVibe P300 Speller. Based on mne-python
- Supported bci software
- BCI2000 (BCI2kReader by @markusadamek)
- OpenVibe
- Preprocessing features
- REST infinity rereferencing
- artifact subspace reconstruction (meegkit by @nbara)
- current source density (CSD)
- Artifact rejection channel/trial based
- ERP visualization
- Target vs Non-target plots and topographic maps
- signed r-square heatmaps (wyrm by @bbci)
- Classification
- cross-fold shrinkage LDA
- Sample Data
- OpenVibe P300 provided in
./data_sample
- OpenVibe P300 provided in
| signed r-square maps | offline classification |
| :----------------------------------------------------------: | ------------------------------------------------------------ |
|
|
|

Requirements
- OpenViBE ( for converting to gdf)
- python 3.7+
- If you want to use Artifact Subspace Reconstruction you must manually install
pip install "git+https://github.com/nbara/python-meegkit"pip install statsmodels pyriemann
Install via pip
pip install p3k
or Install from git
git clone https://github.com/lokinou/p3k_offline_analysis.gitcd p3k_offline_analysis- create an anaconda environment
conda env create -f environment.yml python=3.8.1- activate the environment
conda activate p3k- If you want to use Artifact subspace reconstrunction you must install this
pip install "git+https://github.com/nbara/python-meegkit"pip install statsmodels pyriemann- Install the p3k package
pip install .- Finally, check that p3k works, this should trigger no error
python -c "import p3k"
Usage
If you are trying to modify from sources, make sure in the first lines of P300Analysis.py that DEVELOP = True
to make sure python loads the current source files and not the package installe with pip located in site-packages.
Test the sample data
from p3k.P300Analysis import run_analysis
run_analysis()
BCI2000 data
Put the file(s) inside a folder
``` from p3k.P300Analysis import run_analysis from p3k.params import ParamData
Define the path to data parameter
pdata = ParamData(datadir='./data_bci2k')
run the analysis
runanalysis(paramdata=p_data) ```
If the electrode names were not defined in the dat files, you must specify them manually
from p3k.params import ParamChannels
p_channels = ParamChannels(cname=['Fz','FC1','FC2','C1','Cz','C2','P3','Pz','P4','Oz'])
run_analysis(..., param_channels=p_channels)
OpenVibe data
Check my tutorial to convert .ov to .gdf
OpenVibe to gdf conversion does not carry channel names, and P300 Speller description, we must define them here
``` from p3k.P300Analysis import run_analysis from p3k.params import ParamChannels, ParamData, SpellerInfo
channel
p_channels = ParamChannels(cname=['Fz','FC1','FC2','C1','Cz','C2','P3','Pz','P4','Oz'])
P300 speller description
spellerinfo = SpellerInfo(nbstimulusrows=7, nbstimuluscols=7, nbseq=10)
gdf file location
pdata = ParamData(datafiles=r'./data_ov')
run the analysis
runanalysis(paramdata=pdata, paramchannels=pchannels, spellerinfo=speller_info) ```
Changing any parameter
If not initialized or passed to run_analysis() default parameters apply. You can change them very easily:
``` from p3k.P300Analysis import run_analysis from p3k.params import ParamData, ParamPreprocessing, ParamArtifacts, ParamEpochs, ParamLDA, ParamInterface, DisplayPlots, SpellerInfo
pdata = ParamData(datafiles='./data')
Change the length of the ERP window and baseline
pepoch = ParamEpochs(timeepoch=(-0.5, 0.8), time_baseline=(-.1, 0))
Use artifact subspace reconstruction for noisy data, and select another bandpass
ppreproc = ParamPreprocessing(applyASR=True, bandpass=(.5, 30))
Change the number of cross fold to match the number of trials (e.g 8)
plda = ParamLDA(nbcross_fold=8)
Select which plots to display
pplots = DisplayPlots(butterflytopomap=True)
Visualize a few of those parameters
print(pepoch) print(pplots)
Launch the analysis
runanalysis(paramdata=pdata, paramepoch=pepoch, parampreprocessing=ppreproc, paramlda=plda, paramplots=p_plots) ```
Output
By default, figures are saved into ./out/<name_first_datafile>/*
Accessing the notebook
Follow the instructions to "Install from git"
move to the current repository folder
cd %USERPROFILE%\Desktop\p3k_offline_analysisactivate the environment
conda activate p3kexecute the notebook:
jupyter lab p300_analysis.ipynb- if jupyter lab crashes (win32api error), reinstall it from conda
conda install pywin32 jupyterlab BCI2kReader - > if jupyter lab does not want to work yet, use jupyter notebook instead by executing
jupyter notebook p300_analysis.ipynb
- if jupyter lab crashes (win32api error), reinstall it from conda
Owner
- Login: lokinou
- Kind: user
- Repositories: 1
- Profile: https://github.com/lokinou
GitHub Events
Total
Last Year
Committers
Last synced: about 2 years ago
Top Committers
| Name | Commits | |
|---|---|---|
| lokinou | l****l@g****m | 75 |
| Manamthi | m****l@g****e | 9 |
| Matthias Eidel | m****l@u****e | 7 |
| Loic Botrel | l****l@u****e | 2 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 1
- Total pull requests: 1
- Average time to close issues: N/A
- Average time to close pull requests: less than a minute
- Total issue authors: 1
- Total pull request authors: 1
- Average comments per issue: 1.0
- Average comments per pull request: 0.0
- Merged pull requests: 1
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- nbara (1)
Pull Request Authors
- lokinou (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- BCI2kReader *
- matplotlib *
- mne >=0.23
- numpy >=1.20.2
- pandas >=1.3.2
- scikit-learn >=0.22
- seaborn >=0.11.2
- anyio ==3.1.0
- argon2-cffi ==20.1.0
- async-generator ==1.10
- attrs ==21.2.0
- babel ==2.9.1
- backcall ==0.2.0
- bleach ==3.3.0
- cffi ==1.14.5
- chardet ==4.0.0
- colorama ==0.4.4
- decorator ==5.0.9
- defusedxml ==0.7.1
- entrypoints ==0.3
- et-xmlfile ==1.1.0
- idna ==2.10
- ipykernel ==5.5.5
- ipython ==7.24.1
- ipython-genutils ==0.2.0
- jedi ==0.18.0
- jinja2 ==3.0.1
- json5 ==0.9.5
- jsonschema ==3.2.0
- jupyter-client ==6.1.12
- jupyter-core ==4.7.1
- jupyter-server ==1.8.0
- jupyterlab ==3.0.16
- jupyterlab-pygments ==0.1.2
- jupyterlab-server ==2.6.0
- markupsafe ==2.0.1
- matplotlib-inline ==0.1.2
- meegkit ==0.1.1
- mistune ==0.8.4
- mne ==0.23.0
- nbclassic ==0.3.1
- nbclient ==0.5.3
- nbconvert ==6.0.7
- nbformat ==5.1.3
- nest-asyncio ==1.5.1
- notebook ==6.4.0
- openpyxl ==3.0.7
- packaging ==20.9
- pandocfilters ==1.4.3
- parso ==0.8.2
- pickleshare ==0.7.5
- prometheus-client ==0.11.0
- prompt-toolkit ==3.0.18
- pycparser ==2.20
- pygments ==2.9.0
- pyrsistent ==0.17.3
- pywin32 ==301
- pywinpty ==1.1.1
- pyzmq ==22.1.0
- requests ==2.25.1
- seaborn ==0.11.1
- send2trash ==1.5.0
- sniffio ==1.2.0
- terminado ==0.10.0
- testpath ==0.5.0
- traitlets ==5.0.5
- urllib3 ==1.26.5
- wcwidth ==0.2.5
- webencodings ==0.5.1
- websocket-client ==1.0.1
- xlrd ==2.0.1