Demos

demos/
    ├── face_repetition
    ├── lesion_detection
    ├── MoAE
    ├── openneuro
    ├── tSNR
    └── vismotion

The demos show show you different way to use bidspm.

MoAE

/demos/MoAE
    ├── models
    └── options

This “Mother of All Experiments” is based on the block design dataset of SPM.

In the options folder has several examples of how to encode the options of your analysis in a json file.

In the models shows the BIDS statistical model used to run the GLM of this demo.

moae_01_bids_app

# MoAE demo

This script shows how to use the bidspm BIDS app

  • Download

    • download the dataset from the FIL for the block design SPM tutorial

  • Preprocessing

    • copies the necessary data from the raw to the derivative folder,

    • runs spatial preprocessing

      those are otherwise handled by the workflows:

    • bidsCopyInputFolder.m

    • bidsSpatialPrepro.m

  • Stats

    This will run the subject level GLM and contrasts on it of the MoaE dataset

    • GLM specification + estimation

    • compute contrasts

    • show results

    that are otherwise handled by the workflows

    • bidsFFX.m

    • bidsResults.m

Note

Results might be a bit different from those in the SPM manual as some default options are slightly different in this pipeline (e.g use of FAST instead of AR(1), motion regressors added)

type bidspm help or bidspm(‘action’, ‘help’) or see this page: https://bidspm.readthedocs.io/en/stable/bids_app_api.html for more information on what parameters are obligatory or optional

  1. Copyright 2022 Remi Gau

moae_fmriprep

This script will run the FFX and contrasts on it of the MoAE dataset using the fmriprep preprocessed data

If you want to get the preprocessed data and you have datalad on your computer you can run the following commands to get the necessary data:

datalad install --source git@gin.g-node.org:/SPM_datasets/spm_moae_fmriprep.git \
        inputs/fmriprep
cd inputs/fmriprep && datalad get *.json \
                  */*/*tsv \
                  */*/*json \
                  */*/*desc-preproc*.nii.gz \
                  */*/*desc-brain*.nii.gz

Otherwise you also grab the data from OSF: https://osf.io/vufjs/download

  1. Copyright 2019 Remi Gau

moae_02_create_roi_extract_data

This script shows how to create a ROI and extract data from it.

Warning

This is “double dipping” as we use the same data to create the ROI we are going to extract the value from.

  1. Copyright 2021 Remi Gau

moae_03_slice_display

This script shows how to display the results of a GLM by having on the same image:

  • the beta estimates

  • the t statistics

  • ROI contours

  1. Copyright 2021 Remi Gau

Face repetition

This is based on the event related design dataset of SPM.

face_rep_01_bids_app

This script will download the face repetition dataset from SPM and will run the basic preprocessing.

Download

  • downloads and BIDSify the dataset from the FIL website

Preprocessing

  • copies the necessary data from the raw to the derivative folder,

  • runs slice time correction

  • runs spatial preprocessing

those are otherwise handled by the workflows:

  • bidsCopyInputFolder.m

  • bidsSTC.m

  • bidsSpatialPrepro.m

type bidspm help or bidspm(‘action’, ‘help’) or see this page: https://bidspm.readthedocs.io/en/stable/bids_app_api.html for more information on what parameters are obligatory or optional

face_rep_01_anat_only

This show how an anat only pipeline would look like.

Download

  • downloads and BIDSify the dataset from the FIL website

Preprocessing

  • copies the necessary data from the raw to the derivative folder,

  • runs spatial preprocessing

those are otherwise handled by the workflows:

  • bidsCopyInputFolder.m

  • bidsSpatialPrepro.m

type bidspm help or bidspm(‘action’, ‘help’) or see this page: https://bidspm.readthedocs.io/en/stable/bids_app_api.html for more information on what parameters are obligatory or optional

  1. Copyright 2022 Remi Gau

face_rep_02_stats

Warning

This script assumes you have already preprocessed the data with face_rep_01_bids_app.m

stats

This script will run the FFX and contrasts on the the face repetition dataset from SPM.

  • GLM specification + estimation

  • compute contrasts

  • show results

that are otherwise handled by the workflows

  • bidsFFX.m

  • bidsResults.m

Note

Results might be a bit different from those in the SPM manual as some default options are slightly different in this pipeline (e.g use of FAST instead of AR(1), motion regressors added)

type bidspm help or bidspm(‘action’, ‘help’) or see this page: https://bidspm.readthedocs.io/en/stable/bids_app_api.html for more information on what parameters are obligatory or optional

  1. Copyright 2022 Remi Gau

face_rep_03_roi_analysis

Creates a ROI in MNI space from the retinotopic probabilistic atlas.

Creates its equivalent in subject space (inverse normalization).

Then uses marsbar to run a ROI based GLM

  1. Copyright 2019 Remi Gau

Visual motion localizers

Small demo using visual motion localizer data to show how to set up an analysis with bidspm from scratch with datalad.

Using bidspm and datalad

Ideally better to use the datalad fMRI template we have set up, this shows a set by step approach.

Note

The bash script vismotion_demo.sh will run all the steps described here in one fell swoop.

You can run it by typing the following from within the bidspm/demos/vismotion

bash vismotion_demo.sh

Set up

Create a new datalad dataset with a YODA config

datalad create -c yoda visual_motion_localiser
cd visual_motion_localiser

Add the bidspm code as a sub-dataset, checkout the dev branch ands initializes

all submodules.

datalad install \
    -d . \
    -s https://github.com/cpp-lln-lab/bidspm.git \
    -r \
    code/bidspm

In case you get some errors when installing the submodules you might have to initialize them manually, and update your dataset with that update

cd code/bidspm
git checkout main
git submodule update --init --recursive && git submodule update --recursive
cd ..
datalad save -m 'update bidspm submodules'

Now let’s get the raw data as a subdataset and put it in an inputs/raw folder.

The data from the CPP lab is openly available on GIN: https://gin.g-node.org/cpp-lln-lab/Toronto_VisMotionLocalizer_MR_raw

Note that to install it you will need to have set up Datalad to play nice with GIN: see the datalad handbook

This will install the data:

datalad install -d . \
                -s git@gin.g-node.org:/cpp-lln-lab/Trento_VisMotionLocalizer_MR_raw.git \
                --recursive \
                --jobs 12 \
                inputs/raw

After this your datalad dataset should look something like this:

├── code
│   └── bidspm
└── inputs
    └── raw
        ├── derivatives
           └── fmriprep
        ├── sub-con07
        ├── sub-con08
        └── sub-con15

To finish the setup you need to download the data:

cd inputs/raw
datalad get .

Note that you could have installed the dataset and got the data in one command:

Running the analysis

Start matlab and run the step_1_preprocess.m and step_2_stats.m scripts.

In the end your whole analysis should look like this.

├── code
│   └── bidspm
│       ├── binder
│       ├── demos
│          ├── face_repetition
│          ├── lesion_detection
│          ├── MoAE
│          ├── openneuro
│          ├── tSNR
│          └── vismotion          # <--- your scripts are there       ├── docs
│       ├── lib
│       ├── manualTests
│       ├── notebooks
│       ├── src
│       ├── templates
│       └── tests
├── inputs
│   └── raw                        # <--- input data       ├── derivatives
│          └── fmriprep           # <--- fmriprep data       ├── sub-con07
│          └── ses-01
│       ├── sub-con08
│          └── ses-01
│       └── sub-con15
│           └── ses-01
└── outputs
    └── derivatives
        ├── bidspm-preproc        # <--- smoothed data
           ├── jobs
           ├── sub-con07
           ├── sub-con08
           └── sub-con15
        └── bidspm-stats          # <--- stats output
            ├── jobs
            ├── sub-con07
            ├── sub-con08
            └── sub-con15

Openneuro based demos

Demos based on openneuro datasets

  • ds000001: one task, one session, several runs

  • ds000114: several tasks, several sessions, one or several runs depending on task

  • ds001168: resting state, several sessions, several acquisition, fieldmaps, physio data

  • ds002799: resting state and task, several sessions, with fmriprep data

Download with datalad

All those data can be installed with datalad.

Datalad datasets can be accessed via their siblings on: https://github.com/OpenNeuroDatasets

Check the content of the Makefile to see the code snippets you need to run to install those datasets.

Otherwise you can also get them by using the Datalad superdataset.

For example:

datalad install ///
cd datasets.datalad.org/
datalad install openneuro
datalad install openneuro/dsXXXXXX
cd openneuro/dsXXXXXX
# get rest data first subject
datalad get /openneuro/dsXXXXXX/sub-0001/func/sub-0001*

NARPS: ds001734

More details here:

https://docs.google.com/spreadsheets/d/1FU_F6kdxOD4PRQDIHXGHS4zTi_jEVaUqY_Zwg0z6S64/edit#gid=1019165812&range=A51

TODO: add expected value to the model

% compute euclidean distance to the indifference line defined by
% gain twice as big as losses
% https://en.wikipedia.org/wiki/Distance_from_a_point_to_a_line
a = 0.5;
b = -1;
c = 0;
x = onsets{iRun}.gain;
y = onsets{iRun}.loss;
dist = abs(a * x + b * y + c) / (a^2 + b^2)^.5;
onsets{iRun}.EV = dist; % create an "expected value" regressor

TODO: transformers cannot yet be appled to confounds

{
    "Description": "Time points with a framewise displacement (as calculated by fMRIprep) > 0.5 mm were censored (no interpolation) at the subject level GLM..",
    "Name": "Threshold",
    "Input": [
        "framewise_displacement"
    ],
    "Threshold": 0.5,
    "Binarize": true,
    "Output": [
        "thres_framewise_displacement"
    ]
},