Skip to content

Cross-talk function as a tool for evaluating the extraction of M/EEG activity from brain areas of interest

Notifications You must be signed in to change notification settings

ctrltz/crosstalk-evaluation

Repository files navigation

Cross-talk function & extraction of ROI activity

This repository contains a set of scripts that how cross-talk function can be used to understand the difference between pipelines for extraction of M/EEG activity from brain areas of interest. For more details about the project, please refer to the accompanying paper and consider citing it in case you use the scripts from this repository:

Kapralov, N., Studenova, A., Eguinoa, R., Nolte, G., Haufe, S., Villringer, A., & Nikulin, V. (2025). Non-uniform effects of remaining field spread on the estimation of M/EEG activity and connectivity between regions of interest. bioRxiv. https://doi.org/10.64898/2025.12.09.688708

Prerequisites

  1. The LEMON dataset.

  2. The RIFT dataset.

    NOTE: We used scratch/sub*-snr-and-spectra.mat files to keep track of subjects which were included in the original analysis, please make sure to download them!

  3. Python (tested with 3.12.11). The dependencies are described in pyproject.toml and can be installed with pip install -e .[dev].

  4. MATLAB (tested with R2023b) and the FieldTrip toolbox (tested with v20241025) are used to export the RIFT data to MNE-Python.

Folder structure

  • assets - static files used in analyses or data visualization
    • lemon-info.fif - an mne.Info object that contains template locations of EEG channels present in the LEMON dataset
    • plots.mplstyle - a Matplotlib theme used for all plots
    • rift-*-setup.png - stimuli that were presented to participants in the RIFT experiment
  • data (not tracked)
    • derivatives - intermediate results of all analysis steps
    • review - source data for the literature review (to be published)
    • simulations - simulated data for all experiments
  • matlab - MATLAB scripts for exporting the conditions of interest from the RIFT dataset
  • paper - source files required for generating the resulting PDF of the manuscript
    • figure_components - components for the figure that were created/postprocessed manually
    • figures - final figures
    • numbers - numbers (settings, results) exported from scripts to TeX
    • sections - source TeX code for the main text
    • supplementary - source TeX code for the supplementary
  • results (not tracked) - plot generated by the scripts
  • scripts - Python scripts that describe the main analysis
  • src - functions that were implemented for the analysis, provided as a Python package ctfeval that can be installed locally
  • tests - unit tests for the analysis functions
  • workflow - a set of Snakemake rules that describe the workflow and can be used to automatically run all scripts either locally or on the SLURM cluster

Creating a local copy of project

  1. Clone the repository or download its contents.

  2. Navigate to the project folder.

  3. Install the Python dependencies by running the following command (optionally, in a dedicated environment):

    pip install -e .[dev]
    

    The command above makes the analysis functions available as a ctfeval package, e.g.:

    from ctfeval.connectivity import data2cs_fourier
    
  4. Create a copy of the .env.example file and name it .env. This file is used to set all variables (e.g., paths) that are specific to your local environment. Ideally, specify the absolute paths, not the relative ones.

  5. Run the script that initializes the workspace and downloads the fsaverage head model:

    python scripts/00_prepare/00_init_workspace.py
    
  6. [optional, but recommended] Run the tests to make sure that the previous steps were performed successfully. For this, a forward model also need to be created first:

    # Create the template forward model with oct6 spacing
    python scripts/00_prepare/01_prepare_head_model.py --spacing oct6
    
    # All tests should pass
    pytest
    
  7. The main analysis is described as a set of Snakemake rules and can be run using one of the following commands:

    # local execution, parallelized over 8 CPU cores
    snakemake --cores 8
    
    # execution on a SLURM cluster (additional config file is required, check the docs)
    snakemake --executor slurm
    

    Otherwise, number prefixes of all script files indicate the order, in which the files are supposed to be executed.

  8. The PDF of the paper can be built using the Makefile in two commands:

    # collect the generated results (except manually post-processed figures)
    make collect
    
    # build the PDF
    make paper
    
  9. Done!

About

Cross-talk function as a tool for evaluating the extraction of M/EEG activity from brain areas of interest

Topics

Resources

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •  

Languages