Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch improvements #345

Merged
merged 13 commits into from
Sep 23, 2024
Merged

Batch improvements #345

merged 13 commits into from
Sep 23, 2024

Conversation

matsvanes
Copy link
Collaborator

changes:

  • implemented group processing in osl.preproc.run_proc_batch
  • add read_dataset osl wrapper
  • add skip_save argument to run_proc_chain/batch and write_dataset
  • added log statements when saving data in write_dataset
  • added Study.refresh() function
  • added save_pkl for glm_epochs

@matsvanes matsvanes requested a review from cgohil8 September 19, 2024 11:21
@matsvanes
Copy link
Collaborator Author

@cgohil8 I've tested the changes (esp the group thing in osl) using something like this (make sure you adapt the paths if you're using this)

"""
Use this script to preprocess the data.
"""
import os
import osl
import logging
import glmtools as glm

raw_dir = "/Users/matsvanes/Documents/Werk/osl/meguk-debugging/raw"
outdir = "/Users/matsvanes/Documents/Werk/osl/meguk-debugging/output_mve"
study = osl.utils.Study(os.path.join(f"{raw_dir}", "sub-oxf{i_sub}/meg/sub-oxf{i_sub}_task-{task}_meg.fif"))
infiles = sorted(study.get(task="resteyesclosed"))[:2]
overwrite = True
use_dask = False

subjects = ["sub-oxf001_task-resteyesclosed", "sub-oxf002_task-resteyesclosed"]

  
def first_level(dataset, userargs):
    logger = logging.getLogger(__name__)
    dataset['raw'].info['bads'] = []        
    dataset['glm'] = osl.glm.glm_spectrum(dataset['raw'].pick_types(meg=True), fmin=1, fmax=95,
                      nperseg=int(500), noverlap=int(250),
                      mode='magnitude',
                      standardise_data=True)
    return dataset
  
def second_level(dataset, userargs):
    logger = logging.getLogger(__name__)    
    target = userargs.get("target", "glm")
    groupDC = glm.design.DesignConfig()
    for subj_ind in range(len(dataset[target])):
        regressor_name=("Subj{}".format(subj_ind+1))
        groupDC.add_regressor(name=regressor_name, rtype="Categorical", codes=subj_ind)
    groupDC.add_contrast(name="Difference",
                    values={"Subj1": 1, "Subj2": -1},
                   )
    dataset['group_glm'] = osl.glm.group_glm_spectrum(dataset[target], groupDC)
    return dataset


config = """
meta:
  event_codes:
preproc:
  - crop:               {tmin: 30}
  - filter:             {l_freq: 0.5, h_freq: 125, method: iir, iir_params: {order: 5, ftype: butter}}
  - notch_filter:       {freqs: 50 100, notch_widths: 2} 
  - resample:           {sfreq: 250}
  - bad_segments:       {segment_len: 500, picks: grad, significance_level: 0.1}
  - bad_segments:       {segment_len: 500, picks: mag, significance_level: 0.1}
  - bad_segments:       {segment_len: 500, picks: grad, mode: diff, significance_level: 0.1}
  - bad_segments:       {segment_len: 500, picks: mag, mode: diff, significance_level: 0.1}
  - bad_channels:       {picks: meg, significance_level: 0.1}
  - interpolate_bads:   {reset_bads: False}
  - first_level:        {}
group:
  - second_level:       {target: glm}
"""

if use_dask:
  from dask.distributed import Client
  client = Client(n_workers=2, threads_per_worker=1)

goods = osl.preprocessing.run_proc_batch(
    config, 
    infiles, 
    subjects=subjects,
    outdir=outdir,
    extra_funcs=[first_level, second_level],
    overwrite=overwrite,
    dask_client=use_dask,
)

@cgohil8
Copy link
Collaborator

cgohil8 commented Sep 20, 2024

This looks good.

@matsvanes matsvanes merged commit 95eefaa into main Sep 23, 2024
1 check passed
@matsvanes matsvanes deleted the batch_improvements branch September 23, 2024 09:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants