Skip to content

Commit

Permalink
Update subject segmentation outputs (#146)
Browse files Browse the repository at this point in the history
* Update subject segmentation outputs

* desc -> atlas_desc to make the source explicit

* parametise the output name test

* ensure the file name changes are reflected in the workflow

* FIX test should reflect new changw

* Update documentations related to introducing seg entity

* Reflect the change of Schaefer atlas name in github workflow

* [full_test] test deprication

* FIX type of the new deprication action class

* [full_test] deprecation warning for atlas name

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [full_test] check type of potentially depricated atlas input

* [full_test] TEST capture log correctly

* add deprication related details to doc

* TEST improve coverage

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
htwangtw and pre-commit-ci[bot] authored May 28, 2024
1 parent 777f0e0 commit 692948a
Show file tree
Hide file tree
Showing 15 changed files with 393 additions and 199 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ jobs:
needs: [download-test-data, docker-build]
strategy:
matrix:
atlas: ['Schaefer20187Networks', 'MIST', 'DiFuMo', 'HarvardOxfordCortical', 'HarvardOxfordCorticalSymmetricSplit', 'HarvardOxfordSubcortical']
atlas: ['Schaefer2018', 'MIST', 'DiFuMo', 'HarvardOxfordCortical', 'HarvardOxfordCorticalSymmetricSplit', 'HarvardOxfordSubcortical']
steps:
- uses: actions/checkout@v4
with:
Expand All @@ -92,12 +92,12 @@ jobs:
docker run --rm \
-v ${{ env.DATA }}:/test_data \
-v ./outputs:/outputs \
-v ./outputs/working_dir:/work \
-v ./outputs/atlases:/atlases \
${{env.USER_NAME}}/${{env.REPO_NAME}} \
/test_data/ds000017-fmriprep22.0.1-downsampled-nosurface \
/outputs \
participant \
-w /work \
-a /atlases \
--atlas ${{ matrix.atlas }} \
--participant_label 1 \
--reindex-bids
Expand Down
7 changes: 6 additions & 1 deletion docs/source/changes.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
# What’s new

## 0.5.1.dev
## 0.6.0.dev

**Released MONTH YEAR**

### New

- [EHN] Default atlas `Schaefer20187Networks` is renamed to `Schaefer2018`. `Schaefer20187Networks` will be deprecated ub 0.7.0. (@htwangtw)
- [EHN] `--work-dir` is now renamed to `--atlases-dir`. `--work-dir` will be deprecated ub 0.7.0. (@htwangtw)
- [EHN] Add details of denoising strategy to the meta data of the time series extraction. (@htwangtw) [#144](https://github.com/bids-apps/giga_connectome/issues/144)

### Fixes
Expand All @@ -17,6 +19,9 @@

### Changes

- [EHN] Merge `atlas-` and the atlas description `desc-` into one filed `seg-` defined under 'Derivatives-Image data type' in BIDS. (@htwangtw) [#143](https://github.com/bids-apps/giga_connectome/issues/143)
- [EHN] Working directory is now renamed as `atlases/` to reflect on the atlases directory mentioned in BEP017.

## 0.5.0

Released April 2024
Expand Down
145 changes: 97 additions & 48 deletions docs/source/outputs.md

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions docs/source/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,18 +47,18 @@ An example using Apptainer (formerly known as Singularity):
```bash
FMRIPREP_DIR=/path/to/fmriprep_output
OUTPUT_DIR=/path/to/connectom_output
WORKING_DIR=/path/to/working_dir
ATLASES_DIR=/path/to/atlases
DENOISE_CONFIG=/path/to/denoise_config.json

GIGA_CONNECTOME=/path/to/giga-connectome.simg

apptainer run \
--bind ${FMRIPREP_DIR}:/data/input \
--bind ${OUTPUT_DIR}:/data/output \
--bind ${WORKING_DIR}:/data/working \
--bind ${ATLASES_DIR}:/data/atlases \
--bind ${DENOISE_CONFIG}:/data/denoise_config.json \
${GIGA_CONNECTOME} \
-w /data/working \
-a /data/atlases \
--denoise-strategy /data/denoise_config.json \
/data/input \
/data/output \
Expand Down Expand Up @@ -128,7 +128,7 @@ An example using Apptainer (formerly known as Singularity):
```bash
FMRIPREP_DIR=/path/to/fmriprep_output
OUTPUT_DIR=/path/to/connectom_output
WORKING_DIR=/path/to/working_dir
ATLASES_DIR=/path/to/atlases
ATLAS_CONFIG=/path/to/atlas_config.json

GIGA_CONNECTOME=/path/to/giga-connectome.simg
Expand All @@ -138,10 +138,10 @@ export APPTAINERENV_TEMPLATEFLOW_HOME=/data/atlas
apptainer run \
--bind ${FMRIPREP_DIR}:/data/input \
--bind ${OUTPUT_DIR}:/data/output \
--bind ${WORKING_DIR}:/data/working \
--bind ${ATLASES_DIR}:/data/atlases \
--bind ${ATLAS_CONFIG}:/data/atlas_config.json \
${GIGA_CONNECTOME} \
-w /data/working \
-s /data/atlases \
--atlas /data/atlas_config.json \
/data/input \
/data/output \
Expand Down
60 changes: 37 additions & 23 deletions giga_connectome/atlas.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,12 @@
{"name": str, "file_paths": Dict[str, List[Path]], "type": str},
)

deprecations = {
# parser attribute name:
# (replacement, version slated to be removed in)
"Schaefer20187Networks": ("Schaefer2018", "0.7.0"),
}


def load_atlas_setting(
atlas: str | Path | dict[str, Any],
Expand Down Expand Up @@ -94,60 +100,55 @@ def load_atlas_setting(


def resample_atlas_collection(
template: str,
subject_seg_file_names: list[str],
atlas_config: ATLAS_SETTING_TYPE,
group_mask_dir: Path,
group_mask: Nifti1Image,
subject_mask_dir: Path,
subject_mask: Nifti1Image,
) -> list[Path]:
"""Resample a atlas collection to group grey matter mask.
Parameters
----------
template: str
Templateflow template name. This template should match the template of
`all_masks`.
subject_atlas_file_names: list of str
File names of subject atlas segmentations.
atlas_config: dict
Atlas name. Currently support Schaefer20187Networks, MIST, DiFuMo.
group_mask_dir: pathlib.Path
subject_mask_dir: pathlib.Path
Path to where the outputs are saved.
group_mask : nibabel.nifti1.Nifti1Image
EPI (grey matter) mask for the current group of subjects.
subject_mask : nibabel.nifti1.Nifti1Image
EPI (grey matter) mask for the subject.
Returns
-------
list of pathlib.Path
Paths to atlases sampled to group level grey matter mask.
Paths to subject specific segmentations created from atlases sampled
to individual grey matter mask.
"""
gc_log.info("Resample atlas to group grey matter mask.")
resampled_atlases = []
subject_seg = []

with progress_bar(text="Resampling atlases") as progress:
task = progress.add_task(
description="resampling", total=len(atlas_config["file_paths"])
)

for desc in atlas_config["file_paths"]:
for seg_file, desc in zip(
subject_seg_file_names, atlas_config["file_paths"]
):
parcellation = atlas_config["file_paths"][desc]
parcellation_resampled = resample_to_img(
parcellation, group_mask, interpolation="nearest"
)
filename = (
f"tpl-{template}_"
f"atlas-{atlas_config['name']}_"
"res-dataset_"
f"desc-{desc}_"
f"{atlas_config['type']}.nii.gz"
parcellation, subject_mask, interpolation="nearest"
)
save_path = group_mask_dir / filename
save_path = subject_mask_dir / seg_file
nib.save(parcellation_resampled, save_path)
resampled_atlases.append(save_path)
subject_seg.append(save_path)

progress.update(task, advance=1)

return resampled_atlases
return subject_seg


def get_atlas_labels() -> List[str]:
Expand Down Expand Up @@ -176,9 +177,18 @@ def _check_altas_config(
KeyError
atlas configuration not containing the correct keys.
"""
if isinstance(atlas, str) and atlas in deprecations:
new_name, version = deprecations[atlas]
gc_log.warning(
f"{atlas} has been deprecated and will be removed in "
f"{version}. Please use {new_name} instead."
)
atlas = new_name

# load the file first if the input is not already a dictionary
atlas_dir = resource_filename("giga_connectome", "data/atlas")
preset_atlas = [p.stem for p in Path(atlas_dir).glob("*.json")]

if isinstance(atlas, (str, Path)):
if atlas in preset_atlas:
config_path = Path(
Expand All @@ -188,6 +198,10 @@ def _check_altas_config(
)
elif Path(atlas).exists():
config_path = Path(atlas)
else:
raise FileNotFoundError(
f"Atlas configuration file {atlas} not found."
)

with open(config_path, "r") as file:
atlas_config = json.load(file)
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"name": "Schaefer20187Networks",
"name": "Schaefer2018",
"parameters": {
"atlas": "Schaefer2018",
"template": "MNI152NLin2009cAsym",
Expand Down
114 changes: 63 additions & 51 deletions giga_connectome/mask.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,59 +21,79 @@

from giga_connectome.atlas import ATLAS_SETTING_TYPE, resample_atlas_collection
from giga_connectome.logger import gc_logger
from giga_connectome import utils

gc_log = gc_logger()


def generate_gm_mask_atlas(
working_dir: Path,
atlases_dir: Path,
atlas: ATLAS_SETTING_TYPE,
template: str,
masks: list[BIDSImageFile],
) -> tuple[Path, list[Path]]:
""" """
# check masks; isolate this part and make sure to make it a validate
# templateflow template with a config file

group_mask_dir = working_dir / "groupmasks" / f"tpl-{template}"
group_mask_dir.mkdir(exist_ok=True, parents=True)

group_mask, resampled_atlases = None, None
if group_mask_dir.exists():
group_mask, resampled_atlases = _check_pregenerated_masks(
template, working_dir, atlas
subject, _, _ = utils.parse_bids_name(masks[0].path)
subject_mask_dir = atlases_dir / subject / "func"
subject_mask_dir.mkdir(exist_ok=True, parents=True)
target_subject_mask_file_name: str = utils.output_filename(
source_file=masks[0].path,
atlas="",
suffix="mask",
extension="nii.gz",
strategy="",
atlas_desc="",
)
target_subject_seg_file_names: list[str] = [
utils.output_filename(
source_file=masks[0].path,
atlas=atlas["name"],
suffix=atlas["type"],
extension="nii.gz",
strategy="",
atlas_desc=atlas_desc,
)
for atlas_desc in atlas["file_paths"]
]
target_subject_mask, target_subject_seg = _check_pregenerated_masks(
subject_mask_dir,
target_subject_mask_file_name,
target_subject_seg_file_names,
)

if not group_mask:
if not target_subject_mask:
# grey matter group mask is only supplied in MNI152NLin2009c(A)sym
group_mask_nii = generate_group_mask(
subject_mask_nii = generate_subject_gm_mask(
[m.path for m in masks], "MNI152NLin2009cAsym"
)
current_file_name = (
f"tpl-{template}_res-dataset_label-GM_desc-group_mask.nii.gz"
nib.save(
subject_mask_nii, subject_mask_dir / target_subject_mask_file_name
)
group_mask = group_mask_dir / current_file_name
nib.save(group_mask_nii, group_mask)

if not resampled_atlases:
resampled_atlases = resample_atlas_collection(
template, atlas, group_mask_dir, group_mask
if not target_subject_seg:
subject_seg_niis = resample_atlas_collection(
target_subject_seg_file_names,
atlas,
subject_mask_dir,
subject_mask_nii,
)

return group_mask, resampled_atlases
return subject_mask_nii, subject_seg_niis


def generate_group_mask(
def generate_subject_gm_mask(
imgs: Sequence[Path | str | Nifti1Image],
template: str = "MNI152NLin2009cAsym",
templateflow_dir: Path | None = None,
n_iter: int = 2,
) -> Nifti1Image:
"""
Generate a group EPI grey matter mask, and overlaid with a MNI grey
Generate a subject EPI grey matter mask, and overlaid with a MNI grey
matter template.
The Group EPI mask will ensure the signal extraction is from the most
overlapping voxels.
The subject EPI mask will ensure the signal extraction is from the most
overlapping voxels for all scans of the subject.
Parameters
----------
Expand Down Expand Up @@ -267,38 +287,30 @@ def _check_mask_affine(


def _check_pregenerated_masks(
template: str, working_dir: Path, atlas: ATLAS_SETTING_TYPE
) -> tuple[Path | None, list[Path] | None]:
subject_mask_dir: Path,
subject_mask_file_name: str,
subject_seg_file_names: list[str],
) -> tuple[bool, bool]:
"""Check if the working directory is populated with needed files."""
output_dir = working_dir / "groupmasks" / f"tpl-{template}"
group_mask: Path | None = (
output_dir
/ f"tpl-{template}_res-dataset_label-GM_desc-group_mask.nii.gz"
)
if group_mask and not group_mask.exists():
group_mask = None
else:
# subject grey matter mask
if target_subject_mask := (
subject_mask_dir / subject_mask_file_name
).exists():
gc_log.info(
f"Found pregenerated group level grey matter mask: {group_mask}"
"Found pregenerated group level grey matter mask: "
f"{subject_mask_dir / subject_mask_file_name}"
)

# atlas
resampled_atlases: list[Path] = []
for desc in atlas["file_paths"]:
filename = (
f"tpl-{template}_"
f"atlas-{atlas['name']}_"
"res-dataset_"
f"desc-{desc}_"
f"{atlas['type']}.nii.gz"
)
resampled_atlases.append(output_dir / filename)
all_exist = [file_path.exists() for file_path in resampled_atlases]
if not all(all_exist):
return group_mask, None
else:
all_exist = [
(subject_mask_dir / file_path).exists()
for file_path in subject_seg_file_names
]
if target_subject_seg := all(all_exist):
gc_log.info(
f"Found resampled atlases:\n{[str(x) for x in resampled_atlases]}."
"\nSkipping group level mask generation step."
"Found resampled atlases:\n"
f"{[filepath for filepath in subject_seg_file_names]} "
f"in {subject_mask_dir}."
"\nSkipping individual segmentation generation step."
)
return group_mask, resampled_atlases
return target_subject_mask, target_subject_seg
Loading

0 comments on commit 692948a

Please sign in to comment.