Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

♻️ Split ResourcePool into three classes #2131

Open
wants to merge 77 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
b414d17
:truck: Move engine resources to own submodule
shnizzedy Jul 8, 2024
821bcaa
:construction: Split ResourcePool into three classes with docstrings
shnizzedy Jul 8, 2024
06612ff
:white_check_mark: Move BIDS examples to a fixture
shnizzedy Jul 9, 2024
7a76603
:construction: WIP :recycle: Move `ResourcePool` init from functions …
shnizzedy Jul 10, 2024
646f49d
:white_check_mark: Restore engine unit tests
shnizzedy Jul 10, 2024
3334a2a
:construction_worker: Replace non-`\w`-non-dot characters with hyphens
shnizzedy Jul 10, 2024
bbf3e97
:recycle: Fold `initiate_rpool` into `ResourcePool.__init__`
shnizzedy Jul 10, 2024
f1f7705
:recycle: Finish moving `create_func_datasource` from func to method
shnizzedy Jul 10, 2024
515b791
:recycle: EmptyConfiguration → Preconfiguration('blank')
shnizzedy Jul 11, 2024
3c2220d
:pencil2: Import Path for signature
shnizzedy Jul 11, 2024
8a24441
:recycle: Pass `part_id` through to `DataPaths`
shnizzedy Jul 11, 2024
87fb1c0
:white_check_mark: Skip NHP configs if no torch installed
shnizzedy Jul 11, 2024
3af1407
:white_check_mark: Use abspath for BIDS-examples
shnizzedy Jul 11, 2024
98c8bb8
:construction: :recycle: Continue updating calls to `ResourcePool` me…
shnizzedy Jul 11, 2024
4d9934d
:construction: WIP :recycle: Fix `StratPool.__init__`
shnizzedy Jul 11, 2024
1cc3914
:art: :technologist: Clarify typing
shnizzedy Jul 12, 2024
199bd60
:white_check_mark: `pytest.Cache.makedir` → `pytest.Cache.mkdir`
shnizzedy Jul 12, 2024
ec32f34
:pencil2: Pipe inside quotation marks
shnizzedy Jul 13, 2024
b1f16ed
:art: Type `ResourcePool.get_strats`
shnizzedy Jul 13, 2024
02d1a14
:white_check_mark: Remove dir instead of file
shnizzedy Jul 13, 2024
1b11462
:technologist: Add `__repr__` method to `Resource`
shnizzedy Jul 13, 2024
bb6cbae
:bug: Fix circular import
shnizzedy Jul 15, 2024
1058c94
:art: Define `_Pool.__contains__` and `Resource.__contains__`
shnizzedy Jul 15, 2024
aaa37a9
:recycle: Move `StratDict().rpool[json]` to `StratDict().json`
shnizzedy Jul 15, 2024
4280fc6
:recycle: Dedupe loop through `self.node_blocks.items()`
shnizzedy Jul 15, 2024
c22db11
:bug: Remove extra curly braces
shnizzedy Jul 15, 2024
90d96b1
:white_check_mark: Update test for new `Resource` class
shnizzedy Jul 15, 2024
e25479f
:bug: Fix auto-`quick_single`
shnizzedy Jul 15, 2024
faf8ab9
:white_check_mark: Check if `bids-examples` is empty before moving on
shnizzedy Jul 15, 2024
5ea19ef
:twisted_rightwards_arrow: Merge `develop` into `engine/(th)r(e)esour…
shnizzedy Jul 15, 2024
abb4809
:art: Type `connect_pipeline`
shnizzedy Jul 15, 2024
6207348
:zap: Replace some `deepcopy` calls
shnizzedy Jul 15, 2024
588df00
:recycle: Typehint `StratPool.append_name`
shnizzedy Jul 16, 2024
fc2714a
:recycle: Clarify `ResourcePool.gather_pipes`
shnizzedy Jul 16, 2024
d30496c
:recycle: Move `connect_pipeline` from standalone function to `Resour…
shnizzedy Jul 17, 2024
a9a3c48
:coffin: Remove `_Pool.node_data` method
shnizzedy Jul 17, 2024
3613f8c
:construction_worker: Livelog pytest
shnizzedy Jul 17, 2024
4bf5f00
:recycle: Move `post_process` method back into `ResourcePool`
shnizzedy Jul 17, 2024
b0b94c9
:recycle: Move `filter_name` method into `StratPool`
shnizzedy Jul 17, 2024
29b481d
:recycle: Move `filtered_movement` property into `StratPool`
shnizzedy Jul 17, 2024
48e8b90
:recycle: Move `derivative_xfm` back into `ResourcePool`
shnizzedy Jul 17, 2024
74629ac
:coffin: Remove unused `flatten_prov` method
shnizzedy Jul 17, 2024
748b98e
:recycle: Move `get_resource_strats_from_prov` back into `ResourcePool`
shnizzedy Jul 17, 2024
52c38bf
:coffin: Remove unused `generate_prov_list` method
shnizzedy Jul 17, 2024
f2423a2
:recycle: Move `get_cpac_provenance` and `regressor_dct` into `StratP…
shnizzedy Jul 17, 2024
0f9099c
:recycle: Split `get_json` across `ResourcePool` and `StratPool`
shnizzedy Jul 17, 2024
6fb1dc1
:coffin: Remove unused `get_pipe_idxs` method
shnizzedy Jul 17, 2024
a004ab6
:coffin: Remove unused `update_resource` method
shnizzedy Jul 17, 2024
69cb603
:recycle: Move `copy_resource` method into `StratPool`
shnizzedy Jul 17, 2024
0a0b5a0
:recycle: Move `get_json_info` back into `ResourcePool`
shnizzedy Jul 17, 2024
839c7cd
:coffin: Remove unused `set_json_info` method
shnizzedy Jul 17, 2024
b8ca36c
:coffin: Remove unused `get_strat_info` method
shnizzedy Jul 17, 2024
2fc7244
:recycle: Move `get_raw_label` back into `ResourcePool`
shnizzedy Jul 17, 2024
14c3e32
:coffin: Remove unused `get_entire_rpool` method
shnizzedy Jul 17, 2024
77ffc16
:coffin: Remove unused `wrap_block` function
shnizzedy Jul 17, 2024
ef402e5
:recycle: Move `_get_pipe_number` back into `ResourcePool`
shnizzedy Jul 17, 2024
c944940
:recycle: Move `create_func_datasource` method into `ResourcePool`
shnizzedy Jul 17, 2024
cf85273
:coffin: Remove unused `get_name` method
shnizzedy Jul 17, 2024
84274b9
:recycle: Move `_config_lookup` method into `ResourcePool`
shnizzedy Jul 17, 2024
8389d5b
:recycle: Move `json_outdir_ingress` method into `ResourcePool`
shnizzedy Jul 17, 2024
2e97e51
:recycle: Move `initialize_nipype_wf` method into `ResourcePool`
shnizzedy Jul 17, 2024
a855222
:coffin: Remove unused function `run_node_blocks`
shnizzedy Jul 17, 2024
0d848b9
:recycle: Replace calls to `grab_tiered_dct` with direct config lookup
shnizzedy Jul 17, 2024
7571455
:recycle: Move `_check_null` from method to private function
shnizzedy Jul 17, 2024
4a05442
:construction_worker: Pre-clone `bids-examples`
shnizzedy Jul 17, 2024
dd0985f
:pencil2: Fix typo (~~"tpyes"~~ → "types")
shnizzedy Jul 18, 2024
7b04cc8
:white_check_mark: Unlink symlink instead of rmtree
shnizzedy Jul 18, 2024
3b00764
:twisted_rightwards_arrows: Merge 'origin/develop' into 'engine/(th)r…
shnizzedy Jul 18, 2024
75e38e0
:memo: Update CHANGELOG re: #2131
shnizzedy Jul 18, 2024
f5dd824
:art: Standardize docstring format across changes.
shnizzedy Jul 18, 2024
6c96667
:bug: Fix conflicting class name
shnizzedy Jul 18, 2024
763af4e
:art: More docstring updates for ResourcePool refactor
shnizzedy Jul 18, 2024
61eeef4
:bug: Fix memoization
shnizzedy Jul 19, 2024
2402a2c
:pencil2: A little more docstring cleanup
shnizzedy Jul 19, 2024
8d80941
:art: Qualify refs to documented functions.
shnizzedy Jul 19, 2024
836d100
:art: Remove duplicate imports
shnizzedy Jul 19, 2024
0a53108
:goal_net: Catch and release no-regressors
shnizzedy Jul 19, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions .circleci/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,9 @@ commands:
steps:
- run:
name: Getting Sample BIDS Data
command: git clone https://github.com/bids-standard/bids-examples.git
command: |
mkdir -p /home/circleci/project/dev/circleci_data/.pytest_cache/d/bids-examples
git clone https://github.com/bids-standard/bids-examples.git /home/circleci/project/dev/circleci_data/.pytest_cache/d/bids-examples
get-singularity:
parameters:
version:
Expand Down Expand Up @@ -156,7 +158,7 @@ commands:
then
TAG=nightly
else
TAG="${CIRCLE_BRANCH//\//_}"
TAG=`echo ${CIRCLE_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
fi
DOCKER_TAG="ghcr.io/${CIRCLE_PROJECT_USERNAME,,}/${CIRCLE_PROJECT_REPONAME,,}:${TAG,,}"
if [[ -n "<< parameters.variant >>" ]]
Expand All @@ -172,7 +174,7 @@ commands:
name: Testing Singularity installation
command: |
pip install -r dev/circleci_data/requirements.txt
coverage run -m pytest --junitxml=test-results/junit.xml --continue-on-collection-errors dev/circleci_data/test_install.py
coverage run -m pytest --capture=no --junitxml=test-results/junit.xml --continue-on-collection-errors dev/circleci_data/test_install.py

jobs:
combine-coverage:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_C-PAC.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ jobs:
GITHUB_BRANCH=$(echo ${GITHUB_REF} | cut -d '/' -f 3-)
if [[ ! $GITHUB_BRANCH == 'main' ]] && [[ ! $GITHUB_BRANCH == 'develop' ]]
then
TAG=${GITHUB_BRANCH//\//_}
TAG=`echo ${GITHUB_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
DOCKERFILE=.github/Dockerfiles/C-PAC.develop$VARIANT-$OS.Dockerfile
elif [[ $GITHUB_BRANCH == 'develop' ]]
then
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/regression_test_full.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
GITHUB_BRANCH=$(echo ${GITHUB_REF} | cut -d '/' -f 3-)
if [[ ! $GITHUB_BRANCH == 'main' ]] && [[ ! $GITHUB_BRANCH == 'develop' ]]
then
TAG=${GITHUB_BRANCH//\//_}
TAG=`echo ${GITHUB_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
elif [[ $GITHUB_BRANCH == 'develop' ]]
then
TAG=nightly
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/regression_test_lite.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ jobs:
run: |
if [[ ! $GITHUB_REF_NAME == 'main' ]] && [[ ! $GITHUB_REF_NAME == 'develop' ]]
then
TAG=${GITHUB_REF_NAME//\//_}
TAG=`echo ${GITHUB_REF_NAME} | sed 's/[^a-zA-Z0-9._]/-/g'`
elif [[ $GITHUB_REF_NAME == 'develop' ]]
then
TAG=nightly
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/smoke_test_participant.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ jobs:
GITHUB_BRANCH=$(echo ${GITHUB_REF} | cut -d '/' -f 3-)
if [[ ! $GITHUB_BRANCH == 'main' ]] && [[ ! $GITHUB_BRANCH == 'develop' ]]
then
TAG=${GITHUB_BRANCH//\//_}
TAG=`echo ${GITHUB_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
elif [[ $GITHUB_BRANCH == 'develop' ]]
then
TAG=nightly
Expand Down Expand Up @@ -133,7 +133,7 @@ jobs:
GITHUB_BRANCH=$(echo ${GITHUB_REF} | cut -d '/' -f 3-)
if [[ ! $GITHUB_BRANCH == 'main' ]] && [[ ! $GITHUB_BRANCH == 'develop' ]]
then
TAG=${GITHUB_BRANCH//\//_}
TAG=`echo ${GITHUB_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
elif [[ $GITHUB_BRANCH == 'develop' ]]
then
TAG=nightly
Expand Down Expand Up @@ -192,7 +192,7 @@ jobs:
GITHUB_BRANCH=$(echo ${GITHUB_REF} | cut -d '/' -f 3-)
if [[ ! $GITHUB_BRANCH == 'main' ]] && [[ ! $GITHUB_BRANCH == 'develop' ]]
then
TAG=${GITHUB_BRANCH//\//_}
TAG=`echo ${GITHUB_BRANCH} | sed 's/[^a-zA-Z0-9._]/-/g'`
elif [[ $GITHUB_BRANCH == 'develop' ]]
then
TAG=nightly
Expand Down
1 change: 1 addition & 0 deletions .ruff.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ external = ["T20"] # Don't autoremove 'noqa` comments for these rules
"CPAC/utils/sklearn.py" = ["RUF003"]
"CPAC/utils/tests/old_functions.py" = ["C", "D", "E", "EM", "PLW", "RET"]
"CPAC/utils/utils.py" = ["T201"] # until `repickle` is removed
"dev/circleci_data/conftest.py" = ["F401"]
"setup.py" = ["D1"]

[lint.flake8-import-conventions.extend-aliases]
Expand Down
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Changed

- Moved `pygraphviz` from requirements to `graphviz` optional dependencies group.
- Split `ResourcePool` into three classes: `Resource`, `ResourcePool`, and `StratPool`.

### Fixed

Expand Down
2 changes: 1 addition & 1 deletion CPAC/alff/alff.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

from CPAC.alff.utils import get_opt_string
from CPAC.pipeline import nipype_pipeline_engine as pe
from CPAC.pipeline.nodeblock import nodeblock
from CPAC.pipeline.engine.nodeblock import nodeblock
from CPAC.registration.registration import apply_transform
from CPAC.utils.interfaces import Function
from CPAC.utils.utils import check_prov_for_regtool
Expand Down
2 changes: 1 addition & 1 deletion CPAC/anat_preproc/anat_preproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
wb_command,
)
from CPAC.pipeline import nipype_pipeline_engine as pe
from CPAC.pipeline.nodeblock import nodeblock
from CPAC.pipeline.engine.nodeblock import nodeblock
from CPAC.utils.interfaces import Function
from CPAC.utils.interfaces.fsl import Merge as fslMerge

Expand Down
32 changes: 32 additions & 0 deletions CPAC/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Copyright (C) 2024 C-PAC Developers

# This file is part of C-PAC.

# C-PAC is free software: you can redistribute it and/or modify it under
# the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation, either version 3 of the License, or (at your
# option) any later version.

# C-PAC is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public
# License for more details.

# You should have received a copy of the GNU Lesser General Public
# License along with C-PAC. If not, see <https://www.gnu.org/licenses/>.
"""Global pytest configuration."""

from pathlib import Path

import pytest


@pytest.fixture
def bids_examples(cache: pytest.Cache) -> Path:
"""Get cached example BIDS directories."""
bids_dir = cache.mkdir("bids-examples").absolute()
if not (bids_dir.exists() and list(bids_dir.iterdir())):
from git import Repo

Repo.clone_from("https://github.com/bids-standard/bids-examples.git", bids_dir)
return bids_dir
13 changes: 2 additions & 11 deletions CPAC/distortion_correction/distortion_correction.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
run_fsl_topup,
)
from CPAC.pipeline import nipype_pipeline_engine as pe
from CPAC.pipeline.nodeblock import nodeblock
from CPAC.pipeline.engine.nodeblock import nodeblock
from CPAC.utils import function
from CPAC.utils.datasource import match_epi_fmaps
from CPAC.utils.interfaces.function import Function
Expand Down Expand Up @@ -438,11 +438,6 @@ def distcor_blip_afni_qwarp(wf, cfg, strat_pool, pipe_num, opt=None):
node, out = strat_pool.get_data("pe-direction")
wf.connect(node, out, match_epi_fmaps_node, "bold_pedir")

# interface = {'bold': (match_epi_fmaps_node, 'opposite_pe_epi'),
# 'desc-brain_bold': 'opposite_pe_epi_brain'}
# wf, strat_pool = wrap_block([bold_mask_afni, bold_masking],
# interface, wf, cfg, strat_pool, pipe_num, opt)

func_get_brain_mask = pe.Node(
interface=preprocess.Automask(), name=f"afni_mask_opposite_pe_{pipe_num}"
)
Expand Down Expand Up @@ -530,10 +525,6 @@ def distcor_blip_afni_qwarp(wf, cfg, strat_pool, pipe_num, opt=None):
wf.connect(node, out, undistort_func_mean, "reference_image")
wf.connect(convert_afni_warp, "ants_warp", undistort_func_mean, "transforms")

# interface = {'desc-preproc_bold': (undistort_func_mean, 'output_image')}
# wf, strat_pool = wrap_block([bold_mask_afni],
# interface, wf, cfg, strat_pool, pipe_num, opt)

remask = pe.Node(
interface=preprocess.Automask(), name=f"afni_remask_boldmask_{pipe_num}"
)
Expand Down Expand Up @@ -764,7 +755,7 @@ def distcor_blip_fsl_topup(wf, cfg, strat_pool, pipe_num, opt=None):
wf.connect(run_topup, "out_jacs", vnum_base, "jac_matrix_list")
wf.connect(run_topup, "out_warps", vnum_base, "warp_field_list")

mean_bold = strat_pool.node_data("sbref")
mean_bold = strat_pool.get_data("sbref")

flirt = pe.Node(interface=fsl.FLIRT(), name="flirt")
flirt.inputs.dof = 6
Expand Down
23 changes: 16 additions & 7 deletions CPAC/func_preproc/func_ingress.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,21 @@

# You should have received a copy of the GNU Lesser General Public
# License along with C-PAC. If not, see <https://www.gnu.org/licenses/>.
from CPAC.utils.datasource import create_func_datasource, ingress_func_metadata
"""Ingress functional data for preprocessing."""

from CPAC.utils.strategy import Strategy


def connect_func_ingress(
workflow, strat_list, c, sub_dict, subject_id, input_creds_path, unique_id=None
workflow,
strat_list: list[Strategy],
c,
sub_dict,
subject_id,
input_creds_path,
unique_id=None,
):
"""Connect functional ingress workflow."""
for num_strat, strat in enumerate(strat_list):
if "func" in sub_dict:
func_paths_dict = sub_dict["func"]
Expand All @@ -31,7 +40,9 @@ def connect_func_ingress(
else:
workflow_name = f"func_gather_{unique_id}_{num_strat}"

func_wf = create_func_datasource(func_paths_dict, workflow_name)
func_wf = strat._resource_pool.create_func_datasource(
func_paths_dict, workflow_name
)

func_wf.inputs.inputnode.set(
subject=subject_id,
Expand All @@ -47,8 +58,6 @@ def connect_func_ingress(
}
)

(workflow, strat.rpool, diff, blip, fmap_rp_list) = ingress_func_metadata(
workflow, c, strat.rpool, sub_dict, subject_id, input_creds_path, unique_id
)
diff, blip, fmap_rp_list = strat.rpool.ingress_func_metadata()

return (workflow, diff, blip, fmap_rp_list)
return strat.rpool.wf, diff, blip, fmap_rp_list
4 changes: 2 additions & 2 deletions CPAC/func_preproc/func_motion.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
motion_power_statistics,
)
from CPAC.pipeline import nipype_pipeline_engine as pe
from CPAC.pipeline.nodeblock import nodeblock
from CPAC.pipeline.engine.nodeblock import nodeblock
from CPAC.pipeline.schema import valid_options
from CPAC.utils.interfaces.function import Function
from CPAC.utils.utils import check_prov_for_motion_tool
Expand Down Expand Up @@ -830,7 +830,7 @@ def motion_estimate_filter(wf, cfg, strat_pool, pipe_num, opt=None):
notch.inputs.lowpass_cutoff = opt.get("lowpass_cutoff")
notch.inputs.filter_order = opt.get("filter_order")

movement_parameters = strat_pool.node_data("desc-movementParameters_motion")
movement_parameters = strat_pool.get_data("desc-movementParameters_motion")
wf.connect(
movement_parameters.node, movement_parameters.out, notch, "motion_params"
)
Expand Down
4 changes: 2 additions & 2 deletions CPAC/func_preproc/func_preproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

from CPAC.func_preproc.utils import nullify
from CPAC.pipeline import nipype_pipeline_engine as pe
from CPAC.pipeline.nodeblock import nodeblock
from CPAC.pipeline.engine.nodeblock import nodeblock
from CPAC.utils.interfaces import Function
from CPAC.utils.interfaces.ants import (
AI, # niworkflows
Expand Down Expand Up @@ -993,7 +993,7 @@ def bold_mask_fsl_afni(wf, cfg, strat_pool, pipe_num, opt=None):
# and this function has been changed.

# CHANGES:
# * Converted from a plain function to a CPAC.pipeline.nodeblock.NodeBlockFunction
# * Converted from a plain function to a CPAC.pipeline.engine.nodeblock.NodeBlockFunction
# * Removed Registration version check
# * Hardcoded Registration parameters instead of loading epi_atlasbased_brainmask.json
# * Uses C-PAC's ``FSL-AFNI-brain-probseg`` template in place of ``templateflow.api.get("MNI152NLin2009cAsym", resolution=1, label="brain", suffix="probseg")``
Expand Down
5 changes: 2 additions & 3 deletions CPAC/func_preproc/tests/test_preproc_connections.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,6 @@
)
from CPAC.func_preproc.func_preproc import func_normalize
from CPAC.nuisance.nuisance import choose_nuisance_blocks
from CPAC.pipeline.cpac_pipeline import connect_pipeline
from CPAC.pipeline.engine import ResourcePool
from CPAC.pipeline.nipype_pipeline_engine import Workflow
from CPAC.registration.registration import (
Expand Down Expand Up @@ -81,7 +80,7 @@
"from-template_to-T1w_mode-image_desc-linear_xfm",
]

NUM_TESTS = 48 # number of parameterizations to run for many-parameter tests
NUM_TESTS = 8 # number of parameterizations to run for many-parameter tests


def _filter_assertion_message(
Expand Down Expand Up @@ -268,7 +267,7 @@ def test_motion_filter_connections(
if not rpool.check_rpool("desc-cleaned_bold"):
pipeline_blocks += choose_nuisance_blocks(c, generate_only)
wf = Workflow(re.sub(r"[\[\]\-\:\_ \'\",]", "", str(rpool)))
connect_pipeline(wf, c, rpool, pipeline_blocks)
rpool.connect_pipeline(wf, c, pipeline_blocks)
# Check that filtering is happening as expected
filter_switch_key = [
"functional_preproc",
Expand Down
Loading
Loading