130 lines
10 KiB
(Stored with Git Annex)
Text
130 lines
10 KiB
(Stored with Git Annex)
Text
+ dssource=ria+file:///data/project/QC_workflow/TMP/RIA_QCworkflow/inputstore#aae8905a-985f-46fb-91f5-35c772654ddd
|
|
+ pushgitremote=file:///data/project/QC_workflow/TMP/RIA_QCworkflow/aae/8905a-985f-46fb-91f5-35c772654ddd
|
|
+ subid=sub-cIVs025
|
|
+ export 'DUCT_OUTPUT_PREFIX=logs/duct/sub-cIVs025_{datetime_filesafe}-{pid}_'
|
|
+ DUCT_OUTPUT_PREFIX='logs/duct/sub-cIVs025_{datetime_filesafe}-{pid}_'
|
|
+ datalad clone ria+file:///data/project/QC_workflow/TMP/RIA_QCworkflow/inputstore#aae8905a-985f-46fb-91f5-35c772654ddd ds
|
|
[INFO] Attempting a clone into /var/lib/condor/execute/dir_3950045/ds
|
|
[INFO] Attempting to clone from file:///data/project/QC_workflow/TMP/RIA_QCworkflow/inputstore/aae/8905a-985f-46fb-91f5-35c772654ddd to /var/lib/condor/execute/dir_3950045/ds
|
|
[INFO] Completed clone attempts for Dataset(/var/lib/condor/execute/dir_3950045/ds)
|
|
+ cd ds
|
|
+ git remote add outputstore file:///data/project/QC_workflow/TMP/RIA_QCworkflow/aae/8905a-985f-46fb-91f5-35c772654ddd
|
|
+ git checkout -b job_sub-cIVs025_10066455
|
|
Switched to a new branch 'job_sub-cIVs025_10066455'
|
|
+ datalad get -n sourcedata/raw/
|
|
[INFO] Attempting a clone into /var/lib/condor/execute/dir_3950045/ds/sourcedata/raw
|
|
[INFO] Attempting to clone from file:///data/project/QC_workflow/TMP/RIA_QCworkflow/inputstore/9e5/e9b46-fb3d-48cf-b766-89923247370d to /var/lib/condor/execute/dir_3950045/ds/sourcedata/raw
|
|
[INFO] Attempting to clone from https://github.com/OpenNeuroDatasets/ds003416.git to /var/lib/condor/execute/dir_3950045/ds/sourcedata/raw
|
|
[INFO] Start enumerating objects
|
|
[INFO] Start counting objects
|
|
[INFO] Start compressing objects
|
|
[INFO] Start receiving objects
|
|
[INFO] Start resolving deltas
|
|
[INFO] Completed clone attempts for Dataset(/var/lib/condor/execute/dir_3950045/ds/sourcedata/raw)
|
|
[INFO] Remote origin not usable by git-annex; setting annex-ignore
|
|
[INFO] https://github.com/OpenNeuroDatasets/ds003416.git/config download failed: Not Found
|
|
[INFO] access to 1 dataset sibling s3-PRIVATE not auto-enabled, enable with:
|
|
| datalad siblings -d "/var/lib/condor/execute/dir_3950045/ds/sourcedata/raw" enable -s s3-PRIVATE
|
|
+ datalad containers-run -m 'Compute MRIQC for sub-cIVs025' -n bids-mriqc -i sourcedata/raw/sub-cIVs025 -i sourcedata/raw/dataset_description.json mriqc sourcedata/raw . participant --participant-label sub-cIVs025 --no-datalad-get --no-sub --verbose --nprocs 1 --mem 3000 --work-dir /tmp --float32 --verbose-reports
|
|
[INFO] Making sure inputs are available (this may take some time)
|
|
[INFO] Attempting a clone into /var/lib/condor/execute/dir_3950045/ds/code/containers
|
|
[INFO] Attempting to clone from file:///data/project/QC_workflow/TMP/RIA_QCworkflow/inputstore/b02/e63c2-62c1-11e9-82b0-52540040489c to /var/lib/condor/execute/dir_3950045/ds/code/containers
|
|
[INFO] Attempting to clone from https://github.com/ReproNim/containers.git to /var/lib/condor/execute/dir_3950045/ds/code/containers
|
|
[INFO] Start enumerating objects
|
|
[INFO] Start counting objects
|
|
[INFO] Start compressing objects
|
|
[INFO] Start receiving objects
|
|
[INFO] Start resolving deltas
|
|
[INFO] Completed clone attempts for Dataset(/var/lib/condor/execute/dir_3950045/ds/code/containers)
|
|
[INFO] Remote origin not usable by git-annex; setting annex-ignore
|
|
[INFO] https://github.com/ReproNim/containers.git/config download failed: Not Found
|
|
[INFO] == Command start (output follows) =====
|
|
2025-10-09T23:57:17+0200 [INFO ] con-duct: duct 0.16.0 is executing 'singularity exec -W /tmp/singtmp.lGjCDi -B /var/lib/condor/execute/dir_3950045/ds/code/containers/binds/zoneinfo/UTC:/etc/localtime -B /tmp/singtmp.lGjCDi/tmp:/tmp -B /tmp/singtmp.lGjCDi/var/tmp:/var/tmp -e -B /var/lib/condor/execute/dir_3950045/ds -H /var/lib/condor/execute/dir_3950045/ds/code/containers/binds/HOME --pwd /var/lib/condor/execute/dir_3950045/ds code/containers/images/bids/bids-mriqc--24.0.2.sing mriqc sourcedata/raw . participant --participant-label sub-cIVs025 --no-datalad-get --no-sub --verbose --nprocs 1 --mem 3000 --work-dir /tmp --float32 --verbose-reports'...
|
|
2025-10-09T23:57:17+0200 [INFO ] con-duct: Log files will be written to logs/duct/sub-cIVs025_2025.10.09T23.57.17-4074447_
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
Fontconfig error: No writable cache directories
|
|
/opt/conda/lib/python3.11/site-packages/mriqc/interfaces/anatomical.py:490: RuntimeWarning: divide by zero encountered in divide
|
|
bg_data[bg_data > 0] = bg_data[bg_data > 0] / bg_spread
|
|
/opt/conda/lib/python3.11/site-packages/mriqc/qc/anatomical.py:491: RuntimeWarning: divide by zero encountered in scalar divide
|
|
data *= 100 / np.percentile(data, 99)
|
|
/opt/conda/lib/python3.11/site-packages/mriqc/qc/anatomical.py:491: RuntimeWarning: invalid value encountered in multiply
|
|
data *= 100 / np.percentile(data, 99)
|
|
Traceback (most recent call last):
|
|
File "/opt/conda/bin/mriqc", line 8, in <module>
|
|
sys.exit(main())
|
|
^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/cli/run.py", line 178, in main
|
|
mriqc_wf.run(**_plugin)
|
|
File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/workflows.py", line 638, in run
|
|
runner.run(execgraph, updatehash=updatehash, config=self.config)
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/engine/plugin.py", line 196, in run
|
|
notrun.append(self._clean_queue(jobid, graph, result=result))
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/engine/plugin.py", line 259, in _clean_queue
|
|
raise RuntimeError(''.join(result['traceback']))
|
|
RuntimeError: Traceback (most recent call last):
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/engine/plugin.py", line 64, in run_node
|
|
result['result'] = node.run(updatehash=updatehash)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
|
|
result = self._run_interface(execute=True)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
|
|
return self._run_command(execute)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
|
|
raise NodeExecutionError(msg)
|
|
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node ComputeQI2.
|
|
|
|
Traceback:
|
|
Traceback (most recent call last):
|
|
File "/opt/conda/lib/python3.11/site-packages/nipype/interfaces/base/core.py", line 397, in run
|
|
runtime = self._run_interface(runtime)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/interfaces/anatomical.py", line 378, in _run_interface
|
|
qi2, out_file = art_qi2(imdata, airdata)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/mriqc/qc/anatomical.py", line 497, in art_qi2
|
|
kde_skl = KernelDensity(kernel='gaussian', bandwidth=4.0).fit(modelx[:, np.newaxis])
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/base.py", line 1351, in wrapper
|
|
return fit_method(estimator, *args, **kwargs)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/neighbors/_kde.py", line 226, in fit
|
|
X = self._validate_data(X, order="C", dtype=np.float64)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/base.py", line 633, in _validate_data
|
|
out = check_array(X, input_name="X", **check_params)
|
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/utils/validation.py", line 1003, in check_array
|
|
_assert_all_finite(
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/utils/validation.py", line 126, in _assert_all_finite
|
|
_assert_all_finite_element_wise(
|
|
File "/opt/conda/lib/python3.11/site-packages/sklearn/utils/validation.py", line 175, in _assert_all_finite_element_wise
|
|
raise ValueError(msg_err)
|
|
ValueError: Input X contains NaN.
|
|
KernelDensity does not accept missing values encoded as NaN natively. For supervised learning, you might want to consider sklearn.ensemble.HistGradientBoostingClassifier and Regressor which accept missing values encoded as NaNs natively. Alternatively, it is possible to preprocess the data, for instance by using an imputer transformer in a pipeline or drop samples with missing values. See https://scikit-learn.org/stable/modules/impute.html You can find a list of all estimators that handle NaN values at the following page: https://scikit-learn.org/stable/modules/impute.html#estimators-that-handle-nan-values
|
|
|
|
|
|
2025-10-10T00:55:08+0200 [INFO ] con-duct: Summary:
|
|
Exit Code: 1
|
|
Command: singularity exec -W /tmp/singtmp.lGjCDi -B /var/lib/condor/execute/dir_3950045/ds/code/containers/binds/zoneinfo/UTC:/etc/localtime -B /tmp/singtmp.lGjCDi/tmp:/tmp -B /tmp/singtmp.lGjCDi/var/tmp:/var/tmp -e -B /var/lib/condor/execute/dir_3950045/ds -H /var/lib/condor/execute/dir_3950045/ds/code/containers/binds/HOME --pwd /var/lib/condor/execute/dir_3950045/ds code/containers/images/bids/bids-mriqc--24.0.2.sing mriqc sourcedata/raw . participant --participant-label sub-cIVs025 --no-datalad-get --no-sub --verbose --nprocs 1 --mem 3000 --work-dir /tmp --float32 --verbose-reports
|
|
Log files location: logs/duct/sub-cIVs025_2025.10.09T23.57.17-4074447_
|
|
Wall Clock Time: 3470.880 sec
|
|
Memory Peak Usage (RSS): 13.5 GB
|
|
Memory Average Usage (RSS): 1.6 GB
|
|
Virtual Memory Peak Usage (VSZ): 19.1 GB
|
|
Virtual Memory Average Usage (VSZ): 4.8 GB
|
|
Memory Peak Percentage: 2.30%
|
|
Memory Average Percentage: 0.11%
|
|
CPU Peak Usage: 308.40%
|
|
Average CPU Usage: 90.67%
|
|
|
|
[INFO] == Command exit (modification check follows) =====
|