Snakemake workflow for building an ANTS template (from antsMultivariateTemplateConstruction2.sh)
Help improve this workflow!
This workflow has been published but could be further improved with some additional meta data:- Keyword(s) in categories input, output, operation, topic
You can help improve this workflow by suggesting the addition or removal of keywords, suggest changes and report issues, or request to become a maintainer of the Workflow .
This is the template for a new Snakemake workflow. Replace this text with a comprehensive description covering the purpose and domain.
Insert your code into the respective folders, i.e.
scripts
,
rules
, and
envs
. Define the entry point of the workflow in the
Snakefile
and the main configuration in the
config.yaml
file.
Authors
- Ali Khan (@akhanf)
Usage
If you use this workflow in a paper, don't forget to give credits to the authors by citing the URL of this (original) repository and, if available, its DOI (see above).
Step 1: Obtain a copy of this workflow
-
Create a new github repository using this workflow as a template .
-
Clone the newly created repository to your local system, into the place where you want to perform the data analysis.
Step 2: Configure workflow
Configure the workflow according to your needs via editing the files in the
config/
folder. Adjust
config.yaml
to configure the workflow execution, and
samples.tsv
to specify your sample setup.
Step 3: Install Snakemake
Install Snakemake using conda :
conda create -c bioconda -c conda-forge -n snakemake snakemake
For installation details, see the instructions in the Snakemake documentation .
Step 4: Execute workflow
Activate the conda environment:
conda activate snakemake
Test your configuration by performing a dry-run via
snakemake --use-conda -n
Execute the workflow locally via
snakemake --use-conda --cores $N
using
$N
cores or run it in a cluster environment via
snakemake --use-conda --cluster qsub --jobs 100
or
snakemake --use-conda --drmaa --jobs 100
If you not only want to fix the software stack but also the underlying OS, use
snakemake --use-conda --use-singularity
in combination with any of the modes above. See the Snakemake documentation for further details.
Step 5: Investigate results
After successful execution, you can create a self-contained interactive HTML report with all results via:
snakemake --report report.html
This report can, e.g., be forwarded to your collaborators. An example (using some trivial test data) can be seen here .
Step 6: Commit changes
Whenever you change something, don't forget to commit the changes back to your github copy of the repository:
git commit -a
git push
Step 7: Obtain updates from upstream
Whenever you want to synchronize your workflow copy with new developments from upstream, do the following.
-
Once, register the upstream repository in your local copy:
git remote add -f upstream git@github.com:snakemake-workflows/ants_build_template_smk.git
orgit remote add -f upstream https://github.com/snakemake-workflows/ants_build_template_smk.git
if you do not have setup ssh keys. -
Update the upstream version:
git fetch upstream
. -
Create a diff with the current version:
git diff HEAD upstream/master workflow > upstream-changes.diff
. -
Investigate the changes:
vim upstream-changes.diff
. -
Apply the modified diff via:
git apply upstream-changes.diff
. -
Carefully check whether you need to update the config files:
git diff HEAD upstream/master config
. If so, do it manually, and only where necessary, since you would otherwise likely overwrite your settings and samples.
Step 8: Contribute back
In case you have also changed or added steps, please consider contributing them back to the original repository:
-
Fork the original repo to a personal or lab account.
-
Clone the fork to your local system, to a different place than where you ran your analysis.
-
Copy the modified files from your analysis to the clone of your fork, e.g.,
cp -r workflow path/to/fork
. Make sure to not accidentally copy config file contents or sample sheets. Instead, manually update the example config files if necessary. -
Commit and push your changes to your fork.
-
Create a pull request against the original repository.
Testing
Test cases are in the subfolder
.test
. They are automatically executed via continuous integration with
Github Actions
.
Code Snippets
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | import nibabel as nib import numpy as np img = nib.load(snakemake.input[0]) print('config:') print(snakemake.config) hdr = img.header shape = np.array(hdr.get_data_shape()).astype('float').tolist() zooms = np.array(hdr.get_zooms()).astype('float').tolist() affine = img.affine origin = affine @ np.array([0,0,0,1]).T origin = origin[0:3].astype('float').tolist() template_dict = dict() #add extras from config file template_dict.update(snakemake.config['template_description_extras']) #add shape, zooms, origin, for the resolution template_dict.update( { 'res': {'{res:02d}'.format(res=snakemake.config['resolution_index']) : { 'origin': origin, 'shape': shape, 'zooms': zooms } } } ) import json with open(snakemake.output[0],'w') as outfile: json.dump(template_dict,outfile,indent=2) |
43 44 | shell: 'AverageImages {params.dim} {output} {params.use_n4} {input} &> {log}' |
51 | shell: 'cp -v {input} {output} &> {log}' |
66 | shell: '{params.cmd} &> {log}' |
102 103 104 105 106 107 108 109 | shell: 'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} ' 'antsRegistration {params.base_opts} {params.intensity_opts} ' '{params.init_translation} ' #initial translation '-t Rigid[0.1] {params.linear_metric} {params.linear_multires} ' # rigid registration '-t Affine[0.1] {params.linear_metric} {params.linear_multires} ' # affine registration '{params.deform_model} {params.deform_metric} {params.deform_multires} ' # deformable registration '-o {params.out_prefix} &> {log}' |
125 126 127 | shell: 'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} ' 'antsApplyTransforms {params.base_opts} -i {input.target} -o {output.warped} -r {input.template} -t {input.warp} -t {input.affine} &> {log}' |
139 140 | shell: 'AverageImages {params.dim} {output} {params.use_n4} {input} &> {log}' |
153 154 | shell: 'AverageImages {params.dim} {output} {params.use_n4} {input} &> {log}' |
165 166 | shell: 'MultiplyImages {params.dim} {input} {params.gradient_step} {output} &> {log}' |
178 179 | shell: 'AverageAffineTransformNoRigid {params.dim} {output} {input} &> {log}' |
193 194 | shell: 'antsApplyTransforms {params.dim} -e vector -i {input.invwarp} -o {output} -t [{input.affine},1] -r {input.ref} --verbose 1 &> {log}' |
208 209 210 211 212 213 214 215 216 217 218 | shell: 'antsApplyTransforms {params.dim} --float 1 --verbose 1 -i {input.template} -o {output.template} -t [{input.affine},1] ' ' -t {input.invwarp} -t {input.invwarp} -t {input.invwarp} -t {input.invwarp} -r {input.template} &> {log}' #apply warp 4 times rule cp_nii_to_templateflow_naming: input: template = 'results/iter_{iteration}/template_{{channel}}.nii.gz'.format(iteration=config['max_iters']) output: template = 'results/tpl-{name}/tpl-{name}_res-{res}_{{channel}}.nii.gz'.format(name=config['template_name'],res=config['resolution_index']) shell: 'cp {input} {output}' |
227 | script: 'scripts/create_template_description_json.py' |
Support
- Future updates
Related Workflows





