├── README.md ├── fMRI_1st_Level.ipynb ├── fMRI_2nd_Level.ipynb ├── fMRI_preprocessing.ipynb ├── graph_1st_level.png ├── graph_2nd_level.png └── graph_preprocessing.png /README.md: -------------------------------------------------------------------------------- 1 | # BIDS_fMRI_analysis_nipype 2 | ### Analysis pipeline for fMRI (and structural) images in BIDs format 3 | 4 | This repository contains a series of processing pipelines for fMRI (and structural) dataset in BIDs format, in form jupyter notebooks. 5 | The pipelines are implemented in [nipype](https://nipype.readthedocs.io/en/latest/) and are adapted from the [Michael Notter tutorial](https://miykael.github.io/nipype_tutorial/). 6 | The scripts were developed to analyze specific fMRI data colleted during a Motor Imagery Neurofeedback task, however they can be easily extended to other datasets in BIDs format. 7 | To be able to use these notebooks and adapt them to your own analysis, you need to have installed the dependencies necessary to use Nipype 8 | 9 | The pipeline is organized in three modules: 10 | 11 | ### 1. fMRI_preprocessing.ipynb 12 | Preprocessing pipeline of fMRI and structural scans including 13 | 1. Motion Correction 14 | 2. Slice timing Correction 15 | 3. Outliers detection 16 | 4. Coregistration with structural scan 17 | 18 | 19 | ### 2. fMRI_1st_Level.ipynb 20 | Individual GLM analysis model definition and estimation. Normalization of obtained contrast on MNI template. 21 | 22 | ### 3. fMRI_2nd_Level.ipynb 23 | Group GLM analysis (one sample T-test model) and examples of visualization. 24 | -------------------------------------------------------------------------------- /fMRI_preprocessing.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### fMRI PREPROCESSING \n", 8 | "\n", 9 | "This pipeline is adapted from the example_preprocessing module in the nipype tutorial https://miykael.github.io/nipype_tutorial/notebooks/example_preprocessing.html\n", 10 | "\n", 11 | "**step implemented**\n", 12 | "\n", 13 | "1. slice time correction (fsl)\n", 14 | "2. motion correction (fsl)\n", 15 | "3. artifact detection to identify outliers (fsl)\n", 16 | "4. segmentation - fsl FAST\n", 17 | "5. coregistration with anatomical (fls)\n", 18 | "6. smoothing (fsl)\n", 19 | "\n" 20 | ] 21 | }, 22 | { 23 | "cell_type": "markdown", 24 | "metadata": {}, 25 | "source": [ 26 | "## import modules" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "metadata": {}, 33 | "outputs": [], 34 | "source": [ 35 | "import os\n", 36 | "import json\n", 37 | "import numpy as np\n", 38 | "import pylab as plt\n", 39 | "import nibabel as nb\n", 40 | "from nipype import Node,Workflow\n", 41 | "from nipype.interfaces.io import SelectFiles, DataSink\n", 42 | "from nipype.interfaces.utility import IdentityInterface\n", 43 | "from nilearn.plotting import plot_anat\n", 44 | "from os.path import join as opj\n", 45 | "from nipype.interfaces.matlab import MatlabCommand\n", 46 | "from nipype.interfaces.spm import Normalize12, NewSegment\n", 47 | "from nipype.algorithms.misc import Gunzip\n", 48 | "from bids.layout import BIDSLayout\n", 49 | "from nipype.algorithms.rapidart import ArtifactDetect\n", 50 | "from nipype.interfaces.fsl import (BET,ExtractROI,FAST,FLIRT,MCFLIRT,ImageMaths,SliceTimer,Threshold,Smooth)\n", 51 | "\n", 52 | "\n", 53 | "#tpm_img ='/home/ubuntu/Documents/MATLAB/spm12/tpm/TPM.nii'\n", 54 | "#\n", 55 | "#tissue1 = ((tpm_img, 1), 1, (True,False), (False, False))\n", 56 | "#tissue2 = ((tpm_img, 2), 1, (True,False), (False, False))\n", 57 | "#tissue3 = ((tpm_img, 3), 2, (True,False), (False, False))\n", 58 | "#tissue4 = ((tpm_img, 4), 3, (False,False), (False, False))\n", 59 | "#tissue5 = ((tpm_img, 5), 4, (False,False), (False, False))\n", 60 | "#tissue6 = ((tpm_img, 6), 2, (False,False), (False, False))\n", 61 | "#\n", 62 | "#tissues = [tissue1, tissue2, tissue3, tissue4, tissue5, tissue6]\n" 63 | ] 64 | }, 65 | { 66 | "cell_type": "markdown", 67 | "metadata": {}, 68 | "source": [ 69 | "## experiment parameters" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": {}, 76 | "outputs": [], 77 | "source": [ 78 | "data_dir='/home/ubuntu/Documents/BIDs/XP2'\n", 79 | "experiment_dir='/home/ubuntu/Documents/windowshare/output/XP2'\n", 80 | "output_dir='datasink'\n", 81 | "working_dir='workingdir'\n", 82 | "\n", 83 | "subject_list_2d=['xp204','xp205','xp207','xp210','xp212','xp213','xp215','xp216','xp217','xp221','xp223']\n", 84 | "subject_list=['xp201','xp202','xp203','xp206','xp208','xp209','xp211','xp214','xp218','xp219','xp220','xp222']\n", 85 | "\n", 86 | "task_list=['1dNF']\n", 87 | "run_list=['01','02']\n", 88 | "fwhm_list=[6]\n", 89 | "\n", 90 | "#isometric voxel resolution\n", 91 | "desired_voxel_iso = 4" 92 | ] 93 | }, 94 | { 95 | "cell_type": "markdown", 96 | "metadata": {}, 97 | "source": [ 98 | "### metadata info from BIDs layout" 99 | ] 100 | }, 101 | { 102 | "cell_type": "code", 103 | "execution_count": null, 104 | "metadata": {}, 105 | "outputs": [], 106 | "source": [ 107 | "#FIND INFO FROM BIDS LAYOUT\n", 108 | "func_file='/home/ubuntu/Documents/BIDs/XP2/sub-xp201/func/sub-xp201_task-1dNF_run-01_bold.nii.gz'\n", 109 | "layout_data=BIDSLayout(data_dir)\n", 110 | "sub_list=layout_data.get_subjects()\n", 111 | "layout_data.get_tasks()\n", 112 | "TR=layout_data.get_metadata(func_file)[\"RepetitionTime\"]\n", 113 | "sliceTiming=layout_data.get_metadata(func_file)[\"SliceTiming\"]\n", 114 | "slice_order=[sliceTiming.index(x) for x in sorted(sliceTiming)]\n", 115 | "\n", 116 | "print(\"TR: \"+str(TR))\n", 117 | "print(str(slice_order))\n", 118 | "print(\"image dim: \"+ str(layout_data.get_metadata(func_file)[\"PhaseEncodingSteps\"])+ \" X \"+ str(layout_data.get_metadata(func_file)[\"PhaseEncodingSteps\"])+ \" X \" + str(len(sliceTiming)))" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "## main workflow" 126 | ] 127 | }, 128 | { 129 | "cell_type": "code", 130 | "execution_count": null, 131 | "metadata": {}, 132 | "outputs": [], 133 | "source": [ 134 | "#SLICE TIME CORRECTION\n", 135 | "slicetimer = Node(SliceTimer(index_dir=False,\n", 136 | " interleaved=True,\n", 137 | " output_type='NIFTI',\n", 138 | " time_repetition=TR),\n", 139 | " name=\"slicetimer\")\n", 140 | "\n", 141 | "#MOTION CORRECTION\n", 142 | "mcflirt=Node(MCFLIRT(mean_vol=True,\n", 143 | " save_plots=True,\n", 144 | " output_type='NIFTI'),\n", 145 | " name=\"mcflirt\")\n", 146 | "\n", 147 | "#ARTIFACT DETECTION\n", 148 | "art=Node(ArtifactDetect(norm_threshold=2,\n", 149 | " zintensity_threshold=2,\n", 150 | " mask_type='spm_global',\n", 151 | " parameter_source='FSL',\n", 152 | " use_differences=[True,False],\n", 153 | " plot_type='svg'), name='art')\n", 154 | "\n", 155 | "\n", 156 | "#COREGISTRATION FUNC-ANAT IMAGEs\n", 157 | "coreg = Node(FLIRT(dof=6,\n", 158 | " cost='bbr',\n", 159 | " schedule='/usr/local/fsl/etc/flirtsch/bbr.sch',\n", 160 | " output_type='NIFTI'),\n", 161 | " name=\"coreg\")\n", 162 | "\n", 163 | "#SMOOTH\n", 164 | "\n", 165 | "smooth = Node(Smooth(), name=\"smooth\")\n", 166 | "smooth.iterables = (\"fwhm\", fwhm_list)\n", 167 | "\n" 168 | ] 169 | }, 170 | { 171 | "cell_type": "markdown", 172 | "metadata": {}, 173 | "source": [ 174 | "### coregistration workflow" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": null, 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "# BET -Skullstrip anatomical Image\n", 184 | "bet_anat=Node(BET(frac=0.4,\n", 185 | " robust=True,\n", 186 | " output_type='NIFTI_GZ'),\n", 187 | " name=\"bet_anat\")\n", 188 | "#SEGMENTATION\n", 189 | "\n", 190 | "# with FAST \n", 191 | "segmentation=Node(FAST(output_type='NIFTI_GZ'),name=\"segmentation\")\n", 192 | "\n", 193 | "#with SPM\n", 194 | "#segmentation = Node(NewSegment(tissues=tissues), name='segmentation')\n", 195 | "\n", 196 | "\n", 197 | "# Select WM segmentation file from segmentation output\n", 198 | "# for fast\n", 199 | "def get_wm(files):\n", 200 | " return files[-1]\n", 201 | "\n", 202 | "#for spm\n", 203 | "#def get_wm(files):\n", 204 | " # return files[1][0]\n", 205 | "\n", 206 | "#Threshold white matter probablity map \n", 207 | "threshold_WM = Node(Threshold(thresh=0.5,\n", 208 | " args='-bin',\n", 209 | " output_type='NIFTI'),\n", 210 | " name=\"threshold_WM\")\n", 211 | "\n", 212 | "#FLIRT - pre-alignement (6 DOF) of functional to anatomical image\n", 213 | "coreg_pre=Node(FLIRT(dof=6, output_type='NIFTI_GZ'),name=\"coreg_pre\")\n", 214 | "\n", 215 | "#FLIRT - coregistration of functional to anatomical with BBR\n", 216 | "coreg_bbr=Node(FLIRT(dof=6,cost='bbr',\n", 217 | " schedule=opj(os.getenv('FSLDIR'),'etc/flirtsch/bbr.sch'),\n", 218 | " output_type='NIFTI_GZ'), name='coreg_bbr')\n", 219 | "\n", 220 | "#APPLY coregistration warp to functional image\n", 221 | "applywarp = Node(FLIRT(interp='spline',\n", 222 | " apply_isoxfm=desired_voxel_iso,\n", 223 | " output_type='NIFTI'),\n", 224 | " name=\"applywarp\")\n", 225 | "\n", 226 | "#apply coregistration warp to mean file\n", 227 | "applywarp_mean=Node(FLIRT(interp='spline',\n", 228 | " apply_isoxfm=desired_voxel_iso,\n", 229 | " output_type='NIFTI_GZ'),\n", 230 | " name=\"applywarp_mean\")\n", 231 | "\n" 232 | ] 233 | }, 234 | { 235 | "cell_type": "markdown", 236 | "metadata": {}, 237 | "source": [ 238 | "### create a coregistration workflow and connect nodes" 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": null, 244 | "metadata": {}, 245 | "outputs": [], 246 | "source": [ 247 | "coregwf=Workflow(name='coregwf')\n", 248 | "coregwf.base_dir=opj(experiment_dir,working_dir)\n", 249 | "\n", 250 | "\n", 251 | "coregwf.connect([\n", 252 | " #(bet_anat,segmentation,[('out_file','channel_files')]),\n", 253 | " (bet_anat,segmentation,[('out_file','in_files')]),\n", 254 | " (segmentation, threshold_WM, [(('partial_volume_files', get_wm),\n", 255 | " 'in_file')]),\n", 256 | " # (segmentation, threshold_WM, [(('native_class_images', get_wm),\n", 257 | " # 'in_file')]),\n", 258 | " (bet_anat,coreg_pre,[('out_file','reference')]),\n", 259 | " (threshold_WM,coreg_bbr,[('out_file','wm_seg')]),\n", 260 | " (coreg_pre,coreg_bbr,[('out_matrix_file','in_matrix_file')]),\n", 261 | " (coreg_bbr,applywarp,[('out_matrix_file','in_matrix_file')]),\n", 262 | " (bet_anat,applywarp,[('out_file','reference')]),\n", 263 | " (coreg_bbr,applywarp_mean,[('out_matrix_file','in_matrix_file')]),\n", 264 | " (bet_anat,applywarp_mean,[('out_file','reference')]),\n", 265 | " ])" 266 | ] 267 | }, 268 | { 269 | "cell_type": "markdown", 270 | "metadata": {}, 271 | "source": [ 272 | "## data input and output" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": null, 278 | "metadata": {}, 279 | "outputs": [], 280 | "source": [ 281 | "#Inforsource- to iterate over list of subject names\n", 282 | "# zrun_id because I want it to be at the end of the folder tree (apparently IdentityInterface follows alphabetic order)\n", 283 | "infosource=Node(IdentityInterface(fields=['subject_id','task_id','zrun_id']), \n", 284 | " name=\"infosource\")\n", 285 | "infosource.iterables=[('subject_id',subject_list),\n", 286 | " ('task_id',task_list),\n", 287 | " ('zrun_id',run_list)]\n", 288 | "\n", 289 | "# String template with {}-based strings\n", 290 | "templates = {'anat': 'sub-{subject_id}/anat/'\n", 291 | " 'sub-{subject_id}_T1w.nii.gz',\n", 292 | " 'func': 'sub-{subject_id}/func/'\n", 293 | " 'sub-{subject_id}_task-{task_id}_run-{zrun_id}_bold.nii.gz'}\n", 294 | "\n", 295 | "# Create SelectFiles node\n", 296 | "selectfiles = Node(SelectFiles(templates,\n", 297 | " base_directory=data_dir,\n", 298 | " sort_filelist=True),\n", 299 | " name='selectfiles')\n", 300 | "\n", 301 | "\n", 302 | "# DataSink- creates output folder for important outputs\n", 303 | "datasink=Node(DataSink(base_directory=experiment_dir,\n", 304 | " container=output_dir), name=\"datasink\")\n", 305 | "\n", 306 | "substitutions=[('_task_id_','/task-'),\n", 307 | " ('_subject_id_','sub-'),\n", 308 | " ('_zrun_id_','/run-'),\n", 309 | " ('_fwhm_','fwhm-'),\n", 310 | " ('_roi',''),\n", 311 | " ('_mcf',''),\n", 312 | " ('_st',''),\n", 313 | " ('_flirt',''),\n", 314 | " ('_smooth',''),\n", 315 | " ('.nii_mean_reg','_mean'),\n", 316 | " ('.nii.par','.par')]\n", 317 | "\n", 318 | "subjFolders=[('fwhm-%s/' % f, 'fwhm-%s-' % f) for f in fwhm_list]\n", 319 | "substitutions.extend(subjFolders)\n", 320 | "datasink.inputs.substitutions=substitutions" 321 | ] 322 | }, 323 | { 324 | "cell_type": "markdown", 325 | "metadata": {}, 326 | "source": [ 327 | "## main workflow" 328 | ] 329 | }, 330 | { 331 | "cell_type": "code", 332 | "execution_count": null, 333 | "metadata": {}, 334 | "outputs": [], 335 | "source": [ 336 | "preproc=Workflow(name='preprocessing',base_dir=opj(experiment_dir,working_dir))\n", 337 | "\n", 338 | "preproc.connect([(infosource,selectfiles,[('subject_id','subject_id'),\n", 339 | " ('task_id','task_id'),\n", 340 | " ('zrun_id','zrun_id')]),\n", 341 | " (selectfiles,mcflirt,[('func','in_file')]),\n", 342 | " (mcflirt,slicetimer,[('out_file','in_file')]),\n", 343 | " (selectfiles,coregwf,[('anat','bet_anat.in_file'),\n", 344 | " ('anat','coreg_bbr.reference')]),\n", 345 | " (mcflirt,coregwf,[('mean_img','coreg_pre.in_file'),\n", 346 | " ('mean_img','coreg_bbr.in_file'),\n", 347 | " ('mean_img','applywarp_mean.in_file')]),\n", 348 | " (slicetimer,coregwf,[('slice_time_corrected_file','applywarp.in_file')]),\n", 349 | " (coregwf,smooth,[('applywarp.out_file','in_file')]),\n", 350 | " # (coregwf,normalize,[('applywarp.out_file','image_to_align')]),\n", 351 | " #(normalize,smooth,[('normalized_image','in_file')]),\n", 352 | " \n", 353 | " (mcflirt,datasink,[('par_file','preproc.@par')]),\n", 354 | " (smooth,datasink,[('smoothed_file','preproc.@smooth')]),\n", 355 | " # (normalize,datasink,[('normalized_files','norm_spm.@files'),\n", 356 | " # ('normalized_image','norm_spm.@image')]),\n", 357 | " (coregwf,datasink,[('applywarp_mean.out_file','preproc.@mean')]),\n", 358 | " \n", 359 | " (coregwf,art,[('applywarp.out_file','realigned_files')]),\n", 360 | " (mcflirt,art,[('par_file','realignment_parameters')]),\n", 361 | " \n", 362 | " (coregwf,datasink,[('coreg_bbr.out_matrix_file','preproc.@mat_file'),\n", 363 | " ('bet_anat.out_file','preproc.@brain')]),\n", 364 | " (art,datasink,[('outlier_files','preproc.@outlier_files'),\n", 365 | " ('plot_files','preproc.@plot_files')]),\n", 366 | " ])" 367 | ] 368 | }, 369 | { 370 | "cell_type": "markdown", 371 | "metadata": {}, 372 | "source": [ 373 | "## visualize" 374 | ] 375 | }, 376 | { 377 | "cell_type": "code", 378 | "execution_count": null, 379 | "metadata": {}, 380 | "outputs": [], 381 | "source": [ 382 | "# Create preproc output graph\n", 383 | "preproc.write_graph(graph2use='colored', format='png', simple_form=True)\n", 384 | "\n", 385 | "# Visualize the graph\n", 386 | "from IPython.display import Image\n", 387 | "Image(filename=opj(preproc.base_dir,'preprocessing','graph.png'), width=750)" 388 | ] 389 | }, 390 | { 391 | "cell_type": "markdown", 392 | "metadata": {}, 393 | "source": [ 394 | "## run" 395 | ] 396 | }, 397 | { 398 | "cell_type": "code", 399 | "execution_count": null, 400 | "metadata": {}, 401 | "outputs": [], 402 | "source": [ 403 | "#preproc.run('MultiProc', plugin_args={'n_procs': 4})\n", 404 | "preproc.run()" 405 | ] 406 | }, 407 | { 408 | "cell_type": "markdown", 409 | "metadata": {}, 410 | "source": [ 411 | "## visualize results tree" 412 | ] 413 | }, 414 | { 415 | "cell_type": "code", 416 | "execution_count": null, 417 | "metadata": {}, 418 | "outputs": [], 419 | "source": [ 420 | "!tree /home/ubuntu/Documents/windowshare/output/XP2/datasink/preproc/\n" 421 | ] 422 | }, 423 | { 424 | "cell_type": "markdown", 425 | "metadata": {}, 426 | "source": [ 427 | "## outliers" 428 | ] 429 | }, 430 | { 431 | "cell_type": "code", 432 | "execution_count": null, 433 | "metadata": {}, 434 | "outputs": [], 435 | "source": [ 436 | "num_outliers=np.zeros((len(subject_list),len(task_list),len(run_list)))\n", 437 | " \n", 438 | "for ii in range(len(subject_list)) :\n", 439 | "\n", 440 | " for tt in range(len(task_list)) : \n", 441 | "\n", 442 | " for rr in range(len(run_list)) :\n", 443 | " out_dir='/home/ubuntu/Documents/windowshare/output/XP2/datasink/preproc/sub-%s/task-%s/run-%s' % (subject_list[ii],task_list[tt],run_list[rr])\n", 444 | " outliers_file='art.sub-%s_task-%s_run-%s_bold_outliers.txt' %(subject_list[ii],task_list[tt],run_list[rr])\n", 445 | " outliers = np.loadtxt(opj(out_dir,outliers_file))\n", 446 | " a=np.array(list(outliers.astype('int')))\n", 447 | " num_outliers[ii,tt,rr]=len(a[a<320])\n", 448 | " \n", 449 | "np.save(opj('/home/ubuntu/Documents/windowshare/output/XP2/datasink/preproc','array_outliers_task_%s.npy' %task_list[tt]),num_outliers)\n", 450 | "\n" 451 | ] 452 | }, 453 | { 454 | "cell_type": "code", 455 | "execution_count": null, 456 | "metadata": {}, 457 | "outputs": [], 458 | "source": [ 459 | "#mean+std number of outliers \n", 460 | "mean_out=num_outliers.mean()\n", 461 | "\n", 462 | "std_out=num_outliers.std()\n", 463 | "print(mean_out,std_out)\n", 464 | "print((mean_out*100)/320)" 465 | ] 466 | }, 467 | { 468 | "cell_type": "code", 469 | "execution_count": null, 470 | "metadata": {}, 471 | "outputs": [], 472 | "source": [] 473 | } 474 | ], 475 | "metadata": { 476 | "kernelspec": { 477 | "display_name": "Python 3", 478 | "language": "python", 479 | "name": "python3" 480 | }, 481 | "language_info": { 482 | "codemirror_mode": { 483 | "name": "ipython", 484 | "version": 3 485 | }, 486 | "file_extension": ".py", 487 | "mimetype": "text/x-python", 488 | "name": "python", 489 | "nbconvert_exporter": "python", 490 | "pygments_lexer": "ipython3", 491 | "version": "3.6.8" 492 | } 493 | }, 494 | "nbformat": 4, 495 | "nbformat_minor": 2 496 | } 497 | -------------------------------------------------------------------------------- /graph_1st_level.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/glioi/BIDS_fMRI_analysis_nipype/e19d7b0cc1864d0f88fcbcd0e3a96a9c6e162062/graph_1st_level.png -------------------------------------------------------------------------------- /graph_2nd_level.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/glioi/BIDS_fMRI_analysis_nipype/e19d7b0cc1864d0f88fcbcd0e3a96a9c6e162062/graph_2nd_level.png -------------------------------------------------------------------------------- /graph_preprocessing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/glioi/BIDS_fMRI_analysis_nipype/e19d7b0cc1864d0f88fcbcd0e3a96a9c6e162062/graph_preprocessing.png --------------------------------------------------------------------------------