├── .gitignore ├── Dockerfile ├── README.md ├── assets ├── raw_examples │ ├── brain-lesion_T1w.nii.gz │ ├── brain-lesion_T1w_mask.nii.gz │ ├── fsl-open-dev_sub-001_T1w.nii.gz │ ├── listen-task_sub-UTS01_ses-1_T1w.nii.gz │ └── wash-120_sub-001_T1w.nii.gz └── templates │ ├── mni_icbm152_t1_tal_nlin_sym_09a.nii │ └── mni_icbm152_t1_tal_nlin_sym_09a_mask.nii ├── docker-compose.yml ├── notebooks ├── 00_libs_review.ipynb ├── 01_img_orientation.ipynb ├── 02_common_operations.ipynb ├── 03_bias_field_correction.ipynb ├── 04_templates_and_masks.ipynb ├── 05_intensity_normalization.ipynb ├── 06_registration.ipynb ├── 07_registration_and_masks.ipynb ├── 08_brain_extraction_with_antspynet.ipynb ├── 09_brain_extraction_with_template.ipynb └── helpers.py └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | 131 | # custom folders 132 | preprocessed/ -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | # Use a minimal Python image as the base 2 | FROM python:3.12-slim 3 | 4 | WORKDIR /app 5 | 6 | COPY . . 7 | 8 | RUN pip3 install -r requirements.txt 9 | 10 | # Expose the port Jupyter will run on 11 | EXPOSE 8888 12 | 13 | # Set environment variables 14 | ENV JUPYTER_ENABLE_LAB=true 15 | 16 | # Run Jupyter Notebook when the container launches 17 | CMD ["jupyter", "notebook", "--ip=0.0.0.0", "--port=8888", "--no-browser", "--allow-root"] 18 | 19 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # MRI-preprocessing-techniques in Python 2 | Code examples of the free course in Youtube of [brain MRI preprocessing techniques in python](https://www.youtube.com/playlist?list=PLI3eIHxETjX4a5NMmgayg3wuM232fYIxy) 3 | 4 | ## Setup Instructions 5 | ### Using venv (for Linux only) 6 | ``` 7 | python -m venv .venv 8 | source .venv/bin/activate 9 | 10 | pip install -r requirements.txt 11 | ``` 12 | ### Using Docker 13 | (Update 2025) I added this option to make the setup easier for Linux based operating systems(e.g. Ubuntu) and to add support for Windows. 14 | This step requires `Docker>=27.3.1` 15 | 1. Make sure you have [Docker](https://docs.docker.com/get-started/) installed 16 | 2. Open a terminal and execute the following command. This command builds the image and runs the service(This step may take between 8-10 minutes, so be patient, all the dependencies are being installed for you): 17 | ``` 18 | docker compose up 19 | ``` 20 | 21 | 3. Once the previous command has finished, in the console log, look for a message similar to the following, and open the highlighted url: 22 | ``` 23 | To access the server, open this file in a browser: 24 | jupyter-1 | file:///root/.local/share/jupyter/runtime/jpserver-1-open.html 25 | jupyter-1 | Or copy and paste one of these URLs: 26 | jupyter-1 | http://7090c5cd23eb:8888/tree?token={secret} 27 | jupyter-1 | http://127.0.0.1:8888/tree?token={secret} <-- Open this url 28 | ``` 29 | 30 | 4. Browse the notebooks and execute them as you wish. You should be able to execute them. 31 | 32 | Please, feel free to report any issue o bug if you have any. Tested and working in a laptop with following specs: 33 | ``` 34 | Windows 10 Home 35 | Python version 3.12.8 36 | CPU core i7 8550u 37 | NVIDIA GeForce MX130 2GB dedicated RAM 38 | 8GB Ram 39 | ``` 40 | From my experience, the project should run normally in a pc with a processor of 2 cores(or more) and 8gb RAM(or more). 41 | 42 | ## About `/assets` 43 | I selected sample images and templates from the following sources 44 | 45 | ### Datasets : 46 | - FSL open science dev dataset 47 | - Washington University 120 48 | - Kung fu panda 49 | - An fMRI dataset during a passive natural language listening task 50 | 51 | *Source : https://openneuro.org/* 52 | 53 | ### Templates: 54 | - ICBM 2009a Nonlinear Symmetric (NIFTI) 55 | 56 | *Source : https://nist.mni.mcgill.ca/icbm-152-nonlinear-atlases-2009/* 57 | 58 | ### Papers: 59 | 60 | - Mena, R., Pelaez, E., Loayza, F., Macas, A., & Franco-Maldonado, H. (2023). An artificial intelligence approach for segmenting and classifying brain lesions caused by stroke. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 11(7), 2736–2747. https://doi.org/10.1080/21681163.2023.2264410 61 | - Mena, Roberto & Macas, Alex & Pelaez, C. & Loayza, Francis & Franco-Maldonado, Heydy. (2022). A Pipeline for Segmenting and Classifying Brain Lesions Caused by Stroke: A Machine Learning Approach. 10.1007/978-3-031-04829-6_37. 62 | -------------------------------------------------------------------------------- /assets/raw_examples/brain-lesion_T1w.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/raw_examples/brain-lesion_T1w.nii.gz -------------------------------------------------------------------------------- /assets/raw_examples/brain-lesion_T1w_mask.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/raw_examples/brain-lesion_T1w_mask.nii.gz -------------------------------------------------------------------------------- /assets/raw_examples/fsl-open-dev_sub-001_T1w.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/raw_examples/fsl-open-dev_sub-001_T1w.nii.gz -------------------------------------------------------------------------------- /assets/raw_examples/listen-task_sub-UTS01_ses-1_T1w.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/raw_examples/listen-task_sub-UTS01_ses-1_T1w.nii.gz -------------------------------------------------------------------------------- /assets/raw_examples/wash-120_sub-001_T1w.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/raw_examples/wash-120_sub-001_T1w.nii.gz -------------------------------------------------------------------------------- /assets/templates/mni_icbm152_t1_tal_nlin_sym_09a.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/templates/mni_icbm152_t1_tal_nlin_sym_09a.nii -------------------------------------------------------------------------------- /assets/templates/mni_icbm152_t1_tal_nlin_sym_09a_mask.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Angeluz-07/MRI-preprocessing-techniques/50e3fd8fe15204423fa16a8a24553614ccf385e5/assets/templates/mni_icbm152_t1_tal_nlin_sym_09a_mask.nii -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | services: 2 | jupyter: 3 | build: . 4 | ports: 5 | - "8888:8888" 6 | volumes: 7 | - ./notebooks:/app/notebooks 8 | - ./assets:/app/assets 9 | environment: 10 | - JUPYTER_ENABLE_LAB=true 11 | -------------------------------------------------------------------------------- /notebooks/00_libs_review.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### **Review of the libraries**" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "**Learning outcomes:**\n", 15 | "- Load .nii.gz/.nii images using AntsPy and SITK\n", 16 | "- Get basic information\n", 17 | "- Get numpy representation\n", 18 | "- Plot MRI images" 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": null, 24 | "metadata": {}, 25 | "outputs": [], 26 | "source": [ 27 | "%matplotlib inline\n", 28 | "\n", 29 | "import os\n", 30 | "from helpers import *\n", 31 | "\n", 32 | "import ants\n", 33 | "import SimpleITK as sitk\n", 34 | "\n", 35 | "print(f'AntsPy version = {ants.__version__}')\n", 36 | "print(f'SimpleITK version = {sitk.__version__}')" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": null, 42 | "metadata": {}, 43 | "outputs": [], 44 | "source": [ 45 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 46 | "print(f'project folder = {BASE_DIR}')" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": null, 52 | "metadata": {}, 53 | "outputs": [], 54 | "source": [ 55 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', 'fsl-open-dev_sub-001_T1w.nii.gz')\n", 56 | "print(f'raw_img_path = {raw_img_path}')" 57 | ] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": {}, 62 | "source": [ 63 | "### AntsPY" 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": 5, 69 | "metadata": {}, 70 | "outputs": [], 71 | "source": [ 72 | "raw_img_ants = ants.image_read(raw_img_path)" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": null, 78 | "metadata": {}, 79 | "outputs": [], 80 | "source": [ 81 | "print(raw_img_ants)" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": null, 87 | "metadata": {}, 88 | "outputs": [], 89 | "source": [ 90 | "raw_img_ants_arr = raw_img_ants.numpy()\n", 91 | "\n", 92 | "print(f'type = {type(raw_img_ants_arr)}')\n", 93 | "print(f'shape = {raw_img_ants_arr.shape}')" 94 | ] 95 | }, 96 | { 97 | "cell_type": "code", 98 | "execution_count": null, 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "ants.plot(raw_img_ants, figsize=3, axis=2)" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "metadata": {}, 109 | "outputs": [], 110 | "source": [ 111 | "explore_3D_array(raw_img_ants_arr)" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "### Simple ITK" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": 11, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [ 127 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": 12, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "#print(raw_img_sitk)" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": null, 142 | "metadata": {}, 143 | "outputs": [], 144 | "source": [ 145 | "show_sitk_img_info(raw_img_sitk)" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "raw_img_sitk_arr = sitk.GetArrayFromImage(raw_img_sitk)\n", 155 | "\n", 156 | "print(f'type = {type(raw_img_sitk_arr)}')\n", 157 | "print(f'shape = {raw_img_sitk_arr.shape}')" 158 | ] 159 | }, 160 | { 161 | "cell_type": "code", 162 | "execution_count": 15, 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [ 166 | "#sitk.Show(raw_img_sitk)" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": null, 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [ 175 | "explore_3D_array(raw_img_sitk_arr)" 176 | ] 177 | } 178 | ], 179 | "metadata": { 180 | "kernelspec": { 181 | "display_name": ".venv", 182 | "language": "python", 183 | "name": "python3" 184 | }, 185 | "language_info": { 186 | "codemirror_mode": { 187 | "name": "ipython", 188 | "version": 3 189 | }, 190 | "file_extension": ".py", 191 | "mimetype": "text/x-python", 192 | "name": "python", 193 | "nbconvert_exporter": "python", 194 | "pygments_lexer": "ipython3", 195 | "version": "3.13.1" 196 | } 197 | }, 198 | "nbformat": 4, 199 | "nbformat_minor": 4 200 | } 201 | -------------------------------------------------------------------------------- /notebooks/01_img_orientation.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Image Orientation" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "**Learning outcomes:**\n", 15 | "- Load .nii.gz/.nii images using AntsPy and SITK using different orientations" 16 | ] 17 | }, 18 | { 19 | "cell_type": "code", 20 | "execution_count": null, 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "%matplotlib inline\n", 25 | "\n", 26 | "import os\n", 27 | "from helpers import *\n", 28 | "\n", 29 | "import ants\n", 30 | "import SimpleITK as sitk\n", 31 | "\n", 32 | "print(f'AntsPy version = {ants.__version__}')\n", 33 | "print(f'SimpleITK version = {sitk.__version__}')" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": null, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 43 | "print(f'project folder = {BASE_DIR}')" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": null, 49 | "metadata": {}, 50 | "outputs": [], 51 | "source": [ 52 | "raw_examples = [\n", 53 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 54 | " 'wash-120_sub-001_T1w.nii.gz',\n", 55 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 56 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz'\n", 57 | "]" 58 | ] 59 | }, 60 | { 61 | "cell_type": "markdown", 62 | "metadata": {}, 63 | "source": [ 64 | "### AntsPy" 65 | ] 66 | }, 67 | { 68 | "cell_type": "code", 69 | "execution_count": null, 70 | "metadata": {}, 71 | "outputs": [], 72 | "source": [ 73 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_examples[0])" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": null, 79 | "metadata": {}, 80 | "outputs": [], 81 | "source": [ 82 | "raw_img_ants = ants.image_read(raw_img_path) \n", 83 | "print(raw_img_ants)" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": null, 89 | "metadata": {}, 90 | "outputs": [], 91 | "source": [ 92 | "# LPI = Left-to-right, Posterior-to-anterior, Inferior-to-superior\n", 93 | "print(raw_img_ants.get_orientation())" 94 | ] 95 | }, 96 | { 97 | "cell_type": "code", 98 | "execution_count": null, 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "arr = raw_img_ants.numpy()\n", 103 | "print(raw_img_ants.get_orientation())\n", 104 | "print(arr.shape, '-> (Z,Y,X)')" 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": null, 110 | "metadata": {}, 111 | "outputs": [], 112 | "source": [ 113 | "# LPI = Left-to-right, Posterior-to-anterior, Inferior-to-superior\n", 114 | "arr = raw_img_ants.numpy()\n", 115 | "print(raw_img_ants.get_orientation())\n", 116 | "print(arr.shape, '-> (Z,Y,X)')\n", 117 | "explore_3D_array(arr=raw_img_ants.numpy()) " 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": null, 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "# Pixel arrangement\n", 127 | "# Z, Y, X = (↑,↓,→)" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "# LPI = Left-to-right, Posterior-to-anterior, Inferior-to-superior\n", 137 | "# IAL = Inferior-to-superior, Anterior-to-posterior, Left-to-right\n", 138 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL') \n", 139 | "\n", 140 | "print(raw_img_ants.get_orientation())\n", 141 | "print(arr.shape, '-> (Z,Y,X)')\n", 142 | "explore_3D_array(arr=raw_img_ants.numpy()) " 143 | ] 144 | }, 145 | { 146 | "cell_type": "markdown", 147 | "metadata": {}, 148 | "source": [ 149 | "### Simple ITK" 150 | ] 151 | }, 152 | { 153 | "cell_type": "code", 154 | "execution_count": null, 155 | "metadata": {}, 156 | "outputs": [], 157 | "source": [ 158 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_examples[0])\n", 159 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)" 160 | ] 161 | }, 162 | { 163 | "cell_type": "code", 164 | "execution_count": null, 165 | "metadata": {}, 166 | "outputs": [], 167 | "source": [ 168 | "raw_img_sitk_arr = sitk.GetArrayFromImage(raw_img_sitk)\n", 169 | "print(raw_img_sitk_arr.shape)\n", 170 | "explore_3D_array(raw_img_sitk_arr)" 171 | ] 172 | }, 173 | { 174 | "cell_type": "markdown", 175 | "metadata": {}, 176 | "source": [ 177 | "For AntsPy:\n", 178 | "- Internal axis are (Z,Y,X). It means, when we get numpy array dimensions are (Z,Y,X)\n", 179 | "- When we define orientation, orientation string is according to internal axis.\n", 180 | "\n", 181 | "For SimpleITK:\n", 182 | "- Internal axis are (X,Y,Z). It means, when we get numpy array dimensions are (Z,Y,X) i.e. shifted.\n", 183 | "- When we define orientation, orientation string is according to internal axis. \n", 184 | "- The orientation string is set with the latest letter, e.g. : \n", 185 | " - \"RPS\" = (left-to-Right, anterior-to-Posterior, inferior-to-Superior)\n", 186 | " - \"PSR\" = (anterior-to-Posterior, inferior-to-Superior, left-to-Right)" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)\n", 196 | "raw_img_sitk = sitk.DICOMOrient(raw_img_sitk,'RPS')\n", 197 | "\n", 198 | "raw_img_sitk_arr = sitk.GetArrayFromImage(raw_img_sitk)\n", 199 | "print(raw_img_sitk_arr.shape)\n", 200 | "explore_3D_array(raw_img_sitk_arr)" 201 | ] 202 | }, 203 | { 204 | "cell_type": "code", 205 | "execution_count": null, 206 | "metadata": {}, 207 | "outputs": [], 208 | "source": [ 209 | "# Internal Pixel arrangement for SimpleItk\n", 210 | "# (X, Y, Z) = (→, ↓, ↑)" 211 | ] 212 | } 213 | ], 214 | "metadata": { 215 | "interpreter": { 216 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 217 | }, 218 | "kernelspec": { 219 | "display_name": "Python 3.7.1 ('.venv': venv)", 220 | "language": "python", 221 | "name": "python3" 222 | }, 223 | "language_info": { 224 | "codemirror_mode": { 225 | "name": "ipython", 226 | "version": 3 227 | }, 228 | "file_extension": ".py", 229 | "mimetype": "text/x-python", 230 | "name": "python", 231 | "nbconvert_exporter": "python", 232 | "pygments_lexer": "ipython3", 233 | "version": "3.7.1" 234 | }, 235 | "orig_nbformat": 4 236 | }, 237 | "nbformat": 4, 238 | "nbformat_minor": 2 239 | } 240 | -------------------------------------------------------------------------------- /notebooks/02_common_operations.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Common operations" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "**Learning outcomes:**\n", 15 | "- Learn how to apply basic filters and transformations using AntsPy and SITK:\n", 16 | " - Denoise\n", 17 | " - Morphological operations\n", 18 | " - Shrink\n", 19 | " - Cropping\n", 20 | " - Padding\n", 21 | " - Blurring\n", 22 | " - Thresholding\n", 23 | " - Statistics" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": null, 29 | "metadata": {}, 30 | "outputs": [], 31 | "source": [ 32 | "%matplotlib inline\n", 33 | "\n", 34 | "import os\n", 35 | "from helpers import *\n", 36 | "\n", 37 | "import ants\n", 38 | "import SimpleITK as sitk\n", 39 | "\n", 40 | "print(f'AntsPy version = {ants.__version__}')\n", 41 | "print(f'SimpleITK version = {sitk.__version__}')" 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "metadata": {}, 48 | "outputs": [], 49 | "source": [ 50 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 51 | "print(f'project folder = {BASE_DIR}')" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": null, 57 | "metadata": {}, 58 | "outputs": [], 59 | "source": [ 60 | "raw_examples = [\n", 61 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 62 | " 'wash-120_sub-001_T1w.nii.gz',\n", 63 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 64 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz'\n", 65 | "]" 66 | ] 67 | }, 68 | { 69 | "cell_type": "markdown", 70 | "metadata": {}, 71 | "source": [ 72 | "### AntsPy" 73 | ] 74 | }, 75 | { 76 | "cell_type": "markdown", 77 | "metadata": {}, 78 | "source": [ 79 | "#### Raw Image" 80 | ] 81 | }, 82 | { 83 | "cell_type": "code", 84 | "execution_count": null, 85 | "metadata": {}, 86 | "outputs": [], 87 | "source": [ 88 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_examples[0])\n", 89 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL') \n", 90 | "\n", 91 | "print(f'shape = {raw_img_ants.numpy().shape} -> (Z, X, Y)')\n", 92 | "\n", 93 | "explore_3D_array(\n", 94 | " arr=raw_img_ants.numpy(),\n", 95 | " cmap='viridis'\n", 96 | ")" 97 | ] 98 | }, 99 | { 100 | "cell_type": "markdown", 101 | "metadata": {}, 102 | "source": [ 103 | "#### Denoise" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "transformed = ants.denoise_image(raw_img_ants, shrink_factor=8)\n", 113 | "\n", 114 | "explore_3D_array_comparison(\n", 115 | " arr_before=raw_img_ants.numpy(),\n", 116 | " arr_after=transformed.numpy(),\n", 117 | " cmap='viridis'\n", 118 | ")" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "#### Morphological operations" 126 | ] 127 | }, 128 | { 129 | "cell_type": "code", 130 | "execution_count": null, 131 | "metadata": {}, 132 | "outputs": [], 133 | "source": [ 134 | "\"\"\"\n", 135 | "operation to apply\n", 136 | " \"close\" Morphological closing\n", 137 | " \"dilate\" Morphological dilation\n", 138 | " \"erode\" Morphological erosion\n", 139 | " \"open\" Morphological opening\n", 140 | "\"\"\"\n", 141 | "\n", 142 | "transformed = ants.morphology(raw_img_ants, radius=1, operation='erode', mtype='grayscale')\n", 143 | "\n", 144 | "explore_3D_array_comparison(\n", 145 | " arr_before=raw_img_ants.numpy(),\n", 146 | " arr_after=transformed.numpy(),\n", 147 | " cmap='viridis'\n", 148 | ")" 149 | ] 150 | }, 151 | { 152 | "cell_type": "markdown", 153 | "metadata": {}, 154 | "source": [ 155 | "### Simple ITK" 156 | ] 157 | }, 158 | { 159 | "cell_type": "markdown", 160 | "metadata": {}, 161 | "source": [ 162 | "#### Raw Image" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": null, 168 | "metadata": {}, 169 | "outputs": [], 170 | "source": [ 171 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_examples[0])\n", 172 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)\n", 173 | "raw_img_sitk = sitk.DICOMOrient(raw_img_sitk,'RPS')\n", 174 | "\n", 175 | "print(f'shape = {sitk.GetArrayFromImage(raw_img_sitk).shape} -> (Z, X, Y)')\n", 176 | "explore_3D_array(\n", 177 | " arr=sitk.GetArrayFromImage(raw_img_sitk),\n", 178 | " cmap='viridis'\n", 179 | ")" 180 | ] 181 | }, 182 | { 183 | "cell_type": "markdown", 184 | "metadata": {}, 185 | "source": [ 186 | "#### Shrink" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "shrinkFactor = 3\n", 196 | "transformed = sitk.Shrink( raw_img_sitk, [ shrinkFactor ] * raw_img_sitk.GetDimension() )\n", 197 | "\n", 198 | "print(f'shape before = {sitk.GetArrayFromImage(raw_img_sitk).shape}')\n", 199 | "print(f'shape after = {sitk.GetArrayFromImage(transformed).shape}')\n", 200 | "\n", 201 | "explore_3D_array(sitk.GetArrayFromImage(transformed))" 202 | ] 203 | }, 204 | { 205 | "cell_type": "markdown", 206 | "metadata": {}, 207 | "source": [ 208 | "#### Crop" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": null, 214 | "metadata": {}, 215 | "outputs": [], 216 | "source": [ 217 | "# Cropping takes the orientation of the pixels for the reference of lower & upper boundaries vectors\n", 218 | "# Pixel orientation = RPS = (left-to-Right, anterior-to-Posterior, inferior-to-Superior)\n", 219 | "\n", 220 | "# crop nothing\n", 221 | "#transformed = sitk.Crop(raw_img_sitk)\n", 222 | "#transformed = sitk.Crop(raw_img_sitk, (0,0,0), (0,0,0))\n", 223 | "\n", 224 | "# crop 20 from left to right X,Y,Z\n", 225 | "#transformed = sitk.Crop(raw_img_sitk, (20,0,0), (0,0,0))\n", 226 | "\n", 227 | "# crop 20 from left to right, crop 30 from anterior to posterior\n", 228 | "#transformed = sitk.Crop(raw_img_sitk, (20,30,0), (0,0,0))\n", 229 | "\n", 230 | "# crop 20 from left to right, crop 30 from anterior to posterior, \n", 231 | "# crop 10 from right to left, crop 5 from posterior to anterior. \n", 232 | "#transformed = sitk.Crop(raw_img_sitk, (20,30,0), (10,5,0)) \n", 233 | "\n", 234 | "# crop 40 from inferior to superior, crop 50 from superior to inferior\n", 235 | "transformed = sitk.Crop(raw_img_sitk, (0,0,40), (0,0,50)) \n", 236 | "\n", 237 | "\n", 238 | "print(f'shape before = {sitk.GetArrayFromImage(raw_img_sitk).shape}')\n", 239 | "print(f'shape after = {sitk.GetArrayFromImage(transformed).shape}')\n", 240 | "\n", 241 | "explore_3D_array(sitk.GetArrayFromImage(transformed))" 242 | ] 243 | }, 244 | { 245 | "cell_type": "markdown", 246 | "metadata": {}, 247 | "source": [ 248 | "#### Padding" 249 | ] 250 | }, 251 | { 252 | "cell_type": "code", 253 | "execution_count": null, 254 | "metadata": {}, 255 | "outputs": [], 256 | "source": [ 257 | "constant = int(sitk.GetArrayFromImage(raw_img_sitk).min())\n", 258 | "constant" 259 | ] 260 | }, 261 | { 262 | "cell_type": "code", 263 | "execution_count": null, 264 | "metadata": {}, 265 | "outputs": [], 266 | "source": [ 267 | "# Padding (as Cropping) takes the orientation of the pixels for the reference of lower & upper boundaries vectors\n", 268 | "# Pixel orientation = RPS = (left-to-Right, anterior-to-Posterior, inferior-to-Superior)\n", 269 | "\n", 270 | "# pad nothing\n", 271 | "#transformed = sitk.ConstantPad(raw_img_sitk)\n", 272 | "#transformed = sitk.ConstantPad(raw_img_sitk,(0,0,0),(0,0,0), constant)\n", 273 | "\n", 274 | "# pad 10 from left to right\n", 275 | "#transformed = sitk.ConstantPad(raw_img_sitk,(10,0,0),(0,0,0),constant)\n", 276 | "\n", 277 | "# pad 10 from left to right, pad 15 from anterior to posterior\n", 278 | "#transformed = sitk.ConstantPad(raw_img_sitk,(10,15,0),(0,0,0),constant)\n", 279 | "\n", 280 | "# pad 10 from left to right, pad 15 from anterior to posterior, \n", 281 | "# pad 5 from right to left, pad 8 from posterior to anterior. \n", 282 | "transformed = sitk.ConstantPad(raw_img_sitk,(10,15,0),(5,8,0),constant)\n", 283 | "\n", 284 | "\n", 285 | "print(f'shape before = {sitk.GetArrayFromImage(raw_img_sitk).shape}')\n", 286 | "print(f'shape after = {sitk.GetArrayFromImage(transformed).shape}')\n", 287 | "\n", 288 | "explore_3D_array(sitk.GetArrayFromImage(transformed), cmap='viridis')" 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "metadata": {}, 294 | "source": [ 295 | "#### Denoise" 296 | ] 297 | }, 298 | { 299 | "cell_type": "markdown", 300 | "metadata": {}, 301 | "source": [ 302 | "Curvature Flow filter" 303 | ] 304 | }, 305 | { 306 | "cell_type": "code", 307 | "execution_count": null, 308 | "metadata": {}, 309 | "outputs": [], 310 | "source": [ 311 | "transformed = sitk.CurvatureFlow(raw_img_sitk)\n", 312 | "\n", 313 | "explore_3D_array_comparison(\n", 314 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 315 | " arr_after=sitk.GetArrayFromImage(transformed),\n", 316 | " cmap='viridis'\n", 317 | ")" 318 | ] 319 | }, 320 | { 321 | "cell_type": "markdown", 322 | "metadata": {}, 323 | "source": [ 324 | "#### Morphological Operations" 325 | ] 326 | }, 327 | { 328 | "cell_type": "code", 329 | "execution_count": null, 330 | "metadata": {}, 331 | "outputs": [], 332 | "source": [ 333 | "\"\"\"\n", 334 | "sitk.GrayscaleMorphologicalClosing\n", 335 | "sitk.GrayscaleDilate\n", 336 | "sitk.GrayscaleErode\n", 337 | "sitk.GrayscaleMorphologicalOpening\n", 338 | "\n", 339 | "sitk.BinaryMorphologicalClosing\n", 340 | "sitk.BinaryDilate\n", 341 | "sitk.BinaryErode\n", 342 | "sitk.BinaryMorphologicalOpening\n", 343 | "\"\"\"\n", 344 | "\n", 345 | "transformed = sitk.GrayscaleErode(raw_img_sitk)\n", 346 | "\n", 347 | "explore_3D_array_comparison(\n", 348 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 349 | " arr_after=sitk.GetArrayFromImage(transformed),\n", 350 | " cmap='viridis'\n", 351 | ")" 352 | ] 353 | }, 354 | { 355 | "cell_type": "markdown", 356 | "metadata": {}, 357 | "source": [ 358 | "#### Blurring" 359 | ] 360 | }, 361 | { 362 | "cell_type": "code", 363 | "execution_count": null, 364 | "metadata": {}, 365 | "outputs": [], 366 | "source": [ 367 | "transformed = sitk.DiscreteGaussian(raw_img_sitk)\n", 368 | "\n", 369 | "explore_3D_array_comparison(\n", 370 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 371 | " arr_after=sitk.GetArrayFromImage(transformed),\n", 372 | " cmap='viridis'\n", 373 | ")" 374 | ] 375 | }, 376 | { 377 | "cell_type": "markdown", 378 | "metadata": {}, 379 | "source": [ 380 | "#### Thresholding" 381 | ] 382 | }, 383 | { 384 | "cell_type": "code", 385 | "execution_count": null, 386 | "metadata": {}, 387 | "outputs": [], 388 | "source": [ 389 | "\"\"\"\n", 390 | "sitk.OtsuThreshold\n", 391 | "sitk.LiThreshold\n", 392 | "sitk.TriangleThreshold\n", 393 | "sitk.MomentsThreshold\n", 394 | "\"\"\"\n", 395 | "\n", 396 | "transformed = sitk.TriangleThreshold(raw_img_sitk, 0, 1)\n", 397 | "\n", 398 | "explore_3D_array_comparison(\n", 399 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 400 | " arr_after=sitk.GetArrayFromImage(transformed)\n", 401 | ")" 402 | ] 403 | }, 404 | { 405 | "cell_type": "markdown", 406 | "metadata": {}, 407 | "source": [ 408 | "#### Statistics" 409 | ] 410 | }, 411 | { 412 | "cell_type": "code", 413 | "execution_count": null, 414 | "metadata": {}, 415 | "outputs": [], 416 | "source": [ 417 | "stats = sitk.StatisticsImageFilter()\n", 418 | "stats.Execute(raw_img_sitk)\n", 419 | "\n", 420 | "\n", 421 | "print('\\tRaw img')\n", 422 | "print(\"min =\", stats.GetMinimum())\n", 423 | "print(\"max =\", stats.GetMaximum())\n", 424 | "print(\"mean =\", stats.GetMean())\n" 425 | ] 426 | } 427 | ], 428 | "metadata": { 429 | "interpreter": { 430 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 431 | }, 432 | "kernelspec": { 433 | "display_name": "Python 3.7.1 ('.venv': venv)", 434 | "language": "python", 435 | "name": "python3" 436 | }, 437 | "language_info": { 438 | "codemirror_mode": { 439 | "name": "ipython", 440 | "version": 3 441 | }, 442 | "file_extension": ".py", 443 | "mimetype": "text/x-python", 444 | "name": "python", 445 | "nbconvert_exporter": "python", 446 | "pygments_lexer": "ipython3", 447 | "version": "3.7.1" 448 | }, 449 | "orig_nbformat": 4 450 | }, 451 | "nbformat": 4, 452 | "nbformat_minor": 2 453 | } 454 | -------------------------------------------------------------------------------- /notebooks/03_bias_field_correction.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Bias Field Correction" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- Learn how to apply Bias Field Correction to an image using SITK.\n", 18 | "- Inspect Bias Field visually." 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": null, 24 | "metadata": {}, 25 | "outputs": [], 26 | "source": [ 27 | "%matplotlib inline\n", 28 | "\n", 29 | "import os\n", 30 | "from helpers import *\n", 31 | "\n", 32 | "import ants\n", 33 | "import SimpleITK as sitk\n", 34 | "\n", 35 | "print(f'AntsPy version = {ants.__version__}')\n", 36 | "print(f'SimpleITK version = {sitk.__version__}')" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": null, 42 | "metadata": {}, 43 | "outputs": [], 44 | "source": [ 45 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 46 | "print(f'project folder = {BASE_DIR}')" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": null, 52 | "metadata": {}, 53 | "outputs": [], 54 | "source": [ 55 | "raw_examples = [\n", 56 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 57 | " 'wash-120_sub-001_T1w.nii.gz',\n", 58 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 59 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz',\n", 60 | "]" 61 | ] 62 | }, 63 | { 64 | "attachments": {}, 65 | "cell_type": "markdown", 66 | "metadata": {}, 67 | "source": [ 68 | "#### Load image" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "execution_count": null, 74 | "metadata": {}, 75 | "outputs": [], 76 | "source": [ 77 | "raw_example = raw_examples[0]\n", 78 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 79 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)\n", 80 | "raw_img_sitk = sitk.DICOMOrient(raw_img_sitk,'RPS')\n", 81 | "\n", 82 | "raw_img_sitk_arr = sitk.GetArrayFromImage(raw_img_sitk)\n", 83 | "print(f'shape = {raw_img_sitk_arr.shape} -> (Z, X, Y)')\n", 84 | "explore_3D_array(raw_img_sitk_arr, cmap='nipy_spectral')" 85 | ] 86 | }, 87 | { 88 | "attachments": {}, 89 | "cell_type": "markdown", 90 | "metadata": {}, 91 | "source": [ 92 | "#### Create head mask" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "transformed = sitk.RescaleIntensity(raw_img_sitk, 0, 255)\n", 102 | "\n", 103 | "#transformed = sitk.TriangleThreshold(transformed, 0, 1)\n", 104 | "transformed = sitk.LiThreshold(transformed,0,1)\n", 105 | "\n", 106 | "head_mask = transformed\n", 107 | "\n", 108 | "explore_3D_array_comparison(\n", 109 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 110 | " arr_after=sitk.GetArrayFromImage(head_mask)\n", 111 | ")" 112 | ] 113 | }, 114 | { 115 | "attachments": {}, 116 | "cell_type": "markdown", 117 | "metadata": {}, 118 | "source": [ 119 | "#### Bias Correction" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": null, 125 | "metadata": {}, 126 | "outputs": [], 127 | "source": [ 128 | "shrinkFactor = 4\n", 129 | "inputImage = raw_img_sitk\n", 130 | "\n", 131 | "inputImage = sitk.Shrink( raw_img_sitk, [ shrinkFactor ] * inputImage.GetDimension() )\n", 132 | "maskImage = sitk.Shrink( head_mask, [ shrinkFactor ] * inputImage.GetDimension() )\n", 133 | "\n", 134 | "bias_corrector = sitk.N4BiasFieldCorrectionImageFilter()\n", 135 | "\n", 136 | "corrected = bias_corrector.Execute(inputImage, maskImage)" 137 | ] 138 | }, 139 | { 140 | "attachments": {}, 141 | "cell_type": "markdown", 142 | "metadata": {}, 143 | "source": [ 144 | "#### Get image corrected" 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": null, 150 | "metadata": {}, 151 | "outputs": [], 152 | "source": [ 153 | "log_bias_field = bias_corrector.GetLogBiasFieldAsImage(raw_img_sitk)\n", 154 | "corrected_image_full_resolution = raw_img_sitk / sitk.Exp( log_bias_field )\n", 155 | "\n", 156 | "explore_3D_array_comparison(\n", 157 | " sitk.GetArrayFromImage(raw_img_sitk),\n", 158 | " sitk.GetArrayFromImage(corrected_image_full_resolution), \n", 159 | " cmap='nipy_spectral')\n" 160 | ] 161 | }, 162 | { 163 | "attachments": {}, 164 | "cell_type": "markdown", 165 | "metadata": {}, 166 | "source": [ 167 | "#### Inspect the bias field" 168 | ] 169 | }, 170 | { 171 | "cell_type": "code", 172 | "execution_count": null, 173 | "metadata": {}, 174 | "outputs": [], 175 | "source": [ 176 | "# bias field\n", 177 | "temp = sitk.Exp(log_bias_field)\n", 178 | "temp = sitk.Mask(temp, head_mask)\n", 179 | "explore_3D_array(sitk.GetArrayFromImage(temp), cmap='gray')" 180 | ] 181 | }, 182 | { 183 | "attachments": {}, 184 | "cell_type": "markdown", 185 | "metadata": {}, 186 | "source": [ 187 | "#### Save the image" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": null, 193 | "metadata": {}, 194 | "outputs": [], 195 | "source": [ 196 | "out_folder = os.path.join(BASE_DIR, 'assets', 'preprocessed')\n", 197 | "out_folder = os.path.join(out_folder, raw_example.split('.')[0]) # create folder with name of the raw file\n", 198 | "os.makedirs(out_folder, exist_ok=True) # create folder if not exists\n", 199 | "\n", 200 | "out_filename = add_suffix_to_filename(raw_example, suffix='biasFieldCorrected')\n", 201 | "out_path = os.path.join(out_folder, out_filename)\n", 202 | "\n", 203 | "print(raw_img_path[len(BASE_DIR):])\n", 204 | "print(out_path[len(BASE_DIR):])" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": null, 210 | "metadata": {}, 211 | "outputs": [], 212 | "source": [ 213 | "sitk.WriteImage(corrected_image_full_resolution, out_path)" 214 | ] 215 | } 216 | ], 217 | "metadata": { 218 | "interpreter": { 219 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 220 | }, 221 | "kernelspec": { 222 | "display_name": "Python 3.7.1 ('.venv': venv)", 223 | "language": "python", 224 | "name": "python3" 225 | }, 226 | "language_info": { 227 | "codemirror_mode": { 228 | "name": "ipython", 229 | "version": 3 230 | }, 231 | "file_extension": ".py", 232 | "mimetype": "text/x-python", 233 | "name": "python", 234 | "nbconvert_exporter": "python", 235 | "pygments_lexer": "ipython3", 236 | "version": "3.7.1" 237 | }, 238 | "orig_nbformat": 4 239 | }, 240 | "nbformat": 4, 241 | "nbformat_minor": 2 242 | } 243 | -------------------------------------------------------------------------------- /notebooks/04_templates_and_masks.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Templates and masks" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to load a template image and a mask with ants.\n", 18 | "- How to mask a brain using ants.\n", 19 | "- Inspect visually region delimited by a mask." 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": null, 25 | "metadata": {}, 26 | "outputs": [], 27 | "source": [ 28 | "%matplotlib inline\n", 29 | "\n", 30 | "import os\n", 31 | "from helpers import *\n", 32 | "\n", 33 | "import ants\n", 34 | "import SimpleITK as sitk\n", 35 | "\n", 36 | "print(f'AntsPy version = {ants.__version__}')\n", 37 | "print(f'SimpleITK version = {sitk.__version__}')" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "metadata": {}, 44 | "outputs": [], 45 | "source": [ 46 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 47 | "print(f'project folder = {BASE_DIR}')" 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "metadata": {}, 53 | "source": [ 54 | "#### Template example" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": null, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a.nii')\n", 64 | "template_img_ants = ants.image_read(template_img_path, reorient='IAL')\n", 65 | "\n", 66 | "explore_3D_array(arr = template_img_ants.numpy())" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": null, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "print('\\t\\tTEMPLATE IMG')\n", 76 | "print(template_img_ants)\n" 77 | ] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "metadata": {}, 82 | "source": [ 83 | "#### Brain mask" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": null, 89 | "metadata": {}, 90 | "outputs": [], 91 | "source": [ 92 | "brain_mask_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a_mask.nii')\n", 93 | "brain_mask_img_ants = ants.image_read(brain_mask_img_path, reorient='IAL')\n", 94 | "\n", 95 | "explore_3D_array(arr = brain_mask_img_ants.numpy())" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [ 104 | "print('\\t\\tTEMPLATE IMG')\n", 105 | "print(template_img_ants)\n", 106 | "\n", 107 | "print('\\t\\tBRAIN MASK IMG')\n", 108 | "print(brain_mask_img_ants)" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": {}, 114 | "source": [ 115 | "#### Mask out the brain" 116 | ] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": null, 121 | "metadata": {}, 122 | "outputs": [], 123 | "source": [ 124 | "brain_masked = ants.mask_image(template_img_ants, brain_mask_img_ants)\n", 125 | "\n", 126 | "explore_3D_array_comparison(template_img_ants.numpy(), brain_masked.numpy())" 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": null, 132 | "metadata": {}, 133 | "outputs": [], 134 | "source": [ 135 | "explore_3D_array_with_mask_contour(template_img_ants.numpy(), brain_mask_img_ants.numpy())" 136 | ] 137 | }, 138 | { 139 | "cell_type": "markdown", 140 | "metadata": {}, 141 | "source": [ 142 | "#### Brain Lesion example" 143 | ] 144 | }, 145 | { 146 | "cell_type": "code", 147 | "execution_count": null, 148 | "metadata": {}, 149 | "outputs": [], 150 | "source": [ 151 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', 'brain-lesion_T1w.nii.gz')\n", 152 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL')\n", 153 | "\n", 154 | "explore_3D_array(arr = raw_img_ants.numpy())" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": {}, 160 | "source": [ 161 | "#### Tissue mask" 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": null, 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "mask_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', 'brain-lesion_T1w_mask.nii.gz')\n", 171 | "mask_img_ants = ants.image_read(mask_img_path, reorient='IAL')\n", 172 | "\n", 173 | "explore_3D_array(arr = mask_img_ants.numpy())" 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": null, 179 | "metadata": {}, 180 | "outputs": [], 181 | "source": [ 182 | "print('\\t\\tRAW IMG')\n", 183 | "print(raw_img_ants)\n", 184 | "\n", 185 | "print('\\t\\tMASK IMG')\n", 186 | "print(mask_img_ants)" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "explore_3D_array_with_mask_contour(raw_img_ants.numpy(),mask_img_ants.numpy())" 196 | ] 197 | } 198 | ], 199 | "metadata": { 200 | "interpreter": { 201 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 202 | }, 203 | "kernelspec": { 204 | "display_name": "Python 3.7.1 64-bit ('.venv': venv)", 205 | "language": "python", 206 | "name": "python3" 207 | }, 208 | "language_info": { 209 | "codemirror_mode": { 210 | "name": "ipython", 211 | "version": 3 212 | }, 213 | "file_extension": ".py", 214 | "mimetype": "text/x-python", 215 | "name": "python", 216 | "nbconvert_exporter": "python", 217 | "pygments_lexer": "ipython3", 218 | "version": "3.7.1" 219 | }, 220 | "orig_nbformat": 4 221 | }, 222 | "nbformat": 4, 223 | "nbformat_minor": 2 224 | } 225 | -------------------------------------------------------------------------------- /notebooks/05_intensity_normalization.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Intensity Normalization" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to apply intensity normalization by Histogram Matching." 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "%matplotlib inline\n", 27 | "\n", 28 | "import os\n", 29 | "from helpers import *\n", 30 | "\n", 31 | "import ants\n", 32 | "import SimpleITK as sitk\n", 33 | "\n", 34 | "print(f'AntsPy version = {ants.__version__}')\n", 35 | "print(f'SimpleITK version = {sitk.__version__}')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 45 | "print(f'project folder = {BASE_DIR}')" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "raw_examples = [\n", 55 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 56 | " 'wash-120_sub-001_T1w.nii.gz',\n", 57 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 58 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz'\n", 59 | "]" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "### Simple ITK" 67 | ] 68 | }, 69 | { 70 | "cell_type": "markdown", 71 | "metadata": {}, 72 | "source": [ 73 | "#### Raw Image" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": null, 79 | "metadata": {}, 80 | "outputs": [], 81 | "source": [ 82 | "raw_example = raw_examples[0]\n", 83 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 84 | "raw_img_sitk = sitk.ReadImage(raw_img_path, sitk.sitkFloat32)\n", 85 | "raw_img_sitk = sitk.DICOMOrient(raw_img_sitk,'RPS')\n", 86 | "\n", 87 | "print(f'shape = {sitk.GetArrayFromImage(raw_img_sitk).shape} -> (Z, X, Y)')\n", 88 | "explore_3D_array(\n", 89 | " arr=sitk.GetArrayFromImage(raw_img_sitk),\n", 90 | " cmap='viridis'\n", 91 | ")" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a.nii')\n", 101 | "\n", 102 | "template_img_sitk = sitk.ReadImage(template_img_path, sitk.sitkFloat32)\n", 103 | "template_img_sitk = sitk.DICOMOrient(template_img_sitk,'RPS')\n", 104 | "explore_3D_array(arr = sitk.GetArrayFromImage(template_img_sitk))" 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": null, 110 | "metadata": {}, 111 | "outputs": [], 112 | "source": [ 113 | "transformed = sitk.HistogramMatching(raw_img_sitk, template_img_sitk)\n", 114 | "\n", 115 | "explore_3D_array_comparison(\n", 116 | " arr_before=sitk.GetArrayFromImage(raw_img_sitk),\n", 117 | " arr_after=sitk.GetArrayFromImage(transformed),\n", 118 | " cmap='gray'\n", 119 | ")" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": null, 125 | "metadata": {}, 126 | "outputs": [], 127 | "source": [ 128 | "stats = sitk.StatisticsImageFilter()\n", 129 | "\n", 130 | "stats.Execute(raw_img_sitk)\n", 131 | "print('\\tRaw img')\n", 132 | "print(\"min =\", stats.GetMinimum())\n", 133 | "print(\"max =\", stats.GetMaximum())\n", 134 | "print()\n", 135 | "\n", 136 | "\n", 137 | "stats.Execute(template_img_sitk)\n", 138 | "print('\\tTemplate img')\n", 139 | "print(\"min =\", stats.GetMinimum())\n", 140 | "print(\"max =\", stats.GetMaximum())\n", 141 | "print()\n", 142 | "\n", 143 | "stats.Execute(transformed)\n", 144 | "print('\\tTransformed img')\n", 145 | "print(\"min =\", stats.GetMinimum())\n", 146 | "print(\"max =\", stats.GetMaximum())\n", 147 | "\n" 148 | ] 149 | } 150 | ], 151 | "metadata": { 152 | "interpreter": { 153 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 154 | }, 155 | "kernelspec": { 156 | "display_name": "Python 3.7.1 ('.venv': venv)", 157 | "language": "python", 158 | "name": "python3" 159 | }, 160 | "language_info": { 161 | "codemirror_mode": { 162 | "name": "ipython", 163 | "version": 3 164 | }, 165 | "file_extension": ".py", 166 | "mimetype": "text/x-python", 167 | "name": "python", 168 | "nbconvert_exporter": "python", 169 | "pygments_lexer": "ipython3", 170 | "version": "3.7.1 (default, Jul 27 2021, 18:42:28) \n[GCC 9.3.0]" 171 | }, 172 | "orig_nbformat": 4 173 | }, 174 | "nbformat": 4, 175 | "nbformat_minor": 2 176 | } 177 | -------------------------------------------------------------------------------- /notebooks/06_registration.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Registration" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to apply registration to an image using ants." 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "%matplotlib inline\n", 27 | "\n", 28 | "import os\n", 29 | "from helpers import *\n", 30 | "\n", 31 | "import ants\n", 32 | "import SimpleITK as sitk\n", 33 | "\n", 34 | "print(f'AntsPy version = {ants.__version__}')\n", 35 | "print(f'SimpleITK version = {sitk.__version__}')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 45 | "print(f'project folder = {BASE_DIR}')" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "raw_examples = [\n", 55 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 56 | " 'wash-120_sub-001_T1w.nii.gz',\n", 57 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 58 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz'\n", 59 | "]" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "#### Raw image" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": null, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "raw_example = raw_examples[0]\n", 76 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 77 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL')\n", 78 | "\n", 79 | "explore_3D_array(arr=raw_img_ants.numpy())" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": {}, 85 | "source": [ 86 | "#### Template image" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": {}, 93 | "outputs": [], 94 | "source": [ 95 | "template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a.nii')\n", 96 | "template_img_ants = ants.image_read(template_img_path, reorient='IAL')\n", 97 | "\n", 98 | "explore_3D_array(arr = template_img_ants.numpy())" 99 | ] 100 | }, 101 | { 102 | "cell_type": "code", 103 | "execution_count": null, 104 | "metadata": {}, 105 | "outputs": [], 106 | "source": [ 107 | "print('\\t\\tRAW IMG')\n", 108 | "print(raw_img_ants)\n", 109 | "\n", 110 | "print('\\t\\tTEMPLATE IMG')\n", 111 | "print(template_img_ants)\n" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "#### Registration" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": null, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [ 127 | "transformation = ants.registration(\n", 128 | " fixed=template_img_ants,\n", 129 | " moving=raw_img_ants, \n", 130 | " type_of_transform='SyN',\n", 131 | " verbose=True\n", 132 | ")" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": null, 138 | "metadata": {}, 139 | "outputs": [], 140 | "source": [ 141 | "print(transformation)" 142 | ] 143 | }, 144 | { 145 | "cell_type": "code", 146 | "execution_count": null, 147 | "metadata": {}, 148 | "outputs": [], 149 | "source": [ 150 | "registered_img_ants = transformation['warpedmovout']\n", 151 | "\n", 152 | "explore_3D_array(arr=registered_img_ants.numpy())" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": null, 158 | "metadata": {}, 159 | "outputs": [], 160 | "source": [ 161 | "out_folder = os.path.join(BASE_DIR, 'assets', 'preprocessed')\n", 162 | "out_folder = os.path.join(out_folder, raw_example.split('.')[0]) # create folder with name of the raw file\n", 163 | "os.makedirs(out_folder, exist_ok=True) # create folder if not exists\n", 164 | "\n", 165 | "out_filename = add_suffix_to_filename(raw_example, suffix='registered')\n", 166 | "out_path = os.path.join(out_folder, out_filename)\n", 167 | "\n", 168 | "print(raw_img_path[len(BASE_DIR):])\n", 169 | "print(out_path[len(BASE_DIR):])" 170 | ] 171 | }, 172 | { 173 | "cell_type": "code", 174 | "execution_count": null, 175 | "metadata": {}, 176 | "outputs": [], 177 | "source": [ 178 | "registered_img_ants.to_file(out_path)" 179 | ] 180 | } 181 | ], 182 | "metadata": { 183 | "interpreter": { 184 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 185 | }, 186 | "kernelspec": { 187 | "display_name": "Python 3.7.1 64-bit ('.venv': venv)", 188 | "language": "python", 189 | "name": "python3" 190 | }, 191 | "language_info": { 192 | "codemirror_mode": { 193 | "name": "ipython", 194 | "version": 3 195 | }, 196 | "file_extension": ".py", 197 | "mimetype": "text/x-python", 198 | "name": "python", 199 | "nbconvert_exporter": "python", 200 | "pygments_lexer": "ipython3", 201 | "version": "3.7.1 (default, Jul 27 2021, 18:42:28) \n[GCC 9.3.0]" 202 | }, 203 | "orig_nbformat": 4 204 | }, 205 | "nbformat": 4, 206 | "nbformat_minor": 2 207 | } 208 | -------------------------------------------------------------------------------- /notebooks/07_registration_and_masks.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Registration and masks" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to apply registration to an image and its corresponding mask using ants." 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "%matplotlib inline\n", 27 | "\n", 28 | "import os\n", 29 | "from helpers import *\n", 30 | "\n", 31 | "import ants\n", 32 | "import SimpleITK as sitk\n", 33 | "\n", 34 | "print(f'AntsPy version = {ants.__version__}')\n", 35 | "print(f'SimpleITK version = {sitk.__version__}')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 45 | "print(f'project folder = {BASE_DIR}')" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "raw_examples = [\n", 55 | " 'brain-lesion_T1w.nii.gz',\n", 56 | "]" 57 | ] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": {}, 62 | "source": [ 63 | "#### Raw image" 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": null, 69 | "metadata": {}, 70 | "outputs": [], 71 | "source": [ 72 | "raw_example = raw_examples[0]\n", 73 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 74 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL')\n", 75 | "\n", 76 | "explore_3D_array(arr=raw_img_ants.numpy())" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "metadata": {}, 83 | "outputs": [], 84 | "source": [ 85 | "mask_example = add_suffix_to_filename(raw_example, suffix='mask')\n", 86 | "mask_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', mask_example)\n", 87 | "mask_img_ants = ants.image_read(mask_img_path, reorient='IAL')\n", 88 | "\n", 89 | "explore_3D_array_with_mask_contour(\n", 90 | " arr=raw_img_ants.numpy(),\n", 91 | " mask=mask_img_ants.numpy()\n", 92 | ")" 93 | ] 94 | }, 95 | { 96 | "cell_type": "markdown", 97 | "metadata": {}, 98 | "source": [ 99 | "#### Template image" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": null, 105 | "metadata": {}, 106 | "outputs": [], 107 | "source": [ 108 | "template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a.nii')\n", 109 | "template_img_ants = ants.image_read(template_img_path, reorient='IAL')\n", 110 | "\n", 111 | "explore_3D_array(arr = template_img_ants.numpy())" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": null, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "print('\\t\\tRAW IMG')\n", 121 | "print(raw_img_ants)\n", 122 | "\n", 123 | "print('\\t\\tTEMPLATE IMG')\n", 124 | "print(template_img_ants)\n" 125 | ] 126 | }, 127 | { 128 | "cell_type": "markdown", 129 | "metadata": {}, 130 | "source": [ 131 | "#### Registration" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": null, 137 | "metadata": {}, 138 | "outputs": [], 139 | "source": [ 140 | "transformation = ants.registration(\n", 141 | " fixed=template_img_ants,\n", 142 | " moving=raw_img_ants, \n", 143 | " type_of_transform='SyN',\n", 144 | " verbose=True\n", 145 | ")" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "print(transformation)" 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": null, 160 | "metadata": {}, 161 | "outputs": [], 162 | "source": [ 163 | "registered_img_ants = transformation['warpedmovout']\n", 164 | "\n", 165 | "explore_3D_array(arr=registered_img_ants.numpy())" 166 | ] 167 | }, 168 | { 169 | "cell_type": "code", 170 | "execution_count": null, 171 | "metadata": {}, 172 | "outputs": [], 173 | "source": [ 174 | "out_folder = os.path.join(BASE_DIR, 'assets', 'preprocessed')\n", 175 | "out_folder = os.path.join(out_folder, raw_example.split('.')[0]) # create folder with name of the raw file\n", 176 | "os.makedirs(out_folder, exist_ok=True) # create folder if not exists\n", 177 | "\n", 178 | "out_filename = add_suffix_to_filename(raw_example, suffix='registered')\n", 179 | "out_path = os.path.join(out_folder, out_filename)\n", 180 | "\n", 181 | "print(raw_img_path[len(BASE_DIR):])\n", 182 | "print(out_path[len(BASE_DIR):])" 183 | ] 184 | }, 185 | { 186 | "cell_type": "code", 187 | "execution_count": null, 188 | "metadata": {}, 189 | "outputs": [], 190 | "source": [ 191 | "registered_img_ants.to_file(out_path)" 192 | ] 193 | }, 194 | { 195 | "cell_type": "markdown", 196 | "metadata": {}, 197 | "source": [ 198 | "#### Move raw mask from native space." 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": null, 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "registered_mask_img_ants = ants.apply_transforms(\n", 208 | " moving=mask_img_ants,\n", 209 | " fixed=transformation['warpedmovout'],\n", 210 | " transformlist=transformation['fwdtransforms'],\n", 211 | " verbose=True\n", 212 | ")" 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": null, 218 | "metadata": {}, 219 | "outputs": [], 220 | "source": [ 221 | "explore_3D_array_with_mask_contour(\n", 222 | " arr=registered_img_ants.numpy(),\n", 223 | " mask=registered_mask_img_ants.numpy()\n", 224 | ")" 225 | ] 226 | }, 227 | { 228 | "cell_type": "code", 229 | "execution_count": null, 230 | "metadata": {}, 231 | "outputs": [], 232 | "source": [ 233 | "out_filename = add_suffix_to_filename(mask_example, suffix='registered')\n", 234 | "out_path = os.path.join(out_folder, out_filename)\n", 235 | "\n", 236 | "print(out_path[len(BASE_DIR):])" 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": null, 242 | "metadata": {}, 243 | "outputs": [], 244 | "source": [ 245 | "registered_mask_img_ants.to_file(out_path)" 246 | ] 247 | } 248 | ], 249 | "metadata": { 250 | "interpreter": { 251 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 252 | }, 253 | "kernelspec": { 254 | "display_name": "Python 3.7.1 64-bit ('.venv': venv)", 255 | "language": "python", 256 | "name": "python3" 257 | }, 258 | "language_info": { 259 | "codemirror_mode": { 260 | "name": "ipython", 261 | "version": 3 262 | }, 263 | "file_extension": ".py", 264 | "mimetype": "text/x-python", 265 | "name": "python", 266 | "nbconvert_exporter": "python", 267 | "pygments_lexer": "ipython3", 268 | "version": "3.7.1" 269 | }, 270 | "orig_nbformat": 4 271 | }, 272 | "nbformat": 4, 273 | "nbformat_minor": 2 274 | } 275 | -------------------------------------------------------------------------------- /notebooks/08_brain_extraction_with_antspynet.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Brain extraction(Skull stripping)" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to do quick brain extraction using ants(antspynet module)" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "%matplotlib inline\n", 27 | "\n", 28 | "import os\n", 29 | "from helpers import *\n", 30 | "\n", 31 | "import ants\n", 32 | "import SimpleITK as sitk\n", 33 | "\n", 34 | "print(f'AntsPy version = {ants.__version__}')\n", 35 | "print(f'SimpleITK version = {sitk.__version__}')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 45 | "print(f'project folder = {BASE_DIR}')" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": 5, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "raw_examples = [\n", 55 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 56 | " 'wash-120_sub-001_T1w.nii.gz',\n", 57 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 58 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz',\n", 59 | " 'brain-lesion_T1w.nii.gz'\n", 60 | "]" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "metadata": {}, 66 | "source": [ 67 | "#### Raw Image" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": null, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "raw_example = raw_examples[4]\n", 77 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 78 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL')\n", 79 | "\n", 80 | "print(f'shape = {raw_img_ants.numpy().shape} -> (Z, X, Y)')\n", 81 | "\n", 82 | "explore_3D_array(arr=raw_img_ants.numpy(), cmap='nipy_spectral')" 83 | ] 84 | }, 85 | { 86 | "cell_type": "markdown", 87 | "metadata": {}, 88 | "source": [ 89 | "#### Deep Learning based method" 90 | ] 91 | }, 92 | { 93 | "cell_type": "markdown", 94 | "metadata": {}, 95 | "source": [ 96 | "#### Load Model via AntsPyNet API and predict" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": null, 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "from antspynet.utilities import brain_extraction" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": null, 111 | "metadata": {}, 112 | "outputs": [], 113 | "source": [ 114 | "prob_brain_mask = brain_extraction(raw_img_ants,modality=\"t1\",verbose=True)" 115 | ] 116 | }, 117 | { 118 | "cell_type": "markdown", 119 | "metadata": {}, 120 | "source": [ 121 | "#### Inspect probabilities array" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": null, 127 | "metadata": {}, 128 | "outputs": [], 129 | "source": [ 130 | "print(prob_brain_mask)\n", 131 | "explore_3D_array(prob_brain_mask.numpy())" 132 | ] 133 | }, 134 | { 135 | "attachments": {}, 136 | "cell_type": "markdown", 137 | "metadata": {}, 138 | "source": [ 139 | "#### Generate final mask" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": null, 145 | "metadata": {}, 146 | "outputs": [], 147 | "source": [ 148 | "brain_mask = ants.get_mask(prob_brain_mask, low_thresh=0.5)" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "explore_3D_array_with_mask_contour(raw_img_ants.numpy(), brain_mask.numpy())" 158 | ] 159 | }, 160 | { 161 | "cell_type": "code", 162 | "execution_count": null, 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [ 166 | "out_folder = os.path.join(BASE_DIR, 'assets', 'preprocessed')\n", 167 | "out_folder = os.path.join(out_folder, raw_example.split('.')[0]) # create folder with name of the raw file\n", 168 | "os.makedirs(out_folder, exist_ok=True) # create folder if not exists\n", 169 | "\n", 170 | "out_filename = add_suffix_to_filename(raw_example, suffix='brainMaskByDL')\n", 171 | "out_path = os.path.join(out_folder, out_filename)\n", 172 | "\n", 173 | "print(raw_img_path[len(BASE_DIR):])\n", 174 | "print(out_path[len(BASE_DIR):])" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": null, 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "brain_mask.to_file(out_path)" 184 | ] 185 | }, 186 | { 187 | "attachments": {}, 188 | "cell_type": "markdown", 189 | "metadata": {}, 190 | "source": [ 191 | "#### Generate brain masked" 192 | ] 193 | }, 194 | { 195 | "cell_type": "code", 196 | "execution_count": null, 197 | "metadata": {}, 198 | "outputs": [], 199 | "source": [ 200 | "masked = ants.mask_image(raw_img_ants, brain_mask)\n", 201 | "\n", 202 | "explore_3D_array(masked.numpy())" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "out_filename = add_suffix_to_filename(raw_example, suffix='brainMaskedByDL')\n", 212 | "out_path = os.path.join(out_folder, out_filename)\n", 213 | "\n", 214 | "print(raw_img_path[len(BASE_DIR):])\n", 215 | "print(out_path[len(BASE_DIR):])" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": null, 221 | "metadata": {}, 222 | "outputs": [], 223 | "source": [ 224 | "masked.to_file(out_path)" 225 | ] 226 | } 227 | ], 228 | "metadata": { 229 | "interpreter": { 230 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 231 | }, 232 | "kernelspec": { 233 | "display_name": "Python 3 (ipykernel)", 234 | "language": "python", 235 | "name": "python3" 236 | }, 237 | "language_info": { 238 | "codemirror_mode": { 239 | "name": "ipython", 240 | "version": 3 241 | }, 242 | "file_extension": ".py", 243 | "mimetype": "text/x-python", 244 | "name": "python", 245 | "nbconvert_exporter": "python", 246 | "pygments_lexer": "ipython3", 247 | "version": "3.12.8" 248 | } 249 | }, 250 | "nbformat": 4, 251 | "nbformat_minor": 4 252 | } 253 | -------------------------------------------------------------------------------- /notebooks/09_brain_extraction_with_template.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "### Brain extraction(Skull stripping)" 9 | ] 10 | }, 11 | { 12 | "attachments": {}, 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "**Learning outcomes:**\n", 17 | "- How to do brain extraction using registration." 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "%matplotlib inline\n", 27 | "\n", 28 | "import os\n", 29 | "from helpers import *\n", 30 | "\n", 31 | "import ants\n", 32 | "import SimpleITK as sitk\n", 33 | "\n", 34 | "print(f'AntsPy version = {ants.__version__}')\n", 35 | "print(f'SimpleITK version = {sitk.__version__}')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"__file__\")))\n", 45 | "print(f'project folder = {BASE_DIR}')" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "raw_examples = [\n", 55 | " 'fsl-open-dev_sub-001_T1w.nii.gz',\n", 56 | " 'wash-120_sub-001_T1w.nii.gz',\n", 57 | " 'kf-panda_sub-01_ses-3T_T1w.nii.gz',\n", 58 | " 'listen-task_sub-UTS01_ses-1_T1w.nii.gz',\n", 59 | "]" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "#### Raw Image" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": null, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "raw_example = raw_examples[0]\n", 76 | "raw_img_path = os.path.join(BASE_DIR, 'assets', 'raw_examples', raw_example)\n", 77 | "raw_img_ants = ants.image_read(raw_img_path, reorient='IAL')\n", 78 | "\n", 79 | "explore_3D_array(arr=raw_img_ants.numpy(), cmap='nipy_spectral')" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": {}, 85 | "source": [ 86 | "### Template based method (Native space)" 87 | ] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "metadata": {}, 92 | "source": [ 93 | "#### Template Image" 94 | ] 95 | }, 96 | { 97 | "cell_type": "code", 98 | "execution_count": null, 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a.nii')\n", 103 | "template_img_ants = ants.image_read(template_img_path, reorient='IAL')\n", 104 | "\n", 105 | "explore_3D_array(arr = template_img_ants.numpy())" 106 | ] 107 | }, 108 | { 109 | "cell_type": "markdown", 110 | "metadata": {}, 111 | "source": [ 112 | "#### Brain Mask of the template" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": null, 118 | "metadata": {}, 119 | "outputs": [], 120 | "source": [ 121 | "mask_template_img_path = os.path.join(BASE_DIR, 'assets', 'templates', 'mni_icbm152_t1_tal_nlin_sym_09a_mask.nii')\n", 122 | "mask_template_img_ants = ants.image_read(mask_template_img_path, reorient='IAL')\n", 123 | "\n", 124 | "explore_3D_array(mask_template_img_ants.numpy())" 125 | ] 126 | }, 127 | { 128 | "cell_type": "code", 129 | "execution_count": null, 130 | "metadata": {}, 131 | "outputs": [], 132 | "source": [ 133 | "np.unique(mask_template_img_ants.numpy())" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": {}, 139 | "source": [ 140 | "#### Register template to raw image" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [ 149 | "transformation = ants.registration(\n", 150 | " fixed=raw_img_ants,\n", 151 | " moving=template_img_ants, \n", 152 | " type_of_transform='SyN',\n", 153 | " verbose=True\n", 154 | ")" 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": null, 160 | "metadata": {}, 161 | "outputs": [], 162 | "source": [ 163 | "print(transformation)" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": null, 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "registered_img_ants = transformation['warpedmovout']\n", 173 | "\n", 174 | "explore_3D_array_comparison(\n", 175 | " arr_before=raw_img_ants.numpy(), \n", 176 | " arr_after=registered_img_ants.numpy()\n", 177 | ")" 178 | ] 179 | }, 180 | { 181 | "cell_type": "markdown", 182 | "metadata": {}, 183 | "source": [ 184 | "#### Apply the generated transformations to the mask of template" 185 | ] 186 | }, 187 | { 188 | "cell_type": "code", 189 | "execution_count": null, 190 | "metadata": {}, 191 | "outputs": [], 192 | "source": [ 193 | "brain_mask = ants.apply_transforms(\n", 194 | " fixed=transformation['warpedmovout'],\n", 195 | " moving=mask_template_img_ants,\n", 196 | " transformlist=transformation['fwdtransforms'],\n", 197 | " interpolator='nearestNeighbor',\n", 198 | " verbose=True\n", 199 | ")" 200 | ] 201 | }, 202 | { 203 | "cell_type": "code", 204 | "execution_count": null, 205 | "metadata": {}, 206 | "outputs": [], 207 | "source": [ 208 | "explore_3D_array(brain_mask.numpy())" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": null, 214 | "metadata": {}, 215 | "outputs": [], 216 | "source": [ 217 | "explore_3D_array_with_mask_contour(raw_img_ants.numpy(), brain_mask.numpy())" 218 | ] 219 | }, 220 | { 221 | "cell_type": "code", 222 | "execution_count": null, 223 | "metadata": {}, 224 | "outputs": [], 225 | "source": [ 226 | "brain_mask_dilated = ants.morphology(brain_mask, radius=4, operation='dilate', mtype='binary')\n", 227 | "\n", 228 | "explore_3D_array_with_mask_contour(raw_img_ants.numpy(), brain_mask_dilated.numpy())" 229 | ] 230 | }, 231 | { 232 | "cell_type": "markdown", 233 | "metadata": {}, 234 | "source": [ 235 | "#### Save brain mask" 236 | ] 237 | }, 238 | { 239 | "cell_type": "code", 240 | "execution_count": null, 241 | "metadata": {}, 242 | "outputs": [], 243 | "source": [ 244 | "out_folder = os.path.join(BASE_DIR, 'assets', 'preprocessed')\n", 245 | "out_folder = os.path.join(out_folder, raw_example.split('.')[0]) # create folder with name of the raw file\n", 246 | "os.makedirs(out_folder, exist_ok=True) # create folder if not exists\n", 247 | "\n", 248 | "out_filename = add_suffix_to_filename(raw_example, suffix='brainMaskByTemplate')\n", 249 | "out_path = os.path.join(out_folder, out_filename)\n", 250 | "\n", 251 | "print(raw_img_path[len(BASE_DIR):])\n", 252 | "print(out_path[len(BASE_DIR):])" 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": null, 258 | "metadata": {}, 259 | "outputs": [], 260 | "source": [ 261 | "brain_mask_dilated.to_file(out_path)" 262 | ] 263 | }, 264 | { 265 | "attachments": {}, 266 | "cell_type": "markdown", 267 | "metadata": {}, 268 | "source": [ 269 | "#### Generate brain masked" 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": null, 275 | "metadata": {}, 276 | "outputs": [], 277 | "source": [ 278 | "masked = ants.mask_image(raw_img_ants, brain_mask_dilated)\n", 279 | "\n", 280 | "explore_3D_array(masked.numpy())" 281 | ] 282 | }, 283 | { 284 | "cell_type": "code", 285 | "execution_count": null, 286 | "metadata": {}, 287 | "outputs": [], 288 | "source": [ 289 | "out_filename = add_suffix_to_filename(raw_example, suffix='brainMaskedByTemplate')\n", 290 | "out_path = os.path.join(out_folder, out_filename)\n", 291 | "\n", 292 | "print(raw_img_path[len(BASE_DIR):])\n", 293 | "print(out_path[len(BASE_DIR):])" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": null, 299 | "metadata": {}, 300 | "outputs": [], 301 | "source": [ 302 | "masked.to_file(out_path)" 303 | ] 304 | } 305 | ], 306 | "metadata": { 307 | "interpreter": { 308 | "hash": "418825faa255fc22419b3421ba9be6bb08852f7738a4e2e9757a921549f74527" 309 | }, 310 | "kernelspec": { 311 | "display_name": "Python 3.7.1 ('.venv': venv)", 312 | "language": "python", 313 | "name": "python3" 314 | }, 315 | "language_info": { 316 | "codemirror_mode": { 317 | "name": "ipython", 318 | "version": 3 319 | }, 320 | "file_extension": ".py", 321 | "mimetype": "text/x-python", 322 | "name": "python", 323 | "nbconvert_exporter": "python", 324 | "pygments_lexer": "ipython3", 325 | "version": "3.7.1" 326 | }, 327 | "orig_nbformat": 4 328 | }, 329 | "nbformat": 4, 330 | "nbformat_minor": 2 331 | } 332 | -------------------------------------------------------------------------------- /notebooks/helpers.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | 3 | from ipywidgets import interact 4 | import numpy as np 5 | import SimpleITK as sitk 6 | import cv2 7 | 8 | def explore_3D_array(arr: np.ndarray, cmap: str = 'gray'): 9 | """ 10 | Given a 3D array with shape (Z,X,Y) This function will create an interactive 11 | widget to check out all the 2D arrays with shape (X,Y) inside the 3D array. 12 | The purpose of this function to visual inspect the 2D arrays in the image. 13 | 14 | Args: 15 | arr : 3D array with shape (Z,X,Y) that represents the volume of a MRI image 16 | cmap : Which color map use to plot the slices in matplotlib.pyplot 17 | """ 18 | 19 | def fn(SLICE): 20 | plt.figure(figsize=(7,7)) 21 | plt.imshow(arr[SLICE, :, :], cmap=cmap) 22 | plt.show() 23 | 24 | interact(fn, SLICE=(0, arr.shape[0]-1)) 25 | 26 | 27 | def explore_3D_array_comparison(arr_before: np.ndarray, arr_after: np.ndarray, cmap: str = 'gray'): 28 | """ 29 | Given two 3D arrays with shape (Z,X,Y) This function will create an interactive 30 | widget to check out all the 2D arrays with shape (X,Y) inside the 3D arrays. 31 | The purpose of this function to visual compare the 2D arrays after some transformation. 32 | 33 | Args: 34 | arr_before : 3D array with shape (Z,X,Y) that represents the volume of a MRI image, before any transform 35 | arr_after : 3D array with shape (Z,X,Y) that represents the volume of a MRI image, after some transform 36 | cmap : Which color map use to plot the slices in matplotlib.pyplot 37 | """ 38 | 39 | assert arr_after.shape == arr_before.shape 40 | 41 | def fn(SLICE): 42 | fig, (ax1, ax2) = plt.subplots(1, 2, sharex='col', sharey='row', figsize=(10,10)) 43 | 44 | ax1.set_title('Before', fontsize=15) 45 | ax1.imshow(arr_before[SLICE, :, :], cmap=cmap) 46 | 47 | ax2.set_title('After', fontsize=15) 48 | ax2.imshow(arr_after[SLICE, :, :], cmap=cmap) 49 | 50 | plt.tight_layout() 51 | plt.show() 52 | 53 | interact(fn, SLICE=(0, arr_before.shape[0]-1)) 54 | 55 | 56 | def show_sitk_img_info(img: sitk.Image): 57 | """ 58 | Given a sitk.Image instance prints the information about the MRI image contained. 59 | 60 | Args: 61 | img : instance of the sitk.Image to check out 62 | """ 63 | pixel_type = img.GetPixelIDTypeAsString() 64 | origin = img.GetOrigin() 65 | dimensions = img.GetSize() 66 | spacing = img.GetSpacing() 67 | direction = img.GetDirection() 68 | 69 | info = {'Pixel Type' : pixel_type, 'Dimensions': dimensions, 'Spacing': spacing, 'Origin': origin, 'Direction' : direction} 70 | for k,v in info.items(): 71 | print(f' {k} : {v}') 72 | 73 | 74 | def add_suffix_to_filename(filename: str, suffix:str) -> str: 75 | """ 76 | Takes a NIfTI filename and appends a suffix. 77 | 78 | Args: 79 | filename : NIfTI filename 80 | suffix : suffix to append 81 | 82 | Returns: 83 | str : filename after append the suffix 84 | """ 85 | if filename.endswith('.nii'): 86 | result = filename.replace('.nii', f'_{suffix}.nii') 87 | return result 88 | elif filename.endswith('.nii.gz'): 89 | result = filename.replace('.nii.gz', f'_{suffix}.nii.gz') 90 | return result 91 | else: 92 | raise RuntimeError('filename with unknown extension') 93 | 94 | 95 | def rescale_linear(array: np.ndarray, new_min: int, new_max: int): 96 | """Rescale an array linearly.""" 97 | minimum, maximum = np.min(array), np.max(array) 98 | m = (new_max - new_min) / (maximum - minimum) 99 | b = new_min - m * minimum 100 | return m * array + b 101 | 102 | 103 | def explore_3D_array_with_mask_contour(arr: np.ndarray, mask: np.ndarray, thickness: int = 1): 104 | """ 105 | Given a 3D array with shape (Z,X,Y) This function will create an interactive 106 | widget to check out all the 2D arrays with shape (X,Y) inside the 3D array. The binary 107 | mask provided will be used to overlay contours of the region of interest over the 108 | array. The purpose of this function is to visual inspect the region delimited by the mask. 109 | 110 | Args: 111 | arr : 3D array with shape (Z,X,Y) that represents the volume of a MRI image 112 | mask : binary mask to obtain the region of interest 113 | """ 114 | assert arr.shape == mask.shape 115 | 116 | _arr = rescale_linear(arr,0,1) 117 | _mask = rescale_linear(mask,0,1) 118 | _mask = _mask.astype(np.uint8) 119 | 120 | def fn(SLICE): 121 | arr_rgb = cv2.cvtColor(_arr[SLICE, :, :], cv2.COLOR_GRAY2RGB) 122 | contours, _ = cv2.findContours(_mask[SLICE, :, :], cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE) 123 | 124 | arr_with_contours = cv2.drawContours(arr_rgb, contours, -1, (0,1,0), thickness) 125 | 126 | plt.figure(figsize=(7,7)) 127 | plt.imshow(arr_with_contours) 128 | plt.show() 129 | 130 | interact(fn, SLICE=(0, arr.shape[0]-1)) 131 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | uv 2 | ipykernel 3 | ipywidgets 4 | matplotlib 5 | opencv-python-headless 6 | jupyter 7 | notebook 8 | SimpleITK==2.4.0 9 | antspyx==0.5.4 10 | antspynet 11 | --------------------------------------------------------------------------------