├── runtime.txt ├── requirements.txt ├── workshops ├── ohbm-2023-example-input.nii.gz ├── BigBrainWorkshop2023.ipynb ├── ohbm-2023-example.ipynb ├── imb9345-2024.ipynb └── HIBALL-winterschool-2023.ipynb ├── .github └── workflows │ └── test-tutorials.yml ├── .gitignore ├── README.md ├── 05-CombinedWorkflow.ipynb ├── 04-DifferentialGeneExpressions.ipynb ├── 03-ProbabilisticAssignment.ipynb ├── LICENSE ├── 02-DataFeatures.ipynb └── 01-BasicConcepts.ipynb /runtime.txt: -------------------------------------------------------------------------------- 1 | python-3.8 2 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | nibabel>=4.0.0 2 | siibra>=1.0.1-alpha.2 3 | siibra-jugex==1.22 4 | matplotlib 5 | nilearn 6 | -------------------------------------------------------------------------------- /workshops/ohbm-2023-example-input.nii.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/FZJ-INM1-BDA/siibra-tutorials/HEAD/workshops/ohbm-2023-example-input.nii.gz -------------------------------------------------------------------------------- /.github/workflows/test-tutorials.yml: -------------------------------------------------------------------------------- 1 | name: '[Test] Test siibra-tutorials' 2 | 3 | on: 4 | push: 5 | branches: ["**"] 6 | workflow_dispatch: 7 | inputs: 8 | siibra_python_branch: 9 | description: 'siibra-python branch to use' 10 | required: false 11 | default: 'main' 12 | type: string 13 | 14 | 15 | jobs: 16 | test_tutorials: 17 | runs-on: ubuntu-latest 18 | steps: 19 | - uses: actions/checkout@v4 20 | - name: Set up Python 3.10 21 | uses: actions/setup-python@v4 22 | with: 23 | python-version: '3.10' 24 | 25 | - name: Set siibra-python branch 26 | id: set_branch 27 | run: | 28 | INPUT=${{ github.event.inputs.siibra_python_branch }} 29 | echo "::set-output name=value::${INPUT:-"main"}" 30 | 31 | - name: Install dependencies 32 | run: | 33 | python -m pip install --upgrade pip 34 | pip install git+https://github.com/FZJ-INM1-BDA/siibra-python.git@${{ steps.set_branch.outputs.value}} 35 | pip install siibra_jugex 36 | pip install matplotlib 37 | pip install pytest nbmake 38 | 39 | - name: Test tutorial notebooks with nbmake 40 | run: pytest --nbmake --nbmake-timeout=1200 ./ 41 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [](https://opensource.org/licenses/Apache-2.0) 2 | 3 | # siibra-tutorials 4 | 5 | Some tutorial notebooks for programmatic use of the [EBRAINS human brain atlas](https://ebrains.eu/service/human-brain-atlas) with [siibra-python](https://github.com/FZJ-INM1-BDA/siibra-python). 6 | Since siibra-python is under active development, these tutorials are regularly updated and typically point to specifice development snapshots of the library. 7 | 8 | Tutorial | About | Link 9 | :---: | --- | --- 10 | 1 | Basic Concepts of reference spaces, brain parcellations, maps and regions | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=01-BasicConcepts.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2F01-BasicConcepts.ipynb&branch=main) 11 | 2 | Retrieving multimodal data features for brain regions | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=02-DataFeatures.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2F02-DataFeatures.ipynb&branch=main) 12 | 3 | Probabilistic assignment of regions to coordinates | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=03-ProbabilisticAssignment.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2F03-ProbabilisticAssignment.ipynb&branch=main) 13 | 4 | Differential Gene Expression Analysis | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=04-DifferentialGeneExpressions.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2F04-DifferentialGeneExpressions.ipynb&branch=main) 14 | 5 | Combined Workflow | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=05-CombinedWorkflow.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2F05-CombinedWorkflow.ipynb&branch=main) 15 | Workshop | OHBM 2023 | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=workshops/ohbm-2023-example.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2Fworkshops/ohbm-2023-example.ipynb&branch=main) 16 | Workshop |HIBALL Winterschool 2023 | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=workshops/HIBALL-winterschool-2023.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2Fworkshops/HIBALL-winterschool-2023.ipynb&branch=main) 17 | Workshop | BigBrain Workshop 2023 | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=workshops/BigBrainWorkshop2023.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2Fworkshops/BigBrainWorkshop2023.ipynb&branch=main) 18 | Workshop | IBM9345 atlas course 2024 | [](https://mybinder.org/v2/gh/FZJ-INM1-BDA/siibra-tutorials/HEAD?filepath=workshops/imb9345-2024.ipynb) [lab.ebrains.eu](https://lab.ebrains.eu/hub/user-redirect/git-pull?repo=https%3A%2F%2Fgithub.com%2FFZJ-INM1-BDA%2Fsiibra-tutorials.git&urlpath=tree%2Fsiibra-tutorials.git%2Fworkshops/imb9345-2024.ipynb&branch=main) 19 | -------------------------------------------------------------------------------- /05-CombinedWorkflow.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# install siibra\n", 10 | "!pip install -r requirements.txt --user" 11 | ] 12 | }, 13 | { 14 | "cell_type": "markdown", 15 | "metadata": {}, 16 | "source": [ 17 | "# Using siibra-python to characterize motor and language areas in the Julich-Brain cytoarchitectonic atlas \n", 18 | "\n", 19 | "\n", 20 | "### 1. Import the package \n", 21 | "\n", 22 | "We start by importing the `siibra` package." 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": null, 28 | "metadata": {}, 29 | "outputs": [], 30 | "source": [ 31 | "import siibra\n", 32 | "from packaging.version import Version\n", 33 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')\n", 34 | "\n", 35 | "import matplotlib.pyplot as plt\n", 36 | "%matplotlib notebook" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "### 2. Select two regions from Julich-Brain" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": null, 49 | "metadata": {}, 50 | "outputs": [], 51 | "source": [ 52 | "# access Julich-Brain in version 2.9\n", 53 | "julichbrain = siibra.parcellations.get('julich')\n", 54 | "\n", 55 | "# Get two region objects from Julich-Brain\n", 56 | "regions = [\n", 57 | " julichbrain.get_region('4p left'),\n", 58 | " julichbrain.get_region('44 left')\n", 59 | "]" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "julich_mni_pmaps = julichbrain.get_map(\n", 69 | " space=siibra.spaces.MNI_152_ICBM_2009C_NONLINEAR_ASYMMETRIC, \n", 70 | " maptype=siibra.MapType.STATISTICAL\n", 71 | ")" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": null, 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "# fetch and display their probability maps\n", 81 | "from nilearn import plotting\n", 82 | "for region in regions:\n", 83 | " pmap = julich_mni_pmaps.fetch(region)\n", 84 | " plotting.plot_stat_map(pmap, title=region.name)" 85 | ] 86 | }, 87 | { 88 | "cell_type": "markdown", 89 | "metadata": {}, 90 | "source": [ 91 | "### 4. Access GABAB receptor distributions" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "receptor = 'GABAB'" 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": null, 106 | "metadata": {}, 107 | "outputs": [], 108 | "source": [ 109 | "fig, axs = plt.subplots(1, len(regions), figsize=(8, 3), sharex=True, sharey=True)\n", 110 | "\n", 111 | "for i, region in enumerate(regions):\n", 112 | " \n", 113 | " # retrieve receptor profiles\n", 114 | " features = siibra.features.get(\n", 115 | " region, siibra.features.molecular.ReceptorDensityProfile\n", 116 | " )\n", 117 | " \n", 118 | " # get one with GABAB type and plot it\n", 119 | " features[0].get_element('GABAB').plot(ax=axs[i])\n", 120 | "\n", 121 | "plt.tight_layout()" 122 | ] 123 | }, 124 | { 125 | "cell_type": "markdown", 126 | "metadata": {}, 127 | "source": [ 128 | "### 5. Run a differential gene expression analysis for the two regions" 129 | ] 130 | }, 131 | { 132 | "cell_type": "code", 133 | "execution_count": null, 134 | "metadata": {}, 135 | "outputs": [], 136 | "source": [ 137 | "# setup and run the analysis\n", 138 | "genes = ['GABBR1', 'GABBR2']\n", 139 | "\n", 140 | "fig, axs = plt.subplots(1, len(regions), figsize=(8, 3), sharex=True, sharey=True)\n", 141 | "\n", 142 | "geneexps = {}\n", 143 | "for i, region in enumerate(regions):\n", 144 | " # retrieve receptor profiles\n", 145 | " geneexps[region] = siibra.features.get(region, \"GeneExpression\", gene=genes)[0]\n", 146 | " # get one with GABAB type and plot it\n", 147 | " geneexps[region].plot(ax=axs[i])\n", 148 | "\n", 149 | "plt.tight_layout()" 150 | ] 151 | }, 152 | { 153 | "cell_type": "code", 154 | "execution_count": null, 155 | "metadata": {}, 156 | "outputs": [], 157 | "source": [ 158 | "# plot the extracted sample positions of the microarray data\n", 159 | "from nilearn import plotting\n", 160 | "for region in regions:\n", 161 | " pmap = region.get_regional_map(\n", 162 | " siibra.spaces.MNI_152_ICBM_2009C_NONLINEAR_ASYMMETRIC,\n", 163 | " siibra.MapType.STATISTICAL\n", 164 | " )\n", 165 | " display = plotting.plot_glass_brain(pmap.fetch(), cmap=\"viridis\", title=region.name)\n", 166 | " display.add_markers(geneexps[region].anchor.location.coordinates, marker_size=5)" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": null, 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [] 175 | } 176 | ], 177 | "metadata": { 178 | "kernelspec": { 179 | "display_name": "Python 3 (ipykernel)", 180 | "language": "python", 181 | "name": "python3" 182 | }, 183 | "language_info": { 184 | "codemirror_mode": { 185 | "name": "ipython", 186 | "version": 3 187 | }, 188 | "file_extension": ".py", 189 | "mimetype": "text/x-python", 190 | "name": "python", 191 | "nbconvert_exporter": "python", 192 | "pygments_lexer": "ipython3", 193 | "version": "3.9.15" 194 | }, 195 | "vscode": { 196 | "interpreter": { 197 | "hash": "c6134497cd5410b1b275ebc88f99d14855849379b696eb9e04cff5dd9aa5e77a" 198 | } 199 | } 200 | }, 201 | "nbformat": 4, 202 | "nbformat_minor": 4 203 | } 204 | -------------------------------------------------------------------------------- /04-DifferentialGeneExpressions.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Differential gene expression analysis in two brain regions\n", 8 | "\n", 9 | "JuGEx performs differential gene expresssion in two different brain regions, as described in the publication\n", 10 | "\n", 11 | "> *Sebastian Bludau, Thomas W. Mühleisen, Simon B. Eickhoff, Michael J. Hawrylycz, Sven Cichon, Katrin Amunts. Integration of transcriptomic and cytoarchitectonic data implicates a role for MAOA and TAC1 in the limbic-cortical network. 2018, Brain Structure and Function. https://doi.org/10.1007/s00429-018-1620-6*\n", 12 | "\n", 13 | "For the gene expression data, `siibra` accesses the Allen Brain Atlas API (© 2015 Allen Institute for Brain Science. Allen Brain Atlas API. Available from: brain-map.org/api/index.html). " 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "### Initialize the analysis\n", 21 | "\n", 22 | "The analysis is initialized with a `siibra` atlas object. It will check if the parcellation selected in the atlas is suitable for performing the analysis, which includes to verify that the given atlas object provides maps in the MNI ICBM 152 space. We explicitely select the Julich-Brain probabilistic cytoarchitectonic maps, and tell the atlas to threshold the probability maps for filtering gene expressions instead of using the simplified labelled volume. " 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": null, 28 | "metadata": {}, 29 | "outputs": [], 30 | "source": [ 31 | "import siibra, siibra_jugex\n", 32 | "from packaging.version import Version\n", 33 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')\n", 34 | "assert Version(siibra_jugex.__version__) >= Version(\"1.2\")" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": null, 40 | "metadata": {}, 41 | "outputs": [], 42 | "source": [ 43 | "julichbrain = siibra.parcellations.get(\"julich\")\n", 44 | "jugex = siibra_jugex.DifferentialGeneExpression(julichbrain)" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": {}, 50 | "source": [ 51 | "### Configure the experiment with brain regions and candidate genes\n", 52 | "\n", 53 | "The analysis is configured by specifying some candidate genes of interest, and two regions of interest (ROI) specified by brain area names that the atlas object can resolve. Note that the siibra atlas class does fuzzy string matching to resolve region names, so you can try with a simple name of the regions to see if siibra interprets them. Also, gene names can easily be looked up and autocompleted in siibra.gene_names. \n" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "candidate_regions = [\"v1 right\", \"v2 right\"]\n", 63 | "candidate_genes = [\"MAOA\", \"TAC1\"]\n", 64 | "jugex.add_candidate_genes(candidate_genes)\n", 65 | "threshold=0.2\n", 66 | "jugex.define_roi1(candidate_regions[0], maptype=siibra.MapType.STATISTICAL, threshold=threshold)\n", 67 | "jugex.define_roi2(candidate_regions[1], maptype=siibra.MapType.STATISTICAL, threshold=threshold)" 68 | ] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "metadata": {}, 73 | "source": [ 74 | "### Run the analysis" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": null, 80 | "metadata": {}, 81 | "outputs": [], 82 | "source": [ 83 | "result = jugex.run(permutations=1000)\n", 84 | "print(result['p-values'])" 85 | ] 86 | }, 87 | { 88 | "cell_type": "markdown", 89 | "metadata": {}, 90 | "source": [ 91 | "The aggregated input parameters can be stored to disk." 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "jugex.save('jugex_{}_{}.json'.format(\n", 101 | " \"_\".join(candidate_regions),\n", 102 | " \"_\".join(candidate_genes) ))" 103 | ] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "metadata": {}, 108 | "source": [ 109 | "### Look at filtered positions of microarray samples in MNI space" 110 | ] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "metadata": {}, 115 | "source": [ 116 | "Let's have a look at the sample positions that have been found in the Allen atlas. Since we configured brainscapes to prefer thresholded continuous maps for region filtering over the simplified parcellation map, we also plot the probability maps here." 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "from nilearn import plotting\n", 126 | "\n", 127 | "for regionname in candidate_regions:\n", 128 | " samples = jugex.get_samples(regionname)\n", 129 | " region = julichbrain.get_region(regionname)\n", 130 | " pmap = region.get_regional_map(\n", 131 | " siibra.spaces.MNI_152_ICBM_2009C_NONLINEAR_ASYMMETRIC, \n", 132 | " siibra.MapType.STATISTICAL\n", 133 | " ) \n", 134 | " display = plotting.plot_glass_brain(pmap.fetch(), cmap=\"viridis\", title=region.name)\n", 135 | " display.add_markers([s['mnicoord'] for s in samples.values()])" 136 | ] 137 | }, 138 | { 139 | "cell_type": "code", 140 | "execution_count": null, 141 | "metadata": {}, 142 | "outputs": [], 143 | "source": [] 144 | } 145 | ], 146 | "metadata": { 147 | "kernelspec": { 148 | "display_name": "Python 3 (ipykernel)", 149 | "language": "python", 150 | "name": "python3" 151 | }, 152 | "language_info": { 153 | "codemirror_mode": { 154 | "name": "ipython", 155 | "version": 3 156 | }, 157 | "file_extension": ".py", 158 | "mimetype": "text/x-python", 159 | "name": "python", 160 | "nbconvert_exporter": "python", 161 | "pygments_lexer": "ipython3", 162 | "version": "3.9.15" 163 | } 164 | }, 165 | "nbformat": 4, 166 | "nbformat_minor": 4 167 | } 168 | -------------------------------------------------------------------------------- /03-ProbabilisticAssignment.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import siibra\n", 10 | "from packaging.version import Version\n", 11 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')\n", 12 | "\n", 13 | "from nilearn import plotting\n", 14 | "import matplotlib.pyplot as plt\n", 15 | "%matplotlib notebook" 16 | ] 17 | }, 18 | { 19 | "cell_type": "markdown", 20 | "metadata": {}, 21 | "source": [ 22 | "### Define two points in MNI space\n", 23 | "\n", 24 | "We start by specifying some points in the reference space. Here, we use the MNI space. We can find such points, amongst other possibilities, by clicking them in the [siibra-explorer](https://siibra-explorer.apps.hbp.eu/). `siibra` has specific data types for points and point sets, which are aware of the reference space used, and can be warped between reference spaces on demand." 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": null, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "# These are copy-pasted from the interactive atlas viewer:\n", 34 | "points = siibra.PointCloud(\n", 35 | " (\n", 36 | " [-25.65, -2.75, -33.75],\n", 37 | " [-37.35, -81.05, -6.3],\n", 38 | " ),\n", 39 | " space='mni152', sigma_mm=5)" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": null, 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "plotting.view_markers(list(map(tuple,points)), ['red', 'cyan'], marker_size=10) " 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "metadata": {}, 54 | "source": [ 55 | "### Assign brain regions to the 3D points\n", 56 | "\n", 57 | "We assign the points to brain regions from the Julich-Brain cytoarchitectonic atlas, using a certain location tolerance of 5mm standard deviation." 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": null, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "julich_pmaps = siibra.get_map(\n", 67 | " parcellation=siibra.parcellations.get('julich 3.0.3'),\n", 68 | " space=siibra.spaces.get('mni152'),\n", 69 | " maptype=siibra.MapType.STATISTICAL\n", 70 | ")\n", 71 | "assignments = julich_pmaps.assign(points)" 72 | ] 73 | }, 74 | { 75 | "cell_type": "markdown", 76 | "metadata": {}, 77 | "source": [ 78 | "The result of the assignment is a pandas dataframe, where each row describes a region assigned to one of the input structures (which in this case are the provided two points, 0 and 1). Each assignment has some scores describing the relation of the point, represented as a 3D Gaussian kernel with bandwidth corresponding to the point uncertainty, and the respectivty assigned statistical map from the parcellation:\n", 79 | "\n", 80 | "- \"Value\" refers to the average value of the statistcal map under the point kernel\n", 81 | "- \"Correlation\" is Pearson's correlation coefficient between the statistical map and the point kernel\n", 82 | "- \"IoU\" is the intersection over union between both\n", 83 | "- \"Contains\" is the intersection over area of the statistical map (indicating containedness of the point)\n", 84 | "- \"Contained\" is the intersection over area of the point (indicating containedness of the statistical map)\n" 85 | ] 86 | }, 87 | { 88 | "cell_type": "markdown", 89 | "metadata": {}, 90 | "source": [ 91 | "Let's look at two of the matched areas." 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "assignments.sort_values(by='intersection over union', ascending=False, inplace=True)\n", 101 | "\n", 102 | "a1 = assignments[assignments['input structure']==0].iloc[0]\n", 103 | "a2 = assignments[assignments['input structure']==1].iloc[0]\n", 104 | "\n", 105 | "for a in [a1, a2]:\n", 106 | " \n", 107 | " # fetch the probability map corresponding to the assigned regions\n", 108 | " statmap = julich_pmaps.fetch(region=a.region)\n", 109 | "\n", 110 | " # plot the map, center at the point\n", 111 | " view = plotting.plot_stat_map(\n", 112 | " statmap, \n", 113 | " title=f\"{a.region} ({a['correlation']:.2f})\", \n", 114 | " cmap='viridis', \n", 115 | " cut_coords=a.centroid)\n", 116 | "\n", 117 | " # add the point\n", 118 | " view.add_markers([a.centroid])" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "### Find connectivity in terms of streamlines from DTI\n", 126 | "\n", 127 | "We also investigate the connectivtiy of the first region as measured by in-vivo imaging. To do so, we select the first region in the atlas, and search for connectivity profiles." 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "# get structural connectivity matrices for Julich Brain\n", 137 | "features = siibra.features.get(\n", 138 | " julich_pmaps.parcellation,\n", 139 | " siibra.features.connectivity.StreamlineCounts\n", 140 | ")\n", 141 | "\n", 142 | "# get one of the connectivity matrices from the first compound\n", 143 | "sc = features[0].get_element(\"000\")\n", 144 | "\n", 145 | "# get the connectivity profile with the strongest 10 connections for the first matched region\n", 146 | "profile = sc.get_profile(a1.region, max_rows=10)\n", 147 | "\n", 148 | "profile.plot()" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [] 157 | } 158 | ], 159 | "metadata": { 160 | "kernelspec": { 161 | "display_name": "Python 3 (ipykernel)", 162 | "language": "python", 163 | "name": "python3" 164 | }, 165 | "language_info": { 166 | "codemirror_mode": { 167 | "name": "ipython", 168 | "version": 3 169 | }, 170 | "file_extension": ".py", 171 | "mimetype": "text/x-python", 172 | "name": "python", 173 | "nbconvert_exporter": "python", 174 | "pygments_lexer": "ipython3", 175 | "version": "3.9.15" 176 | }, 177 | "vscode": { 178 | "interpreter": { 179 | "hash": "c6134497cd5410b1b275ebc88f99d14855849379b696eb9e04cff5dd9aa5e77a" 180 | } 181 | } 182 | }, 183 | "nbformat": 4, 184 | "nbformat_minor": 4 185 | } 186 | -------------------------------------------------------------------------------- /workshops/BigBrainWorkshop2023.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "a62e7e53", 6 | "metadata": {}, 7 | "source": [ 8 | "# BigBrain workshop short tutorial: Retrieving a BigBrain 1 micron patch " 9 | ] 10 | }, 11 | { 12 | "cell_type": "code", 13 | "execution_count": null, 14 | "id": "2a2d0855", 15 | "metadata": {}, 16 | "outputs": [], 17 | "source": [ 18 | "import siibra\n", 19 | "from packaging.version import Version\n", 20 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')" 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "id": "2946c8a8", 26 | "metadata": {}, 27 | "source": [ 28 | "## Step 1: Load a coronal 2D annotation pasted over from siibra-explorer annotation mode" 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": null, 34 | "id": "1d4cf9cb", 35 | "metadata": {}, 36 | "outputs": [], 37 | "source": [ 38 | "structure='''{\n", 39 | " \"@id\": \"c882cfb9\",\n", 40 | " \"@type\": \"tmp/poly\",\n", 41 | " \"coordinateSpace\": {\n", 42 | " \"@id\": \"minds/core/referencespace/v1.0.0/a1655b99-82f1-420f-a3c2-fe80fd4c8588\"\n", 43 | " },\n", 44 | " \"coordinates\": [\n", 45 | " [\n", 46 | " {\n", 47 | " \"@id\": \"26a2b847\",\n", 48 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 49 | " \"value\": -35.721144,\n", 50 | " \"unit\": {\n", 51 | " \"@id\": \"id.link/mm\"\n", 52 | " }\n", 53 | " },\n", 54 | " {\n", 55 | " \"@id\": \"d7f76f43\",\n", 56 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 57 | " \"value\": -1.87,\n", 58 | " \"unit\": {\n", 59 | " \"@id\": \"id.link/mm\"\n", 60 | " }\n", 61 | " },\n", 62 | " {\n", 63 | " \"@id\": \"365dfad0\",\n", 64 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 65 | " \"value\": 35.022472,\n", 66 | " \"unit\": {\n", 67 | " \"@id\": \"id.link/mm\"\n", 68 | " }\n", 69 | " }\n", 70 | " ],\n", 71 | " [\n", 72 | " {\n", 73 | " \"@id\": \"4e160292\",\n", 74 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 75 | " \"value\": -33.692304,\n", 76 | " \"unit\": {\n", 77 | " \"@id\": \"id.link/mm\"\n", 78 | " }\n", 79 | " },\n", 80 | " {\n", 81 | " \"@id\": \"12cb8664\",\n", 82 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 83 | " \"value\": -1.87,\n", 84 | " \"unit\": {\n", 85 | " \"@id\": \"id.link/mm\"\n", 86 | " }\n", 87 | " },\n", 88 | " {\n", 89 | " \"@id\": \"71387d2a\",\n", 90 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 91 | " \"value\": 36.182644,\n", 92 | " \"unit\": {\n", 93 | " \"@id\": \"id.link/mm\"\n", 94 | " }\n", 95 | " }\n", 96 | " ],\n", 97 | " [\n", 98 | " {\n", 99 | " \"@id\": \"dafcc00b\",\n", 100 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 101 | " \"value\": -32.572944,\n", 102 | " \"unit\": {\n", 103 | " \"@id\": \"id.link/mm\"\n", 104 | " }\n", 105 | " },\n", 106 | " {\n", 107 | " \"@id\": \"5d00aa25\",\n", 108 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 109 | " \"value\": -1.87,\n", 110 | " \"unit\": {\n", 111 | " \"@id\": \"id.link/mm\"\n", 112 | " }\n", 113 | " },\n", 114 | " {\n", 115 | " \"@id\": \"f5aab4e8\",\n", 116 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 117 | " \"value\": 34.328704,\n", 118 | " \"unit\": {\n", 119 | " \"@id\": \"id.link/mm\"\n", 120 | " }\n", 121 | " }\n", 122 | " ],\n", 123 | " [\n", 124 | " {\n", 125 | " \"@id\": \"389a2439\",\n", 126 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 127 | " \"value\": -35.639524,\n", 128 | " \"unit\": {\n", 129 | " \"@id\": \"id.link/mm\"\n", 130 | " }\n", 131 | " },\n", 132 | " {\n", 133 | " \"@id\": \"74e3ecaa\",\n", 134 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 135 | " \"value\": -1.87,\n", 136 | " \"unit\": {\n", 137 | " \"@id\": \"id.link/mm\"\n", 138 | " }\n", 139 | " },\n", 140 | " {\n", 141 | " \"@id\": \"90fece85\",\n", 142 | " \"@type\": \"https://openminds.ebrains.eu/core/QuantitativeValue\",\n", 143 | " \"value\": 35.185712,\n", 144 | " \"unit\": {\n", 145 | " \"@id\": \"id.link/mm\"\n", 146 | " }\n", 147 | " }\n", 148 | " ]\n", 149 | " ],\n", 150 | " \"closed\": true\n", 151 | "}'''" 152 | ] 153 | }, 154 | { 155 | "cell_type": "code", 156 | "execution_count": null, 157 | "id": "2b2c9af6", 158 | "metadata": {}, 159 | "outputs": [], 160 | "source": [ 161 | "# we can directly instantiate a location object from the structure in siibra\n", 162 | "# the annotation is cast to a siibra pointcloud.\n", 163 | "# Note that it has already the proper reference space encoded!\n", 164 | "points = siibra.from_json(structure)\n", 165 | "print(points)" 166 | ] 167 | }, 168 | { 169 | "cell_type": "code", 170 | "execution_count": null, 171 | "id": "c45ebe7f", 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [ 175 | "# we choose as a region (or volume) of interst the bounding box of the points\n", 176 | "voi = points.boundingbox\n", 177 | "print(voi.warp('mni152'))" 178 | ] 179 | }, 180 | { 181 | "cell_type": "markdown", 182 | "id": "b650432f", 183 | "metadata": {}, 184 | "source": [ 185 | "## Step 2: Extract a lower resolution surrounding cube from the BigBrain template" 186 | ] 187 | }, 188 | { 189 | "cell_type": "code", 190 | "execution_count": null, 191 | "id": "35de465b", 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "# get a cube of BigBrain context centered at this annotation\n", 196 | "bigbrain = siibra.get_template('bigbrain')\n", 197 | "cube = bigbrain.fetch(\n", 198 | " voi=voi.center.get_enclosing_cube(width_mm=10),\n", 199 | " resolution_mm=0.32\n", 200 | ")" 201 | ] 202 | }, 203 | { 204 | "cell_type": "markdown", 205 | "id": "bf47db66", 206 | "metadata": {}, 207 | "source": [ 208 | "## Step 3: Find the related 1 micron section to extract a 1 micron cortical patch\n", 209 | "\n", 210 | "Note that siibra implements a lazy loading strategy. We find the 1 micron sections as data features with a bounding box. Retrieval of actual image data is postponed until we call `fetch` explitly.\n", 211 | "\n", 212 | "Here we fetch with a volume of interest defintion (the bounding box of the annotation), which makes fetching the full resolution feasible. \n", 213 | "\n", 214 | "With the current siibra version, we fist retrieve all 1 micron sections and intersect explictly.\n", 215 | "The next siibra version allows to query features directly with the annotation's bounding box." 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": null, 221 | "id": "a62c4953", 222 | "metadata": {}, 223 | "outputs": [], 224 | "source": [ 225 | "matched_sections = siibra.features.get(\n", 226 | " points, siibra.features.cellular.CellbodyStainedSection\n", 227 | ")\n", 228 | "assert len(matched_sections) > 0 " 229 | ] 230 | }, 231 | { 232 | "cell_type": "code", 233 | "execution_count": null, 234 | "id": "68d39585", 235 | "metadata": {}, 236 | "outputs": [], 237 | "source": [ 238 | "# extract the full resolution patch\n", 239 | "patch_1mu = matched_sections[0].fetch(voi=voi, resolution_mm=-1)" 240 | ] 241 | }, 242 | { 243 | "cell_type": "markdown", 244 | "id": "5986434a", 245 | "metadata": {}, 246 | "source": [ 247 | "### Step 4: Plot the 3D context and 2D patch" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "id": "0e93f883", 254 | "metadata": {}, 255 | "outputs": [], 256 | "source": [ 257 | "from nilearn import plotting\n", 258 | "import matplotlib.pyplot as plt\n", 259 | "%matplotlib notebook" 260 | ] 261 | }, 262 | { 263 | "cell_type": "code", 264 | "execution_count": null, 265 | "id": "951b218c", 266 | "metadata": {}, 267 | "outputs": [], 268 | "source": [ 269 | "# plot a composite 3D view with nilearn\n", 270 | "view=plotting.plot_img(cube, cmap='gray', cut_coords=tuple(voi.center), vmin=0, vmax=255)\n", 271 | "view.add_overlay(patch_1mu, cmap='magma')" 272 | ] 273 | }, 274 | { 275 | "cell_type": "code", 276 | "execution_count": null, 277 | "id": "ae470b98", 278 | "metadata": {}, 279 | "outputs": [], 280 | "source": [ 281 | "# plot only the 1 micron patch\n", 282 | "plt.figure()\n", 283 | "plt.imshow(patch_1mu.get_fdata().squeeze(), cmap='gray')" 284 | ] 285 | }, 286 | { 287 | "cell_type": "code", 288 | "execution_count": null, 289 | "id": "f1f8de42", 290 | "metadata": {}, 291 | "outputs": [], 292 | "source": [] 293 | } 294 | ], 295 | "metadata": { 296 | "kernelspec": { 297 | "display_name": ".venv", 298 | "language": "python", 299 | "name": "python3" 300 | }, 301 | "language_info": { 302 | "codemirror_mode": { 303 | "name": "ipython", 304 | "version": 3 305 | }, 306 | "file_extension": ".py", 307 | "mimetype": "text/x-python", 308 | "name": "python", 309 | "nbconvert_exporter": "python", 310 | "pygments_lexer": "ipython3", 311 | "version": "3.12.7" 312 | } 313 | }, 314 | "nbformat": 4, 315 | "nbformat_minor": 5 316 | } 317 | -------------------------------------------------------------------------------- /workshops/ohbm-2023-example.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "4c0f45d7", 6 | "metadata": {}, 7 | "source": [ 8 | "# The siibra tool suite – interfaces to a multilevel brain atlas spanning scales and modalities\n", 9 | "\n", 10 | "Studying the human brain requires to capture its structural and functional organization in a common spatial framework. siibra is a tool suite that implements a multilevel atlas of the human brain by providing access to reference templates at different spatial scales, complementary parcellation maps, and multimodal data features. \n", 11 | "\n", 12 | "For interactive and explorative use, the tool suite includes a web-based 3D viewer hosted [on EBRAINS](https://atlases.ebrains.eu/viewer). [siibra-python](https://siibra-python.readthedocs.io) is a software library for using the framework in computational workflows, and provides good compatibility with established (neuro)data science tools such as [nibabel](https://nipy.org/nibabel/) and [pandas](https://pandas.pydata.org).\n", 13 | "\n", 14 | "this notebook walks you through some examples that are intended for presenting siibra at the OHBM 2023 conference. It uses a recent develoment snapshot of siibra." 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "id": "64562619", 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "import siibra as sb\n", 25 | "from packaging.version import Version\n", 26 | "assert Version(sb.__version__) >= Version('1.0.1-alpha.2')\n", 27 | "from nilearn import plotting\n", 28 | "import matplotlib.pyplot as plt\n", 29 | "%matplotlib notebook" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "id": "d38c80d2", 35 | "metadata": {}, 36 | "source": [ 37 | "# Load example image with regions of interest" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "id": "ef10c44c", 43 | "metadata": {}, 44 | "source": [ 45 | "The example starts with a NIfTI image containing some regions of interesta in MNI space, which we are going to assign to cytoarchitectonic brain regions, and use for multimodel data queries. Here we just load a predefined NIfTI file, but in general you might use a thresholded functional activation map or similar signal here." 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "id": "2101caa9", 52 | "metadata": {}, 53 | "outputs": [], 54 | "source": [ 55 | "img = sb.volumes.from_file(\n", 56 | " \"https://github.com/FZJ-INM1-BDA/siibra-tutorials/raw/main/workshops/ohbm-2023-example-input.nii.gz\",\n", 57 | " name=\"Example input\", \n", 58 | " space=\"mni152\",\n", 59 | ")\n", 60 | "\n", 61 | "# display the image\n", 62 | "plotting.plot_stat_map(img.fetch(), cmap='viridis', colorbar=False, draw_cross=False)" 63 | ] 64 | }, 65 | { 66 | "cell_type": "markdown", 67 | "id": "89a7ccca", 68 | "metadata": {}, 69 | "source": [ 70 | "# Assign activations to cytoarchitectonic brain regions\n", 71 | "\n", 72 | "We request the Julich-Brain probabilistic cytoarchitectonic maps, and perform an assignment of the image signal to the cytoarchitectonic structures. Siibra will automatically identify separated structures in the image. The result is pandas dataframe with regions assigned to each structure identified in the input image, and several scores for the assignment. " 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": null, 78 | "id": "979701eb", 79 | "metadata": {}, 80 | "outputs": [], 81 | "source": [ 82 | "# retrieve Julich-Brain probabilistic cytoarchitectonic maps\n", 83 | "julich_pmaps = sb.get_map(\n", 84 | " sb.parcellations.get('julich 2.9'),\n", 85 | " sb.spaces.get('mni 152'),\n", 86 | " sb.MapType.STATISTICAL\n", 87 | ")\n", 88 | "\n", 89 | "# assign the input image from above to the cytoarchitectonic maps.\n", 90 | "# This results in a pandas DataFrame, which we sort by correlation.\n", 91 | "matches = julich_pmaps.assign(img).sort_values('correlation', ascending=False)\n", 92 | "\n", 93 | "# For this example, we work only with a few of the assigned structures.\n", 94 | "# We filter by strong correlation and containedness scores.\n", 95 | "matches = matches[(matches.correlation > 0.3) & (matches['map containedness'] < 0.25)]\n", 96 | "\n", 97 | "# display some columns of the filtered table\n", 98 | "matches[['input structure', 'region', 'correlation']].round(2)" 99 | ] 100 | }, 101 | { 102 | "cell_type": "code", 103 | "execution_count": null, 104 | "id": "ee50f9da", 105 | "metadata": {}, 106 | "outputs": [], 107 | "source": [ 108 | "matches" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "id": "f5bc1b46", 114 | "metadata": {}, 115 | "source": [ 116 | "Show these filtered matched brain regions in 3D." 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "id": "c25bf33e", 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "fig, axs = plt.subplots(1, len(matches), figsize=(8, 4))\n", 127 | "for i, match in enumerate(matches.itertuples()):\n", 128 | " plotting.plot_stat_map(\n", 129 | " match.region.get_regional_map('mni152', maptype='statistical').fetch(),\n", 130 | " axes=axs[i], colorbar=False, display_mode='z',\n", 131 | " cut_coords=[int(match.region.compute_centroids('mni152')[0][2])],\n", 132 | " title=match.region.name\n", 133 | " )" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "id": "2250cbd7", 139 | "metadata": {}, 140 | "source": [ 141 | "# Query multimodal regional features\n", 142 | "\n", 143 | "Retrieving regional data features is as simple as specifying an atlas concept - such as a region, parcellation or space - and a feature modality. So in order to find receptor density fingerprints in the left primary visual cortex, we could do:" 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": null, 149 | "id": "5231d9e8", 150 | "metadata": {}, 151 | "outputs": [], 152 | "source": [ 153 | "# retrieve the features for v1 left\n", 154 | "features = sb.features.get(\n", 155 | " sb.get_region('julich 2.9', 'v1 left'),\n", 156 | " sb.features.molecular.ReceptorDensityFingerprint,\n", 157 | ")\n", 158 | "\n", 159 | "# display the data array of the first of the returned features\n", 160 | "features[0].data.T" 161 | ] 162 | }, 163 | { 164 | "cell_type": "markdown", 165 | "id": "25a7d64c", 166 | "metadata": {}, 167 | "source": [ 168 | "We use the same mechanism now to build a table of three different feature modalities measured in the two selected brain regions. We choose receptor densities and cell densities, and complement them with connectivity profiles below. The connectivity uses a slightly different query, since the connectivity matrix is linked to the parcellation object, not a single region." 169 | ] 170 | }, 171 | { 172 | "cell_type": "code", 173 | "execution_count": null, 174 | "id": "0c39911b", 175 | "metadata": {}, 176 | "outputs": [], 177 | "source": [ 178 | "# create the plot\n", 179 | "fig, axs = plt.subplots(\n", 180 | " 3,\n", 181 | " len(matches),\n", 182 | " sharey='row', figsize=(9,9)\n", 183 | ")\n", 184 | "\n", 185 | "# row 1 - receptor densities for different receptor types\n", 186 | "for i, m in enumerate(matches.itertuples()):\n", 187 | " features = sb.features.get(\n", 188 | " m.region, \n", 189 | " sb.features.molecular.ReceptorDensityFingerprint\n", 190 | " )\n", 191 | " features[0].plot(ax=axs[0, i])\n", 192 | "\n", 193 | "# row 2 - cell densities per cortical layer\n", 194 | "for i, m in enumerate(matches.itertuples()):\n", 195 | " features = sb.features.get(\n", 196 | " m.region, \n", 197 | " sb.features.cellular.LayerwiseCellDensity\n", 198 | " )\n", 199 | " features[0].plot(ax=axs[1, i])\n", 200 | " \n", 201 | "# row 3 - structural connectivity profiles\n", 202 | "for i, m in enumerate(matches.itertuples()):\n", 203 | " features = sb.features.get(\n", 204 | " m.region.parcellation, sb.features.connectivity.StreamlineLengths\n", 205 | " )\n", 206 | " f = features[0][0].get_profile(m.region, max_rows=12)\n", 207 | " f.plot(ax=axs[2, i])\n", 208 | "\n", 209 | "# optimize plot layout\n", 210 | "plt.tight_layout()" 211 | ] 212 | }, 213 | { 214 | "cell_type": "markdown", 215 | "id": "90cd24de", 216 | "metadata": {}, 217 | "source": [ 218 | "# Extract BigBrain image data\n", 219 | "\n", 220 | "Finally, we sample 3D chunks of the BigBrain volume located in the identified brain regions." 221 | ] 222 | }, 223 | { 224 | "cell_type": "code", 225 | "execution_count": null, 226 | "id": "ffa4d0e6", 227 | "metadata": {}, 228 | "outputs": [], 229 | "source": [ 230 | "# access the BigBrain reference template\n", 231 | "bigbrain = sb.get_template('bigbrain')\n", 232 | "\n", 233 | "# fetch the whole-brain volume at reduced resolution\n", 234 | "bigbrain_volume = bigbrain.fetch(resolution_mm=.64)" 235 | ] 236 | }, 237 | { 238 | "cell_type": "code", 239 | "execution_count": null, 240 | "id": "21292efa", 241 | "metadata": {}, 242 | "outputs": [], 243 | "source": [ 244 | "# prepare the plot\n", 245 | "f, axs = plt.subplots(2, len(matches), figsize=(9, 7))\n", 246 | "plot_kwargs = {\n", 247 | " \"bg_img\": None,\n", 248 | " \"cmap\": 'gray',\n", 249 | " \"colorbar\": False, \n", 250 | " \"draw_cross\": False,\n", 251 | " \"annotate\": False, \n", 252 | " 'vmin': 0,\n", 253 | " 'vmax': 255\n", 254 | "}\n", 255 | "\n", 256 | "# for each matched brain region, sample a random 3D location in MNI space,\n", 257 | "# warp it to bigbrain space, and fetch a 3D chunk of 3mm sidelength\n", 258 | "# from the full resolution Big Brain (20 micron) at this position.\n", 259 | "for i, match in enumerate(matches.itertuples()):\n", 260 | "\n", 261 | " point = julich_pmaps.sample_locations(match.region, 1)[0].warp('bigbrain')\n", 262 | " view = plotting.plot_img(bigbrain_volume, axes=axs[0, i], cut_coords=tuple(point), **plot_kwargs)\n", 263 | " view.add_markers([tuple(point)])\n", 264 | "\n", 265 | " voi = point.get_enclosing_cube(width_mm=3)\n", 266 | " chunk = bigbrain.fetch(voi=voi, resolution_mm=0.02)\n", 267 | " plotting.plot_img(chunk, axes=axs[1, i], **plot_kwargs)\n", 268 | "\n", 269 | "plt.tight_layout()" 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": null, 275 | "id": "55878936", 276 | "metadata": {}, 277 | "outputs": [], 278 | "source": [] 279 | } 280 | ], 281 | "metadata": { 282 | "kernelspec": { 283 | "display_name": "venv", 284 | "language": "python", 285 | "name": "python3" 286 | }, 287 | "language_info": { 288 | "codemirror_mode": { 289 | "name": "ipython", 290 | "version": 3 291 | }, 292 | "file_extension": ".py", 293 | "mimetype": "text/x-python", 294 | "name": "python", 295 | "nbconvert_exporter": "python", 296 | "pygments_lexer": "ipython3", 297 | "version": "3.12.7" 298 | } 299 | }, 300 | "nbformat": 4, 301 | "nbformat_minor": 5 302 | } 303 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /workshops/imb9345-2024.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "4c0f45d7", 6 | "metadata": {}, 7 | "source": [ 8 | "# The siibra tool suite – interfaces to a multilevel brain atlas spanning scales and modalities\n", 9 | "\n", 10 | "Studying the human brain requires to capture its structural and functional organization in a common spatial framework. siibra is a tool suite that implements a multilevel atlas of the human brain by providing access to reference templates at different spatial scales, complementary parcellation maps, and multimodal data features. \n", 11 | "\n", 12 | "For interactive and explorative use, the tool suite includes a web-based 3D viewer hosted [on EBRAINS](https://atlases.ebrains.eu/viewer). [siibra-python](https://siibra-python.readthedocs.io) is a software library for using the framework in computational workflows, and provides good compatibility with established (neuro)data science tools such as [nibabel](https://nipy.org/nibabel/) and [pandas](https://pandas.pydata.org).\n", 13 | "\n", 14 | "this notebook walks you through some examples that are intended for presenting siibra at the OHBM 2023 conference. It uses a recent develoment snapshot of siibra." 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "id": "78209d92", 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "!pip install -U --user siibra" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": null, 30 | "id": "64562619", 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "import siibra as sb\n", 35 | "from packaging.version import Version\n", 36 | "assert Version(sb.__version__) >= Version('1.0.1-alpha.2')\n", 37 | "from nilearn import plotting\n", 38 | "import matplotlib.pyplot as plt\n", 39 | "%matplotlib notebook" 40 | ] 41 | }, 42 | { 43 | "cell_type": "markdown", 44 | "id": "d38c80d2", 45 | "metadata": {}, 46 | "source": [ 47 | "# Load example image with regions of interest" 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "id": "ef10c44c", 53 | "metadata": {}, 54 | "source": [ 55 | "The example starts with a NIfTI image containing some regions of interesta in MNI space, which we are going to assign to cytoarchitectonic brain regions, and use for multimodel data queries. Here we just load a predefined NIfTI file, but in general you might use a thresholded functional activation map or similar signal here." 56 | ] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "execution_count": null, 61 | "id": "2101caa9", 62 | "metadata": {}, 63 | "outputs": [], 64 | "source": [ 65 | "img = sb.volumes.from_file(\n", 66 | " \"https://github.com/FZJ-INM1-BDA/siibra-tutorials/raw/main/workshops/ohbm-2023-example-input.nii.gz\",\n", 67 | " name=\"Example input\", \n", 68 | " space=\"mni152\",\n", 69 | ")\n", 70 | "\n", 71 | "# display the image\n", 72 | "plotting.plot_stat_map(img.fetch(), cmap='viridis', colorbar=False, draw_cross=False)" 73 | ] 74 | }, 75 | { 76 | "cell_type": "markdown", 77 | "id": "89a7ccca", 78 | "metadata": {}, 79 | "source": [ 80 | "# Assign activations to cytoarchitectonic brain regions\n", 81 | "\n", 82 | "We request the Julich-Brain probabilistic cytoarchitectonic maps, and perform an assignment of the image signal to the cytoarchitectonic structures. Siibra will automatically identify separated structures in the image. The result is pandas dataframe with regions assigned to each structure identified in the input image, and several scores for the assignment. " 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": null, 88 | "id": "979701eb", 89 | "metadata": {}, 90 | "outputs": [], 91 | "source": [ 92 | "# retrieve Julich-Brain probabilistic cytoarchitectonic maps\n", 93 | "julich_pmaps = sb.get_map(\n", 94 | " sb.parcellations.get('julich 2.9'),\n", 95 | " sb.spaces.get('mni 152'),\n", 96 | " sb.MapType.STATISTICAL\n", 97 | ")\n", 98 | "\n", 99 | "# assign the input image from above to the cytoarchitectonic maps.\n", 100 | "# This results in a pandas DataFrame, which we sort by correlation.\n", 101 | "matches = julich_pmaps.assign(img).sort_values('correlation', ascending=False)\n", 102 | "\n", 103 | "# For this example, we work only with a few of the assigned structures.\n", 104 | "# We filter by strong correlation and containedness scores.\n", 105 | "matches = matches[(matches.correlation > 0.3) & (matches['map containedness'] < 0.25)]\n", 106 | "\n", 107 | "# display some columns of the filtered table\n", 108 | "matches[['input structure', 'region', 'correlation']].round(2)" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": null, 114 | "id": "ee50f9da", 115 | "metadata": {}, 116 | "outputs": [], 117 | "source": [ 118 | "matches" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "id": "f5bc1b46", 124 | "metadata": {}, 125 | "source": [ 126 | "Show these filtered matched brain regions in 3D." 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": null, 132 | "id": "c25bf33e", 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "fig, axs = plt.subplots(1, len(matches), figsize=(8, 4))\n", 137 | "for i, match in enumerate(matches.itertuples()):\n", 138 | " plotting.plot_stat_map(\n", 139 | " julich_pmaps.fetch(region=match.region),\n", 140 | " axes=axs[i], colorbar=False, display_mode='z',\n", 141 | " cut_coords=[int(match.region.compute_centroids('mni152')[0][2])],\n", 142 | " title=match.region.name\n", 143 | " )" 144 | ] 145 | }, 146 | { 147 | "cell_type": "markdown", 148 | "id": "2250cbd7", 149 | "metadata": {}, 150 | "source": [ 151 | "# Query multimodal regional features\n", 152 | "\n", 153 | "Retrieving regional data features is as simple as specifying an atlas concept - such as a region, parcellation or space - and a feature modality. So in order to find receptor density fingerprints in the left primary visual cortex, we could do:" 154 | ] 155 | }, 156 | { 157 | "cell_type": "code", 158 | "execution_count": null, 159 | "id": "5231d9e8", 160 | "metadata": {}, 161 | "outputs": [], 162 | "source": [ 163 | "# retrieve the features for v1 left\n", 164 | "features = sb.features.get(\n", 165 | " sb.get_region('julich 2.9', 'v1 left'),\n", 166 | " sb.features.molecular.ReceptorDensityFingerprint,\n", 167 | ")\n", 168 | "\n", 169 | "# display the data array of the first of the returned features\n", 170 | "features[0].data.T" 171 | ] 172 | }, 173 | { 174 | "cell_type": "markdown", 175 | "id": "25a7d64c", 176 | "metadata": {}, 177 | "source": [ 178 | "We use the same mechanism now to build a table of three different feature modalities measured in the two selected brain regions. We choose receptor densities and cell densities, and complement them with connectivity profiles below. The connectivity uses a slightly different query, since the connectivity matrix is linked to the parcellation object, not a single region." 179 | ] 180 | }, 181 | { 182 | "cell_type": "code", 183 | "execution_count": null, 184 | "id": "34261ea7", 185 | "metadata": {}, 186 | "outputs": [], 187 | "source": [ 188 | "# retrieve the features for v1 left\n", 189 | "features = sb.features.get(\n", 190 | " sb.get_region('julich 2.9', 'v1 left'),\n", 191 | " sb.features.molecular.GeneExpressions,\n", 192 | " gene=['GABBR1', 'GABBR2']\n", 193 | ")\n", 194 | "\n", 195 | "# display the data array of the first of the returned features\n", 196 | "features[0].data" 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": null, 202 | "id": "3ef089e9", 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "# show their locations\n", 207 | "plotting.plot_markers(node_coords = list(features[0].data.mni_xyz), node_values=features[0].data['sample'])" 208 | ] 209 | }, 210 | { 211 | "cell_type": "code", 212 | "execution_count": null, 213 | "id": "0c39911b", 214 | "metadata": {}, 215 | "outputs": [], 216 | "source": [ 217 | "# create the plot\n", 218 | "fig, axs = plt.subplots(\n", 219 | " 3,\n", 220 | " len(matches),\n", 221 | " sharey='row', figsize=(9,9)\n", 222 | ")\n", 223 | "\n", 224 | "# row 1 - receptor densities for different receptor types\n", 225 | "for i, m in enumerate(matches.itertuples()):\n", 226 | " features = sb.features.get(\n", 227 | " m.region, \n", 228 | " sb.features.molecular.ReceptorDensityFingerprint\n", 229 | " )\n", 230 | " features[0].plot(ax=axs[0, i])\n", 231 | "\n", 232 | "# row 2 - cell densities per cortical layer\n", 233 | "for i, m in enumerate(matches.itertuples()):\n", 234 | " features = sb.features.get(\n", 235 | " m.region, \n", 236 | " sb.features.cellular.LayerwiseCellDensity\n", 237 | " )\n", 238 | " features[0].plot(ax=axs[1, i])\n", 239 | " \n", 240 | "# row 3 - gene expression\n", 241 | "for i, m in enumerate(matches.itertuples()):\n", 242 | " features = sb.features.get(\n", 243 | " m.region, \n", 244 | " sb.features.molecular.GeneExpressions,\n", 245 | " gene=['MAOA', 'TAC1', 'GABBR1', 'GABBR2']\n", 246 | " )\n", 247 | " features[0].data.boxplot(column='zscore', by='gene', ax=axs[2, i])\n", 248 | "\n", 249 | "# optimize plot layout\n", 250 | "plt.tight_layout()" 251 | ] 252 | }, 253 | { 254 | "cell_type": "markdown", 255 | "id": "90cd24de", 256 | "metadata": {}, 257 | "source": [ 258 | "# Extract BigBrain image data\n", 259 | "\n", 260 | "Finally, we sample 3D chunks of the BigBrain volume located in the identified brain regions." 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": null, 266 | "id": "ffa4d0e6", 267 | "metadata": {}, 268 | "outputs": [], 269 | "source": [ 270 | "# access the BigBrain reference template\n", 271 | "bigbrain = sb.get_template('bigbrain')\n", 272 | "\n", 273 | "# fetch the whole-brain volume at reduced resolution\n", 274 | "bigbrain_volume = bigbrain.fetch(resolution_mm=.64)" 275 | ] 276 | }, 277 | { 278 | "cell_type": "code", 279 | "execution_count": null, 280 | "id": "21292efa", 281 | "metadata": {}, 282 | "outputs": [], 283 | "source": [ 284 | "# prepare the plot\n", 285 | "f, axs = plt.subplots(2, len(matches), figsize=(9, 7))\n", 286 | "plot_kwargs = {\n", 287 | " \"bg_img\": None,\n", 288 | " \"cmap\": 'gray',\n", 289 | " \"colorbar\": False, \n", 290 | " \"draw_cross\": False,\n", 291 | " \"annotate\": False, \n", 292 | " 'vmin': 0,\n", 293 | " 'vmax': 255\n", 294 | "}\n", 295 | "\n", 296 | "# for each matched brain region, sample a random 3D location in MNI space,\n", 297 | "# warp it to bigbrain space, and fetch a 3D chunk of 3mm sidelength\n", 298 | "# from the full resolution Big Brain (20 micron) at this position.\n", 299 | "for i, match in enumerate(matches.itertuples()):\n", 300 | "\n", 301 | " point = julich_pmaps.sample_locations(match.region, 1)[0].warp('bigbrain')\n", 302 | " view = plotting.plot_img(bigbrain_volume, axes=axs[0, i], cut_coords=tuple(point), **plot_kwargs)\n", 303 | " view.add_markers([tuple(point)])\n", 304 | "\n", 305 | " voi = point.get_enclosing_cube(width_mm=3)\n", 306 | " chunk = bigbrain.fetch(voi=voi, resolution_mm=0.02)\n", 307 | " plotting.plot_img(chunk, axes=axs[1, i], **plot_kwargs)\n", 308 | "\n", 309 | "plt.tight_layout()" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": null, 315 | "id": "55878936", 316 | "metadata": {}, 317 | "outputs": [], 318 | "source": [] 319 | } 320 | ], 321 | "metadata": { 322 | "kernelspec": { 323 | "display_name": "venv", 324 | "language": "python", 325 | "name": "python3" 326 | }, 327 | "language_info": { 328 | "codemirror_mode": { 329 | "name": "ipython", 330 | "version": 3 331 | }, 332 | "file_extension": ".py", 333 | "mimetype": "text/x-python", 334 | "name": "python", 335 | "nbconvert_exporter": "python", 336 | "pygments_lexer": "ipython3", 337 | "version": "3.12.7" 338 | } 339 | }, 340 | "nbformat": 4, 341 | "nbformat_minor": 5 342 | } 343 | -------------------------------------------------------------------------------- /02-DataFeatures.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "87b3cac8", 6 | "metadata": { 7 | "slideshow": { 8 | "slide_type": "slide" 9 | } 10 | }, 11 | "source": [ 12 | "# Extracting multimodal data features\n", 13 | "\n", 14 | "`siibra` provides access to data features of different modalities. The features and their query functions are bundled in the module `siibra.features`. " 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "id": "cf1f23d0", 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "import siibra\n", 25 | "from packaging.version import Version\n", 26 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "id": "7b75cdf8", 32 | "metadata": {}, 33 | "source": [ 34 | "We can choose different types of features from this module. The feature types are organized in a hierarchy under the most abstract type `siibra.features.Feature`. All other feature types are subclasses of it. The current hierarchy can be obtained by" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": 26, 40 | "id": "96143ef5", 41 | "metadata": {}, 42 | "outputs": [ 43 | { 44 | "name": "stdout", 45 | "output_type": "stream", 46 | "text": [ 47 | "Feature\n", 48 | "├── CompoundFeature\n", 49 | "├── Tabular\n", 50 | "│ ├── CorticalProfile\n", 51 | "│ │ ├── BigBrainIntensityProfile\n", 52 | "│ │ ├── CellDensityProfile\n", 53 | "│ │ └── ReceptorDensityProfile\n", 54 | "│ ├── GeneExpressions\n", 55 | "│ │ ├── ProxyFeature\n", 56 | "│ │ └── ProxyFeature\n", 57 | "│ ├── LayerwiseBigBrainIntensities\n", 58 | "│ ├── LayerwiseCellDensity\n", 59 | "│ ├── ReceptorDensityFingerprint\n", 60 | "│ └── RegionalTimeseriesActivity\n", 61 | "│ └── RegionalBOLD\n", 62 | "├── RegionalConnectivity\n", 63 | "│ ├── FunctionalConnectivity\n", 64 | "│ ├── AnatomoFunctionalConnectivity\n", 65 | "│ ├── StreamlineCounts\n", 66 | "│ ├── StreamlineLengths\n", 67 | "│ └── TracingConnectivity\n", 68 | "├── Image\n", 69 | "│ ├── CellBodyStainedVolumeOfInterest\n", 70 | "│ ├── BlockfaceVolumeOfInterest\n", 71 | "│ ├── DTIVolumeOfInterest\n", 72 | "│ ├── PLIVolumeOfInterest\n", 73 | "│ ├── MRIVolumeOfInterest\n", 74 | "│ ├── XPCTVolumeOfInterest\n", 75 | "│ ├── LSFMVolumeOfInterest\n", 76 | "│ └── CellbodyStainedSection\n", 77 | "└── EbrainsDataFeature\n" 78 | ] 79 | } 80 | ], 81 | "source": [ 82 | "siibra.features.render_ascii_tree(\"Feature\")" 83 | ] 84 | }, 85 | { 86 | "cell_type": "markdown", 87 | "id": "3d47c73a", 88 | "metadata": {}, 89 | "source": [ 90 | "### Example: Densities of neurotransmitter receptors\n", 91 | "\n", 92 | "Features can be queried for brain regions, parcellations and location objects (such as volumes of interest in a refrence space) with `siibra.features.get()`, which accepts a query object and a feature type. It will query all subclasses of the given feature type, if any. Here is a simple example for getting a receptor density fingerprint in region V1. The data is of tabular type, provided as a pandas dataframe." 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "id": "fd6faf9f", 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "v1 = siibra.get_region(parcellation='julich 2.9', region='v1')" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "id": "5aababb4", 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "features = siibra.features.get(\n", 113 | " v1, siibra.features.molecular.ReceptorDensityFingerprint\n", 114 | ")\n", 115 | "print(features[0].name)\n", 116 | "features[0].data.T # we transpose the table for display" 117 | ] 118 | }, 119 | { 120 | "cell_type": "markdown", 121 | "id": "30fa6632", 122 | "metadata": {}, 123 | "source": [ 124 | "Most features provide a plot method for easy inspection." 125 | ] 126 | }, 127 | { 128 | "cell_type": "code", 129 | "execution_count": null, 130 | "id": "51cf64ec", 131 | "metadata": {}, 132 | "outputs": [], 133 | "source": [ 134 | "features[0].plot()" 135 | ] 136 | }, 137 | { 138 | "cell_type": "markdown", 139 | "id": "db81a0e1", 140 | "metadata": {}, 141 | "source": [ 142 | "Typically, not all regions have all data modalities linked. Neurotransmitter density fingerprints are currently available for part of the cytoarchitectonic brain regions in human. We can see this by querying with the parcellation, which represents the root region and thus provides all features linked to any Julich-Brain regions.\n", 143 | "\n", 144 | "Here we just print the set of unique region names for which receptor densities are linked." 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": null, 150 | "id": "8aaac974", 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "features = siibra.features.get(\n", 155 | " siibra.parcellations.get('julich'),\n", 156 | " siibra.features.molecular.ReceptorDensityFingerprint\n", 157 | ")\n", 158 | "print(\n", 159 | " \"Regions with receptor features:\\n - \"\n", 160 | " + \"\\n - \".join(\n", 161 | " {r.name for f in features for r in f.anchor.regions}\n", 162 | " )\n", 163 | ")" 164 | ] 165 | }, 166 | { 167 | "cell_type": "markdown", 168 | "id": "301ce31b", 169 | "metadata": {}, 170 | "source": [ 171 | "### Cortical cell densities\n", 172 | "\n", 173 | "The cellular level contains layer-specific cell distributions extracted from different regions of BigBrain." 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": null, 179 | "id": "9e1afe52", 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "features = siibra.features.get(v1, siibra.features.cellular.LayerwiseCellDensity)\n", 184 | "print(\n", 185 | " f\"{features[0].modality}\\n\\n\"\n", 186 | " f\"{features[0].description}\"\n", 187 | ")\n", 188 | "fig = features[0].plot()" 189 | ] 190 | }, 191 | { 192 | "cell_type": "markdown", 193 | "id": "6b614ad5", 194 | "metadata": {}, 195 | "source": [ 196 | "We can retrieve a similar feature representing staining intensities in BigBrain - `siibra.features.cellular.LayerwiseBigBrainIntensities`. This uses cortical staining profiles computed by Konrad Wagstyl. At first call, the query will take a bit to retrieve the profiles." 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": null, 202 | "id": "b4b4ea67", 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "features = siibra.features.get(v1, siibra.features.cellular.LayerwiseBigBrainIntensities)\n", 207 | "print(\n", 208 | " f\"{features[0].modality}\\n\\n\"\n", 209 | " f\"{features[0].description}\"\n", 210 | ")\n", 211 | "fig = features[0].plot()" 212 | ] 213 | }, 214 | { 215 | "cell_type": "markdown", 216 | "id": "7f0c1bdc", 217 | "metadata": { 218 | "slideshow": { 219 | "slide_type": "slide" 220 | } 221 | }, 222 | "source": [ 223 | "### Gene Expressions from the Allen Atlas \n", 224 | "\n", 225 | "`siibra` also implements a live queryto gene expression data from the Allen atlas to extract regional gene expression levels. Gene expressions are linked to atlas regions by coordinates of their probes in MNI space. When called with a brain region, `siibra.features.get` will generate a mask of this region in MNI space to filter the probes. It provides the regional expression levels as tabular data, together with the MNI coordinates." 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": null, 231 | "id": "4aad09b3", 232 | "metadata": {}, 233 | "outputs": [], 234 | "source": [ 235 | "features = siibra.features.get(\n", 236 | " v1, siibra.features.molecular.GeneExpressions, gene=[\"TAC1\", \"MAOA\", \"GABARAPL2\"]\n", 237 | ")\n", 238 | "fig = features[0].plot(title=f'Gene Expressions in {v1}')\n", 239 | "features[0].data" 240 | ] 241 | }, 242 | { 243 | "cell_type": "markdown", 244 | "id": "7e0192ee", 245 | "metadata": {}, 246 | "source": [ 247 | "We can plot the MNI coordinates to localize the measures and verify they are located in V1." 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "id": "6261be77", 254 | "metadata": {}, 255 | "outputs": [], 256 | "source": [ 257 | "from nilearn import plotting\n", 258 | "mask = v1.get_regional_mask(\"mni152\")\n", 259 | "display = plotting.plot_glass_brain(mask.fetch(), cmap='viridis')\n", 260 | "locations = features[0].anchor.location\n", 261 | "display.add_markers(locations.as_list(), marker_size=2) " 262 | ] 263 | }, 264 | { 265 | "cell_type": "markdown", 266 | "id": "5e5957f0", 267 | "metadata": { 268 | "slideshow": { 269 | "slide_type": "slide" 270 | } 271 | }, 272 | "source": [ 273 | "### Connectivity matrices\n", 274 | "\n", 275 | "`siibra` provides connectivity matrices with parcellation averaged structural and functional measurments for different subjects of different cohorts. Here we request streamline counts for Julich Brain. Since the datasets comprise hundreds of connectivity matrices for the same cohort and modality, they are grouped into \"compounds\"." 276 | ] 277 | }, 278 | { 279 | "cell_type": "code", 280 | "execution_count": null, 281 | "id": "5cae2bd7", 282 | "metadata": {}, 283 | "outputs": [], 284 | "source": [ 285 | "features = siibra.features.get(\n", 286 | " siibra.parcellations.get('julich 2.9'),\n", 287 | " siibra.features.connectivity.StreamlineCounts\n", 288 | ")" 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "id": "345906a7", 294 | "metadata": {}, 295 | "source": [ 296 | "Let's check the cohort and indices of individual feature elements for the first compound. The elements are indexed by subject id." 297 | ] 298 | }, 299 | { 300 | "cell_type": "code", 301 | "execution_count": null, 302 | "id": "a243aa83", 303 | "metadata": {}, 304 | "outputs": [], 305 | "source": [ 306 | "print(\n", 307 | " f\"{features[0].cohort}\\n\\n\"\n", 308 | " \"Indices: \"\n", 309 | " f\"{', '.join(features[0].indices)}\"\n", 310 | ")" 311 | ] 312 | }, 313 | { 314 | "cell_type": "markdown", 315 | "id": "456679fe", 316 | "metadata": {}, 317 | "source": [ 318 | "We can retrieve the matrix of a particular subject using the `get_element()` method of the compound. It includes the actual data as a pandas dataframe, with the notable property that the row and column indices are valid region objects for further reference. This implies in particular, that we can directly associate each measure with the corresponding information in the parcellation, and with a mask of a parcellation map." 319 | ] 320 | }, 321 | { 322 | "cell_type": "code", 323 | "execution_count": null, 324 | "id": "6b2db85a", 325 | "metadata": {}, 326 | "outputs": [], 327 | "source": [ 328 | "sc_hcp_000 = features[0].get_element('000')\n", 329 | "sc_hcp_000.data" 330 | ] 331 | }, 332 | { 333 | "cell_type": "markdown", 334 | "id": "ed50d6e3", 335 | "metadata": {}, 336 | "source": [ 337 | "As an example, we retrieve the centroids in MNI152 space and plot the connnectivity graph in 3D:" 338 | ] 339 | }, 340 | { 341 | "cell_type": "code", 342 | "execution_count": null, 343 | "id": "5956a1f3", 344 | "metadata": {}, 345 | "outputs": [], 346 | "source": [ 347 | "node_coords = sc_hcp_000.compute_centroids(space='mni152')\n", 348 | "plotting.view_connectome(\n", 349 | " adjacency_matrix=sc_hcp_000.data,\n", 350 | " node_coords=node_coords,\n", 351 | " edge_threshold=\"99%\",\n", 352 | " node_size=3, colorbar=False,\n", 353 | " edge_cmap=\"bwr\"\n", 354 | ")" 355 | ] 356 | }, 357 | { 358 | "cell_type": "code", 359 | "execution_count": null, 360 | "id": "08fa9c2e", 361 | "metadata": {}, 362 | "outputs": [], 363 | "source": [] 364 | } 365 | ], 366 | "metadata": { 367 | "kernelspec": { 368 | "display_name": ".venv", 369 | "language": "python", 370 | "name": "python3" 371 | }, 372 | "language_info": { 373 | "codemirror_mode": { 374 | "name": "ipython", 375 | "version": 3 376 | }, 377 | "file_extension": ".py", 378 | "mimetype": "text/x-python", 379 | "name": "python", 380 | "nbconvert_exporter": "python", 381 | "pygments_lexer": "ipython3", 382 | "version": "3.12.7" 383 | }, 384 | "vscode": { 385 | "interpreter": { 386 | "hash": "c6134497cd5410b1b275ebc88f99d14855849379b696eb9e04cff5dd9aa5e77a" 387 | } 388 | } 389 | }, 390 | "nbformat": 4, 391 | "nbformat_minor": 5 392 | } 393 | -------------------------------------------------------------------------------- /01-BasicConcepts.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Import siibra" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": null, 13 | "metadata": {}, 14 | "outputs": [], 15 | "source": [ 16 | "import siibra\n", 17 | "from packaging.version import Version\n", 18 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')\n", 19 | "import os\n", 20 | "import matplotlib\n", 21 | "%matplotlib inline" 22 | ] 23 | }, 24 | { 25 | "cell_type": "markdown", 26 | "metadata": {}, 27 | "source": [ 28 | "## Accessing brain parcellations\n", 29 | "\n", 30 | "Preconfigured reference parcellations are stored in the instance table `siibra.parcellations`. \n", 31 | "The configuration is retrieved automatically from an github repository that we maintain with siibra.\n", 32 | "Instance table provide a tabular overview of their elements with the `dataframe` function, which returns a [pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) - a rich object with functions similar to Excel." 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "siibra.parcellations.dataframe" 42 | ] 43 | }, 44 | { 45 | "cell_type": "markdown", 46 | "metadata": {}, 47 | "source": [ 48 | "Elements in an instance table can be accessed in a couple of ways, in particular\n", 49 | "\n", 50 | " - by iterating over all instances\n", 51 | " - by fuzzy matching of keyword or name with the index operator `[.]` or the `get()` method\n", 52 | " - by tab-completion\n", 53 | "\n", 54 | "Let's use keyword matching to retrieve the most recent Julich Brain parcellation." 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": null, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "julichbrain = siibra.parcellations.get('julich')\n", 64 | "julichbrain" 65 | ] 66 | }, 67 | { 68 | "cell_type": "markdown", 69 | "metadata": {}, 70 | "source": [ 71 | "There is also an instance table of atlases, which we could use to access the parcellations linked with the human atlas." 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": null, 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "siibra.atlases.get('human').parcellations" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": null, 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "# let's look at some metadata\n", 90 | "print(f\"Name: {julichbrain.name}\")\n", 91 | "print(f\"Id: {julichbrain.id}\")\n", 92 | "print(f\"Modality: {julichbrain.modality}\\n\")\n", 93 | "print(f\"{julichbrain.description}\\n\")\n", 94 | "for p in julichbrain.publications:\n", 95 | " print(p['citation'])" 96 | ] 97 | }, 98 | { 99 | "cell_type": "markdown", 100 | "metadata": {}, 101 | "source": [ 102 | "The resulting parcellation is a semantic object. It represents the region hierarchy of the parcellation.\n", 103 | "We can find regions by name using the `find` function. If we know unique keywords and expect a single match, we can also use `get`." 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "for region in julichbrain.find('v1'):\n", 113 | " print(region.name)" 114 | ] 115 | }, 116 | { 117 | "cell_type": "markdown", 118 | "metadata": {}, 119 | "source": [ 120 | "As you see, areas often appear three times: Julich-Brain defines them separately for the left and right hemisphere, and additionally defines a common parent region. In fact the parent object represents the corresponding subtree. We can more easily access individual regions by using `get_region` instead of `find_regions`. This method assumes the region specification is unique, and either returns a single region object or fails. If it finds multiple matches, it will try if they have a common parent." 121 | ] 122 | }, 123 | { 124 | "cell_type": "code", 125 | "execution_count": null, 126 | "metadata": {}, 127 | "outputs": [], 128 | "source": [ 129 | "# the whole amygdala subtree\n", 130 | "julichbrain.get_region('v1')" 131 | ] 132 | }, 133 | { 134 | "cell_type": "markdown", 135 | "metadata": {}, 136 | "source": [ 137 | "You may output the subtree anchored at a given region, if any, using `Region.tree2str()`. This is useful to inspect a region object." 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": null, 143 | "metadata": {}, 144 | "outputs": [], 145 | "source": [ 146 | "occ = julichbrain.get_region('occipital cortex')\n", 147 | "print(occ.tree2str())" 148 | ] 149 | }, 150 | { 151 | "cell_type": "markdown", 152 | "metadata": {}, 153 | "source": [ 154 | "## Accessing parcellation maps\n", 155 | "\n", 156 | "A parcellation map or region map is a spatial object corresponding to a parcellation. \n", 157 | "We can access maps with the `get_map` function of parcellation objects.\n", 158 | "Since parcellations may provide maps in different spaces, `siibra` expects you to specify the space. \n", 159 | "Note: Preconfigured reference spaces are managed in another instance table - `siibra.spaces` (you might have guessed it). \n", 160 | "\n", 161 | "\n", 162 | "Let's access the maximum probability map of Julich-Brain in the MNI152 space to see how that works." 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": null, 168 | "metadata": {}, 169 | "outputs": [], 170 | "source": [ 171 | "julich_mpm = julichbrain.get_map(space=siibra.spaces.MNI_152_ICBM_2009C_NONLINEAR_ASYMMETRIC)\n", 172 | "julich_mpm" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "metadata": {}, 178 | "source": [ 179 | "## Fetching the actual image of a parcellation map\n", 180 | "The returned map provides all information required to fetch the actual image.\n", 181 | "To access it we need to retrieve the actual data using the `fetch()` method, which returns a Nifti1Image object.\n", 182 | "This step is separate for two reasons:\n", 183 | "- The parcellation map is more than just the image - it provides information about the space and parcellation of the map, and possibly multiple resource where the data is stored.\n", 184 | "- `siibra` uses a lazy strategy for data loading. `fetch` is the typical last step to actually retrieve the underlying content.\n", 185 | "\n", 186 | "We can use the wonderful `nilearn` library for plotting the map. It plots in the MNI152 space by default, so as long as we work in this space plotting is simple enough.\n", 187 | "\n", 188 | "Some parcellations (and other 3D volumes) are split into multiple fragments represented in separate image volumes. For Julich-Brain 2.9, each hemisphere is in a different fragment. We can fetch individual fragments, but if no fragment is specified, siibra will merge the available ones into a single volume:" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": null, 194 | "metadata": {}, 195 | "outputs": [], 196 | "source": [ 197 | "from nilearn import plotting\n", 198 | "cmap = julich_mpm.get_colormap()\n", 199 | "plotting.plot_roi(julich_mpm.fetch(), cmap=cmap, title=julich_mpm.parcellation.name)" 200 | ] 201 | }, 202 | { 203 | "cell_type": "markdown", 204 | "metadata": {}, 205 | "source": [ 206 | "## Fetching probability maps\n", 207 | "\n", 208 | "Julich-Brain, like some other parcellations, is a probabilistic parcellation. The labelled volumes in the maximum probability map (mpm) above are only a summary representation, displaying for each voxel the brain region of highest probability. \n", 209 | "Each region is additionally available as a probability map, which provides statistical information in the reference space for each particular region.\n", 210 | "\n", 211 | "We received the labelled volumes above because `siibra` uses labelled volumes as the default map type. \n", 212 | "To retrieve probability maps, we explicitly request `siibra.MapType.STATISTICAL` as maptype from the parcellation.\n", 213 | "It returns a sparse map representation, since the set of all probability maps contains several 100 of NIfTI volumes with mostly empty voxels." 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": null, 219 | "metadata": {}, 220 | "outputs": [], 221 | "source": [ 222 | "julich_pmaps = julichbrain.get_map(\n", 223 | " space=siibra.spaces.MNI_152_ICBM_2009C_NONLINEAR_ASYMMETRIC,\n", 224 | " maptype=siibra.MapType.STATISTICAL\n", 225 | ")\n", 226 | "julich_pmaps" 227 | ] 228 | }, 229 | { 230 | "cell_type": "markdown", 231 | "metadata": {}, 232 | "source": [ 233 | "To access the probability maps, we will call fetch again. However, this time, we need to specify a region.\n", 234 | "The sparse representation will then generate a (dense) Nifti1Image which we can use as expected.\n", 235 | "Plotting of probability maps works nicely with nilearn's `plot_stat_map`." 236 | ] 237 | }, 238 | { 239 | "cell_type": "code", 240 | "execution_count": null, 241 | "metadata": {}, 242 | "outputs": [], 243 | "source": [ 244 | "pmap = julich_pmaps.fetch(region='hoc5 right')\n", 245 | "plotting.plot_stat_map(pmap, title=f'hOc5 right of {julich_pmaps.parcellation.name}')" 246 | ] 247 | }, 248 | { 249 | "cell_type": "markdown", 250 | "metadata": {}, 251 | "source": [ 252 | "In the background, `siibra` uses an index to identify regions in a parcellation map.\n", 253 | "The index informs about the image volume and the label used to map the region.\n", 254 | "Usually we don't need to, but we can request and use these indices as well for fetching.\n", 255 | "We will see that a region in the probability map is indexed by the volume, not by a label." 256 | ] 257 | }, 258 | { 259 | "cell_type": "code", 260 | "execution_count": null, 261 | "metadata": {}, 262 | "outputs": [], 263 | "source": [ 264 | "index = julich_pmaps.get_index(region='hoc5 right')\n", 265 | "index" 266 | ] 267 | }, 268 | { 269 | "cell_type": "markdown", 270 | "metadata": {}, 271 | "source": [ 272 | "This is different if we request the index of the same region in the maximum probability map, which is a labelled parcellation and represents all regions by their voxel label in the same volume:" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": null, 278 | "metadata": {}, 279 | "outputs": [], 280 | "source": [ 281 | "index = julich_mpm.get_index(region='hoc5 right')\n", 282 | "index" 283 | ] 284 | }, 285 | { 286 | "cell_type": "markdown", 287 | "metadata": {}, 288 | "source": [ 289 | "As mentioned before, while not recommended, we can also use this index to fetch from the map instead of using a region or region name:" 290 | ] 291 | }, 292 | { 293 | "cell_type": "code", 294 | "execution_count": null, 295 | "metadata": {}, 296 | "outputs": [], 297 | "source": [ 298 | "pmap = julich_pmaps.fetch(index=index)\n", 299 | "plotting.plot_stat_map(pmap, title=f'hOc5 right of {julich_pmaps.parcellation.name}')" 300 | ] 301 | }, 302 | { 303 | "cell_type": "markdown", 304 | "metadata": {}, 305 | "source": [ 306 | "If we request a specific region when fetching from the labelled map, `siibra` will construct a binary mask of the region. This is different in shape from the probabilistic maps, but of course sits at the same location." 307 | ] 308 | }, 309 | { 310 | "cell_type": "code", 311 | "execution_count": null, 312 | "metadata": {}, 313 | "outputs": [], 314 | "source": [ 315 | "mask = julich_mpm.fetch(region='hoc5 right')\n", 316 | "plotting.plot_roi(mask, title=f'hOc5 right of {julich_pmaps.parcellation.name}')" 317 | ] 318 | }, 319 | { 320 | "cell_type": "markdown", 321 | "metadata": {}, 322 | "source": [ 323 | "## Extracting volumes of interest from high resolution\n", 324 | "\n", 325 | "Accessing image volumes is at the heart of `siibra`, and also works for high resolution images such as the BigBrain model. \n", 326 | "\n", 327 | "BigBrain is a reference space in `siibra`, and the corresponding image is the template of that space.\n", 328 | "Getting a template from a space corresonds to getting the map of a parcellation - we call the `get_template` method of the space object.\n", 329 | "\n", 330 | "To get access to the image data of the template, we use `fetch` again on the template object.\n", 331 | "However, fetching BigBrain at full resolution is not a good idea - it is a 1TByte dataset. \n", 332 | "`siibra` will therefore by default fetch a downscaled version!" 333 | ] 334 | }, 335 | { 336 | "cell_type": "code", 337 | "execution_count": null, 338 | "metadata": {}, 339 | "outputs": [], 340 | "source": [ 341 | "bigbrain = siibra.spaces.get('bigbrain').get_template()\n", 342 | "bigbrain_img = bigbrain.fetch()\n", 343 | "plotting.plot_img(bigbrain_img, cmap='gray')" 344 | ] 345 | }, 346 | { 347 | "cell_type": "markdown", 348 | "metadata": {}, 349 | "source": [ 350 | "If we request the full resolution, `siibra` will complain and choose a larger but feasible resolution." 351 | ] 352 | }, 353 | { 354 | "cell_type": "code", 355 | "execution_count": null, 356 | "metadata": {}, 357 | "outputs": [], 358 | "source": [ 359 | "bigbrain.fetch(resolution_mm=0.02)" 360 | ] 361 | }, 362 | { 363 | "cell_type": "markdown", 364 | "metadata": {}, 365 | "source": [ 366 | "To work with full resolution data, we typically fetch volumes of interest only.\n", 367 | "`siibra` represents these as bounding boxes (`siibra.locations.BoundingBox`).\n", 368 | "Bounding boxes are one type of locations provided by `siibra`, and all locations are uniquely associated to a reference space.\n", 369 | "We construct a bounding box in BigBrain space by using the min and max point (-2.979, -61.256, 1.906) and (2.863, -57.356, -2.087):" 370 | ] 371 | }, 372 | { 373 | "cell_type": "code", 374 | "execution_count": null, 375 | "metadata": {}, 376 | "outputs": [], 377 | "source": [ 378 | "voi = siibra.locations.BoundingBox(\n", 379 | " (-2.979, -61.256, 1.906),\n", 380 | " (2.863, -57.356, -2.087),\n", 381 | " space='bigbrain'\n", 382 | ")" 383 | ] 384 | }, 385 | { 386 | "cell_type": "markdown", 387 | "metadata": {}, 388 | "source": [ 389 | "This bounding box can be used to fetch a full resolution chunk from BigBrain.\n", 390 | "To look around in the chunk, nilearn's `view_img` is nice!" 391 | ] 392 | }, 393 | { 394 | "cell_type": "code", 395 | "execution_count": null, 396 | "metadata": {}, 397 | "outputs": [], 398 | "source": [ 399 | "bigbrainchunk = bigbrain.fetch(resolution_mm=-1, voi=voi)\n", 400 | "plotting.view_img(bigbrainchunk, bg_img=None, cmap='gray')" 401 | ] 402 | }, 403 | { 404 | "cell_type": "markdown", 405 | "metadata": {}, 406 | "source": [ 407 | "The resulting image chunk sits properly in its reference space, so we can also plot it on top of the low-resolution whole brain image that we fetched already above." 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "execution_count": null, 413 | "metadata": {}, 414 | "outputs": [], 415 | "source": [ 416 | "plotting.plot_roi(bigbrainchunk, bg_img=bigbrain_img)" 417 | ] 418 | }, 419 | { 420 | "cell_type": "markdown", 421 | "metadata": {}, 422 | "source": [ 423 | "We can apply the same bounding box to extract chunks from other objects in the same space, like parcellation maps. \n", 424 | "Here we use the cortical layer maps in BigBrain space (in labelled map format), and download them in full resolution.\n", 425 | "For the superimposition, we can use `view_img` with reduced `opacity`." 426 | ] 427 | }, 428 | { 429 | "cell_type": "code", 430 | "execution_count": null, 431 | "metadata": {}, 432 | "outputs": [], 433 | "source": [ 434 | "layermap = siibra.parcellations.get('cortical layers').get_map(space='bigbrain')\n", 435 | "mask = layermap.fetch(resolution_mm=0.16, voi=voi, fragment='left')\n", 436 | "plotting.view_img(mask, bg_img=bigbrainchunk, opacity=.1, symmetric_cmap=False)" 437 | ] 438 | }, 439 | { 440 | "cell_type": "code", 441 | "execution_count": null, 442 | "metadata": {}, 443 | "outputs": [], 444 | "source": [] 445 | } 446 | ], 447 | "metadata": { 448 | "kernelspec": { 449 | "display_name": ".venv", 450 | "language": "python", 451 | "name": "python3" 452 | }, 453 | "language_info": { 454 | "codemirror_mode": { 455 | "name": "ipython", 456 | "version": 3 457 | }, 458 | "file_extension": ".py", 459 | "mimetype": "text/x-python", 460 | "name": "python", 461 | "nbconvert_exporter": "python", 462 | "pygments_lexer": "ipython3", 463 | "version": "3.12.7" 464 | } 465 | }, 466 | "nbformat": 4, 467 | "nbformat_minor": 4 468 | } 469 | -------------------------------------------------------------------------------- /workshops/HIBALL-winterschool-2023.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "999772f5", 6 | "metadata": {}, 7 | "source": [ 8 | "# Instantiation" 9 | ] 10 | }, 11 | { 12 | "cell_type": "code", 13 | "execution_count": null, 14 | "id": "0e71a4c5", 15 | "metadata": {}, 16 | "outputs": [], 17 | "source": [ 18 | "import siibra\n", 19 | "from packaging.version import Version\n", 20 | "assert Version(siibra.__version__) >= Version('1.0.1-alpha.2')\n", 21 | "\n", 22 | "import matplotlib\n", 23 | "from nilearn import plotting\n", 24 | "\n", 25 | "# ignore the following lines at this point - we just touch some objects to trigger data loading \n", 26 | "# while we still take the introduction\n" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "id": "b46e900c", 32 | "metadata": {}, 33 | "source": [ 34 | "For the purpose of this tutorial, we request some objects here already to retrieve data while your tutor still does the introductory remarks :-) Please just run the cell and ignore for now. " 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": null, 40 | "id": "3af4ed33", 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "julich_brain = siibra.parcellations.get('julich 3.1')\n", 45 | "julich_pmaps = julich_brain.get_map('mni152', 'statistical')\n", 46 | "julich_pmaps.fetch(region='4a left')\n", 47 | "siibra.warm_cache()" 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "id": "bbc1c43d", 53 | "metadata": {}, 54 | "source": [ 55 | "# Part I: Accessing reference spaces, parcellations and regions\n", 56 | "\n", 57 | "\n", 58 | "## Instance tables of key concepts\n", 59 | "\n", 60 | "`siibra` is structured around the key concepts atlas, reference space, parcellation and parcellation region. Each of these concepts has a specific type. `siibra` comes with preconfigured instances of these concepts, which can be easily accessed via *instance tables*. When you load siibra for the first time, it pulls this preconfiguration information from our online repository and caches it on your computer. Therefore, `siibra` will be a bit slower when you use it for the first time (or after a version upgrade).\n", 61 | "\n", 62 | "Here is an overview of the key concepts:\n", 63 | "\n", 64 | "| Concept | siibra type | instance table | description |\n", 65 | "| :-- | :-- | :-- | :-- |\n", 66 | "| Atlases | Atlas | `siibra.atlases` | A collection of related reference spaces and parcellations, typically per species |\n", 67 | "| Reference spaces | Space | `siibra.spaces` | 3D coordinate systems of the brain |\n", 68 | "| Parcellations | Parcellation | `siibra.parcellations` | Different brain parcellations schemes with their region hierarchies |\n", 69 | "| regions | Region | - | Structures defined within a parcellation, each representing a subtree of a parcellation's hierarchy| \n", 70 | "\n", 71 | "**Note:** These concepts are just semantic objects - they mostly give names and relationships to atlas concepts. We will deal with parcellation maps in the next notebook" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": 3, 77 | "id": "ecd71f02", 78 | "metadata": {}, 79 | "outputs": [], 80 | "source": [ 81 | "import siibra" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "id": "b3affd86", 87 | "metadata": {}, 88 | "source": [ 89 | "## Select parcellations from their instance table\n", 90 | "\n", 91 | "The instance table for parcellations is `siibra.parcellations`. To get an overview, we can simply print its elements." 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "id": "38937f61", 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "siibra.parcellations" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": null, 107 | "id": "bc9a7167", 108 | "metadata": {}, 109 | "outputs": [], 110 | "source": [ 111 | "print(siibra.parcellations)" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "id": "b1be34e2", 117 | "metadata": {}, 118 | "source": [ 119 | "To actually access an element from an instance table, there are several options:\n", 120 | "\n", 121 | " - by tab-completion on an interactive Python shell. This is very convenient, since it allows you to browse while typing.\n", 122 | " - by fuzzy keyword matching via the get() function\n", 123 | " - By using the index operator `[]` with keywords or numbers" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": null, 129 | "id": "8bbee7cd", 130 | "metadata": {}, 131 | "outputs": [], 132 | "source": [ 133 | "siibra.parcellations['julich']" 134 | ] 135 | }, 136 | { 137 | "cell_type": "code", 138 | "execution_count": null, 139 | "id": "895779b9", 140 | "metadata": {}, 141 | "outputs": [], 142 | "source": [ 143 | "siibra.parcellations.get('bundles')" 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": null, 149 | "id": "fa3fe22d", 150 | "metadata": {}, 151 | "outputs": [], 152 | "source": [ 153 | "# this is equivalent to the above\n", 154 | "siibra.parcellations['bundles']" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "id": "753a3cfd", 160 | "metadata": {}, 161 | "source": [ 162 | "## Your turn! Browse predefined reference spaces\n", 163 | "\n", 164 | "