├── .gitignore ├── LICENSE ├── README.md ├── _config.yml ├── docs ├── _config.yml ├── images │ ├── ca_network.png │ ├── ca_parcels.png │ ├── image.png │ ├── mmp.png │ ├── out1.png │ ├── out2.png │ ├── out3.png │ ├── out3a.png │ ├── out3b.png │ ├── out4.png │ ├── out5.png │ ├── out6.png │ ├── out7.png │ ├── out8.png │ ├── standard.png │ ├── yeo17.png │ └── yeo7.png ├── index.md └── parcellation_labels.md ├── hcp_utils ├── __init__.py ├── data │ ├── S1200.L.flat.32k_fs_LR.surf.gii │ ├── S1200.L.inflated_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.L.midthickness_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.L.pial_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.L.sphere.32k_fs_LR.surf.gii │ ├── S1200.L.very_inflated_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.L.white_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.R.flat.32k_fs_LR.surf.gii │ ├── S1200.R.inflated_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.R.midthickness_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.R.pial_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.R.sphere.32k_fs_LR.surf.gii │ ├── S1200.R.very_inflated_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.R.white_MSMAll.32k_fs_LR.surf.gii │ ├── S1200.sulc_MSMAll.32k_fs_LR.dscalar.nii │ ├── ca_network_1.1.npz │ ├── ca_parcels_1.1.npz │ ├── cortical_adjacency.npz │ ├── fMRI_vertex_info_32k.npz │ ├── mmp_1.0.npz │ ├── standard.npz │ ├── yeo17.npz │ └── yeo7.npz └── hcp_utils.py ├── images └── image.png ├── prepare ├── README.md ├── prepare_adjacency.py ├── prepare_ca.py ├── prepare_mmp.py ├── prepare_standard.py └── prepare_yeo.py ├── setup.py └── source_data ├── CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_netassignments_LR.dlabel.nii ├── CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_parcels_LR.dlabel.nii ├── Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii ├── README.md ├── RSN-networks.32k_fs_LR.dlabel.nii └── RSN.dlabel.nii /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__ 2 | notebooks 3 | build 4 | dist 5 | hcp_utils.egg-info 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 by Romuald A. Janik 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # hcp-utils 2 | 3 | This package contains utilities to use [Human Connectome Project](https://www.humanconnectome.org/) (HCP) data and HCP-like data (e.g. obtained from legacy data using [ciftify](https://github.com/edickie/ciftify)) as well as corresponding parcellations with [nilearn](https://nilearn.github.io/) and other Python tools. 4 | 5 | The HCP data differs from conventional volumetric fMRI data which records the BOLD signal from each *voxel* in a 3D volume in that the signal from the cortical surface is treated as a folded two dimensional surface, and hence the data is associated with vertices of a predefined surface mesh, while the subcortical structures are described volumetrically using voxels. 6 | 7 | The CIFTI (more precisely CIFTI-2) file format encompasses both the cortical 2D surface data as well as the subcortical volume data. However, only the voxels associated with relevant subcortical structures are kept. 8 | Thus these data are quite richly structured. Although the standard Python tools for dealing with fMRI data like [nibabel](https://nipy.org/nibabel/) can read both the CIFTI-2 files containing the fMRI signals and the GIFTI files containing the surface mesh definitions, there is not much that one could do further out-of-the-box, in particular visualization using [nilearn](https://nilearn.github.io/) or processing parcellated data using e.g. machine learning tools which work exclusively with `numpy` arrays. The goal of this package is to ease the interoperability of HCP data and these standard Python tools. 9 | 10 | 11 | ![brain image](images/image.png) 12 | 13 | 14 | The utilities mainly deal with plotting surface data, accessing the predefined subcortical structures as well as using various parcellations and identifying connected components on the cortical surface. Various helper functions aid e.g. in mapping the HCP fMRI cortical data to surface vertices for visualization etc. The functions work directly with numpy arrays of shape `Tx91282` or `91282` for fMRI data, with `T` being the number of time frames, while `91282` is the standard HCP dimensionality for the 3T cortical surface and subcortical data. 15 | 16 | ## Documentation 17 | 18 | Find the documentation at [rmldj.github.io/hcp-utils](https://rmldj.github.io/hcp-utils/) 19 | 20 | ## Installation 21 | 22 | Make sure that you have the following packages installed (e.g. using [miniconda](https://docs.conda.io/en/latest/miniconda.html)) 23 | ``` 24 | nibabel, nilearn, numpy, scikit-learn, matplotlib, pandas, scipy 25 | ``` 26 | Then install with 27 | ``` 28 | pip install hcp_utils 29 | ``` 30 | upgrade with 31 | ``` 32 | pip install --upgrade hcp_utils 33 | ``` 34 | 35 | 36 | ## External data 37 | 38 | ### Surface meshes 39 | 40 | The default surface meshes for 3D visualization come from the group average of the Human Connectome Project (HCP) 1200 Subjects (S1200) data release (March 2017) processed using HCP pipelines. They can be obtained on BALSA: https://balsa.wustl.edu/reference/show/pkXDZ 41 | 42 | 43 | These group average files are redistributed under the [HCP Open Access Data Use Terms](https://www.humanconnectome.org/study/hcp-young-adult/document/wu-minn-hcp-consortium-open-access-data-use-terms) with the acknowledgment: 44 | 45 | *"Data were provided [in part] by the Human Connectome Project, WU-Minn Consortium (Principal Investigators: David Van Essen and Kamil Ugurbil; 1U54MH091657) funded by the 16 NIH Institutes and Centers that support the NIH Blueprint for Neuroscience Research; and by the McDonnell Center for Systems Neuroscience at Washington University."* 46 | 47 | 48 | ### Parcellations 49 | 50 | When using the included parcellations, please cite the relevant papers, which include full details. 51 | 52 | **The Glasser MMP1.0 Parcellation:** Glasser, Matthew F., Timothy S. Coalson, Emma C. Robinson, Carl D. Hacker, John Harwell, Essa Yacoub, Kamil Ugurbil, et al. 2016. “A Multi-Modal Parcellation of Human Cerebral Cortex.” Nature 536 (7615): 171–78. [http://doi.org/10.1038/nature18933](http://doi.org/10.1038/nature18933) (see in particular the details in *Supplementary Neuroanatomical Results*). 53 | 54 | **Yeo 7 or (17) Network Parcellation:** Yeo, B. T. Thomas, Fenna M. Krienen, Jorge Sepulcre, Mert R. Sabuncu, Danial Lashkari, Marisa Hollinshead, Joshua L. Roffman, et al. 2011. “The Organization of the Human Cerebral Cortex Estimated by Intrinsic Functional Connectivity.” Journal of Neurophysiology 106 (3): 1125–65. [https://doi.org/10.1152/jn.00338.2011](https://doi.org/10.1152/jn.00338.2011). 55 | 56 | **The Cole-Anticevic Brain-wide Network Partition:** Ji JL, Spronk M, Kulkarni K, Repovs G, Anticevic A, Cole MW (2019). "Mapping the human brain's cortical-subcortical functional network organization". NeuroImage. 185:35–57. [https://doi.org/10.1016/j.neuroimage.2018.10.006](https://doi.org/10.1016/j.neuroimage.2018.10.006) (also available as an open access bioRxiv preprint: [http://doi.org/10.1101/206292](http://doi.org/10.1101/206292)) and [https://github.com/ColeLab/ColeAnticevicNetPartition/](https://github.com/ColeLab/ColeAnticevicNetPartition/) 57 | 58 | 59 | * * * 60 | 61 | *This package was initiated as a tool within the project ["Bio-inspired artificial neural networks"](http://bionn.matinf.uj.edu.pl/) funded by the Foundation for Polish Science (FNP).* 62 | 63 | 64 | -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-slate -------------------------------------------------------------------------------- /docs/_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-slate -------------------------------------------------------------------------------- /docs/images/ca_network.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/ca_network.png -------------------------------------------------------------------------------- /docs/images/ca_parcels.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/ca_parcels.png -------------------------------------------------------------------------------- /docs/images/image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/image.png -------------------------------------------------------------------------------- /docs/images/mmp.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/mmp.png -------------------------------------------------------------------------------- /docs/images/out1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out1.png -------------------------------------------------------------------------------- /docs/images/out2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out2.png -------------------------------------------------------------------------------- /docs/images/out3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out3.png -------------------------------------------------------------------------------- /docs/images/out3a.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out3a.png -------------------------------------------------------------------------------- /docs/images/out3b.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out3b.png -------------------------------------------------------------------------------- /docs/images/out4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out4.png -------------------------------------------------------------------------------- /docs/images/out5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out5.png -------------------------------------------------------------------------------- /docs/images/out6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out6.png -------------------------------------------------------------------------------- /docs/images/out7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out7.png -------------------------------------------------------------------------------- /docs/images/out8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/out8.png -------------------------------------------------------------------------------- /docs/images/standard.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/standard.png -------------------------------------------------------------------------------- /docs/images/yeo17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/yeo17.png -------------------------------------------------------------------------------- /docs/images/yeo7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/docs/images/yeo7.png -------------------------------------------------------------------------------- /docs/index.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | This package contains utilities to use [Human Connectome Project](https://www.humanconnectome.org/) (HCP) data and HCP-like data (e.g. obtained from legacy data using [ciftify](https://github.com/edickie/ciftify)) as well as corresponding parcellations with [nilearn](https://nilearn.github.io/) and other Python tools. 4 | 5 | The HCP data differs from conventional volumetric fMRI data which records the BOLD signal from each *voxel* in a 3D volume in that the signal from the cortical surface is treated as a folded two dimensional surface, and hence the data is associated with vertices of a predefined surface mesh, while the subcortical structures are described volumetrically using voxels. 6 | 7 | The CIFTI (more precisely CIFTI-2) file format encompasses both the cortical 2D surface data as well as the subcortical volume data. However, only the voxels associated with relevant subcortical structures are kept. 8 | Thus these data are quite richly structured. Although the standard Python tools for dealing with fMRI data like [nibabel](https://nipy.org/nibabel/) can read both the CIFTI-2 files containing the fMRI signals and the GIFTI files containing the surface mesh definitions, there is not much that one could do further out-of-the-box, in particular visualization using [nilearn](https://nilearn.github.io/) or processing parcellated data using e.g. machine learning tools which work exclusively with `numpy` arrays. The goal of this package is to ease the interoperability of HCP data and these standard Python tools. 9 | 10 | 11 | ![brain image](images/image.png) 12 | 13 | 14 | The utilities mainly deal with plotting surface data, accessing the predefined subcortical structures as well as using various parcellations and identifying connected components on the cortical surface. Various helper functions aid e.g. in mapping the HCP fMRI cortical data to surface vertices for visualization etc. The functions work directly with numpy arrays of shape `Tx91282` or `91282` for fMRI data, with `T` being the number of time frames, while `91282` is the standard HCP dimensionality for the 3T cortical surface and subcortical data. 15 | 16 | 17 | # Installation 18 | 19 | Make sure that you have the following packages installed (e.g. using [miniconda](https://docs.conda.io/en/latest/miniconda.html)) 20 | ``` 21 | nibabel, nilearn, numpy, scikit-learn, matplotlib, pandas, scipy 22 | ``` 23 | Then install with 24 | ``` 25 | pip install hcp_utils 26 | ``` 27 | upgrade with 28 | ``` 29 | pip install --upgrade hcp_utils 30 | ``` 31 | 32 | 33 | # Usage 34 | 35 | Here we assume that the commands will be run in a Jupyter notebook as this allows for rotating and scaling the 3D plots with your mouse. 36 | 37 | ## Getting started 38 | 39 | First import prerequisities. 40 | ``` 41 | import nibabel as nib 42 | import nilearn.plotting as plotting 43 | import numpy as np 44 | import matplotlib.pyplot as plt 45 | %matplotlib inline 46 | ``` 47 | Then import `hcp_utils`: 48 | ``` 49 | import hcp_utils as hcp 50 | ``` 51 | 52 | We use `nibabel` to load a CIFTI file with the fMRI time series. We also extract the fMRI time series to a `numpy` array. 53 | ``` 54 | img = nib.load('path/to/fMRI_data_file.dtseries.nii') 55 | X = img.get_fdata() 56 | X.shape # e.g. (700, 91282) 57 | ``` 58 | which corresponds to 700 time-steps and 91282 grayordinates. 59 | 60 | The CIFTI file format for HCP 3T data partitions the 91282 grayordinates into the left- and right- cortex as well as 19 subcortical regions. We can easily extract say the signal in the left hippocampus using `hcp.struct` 61 | 62 | ``` 63 | X_hipL = X[:, hcp.struct.hippocampus_left] 64 | X_hipL.shape # (700, 764) 65 | ``` 66 | 67 | The other available regions are 68 | ``` 69 | hcp.struct.keys() 70 | ``` 71 | 72 | > dict_keys(['cortex_left', 'cortex_right', 'cortex', 'subcortical', 'accumbens_left', 'accumbens_right', 'amygdala_left', 'amygdala_right', 'brainStem', 'caudate_left', 'caudate_right', 'cerebellum_left', 'cerebellum_right', 'diencephalon_left', 'diencephalon_right', 'hippocampus_left', 'hippocampus_right', 'pallidum_left', 'pallidum_right', 'putamen_left', 'putamen_right', 'thalamus_left', 'thalamus_right']) 73 | 74 | For convenience we define a function which normalizes the data so that each grayordinate has zero (temporal) mean and unit standard deviation: 75 | ``` 76 | Xn = hcp.normalize(X) 77 | ``` 78 | 79 | ## Plotting 80 | 81 | In order to plot the cortical surface data for the whole bran, one has to have the surface meshes appropriate for HCP data and combine the ones corresponding to the left and right hemispheres into a single mesh. 82 | 83 | In addition, the HCP fMRI data are defined on a *subset* of the surface vertices (29696 out of 32492 for the left cortex and 29716 out of 32492 for the right cortex). Hence we have to construct an auxilliary array of size 32492 or 64984 with the fMRI data points inserted in appropriate places and a constant (zero by default) elsewhere. This is achieved by the `cortex_data(arr, fill=0)`, `left_cortex_data(arr, fill=0)` and `right_cortex_data(arr, fill=0)` functions. 84 | 85 | `hcp_utils` comes with preloaded surface meshes from the HCP S1200 group average data as well as the whole brain meshes composed of both the left and right meshes. In addition sulcal depth data is included for shading. These data are packaged in the following way: 86 | 87 | ``` 88 | hcp.mesh.keys() 89 | ``` 90 | 91 | > dict_keys(['white_left', 'white_right', 'white', 'midthickness_left', 'midthickness_right', 'midthickness', 'pial_left', 'pial_right', 'pial', 'inflated_left', 'inflated_right', 'inflated', 'very_inflated_left', 'very_inflated_right', 'very_inflated', 'flat_left', 'flat_right', 'flat', 'sphere_left', 'sphere_right', 'sphere', 'sulc', 'sulc_left', 'sulc_right']) 92 | 93 | Here `white` is the top of white matter, `pial` is the surface of the brain, `midthickness` is halfway between them, while `inflated` and `very_inflated` are mostly useful for visualization. `flat` is a 2D flat representation. 94 | 95 | In order to make an interactive 3D surface plot using `nilearn` of the normalized fMRI data (thresholded at 1.5) at t=29 on the inflated group average mesh, we write 96 | 97 | ``` 98 | plotting.view_surf(hcp.mesh.inflated, hcp.cortex_data(Xn[29]), 99 | threshold=1.5, bg_map=hcp.mesh.sulc) 100 | ``` 101 | 102 | ![brain image](images/out1.png) 103 | 104 | The group average surfaces are much smoother than the ones for individual subjects. If we have the latter ones at our disposal, we can also load them and use them for visualization. 105 | 106 | ``` 107 | mesh_sub = hcp.load_surfaces(example_filename='path/to/fsaverage_LR32k/ 108 | sub-XX.R.pial.32k_fs_LR.surf.gii') 109 | ``` 110 | 111 | Here as an argument we give just one example filename and `hcp_utils` will try to load all other versions for both hemispheres and the sulcal depth file assuming HCP like naming conventions (the `.R.pial` part here). 112 | 113 | Let's look at the same data as previously, but now on the inflated single subject surface: 114 | 115 | ``` 116 | plotting.view_surf(mesh_sub.inflated, hcp.cortex_data(Xn[29]), 117 | threshold=1.5, bg_map=mesh_sub.sulc) 118 | ``` 119 | 120 | ![brain image](images/out2.png) 121 | 122 | ## Parcellations 123 | 124 | `hcp_utils` comes with a couple of parcellations preloaded. In particular we have the following ones (where we also indicated the name of the variable with the parcellation data) 125 | 126 | * the *Glasser et.al.* Multi-Modal Parcellation (MMP 1.0) which partitions each hemisphere of the cortex into 180 regions - `hcp.mmp` 127 | * the Cole-Anticevic Brain-wide Network Partition (version 1.1) which 128 | * extends the Multi-Modal Parcellation of the cortex by a parcellation of the subcortical regions - `hcp.ca_parcels` 129 | * groups the cortical (MMP) and subcortical regions into functional networks - `hcp.ca_network` 130 | * the *Yeo et.al.* 7- and 17-region (cortical) functional networks - `hcp.yeo7` and `hcp.yeo17` 131 | * the standard CIFTI partition into the main subcortical regions - `hcp.standard` 132 | 133 | For references and details see below or the [github page](https://github.com/rmldj/hcp-utils). These parcellations were extracted from the relevant `.dlabel.nii` files by the scripts in the `prepare/` folder of the package repository. Please cite the relevant papers if you make use of these parcellations. 134 | 135 | All the labels and the corresponding numerical ids for these parcellations are displayed on the [parcellation labels](./parcellation_labels.html) page. 136 | 137 | The data for a parcellation contained e.g. in `hcp.mmp` has the following fields: 138 | * `parcellation.ids` - numerical ids of the parcels. 0 means unassigned. 139 | * `parcellation.nontrivial_ids` - same but with the unassigned one omitted. 140 | * `parcellation.labels` - a dictionary which maps the numerical id to the name of the region 141 | * `parcellation.map_all` - an integer array of size 91282, giving the id of each grayordinate 142 | * `parcellation.rgba` - a dictionary which maps the numerical id to the rgba color (extracted from the source files) 143 | 144 | One can view a cortical parcellation on the 3D surface plot using the following function 145 | 146 | ``` 147 | hcp.view_parcellation(mesh_sub.inflated, hcp.yeo7) 148 | ``` 149 | 150 | ![yeo-7 parcellation image](images/out3.png) 151 | 152 | The color coding of the labels together with the numerical ids can be displayed through 153 | 154 | ``` 155 | hcp.parcellation_labels(hcp.yeo7) 156 | ``` 157 | 158 | ![yeo-7 parcellation labels](images/out4.png) 159 | 160 | 161 | Sometimes it may be convenient to differentiate between the networks in the left and right hemispheres. For that one can use the function `make_lr_parcellation(parcellation)` which takes a parcellation and separates the parcels in the left and right hemisphere. The subcortical part is set to zero (unassigned). 162 | 163 | ``` 164 | yeo7lr = hcp.make_lr_parcellation(hcp.yeo7) 165 | hcp.view_parcellation(mesh_sub.inflated, yeo7lr) 166 | ``` 167 | 168 | ![yeo-7 lr parcellation image](images/out3a.png) 169 | 170 | ``` 171 | hcp.parcellation_labels(yeo7lr) 172 | ``` 173 | 174 | ![yeo-7 lr parcellation labels](images/out3b.png) 175 | 176 | 177 | The main use of a parcellation is to obtain the parcellated time series (by taking the mean over each parcel) for further analysis. This can be done through 178 | 179 | ``` 180 | Xp = hcp.parcellate(Xn, hcp.yeo7) 181 | Xp.shape # (700, 7) 182 | ``` 183 | 184 | One could e.g. take the maximum by writing instead 185 | ``` 186 | hcp.parcellate(Xn, hcp.yeo7, method=np.amax) 187 | ``` 188 | For visualization it might be interesting to plot the parcellated value on each location of the brain. For that we use the `unparcellate(Xp, parcellation)` function: 189 | 190 | ``` 191 | plotting.view_surf(mesh_sub.inflated, 192 | hcp.cortex_data(hcp.unparcellate(Xp[29], hcp.yeo7)), 193 | threshold=0.1, bg_map=mesh_sub.sulc) 194 | ``` 195 | 196 | ![yeo-7 parcellated data](images/out5.png) 197 | 198 | If we wanted to focus just on the activity in the somatomotor network of Yeo-7, we can mask out to 0 the remaining parts of the brain using `mask(X, mask, fill=0)`. This is of course only useful for visualization: 199 | 200 | ``` 201 | plotting.view_surf(mesh_sub.inflated, 202 | hcp.cortex_data(hcp.mask(Xn[29], hcp.yeo7.map_all==2)), 203 | threshold=0.1, bg_map=mesh_sub.sulc) 204 | ``` 205 | 206 | ![somatomotor masked data](images/out6.png) 207 | 208 | Finally, once we have parcellated 1D data (e.g. a snapshot in time or the results of some analysis), we can order the regions according to the values using the function `ranking(Xp, parcellation, descending=True)`: 209 | 210 | ``` 211 | df = hcp.ranking(Xp[29], hcp.yeo7) 212 | df 213 | ``` 214 | 215 | ![region ranking](images/out7.png) 216 | 217 | The above function returns a `Pandas` data frame which of course can be used for further analysis. 218 | 219 | ## Connected components 220 | 221 | Once some computation on the cortex data has been done and some boolean condition determined, it may be useful to decompose the region where the condition is satisfied into connected components. 222 | 223 | `cortical_adjacency` is the 59412x59412 (sparse) adjacency matrix of the grayordinates on both hemispheres of the cortex. 224 | 225 | The decomposition of the region where the boolean condition is satisfied into connected components is done by the function `cortical_components(condition, cutoff=0)` which returns the number of components, their sizes in descending order and an integer array with the labels of each grayordinate (0 means unassigned, labels ordered according to decreasing size of the connected components). 226 | If the cutoff parameter is specified, components smaller than the cutoff are neglected and the corresponding labels set to zero. 227 | 228 | E.g. if we would insert a condition which is always true 229 | ``` 230 | n_components, sizes, rois = hcp.cortical_components(Xn[29]>-1000.0) 231 | n_components, sizes 232 | ``` 233 | > (2, array([29716, 29696])) 234 | 235 | we would get just the two cortical hemispheres as the connected components. 236 | 237 | A more realistic example would be 238 | ``` 239 | n_components, sizes, rois = hcp.cortical_components(Xn[29]>1.0, cutoff=36) 240 | n_components, sizes 241 | ``` 242 | > (42, array([1227, 415, 296, 276, 201, 168, 165, 143, 141, 115, 103, 243 | 101, 79, 77, 74, 69, 63, 62, 60, 52, 51, 51, 244 | 49, 48, 48, 45, 44, 43, 43, 43, 42, 40, 40, 245 | 39, 38, 38, 38, 38, 37, 37, 37, 36])) 246 | 247 | Then the largest connected component of size 1227 could be displayed by 248 | ``` 249 | plotting.view_surf(hcp.mesh.inflated, 250 | hcp.cortex_data(hcp.mask(Xn[29], rois==1)), 251 | threshold=1.0, bg_map=hcp.mesh.sulc) 252 | ``` 253 | 254 | ![connected component](images/out8.png) 255 | 256 | 257 | ## External data and references 258 | 259 | ### Surface meshes 260 | 261 | The default surface meshes for 3D visualization come from the group average of the Human Connectome Project (HCP) 1200 Subjects (S1200) data release (March 2017) processed using HCP pipelines. They can be obtained on BALSA: [https://balsa.wustl.edu/reference/show/pkXDZ](https://balsa.wustl.edu/reference/show/pkXDZ) 262 | 263 | 264 | These group average files are redistributed under the [HCP Open Access Data Use Terms](https://www.humanconnectome.org/study/hcp-young-adult/document/wu-minn-hcp-consortium-open-access-data-use-terms) with the acknowledgment: 265 | 266 | *"Data were provided [in part] by the Human Connectome Project, WU-Minn Consortium (Principal Investigators: David Van Essen and Kamil Ugurbil; 1U54MH091657) funded by the 16 NIH Institutes and Centers that support the NIH Blueprint for Neuroscience Research; and by the McDonnell Center for Systems Neuroscience at Washington University."* 267 | 268 | 269 | ### Parcellations 270 | 271 | When using the included parcellations, please cite the relevant papers, which include full details. 272 | 273 | **The Glasser MMP1.0 Parcellation:** Glasser, Matthew F., Timothy S. Coalson, Emma C. Robinson, Carl D. Hacker, John Harwell, Essa Yacoub, Kamil Ugurbil, et al. 2016. “A Multi-Modal Parcellation of Human Cerebral Cortex.” Nature 536 (7615): 171–78. [http://doi.org/10.1038/nature18933](http://doi.org/10.1038/nature18933) (see in particular the details in *Supplementary Neuroanatomical Results*). 274 | 275 | **Yeo 7 or (17) Network Parcellation:** Yeo, B. T. Thomas, Fenna M. Krienen, Jorge Sepulcre, Mert R. Sabuncu, Danial Lashkari, Marisa Hollinshead, Joshua L. Roffman, et al. 2011. “The Organization of the Human Cerebral Cortex Estimated by Intrinsic Functional Connectivity.” Journal of Neurophysiology 106 (3): 1125–65. [https://doi.org/10.1152/jn.00338.2011](https://doi.org/10.1152/jn.00338.2011). 276 | 277 | **The Cole-Anticevic Brain-wide Network Partition:** Ji JL, Spronk M, Kulkarni K, Repovs G, Anticevic A, Cole MW (2019). "Mapping the human brain's cortical-subcortical functional network organization". NeuroImage. 185:35–57. [https://doi.org/10.1016/j.neuroimage.2018.10.006](https://doi.org/10.1016/j.neuroimage.2018.10.006) (also available as an open access bioRxiv preprint: [http://doi.org/10.1101/206292](http://doi.org/10.1101/206292)) and [https://github.com/ColeLab/ColeAnticevicNetPartition/](https://github.com/ColeLab/ColeAnticevicNetPartition/) 278 | 279 | 280 | 281 | * * * 282 | 283 | *This package was initiated as a tool within the project ["Bio-inspired artificial neural networks"](http://bionn.matinf.uj.edu.pl/) funded by the Foundation for Polish Science (FNP).* 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | -------------------------------------------------------------------------------- /docs/parcellation_labels.md: -------------------------------------------------------------------------------- 1 | # Parcellation labels 2 | 3 | Here we summarize the labels and the corresponding numerical ids of regions of the parcellations included in `hcp_utils`. For details see the references in the [package repository README](https://github.com/rmldj/hcp-utils). 4 | We start with the smaller parcellations ending with the largest ones. 5 | 6 | [Return to the main documentation](index.html) 7 | 8 | 9 | ### Standard CIFTI subcortical structures 10 | 11 | ``` 12 | hcp.parcellation_labels(hcp.standard) 13 | ``` 14 | 15 | ![standard parcellation labels](images/standard.png) 16 | 17 | 18 | ### Yeo 7-region networks 19 | 20 | ``` 21 | hcp.parcellation_labels(hcp.yeo7) 22 | ``` 23 | 24 | ![yeo-7 parcellation labels](images/yeo7.png) 25 | 26 | 27 | ### Yeo 17-region networks 28 | 29 | ``` 30 | hcp.parcellation_labels(hcp.yeo17) 31 | ``` 32 | 33 | ![yeo-17 parcellation labels](images/yeo17.png) 34 | 35 | ### Cole-Anticevic brain-wide functional networks 36 | 37 | These functional networks differ from the previous ones in the following ways: 38 | 39 | * they always contain whole cortical parcels from the MMP parcellation 40 | * they extend also to the subcortical structures 41 | 42 | ``` 43 | hcp.parcellation_labels(hcp.ca_network) 44 | ``` 45 | 46 | ![ca_network parcellation labels](images/ca_network.png) 47 | 48 | 49 | ### Multi Modal Parcellation MMP 50 | 51 | To the 360 cortical regions we added the standard subcortical structures (ids 361-379). 52 | We exchanged the internal numerical ids between the left and right hemispheres for consistency with the Cole-Anticevic Brain-wide Network Partition. 53 | 54 | ``` 55 | hcp.parcellation_labels(hcp.mmp) 56 | ``` 57 | 58 | ![mmp parcellation labels](images/mmp.png) 59 | 60 | ### Cole-Anticevic extension of MMP to a parcellation of the subcortical regions 61 | 62 | ``` 63 | hcp.parcellation_labels(hcp.ca_parcels) 64 | ``` 65 | 66 | ![ca_parcels parcellation labels](images/ca_parcels.png) 67 | 68 | 69 | [Return to the main documentation](index.html) 70 | 71 | -------------------------------------------------------------------------------- /hcp_utils/__init__.py: -------------------------------------------------------------------------------- 1 | from .hcp_utils import struct, vertex_info, mesh 2 | from .hcp_utils import standard, mmp, ca_network, ca_parcels, yeo7, yeo17 3 | from .hcp_utils import view_parcellation, parcellation_labels, make_lr_parcellation 4 | from .hcp_utils import parcellate, unparcellate, mask, ranking, normalize 5 | from .hcp_utils import left_cortex_data, right_cortex_data, cortex_data, combine_meshes, load_surfaces 6 | from .hcp_utils import get_HCP_vertex_info 7 | from .hcp_utils import cortical_adjacency, cortical_components 8 | 9 | __version__ = '0.1.0' 10 | -------------------------------------------------------------------------------- /hcp_utils/data/S1200.sulc_MSMAll.32k_fs_LR.dscalar.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/S1200.sulc_MSMAll.32k_fs_LR.dscalar.nii -------------------------------------------------------------------------------- /hcp_utils/data/ca_network_1.1.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/ca_network_1.1.npz -------------------------------------------------------------------------------- /hcp_utils/data/ca_parcels_1.1.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/ca_parcels_1.1.npz -------------------------------------------------------------------------------- /hcp_utils/data/cortical_adjacency.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/cortical_adjacency.npz -------------------------------------------------------------------------------- /hcp_utils/data/fMRI_vertex_info_32k.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/fMRI_vertex_info_32k.npz -------------------------------------------------------------------------------- /hcp_utils/data/mmp_1.0.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/mmp_1.0.npz -------------------------------------------------------------------------------- /hcp_utils/data/standard.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/standard.npz -------------------------------------------------------------------------------- /hcp_utils/data/yeo17.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/yeo17.npz -------------------------------------------------------------------------------- /hcp_utils/data/yeo7.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/hcp_utils/data/yeo7.npz -------------------------------------------------------------------------------- /hcp_utils/hcp_utils.py: -------------------------------------------------------------------------------- 1 | import nibabel as nib 2 | from nilearn import surface 3 | from nilearn import plotting 4 | from sklearn.utils import Bunch 5 | import matplotlib 6 | import matplotlib.pyplot as plt 7 | import matplotlib.patches as mpatches 8 | import numpy as np 9 | import pandas as pd 10 | from scipy.sparse import load_npz 11 | from scipy.sparse.csgraph import connected_components 12 | import os 13 | import re 14 | from pathlib import Path 15 | 16 | # define standard structures (for 3T HCP-like data) 17 | 18 | struct = Bunch() 19 | 20 | struct.cortex_left = slice(0,29696) 21 | struct.cortex_right = slice(29696,59412) 22 | struct.cortex = slice(0,59412) 23 | struct.subcortical = slice(59412,None) 24 | 25 | struct.accumbens_left = slice(59412,59547) 26 | struct.accumbens_right = slice(59547,59687) 27 | struct.amygdala_left = slice(59687,60002) 28 | struct.amygdala_right = slice(60002,60334) 29 | struct.brainStem = slice(60334,63806) 30 | struct.caudate_left = slice(63806,64534) 31 | struct.caudate_right = slice(64534,65289) 32 | struct.cerebellum_left = slice(65289,73998) 33 | struct.cerebellum_right = slice(73998,83142) 34 | struct.diencephalon_left = slice(83142,83848) 35 | struct.diencephalon_right = slice(83848,84560) 36 | struct.hippocampus_left = slice(84560,85324) 37 | struct.hippocampus_right = slice(85324,86119) 38 | struct.pallidum_left = slice(86119,86416) 39 | struct.pallidum_right = slice(86416,86676) 40 | struct.putamen_left = slice(86676,87736) 41 | struct.putamen_right = slice(87736,88746) 42 | struct.thalamus_left = slice(88746,90034) 43 | struct.thalamus_right = slice(90034,None) 44 | 45 | # The fMRI data are not defined on all 32492 vertices of the 32k surface meshes 46 | # Hence we need to record what is the mapping between the cortex grayordinates from fMRI 47 | # and the vertices of the 32k surface meshes. 48 | # This information is kept in vertex_info 49 | # 50 | # for a standard 3T HCP style fMRI image get_HCP_vertex_info(img) should coincide with vertex_info 51 | 52 | 53 | def _make_vertex_info(grayl, grayr, num_meshl, num_meshr): 54 | vertex_info = Bunch() 55 | vertex_info.grayl = grayl 56 | vertex_info.grayr = grayr 57 | vertex_info.num_meshl = num_meshl 58 | vertex_info.num_meshr = num_meshr 59 | return vertex_info 60 | 61 | PKGDATA = Path(__file__).parent / 'data' 62 | 63 | vertex_data = np.load(PKGDATA / 'fMRI_vertex_info_32k.npz') 64 | vertex_info = _make_vertex_info(vertex_data['grayl'], vertex_data['grayr'], int(vertex_data['num_meshl']), int(vertex_data['num_meshr'])) 65 | 66 | def get_HCP_vertex_info(img): 67 | """ 68 | Extracts information about the relation of indices in the fMRI data to the surface meshes and the left/right cortex. 69 | Use only for meshes different from the 32k standard one which is loaded by default. 70 | """ 71 | assert isinstance(img, nib.cifti2.cifti2.Cifti2Image) 72 | 73 | map1 = img.header.get_index_map(1) 74 | bms = list(map1.brain_models) 75 | 76 | grayl = np.array(bms[0].vertex_indices) 77 | grayr = np.array(bms[1].vertex_indices) 78 | num_meshl = bms[0].surface_number_of_vertices 79 | num_meshr = bms[1].surface_number_of_vertices 80 | return _make_vertex_info(grayl, grayr, num_meshl, num_meshr) 81 | 82 | 83 | # The following three functions take a 1D array of fMRI grayordinates 84 | # and return the array on the left- right- or both surface meshes 85 | 86 | def left_cortex_data(arr, fill=0, vertex_info=vertex_info): 87 | """ 88 | Takes a 1D array of fMRI grayordinates and returns the values on the vertices of the left cortex mesh which is neccessary for surface visualization. 89 | The unused vertices are filled with a constant (zero by default). 90 | """ 91 | out = np.zeros(vertex_info.num_meshl) 92 | out[:] = fill 93 | out[vertex_info.grayl] = arr[:len(vertex_info.grayl)] 94 | return out 95 | 96 | def right_cortex_data(arr, fill=0, vertex_info=vertex_info): 97 | """ 98 | Takes a 1D array of fMRI grayordinates and returns the values on the vertices of the right cortex mesh which is neccessary for surface visualization. 99 | The unused vertices are filled with a constant (zero by default). 100 | """ 101 | out = np.zeros(vertex_info.num_meshr) 102 | out[:] = fill 103 | if len(arr) == len(vertex_info.grayr): 104 | # means arr is already just the right cortex 105 | out[vertex_info.grayr] = arr 106 | else: 107 | out[vertex_info.grayr] = arr[len(vertex_info.grayl):len(vertex_info.grayl) + len(vertex_info.grayr)] 108 | return out 109 | 110 | def cortex_data(arr, fill=0, vertex_info=vertex_info): 111 | """ 112 | Takes a 1D array of fMRI grayordinates and returns the values on the vertices of the full cortex mesh which is neccessary for surface visualization. 113 | The unused vertices are filled with a constant (zero by default). 114 | """ 115 | dataL = left_cortex_data(arr, fill=fill, vertex_info=vertex_info) 116 | dataR = right_cortex_data(arr, fill=fill, vertex_info=vertex_info) 117 | return np.hstack((dataL, dataR)) 118 | 119 | # utility function for making a mesh for both hemispheres 120 | # used internally by load_surfaces 121 | 122 | def combine_meshes(meshL, meshR): 123 | """ 124 | Combines left and right meshes into a single mesh for both hemispheres. 125 | """ 126 | coordL, facesL = meshL 127 | coordR, facesR = meshR 128 | coord = np.vstack((coordL, coordR)) 129 | faces = np.vstack((facesL, facesR+len(coordL))) 130 | return coord, faces 131 | 132 | # loads all available surface meshes 133 | 134 | def load_surfaces(example_filename=None, filename_sulc=None): 135 | """ 136 | Loads all available surface meshes and sulcal depth file. 137 | Combines the left and right hemispheres into joint meshes for the whole brain. 138 | With no arguments loads the HCP S1200 group average meshes. 139 | If loading subject specific meshes it is enough to specify a single `example_filename` being one of 140 | `white|midthickness|pial|inflated|very_inflated` type e.g. 141 | 142 | ``` 143 | mesh = load_surfaces(example_filename='PATH/sub-44.L.pial.32k_fs_LR.surf.gii') 144 | ``` 145 | The function will load all available surfaces from that location. 146 | 147 | """ 148 | if example_filename is None: 149 | filename_pattern = str(PKGDATA / 'S1200.{}.{}_MSMAll.32k_fs_LR.surf.gii') 150 | else: 151 | filename_pattern = re.sub('\.(L|R)\.', '.{}.', example_filename) 152 | filename_pattern = re.sub('white|midthickness|pial|inflated|very_inflated', '{}', filename_pattern) 153 | 154 | flatsphere_pattern = str(PKGDATA / 'S1200.{}.{}.32k_fs_LR.surf.gii') 155 | 156 | meshes = Bunch() 157 | for variant in ['white', 'midthickness', 'pial', 'inflated', 'very_inflated', 'flat' , 'sphere']: 158 | count = 0 159 | for hemisphere, hemisphere_name in [('L', 'left'), ('R', 'right')]: 160 | if variant in ['flat' , 'sphere']: 161 | filename = flatsphere_pattern.format(hemisphere, variant) 162 | else: 163 | filename = filename_pattern.format(hemisphere, variant) 164 | if os.path.exists(filename): 165 | coord, faces = surface.load_surf_mesh(filename) 166 | if variant=='flat': 167 | coordnew = np.zeros_like(coord) 168 | coordnew[:, 1] = coord[:, 0] 169 | coordnew[:, 2] = coord[:, 1] 170 | coordnew[:, 0] = 0 171 | coord = coordnew 172 | meshes[variant+'_'+hemisphere_name] = coord, faces 173 | count += 1 174 | else: 175 | print('Cannot find', filename) 176 | if count==2: 177 | if variant == 'flat': 178 | coordl, facesl = meshes['flat_left'] 179 | coordr, facesr = meshes['flat_right'] 180 | coordlnew = coordl.copy() 181 | coordlnew[:, 1] = coordl[:, 1] - 250.0 182 | coordrnew = coordr.copy() 183 | coordrnew[:, 1] = coordr[:, 1] + 250.0 184 | meshes['flat'] = combine_meshes( (coordlnew, facesl), (coordrnew, facesr) ) 185 | else: 186 | meshes[variant] = combine_meshes(meshes[variant+'_left'], meshes[variant+'_right']) 187 | 188 | if filename_sulc is None: 189 | filename_sulc = filename_pattern.format('XX','XX').replace('XX.XX', 'sulc').replace('surf.gii','dscalar.nii') 190 | if os.path.exists(filename_sulc): 191 | sulc_data = - nib.load(filename_sulc).get_fdata()[0] 192 | if len(sulc_data)==59412: 193 | # this happens for HCP S1200 group average data 194 | sulc_data = cortex_data(sulc_data) 195 | meshes['sulc'] = sulc_data 196 | num = len(meshes.sulc) 197 | meshes['sulc_left'] = meshes.sulc[:num//2] 198 | meshes['sulc_right'] = meshes.sulc[num//2:] 199 | else: 200 | print('Cannot load file {} with sulcal depth data'.format(filename_sulc)) 201 | 202 | return meshes 203 | 204 | mesh = load_surfaces() 205 | 206 | # parcellations 207 | 208 | def _load_hcp_parcellation(variant=None): 209 | allowed = ['mmp', 'ca_network', 'ca_parcels', 'yeo7', 'yeo17', 'standard'] 210 | if variant not in allowed: 211 | print('argument should be one of ' + ','.join(allowed)) 212 | return 213 | 214 | if variant=='standard': 215 | parcnpz = np.load(PKGDATA / 'standard.npz') 216 | if variant=='mmp': 217 | parcnpz = np.load(PKGDATA / 'mmp_1.0.npz') 218 | if variant=='ca_network': 219 | parcnpz = np.load(PKGDATA / 'ca_network_1.1.npz') 220 | if variant=='ca_parcels': 221 | parcnpz = np.load(PKGDATA / 'ca_parcels_1.1.npz') 222 | if variant=='yeo7': 223 | parcnpz = np.load(PKGDATA / 'yeo7.npz') 224 | if variant=='yeo17': 225 | parcnpz = np.load(PKGDATA / 'yeo17.npz') 226 | 227 | parcellation = Bunch() 228 | parcellation.ids = parcnpz['ids'] 229 | parcellation.map_all = parcnpz['map_all'] 230 | 231 | labels = parcnpz['labels'] 232 | labelsdict = dict() 233 | rgba = parcnpz['rgba'] 234 | rgbadict = dict() 235 | for i, k in enumerate(parcellation.ids): 236 | labelsdict[k] = labels[i] 237 | rgbadict[k] = rgba[i] 238 | 239 | parcellation.labels = labelsdict 240 | parcellation.rgba = rgbadict 241 | 242 | i = 0 243 | nontrivial_ids = [] 244 | for k in parcellation.ids: 245 | if k!=0: 246 | nontrivial_ids.append(k) 247 | i += 1 248 | parcellation.nontrivial_ids = np.array(nontrivial_ids) 249 | 250 | return parcellation 251 | 252 | # predefined parcellations 253 | 254 | mmp = _load_hcp_parcellation('mmp') 255 | ca_network = _load_hcp_parcellation('ca_network') 256 | ca_parcels = _load_hcp_parcellation('ca_parcels') 257 | yeo7 = _load_hcp_parcellation('yeo7') 258 | yeo17 = _load_hcp_parcellation('yeo17') 259 | standard = _load_hcp_parcellation('standard') 260 | 261 | def view_parcellation(meshLR, parcellation): 262 | """ 263 | View the given parcellation on an a whole brain surface mesh. 264 | """ 265 | # for some parcellations the numerical ids need not be consecutive 266 | cortex_map = cortex_data(parcellation.map_all) 267 | ids = np.unique(cortex_map) 268 | normalized_cortex_map = np.zeros_like(cortex_map) 269 | rgba = np.zeros((len(ids), 4)) 270 | for i in range(len(ids)): 271 | ind = cortex_map==ids[i] 272 | normalized_cortex_map[ind] = i 273 | rgba[i,:] = parcellation.rgba[ids[i]] 274 | 275 | cmap = matplotlib.colors.ListedColormap(rgba) 276 | return plotting.view_surf(meshLR, normalized_cortex_map, symmetric_cmap=False, cmap=cmap) 277 | 278 | def parcellation_labels(parcellation): 279 | """ 280 | Displays names of ROI's in a parcellation together with color coding and the corresponding numeric ids. 281 | """ 282 | n = len(parcellation.ids) 283 | ncols = 4 284 | nrows = n // ncols + 1 285 | 286 | dpi = 72 287 | h = 12 288 | dh = 6 289 | H = h + dh 290 | 291 | Y = (nrows + 1) * H 292 | fig_height = Y / dpi 293 | 294 | fig, ax = plt.subplots(figsize=(18, fig_height)) 295 | X, _ = fig.get_dpi() * fig.get_size_inches() 296 | w = X/ncols 297 | 298 | for i in range(n): 299 | k = parcellation.ids[i] 300 | label = parcellation.labels[k] 301 | if label == '': 302 | label = 'None' 303 | 304 | name = '{} ({})'.format(label, k) 305 | 306 | col = i // nrows 307 | row = i % nrows 308 | y = Y - (row * H) - H 309 | 310 | xi = w * (col + 0.05) 311 | xf = w * (col + 0.25) 312 | xt = w * (col + 0.3) 313 | 314 | ax.text(xt, y + h/2 , name, fontsize=h, horizontalalignment='left', verticalalignment='center') 315 | 316 | ax.add_patch(mpatches.Rectangle((xi, y), xf-xi, h ,linewidth=1,edgecolor='k',facecolor=parcellation.rgba[k])) 317 | 318 | ax.set_xlim(0, X) 319 | ax.set_ylim(0, Y) 320 | ax.set_axis_off() 321 | 322 | plt.subplots_adjust(left=0, right=1, top=1, bottom=0, hspace=0, wspace=0) 323 | 324 | 325 | def parcellate(X, parcellation, method=np.mean): 326 | """ 327 | Parcellates the data into ROI's using `method` (mean by default). Ignores the unassigned grayordinates with id=0. 328 | Works both for time-series 2D data and snapshot 1D data. 329 | """ 330 | n = np.sum(parcellation.ids!=0) 331 | if X.ndim==2: 332 | Xp = np.zeros((len(X), n), dtype=X.dtype) 333 | else: 334 | Xp = np.zeros(np, dtype=X.dtype) 335 | i = 0 336 | for k in parcellation.ids: 337 | if k!=0: 338 | if X.ndim==2: 339 | Xp[:, i] = method(X[:, parcellation.map_all==k], axis=1) 340 | else: 341 | Xp[i] = method(X[parcellation.map_all==k]) 342 | i += 1 343 | return Xp 344 | 345 | def unparcellate(Xp, parcellation): 346 | """ 347 | Takes as input time-series (2D) or snapshot (1D) parcellated data. 348 | Creates full grayordinate data with grayordinates set to the value of the parcellated data. 349 | Can be useful for visualization. 350 | """ 351 | n = len(parcellation.map_all) 352 | if Xp.ndim==2: 353 | X = np.zeros((len(Xp), n), dtype=Xp.dtype) 354 | else: 355 | X = np.zeros(n, dtype=Xp.dtype) 356 | i = 0 357 | for k in parcellation.ids: 358 | if k!=0: 359 | if Xp.ndim==2: 360 | X[:, parcellation.map_all==k] = Xp[:,i][:,np.newaxis] 361 | else: 362 | X[parcellation.map_all==k] = Xp[i] 363 | i += 1 364 | return X 365 | 366 | def mask(X, mask, fill=0): 367 | """ 368 | Takes 1D data `X` and a mask `mask`. Sets the exterior of mask to a constant (by default zero). 369 | Can be useful for visualization. 370 | """ 371 | X_masked = np.zeros_like(X) 372 | X_masked[:] = fill 373 | X_masked[mask] = X[mask] 374 | return X_masked 375 | 376 | 377 | def ranking(Xp, parcellation, descending=True): 378 | """ 379 | Returns a dataframe with sorted values in the 1D parcellated array with appropriate labels 380 | """ 381 | ind = np.argsort(Xp) 382 | if descending: 383 | ind = ind[::-1] 384 | labels = [] 385 | ids = [] 386 | for i in range(len(Xp)): 387 | j = ind[i] 388 | k = parcellation.nontrivial_ids[j] 389 | labels.append(parcellation.labels[k]) 390 | ids.append(k) 391 | return pd.DataFrame({'region':labels, 'id':ids, 'data':Xp[ind]}) 392 | 393 | 394 | def make_lr_parcellation(parcellation): 395 | """ 396 | Takes the given parcellation and produces a new one where parcels in the left and right hemisphere are made to be distinct. 397 | Subcortical voxels are set to 0 (unassigned). 398 | """ 399 | map_all = np.zeros_like(parcellation.map_all) 400 | left = parcellation.map_all[struct.cortex_left] 401 | right = parcellation.map_all[struct.cortex_right] 402 | left_ids = np.unique(left) 403 | right_ids = np.unique(right) 404 | if left_ids[0]==0: 405 | left_ids = left_ids[1:] 406 | if right_ids[0]==0: 407 | right_ids = right_ids[1:] 408 | 409 | n_left_ids = len(left_ids) 410 | n_right_ids = len(right_ids) 411 | new_left_ids = np.arange(n_left_ids) + 1 412 | new_right_ids = np.arange(n_right_ids) + new_left_ids[-1] + 1 413 | 414 | labels = dict() 415 | labels[0] = '' 416 | ids = [0] 417 | nontrivial_ids = [] 418 | rgba = dict() 419 | rgba[0] = np.array([1.0,1.0,1.0,1.0]) 420 | 421 | for i in range(n_left_ids): 422 | old_id = left_ids[i] 423 | new_id = new_left_ids[i] 424 | map_all[struct.cortex_left][left==old_id] = new_id 425 | labels[new_id] = parcellation.labels[old_id] + ' L' 426 | ids.append(new_id) 427 | nontrivial_ids.append(new_id) 428 | color = parcellation.rgba[old_id].copy() 429 | color[:3] = color[:3] * 0.7 430 | rgba[new_id] = color 431 | 432 | for i in range(n_right_ids): 433 | old_id = right_ids[i] 434 | new_id = new_right_ids[i] 435 | map_all[struct.cortex_right][right==old_id] = new_id 436 | labels[new_id] = parcellation.labels[old_id] + ' R' 437 | ids.append(new_id) 438 | nontrivial_ids.append(new_id) 439 | rgba[new_id] = parcellation.rgba[old_id] 440 | 441 | new_parcellation = Bunch() 442 | new_parcellation.map_all = map_all 443 | new_parcellation.labels = labels 444 | new_parcellation.ids = ids 445 | new_parcellation.nontrivial_ids = nontrivial_ids 446 | new_parcellation.rgba = rgba 447 | 448 | return new_parcellation 449 | 450 | # Other utilities 451 | 452 | def normalize(X): 453 | """ 454 | Normalizes data so that each grayordinate has zero (temporal) mean and unit standard deviation. 455 | """ 456 | return (X - np.mean(X,axis=0))/np.std(X,axis=0) 457 | 458 | 459 | # cortical adjacency matrix 460 | 461 | cortical_adjacency = load_npz(PKGDATA / 'cortical_adjacency.npz') 462 | 463 | def cortical_components(condition, cutoff=0): 464 | """ 465 | Decomposes boolean array condition into connected components on the cortex. 466 | Returns `n_components`, `sizes` and an integer array `rois` with corresponding labels. 467 | 0 means unassigned. 468 | """ 469 | 470 | condition_cortex = condition[struct.cortex] 471 | rois = np.zeros(len(condition), dtype=int) 472 | G = cortical_adjacency[condition_cortex, :][:, condition_cortex] 473 | n_components, labels = connected_components(G) 474 | _, counts = np.unique(labels, return_counts=True) 475 | 476 | perm = np.argsort(counts)[::-1] 477 | invperm = np.argsort(perm) 478 | labels = invperm[labels] + 1 479 | rois[struct.cortex][condition_cortex] = labels 480 | sizes = counts[perm] 481 | 482 | if cutoff>0: 483 | maxc_arr = np.where(sizes0: 485 | maxc = maxc_arr[0] 486 | n_components = maxc 487 | sizes = sizes[:maxc] 488 | rois[rois>maxc] = 0 489 | 490 | return n_components, sizes, rois 491 | 492 | 493 | 494 | -------------------------------------------------------------------------------- /images/image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/images/image.png -------------------------------------------------------------------------------- /prepare/README.md: -------------------------------------------------------------------------------- 1 | Scripts in this folder served to prepare the `numpy` arrays with the parcellation data in the `data` folder used by `hcp_utils.py`. 2 | 3 | -------------------------------------------------------------------------------- /prepare/prepare_adjacency.py: -------------------------------------------------------------------------------- 1 | from nilearn import surface 2 | import numpy as np 3 | from scipy.sparse import csr_matrix, save_npz 4 | from scipy.sparse.csgraph import connected_components 5 | 6 | coordl, facesl = surface.load_surf_mesh('../hcp_utils/data/S1200.L.pial_MSMAll.32k_fs_LR.surf.gii') 7 | coordr, facesr = surface.load_surf_mesh('../hcp_utils/data/S1200.R.pial_MSMAll.32k_fs_LR.surf.gii') 8 | 9 | faces = np.vstack((facesl, facesr+len(coordl))) 10 | 11 | vertex_data = np.load('../hcp_utils/data/fMRI_vertex_info_32k.npz') 12 | 13 | grayl = vertex_data['grayl'] 14 | grayr = vertex_data['grayr'] 15 | 16 | gray = np.hstack((grayl, grayr + len(coordl))) 17 | 18 | print(len(gray), np.amax(gray)) 19 | 20 | gray_vertex = dict() 21 | 22 | for i, g in enumerate(gray): 23 | gray_vertex[g] = i 24 | 25 | 26 | print() 27 | print(gray_vertex[180], gray_vertex[235], gray_vertex[12]) 28 | 29 | nv = len(grayl) + len(grayr) 30 | 31 | def adjacency1(triangles, nv=nv): 32 | # standard adjacency matrix containing all edges between cortical grayordinates 33 | 34 | adj_rows = [] 35 | adj_cols = [] 36 | dist = [] 37 | 38 | used=set() 39 | for i1,i2,i3 in triangles: 40 | for p in [(i1,i2), (i2,i1), (i1,i3), (i3,i1), (i2,i3), (i3,i2)]: 41 | if p in used: 42 | continue 43 | i,j=p 44 | 45 | if i in gray_vertex and j in gray_vertex: 46 | adj_rows.append(gray_vertex[i]) 47 | adj_cols.append(gray_vertex[j]) 48 | dist.append(1) 49 | used.add(p) 50 | 51 | adj=csr_matrix((dist, (adj_rows, adj_cols)), shape=(nv,nv), dtype=int) 52 | return adj 53 | 54 | 55 | adj1 = adjacency1(faces) 56 | 57 | 58 | print(94, adj1.getrow(94)) # 11, 12, 20, 21, 95, 102 59 | print(21, adj1.getrow(21)) # should contain 20, 94, 102 but not 11 and 95 60 | 61 | #print(21, adj2.getrow(21)) # should contain 20, 94, 102 and 11 and 95 62 | 63 | n_components, labels = connected_components(adj1, directed=False) 64 | print(len(grayl), len(grayr)) 65 | print(n_components, np.unique(labels, return_counts=True)) # one should get 2 components: left and right cortex 66 | 67 | save_npz('../hcp_utils/data/cortical_adjacency.npz', adj1) 68 | -------------------------------------------------------------------------------- /prepare/prepare_ca.py: -------------------------------------------------------------------------------- 1 | import nibabel as nib 2 | import numpy as np 3 | 4 | ca_net = nib.load('../source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_netassignments_LR.dlabel.nii') 5 | 6 | # rois of grayordinates in the cortex 7 | 8 | rois = ca_net.dataobj[0].astype(int) 9 | 10 | axis0 = ca_net.header.get_index_map(0) 11 | nmap = list(axis0.named_maps)[0] 12 | 13 | labels = [] 14 | rgba = [] 15 | keys = [] 16 | 17 | # left hemisphere 18 | for i in nmap.label_table: 19 | roi = nmap.label_table[i] 20 | labels.append(roi.label[:-4]) 21 | keys.append(roi.key) 22 | rgba.append((roi.red, roi.green, roi.blue, roi.alpha)) 23 | 24 | # replace shorthand labels by full names from ColeAnticevicNetPartition/network_labelfile.txt 25 | labels = ['', 'Visual1', 'Visual2', 'Somatomotor', 'Cingulo-Opercular', 'Dorsal-attention', 'Language', 'Frontoparietal', 'Auditory', 'Default', 'Posterior-Multimodal', 'Ventral-Multimodal', 'Orbito-Affective'] 26 | 27 | labels = np.array(labels) 28 | rgba = np.array(rgba) 29 | keys = np.array(keys) 30 | 31 | np.savez_compressed('../hcp-utils/data/ca_network_1.1.npz', map_all=rois, labels=labels, rgba=rgba, ids=keys) 32 | 33 | #print(np.unique(rois)) 34 | 35 | ca_parcels = nib.load('../source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_parcels_LR.dlabel.nii') 36 | 37 | # rois of grayordinates in the cortex 38 | 39 | rois = ca_parcels.dataobj[0].astype(int) 40 | 41 | axis0 = ca_parcels.header.get_index_map(0) 42 | nmap = list(axis0.named_maps)[0] 43 | 44 | labels = [] 45 | rgba = [] 46 | keys = [] 47 | 48 | # left hemisphere 49 | for i in nmap.label_table: 50 | roi = nmap.label_table[i] 51 | labels.append(roi.label) 52 | keys.append(roi.key) 53 | rgba.append((roi.red, roi.green, roi.blue, roi.alpha)) 54 | 55 | #print(labels) 56 | #print() 57 | #print(rgba) 58 | #print() 59 | 60 | #print(np.unique(rois)) 61 | #print(len(rois)) 62 | 63 | labels = np.array(labels) 64 | rgba = np.array(rgba) 65 | keys = np.array(keys) 66 | 67 | np.savez_compressed('../hcp-utils/data/ca_parcels_1.1.npz', map_all=rois, labels=labels, rgba=rgba, ids=keys) 68 | -------------------------------------------------------------------------------- /prepare/prepare_mmp.py: -------------------------------------------------------------------------------- 1 | import nibabel as nib 2 | import numpy as np 3 | 4 | mmp = nib.load('../source_data/Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii') 5 | 6 | # rois of grayordinates in the cortex 7 | 8 | rois = mmp.dataobj[0].astype(int) 9 | 10 | # change ids so that smaller numbers are in the left hemisphere 11 | # for consistency with Cole-Anticevic parcellation 12 | 13 | flip_ordering = np.array([0] + list(range(181, 361)) + list(range(1, 181))) 14 | rois = flip_ordering[rois] 15 | 16 | # extract parcel names and colors 17 | 18 | axis0=mmp.header.get_index_map(0) 19 | nmap=list(axis0.named_maps)[0] 20 | 21 | keys = [0] 22 | labels = [''] 23 | rgba = [(0.0, 0.0, 0.0, 0.0)] 24 | 25 | # left hemisphere 26 | for i in range(181, 361): 27 | roi = nmap.label_table[i] 28 | labels.append(roi.label[:-4]) 29 | rgba.append((roi.red, roi.green, roi.blue, roi.alpha)) 30 | keys.append(i - 180) 31 | 32 | # right hemisphere 33 | for i in range(1, 181): 34 | roi = nmap.label_table[i] 35 | labels.append(roi.label[:-4]) 36 | rgba.append((roi.red, roi.green, roi.blue, roi.alpha)) 37 | keys.append(i + 180) 38 | 39 | 40 | # extend the cortical parcellation by the standard subcortical structures from CIFTI-2 41 | 42 | map_all = np.zeros(91282, dtype=int) 43 | 44 | map_all[:59412] = rois 45 | 46 | accumbens_left_=slice(59412,59547) 47 | accumbens_right_=slice(59547,59687) 48 | amygdala_left_=slice(59687,60002) 49 | amygdala_right_=slice(60002,60334) 50 | brainStem_=slice(60334,63806) 51 | caudate_left_=slice(63806,64534) 52 | caudate_right_=slice(64534,65289) 53 | cerebellum_left_=slice(65289,73998) 54 | cerebellum_right_=slice(73998,83142) 55 | diencephalon_left_=slice(83142,83848) 56 | diencephalon_right_=slice(83848,84560) 57 | hippocampus_left_=slice(84560,85324) 58 | hippocampus_right_=slice(85324,86119) 59 | pallidum_left_=slice(86119,86416) 60 | pallidum_right_=slice(86416,86676) 61 | putamen_left_=slice(86676,87736) 62 | putamen_right_=slice(87736,88746) 63 | thalamus_left_=slice(88746,90034) 64 | thalamus_right_=slice(90034,None) 65 | 66 | structures_subcortical=[accumbens_left_, accumbens_right_, amygdala_left_, amygdala_right_, 67 | caudate_left_, caudate_right_, cerebellum_left_, cerebellum_right_, diencephalon_left_, diencephalon_right_, 68 | hippocampus_left_, hippocampus_right_, pallidum_left_, pallidum_right_, putamen_left_, putamen_right_, 69 | thalamus_left_, thalamus_right_, brainStem_] 70 | 71 | names_subcortical=['accumbens_left', 'accumbens_right', 'amygdala_left', 'amygdala_right', 72 | 'caudate_left', 'caudate_right', 'cerebellum_left', 'cerebellum_right', 'diencephalon_left', 'diencephalon_right', 73 | 'hippocampus_left', 'hippocampus_right', 'pallidum_left', 'pallidum_right', 'putamen_left', 'putamen_right', 74 | 'thalamus_left', 'thalamus_right', 'brainStem'] 75 | 76 | num_subcortical = len(structures_subcortical) 77 | 78 | for i in range(num_subcortical): 79 | struct = structures_subcortical[i] 80 | name = names_subcortical[i] 81 | map_all[struct] = 361 + i 82 | keys.append(361 + i) 83 | 84 | labels.extend(names_subcortical) 85 | 86 | # random colors for subcortical parts 87 | np.random.seed(555) 88 | colors2 = np.random.uniform(size=((num_subcortical-1)//2, 4)) 89 | colors1 = 0.8 * colors2 90 | 91 | rgba_sc = np.zeros((num_subcortical, 4)) 92 | rgba_sc[:-1][::2] = colors1 93 | rgba_sc[1::2] = colors2 94 | rgba_sc[:,3] = 1.0 95 | 96 | 97 | labels = np.array(labels) 98 | rgba = np.vstack((np.array(rgba), rgba_sc)) 99 | keys = np.array(keys) 100 | 101 | 102 | np.savez_compressed('../hcp-utils/data/mmp_1.0.npz', map_all=map_all, labels=labels, rgba=rgba, ids=keys) 103 | 104 | -------------------------------------------------------------------------------- /prepare/prepare_standard.py: -------------------------------------------------------------------------------- 1 | import nibabel as nib 2 | import numpy as np 3 | 4 | 5 | map_all = np.zeros(91282, dtype=int) 6 | 7 | cortex_left_=slice(0,29696) 8 | cortex_right_=slice(29696,59412) 9 | 10 | keys = [0, 1, 2] 11 | labels = ['', 'cortex_left', 'cortex_right'] 12 | rgba = [(1.0, 1.0, 1.0, 1.0), (0.87, 0.09, 0.16, 1.0), (0.92, 0.12, 0.18, 1.0)] 13 | 14 | map_all[cortex_left_] = 1 15 | map_all[cortex_right_] = 2 16 | 17 | # extend the cortical parcellation by the standard subcortical structures from CIFTI-2 18 | 19 | 20 | accumbens_left_=slice(59412,59547) 21 | accumbens_right_=slice(59547,59687) 22 | amygdala_left_=slice(59687,60002) 23 | amygdala_right_=slice(60002,60334) 24 | brainStem_=slice(60334,63806) 25 | caudate_left_=slice(63806,64534) 26 | caudate_right_=slice(64534,65289) 27 | cerebellum_left_=slice(65289,73998) 28 | cerebellum_right_=slice(73998,83142) 29 | diencephalon_left_=slice(83142,83848) 30 | diencephalon_right_=slice(83848,84560) 31 | hippocampus_left_=slice(84560,85324) 32 | hippocampus_right_=slice(85324,86119) 33 | pallidum_left_=slice(86119,86416) 34 | pallidum_right_=slice(86416,86676) 35 | putamen_left_=slice(86676,87736) 36 | putamen_right_=slice(87736,88746) 37 | thalamus_left_=slice(88746,90034) 38 | thalamus_right_=slice(90034,None) 39 | 40 | structures_subcortical=[accumbens_left_, accumbens_right_, amygdala_left_, amygdala_right_, 41 | caudate_left_, caudate_right_, cerebellum_left_, cerebellum_right_, diencephalon_left_, diencephalon_right_, 42 | hippocampus_left_, hippocampus_right_, pallidum_left_, pallidum_right_, putamen_left_, putamen_right_, 43 | thalamus_left_, thalamus_right_, brainStem_] 44 | 45 | names_subcortical=['accumbens_left', 'accumbens_right', 'amygdala_left', 'amygdala_right', 46 | 'caudate_left', 'caudate_right', 'cerebellum_left', 'cerebellum_right', 'diencephalon_left', 'diencephalon_right', 47 | 'hippocampus_left', 'hippocampus_right', 'pallidum_left', 'pallidum_right', 'putamen_left', 'putamen_right', 48 | 'thalamus_left', 'thalamus_right', 'brainStem'] 49 | 50 | num_subcortical = len(structures_subcortical) 51 | 52 | for i in range(num_subcortical): 53 | struct = structures_subcortical[i] 54 | name = names_subcortical[i] 55 | map_all[struct] = 361 + i 56 | keys.append(361 + i) 57 | 58 | labels.extend(names_subcortical) 59 | 60 | # random colors for subcortical parts 61 | np.random.seed(555) 62 | colors2 = np.random.uniform(size=((num_subcortical-1)//2, 4)) 63 | colors1 = 0.8 * colors2 64 | 65 | rgba_sc = np.zeros((num_subcortical, 4)) 66 | rgba_sc[:-1][::2] = colors1 67 | rgba_sc[1::2] = colors2 68 | rgba_sc[:,3] = 1.0 69 | 70 | 71 | labels = np.array(labels) 72 | rgba = np.vstack((np.array(rgba), rgba_sc)) 73 | keys = np.array(keys) 74 | 75 | 76 | np.savez_compressed('../hcp-utils/data/standard.npz', map_all=map_all, labels=labels, rgba=rgba, ids=keys) 77 | 78 | -------------------------------------------------------------------------------- /prepare/prepare_yeo.py: -------------------------------------------------------------------------------- 1 | import nibabel as nib 2 | import numpy as np 3 | 4 | rsn = nib.load('../source_data/RSN.dlabel.nii') 5 | 6 | #print(rsn.shape) 7 | 8 | rois = rsn.get_fdata().astype(int) 9 | 10 | # for i in range(4): 11 | # print(i, np.unique(rois[i])) 12 | 13 | vertex_data = np.load('../hcp-utils/data/fMRI_vertex_info_32k.npz') 14 | grayl = vertex_data['grayl'] 15 | grayr = vertex_data['grayr'] 16 | 17 | cortex_left_=slice(0,29696) 18 | cortex_right_=slice(29696,59412) 19 | 20 | 21 | 22 | # Yeo 7-network parcellation 23 | 24 | yeo7lv=rois[0,:32492] 25 | yeo7rv=rois[0,32492:] 26 | 27 | yeo7 = np.zeros(91282, dtype=int) 28 | yeo7l = yeo7[cortex_left_] 29 | yeo7r = yeo7[cortex_right_] 30 | 31 | yeo7l[:] = yeo7lv[grayl] 32 | yeo7r[:] = yeo7rv[grayr] 33 | 34 | #print(np.unique(yeo7l)) 35 | #print(np.unique(yeo7r)) 36 | 37 | mapyeo7=dict() 38 | mapyeo7[37]=0 39 | mapyeo7[41]=1 40 | mapyeo7[43]=2 41 | mapyeo7[38]=3 42 | mapyeo7[44]=4 43 | mapyeo7[42]=5 44 | mapyeo7[39]=6 45 | mapyeo7[40]=7 46 | 47 | for v in mapyeo7: 48 | yeo7[yeo7==v]=mapyeo7[v] 49 | 50 | #print(np.unique(yeo7)) 51 | 52 | labels = ['', 'Visual', 'Somatomotor', 'Dorsal Attention', 'Ventral Attention', 'Limbic', 'Frontoparietal', 'Default'] 53 | 54 | axis0=rsn.header.get_index_map(0) 55 | nmap=list(axis0.named_maps)[0] 56 | 57 | rgba = [(1.0,1.0,1.0,1.0)] 58 | 59 | for i in [41, 43, 38, 44, 42, 39, 40]: 60 | lab=nmap.label_table[i] 61 | rgba.append((lab.red, lab.green, lab.blue, 1.0)) 62 | 63 | labels = np.array(labels) 64 | rgba = np.array(rgba) 65 | keys = np.arange(8) 66 | 67 | #print(labels) 68 | #print(rgba) 69 | 70 | np.savez_compressed('../hcp-utils/data/yeo7.npz', map_all=yeo7, labels=labels, rgba=rgba, ids=keys) 71 | 72 | 73 | # Yeo 17-network parcellation 74 | 75 | yeo17lv=rois[1,:32492] 76 | yeo17rv=rois[1,32492:] 77 | 78 | yeo17 = np.zeros(91282, dtype=int) 79 | yeo17l = yeo17[cortex_left_] 80 | yeo17r = yeo17[cortex_right_] 81 | 82 | yeo17l[:] = yeo17lv[grayl] 83 | yeo17r[:] = yeo17rv[grayr] 84 | 85 | #print(np.unique(yeo17l)) 86 | #print(np.unique(yeo17r)) 87 | 88 | mapyeo17=dict() 89 | mapyeo17[37]=0 90 | mapyeo17[54]=1 91 | mapyeo17[45]=2 92 | mapyeo17[58]=3 93 | mapyeo17[55]=4 94 | mapyeo17[50]=5 95 | mapyeo17[47]=6 96 | mapyeo17[60]=7 97 | mapyeo17[59]=8 98 | mapyeo17[56]=9 99 | mapyeo17[49]=10 100 | mapyeo17[57]=11 101 | mapyeo17[48]=12 102 | mapyeo17[51]=13 103 | mapyeo17[61]=14 104 | mapyeo17[53]=15 105 | mapyeo17[46]=16 106 | mapyeo17[52]=17 107 | 108 | for v in mapyeo17: 109 | yeo17[yeo17==v]=mapyeo17[v] 110 | 111 | #print(np.unique(yeo17)) 112 | 113 | labels = [''] + ['network_{}'.format(i) for i in range(1,18)] 114 | 115 | axis0=rsn.header.get_index_map(0) 116 | nmap=list(axis0.named_maps)[0] 117 | 118 | rgba = [(1.0,1.0,1.0,1.0)] 119 | 120 | for i in [54, 45, 58, 55, 50, 47, 60, 59, 56, 49, 57, 48, 51, 61, 53, 46, 52]: 121 | lab=nmap.label_table[i] 122 | rgba.append((lab.red, lab.green, lab.blue, 1.0)) 123 | 124 | labels = np.array(labels) 125 | rgba = np.array(rgba) 126 | keys = np.arange(18) 127 | 128 | #print(labels) 129 | #print(rgba) 130 | 131 | np.savez_compressed('../hcp-utils/data/yeo17.npz', map_all=yeo17, labels=labels, rgba=rgba, ids=keys) 132 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | import setuptools 2 | 3 | with open("README.md", "r") as fh: 4 | long_description = fh.read() 5 | 6 | setuptools.setup( 7 | name="hcp_utils", 8 | version="0.1.0", 9 | author="Romuald A. Janik", 10 | author_email="romuald.janik@gmail.com", 11 | description="A set of utilities for working with HCP-style fMRI data with nilearn.", 12 | long_description=long_description, 13 | long_description_content_type="text/markdown", 14 | url="https://github.com/rmldj/hcp-utils", 15 | packages=['hcp_utils'], 16 | package_data={'hcp_utils': ['data/*']}, 17 | classifiers=[ 18 | "Programming Language :: Python :: 3", 19 | "License :: OSI Approved :: MIT License", 20 | "Operating System :: OS Independent", 21 | ], 22 | python_requires='>=3.6', 23 | zip_safe=False 24 | ) 25 | -------------------------------------------------------------------------------- /source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_netassignments_LR.dlabel.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_netassignments_LR.dlabel.nii -------------------------------------------------------------------------------- /source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_parcels_LR.dlabel.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/source_data/CortexSubcortex_ColeAnticevic_NetPartition_wSubcorGSR_parcels_LR.dlabel.nii -------------------------------------------------------------------------------- /source_data/Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/source_data/Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii -------------------------------------------------------------------------------- /source_data/README.md: -------------------------------------------------------------------------------- 1 | 2 | # MMP 1.0 3 | 4 | 'Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii' 5 | 6 | from 7 | 8 | https://balsa.wustl.edu/78X3 9 | 10 | 11 | # Cole Anticevic network partition and parcellation 12 | 13 | The files are from https://github.com/ColeLab/ColeAnticevicNetPartition 14 | 15 | 16 | -------------------------------------------------------------------------------- /source_data/RSN-networks.32k_fs_LR.dlabel.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/source_data/RSN-networks.32k_fs_LR.dlabel.nii -------------------------------------------------------------------------------- /source_data/RSN.dlabel.nii: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmldj/hcp-utils/41a3f1567dd1fa95da09113acc0a62dad4516fb2/source_data/RSN.dlabel.nii --------------------------------------------------------------------------------