├── .all-contributorsrc ├── .gitignore ├── LICENSE ├── README.md ├── img └── handbook.png └── md_files ├── hcp_datalad_dataset.md ├── template.md ├── timeseries_to_file.md └── values_on_parcels.md /.all-contributorsrc: -------------------------------------------------------------------------------- 1 | { 2 | "files": [ 3 | "README.md" 4 | ], 5 | "imageSize": 100, 6 | "commit": false, 7 | "contributors": [ 8 | { 9 | "login": "htwangtw", 10 | "name": "Hao-Ting Wang", 11 | "avatar_url": "https://avatars.githubusercontent.com/u/13743617?v=4", 12 | "profile": "https://wanghaoting.com/", 13 | "contributions": [ 14 | "ideas", 15 | "design", 16 | "content", 17 | "example" 18 | ] 19 | }, 20 | { 21 | "login": "iamdamion", 22 | "name": "Rockets2theMoon", 23 | "avatar_url": "https://avatars.githubusercontent.com/u/6740413?v=4", 24 | "profile": "http://damiondemeter.com", 25 | "contributions": [ 26 | "ideas", 27 | "design", 28 | "content", 29 | "example" 30 | ] 31 | }, 32 | { 33 | "login": "axiezai", 34 | "name": "Xihe Xie", 35 | "avatar_url": "https://avatars.githubusercontent.com/u/21200014?v=4", 36 | "profile": "https://axiezai.github.io", 37 | "contributions": [ 38 | "example" 39 | ] 40 | } 41 | ], 42 | "contributorsPerLine": 7, 43 | "projectName": "HCP-snippets", 44 | "projectOwner": "iamdamion", 45 | "repoType": "github", 46 | "repoHost": "https://github.com", 47 | "skipCi": true 48 | } 49 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # mac stuff 2 | .DS_Store 3 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Creative Commons Legal Code 2 | 3 | CC0 1.0 Universal 4 | 5 | CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE 6 | LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN 7 | ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS 8 | INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES 9 | REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS 10 | PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM 11 | THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED 12 | HEREUNDER. 13 | 14 | Statement of Purpose 15 | 16 | The laws of most jurisdictions throughout the world automatically confer 17 | exclusive Copyright and Related Rights (defined below) upon the creator 18 | and subsequent owner(s) (each and all, an "owner") of an original work of 19 | authorship and/or a database (each, a "Work"). 20 | 21 | Certain owners wish to permanently relinquish those rights to a Work for 22 | the purpose of contributing to a commons of creative, cultural and 23 | scientific works ("Commons") that the public can reliably and without fear 24 | of later claims of infringement build upon, modify, incorporate in other 25 | works, reuse and redistribute as freely as possible in any form whatsoever 26 | and for any purposes, including without limitation commercial purposes. 27 | These owners may contribute to the Commons to promote the ideal of a free 28 | culture and the further production of creative, cultural and scientific 29 | works, or to gain reputation or greater distribution for their Work in 30 | part through the use and efforts of others. 31 | 32 | For these and/or other purposes and motivations, and without any 33 | expectation of additional consideration or compensation, the person 34 | associating CC0 with a Work (the "Affirmer"), to the extent that he or she 35 | is an owner of Copyright and Related Rights in the Work, voluntarily 36 | elects to apply CC0 to the Work and publicly distribute the Work under its 37 | terms, with knowledge of his or her Copyright and Related Rights in the 38 | Work and the meaning and intended legal effect of CC0 on those rights. 39 | 40 | 1. Copyright and Related Rights. A Work made available under CC0 may be 41 | protected by copyright and related or neighboring rights ("Copyright and 42 | Related Rights"). Copyright and Related Rights include, but are not 43 | limited to, the following: 44 | 45 | i. the right to reproduce, adapt, distribute, perform, display, 46 | communicate, and translate a Work; 47 | ii. moral rights retained by the original author(s) and/or performer(s); 48 | iii. publicity and privacy rights pertaining to a person's image or 49 | likeness depicted in a Work; 50 | iv. rights protecting against unfair competition in regards to a Work, 51 | subject to the limitations in paragraph 4(a), below; 52 | v. rights protecting the extraction, dissemination, use and reuse of data 53 | in a Work; 54 | vi. database rights (such as those arising under Directive 96/9/EC of the 55 | European Parliament and of the Council of 11 March 1996 on the legal 56 | protection of databases, and under any national implementation 57 | thereof, including any amended or successor version of such 58 | directive); and 59 | vii. other similar, equivalent or corresponding rights throughout the 60 | world based on applicable law or treaty, and any national 61 | implementations thereof. 62 | 63 | 2. Waiver. To the greatest extent permitted by, but not in contravention 64 | of, applicable law, Affirmer hereby overtly, fully, permanently, 65 | irrevocably and unconditionally waives, abandons, and surrenders all of 66 | Affirmer's Copyright and Related Rights and associated claims and causes 67 | of action, whether now known or unknown (including existing as well as 68 | future claims and causes of action), in the Work (i) in all territories 69 | worldwide, (ii) for the maximum duration provided by applicable law or 70 | treaty (including future time extensions), (iii) in any current or future 71 | medium and for any number of copies, and (iv) for any purpose whatsoever, 72 | including without limitation commercial, advertising or promotional 73 | purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each 74 | member of the public at large and to the detriment of Affirmer's heirs and 75 | successors, fully intending that such Waiver shall not be subject to 76 | revocation, rescission, cancellation, termination, or any other legal or 77 | equitable action to disrupt the quiet enjoyment of the Work by the public 78 | as contemplated by Affirmer's express Statement of Purpose. 79 | 80 | 3. Public License Fallback. Should any part of the Waiver for any reason 81 | be judged legally invalid or ineffective under applicable law, then the 82 | Waiver shall be preserved to the maximum extent permitted taking into 83 | account Affirmer's express Statement of Purpose. In addition, to the 84 | extent the Waiver is so judged Affirmer hereby grants to each affected 85 | person a royalty-free, non transferable, non sublicensable, non exclusive, 86 | irrevocable and unconditional license to exercise Affirmer's Copyright and 87 | Related Rights in the Work (i) in all territories worldwide, (ii) for the 88 | maximum duration provided by applicable law or treaty (including future 89 | time extensions), (iii) in any current or future medium and for any number 90 | of copies, and (iv) for any purpose whatsoever, including without 91 | limitation commercial, advertising or promotional purposes (the 92 | "License"). The License shall be deemed effective as of the date CC0 was 93 | applied by Affirmer to the Work. Should any part of the License for any 94 | reason be judged legally invalid or ineffective under applicable law, such 95 | partial invalidity or ineffectiveness shall not invalidate the remainder 96 | of the License, and in such case Affirmer hereby affirms that he or she 97 | will not (i) exercise any of his or her remaining Copyright and Related 98 | Rights in the Work or (ii) assert any associated claims and causes of 99 | action with respect to the Work, in either case contrary to Affirmer's 100 | express Statement of Purpose. 101 | 102 | 4. Limitations and Disclaimers. 103 | 104 | a. No trademark or patent rights held by Affirmer are waived, abandoned, 105 | surrendered, licensed or otherwise affected by this document. 106 | b. Affirmer offers the Work as-is and makes no representations or 107 | warranties of any kind concerning the Work, express, implied, 108 | statutory or otherwise, including without limitation warranties of 109 | title, merchantability, fitness for a particular purpose, non 110 | infringement, or the absence of latent or other defects, accuracy, or 111 | the present or absence of errors, whether or not discoverable, all to 112 | the greatest extent permissible under applicable law. 113 | c. Affirmer disclaims responsibility for clearing rights of other persons 114 | that may apply to the Work or any use thereof, including without 115 | limitation any person's Copyright and Related Rights in the Work. 116 | Further, Affirmer disclaims responsibility for obtaining any necessary 117 | consents, permissions or other rights required for any use of the 118 | Work. 119 | d. Affirmer understands and acknowledges that Creative Commons is not a 120 | party to this document and has no duty or obligation with respect to 121 | this CC0 or use of the Work. 122 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Handbook for the Recently HCP'd 4 | Snippets of code, examples, and simple walkthroughs to help grad students/new users complete basic tasks with workbench command and common HCP pipeline output files. 5 | 6 | 7 | [![All Contributors](https://img.shields.io/badge/all_contributors-3-orange.svg?style=flat-square)](#contributors-) 8 | 9 | 10 | 11 | 12 | ## Background 13 | Working with HCP files has been an adventure. There are many resources out there and the workbench commands are very powerful. However, while attempting to complete many tasks I would simply and quickly do using fsl commands on volume files, I found myself a little confused with surface/HCP files. I was constantly searching for brief and simple examples or step-by-step instructions for the tasks I was trying to complete...but was unable to find them. If you are anything like me, you learn best by clear and brief step-by-step examples that offer real outputs you can then experiment with and build upon. 14 | 15 | This repository is meant to be a living resource for people in the position I was in. Those searching for simple step-by-step instructions and examples commands or code to build upon for your work. Hopefully, it will mitigate the struggle for grad students and new workbench and HCP users. 16 | 17 | --- 18 | ## Table of Contents 19 | - [The Most Clear Basic File Type Overview I've Found](https://mandymejia.com/2015/08/10/a-laymans-guide-to-working-with-cifti-files/) (External Link) 20 | - [Extract Timeseries to File](md_files/timeseries_to_file.md) 21 | - [Write Arbitrary Values to Surface Parcels](md_files/values_on_parcels.md) 22 | - [Obtain HCP dataset with Datalad](md_files/hcp_datalad_dataset.md) 23 | 24 | --- 25 | ## Contributors ✨ 26 | Thanks to Hao-Ting for the mini "brainhack" sessions that inspired making this resource in the first place! ([emoji key](https://allcontributors.org/docs/en/emoji-key)): 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 |

Hao-Ting Wang

🤔 🎨 🖋 💡

Rockets2theMoon

🤔 🎨 🖋 💡

Xihe Xie

💡
38 | 39 | 40 | 41 | 42 | 43 | 44 | This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome! 45 | 46 | --- 47 | ## Please Contribute!! 48 | If you'd like to contribute any brief tutorials or tips you wish you'd have known when you started working with HCP pipeline output files or workbench commands, please write something up. We have provided a [template markdown file](https://github.com/iamdamion/HCP-snippets/blob/main/md_files/template.md) and any images should be linked from the /img folder in this repository. Submit a pull request and we'll get it added asap. We've all learned a lot from scattered resources, but I'm hoping this can evolve into something that can reduce that learning curve pain for new grad students. 49 | 50 | 51 | -------------------------------------------------------------------------------- /img/handbook.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/iamdamion/HCP-snippets/b73ff6fa8dfd12df09eff630717d7f63b1f6e98c/img/handbook.png -------------------------------------------------------------------------------- /md_files/hcp_datalad_dataset.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Get HCP Data with Datalad 4 | 5 | ## What is this snippet for? 6 | 10 | The HCP Open Access dataset is a collection of more than a thousand subjects' multimodal neuroimaging data. Obtaining the data through [https://db.humanconnectome.org/](https://db.humanconnectome.org/) requires thorough navigation and querying of the web interface, followed by a painfully long download process for large collections. 11 | 12 | [Datalad](http://handbook.datalad.org/en/latest/) is an open-source data management tool that simplifies the data download process. The HCP dataset hosted on Amazon S3 buckets have been converted to a datalad dataset and remotely hosted on GitHub at: [https://github.com/datalad-datasets/human-connectome-project-openaccess](https://github.com/datalad-datasets/human-connectome-project-openaccess). 13 | 14 | 15 | ## Why do we need this? 16 | 19 | Instead of downloading huge images for a large collection of subjects, you can clone this GitHub remote and use datalad to obtain the entire dataset as light-weight symbolic links pointing to the dataset's origin (Amazon S3 Bucket). So instead of downloading TBs of data in one go, you simply download a few MBs of symbolic links, and you can explore the dataset on your computer as if you've already have all the data. And once you've found the specific files you'd like to obtain, you can get the full file with `datalad get`. 20 | 21 | ## Code example! 22 | See GitHub repo for instructions on how to obtain a `datalad` dataset: [https://github.com/datalad-datasets/human-connectome-project-openaccess](https://github.com/datalad-datasets/human-connectome-project-openaccess) 23 | 24 | To summarize: 25 | 26 | - Install `datalad` with either `pip` or `conda` (See handbook below if you've never done this) 27 | 28 | - Obtain your [ConnectomeDB](https://db.humanconnectome.org) account and enable the Amazon S3 bucket access for your Amazon account so you can obtain your own access secret key and key ID. 29 | 30 | - Clone the dataset with `datalad clone https://github.com/datalad-datasets/human-connectome-project-openaccess.git` 31 | 32 | - Each subject folder and their corresponding data folders are called "subdatasets", you can obtain specific files with `datalad get ` or `datalad get `. 33 | 34 | - If this is the first time you access a Amazon S3 bucket, you will be prompted to enter your access key ID and secret key obtained in the earlier step. 35 | 36 | For people unfamiliar with `datalad`, check out their wonderful handbook: [http://handbook.datalad.org/en/latest/](http://handbook.datalad.org/en/latest/) 37 | 38 | ### Example Outputs 39 | 42 | 43 | See [https://github.com/datalad-datasets/human-connectome-project-openaccess](https://github.com/datalad-datasets/human-connectome-project-openaccess) for all files available through the GitHub remote host. -------------------------------------------------------------------------------- /md_files/template.md: -------------------------------------------------------------------------------- 1 | 2 | # SHORT AND CLEAR TITLE GOES HERE 3 | 4 | ## What is this snippet for? 5 | 9 | WALKTHROUGH TEXT HERE 10 | 11 | ## Why do we need this? 12 | 15 | BRIEF OVERVIEW TEXT HERE 16 | 17 | 18 | ## Code example! 19 | 22 | ### Step I: PLEASE SEPARATE INTO CLEAR STEPS 23 | ```python 24 | 25 | # python code would go here 26 | 27 | ``` 28 | 29 | ```bash 30 | 31 | # you can specify language for clarity with markdown 32 | 33 | ``` 34 | 35 | ### Example Outputs 36 | 39 | 40 | EXAMPLES OF OUTPUTS WOULD GO HERE. PLEASE LINK IMAGES FROM THE /img FOLDER OF THIS REPOSITORY. -------------------------------------------------------------------------------- /md_files/timeseries_to_file.md: -------------------------------------------------------------------------------- 1 | 2 | # Extract Timeseries to File Using Surface Parcels 3 | 4 | ## What is this snippet for? 5 | 9 | Simple walkthrough and example code (python) to extract timeseries vectors, from a dense timeseries (dtseries) file, using a parcel set's dlabel file. 10 | 11 | ## Why do we need this? 12 | 15 | The most basic of needs for any analyses is a bit different using surface files. Here is an example in the most basic of scenarios. You have a dense timseries file created from the core HCP pipeline (or any offshoot, such as from [DCAN lab](https://collection3165.readthedocs.io/en/stable/pipeline/) or using [fMRIPrep](https://www.nature.com/articles/s41592-018-0235-4)). Below is how you would: 16 | 1. Create a ptseries (parcellated timeseries) file. This is the average value of all vertices within each surface parcel, for all TRs (repetition time). 17 | 2. Extract the timeseries vectors from that ptseries file (to a .txt file use in any script), using the 333 cortical surface parcels in the Gordon 333 parcellation set. 18 | 19 | Note: This ptseries (parcellated timeseries file) file can be used with other workbench commands, etc. 20 | 21 | ## Code example! 22 | 25 | ### Step I: Create a pstseries (parcellated timeseries file) from the dtseries 26 | ```python 27 | # the dtseries file you want to extract 28 | dt_path = '/your_dense_timeseries_file.dtseries.nii' 29 | # whatever parcel set you want to use 30 | parc_path = '/Gordon333_Parcels_LR.dlabel.nii' 31 | # output path to the ptseries you want to create - 32 | pt_path = '/your_parcellated_timeseries_file.ptseries.nii' 33 | 34 | pt_comm = ' '.join(['wb_command', 35 | '-cifti-parcellate', 36 | dt_path, 37 | parc_path, 38 | 'COLUMN', 39 | pt_path 40 | ]) 41 | try: 42 | subprocess.call(pt_comm, shell=True) 43 | except subprocess.CalledProcessError as err: 44 | print('ERROR:', err) 45 | ``` 46 | 47 | ### Step II: Extract timeseries of average vertex values within each parcel to a .txt file. 48 | 49 | ```python 50 | # output path to the .txt - this will have one vector, per line. 51 | # NOTE: Each line is a parcel (in the order the dlabel file has them labeled). 52 | # Each vector has one value for each TR of the scan. 53 | tc_path = '/your_timeseries_text_file.txt' 54 | 55 | tcext_comm = ' '.join(['wb_command', 56 | '-cifti-convert', 57 | '-to-text', 58 | pt_path, 59 | tc_path 60 | ]) 61 | try: 62 | subprocess.call(tcext_comm, shell=True) 63 | except subprocess.CalledProcessError as err: 64 | print('ERROR:', err) 65 | ``` 66 | 67 | ### Example Outputs 68 | 71 | -------------------------------------------------------------------------------- /md_files/values_on_parcels.md: -------------------------------------------------------------------------------- 1 | 2 | # Populate surface parcels with arbitrary values 3 | 4 | ## What is this snippet for? 5 | 9 | Simple walkthrough and example code (python) to create a "template" surface pscalar, then to populate the Gordon 333 surface parcels with arbitrary values. 10 | (good for visualization of any metric or value you are computing using parcels.) 11 | 12 | ## Why do we need this? 13 | 16 | For one project I was identifying parcels of interest. To show the overlap across a group of subjects, I needed to place a number on each parcel, which represented how many times that parcel was a parcel of interest across all subjects in my group. This example code allows you to place arbitrary values on any parcel, which you can then bring into workbench viewer and make a nice figure including a color map for your results, etc. 17 | 18 | This example will: 19 | 1. Show you how to create a "blank" pscalar (parcellated scalar surface file) using a source surface scalar file and parcel set of your choosing. 20 | 2. Explain the text file needed to place values onto each parcel in the set you've chosen. 21 | 3. Place those values onto the template pscalar file you made and save it out as a new pscalar.nii file. 22 | 23 | This file can then be brought into workbench view to make figures. 24 | 25 | ## Code example! 26 | 29 | ### Step I: Make TEMPLATE pscalar from original dlabel (this only needs to be done once really) 30 | ```python 31 | 32 | # any dscalar in the correct surface you want to use 33 | hcp1200_dscalar_source = '/S1200.sulc_MSMAll.32k_fs_LR.dscalar.nii' 34 | # whatever parcel set you want to use 35 | gordon_dlabel = '/Gordon333_Parcels_LR.dlabel.nii' 36 | # The template you want to put values onto going forward 37 | out_pscalar_TEMPLATE = '/Gordon333_TEMPLATE.pscalar.nii' 38 | 39 | # the wb_command 40 | wb_comm = ' '.join(['wb_command', 41 | '-cifti-parcellate', 42 | hcp1200_dscalar_source, 43 | gordon_dlabel, 44 | 'COLUMN', 45 | out_pscalar_TEMPLATE 46 | ]) 47 | subprocess.call(wb_comm, shell=True) 48 | ``` 49 | 50 | ### Step II: Make and save whatever values. 51 | - 333 vector (each line of text file is each parcel 1-333. careful of python 0 indexing) 52 | - Save out as text file, one value per line. 53 | 54 | ```python 55 | value_vect_out = '/TEST_values.txt' 56 | # making manual text file just for testing here 57 | the_vect = np.zeros(333, dtype=np.int).tolist() 58 | the_vect[0] = 1 59 | the_vect[161] = 162 60 | the_vect[116] = 117 61 | the_vect[278] = 279 62 | with open(value_vect_out, 'w') as f: 63 | for item in the_vect: 64 | f.write(str(item) + '\n') 65 | ``` 66 | 67 | ### Step III: Write that vector of value to a pscalar file for wb view!! 68 | 69 | ```python 70 | output_pscalar = '/YOUR_OUTPUT_G333.pscalar.nii' 71 | wb_comm = ' '.join(['wb_command', 72 | '-cifti-convert', 73 | '-from-text', 74 | value_vect_out, 75 | out_pscalar_TEMPLATE, 76 | output_pscalar 77 | ]) 78 | subprocess.call(wb_comm, shell=True) 79 | ``` 80 | 81 | ### Example Outputs 82 | 85 | --------------------------------------------------------------------------------