├── course ├── static │ ├── readme.md │ ├── GitHub_fork.png │ └── repronim_logo.png ├── _toc.yml ├── materials.md ├── outline.md ├── conf.py ├── index.md ├── CoC.md ├── _config.yml ├── dei.md ├── overview.md ├── setup.md └── exercise.md ├── pics ├── SEC_Alsh.png ├── GitHub_fork.png └── pull_request.png ├── .gitignore ├── requirements.txt ├── Exercise ├── Groups │ ├── Group_01 │ ├── Group_02 │ ├── Group_03 │ ├── Group_04 │ ├── Group_05 │ ├── Group_06 │ ├── Group_07 │ ├── Group_08 │ ├── Group_09 │ ├── Group_10 │ ├── Group_11 │ ├── Group_12 │ ├── Group_13 │ ├── Group_14 │ ├── Group_15 │ ├── Group_16 │ ├── Group_17 │ ├── Group_18 │ ├── Group_19 │ └── Group_20 └── bidsmri2nidm.txt ├── JupyterHub └── README.md ├── stats ├── README.md ├── ENIGMA_cohensD_functions.R ├── OpenNeuro_results.csv ├── demog_groupStats_loop_OHBM.Rmd ├── mass_uv_regr_loop_OHBM_simplified.Rmd ├── metr_SubCortical_OHBM.csv └── all4sites_data_OHBM.csv ├── .github └── workflows │ └── book.yml ├── scripts └── data_subset.py ├── README.md └── LICENSE /course/static/readme.md: -------------------------------------------------------------------------------- 1 | # Home of course related graphics 2 | -------------------------------------------------------------------------------- /pics/SEC_Alsh.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ReproNim/OHBMEducation-2022/main/pics/SEC_Alsh.png -------------------------------------------------------------------------------- /pics/GitHub_fork.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ReproNim/OHBMEducation-2022/main/pics/GitHub_fork.png -------------------------------------------------------------------------------- /pics/pull_request.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ReproNim/OHBMEducation-2022/main/pics/pull_request.png -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # ignore local build files 2 | course/_build/* 3 | 4 | # ignore macos .DS_Store files 5 | .DS_Store -------------------------------------------------------------------------------- /course/static/GitHub_fork.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ReproNim/OHBMEducation-2022/main/course/static/GitHub_fork.png -------------------------------------------------------------------------------- /course/static/repronim_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ReproNim/OHBMEducation-2022/main/course/static/repronim_logo.png -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | myst-nb==0.12.0 2 | myst-parser==0.13.6 3 | jupyter-book==0.11.1 4 | nbclient==0.5.1 5 | jinja2==3.0.0 6 | jupytext -------------------------------------------------------------------------------- /Exercise/Groups/Group_01: -------------------------------------------------------------------------------- 1 | Group 01 2 | sub-RC4101 3 | sub-RC4103 4 | sub-RC4106 5 | sub-RC4107 6 | sub-RC4109 7 | sub-RC4201 8 | sub-RC4204 9 | sub-RC4205 10 | sub-RC4206 11 | sub-RC4207 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_02: -------------------------------------------------------------------------------- 1 | Group 02 2 | sub-RC4110 3 | sub-RC4111 4 | sub-RC4112 5 | sub-RC4113 6 | sub-RC4114 7 | sub-RC4208 8 | sub-RC4210 9 | sub-RC4211 10 | sub-RC4212 11 | sub-RC4213 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_03: -------------------------------------------------------------------------------- 1 | Group 03 2 | sub-RC4115 3 | sub-RC4116 4 | sub-RC4117 5 | sub-RC4118 6 | sub-RC4119 7 | sub-RC4214 8 | sub-RC4215 9 | sub-RC4217 10 | sub-RC4218 11 | sub-RC4219 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_04: -------------------------------------------------------------------------------- 1 | Group 04 2 | sub-RC4120 3 | sub-RC4121 4 | sub-RC4122 5 | sub-RC4123 6 | sub-RC4125 7 | sub-RC4220 8 | sub-RC4221 9 | sub-RC4224 10 | sub-RC4225 11 | sub-RC4226 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_05: -------------------------------------------------------------------------------- 1 | Group 05 2 | sub-RC4126 3 | sub-RC4128 4 | sub-RC4129 5 | sub-RC4130 6 | sub-RC4131 7 | sub-RC4201 8 | sub-RC4204 9 | sub-RC4205 10 | sub-RC4206 11 | sub-RC4227 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_06: -------------------------------------------------------------------------------- 1 | Group 06 2 | sub-RC4103 3 | sub-RC4106 4 | sub-RC4107 5 | sub-RC4109 6 | sub-RC4110 7 | sub-RC4207 8 | sub-RC4208 9 | sub-RC4210 10 | sub-RC4211 11 | sub-RC4212 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_07: -------------------------------------------------------------------------------- 1 | Group 07 2 | sub-RC4111 3 | sub-RC4112 4 | sub-RC4113 5 | sub-RC4114 6 | sub-RC4115 7 | sub-RC4213 8 | sub-RC4214 9 | sub-RC4215 10 | sub-RC4217 11 | sub-RC4218 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_08: -------------------------------------------------------------------------------- 1 | Group 08 2 | sub-RC4116 3 | sub-RC4117 4 | sub-RC4118 5 | sub-RC4119 6 | sub-RC4120 7 | sub-RC4219 8 | sub-RC4220 9 | sub-RC4221 10 | sub-RC4224 11 | sub-RC4225 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_09: -------------------------------------------------------------------------------- 1 | Group 09 2 | sub-RC4121 3 | sub-RC4122 4 | sub-RC4123 5 | sub-RC4125 6 | sub-RC4126 7 | sub-RC4201 8 | sub-RC4204 9 | sub-RC4205 10 | sub-RC4226 11 | sub-RC4227 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_10: -------------------------------------------------------------------------------- 1 | Group 10 2 | sub-RC4101 3 | sub-RC4128 4 | sub-RC4129 5 | sub-RC4130 6 | sub-RC4131 7 | sub-RC4206 8 | sub-RC4207 9 | sub-RC4208 10 | sub-RC4210 11 | sub-RC4211 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_11: -------------------------------------------------------------------------------- 1 | Group 11 2 | sub-RC4106 3 | sub-RC4107 4 | sub-RC4109 5 | sub-RC4110 6 | sub-RC4111 7 | sub-RC4212 8 | sub-RC4213 9 | sub-RC4214 10 | sub-RC4215 11 | sub-RC4217 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_12: -------------------------------------------------------------------------------- 1 | Group 12 2 | sub-RC4112 3 | sub-RC4113 4 | sub-RC4114 5 | sub-RC4115 6 | sub-RC4116 7 | sub-RC4218 8 | sub-RC4219 9 | sub-RC4220 10 | sub-RC4221 11 | sub-RC4224 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_13: -------------------------------------------------------------------------------- 1 | Group 13 2 | sub-RC4117 3 | sub-RC4118 4 | sub-RC4119 5 | sub-RC4120 6 | sub-RC4121 7 | sub-RC4201 8 | sub-RC4204 9 | sub-RC4225 10 | sub-RC4226 11 | sub-RC4227 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_14: -------------------------------------------------------------------------------- 1 | Group 14 2 | sub-RC4122 3 | sub-RC4123 4 | sub-RC4125 5 | sub-RC4126 6 | sub-RC4128 7 | sub-RC4205 8 | sub-RC4206 9 | sub-RC4207 10 | sub-RC4208 11 | sub-RC4210 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_15: -------------------------------------------------------------------------------- 1 | Group 15 2 | sub-RC4101 3 | sub-RC4103 4 | sub-RC4129 5 | sub-RC4130 6 | sub-RC4131 7 | sub-RC4211 8 | sub-RC4212 9 | sub-RC4213 10 | sub-RC4214 11 | sub-RC4215 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_16: -------------------------------------------------------------------------------- 1 | Group 16 2 | sub-RC4107 3 | sub-RC4109 4 | sub-RC4110 5 | sub-RC4111 6 | sub-RC4112 7 | sub-RC4217 8 | sub-RC4218 9 | sub-RC4219 10 | sub-RC4220 11 | sub-RC4221 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_17: -------------------------------------------------------------------------------- 1 | Group 17 2 | sub-RC4113 3 | sub-RC4114 4 | sub-RC4115 5 | sub-RC4116 6 | sub-RC4117 7 | sub-RC4201 8 | sub-RC4224 9 | sub-RC4225 10 | sub-RC4226 11 | sub-RC4227 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_18: -------------------------------------------------------------------------------- 1 | Group 18 2 | sub-RC4118 3 | sub-RC4119 4 | sub-RC4120 5 | sub-RC4121 6 | sub-RC4122 7 | sub-RC4204 8 | sub-RC4205 9 | sub-RC4206 10 | sub-RC4207 11 | sub-RC4208 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_19: -------------------------------------------------------------------------------- 1 | Group 19 2 | sub-RC4123 3 | sub-RC4125 4 | sub-RC4126 5 | sub-RC4128 6 | sub-RC4129 7 | sub-RC4210 8 | sub-RC4211 9 | sub-RC4212 10 | sub-RC4213 11 | sub-RC4214 12 | -------------------------------------------------------------------------------- /Exercise/Groups/Group_20: -------------------------------------------------------------------------------- 1 | Group 20 2 | sub-RC4101 3 | sub-RC4103 4 | sub-RC4106 5 | sub-RC4130 6 | sub-RC4131 7 | sub-RC4215 8 | sub-RC4217 9 | sub-RC4218 10 | sub-RC4219 11 | sub-RC4220 12 | -------------------------------------------------------------------------------- /course/_toc.yml: -------------------------------------------------------------------------------- 1 | root: index 2 | format: jb-book 3 | chapters: 4 | - file: dei 5 | - file: overview 6 | - file: setup 7 | - file: outline 8 | - file: materials 9 | - file: exercise 10 | - file: CoC -------------------------------------------------------------------------------- /JupyterHub/README.md: -------------------------------------------------------------------------------- 1 | # JupyterHub for the Education Course 2 | Will will be using a JupyterHub instance for the course in order to provide a common and comprehensive and accessible computing platform. 3 | 4 | ## Details 5 | We will do a variant of the [JupyterHub](https://jupyter.org/hub) used by the [ABCD-ReproNim course](https://www.abcd-repronim.org/). The instructions they used can be found 6 | at https://github.com/ABCD-ReproNim/reprohub/. 7 | -------------------------------------------------------------------------------- /stats/README.md: -------------------------------------------------------------------------------- 1 | 2 | To set up your regression model to calculate Cohen's D effect sizes for case/control differences between brain volumes, you will need to edit the input to the following notebook: 3 | mass_uv_regr_loop_OHBM_simplified.Rmd 4 | 5 | We will go through this step by step in the course. 6 | 7 | We will also plot histograms of the data for each site in the notebook: 8 | demog_groupStats_loop_OHBM.Rmd 9 | 10 | You will have the option of expanding these notebooks for future work. 11 | -------------------------------------------------------------------------------- /stats/ENIGMA_cohensD_functions.R: -------------------------------------------------------------------------------- 1 | #/* 2 | # * ENIGMA_DTI 2014. 3 | # */ 4 | #################################################################### 5 | 6 | 7 | 8 | ###Functions used in the code for Cohens d 9 | d.t.unpaired<-function(t.val,n1,n2){ 10 | d<-t.val*sqrt((n1+n2)/(n1*n2)) 11 | names(d)<-"effect size d" 12 | return(d) 13 | } 14 | 15 | partial.d<-function(t.val,df,n1,n2){ 16 | d<-t.val*(n1+n2)/(sqrt(n1*n2)*sqrt(df)) 17 | names(d)<-"effect size d" 18 | return(d) 19 | } 20 | 21 | CI_95<-function(ES,se){ 22 | ci<-c((ES-(1.96)*se),(ES+(1.96)*se)) 23 | names(ci)<-c("95% CI lower","95% CI upper") 24 | return(ci) 25 | } 26 | 27 | se.d2<-function(d,n1,n2){ 28 | se<-sqrt((n1+n2)/(n1*n2)+(d^2)/(2*(n1+n2-2))) 29 | names(se)<-"se for d" 30 | return(se) 31 | } 32 | ###### 33 | unweightedMean <- function(xL, xR) { 34 | xM <- (xL + xR)/2 35 | output=list("meanLR"=xM) 36 | return(output) 37 | } 38 | 39 | -------------------------------------------------------------------------------- /.github/workflows/book.yml: -------------------------------------------------------------------------------- 1 | name: deploy-book 2 | 3 | # Only run this when the main branch changes 4 | on: 5 | push: 6 | branches: 7 | - main 8 | # If your git repository has the Jupyter Book within some-subfolder next to 9 | # unrelated files, you can make this run only if a file within that specific 10 | # folder has been modified. 11 | # 12 | # paths: 13 | # - content/** 14 | 15 | # This job installs dependencies, build the book, and pushes it to `gh-pages` 16 | jobs: 17 | deploy-book: 18 | runs-on: ubuntu-latest 19 | steps: 20 | - uses: actions/checkout@v2 21 | 22 | # Install dependencies 23 | - name: Set up Python 3.7 24 | uses: actions/setup-python@v1 25 | with: 26 | python-version: 3.7 27 | 28 | - name: Install dependencies 29 | run: | 30 | pip install -r requirements.txt 31 | # Build the page 32 | - name: Build the book 33 | run: | 34 | jupyter-book build course/ 35 | # Push the book's HTML to github-pages 36 | - name: GitHub Pages action 37 | uses: peaceiris/actions-gh-pages@v3.6.1 38 | with: 39 | github_token: ${{ secrets.GITHUB_TOKEN }} 40 | publish_dir: ./course/_build/html 41 | -------------------------------------------------------------------------------- /scripts/data_subset.py: -------------------------------------------------------------------------------- 1 | import os, shutil 2 | from pathlib import Path 3 | 4 | 5 | def select_subjects(group): 6 | """reading file with the selected subjects""" 7 | group_formatted = f"{int(group):02d}" 8 | selections_file = Path(os.path.dirname(__file__)) / "../Exercise/Groups/" / f"Group_{group_formatted}" 9 | with selections_file.open() as f: 10 | selected = f.readlines() 11 | selected_sub = [el.rstrip() for el in selected[1:]] 12 | return selected_sub 13 | 14 | 15 | def participate_file_update(dirname, selected_sub): 16 | """ updating particpants.tsv file """ 17 | participant_file = Path(dirname) / "participants.tsv" 18 | participant_orig = Path(dirname) / "participants_orig.tsv" 19 | participant_file.rename(participant_orig) 20 | 21 | # reading an original list of the participants 22 | with participant_orig.open() as f: 23 | all_lines = f.readlines() 24 | os.remove(participant_orig) 25 | 26 | # removing most of the lines from the list 27 | selected_lines = [] 28 | for el in all_lines: 29 | if el.startswith("sub-"): 30 | subj = el.split("\t")[0] 31 | if subj in selected_sub: 32 | selected_lines.append(el) 33 | else: 34 | selected_lines.append(el) 35 | 36 | # saving the subset of the lines 37 | with participant_file.open("w") as f: 38 | f.writelines(selected_lines) 39 | 40 | 41 | def main(dirname, group): 42 | selected_sub = select_subjects(group) 43 | # removing directories 44 | dirs_all = os.listdir(dirname) 45 | for dir in dirs_all: 46 | if dir.startswith("sub-") and dir not in selected_sub: 47 | shutil.rmtree(Path(dirname) / dir) 48 | 49 | # removing additional files 50 | os.remove(Path(dirname) / "nidm.ttl") 51 | os.remove(Path(dirname) / "demographics.csv") 52 | 53 | # updating participant file 54 | participate_file_update(dirname, selected_sub) 55 | 56 | 57 | if __name__ == '__main__': 58 | from argparse import ArgumentParser, RawTextHelpFormatter 59 | parser = ArgumentParser(description=__doc__, 60 | formatter_class=RawTextHelpFormatter) 61 | parser.add_argument("-g", dest="group", 62 | help="group id") 63 | parser.add_argument("-d", dest="data_dir", 64 | help="path to the data", 65 | default="/home/jovyan/my_data/my_ds001907-EDU") 66 | 67 | args = parser.parse_args() 68 | main(dirname=args.data_dir, group=args.group) 69 | 70 | 71 | -------------------------------------------------------------------------------- /course/materials.md: -------------------------------------------------------------------------------- 1 | # Course materials 2 | 3 | ## Introduction to the course 4 | 5 | ```{tabbed} slides 6 | 7 | You can find an interactive version of the slides below. 8 | 9 | 10 | 11 | ``` 12 | ```{tabbed} recording 13 | Recording will be added here. 14 | ``` 15 | 16 | ## What is a ReproPub and why would we want one? 17 | 18 | ```{tabbed} slides 19 | Slides will be added here. 20 | ``` 21 | ```{tabbed} recording 22 | Recording will be added here. 23 | ``` 24 | 25 | 26 | ## Introduction to the publication you are going to do 27 | 28 | ```{tabbed} slides 29 | Slides will be added here. 30 | ``` 31 | ```{tabbed} recording 32 | Recording will be added here. 33 | ``` 34 | 35 | ## What are containers, in general 36 | 37 | ```{tabbed} slides 38 | Slides will be added here. 39 | ``` 40 | ```{tabbed} recording 41 | Recording will be added here. 42 | ``` 43 | 44 | ## How to make a specific “simple container” 45 | 46 | ```{tabbed} slides 47 | 48 | You can find an interactive version of the slides below. 49 | 50 | 51 | 52 | ``` 53 | ```{tabbed} recording 54 | Recording will be added here. 55 | ``` 56 | 57 | ## Brief introduction to `DataLad` 58 | 59 | ```{tabbed} slides 60 | 61 | You can find an interactive version of the slides below or find the original version [here](https://jsheunis.github.io/ohbm-2022/talks/ohbm-2022-educational-jsheunis.html#/ 62 | ). 63 | 64 | 66 | 67 | ``` 68 | ```{tabbed} recording 69 | Recording will be added here. 70 | ``` 71 | 72 | ## “Just do it” aka exercise 73 | 74 | ```{tabbed} slides 75 | Slides will be added here. 76 | ``` 77 | ```{tabbed} recording 78 | Recording will be added here. 79 | ``` 80 | 81 | ## Meta and mega analysis of your results 82 | 83 | ```{tabbed} slides 84 | Slides will be added here. 85 | ``` 86 | ```{tabbed} recording 87 | Recording will be added here. 88 | ``` 89 | 90 | ## Recap and Summary 91 | 92 | ```{tabbed} slides 93 | Slides will be added here. 94 | ``` 95 | ```{tabbed} recording 96 | Recording will be added here. 97 | ``` 98 | -------------------------------------------------------------------------------- /course/outline.md: -------------------------------------------------------------------------------- 1 | # Outline 2 | The [overview]() already specified it, but just to be sure: the course will be split into two distinct parts. While the first half will consist of going through important `basics in writing re-executable publications`, the second half will entail working through a hands-on example to provide participants with first practical experience regarding the introduced concepts. As mentioned before, this `Jupyter Book` will contain all materials utilized in the course. 3 | 4 | ### When and where do we meet? 5 | 6 | As mentioned in the [overview]() section, the course will take place via two distinct options: in-person, 19/06/2002, 1-5 PM BST & virtually. The respective links will be provided for registered participants shortly before the course. 7 | 8 | ### Schedule 9 | In general, we will aim for a 5 h course. The content of the respective sessions will span everything from `interactive lectures` to `hands-on tutorials` and will in-parts entail a mixture of the two. The different parts are roughly indicated in the schedule below like this: 10 | 11 | 💡 - input from the instructor(s) 12 | 👩🏽‍🏫 - instructor(s) presents content 13 | 👨🏻‍💻🧑🏾‍💻 - hands-on computational work (e.g., coding) 14 | 15 | 16 | Our **very optimistic** schedule looks as follows (all times in CET): 17 | 18 | | Time slot | Topic | Instructor(s) | 19 | |--------------|:-----:| ---- | 20 | | 1 - 1:05 PM | [Introduction to the course](http://www.repronim.org/OHBMEducation-2022/basics/introduction.html) 💡👩🏽‍🏫 | [Dave Kennedy]() & [JB Poline]() | 21 | | 1:05 - 1:20 PM | [What is a ReproPub and why would we want one?](http://www.repronim.org/OHBMEducation-2022/basics/reproducible_neuroimaging.html) 💡👩🏽‍🏫 | [Julie Bates ]() | 22 | | 1:20 - 1:35 PM | [Introduction to the publication you are going to do](http://www.repronim.org/OHBMEducation-2022/basics/version_control_code.html) 💡👩🏽‍🏫 👨🏻‍💻🧑🏾‍💻 | [Neda Jahanshad]() | 23 | | 1:35 - 1:50 PM | [What are containers, in general](http://www.repronim.org/OHBMEducation-2022/basics/virtualization.html) 💡👩🏽‍🏫 | [Dorota Jarecka ]() | 24 | | 1:50 - 2:05 PM | [How to make a specific “simple container” ](http://www.repronim.org/OHBMEducation-2022/basics/standardization.html) 💡👩🏽‍🏫 (👨🏻‍💻🧑🏾‍💻) | [Peer Herholz]() | 25 | | 2:05 - 2:20 PM | [Brief introduction to `DataLad`](http://www.repronim.org/OHBMEducation-2022/advanced/recap.html) 💡👩🏽‍🏫 | [Stephan Heunis ]()| 26 | | 2:20 - 4:30 PM| [“Just do it”](http://www.repronim.org/OHBMEducation-2022/advanced/version_control_data.html) 💡👩🏽‍🏫👨🏻‍💻🧑🏾‍💻 | All instructors | 27 | | 4:30 - 4:45 PM| [Meta and mega analysis of your results](http://www.repronim.org/OHBMEducation-2022/advanced/meta_data.html) 💡👩🏽‍🏫 | [Dave Kennedy]() & [Neda Jahanshad]() | 28 | | 4:45 - 5 PM| [Recap and Summary](http://www.repronim.org/OHBMEducation-2022/advanced/workflow.html) 💡👩🏽‍🏫| [Dave Kennedy]() & [JB Poline]() | 29 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # OHBMEducation-2022 2 | Repo for the [2022 OHBM Education](https://www.humanbrainmapping.org/i4a/pages/index.cfm?pageid=4055) 1/2 day course entitled: ["How to Write a Re-Executable Publication"](https://www.humanbrainmapping.org/files/2022/2022%20Annual%20Meeting/%231162_Education_Course_Half_Day_-_How_to_Write_a_Re-executable_Publication.pdf) 3 | 4 | **When**: June 19th, 1pm-5pm local time 5 | 6 | **Where**: Glasgow, Scotland, Scottish Event Campus (SEC), Alsh Room 7 | 8 | 9 | ![picture](pics/SEC_Alsh.png) 10 | 11 | # Objective 12 | The goal of this 1/2 day (4 hour) course is to have the students **DO** a re-executable publication. 13 | 14 | # Instructors (for the credit) 15 | * [Julie Bates](https://profiles.umassmed.edu/display/11661391) - University of assachusetts Chan Medical School 16 | * [Dorota Jarecka](https://gablab.mit.edu/team/jarecka-dorota/) - MIT 17 | * [Peer Herholz](https://peerherholz.github.io/) - McGill and ??? 18 | * [Stephan Heunis](https://jsheunis.github.io/) - Institute of Neuroscience and Medicine, Forschungszentrum Jülich 19 | * [Neda Jahanshad](https://keck.usc.edu/faculty-search/neda-jahanshad/) - Universty of Southern California 20 | * [Stephen Strother](TODO) 21 | 22 | # Orginizers (for the blame) 23 | * [David Kennedy](https://profiles.umassmed.edu/display/130002) - University of assachusetts Chan Medical School 24 | * [JB Poline](https://www.mcgill.ca/neuro/jean-baptiste-poline-phd) - McGill 25 | 26 | # Prerequsites 27 | * A computer with internet and browser 28 | * A GitHub Account (Need one, go [here](https://github.com/signup?ref_cta=Sign+up&ref_loc=header+logged+out&ref_page=%2F&source=header-home)!) 29 | 30 | 31 | # Tentative Schedule 32 | 1:00 - 1:05 (5 minutes) Introduction to the course (DNK/JBP) - [Link to slides](https://docs.google.com/presentation/d/1-PXARrXyHaMOStJNLLt47pPA26EdA1pDnAc9dLB0aik/edit?usp=sharing) - Link to video 33 | 34 | 1:05 - 1:20 (15 minutes) What is a ReproPub and why would we want one? (JFB) - Link to slides - Link to video 35 | 36 | 1:20 - 1:35 (15 Minutes) Introduction to the publication you are going to do (NJ) - Link to slides - Link to video 37 | * ENIGMA has some Parkinson’s Disease (PD) results 38 | * You have some ‘new’ PD data 39 | * Do you see what ENIGMA did? 40 | 41 | 1:35 - 1:50 (15 minutes) What are containers, in general (DJ) - Link to slides - Link to video 42 | 43 | 1:50 - 2:05 (15 minutes) How to make a specific “simple container” (containing FSL) (PH) - Link to slides - Link to video 44 | 45 | 2:05 - 2:20 (15 minutes) Brief intro to DataLad (aka DataLad does it all...) (SH) - [Link to slides](https://jsheunis.github.io/ohbm-2022/talks/ohbm-2022-educational-jsheunis.html#/) - [Link to video](https://www.youtube.com/watch?v=s1zrB_sDbDU) 46 | * installing data 47 | * running containers 48 | * publishing results 49 | 50 | 2:20 - 4:30 “Just do it” - [Exercise](Exercise/README.md) 51 | * Here are your command lines 52 | * Here is the JupyterHub 53 | * Do it, lots of ReproStaff around to help answer questions or problems 54 | 55 | 2:30 - 2:45 Break 56 | 57 | 4:30 - 4:45 (15 minutes) Meta and mega analysis of your results (NJ+DNK) - Link to slides - Link to video 58 | 59 | 4:45 - 5:00 (15 minutes) Recap and Summary (SS) - Link to slides - Link to video 60 | 61 | 62 | # Approach Overview 63 | What do we mean by 'do a re-executable' publication? 64 | * [DataLad](https://www.datalad.org/) install a particular dataset - [DataLad@NITRC](https://www.nitrc.org/projects/datalad) 65 | * [DataLad containers-run](http://handbook.datalad.org/en/latest/basics/101-133-containersrun.html) a particular container (that generates some derived images and results in NIDM) 66 | * [DataLad "Publish"](http://docs.datalad.org/projects/deprecated/en/latest/generated/man/datalad-publish.html) the resulting dataset 67 | * [pynidm](https://github.com/incf-nidash/PyNIDM) query the results, and run a specific statistical test - [PyNIDM@NITRC](https://www.nitrc.org/projects/pynidm) 68 | * "Publish" the NIDM results to the [ReproLake](https://www.youtube.com/watch?v=VQ5t24mrvJI) 69 | -------------------------------------------------------------------------------- /course/conf.py: -------------------------------------------------------------------------------- 1 | # Configuration file for the Sphinx documentation builder. 2 | # 3 | # This file only contains a selection of the most common options. For a full 4 | # list see the documentation: 5 | # https://www.sphinx-doc.org/en/master/usage/configuration.html 6 | 7 | # -- Path setup -------------------------------------------------------------- 8 | 9 | # If extensions (or modules to document with autodoc) are in another directory, 10 | # add these directories to sys.path here. If the directory is relative to the 11 | # documentation root, use os.path.abspath to make it absolute, like shown here. 12 | # 13 | # import os 14 | # import sys 15 | # sys.path.insert(0, os.path.abspath('.')) 16 | 17 | # -- Project information ----------------------------------------------------- 18 | 19 | project = 'OHBMEducation-2022' 20 | copyright = '2022, ReproNim' 21 | author = 'ReproNim' 22 | 23 | master_doc = "index" 24 | 25 | # -- General configuration --------------------------------------------------- 26 | 27 | # Add any Sphinx extension module names here, as strings. They can be 28 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 29 | # ones. 30 | extensions = [ 31 | "myst_nb", 32 | "sphinx_togglebutton", 33 | "sphinx_copybutton", 34 | "sphinx.ext.intersphinx", 35 | "sphinx.ext.autodoc", 36 | "sphinx.ext.viewcode", 37 | ] 38 | 39 | # Add any paths that contain templates here, relative to this directory. 40 | templates_path = ["_templates"] 41 | 42 | # List of patterns, relative to source directory, that match files and 43 | # directories to ignore when looking for source files. 44 | # This pattern also affects html_static_path and html_extra_path. 45 | exclude_patterns = ["_build", "Thumbs.db", ".DS_Store", "**.ipynb_checkpoints"] 46 | 47 | 48 | # -- Options for HTML output ------------------------------------------------- 49 | 50 | # The theme to use for HTML and HTML Help pages. See the documentation for 51 | # a list of builtin themes. 52 | # 53 | html_title = "" 54 | html_theme = "sphinx_book_theme" 55 | html_logo = "_static/repronim_logo.png" 56 | html_theme_options = { 57 | "github_url": "https://github.com/ReproNim/OHBMEducation-2022", 58 | "repository_url": "https://github.com/ReproNim/OHBMEducation-2022", 59 | "repository_branch": "main", 60 | "use_edit_page_button": True, 61 | "path_to_docs": "docs/", 62 | "expand_sections": ["use/index", "examples/index"], 63 | } 64 | 65 | intersphinx_mapping = { 66 | "python": ("https://docs.python.org/3.8", None), 67 | "jb": ("https://jupyterbook.org/", None), 68 | "myst": ("https://myst-parser.readthedocs.io/en/latest/", None), 69 | "markdown_it": ("https://markdown-it-py.readthedocs.io/en/latest", None), 70 | "nbclient": ("https://nbclient.readthedocs.io/en/latest", None), 71 | "nbformat": ("https://nbformat.readthedocs.io/en/latest", None), 72 | "sphinx": ("https://www.sphinx-doc.org/en/3.x", None), 73 | } 74 | 75 | intersphinx_cache_limit = 5 76 | 77 | nitpick_ignore = [ 78 | ("py:class", "docutils.nodes.document"), 79 | ("py:class", "docutils.nodes.Node"), 80 | ("py:class", "docutils.nodes.container"), 81 | ("py:class", "docutils.nodes.system_message"), 82 | ("py:class", "nbformat.notebooknode.NotebookNode"), 83 | ("py:class", "pygments.lexer.RegexLexer"), 84 | ] 85 | 86 | # Add any paths that contain custom static files (such as style sheets) here, 87 | # relative to this directory. They are copied after the builtin static files, 88 | # so a file named "default.css" will overwrite the builtin "default.css". 89 | html_static_path = ["static"] 90 | 91 | copybutton_selector = "div:not(.output) > div.highlight pre" 92 | 93 | nb_custom_formats = {".Rmd": ["jupytext.reads", {"fmt": "Rmd"}]} 94 | jupyter_execute_notebooks = "cache" 95 | execution_show_tb = "READTHEDOCS" in os.environ 96 | execution_timeout = 60 # Note: 30 was timing out on RTD 97 | 98 | myst_admonition_enable = True 99 | myst_amsmath_enable = True 100 | myst_html_img_enable = True 101 | myst_deflist_enable = True 102 | myst_url_schemes = ("http", "https", "mailto") 103 | panels_add_boostrap_css = False 104 | -------------------------------------------------------------------------------- /course/index.md: -------------------------------------------------------------------------------- 1 | 2 | ```{toctree} 3 | --- 4 | hidden: true 5 | includehidden: true 6 | titlesonly: true 7 | --- 8 | ``` 9 | workshop logo 10 | 11 | [![Jupyter Book Badge](https://jupyterbook.org/badge.svg)](http://www.repronim.org/OHBMEducation-2022/) 12 | [![GitHub size](https://img.shields.io/github/repo-size/ReproNim/OHBMEducation-2022)](https://github.com/repronim/OHBMEducation-2022/archive/master.zip) 13 | [![GitHub issues](https://img.shields.io/github/issues/ReproNim/OHBMEducation-2022?style=plastic)](https://github.com/repronim/OHBMEducation-2022/issues) 14 | [![GitHub PR](https://img.shields.io/github/issues-pr/ReproNim/OHBMEducation-2022)](https://github.com/repronim/OHBMEducation-2022/pulls) 15 | [![License](https://img.shields.io/github/license/repronim/OHBMEducation-2022)](https://github.com/repronim/OHBMEducation-2022) 16 | 17 | # Welcome! 18 | 19 | Hello everyone and welcome to the [ReproNim](https://repronim.org/) - [OHBM 2022](https://www.humanbrainmapping.org/i4a/pages/index.cfm?pageid=4114) [educational course](https://www.humanbrainmapping.org/files/OHBM_2022_Educational_Courses_-_Schedule_at_a_Glance_FINAL.pdf) on ["How to Write a Re-Executable Publication"](https://www.humanbrainmapping.org/files/2022/2022%20Annual%20Meeting/%231162_Education_Course_Half_Day_-_How_to_Write_a_Re-executable_Publication.pdf), conducted during the 2022 OHBM meeting both in-person and virtually, we're glad to see you here! 20 | 21 | Within these pages, we provide information on how to follow the course, as well as all respective materials. This [jupyter book](https://jupyterbook.org/intro.html) will include the used slides, recordings and code in a way that they can be explored in an interactive manner. You can navigate through the respective sections via the TOC on the left side and within sections via the TOC on the right side. The three symbols in the top allow to enable full screen mode, link to the underlying [Github repository](https://github.com/repronim/OHBMEducation-2022/) and allow you to download the contents as a pdf or other formats respectively. Some sections will additionally have a little rocket in that row which will allow you to interactively rerun certain analyses via cloud computing. Additionally, we support public reviews and comments through an [hypothes.is plugin](https://web.hypothes.is/) with which you can interact on the right side. All of this awesomeness (talking about the infrastructure and resource) is possible through the dedicated and second to none work of the [Jupyter community](https://jupyter.org/community), specifically, the [Executable/Jupyter Book](https://executablebooks.org/en/latest/) and [mybinder project](https://mybinder.org/). 22 | 23 | ````{margin} 24 | ```{warning} 25 | These pages are currently under construction and will be updated continuously. 26 | Please visit this page again in the next few weeks for further information. 27 | ```` 28 | 29 | # How to Write a Re-Executable Publication 30 | 31 | 32 | You can checkout the respective sections: 33 | 34 | * [An overview](http://www.repronim.org/OHBMEducation-2022/overview.html) 35 | 36 | What's this course about and how is it organized? 37 | 38 | * [Setup](http://www.repronim.org/OHBMEducation-2022/setup.html) 39 | 40 | How are things implemented and supposed to work? 41 | 42 | * [General outline](http://www.repronim.org/OHBMEducation-2022/outline.html) 43 | 44 | What are the specific topics and aspects taught? 45 | 46 | * [Presentations](http://www.repronim.org/OHBMEducation-2022/presentations.html) 47 | 48 | What are the basics one should know to write a `Re-Executable Publication`? 49 | 50 | * [Exercises](http://www.repronim.org/OHBMEducation-2022/exercise.html) 51 | 52 | Once the basics are clear, how do things work "in action" 53 | 54 | * [Code of Conduct](http://www.repronim.org/OHBMEducation-2022/CoC.html) 55 | 56 | Necessities for creating an open, fair, safe and inclusive learning 57 | experience. 58 | 59 | # I've got a question! 🙋🏼‍♀️ ⁉️ 60 | 61 | In case you have any questions or difficulties going through the workshop, please don’t hesitate a single second to get in touch with us. A great way to do this is to [open an issue](https://github.com/repronim/OHBMEducation-2022/issues) on the 62 | [GitHub site of the Workshop](https://github.com/repronim/OHBMEducation-2022) (also possible via the GitHub button at the top of the pages). We would also highly appreciate and value every feedback or idea or you might have (via [issues](https://github.com/repronim/OHBMEducation-2022) or [hypothes.is annotation feature](https://web.hypothes.is/) on the right). 63 | 64 | ## 🙏 Acknowledgements 65 | 66 | This workshop was initialized and organized by [ReproNim](https://www.repronim.org/), as well as fantastic collaborators. 67 | -------------------------------------------------------------------------------- /course/CoC.md: -------------------------------------------------------------------------------- 1 | # Code of Conduct 2 | 3 | Hello 👋, 4 | 5 | this course is dedicated to providing an environment where people are kind and respectful to each other, a harassment-free learning experience for everyone, regardless of gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, age or religion. We do not tolerate harassment of event participants in any form. Sexual language and imagery is not appropriate for any event venue, including talks. Participants violating these rules may be sanctioned or expelled from the lecture at the discretion of the organizers. 6 | 7 | This could really be the end of that code of conduct, but some forms of harassment and negative behavior are fairly hard to identify at first. Please read carefully through the rest of the document to make sure you avoid them. There is also a section to know what to do and expect if you experience behavior that deviates from this code of conduct. 8 | 9 | Harassment includes, but _is not limited to_: 10 | ⛔ Verbal comments that reinforce social structures of domination related to gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, age or religion. 11 | ⛔ Sexual images in public spaces 12 | ⛔ Deliberate intimidation, stalking, or following 13 | ⛔ Harassing photography or recording 14 | ⛔ Sustained disruption of talks or other events 15 | ⛔ Inappropriate physical contact 16 | ⛔ Advocating for, or encouraging, any of the above behaviour 17 | ⛔ Unwelcome sexual attention 18 | 19 | ### *Microaggressions* 20 | Incidents can take the form of “microaggressions,” which is a damaging form of harassment. Microaggressions are the everyday slights or insults which communicate negative messages to target individuals, often based upon their marginalized group membership. Over time, microaggressions can take a great toll on mental and emotional health, and the target’s feeling of belonging in science and academia. The following examples can all be labeled micro-aggressions: 21 | ⛔ Commenting on a woman’s appearance rather than her work; 22 | ⛔ Only directing questions at male colleagues when there are female experts in the room; 23 | ⛔ Telling someone of colour that they “speak such good English”; 24 | ⛔ Forcefully praising meat to an individual with a vegetarian diet; 25 | ⛔ Praising alcoholic drinks to an individual who do not consume them. 26 | ⛔ Exclusion from a group can be a common nonverbal form of microaggression. 27 | ⛔ Microaggressions can be couched in the form of a “compliment,” (e.g. “you’re too attractive to be a scientist”). 28 | 29 | ### *Respecting differences* 30 | Participants come from many cultures and backgrounds. We therefore expect everyone to be very respectful of different cultural practices, attitudes, and beliefs. This includes being aware of preferred titles and pronouns, as well as using a respectful tone of voice. 31 | While we do not assume participants know the cultural practices of every ethnic and cultural group, we expect members to recognize and respect differences within our community. This means being open to learning from and educating others, as well as educating yourself. 32 | 33 | ### *How to treat each other* 34 | 👍 Be respectful and value each other’s ideas, styles and viewpoints. 35 | 👍 Be direct but professional; we cannot withhold hard truths. 36 | 👍 Be inclusive and help new perspectives be heard. 37 | 👍 Appreciate and accommodate our many cultural practices, attitudes and beliefs. 38 | 👍 Be open to learning from others. 39 | 👍 Lead by example and match your actions with your words. 40 | 41 | 42 | ## **Enforcement** 43 | Participants asked to stop any harassing behavior are expected to comply immediately. Organizers and presenters are also subject to the anti-harassment policy. In particular, they should not use sexualized images, activities, or other material. Event organisers may take action to redress anything designed to, or with the clear impact of, disrupting the event or making the environment hostile for any participants. 44 | 45 | ## **Reporting** 46 | If someone makes you or anyone else feel unsafe or unwelcome, please report it as soon as possible to the organizers either in person or by email. 47 | 48 | ### *Reporting Issues* 49 | 50 | Any incident of unacceptable behavior hurts the learning environment for everyone. We are therefore committed to supporting students to access corresponding reporting resources and ensure a supportive environment. 51 | 52 | When necessary, an organizer will assist students to file a report at the appropriate university office. Complaints can be filed, without limitation periods, for any incident of harassment that has occurred in the course of university activities, whether on or off-campus. 53 | 54 | Also, if you need moral support, feel welcome to ping the organizers in person or by email. 55 | 56 | 57 | ## Attribution 58 | Thanks and credits go the [Montreal Brainhack School]() for providing the CoC, this CoC is based on. 59 | -------------------------------------------------------------------------------- /course/_config.yml: -------------------------------------------------------------------------------- 1 | ####################################################################################### 2 | # A default configuration that will be loaded for all jupyter books 3 | # Users are expected to override these values in their own `_config.yml` file. 4 | # This is also the "master list" of all allowed keys and values. 5 | 6 | ####################################################################################### 7 | # Book settings 8 | title : "ReproNim OHBM 2022 Educational Course" # The title of the book. Will be placed in the left navbar. 9 | author : ReproNim # The author of the book 10 | copyright : "2022" # Copyright year to be placed in the footer 11 | logo : "static/repronim_logo.png" # A path to the book logo 12 | exclude_patterns : [] # Patterns to skip when building the book. Can be glob-style (e.g. "*skip.ipynb") 13 | 14 | ####################################################################################### 15 | # Execution settings 16 | execute: 17 | execute_notebooks : off # Whether to execute notebooks at build time. Must be one of ("auto", "force", "cache", "off") 18 | cache : "" # A path to the jupyter cache that will be used to store execution artifacs. Defaults to `_build/.jupyter_cache/` 19 | exclude_patterns : [] # A list of patterns to *skip* in execution (e.g. a notebook that takes a really long time) 20 | 21 | ####################################################################################### 22 | # HTML-specific settings 23 | html: 24 | favicon : "" # A path to a favicon image 25 | use_edit_page_button : True # Whether to add an "edit this page" button to pages. If `true`, repository information in repository: must be filled in 26 | use_repository_button : True # Whether to add a link to your repository button 27 | use_issues_button : True # Whether to add an "open an issue" button 28 | extra_navbar : Powered by Jupyter Book # Will be displayed underneath the left navbar. 29 | extra_footer : "" # Will be displayed underneath the footer. 30 | google_analytics_id : "" # A GA id that can be used to track book views. 31 | home_page_in_navbar : true # Whether to include your home page in the left Navigation Bar 32 | baseurl : "" # The base URL where your book will be hosted. Used for creating image previews and social links. e.g.: https://mypage.com/mybook/ 33 | comments: 34 | hypothesis: true 35 | 36 | ####################################################################################### 37 | # Launch button settings 38 | launch_buttons: 39 | notebook_interface : "classic" # The interface interactive links will activate ["classic", "jupyterlab"] 40 | binderhub_url : "https://mybinder.org" # The URL of the BinderHub (e.g., https://mybinder.org) 41 | thebe : false # Add a thebe button to pages (requires the repository to run on Binder) 42 | 43 | repository: 44 | url : https://github.com/ReproNim/OHBMEducation-2022 # The URL to your book's repository 45 | path_to_book : "course/" # A path to your book's folder, relative to the repository root. 46 | branch : main # Which branch of the repository should be used when creating links 47 | 48 | ####################################################################################### 49 | # Advanced and power-user settings 50 | sphinx: 51 | config: 52 | nb_custom_formats: 53 | .Rmd: 54 | - jupytext.reads 55 | - fmt: Rmd 56 | # TODO: #917 this path will be the default in sphinx v4 57 | # mathjax_path: https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js 58 | # However, it is incompatible with the mathjax config below for macros 59 | mathjax_config: 60 | TeX: 61 | Macros: 62 | "N": "\\mathbb{N}" 63 | "floor": ["\\lfloor#1\\rfloor", 1] 64 | "bmat": ["\\left[\\begin{array}"] 65 | "emat": ["\\end{array}\\right]"] 66 | latex_elements: 67 | preamble: | 68 | \newcommand\N{\mathbb{N}} 69 | \newcommand\floor[1]{\lfloor#1\rfloor} 70 | \newcommand{\bmat}{\left[\begin{array}} 71 | \newcommand{\emat}{\end{array}\right]} 72 | intersphinx_mapping: 73 | ebp: 74 | - "https://executablebooks.org/en/latest/" 75 | - null 76 | myst-parser: 77 | - "https://myst-parser.readthedocs.io/en/latest/" 78 | - null 79 | myst-nb: 80 | - "https://myst-nb.readthedocs.io/en/latest/" 81 | - null 82 | sphinx: 83 | - "https://www.sphinx-doc.org/en/master" 84 | - null 85 | nbformat: 86 | - "https://nbformat.readthedocs.io/en/latest" 87 | - null 88 | sphinx-panels: 89 | - https://sphinx-panels.readthedocs.io/en/sphinx-book-theme/ 90 | - null 91 | -------------------------------------------------------------------------------- /stats/OpenNeuro_results.csv: -------------------------------------------------------------------------------- 1 | Subject,age,sex,diagnosis,Left-Accumbens-area (mm^3),Left-Amygdala (mm^3),Left-Caudate (mm^3),Left-Hippocampus (mm^3),Left-Pallidum (mm^3),Left-Putamen (mm^3),Left-Thalamus-Proper (mm^3),Right-Accumbens-area (mm^3),Right-Amygdala (mm^3),Right-Caudate (mm^3),Right-Hippocampus (mm^3),Right-Pallidum (mm^3),Right-Putamen (mm^3),Right-Thalamus-Proper (mm^3),csf (mm^3),gray (mm^3),white (mm^3) 2 | sub-RC4101,66.28288,1,0,537,1593,3977,3547,1999,5262,9252,331,1616,4063,3896,1962,5312,9069,330623,593671,647740 3 | sub-RC4103,57.45623,0,0,508,1237,3038,4282,1609,4122,8734,449,1073,3202,3549,1903,4317,8147,302765,587373,607962 4 | sub-RC4106,70.1174,1,0,342,1735,2721,3111,2101,3963,9448,278,1691,2760,3584,1954,4525,9306,410846,549299,691729 5 | sub-RC4107,44.87712,0,0,493,1450,2885,3462,1577,3826,7478,414,1037,2862,3079,1643,3972,7283,272493,487332,581691 6 | sub-RC4109,68.16394,1,0,273,1486,3643,3573,1783,4540,7928,169,1366,3877,4296,2016,4732,7955,363395,568592,623537 7 | sub-RC4110,54.44408,0,0,598,1715,3434,4530,2170,5492,9306,424,1664,3674,4669,2370,5131,9090,301515,615080,673255 8 | sub-RC4111,49.37169,0,0,623,1205,3055,3846,1590,4225,8039,381,1468,3201,3946,1646,3740,7698,212815,570766,616937 9 | sub-RC4112,64.32507,0,0,347,1408,2684,3484,1489,3759,7189,245,1438,2872,3607,1488,3652,6982,335399,485283,514474 10 | sub-RC4113,60.15017,1,0,409,881,2643,2469,1507,4308,6317,263,989,2860,1928,1524,3948,5801,330034,501675,512778 11 | sub-RC4114,60.70497,1,0,662,1492,3062,4283,2075,5717,8720,346,1570,3309,4498,1856,5290,8560,355136,611124,710692 12 | sub-RC4115,70.02037,1,0,509,1725,3257,3507,1773,4885,8452,459,1468,3311,3301,1700,5115,7677,343812,580270,615709 13 | sub-RC4116,64.57782,1,0,297,1064,3825,3293,1732,4639,8195,250,1355,3620,3643,1589,5177,7741,282795,530982,549929 14 | sub-RC4117,76.10251,0,0,487,1482,2418,3752,1579,3895,7222,312,1436,2476,3599,1594,3817,7006,338891,465316,547667 15 | sub-RC4118,59.68854,1,0,488,1701,4017,4131,2106,5332,9989,510,1512,4139,4298,2173,4956,9753,360436,632656,747455 16 | sub-RC4119,66.4221,1,0,624,1277,3459,4781,2105,5387,9158,376,1897,3609,4510,1721,5175,8759,337158,602517,644105 17 | sub-RC4120,70.93856,1,0,460,2226,3483,4031,1944,5006,10251,365,1938,3716,4572,2127,4953,10061,331371,611339,704581 18 | sub-RC4121,77.01095,1,0,269,1433,3593,2995,1815,4361,8072,302,1272,3197,3052,1644,4151,8054,398752,603870,645358 19 | sub-RC4122,78.33601,1,0,242,1200,3973,3719,1828,4967,8913,397,1385,4065,3691,1928,5295,8722,373187,570212,684299 20 | sub-RC4123,83.29109,1,0,359,1024,3228,2874,1497,4305,6746,252,927,3666,3233,1499,4375,6505,455810,513687,632572 21 | sub-RC4125,76.8756,1,0,424,1679,3517,3953,1874,5033,8866,388,1383,3847,4059,2094,4981,8671,396823,570455,645455 22 | sub-RC4126,86.66116,1,0,396,1357,3157,2867,1658,4761,8308,222,1342,3585,2835,1595,3690,8057,389528,537152,594477 23 | sub-RC4128,62.45495,1,0,481,1649,2891,3206,1645,4632,6704,414,1544,3153,3530,1724,4249,6580,317394,540355,534939 24 | sub-RC4129,58.43304,1,0,602,1601,3113,3768,1821,4466,8558,394,1366,3296,3949,1805,4455,8285,285447,584367,643937 25 | sub-RC4130,61.95773,0,0,382,1255,2566,3468,1893,4008,7909,338,1041,2526,3548,1867,3987,7842,283104,490673,546584 26 | sub-RC4131,64.8826,1,0,526,1621,3855,3817,2109,5396,9239,458,1298,3613,3809,2064,5135,8546,334295,568161,684419 27 | sub-RC4201,59.41114,1,1,691,1604,3606,4178,1894,5504,9056,412,1453,3681,4013,1882,5413,8766,352131,620965,622409 28 | sub-RC4204,67.90571,0,1,563,1533,3246,3686,1603,4578,7698,372,1350,3501,3402,1522,4131,7550,266391,463421,540949 29 | sub-RC4205,56.63652,1,1,493,1305,3295,4146,1803,4520,7896,427,1243,3265,4620,1734,4312,7860,236285,545132,548659 30 | sub-RC4206,51.35526,0,1,553,1397,3447,3610,1603,4254,7842,396,1354,3491,3725,1504,4217,7424,221753,553575,570470 31 | sub-RC4207,41.83067,0,1,628,1409,3108,3671,1800,4903,7697,513,938,3196,3094,1892,4596,7486,236314,569331,631658 32 | sub-RC4208,51.32668,0,1,423,1661,3316,3948,1739,4155,8851,408,1536,3335,3370,1841,4226,8181,283738,507151,635472 33 | sub-RC4210,55.04227,0,1,732,1667,3023,2907,1797,5082,9000,492,1281,3192,3050,1923,4585,8251,274302,574893,712764 34 | sub-RC4211,67.73355,0,1,490,1541,3612,3777,1926,4680,7828,479,1151,3758,3971,1913,4524,7726,273234,511451,560906 35 | sub-RC4212,55.52033,1,1,472,1294,3570,3893,1715,5129,7510,353,1501,4050,3924,1758,4843,7232,280742,592814,597749 36 | sub-RC4213,47.33602,1,1,555,1298,3685,3852,1837,5214,8782,497,1441,3562,3520,1822,4791,8772,236686,602123,631160 37 | sub-RC4214,63.33054,1,1,530,1719,4074,4214,1971,4614,8720,404,1728,4043,4347,2016,4863,8652,263110,577747,623542 38 | sub-RC4215,51.01917,0,1,575,1517,3342,3589,1712,4430,8339,569,1095,3253,3213,1652,4303,8001,224930,562658,528511 39 | sub-RC4217,74.85095,1,1,459,1382,3276,3413,1572,4274,7509,312,1133,3139,3722,1646,4278,7582,362238,616231,650695 40 | sub-RC4218,66.91393,0,1,443,1272,3471,4109,2156,4479,7808,391,1399,3669,4215,2501,4183,7776,297417,556091,580050 41 | sub-RC4219,75.99059,0,1,464,1426,3218,3650,1705,4212,7609,356,1292,3736,3327,1667,4407,7463,293428,522049,571782 42 | sub-RC4220,67.73355,1,1,570,1732,3262,4340,1907,5219,8916,478,1597,3316,4145,1966,5106,8697,399826,688676,676429 43 | sub-RC4221,66.09977,0,1,551,1472,3262,3496,1716,4513,7970,426,1388,3385,3616,1967,4520,7551,254501,472127,522697 44 | sub-RC4224,73.11893,0,1,427,1483,3268,3439,1703,4434,7937,350,1540,3556,3786,1670,3961,7695,351686,534419,593624 45 | sub-RC4225,72.94404,1,1,536,1927,3598,3903,1952,5024,8600,344,1522,3576,3598,1842,5066,8294,419650,570713,689512 46 | sub-RC4226,69.34278,0,1,288,1711,3705,3860,1896,4980,8950,345,1730,3785,3956,2014,4698,8942,337286,608081,625692 47 | sub-RC4227,69.04774,1,1,636,1393,3472,3691,1914,5389,8840,345,873,3613,3515,1798,5051,8142,359828,562204,710459 -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Creative Commons Legal Code 2 | 3 | CC0 1.0 Universal 4 | 5 | CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE 6 | LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN 7 | ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS 8 | INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES 9 | REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS 10 | PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM 11 | THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED 12 | HEREUNDER. 13 | 14 | Statement of Purpose 15 | 16 | The laws of most jurisdictions throughout the world automatically confer 17 | exclusive Copyright and Related Rights (defined below) upon the creator 18 | and subsequent owner(s) (each and all, an "owner") of an original work of 19 | authorship and/or a database (each, a "Work"). 20 | 21 | Certain owners wish to permanently relinquish those rights to a Work for 22 | the purpose of contributing to a commons of creative, cultural and 23 | scientific works ("Commons") that the public can reliably and without fear 24 | of later claims of infringement build upon, modify, incorporate in other 25 | works, reuse and redistribute as freely as possible in any form whatsoever 26 | and for any purposes, including without limitation commercial purposes. 27 | These owners may contribute to the Commons to promote the ideal of a free 28 | culture and the further production of creative, cultural and scientific 29 | works, or to gain reputation or greater distribution for their Work in 30 | part through the use and efforts of others. 31 | 32 | For these and/or other purposes and motivations, and without any 33 | expectation of additional consideration or compensation, the person 34 | associating CC0 with a Work (the "Affirmer"), to the extent that he or she 35 | is an owner of Copyright and Related Rights in the Work, voluntarily 36 | elects to apply CC0 to the Work and publicly distribute the Work under its 37 | terms, with knowledge of his or her Copyright and Related Rights in the 38 | Work and the meaning and intended legal effect of CC0 on those rights. 39 | 40 | 1. Copyright and Related Rights. A Work made available under CC0 may be 41 | protected by copyright and related or neighboring rights ("Copyright and 42 | Related Rights"). Copyright and Related Rights include, but are not 43 | limited to, the following: 44 | 45 | i. the right to reproduce, adapt, distribute, perform, display, 46 | communicate, and translate a Work; 47 | ii. moral rights retained by the original author(s) and/or performer(s); 48 | iii. publicity and privacy rights pertaining to a person's image or 49 | likeness depicted in a Work; 50 | iv. rights protecting against unfair competition in regards to a Work, 51 | subject to the limitations in paragraph 4(a), below; 52 | v. rights protecting the extraction, dissemination, use and reuse of data 53 | in a Work; 54 | vi. database rights (such as those arising under Directive 96/9/EC of the 55 | European Parliament and of the Council of 11 March 1996 on the legal 56 | protection of databases, and under any national implementation 57 | thereof, including any amended or successor version of such 58 | directive); and 59 | vii. other similar, equivalent or corresponding rights throughout the 60 | world based on applicable law or treaty, and any national 61 | implementations thereof. 62 | 63 | 2. Waiver. To the greatest extent permitted by, but not in contravention 64 | of, applicable law, Affirmer hereby overtly, fully, permanently, 65 | irrevocably and unconditionally waives, abandons, and surrenders all of 66 | Affirmer's Copyright and Related Rights and associated claims and causes 67 | of action, whether now known or unknown (including existing as well as 68 | future claims and causes of action), in the Work (i) in all territories 69 | worldwide, (ii) for the maximum duration provided by applicable law or 70 | treaty (including future time extensions), (iii) in any current or future 71 | medium and for any number of copies, and (iv) for any purpose whatsoever, 72 | including without limitation commercial, advertising or promotional 73 | purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each 74 | member of the public at large and to the detriment of Affirmer's heirs and 75 | successors, fully intending that such Waiver shall not be subject to 76 | revocation, rescission, cancellation, termination, or any other legal or 77 | equitable action to disrupt the quiet enjoyment of the Work by the public 78 | as contemplated by Affirmer's express Statement of Purpose. 79 | 80 | 3. Public License Fallback. Should any part of the Waiver for any reason 81 | be judged legally invalid or ineffective under applicable law, then the 82 | Waiver shall be preserved to the maximum extent permitted taking into 83 | account Affirmer's express Statement of Purpose. In addition, to the 84 | extent the Waiver is so judged Affirmer hereby grants to each affected 85 | person a royalty-free, non transferable, non sublicensable, non exclusive, 86 | irrevocable and unconditional license to exercise Affirmer's Copyright and 87 | Related Rights in the Work (i) in all territories worldwide, (ii) for the 88 | maximum duration provided by applicable law or treaty (including future 89 | time extensions), (iii) in any current or future medium and for any number 90 | of copies, and (iv) for any purpose whatsoever, including without 91 | limitation commercial, advertising or promotional purposes (the 92 | "License"). The License shall be deemed effective as of the date CC0 was 93 | applied by Affirmer to the Work. Should any part of the License for any 94 | reason be judged legally invalid or ineffective under applicable law, such 95 | partial invalidity or ineffectiveness shall not invalidate the remainder 96 | of the License, and in such case Affirmer hereby affirms that he or she 97 | will not (i) exercise any of his or her remaining Copyright and Related 98 | Rights in the Work or (ii) assert any associated claims and causes of 99 | action with respect to the Work, in either case contrary to Affirmer's 100 | express Statement of Purpose. 101 | 102 | 4. Limitations and Disclaimers. 103 | 104 | a. No trademark or patent rights held by Affirmer are waived, abandoned, 105 | surrendered, licensed or otherwise affected by this document. 106 | b. Affirmer offers the Work as-is and makes no representations or 107 | warranties of any kind concerning the Work, express, implied, 108 | statutory or otherwise, including without limitation warranties of 109 | title, merchantability, fitness for a particular purpose, non 110 | infringement, or the absence of latent or other defects, accuracy, or 111 | the present or absence of errors, whether or not discoverable, all to 112 | the greatest extent permissible under applicable law. 113 | c. Affirmer disclaims responsibility for clearing rights of other persons 114 | that may apply to the Work or any use thereof, including without 115 | limitation any person's Copyright and Related Rights in the Work. 116 | Further, Affirmer disclaims responsibility for obtaining any necessary 117 | consents, permissions or other rights required for any use of the 118 | Work. 119 | d. Affirmer understands and acknowledges that Creative Commons is not a 120 | party to this document and has no duty or obligation with respect to 121 | this CC0 or use of the Work. 122 | -------------------------------------------------------------------------------- /course/dei.md: -------------------------------------------------------------------------------- 1 | # Diversity, Equity and Inclusion 2 | 3 | Let's rip one band-aid off right away: the current scientific and academic system is incredibly flawed and tremendously biased. Unfortunately, most folks are not brought into contact with the respective issues during their studies or work and thus quite often perceive the present inequalities as "normal" and a given. This lead to the proliferation of exceptionally unfair "standards" and practices towards minorities and underrepresented groups. What's needed to combat these things are Universities, Departments, Lecturers, etc. that address these issues, as well as inform and enable folks to actively engage in the process towards increasing [Diversity, Equity and Inclusion](https://en.wikipedia.org/wiki/Diversity_(business)) ([#DEI](https://twitter.com/hashtag/DEI?src=hashtag_click)) at all levels. 4 | 5 | You might think: "Why should I care and what does it have to do with me? I just want to learn something here.". Well, even though you might not be (knowingly) affected, someone you know most likely will be and it will take all of us to change this for the better. In order to become aware of the systemic problems and issues others are facing and take corresponding actions, one needs to be able to make informed decisions based on a broad foundation of knowledge and this needs to be enabled asap. 6 | 7 | Thus, please take the time and effort to delve into this crucial topic by checking the resources linked below and utilizing them as a starting point. Please note that these things will be frequently updated. If you stumble across something that should be included, please contact the instructor so that it can be added. 8 | 9 | 10 | ## Overview materials & introduction to DEI 11 | 12 | Below you'll find some useful resources to familiarize yourself with ([#DEI](https://twitter.com/hashtag/DEI?src=hashtag_click)) and related problems. The list will contain different media, from youtube videos, over journal publications to workshops and other information material. 13 | 14 | ### Introductory videos 15 | 16 | A short overview video from the [Department of Medicine](https://healthsci.mcmaster.ca/medicine) at [McMaster University](https://www.mcmaster.ca/) about `Understanding Equity, Diversity and Inclusion in the Academy`. 17 | 18 | 19 | 20 | A set of videos from the [Southern Regional Education Board](www.SREB.org) focusing `Equity and Diversity in Higher Education`, including multiple presentations. 21 | 22 | 23 | 24 | A [TED talk](https://www.ted.com/talks), by [Janet Stovall](https://www.ted.com/speakers/janet_stovall) about `How to get serious about diversity and inclusion in the workplace`. 25 | 26 | 27 | 28 | And another [TED talk](https://www.ted.com/talks), this time by [Anthony Jack](https://www.tedxcambridge.com/speaker/anthony-jack/), called `On Diversity: Access Ain’t Inclusion`. 29 | 30 | 31 | 32 | A set of presentations about `Queer & Trans Perspectives in Academia` organized by [SAGE](https://us.sagepub.com/en-us/nam/home). 33 | 34 | 35 | 36 | 37 | 38 | ### Journal publications 39 | 40 | There is an increasing amount of journal articles that address [#DEI](https://twitter.com/hashtag/DEI?src=hashtag_click) via various approaches. Below you'll find a short list of them, ranging from publications with a rather broad overview character to such that focus on a very specific topic. 41 | 42 | [Gender bias in academia: A lifetime problem that needs solutions](https://www.sciencedirect.com/science/article/pii/S0896627321004177) 43 | 44 | [Racial and ethnic imbalance in neuroscience reference lists and intersections with gender](https://www.biorxiv.org/content/10.1101/2020.10.12.336230v1) 45 | 46 | [The extent and drivers of gender imbalance in neuroscience reference lists](https://www.nature.com/articles/s41593-020-0658-y) 47 | 48 | [(In)citing Action to Realize an Equitable Future](https://www.cell.com/neuron/fulltext/S0896-6273(20)30357-3?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0896627320303573%3Fshowall%3Dtrue#relatedArticles) 49 | 50 | [The Citation Diversity Statement: A Practice of Transparency, A Way of Life](https://www.sciencedirect.com/science/article/pii/S1364661320301649?casa_token=PVTfHFUVDzUAAAAA:8OqlKsrlBRVTPcA-wk71rBVYbyM5OSPGRXhuJSVyLxHIF_Wh5KDTxytmllK0-ecEhFGPHN0KnZo) 51 | 52 | 53 | 54 | ### workshops 55 | 56 | TBA 57 | 58 | ### other material 59 | 60 | Here's an informative Wikipedia article on ["First-generation college students in the United States"](https://en.wikipedia.org/wiki/First-generation_college_students_in_the_United_States). While it's obviously focused on the US education system, a lot of the outlined issues/problems/obstacles/hurdles are faced by first-gen students all over the world. 61 | 62 | ## Initiatives 63 | 64 | Please note that this list will be rather "neuroscience-focused" as it's the instructor's main field of expertise/work. However, other resources will be added asap and frequently. 65 | 66 | ### Women in Neuroscience Repository 67 | 68 | The [Women in Neuroscience Repository](https://www.winrepo.org/) aims to "identify and recommend female neuroscientists for conferences, symposia or collaborations." via providing a fantastic interactive resource organized by a multitude of keywords. There's also a [](https://twitter.com/WINRePo1) account. 69 | 70 | 71 | 72 | ### Queer in Neuro 73 | 74 | The [Queer in Neuro](https://queerinneuro.com/) initiative is currently under development but exciting things will happen soon. Make sure to check their [](https://twitter.com/QueerInNeuro). 75 | 76 | 77 | 78 | 79 | ### Queer Engineer International 80 | 81 | [Queer Engineer Internation](https://www.queerengineer.org/) is an initiative "to build resources and authentic community at the intersection of LGBTQIA+ and STEM". 82 | 83 | 84 | 85 | 86 | ### Letters to a Pre-Scientist 87 | 88 | The goal of [Letters to a Pre-Scientist](https://prescientist.org/) is to "inspire all students to explore a future in STEM" via connecting "students with real scientists to demystify STEM careers and empower all students to see themselves as future STEM professionals". 89 | 90 | 91 | -------------------------------------------------------------------------------- /stats/demog_groupStats_loop_OHBM.Rmd: -------------------------------------------------------------------------------- 1 | 2 | # OHBM Educational Course 2022 - Re-executing a publication 3 | 4 | In this R notebook, you will be able to make plots for looking through some of the data that will be used to calculate case/control effect sizes and evaluate descriptive statistics for your cohort along the way. 5 | 6 | First, you need to make sure to set all the appropriate parameters. 7 | 8 | We will plot age distributions for diagnostic and sex differences for each cohort, but you can use this setup for other plots as well. 9 | 10 | The first part of this notebook is set up just as before. 11 | You will need to install some R packages and load the libraries to ensure plots are displayed. 12 | 13 | ### Read in the data and set the output folder and file names! 14 | ```{r} 15 | datafile <- read.csv("all4sites_data_OHBM.csv"); #Read in the file 16 | #datafile <- read.csv("metr_SubCortical_OHBM.csv"); #Read in the file 17 | #datafile <- read.csv("OpenNeuro_results.csv"); #Read in the file 18 | 19 | ## REMEMBER TO CHANGE THE OUTPUT NAME EACH RUN 20 | outFolder="./AllDATA_allROIs_woICV/" ; dir.create(outFolder,"/",showWarnings = FALSE); 21 | cohortName="MyOHBMcohort-AllDATAFull-woICV" 22 | eName="ENIGMA-PD" 23 | ``` 24 | ```{r} 25 | # Double check the column names of the file you read in: 26 | columnnames = colnames(datafile); 27 | colnames(datafile) 28 | ``` 29 | 30 | Do you need to calculate ICV? 31 | ```{r} 32 | if (length(datafile[,which(columnnames=="ICV")]) == 0 33 | && length(datafile[,which(columnnames=="csf..mm.3.")]) > 0 ) { 34 | print("It looks like you are working with the OHBM results and ICV is not included. We will add volumes of CSF, gray and white matter for you here -- this will serve as a proxy for ICV.") 35 | ICV=datafile$csf..mm.3. + datafile$white..mm.3. + datafile$gray..mm.3. 36 | ncol=dim(datafile)[2] ; 37 | datafile[,ncol+1]=ICV ; colnames(datafile) <- c(columnnames,"ICV"); columnnames = colnames(datafile) } 38 | ``` 39 | 40 | ### This code is exclusively for case/control effect sizes as described by the Cohen's D statistic. It is important to ensure the code uses the proper column and indicator for cases vs controls: 41 | ```{r} 42 | # what is the column header for case/control or indicating diagnostic status? 43 | caseColumnHeader= "Dx" #"Dx" 44 | # how are cases coded (1,0,"patients","cases")? 45 | caseIndicator=1 # in this code those not coded as cases are assumed to be controls 46 | ``` 47 | 48 | ### We will be including age and sex in all models here. However, you can turn this feature off. 49 | ```{r} 50 | # what is the column header for age? 51 | ageColumnHeader="Age" # "Age" 52 | # what is the column header for sex? 53 | sexColumnHeader="Sex" # "Sex" 54 | # how are males coded (1,0,"M","males",-0.5)? 55 | maleIndicator=1 #sex is included in the regression as a categorical variable, or factor 56 | # would you like to include age, sex, and an age-x-sex interaction term in the regression? 57 | includeAgeSexinModel=TRUE 58 | # includeAgeSexInteractioninModel == TRUE # this option has been removed but you can add it back in (experts only -- be careful!) 59 | ``` 60 | 61 | ### Do you have other covariates to include? We will ask about continuous and categorical variables separately 62 | #### First, continuous: 63 | ```{r} 64 | # how many continuous covariates other than age should be used from your spreadsheet? 65 | otherCovariatesN=0 66 | # list out the covariates and separate them with a semi-colon 67 | otherCovariates="ICV" 68 | #note if your ROI is ICV, you will not want ICV as a covariate, so you'd have to run through the script again for the separate model 69 | ``` 70 | #### Next, categorical: 71 | ```{r} 72 | # how many categorical covariates are in your spreadsheet other than sex? 73 | ## (examples might include site / scanner / study / different diagnostic groups) 74 | otherCatCovariatesN=0 75 | # list out the categorical covariates (coded as factors) and separate them with a semi-colon 76 | otherCatCovariates="site" 77 | ``` 78 | 79 | ### What ROIs will you be looping over with the same statistical model? 80 | ```{r} 81 | #ROINames<-c("Left-Accumbens-area (mm^3)","Left-Amygdala (mm^3)","Left-Caudate (mm^3)","Left-Hippocampus (mm^3)","Left-Pallidum (mm^3)","Left-Putamen (mm^3)","Left-Thalamus-Proper (mm^3)","Right-Accumbens-area (mm^3)","Right-Amygdala (mm^3)","Right-Caudate (mm^3)","Right-Hippocampus (mm^3)","Right-Pallidum (mm^3)","Right-Putamen (mm^3)","Right-Thalamus-Proper (mm^3)"); 82 | 83 | ROINames<-c("Lthal","Rthal","Lcaud","Rcaud","Lput","Rput","Lpal","Rpal","Lhippo","Rhippo","Lamyg","Ramyg","Laccumb","Raccumb") 84 | 85 | # ICV would require a different model since we will covary for ICV for the subcortical regions, so we'd run that separately 86 | #ROINames<-c("ICV") 87 | ROINames<-make.names(ROINames) 88 | #Get number of structures to test 89 | Nrois=length(ROINames) 90 | print(paste("We will be looping Cohen's D statistics over data from",Nrois,"regions of interest.")) 91 | ``` 92 | 93 | ### We will source some functions that we will use for Cohen's D calculations from a linear regression output. Feel free to take a look at these functions to see what they do 94 | ```{r} 95 | # If this file is in the same folder as your current working directory, no need to change the input, otherwise please make sure the file with the appropriate path is sourced. 96 | source('ENIGMA_cohensD_functions.R') 97 | ``` 98 | 99 | # Here we go! 100 | 101 | ## Sanity checks: 102 | 103 | ```{r} 104 | # Check for duplicated SubjIDs that may cause issues with merging data sets. 105 | if(anyDuplicated(datafile[,1]) != 0) { stop('You have duplicate SubjIDs in your file or subject ID is not your first column.\nMake sure there are no repeat SubjIDs.') } 106 | 107 | # Check all ROIs are column headers in the input CSV 108 | missingROIS=NULL; l_missing=0; 109 | for (roi in 1:Nrois) { 110 | if (length(datafile[,which(columnnames==ROINames[roi])]) == 0) { 111 | print(ROINames[roi]) 112 | missingROIS=paste(missingROIS,ROINames[roi]) 113 | l_missing=l_missing+1; } 114 | } 115 | if (l_missing >= 1) { warning(paste("You are missing ROIs in your file.\n Please check.",sep=""))} 116 | ``` 117 | ################################################################### 118 | 119 | ## Group statistics and plots 120 | 121 | Make sure packages are installed 122 | ```{r, eval=FALSE} 123 | install.packages('tidyverse','ggpubr','DT','RColorBrewer','scales') 124 | ``` 125 | 126 | Load libraries 127 | ```{r, eval=TRUE} 128 | library(ggpubr) 129 | library('tidyverse') 130 | library('DT') 131 | library(scales) 132 | library(RColorBrewer) 133 | ``` 134 | ```{r} 135 | p1<- ggplot(data=datafile, aes(.data[[ageColumnHeader]], fill=factor(.data[[caseColumnHeader]]))) + 136 | geom_histogram(aes(y = (..count..)/sum(..count..)),bins=10, position = "dodge",color="black" ) + 137 | scale_y_continuous(labels = percent) + 138 | xlab("Age")+ ylab("Percent") 139 | p2<- ggplot(data=datafile, aes(.data[[ageColumnHeader]], fill=factor(.data[[sexColumnHeader]]))) + 140 | geom_histogram(aes(y = (..count..)/sum(..count..)),bins=10, position = "dodge",color="black" ) + 141 | scale_y_continuous(labels = percent) + 142 | xlab("Age")+ ylab("Percent") 143 | ggarrange(p1,p2, labels=c("Age by diagnosis", "Age by sex"),ncol=2,nrow=1,common.legend = FALSE, legend = "right") 144 | 145 | # Box plot for multiple sites 146 | if (length(datafile[,which(columnnames=="site")]) != 0 ) { 147 | bp<-ggplot(datafile, aes(x=site, y=.data[[ageColumnHeader]], fill=site)) + 148 | geom_boxplot() 149 | figure<-bp 150 | print(figure) 151 | for (s in unique(datafile$site)) { 152 | datasite = subset.data.frame(datafile,site==s) 153 | p1<- ggplot(data=datasite, aes(.data[[ageColumnHeader]], fill=factor(.data[[caseColumnHeader]]))) + 154 | geom_histogram(aes(y = (..count..)/sum(..count..)),bins=5, position = "dodge",color="black" ) + 155 | scale_y_continuous(labels = percent) + 156 | xlab("Age")+ ylab("Percent") 157 | p2<- ggplot(data=datasite, aes(.data[[ageColumnHeader]], fill=factor(.data[[sexColumnHeader]]))) + 158 | geom_histogram(aes(y = (..count..)/sum(..count..)),bins=5, position = "dodge",color="black" ) + 159 | scale_y_continuous(labels = percent) + 160 | xlab("Age")+ ylab("Percent") 161 | figure<-ggarrange(p1,p2, labels=c(paste(s,"age X diagnosis"), paste(s,"age X sex")),ncol=2,nrow=1,common.legend = FALSE, legend = "right") 162 | print(figure) 163 | } 164 | } 165 | 166 | 167 | 168 | ``` 169 | 170 | -------------------------------------------------------------------------------- /Exercise/bidsmri2nidm.txt: -------------------------------------------------------------------------------- 1 | # The following is an example from a bidsmri2nidm run 2 | % bidsmri2nidm -d $PWD/rawdata -o $PWD/rawdata/my_nidm.ttl 3 | [2022-05-22 09:56:44,626] - WARNING - ontquery - interlex.py:58 - You have not set an API key for the SciCrunch API! InterLexRemote will error if you try to use it. 4 | 5 | You will now be asked a series of questions to annotate your term: age 6 | Please enter a full name to associate with the term [age]: 7 | Please enter a definition for this term: Age at scan 8 | Please enter the value type for this term from the following list: 9 | 1: string - The string datatype represents character strings 10 | 2: categorical - A variable that can take on one of a limited number of possible values, assigning each to a nominal category on the basis of some qualitative property. 11 | 3: boolean - Binary-valued logic:{true,false} 12 | 4: integer - Integer is a number that can be written without a fractional component 13 | 5: float - Float consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^24, and e is an integer between -149 and 104, inclusive 14 | 6: double - Double consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^53, and e is an integer between -1075 and 970, inclusive 15 | 7: duration - Duration represents a duration of time 16 | 8: dateTime - Values with integer-valued year, month, day, hour and minute properties, a decimal-valued second property, and a boolean timezoned property. 17 | 9: time - Time represents an instant of time that recurs every day 18 | 10: date - Date consists of top-open intervals of exactly one day in length on the timelines of dateTime, beginning on the beginning moment of each day (in each timezone) 19 | 11: anyURI - anyURI represents a Uniform Resource Identifier Reference (URI). An anyURI value can be absolute or relative, and may have an optional fragment identifier 20 | Please enter the datatype [1:11]: 4 21 | Please enter the minimum value [NA]: 22 | Please enter the maximum value [NA]: 23 | Please enter the units [NA]: years 24 | 25 | ************************************************************************************* 26 | Stored mapping: age -> 27 | label: age 28 | source variable: age 29 | description: Age at scan 30 | valueType: http://www.w3.org/2001/XMLSchema#integer 31 | hasUnit: years 32 | minimumValue: 33 | maximumValue: 34 | --------------------------------------------------------------------------------------- 35 | 36 | Concept Association 37 | Query String: age 38 | 39 | NIDM-Terms Concepts: 40 | 1: Label: Date Definition: Date value for this name-value Item. URL: http://uri.interlex.org/ilx_0383109 41 | 2: Label: Age Definition: A time quality inhering in a bearer by virtue of how long it has existed. URL: http://uri.interlex.org/ilx_0100400 42 | 3: Label: language Definition: The mental ability to encode and decode information, and translate this information into verbal, acoustic and visual representations, according to a set of rules that are common across a population. URL: http://cognitiveatlas.org/concept/json/trm_4a3fd79d0a769/ 43 | 44 | 4: Broaden Search (includes interlex, cogatlas, and nidm ontology) 45 | 5: Change query string from: "age" 46 | 6: No concept needed for this variable 47 | --------------------------------------------------------------------------------------- 48 | Please select an option (1:6) from above: 2 49 | 50 | Concept annotation added for source variable: age 51 | WARNING: WIP: Data element not submitted to InterLex. 52 | 53 | You will now be asked a series of questions to annotate your term: sex 54 | Please enter a full name to associate with the term [sex]: 55 | Please enter a definition for this term: Sex of participand assigned at birth 56 | Please enter the value type for this term from the following list: 57 | 1: string - The string datatype represents character strings 58 | 2: categorical - A variable that can take on one of a limited number of possible values, assigning each to a nominal category on the basis of some qualitative property. 59 | 3: boolean - Binary-valued logic:{true,false} 60 | 4: integer - Integer is a number that can be written without a fractional component 61 | 5: float - Float consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^24, and e is an integer between -149 and 104, inclusive 62 | 6: double - Double consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^53, and e is an integer between -1075 and 970, inclusive 63 | 7: duration - Duration represents a duration of time 64 | 8: dateTime - Values with integer-valued year, month, day, hour and minute properties, a decimal-valued second property, and a boolean timezoned property. 65 | 9: time - Time represents an instant of time that recurs every day 66 | 10: date - Date consists of top-open intervals of exactly one day in length on the timelines of dateTime, beginning on the beginning moment of each day (in each timezone) 67 | 11: anyURI - anyURI represents a Uniform Resource Identifier Reference (URI). An anyURI value can be absolute or relative, and may have an optional fragment identifier 68 | Please enter the datatype [1:11]: 1 69 | Please enter the minimum value [NA]: 70 | Please enter the maximum value [NA]: 71 | Please enter the units [NA]: 72 | 73 | ************************************************************************************* 74 | Stored mapping: sex -> 75 | label: sex 76 | source variable: sex 77 | description: Sex of participand assigned at birth 78 | valueType: http://www.w3.org/2001/XMLSchema#string 79 | hasUnit: 80 | minimumValue: 81 | maximumValue: 82 | --------------------------------------------------------------------------------------- 83 | 84 | Concept Association 85 | Query String: sex 86 | 87 | NIDM-Terms Concepts: 88 | 1: Label: SEX Definition: gender URL: http://uri.interlex.org/ilx_0738439 89 | 90 | 2: Broaden Search (includes interlex, cogatlas, and nidm ontology) 91 | 3: Change query string from: "sex" 92 | 4: No concept needed for this variable 93 | --------------------------------------------------------------------------------------- 94 | Please select an option (1:4) from above: 1 95 | 96 | Concept annotation added for source variable: sex 97 | WARNING: WIP: Data element not submitted to InterLex. 98 | 99 | You will now be asked a series of questions to annotate your term: diagnosis 100 | Please enter a full name to associate with the term [diagnosis]: 101 | Please enter a definition for this term: Diagnosis of participant 102 | Please enter the value type for this term from the following list: 103 | 1: string - The string datatype represents character strings 104 | 2: categorical - A variable that can take on one of a limited number of possible values, assigning each to a nominal category on the basis of some qualitative property. 105 | 3: boolean - Binary-valued logic:{true,false} 106 | 4: integer - Integer is a number that can be written without a fractional component 107 | 5: float - Float consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^24, and e is an integer between -149 and 104, inclusive 108 | 6: double - Double consists of the values m × 2^e, where m is an integer whose absolute value is less than 2^53, and e is an integer between -1075 and 970, inclusive 109 | 7: duration - Duration represents a duration of time 110 | 8: dateTime - Values with integer-valued year, month, day, hour and minute properties, a decimal-valued second property, and a boolean timezoned property. 111 | 9: time - Time represents an instant of time that recurs every day 112 | 10: date - Date consists of top-open intervals of exactly one day in length on the timelines of dateTime, beginning on the beginning moment of each day (in each timezone) 113 | 11: anyURI - anyURI represents a Uniform Resource Identifier Reference (URI). An anyURI value can be absolute or relative, and may have an optional fragment identifier 114 | Please enter the datatype [1:11]: 1 115 | Please enter the minimum value [NA]: 116 | Please enter the maximum value [NA]: 117 | Please enter the units [NA]: 118 | 119 | ************************************************************************************* 120 | Stored mapping: diagnosis -> 121 | label: diagnosis 122 | source variable: diagnosis 123 | description: Diagnosis of participant 124 | valueType: http://www.w3.org/2001/XMLSchema#string 125 | hasUnit: 126 | minimumValue: 127 | maximumValue: 128 | --------------------------------------------------------------------------------------- 129 | 130 | Concept Association 131 | Query String: diagnosis 132 | 133 | NIDM-Terms Concepts: 134 | 1: Label: Diagnosis, Psychiatric Definition: Psychiatric-based diagnosis URL: http://uri.interlex.org/ilx_0497879 135 | 2: Label: Diagnosis Definition: The representation of a conclusion of a diagnostic process. URL: http://uri.interlex.org/ilx_0778153 136 | 3: Label: mania (Diagnosis) Definition: current diagnosis of mania. URL: http://uri.interlex.org/ilx_0148913 137 | 138 | 4: Broaden Search (includes interlex, cogatlas, and nidm ontology) 139 | 5: Change query string from: "diagnosis" 140 | 6: No concept needed for this variable 141 | --------------------------------------------------------------------------------------- 142 | Please select an option (1:6) from above: 1 143 | 144 | Concept annotation added for source variable: diagnosis 145 | WARNING: WIP: Data element not submitted to InterLex. 146 | WARNING:root:Cannot find T1w.json file...looking for session-specific one 147 | WARNING:root:Cannot find session-specific T1w.json file which is required in the BIDS spec..continuing anyway 148 | WARNING:root:Cannot find T1w.json file...looking for session-specific one 149 | WARNING:root:Cannot find session-specific T1w.json file which is required in the BIDS spec..continuing anyway 150 | WARNING:root:Cannot find T1w.json file...looking for session-specific one 151 | WARNING:root:Cannot find session-specific T1w.json file which is required in the BIDS spec..continuing anyway 152 | WARNING:root:Cannot find T1w.json file...looking for session-specific one 153 | WARNING:root:Cannot find session-specific T1w.json file which is required in the BIDS spec..continuing anyway 154 | -------------------------------------------------------------------------------- /course/overview.md: -------------------------------------------------------------------------------- 1 | # course overview 2 | 3 | As mentioned on the [Welcome page](https://repronim.github.io/OHBMEducation-2022/index.html), this course will be focus on `re-executable publications` and cover various important aspects from an introduction to the problem and motivation to adapt neuroimaging research workflows accordingly to the utilization of basic and advanced tools/resources. In general, the idea is to do a split between first, the basics of how to write a `re-executable publication` and second, providing first hands-on experience based on examples. Regarding the first, we will try to provide attendees with a brief overview of central aspects, as well as important issues and concerning the second, work through an example workflow from beginning to end together. Thus, we will go through and practice the use of respective software, tools and resources. We will further explain both aspects, as well as the `setup`, etc. below. For a precise outline of this course, please consult the respective [page](https://repronim.github.io/OHBMEducation-2022/outline.html). 4 | 5 | 6 | ## The framework and setup 7 | 8 | All course materials will be provided within the [Jupyter Book](https://jupyterbook.org/intro.html) format you're currently looking at, free for everyone to check and try out, as well as utilize further. The course itself will use a mixture of slides, code and other media within presentations, practical hands-on sessions and discussion rounds to enable a holistic introduction paired with firsthand experience. Depending on a given participant's computational resources and infrastructure, we provide multiple ways to participate in the course as outlined in the [Setup for the course](https://repronim.github.io/OHBMEducation-2022/setup.html) section. 9 | 10 | ## Instructors 11 | 12 | To provide a holistic introduction into the topic of `re-executable publications` and its subcomponents, we assembled a stellar team of instructors entailing highly experience experts of both the `ReproNim` team as well as colleagues & collaborators. You can find them below and get further information via clicking on the respective names. 13 | 14 | 15 | 16 | 22 | 28 | 34 | 40 | 46 | 52 | 58 | 59 |
17 | 18 | 19 |
Julie Bates 20 |
21 |
23 | 24 | 25 |
Dorota Jarecka (she/her) 26 |
27 |
29 | 30 | 31 |
David Kennedy (he/him) 32 |
33 |
35 | 36 | 37 |
Jean-Baptiste Poline (he/him) 38 |
39 |
41 | 42 | 43 |
Neda Jahanshad 44 |
45 |
47 | 48 | 49 |
Stephan Heunis 50 |
51 |
53 | 54 | 55 |
Peer Herholz (he/him) 56 |
57 |
60 | 61 | 62 | ```{admonition} How to address one another? 63 | :class: dropdown 64 | When contacting us, please refrain from using titles when addressing us and super formal language, using our first names is fine and it’s way more important that the content is respectful, fair and constructive (We aim for the same when we reply). However, please let us know if you have a preferred way of interacting with other folks, including how you would liked to be addressed, your pronouns and the level of formality. 65 | ``` 66 | 67 | ### Gimme the details 68 | 69 | Below you will find important details regarding the course summarized in a compact form. Please consult and familiarize yourself with the information presented there prior to the course. 70 | 71 | #### When and where 72 | 73 | The course will take place via two distinct options: in-person, 19/06/2002, 1-5 PM BST & virtually. The respective links will be provided for registered participants shortly before the course. 74 | 75 | #### Can I use my calculator? 🖥️ 76 | 77 | Short answer: no. The workflows, tools and resources we are going to explore in this course require a certain setup and thus computational infrastructure. In order to keep things simple, we will provide all attendees with access to [cloud computing resources]() that fulfill all requirements. However, you can also use your own machine but it should be able to run `bash`, `git`, `docker`/`singularity` and `python` among other things. Furthermore, it will need to be running a standard operating system like `Windows`, `Mac OS X`, or `Linux`. Unfortunately, tablets running mobile operating systems (`iOS`, `Android`) probably won't work for this purpose. If this is an issue for you, please get in touch with the instructors as soon as possible so that we can try to figure out a solution. Regarding `software` and `installation` thereof, please check the next section. 78 | 79 |

via GIPHY

80 | 81 | #### How do I get all the software locally and do I have to apply for a loan to get it? 🖥️ 82 | 83 | If you decide to use your own machine: Don't worry at all. First, in order to help you get all the software required for the course, a [comprehensive installation instruction](http://www.repronim.org/OHBMEducation-2022/setup.html) was compiled. In a step-by-step manner it guides you through the installation process, covering several `OS`: `windows`, `macos` and `linux`. Second, everything will be completely free of charge as we will only use publicly available [open-source software](https://en.wikipedia.org/wiki/Open-source_software). Why? Because teaching students via [proprietary software](https://en.wikipedia.org/wiki/Proprietary_software) is just not fair and won't help anyone: students have to obtain licenses or use those from the university (which usually doesn't have enough for everyone), leading to tremendous problems regarding inequity now and in the future. Additionally, [opens-source software](https://en.wikipedia.org/wiki/Open-source_software) can do everything, if not more, what [proprietary software](https://en.wikipedia.org/wiki/Proprietary_software) can and is furthermore usually better supported, tested and documented, creating a fantastic sense of community. 84 | 85 |

via GIPHY

86 | 87 | #### Where is everything? 88 | 89 | All course materials (lecture slides, recordings, lecture demo notebooks, etc.) will be available on the [course website](http://www.repronim.org/OHBMEducation-2022/index.html), i.e. the one you're looking at right now. Everything will be completely open and free to use, thus constituting an [open educational resource](https://en.wikipedia.org/wiki/Open_educational_resources) you are free to explore, enhance and share. Thus, this website and all materials will also remain up for the entire duration of the course and beyond, ideally to end of the internet. The usage of this resource and the materials therein will be explained at the beginning and throughout the course. 90 | 91 |

via GIPHY

92 | 93 | #### Syllabus and Text 94 | 95 | As noted above, this page serves as the syllabus for this course, with the precise outline indicated in the [respective section](http://www.repronim.org/OHBMEducation-2022/outline.html). As usual, the syllabus is subject to change. 96 | 97 | #### Code of conduct 98 | 99 | This course has a `Code of conduct`. Please inform yourself about the specifics by carefully reading through the [respective section](http://www.repronim.org/OHBMEducation-2022/CoC.html). 100 | 101 |

via GIPHY

102 | 103 | 104 | ### How to Get Your Question(s) answered and/or provide feedback 🙋🏼‍♀️ ⁉️ 105 | 106 | It’s great that we have so many ways to communicate, but it can get tricky to figure out who to contact or where your question belongs or when to expect a response. These guidelines are to help you get your question answered as quickly as possible and to ensure that we’re able to get to everyone’s questions. 107 | 108 | That said, to ensure that we’re respecting everyone's time, we will mainly answer pre-/post-course questions between normal working hours (M-F 9AM-5PM). The instructors are also going to do their best to stick to these working hours when they want to share information. However, they know that’s not when you may be doing your work. So, please feel free to post messages whenever is best for you while knowing that if you post late at night or on a weekend, you may not get a response until the next weekday. As such, do your best not to wait until the last minute to ask a question. 109 | 110 | If you have: 111 | 112 | - questions about course content - these are awesome! We want everyone to see them and have their questions answered too, so either use the [hypothes.is](https://web.hypothes.is/) plugin, `e-mail` or the [GitHub repository](https://github.com/repronim/OHBMEducation-2022/issues). 113 | 114 | - a technical assignment question - come to us. Be as specific as you can in the question you ask. And, for those answering, help your fellow participants as much as you can without just giving the answer. Help guide them, point them in a direction and provide pseudo code. 115 | 116 | - been stuck on something for a while (>30min) and aren’t even really sure where to start - Computational work can be frustrating and it may not always be obvious what is going wrong or why something isn’t working. That’s OK - we’ve all been there! IF you are stuck, you can and should reach out for help, even if you aren’t exactly sure what your specific question is. To determine when to reach out, consider the 2-hour rule. This rule states that if you are stuck, work on that problem for an hour. Then, take a 30 minute break and do something else. When you come back after your break, try for another 30 minutes or so to solve your problem. If you are still completely stuck, stop and contact us (office hours). If you don’t have a specific question, include the information you have (what you’re stuck on, the code you’ve been trying that hasn’t been happening, and/or the error messages you’ve been getting). 117 | 118 | - questions about course logistics - first, check the [overview](http://www.repronim.org/OHBMEducation-2022/overview.html) & [syllabus](http://www.repronim.org/OHBMEducation-2022/outline.html). If you can’t find the answer there, first ask a fellow participant or instructor. 119 | 120 | - something super cool to share related to the course or want to talk about a topic in further depth - feel free to share or contact the instructors. 121 | 122 | - some feedback about the course you want to share anonymously - If you’ve been offended by an example in the course, really liked or disliked a lesson, or wish there were something covered in the course that wasn’t but would rather not share this publicly, etc., please fill out the anonymous [Google Form]()* 123 | 124 | *This form can be taken down at any time if it’s not being used for its intended purpose; however, you all will be notified should that happen. 125 | 126 | ### Acknowledgements 127 | 128 | Several parts of this section are directly taken or adapted from [Alexander Huth's Neuro Data Analysis in Python syllabus](https://github.com/alexhuth/ndap-fa2020) licensed under a [BSD-3-Clause License](https://github.com/alexhuth/ndap-fa2020/blob/master/LICENSE) and [Shannon Ellis' COGS 18: Introduction to Python](https://cogs18.github.io/assets/intro/syllabus.html). -------------------------------------------------------------------------------- /stats/mass_uv_regr_loop_OHBM_simplified.Rmd: -------------------------------------------------------------------------------- 1 | 2 | # OHBM Educational Course 2022 - Re-executing a publication 3 | 4 | In this R notebook, you will be able to calculate case/control effect sizes 5 | and evaluate descriptive statistics for your cohort along the way. 6 | 7 | First, you need to make sure to set all the appropriate parameters. 8 | 9 | We will focus on a fixed regression here, but you will have some flexibility 10 | with this script to test out other models 11 | 12 | For this educational course we will run the following models: 13 | 14 | + your sample of 5 cases vs 5 controls from the ANT cohort (ds001907; Day et al 2022) 15 | + the full OpenNeuro ANT cohort (ds001907; Day et al 2022) 16 | + analysis of 3 other open cohorts used in the ENIGMA-PD publication (Laansma et al 2021) 17 | + analysis of your ANT cohort merged with the above for a total of 4 cohorts 18 | 19 | 20 | ### Read in the data and set the output folder and file names! 21 | ```{r} 22 | datafile <- read.csv("all4sites_data_OHBM.csv"); #Read in the file 23 | datafile <- read.csv("metr_SubCortical_OHBM.csv"); #Read in the file 24 | datafile <- read.csv("OpenNeuro_results.csv"); #Read in the file 25 | outFolder="./OpenNeuro_allROIs_woICV/" ; dir.create(outFolder,"/",showWarnings = FALSE); 26 | cohortName="MyOHBMcohort-OpenNeuroFull-woICV" 27 | eName="ENIGMA-PD" 28 | ``` 29 | ```{r} 30 | # Double check the column names of the file you read in: 31 | columnnames = colnames(datafile); 32 | colnames(datafile) 33 | ``` 34 | 35 | Do you need to calculate ICV? 36 | ```{r} 37 | if (length(datafile[,which(columnnames=="ICV")]) == 0 38 | && length(datafile[,which(columnnames=="csf..mm.3.")]) > 0 ) { 39 | print("It looks like you are working with the OHBM results and ICV is not included. We will add volumes of CSF, gray and white matter for you here -- this will serve as a proxy for ICV.") 40 | ICV=datafile$csf..mm.3. + datafile$white..mm.3. + datafile$gray..mm.3. 41 | ncol=dim(datafile)[2] ; 42 | datafile[,ncol+1]=ICV ; colnames(datafile) <- c(columnnames,"ICV"); columnnames = colnames(datafile) } 43 | ``` 44 | 45 | ### This code is exclusively for case/control effect sizes as described by the Cohen's D statistic. It is important to ensure the code uses the proper column and indicator for cases vs controls: 46 | ```{r} 47 | # what is the column header for case/control or indicating diagnostic status? 48 | caseColumnHeader= "diagnosis" #"Dx" 49 | # how are cases coded (1,0,"patients","cases")? 50 | caseIndicator=1 # in this code those not coded as cases are assumed to be controls 51 | ``` 52 | 53 | ### We will be including age and sex in all models here. However, you can turn this feature off. 54 | ```{r} 55 | # what is the column header for age? 56 | ageColumnHeader="age" # "Age" 57 | # what is the column header for sex? 58 | sexColumnHeader="sex" # "Sex" 59 | # how are males coded (1,0,"M","males",-0.5)? 60 | maleIndicator=1 #sex is included in the regression as a categorical variable, or factor 61 | # would you like to include age, sex, and an age-x-sex interaction term in the regression? 62 | includeAgeSexinModel=TRUE 63 | # includeAgeSexInteractioninModel == TRUE # this option has been removed but you can add it back in (experts only -- be careful!) 64 | ``` 65 | 66 | ### Do you have other covariates to include? We will ask about continuous and categorical variables separately 67 | #### First, continuous: 68 | ```{r} 69 | # how many continuous covariates other than age should be used from your spreadsheet? 70 | otherCovariatesN=0 71 | # list out the covariates and separate them with a semi-colon 72 | otherCovariates="ICV" 73 | #note if your ROI is ICV, you will not want ICV as a covariate, so you'd have to run through the script again for the separate model 74 | ``` 75 | #### Next, categorical: 76 | ```{r} 77 | # how many categorical covariates are in your spreadsheet other than sex? 78 | ## (examples might include site / scanner / study / different diagnostic groups) 79 | otherCatCovariatesN=0 80 | # list out the categorical covariates (coded as factors) and separate them with a semi-colon 81 | otherCatCovariates="site" 82 | ``` 83 | 84 | ### What ROIs will you be looping over with the same statistical model? 85 | ```{r} 86 | ROINames<-c("Left-Accumbens-area (mm^3)","Left-Amygdala (mm^3)","Left-Caudate (mm^3)","Left-Hippocampus (mm^3)","Left-Pallidum (mm^3)","Left-Putamen (mm^3)","Left-Thalamus-Proper (mm^3)","Right-Accumbens-area (mm^3)","Right-Amygdala (mm^3)","Right-Caudate (mm^3)","Right-Hippocampus (mm^3)","Right-Pallidum (mm^3)","Right-Putamen (mm^3)","Right-Thalamus-Proper (mm^3)"); 87 | 88 | #ROINames<-c("Lthal","Rthal","Lcaud","Rcaud","Lput","Rput","Lpal","Rpal","Lhippo","Rhippo","Lamyg","Ramyg","Laccumb","Raccumb") 89 | 90 | # ICV would require a different model since we will covary for ICV for the subcortical regions, so we'd run that separately 91 | #ROINames<-c("ICV") 92 | ROINames<-make.names(ROINames) 93 | #Get number of structures to test 94 | Nrois=length(ROINames) 95 | print(paste("We will be looping Cohen's D statistics over data from",Nrois,"regions of interest.")) 96 | ``` 97 | 98 | ### We will source some functions that we will use for Cohen's D calculations from a linear regression output. Feel free to take a look at these functions to see what they do 99 | ```{r} 100 | # If this file is in the same folder as your current working directory, no need to change the input, otherwise please make sure the file with the appropriate path is sourced. 101 | source('ENIGMA_cohensD_functions.R') 102 | ``` 103 | 104 | # Here we go! 105 | 106 | ## Sanity checks: 107 | 108 | ```{r} 109 | # Check for duplicated SubjIDs that may cause issues with merging data sets. 110 | if(anyDuplicated(datafile[,1]) != 0) { stop('You have duplicate SubjIDs in your file or subject ID is not your first column.\nMake sure there are no repeat SubjIDs.') } 111 | 112 | # Check all ROIs are column headers in the input CSV 113 | missingROIS=NULL; l_missing=0; 114 | for (roi in 1:Nrois) { 115 | if (length(datafile[,which(columnnames==ROINames[roi])]) == 0) { 116 | print(ROINames[roi]) 117 | missingROIS=paste(missingROIS,ROINames[roi]) 118 | l_missing=l_missing+1; } 119 | } 120 | if (l_missing >= 1) { warning(paste("You are missing ROIs in your file.\n Please check.",sep=""))} 121 | ``` 122 | ################################################################### 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | ################################################################### 154 | 155 | Here we initiate variables to save and print later 156 | ```{r} 157 | #Store models for troubleshooting 158 | models.all=NULL; # This will become a list where we store all of the models made by lm 159 | 160 | #allocate empty vectors to store adjust effect sizes, se, ci, counts and pvalue 161 | stats.d=matrix(NA,7,Nrois); 162 | 163 | featureNames=c("d","se","low.ci","up.ci","n.controls","n.cases","p-value") 164 | colnames(stats.d) <- ROINames; rownames(stats.d) <- featureNames; 165 | ``` 166 | 167 | Let's check the user inputs are consistent 168 | ```{r} 169 | # check number indicated is the same as parsed text 170 | covariates=parse(text=otherCovariates); 171 | Ncov_parsed=length(covariates) 172 | covariatesF=parse(text=otherCatCovariates) 173 | Ncatcov_parsed=length(covariatesF) 174 | ``` 175 | 176 | ```{r} 177 | if (otherCovariatesN > 0) { 178 | if (Ncov_parsed != otherCovariatesN) { 179 | stop(paste("You have indicated a different number of covariates than you have listed.\nPlease check listed variables are separated by a semicolon (;)",sep="")) } } 180 | ``` 181 | 182 | ```{r} 183 | if (otherCatCovariatesN > 0) { 184 | if (Ncatcov_parsed != otherCatCovariatesN) { 185 | stop(paste("You have indicated a different number of categorical covariates than you have listed.\nPlease check listed variables are separated by a semicolon (;)",sep="")) } } 186 | ``` 187 | 188 | ```{r} 189 | #Loop through and perform each regression -- add covariates 190 | for (roi in 1:Nrois) { 191 | print(ROINames[roi]) 192 | z=which(columnnames==ROINames[roi]) 193 | rows2use=!is.na(datafile[,z]) 194 | trait=datafile[rows2use,z] 195 | ## uncomment the following line to see how many samples are left in the ROI (after labeling NA for bad segmentations) 196 | #print(length(trait)) 197 | if(length(trait)==0){ 198 | next;} # Skip the whole structure if there are no observations 199 | 200 | if (includeAgeSexinModel == TRUE ) { 201 | # if (includeAgeSexInteractioninModel == TRUE ) { 202 | # caseControl_covariatesforRFX = data.frame(NA,nrow=length(trait),ncol=otherCovariatesN+otherCatCovariatesN+4); 203 | # ageXsex=(datafile[rows2use,sexColumnHeader] - mean(datafile[rows2use,sexColumnHeader])) *datafile[rows2use,ageColumnHeader] 204 | # caseControl_covariatesforRFX = datafile[rows2use,c(caseColumnHeader,ageColumnHeader,sexColumnHeader)] 205 | # origcolnames=colnames(caseControl_covariatesforRFX) 206 | # caseControl_covariatesforRFX[,4]=ageXsex 207 | # colnames(caseControl_covariatesforRFX)<-c(origcolnames,"ageXsex") 208 | # origcolnames=colnames(caseControl_covariatesforRFX) 209 | # string2run=paste(caseColumnHeader," + ",ageColumnHeader," + factor(",sexColumnHeader,") + ageXsex",sep="") } 210 | caseControl_covariatesforRFX = data.frame(NA,nrow=length(trait),ncol=otherCovariatesN+otherCatCovariatesN+3); 211 | caseControl_covariatesforRFX = datafile[rows2use,c(caseColumnHeader,ageColumnHeader,sexColumnHeader)] 212 | origcolnames=colnames(caseControl_covariatesforRFX) 213 | string2run=paste(caseColumnHeader," + ",ageColumnHeader," + factor(",sexColumnHeader,")",sep="") } 214 | 215 | else { 216 | caseControl_covariatesforRFX = data.frame(NA,nrow=length(trait),ncol=otherCovariatesN+otherCatCovariatesN+1); 217 | caseControl_covariatesforRFX = datafile[rows2use,c(caseColumnHeader)] 218 | origcolnames=colnames(caseControl_covariatesforRFX) 219 | string2run=paste(caseColumnHeader) } 220 | 221 | if (otherCovariatesN > 0) { 222 | for (covariate in 1:otherCovariatesN) { 223 | covName=as.character(covariates[covariate]) 224 | caseControl_covariatesforRFX[,length(caseControl_covariatesforRFX)+1] = datafile[rows2use,covName]; 225 | colnames(caseControl_covariatesforRFX) = c(origcolnames,covName); 226 | origcolnames=colnames(caseControl_covariatesforRFX) 227 | string2run=paste(string2run,"+",covName,sep=" ") } } 228 | 229 | if (otherCatCovariatesN > 0) { 230 | for (covariate in 1:otherCatCovariatesN) { 231 | covName=as.character(covariatesF[covariate]) 232 | caseControl_covariatesforRFX[,length(caseControl_covariatesforRFX)+1] = datafile[rows2use,covName]; 233 | colnames(caseControl_covariatesforRFX) = c(origcolnames,covName); 234 | origcolnames=colnames(caseControl_covariatesforRFX) 235 | string2run=paste(string2run," + factor(",covName,") ",sep="") } } 236 | 237 | ## find and get rid of all rows with NAs in them 238 | iNA=which(apply(caseControl_covariatesforRFX,1,function(x)any(is.na(x)))); 239 | if (length(iNA) != 0 ) { caseControl_covariatesforRFX<-caseControl_covariatesforRFX[-which(apply(caseControl_covariatesforRFX,1,function(x)any(is.na(x)))),] } 240 | 241 | #Run the model 242 | #attach main file 243 | attach(caseControl_covariatesforRFX,warn.conflicts = FALSE) 244 | eval(parse(text=paste("tmp=lm(trait ~ ", string2run, ")", sep=''))) 245 | models.all[[roi]]=tmp #Store the model fit for future reference 246 | 247 | #subjects can be dropped if they are missing so we can get the precise number of controls/patients for each region tested 248 | stats.d[5,roi] = length(which(tmp$model[,2] != caseIndicator)) 249 | stats.d[6,roi] = length(which(tmp$model[,2] == caseIndicator)) 250 | 251 | #Convert the lm model to a summary format so we can extract statistics 252 | tmp=summary(tmp) 253 | ## uncomment the following line to see the full summary outputs of the models run 254 | #print(tmp) 255 | tstat=tmp$coefficients[2,3] # Get t-statistic from regression to convert to Cohens d 256 | pval=tmp$coefficients[2,4] # Get p-value from regression 257 | tstat.df=tmp$df[2] 258 | stats.d[7,roi] = pval 259 | 260 | #collect effect size data 261 | stats.d[1,roi]=partial.d(tstat,tstat.df, stats.d[5,roi], stats.d[6,roi]) 262 | stats.d[2,roi]=se.d2(stats.d[1,roi],stats.d[5,roi], stats.d[6,roi]) 263 | bound.cort=CI_95(stats.d[1,roi],stats.d[2,roi]) 264 | stats.d[3,roi]=bound.cort[1] 265 | stats.d[4,roi]=bound.cort[2] 266 | rm(caseControl_covariatesforRFX) 267 | } 268 | 269 | ``` 270 | ```{r} 271 | #save(stats.d, file=paste(outFolder,"/",eName, "_",cohortName,"_EffectSizes.Rdata", sep="")) 272 | save(models.all, file=paste(outFolder,"/",eName, "_",cohortName,"_models.rda",sep="")) 273 | write.csv(stats.d,paste(outFolder,"/",eName, "_",cohortName,"_EffectSizes.csv",sep="")) 274 | stats.d 275 | 276 | ``` 277 | -------------------------------------------------------------------------------- /stats/metr_SubCortical_OHBM.csv: -------------------------------------------------------------------------------- 1 | SubjID,Dx,Sex,Age,site,cohort,DURILL,AO,HY1234,Hoehn_Yahr_stage,ICV,LLatVent,RLatVent,Lthal,Rthal,Lcaud,Rcaud,Lput,Rput,Lpal,Rpal,Lhippo,Rhippo,Lamyg,Ramyg,Laccumb,Raccumb 2 | EPD_0_032014,0,1,53,Neurocon_1,Neurocon,NA,NA,0,NA,1837978.779,9050.4,8720.6,9097.1,7425.4,3488.4,3392.5,NA,4932.2,1188.4,1343.7,4960.9,5108.5,1480.2,1761,532.2,495 3 | EPD_0_032015,0,1,64,Neurocon_1,Neurocon,NA,NA,0,NA,1522398.796,18878.2,16179.6,5662.3,6024.6,3325,3603,4458.5,4230.2,868,1070.8,3053.9,3047.2,1285.8,1305.3,339.5,376.2 4 | EPD_0_032016,0,2,58,Neurocon_1,Neurocon,NA,NA,0,NA,1469688.415,10721.5,8714.2,6967.9,6675.2,3304.6,3528,NA,4004.7,NA,1149.8,4176.2,4340.2,1650.1,1620.7,367.8,328.4 5 | EPD_0_032017,0,2,77,Neurocon_1,Neurocon,NA,NA,0,NA,1420165.924,12266.1,10716.8,5872.1,6192.2,2737,2786.8,3965.4,3518.8,1522.5,1348.9,3272.2,3280.7,1308.1,1201.3,326.2,362.9 6 | EPD_0_032018,0,2,55,Neurocon_1,Neurocon,NA,NA,0,NA,1521936.585,17258.3,18006.5,NA,NA,3224.5,3260.4,4733.1,4259,1239.2,1324.7,3358.6,3576.5,1646.4,1519.1,485.2,364.2 7 | EPD_0_032019,0,2,56,Neurocon_1,Neurocon,NA,NA,0,NA,1478313.968,10741.7,11631.6,8387.5,7785.6,3283.5,3314.7,4610.2,4640.8,1348.3,1371,3982.1,4113.8,1334,1402.1,303.8,319.5 8 | EPD_0_032020,0,2,76,Neurocon_1,Neurocon,NA,NA,0,NA,1267057.355,5289.9,5239.7,6885,6037.6,2762.2,2895.7,NA,NA,NA,NA,3726.5,3733,1338.1,1415.6,290.7,300.9 9 | EPD_0_032021,0,2,56,Neurocon_1,Neurocon,NA,NA,0,NA,1558683.691,5929,7937,8013.5,7000,3249.8,3226,NA,NA,NA,NA,4536.9,4556,1254.8,1335.5,544.7,400 10 | EPD_0_032022,0,2,46,Neurocon_1,Neurocon,NA,NA,0,NA,1408076.473,4807.1,3501,7139.8,7201.6,2705.4,2728.7,4423.9,4383.5,1253.6,1422.6,3958.8,4198.5,1574.3,1717.4,482.9,395.9 11 | EPD_0_032023,0,2,82,Neurocon_1,Neurocon,NA,NA,0,NA,1381769.489,13527.6,15282.4,6051.9,5419.6,2663.7,2615.1,3768,3338.4,1328.4,1134.8,3661.9,3526.8,1092.8,1241.1,343.5,332.5 12 | EPD_0_032024,0,2,71,Neurocon_1,Neurocon,NA,NA,0,NA,1584213.063,27543,24076.2,6671.4,6379.1,3488.1,3200.7,4599.9,4111.4,1390.9,1276.4,3913.6,3737.2,1450.2,1536.4,372.3,316.7 13 | EPD_0_032025,0,1,74,Neurocon_1,Neurocon,NA,NA,0,NA,1482952.211,12699.6,14751.7,NA,6656.3,3522.7,3299.4,4036.8,4414.7,1038.7,1297.7,3610.7,3491.8,1317.4,1239.1,350.6,316.6 14 | EPD_0_032026,0,2,78,Neurocon_1,Neurocon,NA,NA,0,NA,1528881.186,31053.3,27368,6358,5929.9,3180.1,3213.4,3981.9,3736.5,1442.6,1356.7,NA,3365.8,1201.8,1340.3,330.9,270.7 15 | EPD_0_032028,0,2,76,Neurocon_1,Neurocon,NA,NA,0,NA,1472691.333,19409.4,22622.7,6669.2,6007.8,3341.3,3413.5,4160.8,4109.4,1350.3,1477.9,3377.3,3304.6,1107.9,1310.3,374,237.8 16 | EPD_0_032029,0,2,79,Neurocon_1,Neurocon,NA,NA,0,NA,1594083.416,26870.8,33685.5,6226.5,5980.6,3292.9,3342.1,4090.9,3458.8,1309.1,1235.4,3303.7,2939.9,1247,1176.9,266,272.4 17 | EPD_1_032030,1,2,76,Neurocon_1,Neurocon,NA,NA,2,2,1208589.99,9053.4,10473.9,5720.8,5044.7,2607,2620.2,2691,3028.9,1375.3,1368.7,2732.7,2913.7,930.4,1111.1,277.5,310.2 18 | EPD_1_032031,1,2,75,Neurocon_1,Neurocon,NA,NA,2,2,1526462.563,13828.5,14292.4,7836.2,NA,2663.8,3031.2,3878.1,4294.2,1573.7,1235.7,4107.6,3968.3,1232.6,1237.1,330,385.6 19 | EPD_1_032032,1,1,74,Neurocon_1,Neurocon,NA,NA,2,2,1737015.518,27438,25295.2,6632,6802.7,3410.2,3478.5,4119,3861,1366.1,1179.6,3478.3,3668.3,1212,1417.2,317.2,217.4 20 | EPD_1_032033,1,1,67,Neurocon_1,Neurocon,NA,NA,2,2,1519402.825,11953.2,9648.7,7248.4,6854.6,3135.3,3108.4,5118,4483.2,1603.1,1605.5,3684,4029.9,1400.5,1672.9,431.9,272.5 21 | EPD_1_032034,1,1,79,Neurocon_1,Neurocon,NA,NA,2,2,1552273.934,19634.4,16131.1,6400.1,6164.1,2894.9,2926.7,3783.6,3671.7,1322.7,1262,NA,3462.5,1238,1262,241.9,218.7 22 | EPD_1_032035,1,1,60,Neurocon_1,Neurocon,NA,NA,2,2,1661380.915,29399.8,20224.4,6978.8,6931.8,3440.8,3492.7,3924.1,3643.2,1367.3,1405.1,3342,3815.3,1327.8,1396.6,341.2,334.5 23 | EPD_1_032036,1,2,67,Neurocon_1,Neurocon,NA,NA,2,2,1258755.001,7611.5,6916.9,6040.6,6185.9,2567.5,2726.6,3720.7,3479.2,1314,1248.4,3521.9,3542.5,1213.5,1195.7,309.8,383.5 24 | EPD_1_032037,1,1,73,Neurocon_1,Neurocon,NA,NA,2,2,2170191.6,23813.5,22081.9,8167.8,8385.7,5008.1,4830.2,5563,5565.8,1755.3,1685.3,4646.4,4872.6,1803,2017.4,528.1,522.1 25 | EPD_1_032038,1,2,68,Neurocon_1,Neurocon,NA,NA,2,2,1444163.031,13530.3,11785.3,7791.6,6856.9,2481.5,2627,3370.4,3235,1244.1,1074.7,3700.7,3937.4,1260.6,1270.9,336.8,264.4 26 | EPD_1_032039,1,1,60,Neurocon_1,Neurocon,NA,NA,1,1,1780970.392,18495.9,11676.6,8033.6,7495,3355.9,3412.3,4572.6,4156.5,1727,1646.3,4306.9,4522.8,1409.6,1514.4,429,352 27 | EPD_1_032040,1,1,66,Neurocon_1,Neurocon,NA,NA,2,2,1938646.424,19429.6,23082.5,7930.6,8388,4218.1,4919.9,5671.6,5809,2001.8,1542.9,4148.6,4007.4,1699.6,1969.4,514,351.2 28 | EPD_1_032041,1,1,54,Neurocon_1,Neurocon,NA,NA,2,2,1514653.879,9995.2,7741.1,6491.9,7105.5,3167.2,3336.6,3422.3,3864.1,928,1094.3,4411,4562.2,1513.4,1537.9,376.1,309.1 29 | EPD_1_032042,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1564840.712,22794.4,20891.1,5901.3,5790.7,3903.1,3729.1,4299.6,3894.4,1475.4,1399.8,3216.7,3402.4,1146.6,1154.5,321.3,248.8 30 | EPD_1_032043,1,2,64,Neurocon_1,Neurocon,NA,NA,2,2,1312780.31,9055.7,9592.9,6517.5,6230.8,4082.2,3760.8,4875.8,4706.7,1561.4,1433.7,3622.1,3638,1061.1,1079.6,309.1,397 31 | EPD_1_032044,1,1,84,Neurocon_1,Neurocon,NA,NA,2,2,1357163.732,19696.1,19655.5,5458.3,4972.4,3005.1,2943.1,3483.2,3343.4,1177.5,1020.6,2742.7,2873,966.5,1114.5,277.1,223.4 32 | EPD_1_032045,1,1,82,Neurocon_1,Neurocon,NA,NA,2,2,1578076.97,22523,22047.4,6716.8,6453.8,5036.3,4804.1,5159,4735.3,1447.3,1277.5,3107.6,3440,944.9,1174.9,242.1,208 33 | EPD_1_032046,1,1,61,Neurocon_1,Neurocon,NA,NA,2,2,1609068.498,11975.2,11537.9,7497.3,7042.8,3132.9,3309.1,3990,4020.3,1392.9,1355.6,4126.1,4432,1563.2,1368.4,504.6,287.7 34 | EPD_1_032047,1,1,45,Neurocon_1,Neurocon,NA,NA,1,1,1555404.625,2488.9,2419.6,7381.4,7142.4,3418.3,3328.1,4524.4,4491.1,1299.8,1106.9,4742.3,4618,1800.9,1812,562,486.8 35 | EPD_1_032048,1,1,54,Neurocon_1,Neurocon,NA,NA,2,1.5,1688699.98,9851.7,7566.8,7783.7,7689.2,3195.8,3058.3,5046,2957.2,1243.3,1355.5,4364.2,4475,1604.8,1519.1,456.5,406.3 36 | EPD_1_032049,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1530455.055,14272.4,13457.9,6574.7,6254.1,2721.8,2875.1,4063,3621.4,1361.8,1272.1,3356.9,3242.9,1171.8,1072.4,316.1,251 37 | EPD_1_032050,1,1,86,Neurocon_1,Neurocon,NA,NA,2,2,1681670.548,21261.8,16858.2,7403.1,6751.4,4565.8,4277.6,4903.4,5058.6,1550.5,1480.5,3140.4,2799.9,1076.3,1049.3,267.7,334.4 38 | EPD_1_032051,1,1,84,Neurocon_1,Neurocon,NA,NA,2,2.5,1676259.169,21303.6,15465.2,6953.4,7310.8,4006.7,3721.4,4173.3,5205,1641.6,1581.3,3440.9,3021.2,1136,1432.4,396.2,258.1 39 | EPD_1_032052,1,1,63,Neurocon_1,Neurocon,NA,NA,2,2,1651600.267,10693.2,10294.7,8445.4,NA,3184.6,3323.4,4245.3,4182.7,1223.3,1386.9,4377,4247.8,1486.4,1456.6,419.4,380.7 40 | EPD_1_032053,1,1,51,Neurocon_1,Neurocon,NA,NA,2,1.5,1735240.259,7631.9,6863.1,9172.5,8601.7,4017.6,3835.6,5165.4,4608.1,1355.5,1488.2,5016.2,5345.8,1710.9,1750.2,369.5,379.2 41 | EPD_1_032054,1,2,66,Neurocon_1,Neurocon,NA,NA,2,2.5,1653376.364,5937.2,10426.3,7542.2,7348.4,3133.7,3330.4,4596.2,4478.9,1477,1541.5,4285.7,4336.4,1436.8,1391.1,405.2,357.2 42 | EPD_1_032055,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1585578.151,8828.2,5870.2,7312.7,7343.6,2710.9,2585.3,3535.8,2746.2,1359.8,628.8,4010.8,4096.2,1466.6,1444,374,312.6 43 | EPD_1_032056,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1754656.017,22118.5,17878,8237.1,6696.5,4081.3,3769.3,5736.2,5502.2,1556.6,1598.9,3732.2,4295,1209.1,1463,343,482 44 | EPD_001,0,2,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1620724.937,24457.8,19152.6,6422.1,6379.7,3137.8,3122.5,5173.2,4525.7,1923.9,2322.5,4030.5,4300.6,1819.9,1736.8,327.6,349.2 45 | EPD_002,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1298359.679,16945.3,16556.1,8367.4,6140.2,3271.6,3373,3890.1,3708.2,1322.9,1407.5,3341.7,3729.1,1462,1439.4,384,402.7 46 | EPD_003,0,1,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1605639.56,21546.5,20705.9,NA,6805.9,3258.2,3083,4465.6,4541.9,950.6,1268.7,3643.5,4202.1,816.9,1564.9,460.4,416.7 47 | EPD_004,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1505840.774,11874.7,10032.3,7917,5944.6,3396.3,3684.9,NA,5224.1,NA,1997.9,4161.1,4394,1610.2,1623.2,526.1,442.3 48 | EPD_005,0,1,65,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1794421.948,29856.5,23538.3,NA,NA,3406.8,3406,6376.1,4888.2,1754.7,2183.7,3941.8,4143.3,1818.9,1703,471.9,473.1 49 | EPD_006,0,2,66,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1749751.576,33915.9,24083.1,7742.1,7028.4,3105.2,3027.2,4938.8,4759.7,1125.2,1968.1,3285.2,3834.1,1498.2,1898.8,407.2,432.3 50 | EPD_007,0,2,68,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1621011.582,12962.3,8831.1,7331.3,7340.7,2997.8,2951.4,6307.8,5318.8,1887.1,1772.8,3722,4382.4,1728.9,1726.1,737.4,612.9 51 | EPD_008,0,2,60,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1480815.747,13592.3,10593.1,8309.2,7483.6,2871.1,2914.1,3752.2,3920.3,882,1512.6,3818.6,4368.7,1383.6,1537.7,429.1,339.6 52 | EPD_009,0,1,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1465221.972,23462.6,21137.3,5328.2,5451.8,3057.1,2859.5,4463.7,4133.9,1554.8,1487.6,3611.7,3769.9,1852.3,1784.3,190.8,272.7 53 | EPD_010,0,1,71,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1310621.335,8695.7,9207.3,6395.6,5852.4,3088.9,3197,4197.6,4269.3,1135.6,1536.8,3047.7,3112.9,1526.2,1776.8,443.5,410.7 54 | EPD_011,0,2,61,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1351287.794,22773.7,16856.6,NA,NA,2671.4,2848.1,4690.3,3932.9,1665.6,1846.6,3066.9,3351.1,1364.5,1222.8,325.7,381.9 55 | EPD_012,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1266821.229,9667,9035.3,6327.1,5758.1,3379,3388.7,4894.5,4532.4,1387.6,1257.3,3278.4,3368.9,1487.5,1572.2,314.3,406.7 56 | EPD_013,0,1,56,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1268008.249,12745.9,12368.6,7556.3,6219.2,3755.8,3492.4,4643.1,4689.9,1850.2,1773.6,3517.9,3486.7,1437.8,1157.6,175.8,323.8 57 | EPD_014,0,1,72,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1217391.629,17928.9,15235.9,6458.3,5873.5,2891.1,2857.5,4202,3913.6,1010.4,1262.5,3813.5,4001.8,1656.3,1473.6,364.9,411.4 58 | EPD_015,0,1,71,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1449977.253,40508.8,34076.6,7861.1,6457,3415,3667.7,4232.5,3747.9,999,1458.9,3073.6,3519.8,1100.4,1234.1,229.8,251.2 59 | EPD_016,1,2,64,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1387866.494,8312.6,8874.9,8445.3,6075.4,2964.8,3088.2,5135.5,4502.3,1475.7,1484.4,4343.8,4664.2,1564.3,1561.8,333.4,368.3 60 | EPD_017,1,2,61,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1309411.248,12299.6,10123.2,6609.5,6310.6,3550.3,3779.8,4977.9,4745.4,1368.1,1412.6,4369.2,4343.3,1556.6,1511.6,509.5,423.2 61 | EPD_018,1,2,72,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1285303.1,13741.2,10835.1,6147.5,5770.6,2962.3,3048.8,5060.1,4707.3,1134.7,1489.5,3670.4,3858.2,1251.6,1259.2,537.5,432.3 62 | EPD_019,1,2,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1225976.857,12230.6,11181.8,5576.4,5673.7,3249.7,2927.5,4487.8,4330.3,1424.6,1472.2,3414.3,3671.1,1419.6,1215.4,292.4,305.6 63 | EPD_020,1,2,58,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1331464.088,15084.6,16228.1,8244.9,5876,2977.1,3105.6,3870.6,4588.9,945.1,1608.4,4102.4,3928.8,1411.5,1588.5,286.8,275.7 64 | EPD_021,1,1,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1657333.761,26584.4,24601.3,6471,6685.9,3550.7,3309.7,4326.8,5285.3,2260.1,1914.7,3277.5,3716.6,1127.7,1425.6,204.4,434.7 65 | EPD_022,1,2,69,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1180883.526,17586.4,11121,5892.7,5646.8,3596.5,3289.4,5590.1,5237.9,1614.9,1484.8,2961.8,3400.2,1353.8,1356,439.2,372.2 66 | EPD_023,1,2,66,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1262688.299,10273.3,9206.5,6045.2,5542.4,3282.4,3028.9,5096.7,4201.9,1337.2,1386.4,3765.7,4230.3,1379.4,1625.8,471.8,379 67 | EPD_024,1,2,66,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1205206.594,13270.3,13959.6,6024.6,5828.2,3303.4,3206.8,4689,4455.3,1315.8,1596,3470,3391.9,1299.1,1367.7,297,466.9 68 | EPD_025,1,2,75,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1381811.434,19841.1,14969.4,7036.8,6149.8,2555.7,2457.6,4430.4,3812.5,1455.3,1954,3791.6,3998.1,1480.7,1380.1,176.7,375.8 69 | EPD_026,1,1,54,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1707199.154,16022.1,16162.9,8734.7,NA,4190.6,4161.8,5212.1,4612,1694.6,1821,4185.4,NA,1834,1447.5,458,557.4 70 | EPD_027,1,1,54,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1556353.411,15652.3,10794.5,9212.5,7219.6,3773.6,3971.1,5377,5387.8,1241.1,1953.5,4340.2,4865,2089.8,2091.5,581.7,512.5 71 | EPD_028,1,1,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1603290.502,18457.1,17164.2,6741.2,5955.3,3427.6,3519.2,6199.8,5658.5,1458.9,1997.3,4214.3,4430.3,1828.1,1920.3,373.6,268.6 72 | EPD_029,1,1,52,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1718704.653,6454.8,6796,9171.7,8449.7,3256.4,2700.3,6037.7,5208.8,1306,1756,4710.7,4791.2,1689.2,1543.5,526.7,662.5 73 | EPD_030,1,1,68,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1598778.571,10856.8,11959.1,7126.7,5946.6,3946.4,3742.4,5888,5573.2,1312.1,1733.4,4724.9,4974.8,1868.4,1639.8,283.8,491 74 | EPD_031,1,1,72,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1303148.994,8148.8,6742,5827.6,5649.7,3081.1,3149.6,4730.4,4113.7,1210.7,1367.8,3945.8,4174.7,1813,1777.6,501.3,413.6 75 | EPD_032,1,2,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1475484.971,10824.5,10155,6423,6193.9,3429.5,3491.6,5421,4552.5,1530.2,1662,4055.8,4315,1322.3,1452.8,766.2,625.8 76 | EPD_033,1,1,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1251387.265,20674.8,17654.4,5176,4720.7,2893.2,3002.5,NA,4484.5,NA,1429.5,3617.9,3545.3,1419.4,1603.7,452.6,442.7 77 | EPD_034,1,1,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1404323.247,15810.1,14863.1,6137.3,5937.8,3234,3327.7,4860,4316.7,1286.4,1522.1,3692.6,3839.7,1501.2,1597,494,370.5 78 | EPD_035,1,1,70,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1419258.031,10253.7,7562,6421.7,6172,3180,3208.3,5581.8,4824.2,1399.4,1401.4,4269.9,4732.1,1529.3,1736.6,692.3,401.8 79 | EPD_036,1,1,60,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1315001.009,18323.3,18812.7,5181.9,5158,2817.1,3154.7,4328.7,3831.4,1258.8,1101.8,4003.3,3989.1,1356.6,1470.4,625.1,453.4 80 | EPD_037,1,1,65,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1258298.305,22384.8,22368.8,7159.6,6223.2,3641.1,3246.8,3791.6,3637.7,1330.6,1274.5,3202.9,3440.3,1310.8,1381.8,435.5,436.7 81 | EPD_038,1,2,65,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1364160.354,5109.8,3694.9,6059.4,6246.7,3267,2975.5,5448.3,5055.8,1336.8,1178.1,4168.4,4170.3,1509.9,1539,482.1,619.7 82 | EPD_039,1,1,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1736462.908,17926.2,14713.9,6772,6744.8,4346.2,4318.3,6318.2,5692.5,1665.7,1664.9,4144.9,4788.1,1790.4,1839.9,766.2,646.7 83 | EPD_040,1,2,68,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1551951.093,18719,14494.3,6228,5755.8,3204.7,3275.8,5300,4919.9,1593.3,1767.7,3440.9,3678.5,1708.1,1728,366.3,313.8 84 | EPD_041,1,2,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1368785.299,8922.4,4808.4,7807.4,6104.2,3344.7,2979,NA,5109.9,NA,1500.3,3842,4503.9,1489.2,1727.3,464.2,514 85 | EPD_042,1,2,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1353967.499,8580.6,6864.6,NA,5601.5,3072.9,3141,NA,4802.2,NA,1320.5,3572.7,3664.9,1388.4,1536.6,237.7,366.7 86 | EPD_043,1,2,75,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1416784.956,9433.5,7326.7,8458.2,6086.6,3154.5,3158.5,NA,4945.5,NA,1435.7,3968.5,4139.5,1680.8,1400.3,240.4,436.6 87 | EPD_044,1,2,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1456538.522,12628.3,12323.8,9302.4,6552.9,4280,4293.4,NA,5158.2,NA,1947.8,4081.9,4111.1,1406.7,1499.1,323,348.2 88 | EPD_045,1,2,78,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1416813.085,13674.3,12396.3,7874,6618.7,3128.9,2923.4,4424.8,3943.4,2076.7,1716,3984.6,4196.3,1369,1767.5,203.4,329.8 89 | EPD_0_032057,0,2,64,TaoWu_1,TaoWu,NA,NA,0,NA,1136988.062,6636.4,6880.7,5665.3,5543.6,2870.8,2891.3,3707.7,3667,1068.1,1266.4,3920.9,4028.2,1264.4,1525.8,369.9,446.3 90 | EPD_0_032058,0,1,73,TaoWu_1,TaoWu,NA,NA,0,NA,1503204.57,18710.7,13936.9,7472.3,6923.1,3404.9,3096.6,4568.4,3740,1472.7,1442.3,NA,3850.8,1249.3,1358.1,328.6,362.1 91 | EPD_0_032059,0,1,67,TaoWu_1,TaoWu,NA,NA,0,NA,1597061.121,27406.6,27718.6,7712.7,6643.9,3786.1,3157,5536.2,4968.7,1621,1845.6,3765.2,3927.2,1557.1,1781.9,282.7,396 92 | EPD_0_032060,0,1,70,TaoWu_1,TaoWu,NA,NA,0,NA,1577992.856,20284,11651.9,7383.5,6632.7,4496.7,3715.7,NA,4477.2,NA,1460.1,3695,3962.6,1481.9,1412.3,239.7,430.1 93 | EPD_0_032061,0,2,75,TaoWu_1,TaoWu,NA,NA,0,NA,1486946.65,12162,9041,7805.8,7050.2,3566,3227.5,4069.7,3805,1290.7,1393.4,4357.9,4341.9,1408,1295.9,410.7,405.9 94 | EPD_0_032062,0,1,62,TaoWu_1,TaoWu,NA,NA,0,NA,1555999.832,14868.2,11097,8698.9,8143.1,3718.5,3109.1,4849.2,4138.3,1376.3,1664.8,4243.1,4615.3,1581.2,1589,318.5,423.7 95 | EPD_0_032063,0,1,71,TaoWu_1,TaoWu,NA,NA,0,NA,1564971.686,18008.5,16485.4,8916.4,7354.9,3609.9,3284.4,4506.4,4831.4,1064.3,1205.7,4043.8,4177.7,1531.6,1838.4,253.2,490.4 96 | EPD_0_032064,0,1,75,TaoWu_1,TaoWu,NA,NA,0,NA,1418112.397,6426.1,7797.9,8653.9,6659.1,2626.4,2798.3,NA,4038.6,NA,1527.1,4227.1,4439.1,1489.8,1560.4,342.3,386.6 97 | EPD_0_032065,0,1,64,TaoWu_1,TaoWu,NA,NA,0,NA,1651372.119,16010,8317.3,8178.9,7786.7,3641.9,3434.2,5458.5,4912.3,1845.8,1833.1,4015.6,4653.4,1661.3,1734.7,381.2,396.4 98 | EPD_0_032066,0,1,60,TaoWu_1,TaoWu,NA,NA,0,NA,1416979.177,5745.9,4362.9,7583.1,6629.6,2950.8,3047.8,5029.6,5365.4,1360.8,1510.8,4584.3,4934.9,1643.7,1780.2,479.8,499.4 99 | EPD_0_032067,0,1,59,TaoWu_1,TaoWu,NA,NA,0,NA,1480896.396,10649.9,9880.7,7936.5,7081.8,3824.2,3671.5,5753,5478.5,1405.9,1502.2,4740.1,4966.3,1666.2,1847.2,707.3,720.7 100 | EPD_0_032068,0,2,63,TaoWu_1,TaoWu,NA,NA,0,NA,1315379.197,9034.6,8144.3,7265.6,6229.6,3590.7,3506,4761.1,4506.3,1103.4,1284.8,3983.1,3991.3,1586.5,1470.4,383.9,362 101 | EPD_0_032069,0,1,60,TaoWu_1,TaoWu,NA,NA,0,NA,1631434.917,13706.6,8101.2,8355.8,7501.5,3278.5,2923.5,4524.4,4241.3,1237.5,1263.3,4440.4,4987.6,1690.6,1708.9,425.6,482.8 102 | EPD_0_032070,0,2,60,TaoWu_1,TaoWu,NA,NA,0,NA,1290174.933,6752.1,5676.3,7199.5,6203.3,3080.4,3056.3,4451.3,4279.9,1196.1,1371.6,4214.9,4271.7,1375.3,1521.3,464.6,393.2 103 | EPD_0_032071,0,2,59,TaoWu_1,TaoWu,NA,NA,0,NA,1333641.617,6915.1,6851.3,7664.6,6401.1,3126.7,3085.6,NA,4311.5,NA,1209.4,4511.9,4575.3,1262,1402.9,431.2,516.5 104 | EPD_0_032072,0,1,68,TaoWu_1,TaoWu,NA,NA,0,NA,1508014.348,18744.1,22032.2,7040.2,6034.8,3464.1,3222,4928.4,4673.2,1504.9,1624.9,4122.2,4184.3,1749.1,1674.2,452.5,488.3 105 | EPD_0_032073,0,2,63,TaoWu_1,TaoWu,NA,NA,0,NA,1371255.954,10859.1,11444.2,7214.5,6206.4,3321.6,3250.2,4783.3,4806.7,1152.7,1493.1,3825.7,3973.4,1340.3,1502.4,238.4,394.7 106 | EPD_0_032074,0,1,65,TaoWu_1,TaoWu,NA,NA,0,NA,1552521.831,9115.3,7363,8606,6791.7,3213.9,3214.7,5248.1,5801.3,1245.2,1600.9,4708.7,4694.4,1801.5,1724.7,341.6,428.4 107 | EPD_0_032075,0,2,60,TaoWu_1,TaoWu,NA,NA,0,NA,1408744.562,12708.1,13068.3,6869.3,6411.9,3414.3,3019.6,5396.3,4789,1363.2,1529.8,NA,NA,1657,1630,477.7,510.6 108 | EPD_0_032076,0,2,57,TaoWu_1,TaoWu,NA,NA,0,NA,1422592.744,3761.7,4151.4,8390.7,6233.9,2885.4,2712.1,3837.4,4600.9,722.4,1382.4,4075.8,4395.8,1763.7,1731.5,349.1,467.7 109 | EPD_1_032077,1,2,61,TaoWu_1,TaoWu,4,57,1,1,1349827.754,9221.6,5435,6299.4,NA,3148.8,2949,4832.3,4374.9,1459.8,1424.8,3981.5,4340.9,1531,1368.2,481.5,599.3 110 | EPD_1_032078,1,1,71,TaoWu_1,TaoWu,4,67,2,2,1797227.991,17769.8,17059.3,6677,5948.4,2590.3,2754.3,4281.6,4042.8,1289.5,1320.8,3113.4,3637.3,1884.9,1907.7,227.2,435.7 111 | EPD_1_032079,1,1,74,TaoWu_1,TaoWu,5,69,1,1,1449605.026,16118.2,13578.5,6598.2,6288.7,3905.3,3946.5,5414.1,5142.8,1863.9,1691.8,NA,4179.4,1717.1,1561.5,417.1,450.9 112 | EPD_1_032080,1,2,62,TaoWu_1,TaoWu,6,56,3,3,1381155.1,6831.6,4265.1,6233.2,5501.7,2802.1,2751.5,4860.1,4546.9,1327.8,1340.5,3471.7,3638.7,1255.5,1312.2,454,552.1 113 | EPD_1_032081,1,1,66,TaoWu_1,TaoWu,15,51,2,2,1498806.466,11477.9,10658.7,6803.9,7040.4,3499,3482.7,4782.5,4498,1539.5,1414.8,4550.9,4493,1881.3,1773.7,489.2,512.1 114 | EPD_1_032082,1,1,66,TaoWu_1,TaoWu,3,63,1,1,1433323.514,15944.9,14704.6,6417.2,6535,3146.8,3249.8,4256.1,4333.5,1546.6,1577.1,3669.7,4057.7,1316.7,1577.3,384.2,407.2 115 | EPD_1_032083,1,1,67,TaoWu_1,TaoWu,6,61,2,2,1431851.149,10325,9605,7411.9,6854.1,4228.9,4311,5904.1,5663.3,1713.3,1955.5,4117.2,4326.8,1781.1,1791.7,528,577 116 | EPD_1_032084,1,1,67,TaoWu_1,TaoWu,8,59,2,2,1634767.157,12837.7,9580,7355.2,7387.1,3431.3,3202.4,5380.4,4582.7,1768,1710.7,4039.1,4379,1480.7,1631,536.5,577.1 117 | EPD_1_032085,1,2,64,TaoWu_1,TaoWu,6,58,2,2.5,1406747.643,8815.6,6792.9,7073,6095.1,3716.1,3809.2,5580.2,5152,1188.7,1504.6,3746.6,3830,1338.2,1400.3,562.9,501.6 118 | EPD_1_032086,1,2,73,TaoWu_1,TaoWu,8,65,2,2,1236369.836,7632.9,7447.1,6150.7,5657,2845.6,2661.9,3808,3700.6,1073.5,1273.2,3580.4,3684,1342,1208.8,330,382.9 119 | EPD_1_032087,1,2,62,TaoWu_1,TaoWu,3,59,2,2.5,1520593.232,16473.3,11142.7,7164.6,6507.2,3588.6,3541.7,5055.7,4421,1422.4,1497.5,3415,3932.5,1585.9,1660.3,440.6,412 120 | EPD_1_032088,1,2,70,TaoWu_1,TaoWu,15,55,2,2,1289342.782,5853.9,7289.6,7300.6,6068.4,3519.5,3604.9,4165.7,4087.6,1398,1429.1,4107.4,4062.1,1461.5,1593,426.3,469.2 121 | EPD_1_032089,1,1,62,TaoWu_1,TaoWu,2,60,1,1,1528251.515,7430.2,8270.1,7728.6,7114.2,3759.1,4096.7,6190.3,6297.6,1355.5,1434.9,3975.5,4118.1,1526.5,1816.1,509.9,494 122 | EPD_1_032090,1,1,64,TaoWu_1,TaoWu,6,58,2,1.5,1455132.113,9820.3,7320.3,7325.6,6236.7,3465.8,3279,4536.7,4578.9,1262.8,1537.4,4392.9,4654.7,1496.2,1624.2,213.9,387.8 123 | EPD_1_032091,1,2,61,TaoWu_1,TaoWu,3,58,2,2,1391234.096,7111.7,10277.4,7441.3,6833.9,3551.2,3529.2,4727.5,4846.4,1443.9,1452.4,3424,3434,1197.3,1079.4,493.9,522.4 124 | EPD_1_032092,1,1,63,TaoWu_1,TaoWu,1,62,2,2.5,1568084.039,13046.9,14254.9,7498.8,7942.1,3733.6,3770.6,5980.9,5349.5,1772.3,1635.5,4372.8,4600.6,1528.6,1672.5,414.7,405.4 125 | EPD_1_032093,1,1,64,TaoWu_1,TaoWu,1,63,1,1,1696557.061,19154.3,17458.3,7830.4,7850.1,3740.4,3652.9,5555.1,4994.6,1557.6,1659.3,5104,4721.7,1492,1708.9,405.5,424.5 126 | EPD_1_032094,1,2,60,TaoWu_1,TaoWu,3,57,2,1.5,1474907.825,17642.5,13562.7,8318.5,7790.1,3695.7,3428.8,4683.7,4073.5,1333.2,1505.9,4118.5,4164.9,1665,1489,322.6,434.4 127 | EPD_1_032095,1,2,58,TaoWu_1,TaoWu,2,56,2,2.5,1375128.363,5068.1,4614.4,5613.6,6192.3,2899.5,2901,4796.5,4596.5,1468.7,1329.7,4135.9,4243.2,1465,1860.7,750.9,448.1 -------------------------------------------------------------------------------- /stats/all4sites_data_OHBM.csv: -------------------------------------------------------------------------------- 1 | SubjID,Dx,Sex,Age,site,cohort,DURILL,AO,HY1234,Hoehn_Yahr_stage,ICV,LLatVent,RLatVent,Lthal,Rthal,Lcaud,Rcaud,Lput,Rput,Lpal,Rpal,Lhippo,Rhippo,Lamyg,Ramyg,Laccumb,Raccumb 2 | EPD_0_032014,0,1,53,Neurocon_1,Neurocon,NA,NA,0,NA,1837978.779,9050.4,8720.6,9097.1,7425.4,3488.4,3392.5,NA,4932.2,1188.4,1343.7,4960.9,5108.5,1480.2,1761,532.2,495 3 | EPD_0_032015,0,1,64,Neurocon_1,Neurocon,NA,NA,0,NA,1522398.796,18878.2,16179.6,5662.3,6024.6,3325,3603,4458.5,4230.2,868,1070.8,3053.9,3047.2,1285.8,1305.3,339.5,376.2 4 | EPD_0_032016,0,2,58,Neurocon_1,Neurocon,NA,NA,0,NA,1469688.415,10721.5,8714.2,6967.9,6675.2,3304.6,3528,NA,4004.7,NA,1149.8,4176.2,4340.2,1650.1,1620.7,367.8,328.4 5 | EPD_0_032017,0,2,77,Neurocon_1,Neurocon,NA,NA,0,NA,1420165.924,12266.1,10716.8,5872.1,6192.2,2737,2786.8,3965.4,3518.8,1522.5,1348.9,3272.2,3280.7,1308.1,1201.3,326.2,362.9 6 | EPD_0_032018,0,2,55,Neurocon_1,Neurocon,NA,NA,0,NA,1521936.585,17258.3,18006.5,NA,NA,3224.5,3260.4,4733.1,4259,1239.2,1324.7,3358.6,3576.5,1646.4,1519.1,485.2,364.2 7 | EPD_0_032019,0,2,56,Neurocon_1,Neurocon,NA,NA,0,NA,1478313.968,10741.7,11631.6,8387.5,7785.6,3283.5,3314.7,4610.2,4640.8,1348.3,1371,3982.1,4113.8,1334,1402.1,303.8,319.5 8 | EPD_0_032020,0,2,76,Neurocon_1,Neurocon,NA,NA,0,NA,1267057.355,5289.9,5239.7,6885,6037.6,2762.2,2895.7,NA,NA,NA,NA,3726.5,3733,1338.1,1415.6,290.7,300.9 9 | EPD_0_032021,0,2,56,Neurocon_1,Neurocon,NA,NA,0,NA,1558683.691,5929,7937,8013.5,7000,3249.8,3226,NA,NA,NA,NA,4536.9,4556,1254.8,1335.5,544.7,400 10 | EPD_0_032022,0,2,46,Neurocon_1,Neurocon,NA,NA,0,NA,1408076.473,4807.1,3501,7139.8,7201.6,2705.4,2728.7,4423.9,4383.5,1253.6,1422.6,3958.8,4198.5,1574.3,1717.4,482.9,395.9 11 | EPD_0_032023,0,2,82,Neurocon_1,Neurocon,NA,NA,0,NA,1381769.489,13527.6,15282.4,6051.9,5419.6,2663.7,2615.1,3768,3338.4,1328.4,1134.8,3661.9,3526.8,1092.8,1241.1,343.5,332.5 12 | EPD_0_032024,0,2,71,Neurocon_1,Neurocon,NA,NA,0,NA,1584213.063,27543,24076.2,6671.4,6379.1,3488.1,3200.7,4599.9,4111.4,1390.9,1276.4,3913.6,3737.2,1450.2,1536.4,372.3,316.7 13 | EPD_0_032025,0,1,74,Neurocon_1,Neurocon,NA,NA,0,NA,1482952.211,12699.6,14751.7,NA,6656.3,3522.7,3299.4,4036.8,4414.7,1038.7,1297.7,3610.7,3491.8,1317.4,1239.1,350.6,316.6 14 | EPD_0_032026,0,2,78,Neurocon_1,Neurocon,NA,NA,0,NA,1528881.186,31053.3,27368,6358,5929.9,3180.1,3213.4,3981.9,3736.5,1442.6,1356.7,NA,3365.8,1201.8,1340.3,330.9,270.7 15 | EPD_0_032028,0,2,76,Neurocon_1,Neurocon,NA,NA,0,NA,1472691.333,19409.4,22622.7,6669.2,6007.8,3341.3,3413.5,4160.8,4109.4,1350.3,1477.9,3377.3,3304.6,1107.9,1310.3,374,237.8 16 | EPD_0_032029,0,2,79,Neurocon_1,Neurocon,NA,NA,0,NA,1594083.416,26870.8,33685.5,6226.5,5980.6,3292.9,3342.1,4090.9,3458.8,1309.1,1235.4,3303.7,2939.9,1247,1176.9,266,272.4 17 | EPD_1_032030,1,2,76,Neurocon_1,Neurocon,NA,NA,2,2,1208589.99,9053.4,10473.9,5720.8,5044.7,2607,2620.2,2691,3028.9,1375.3,1368.7,2732.7,2913.7,930.4,1111.1,277.5,310.2 18 | EPD_1_032031,1,2,75,Neurocon_1,Neurocon,NA,NA,2,2,1526462.563,13828.5,14292.4,7836.2,NA,2663.8,3031.2,3878.1,4294.2,1573.7,1235.7,4107.6,3968.3,1232.6,1237.1,330,385.6 19 | EPD_1_032032,1,1,74,Neurocon_1,Neurocon,NA,NA,2,2,1737015.518,27438,25295.2,6632,6802.7,3410.2,3478.5,4119,3861,1366.1,1179.6,3478.3,3668.3,1212,1417.2,317.2,217.4 20 | EPD_1_032033,1,1,67,Neurocon_1,Neurocon,NA,NA,2,2,1519402.825,11953.2,9648.7,7248.4,6854.6,3135.3,3108.4,5118,4483.2,1603.1,1605.5,3684,4029.9,1400.5,1672.9,431.9,272.5 21 | EPD_1_032034,1,1,79,Neurocon_1,Neurocon,NA,NA,2,2,1552273.934,19634.4,16131.1,6400.1,6164.1,2894.9,2926.7,3783.6,3671.7,1322.7,1262,NA,3462.5,1238,1262,241.9,218.7 22 | EPD_1_032035,1,1,60,Neurocon_1,Neurocon,NA,NA,2,2,1661380.915,29399.8,20224.4,6978.8,6931.8,3440.8,3492.7,3924.1,3643.2,1367.3,1405.1,3342,3815.3,1327.8,1396.6,341.2,334.5 23 | EPD_1_032036,1,2,67,Neurocon_1,Neurocon,NA,NA,2,2,1258755.001,7611.5,6916.9,6040.6,6185.9,2567.5,2726.6,3720.7,3479.2,1314,1248.4,3521.9,3542.5,1213.5,1195.7,309.8,383.5 24 | EPD_1_032037,1,1,73,Neurocon_1,Neurocon,NA,NA,2,2,2170191.6,23813.5,22081.9,8167.8,8385.7,5008.1,4830.2,5563,5565.8,1755.3,1685.3,4646.4,4872.6,1803,2017.4,528.1,522.1 25 | EPD_1_032038,1,2,68,Neurocon_1,Neurocon,NA,NA,2,2,1444163.031,13530.3,11785.3,7791.6,6856.9,2481.5,2627,3370.4,3235,1244.1,1074.7,3700.7,3937.4,1260.6,1270.9,336.8,264.4 26 | EPD_1_032039,1,1,60,Neurocon_1,Neurocon,NA,NA,1,1,1780970.392,18495.9,11676.6,8033.6,7495,3355.9,3412.3,4572.6,4156.5,1727,1646.3,4306.9,4522.8,1409.6,1514.4,429,352 27 | EPD_1_032040,1,1,66,Neurocon_1,Neurocon,NA,NA,2,2,1938646.424,19429.6,23082.5,7930.6,8388,4218.1,4919.9,5671.6,5809,2001.8,1542.9,4148.6,4007.4,1699.6,1969.4,514,351.2 28 | EPD_1_032041,1,1,54,Neurocon_1,Neurocon,NA,NA,2,2,1514653.879,9995.2,7741.1,6491.9,7105.5,3167.2,3336.6,3422.3,3864.1,928,1094.3,4411,4562.2,1513.4,1537.9,376.1,309.1 29 | EPD_1_032042,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1564840.712,22794.4,20891.1,5901.3,5790.7,3903.1,3729.1,4299.6,3894.4,1475.4,1399.8,3216.7,3402.4,1146.6,1154.5,321.3,248.8 30 | EPD_1_032043,1,2,64,Neurocon_1,Neurocon,NA,NA,2,2,1312780.31,9055.7,9592.9,6517.5,6230.8,4082.2,3760.8,4875.8,4706.7,1561.4,1433.7,3622.1,3638,1061.1,1079.6,309.1,397 31 | EPD_1_032044,1,1,84,Neurocon_1,Neurocon,NA,NA,2,2,1357163.732,19696.1,19655.5,5458.3,4972.4,3005.1,2943.1,3483.2,3343.4,1177.5,1020.6,2742.7,2873,966.5,1114.5,277.1,223.4 32 | EPD_1_032045,1,1,82,Neurocon_1,Neurocon,NA,NA,2,2,1578076.97,22523,22047.4,6716.8,6453.8,5036.3,4804.1,5159,4735.3,1447.3,1277.5,3107.6,3440,944.9,1174.9,242.1,208 33 | EPD_1_032046,1,1,61,Neurocon_1,Neurocon,NA,NA,2,2,1609068.498,11975.2,11537.9,7497.3,7042.8,3132.9,3309.1,3990,4020.3,1392.9,1355.6,4126.1,4432,1563.2,1368.4,504.6,287.7 34 | EPD_1_032047,1,1,45,Neurocon_1,Neurocon,NA,NA,1,1,1555404.625,2488.9,2419.6,7381.4,7142.4,3418.3,3328.1,4524.4,4491.1,1299.8,1106.9,4742.3,4618,1800.9,1812,562,486.8 35 | EPD_1_032048,1,1,54,Neurocon_1,Neurocon,NA,NA,2,1.5,1688699.98,9851.7,7566.8,7783.7,7689.2,3195.8,3058.3,5046,2957.2,1243.3,1355.5,4364.2,4475,1604.8,1519.1,456.5,406.3 36 | EPD_1_032049,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1530455.055,14272.4,13457.9,6574.7,6254.1,2721.8,2875.1,4063,3621.4,1361.8,1272.1,3356.9,3242.9,1171.8,1072.4,316.1,251 37 | EPD_1_032050,1,1,86,Neurocon_1,Neurocon,NA,NA,2,2,1681670.548,21261.8,16858.2,7403.1,6751.4,4565.8,4277.6,4903.4,5058.6,1550.5,1480.5,3140.4,2799.9,1076.3,1049.3,267.7,334.4 38 | EPD_1_032051,1,1,84,Neurocon_1,Neurocon,NA,NA,2,2.5,1676259.169,21303.6,15465.2,6953.4,7310.8,4006.7,3721.4,4173.3,5205,1641.6,1581.3,3440.9,3021.2,1136,1432.4,396.2,258.1 39 | EPD_1_032052,1,1,63,Neurocon_1,Neurocon,NA,NA,2,2,1651600.267,10693.2,10294.7,8445.4,NA,3184.6,3323.4,4245.3,4182.7,1223.3,1386.9,4377,4247.8,1486.4,1456.6,419.4,380.7 40 | EPD_1_032053,1,1,51,Neurocon_1,Neurocon,NA,NA,2,1.5,1735240.259,7631.9,6863.1,9172.5,8601.7,4017.6,3835.6,5165.4,4608.1,1355.5,1488.2,5016.2,5345.8,1710.9,1750.2,369.5,379.2 41 | EPD_1_032054,1,2,66,Neurocon_1,Neurocon,NA,NA,2,2.5,1653376.364,5937.2,10426.3,7542.2,7348.4,3133.7,3330.4,4596.2,4478.9,1477,1541.5,4285.7,4336.4,1436.8,1391.1,405.2,357.2 42 | EPD_1_032055,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1585578.151,8828.2,5870.2,7312.7,7343.6,2710.9,2585.3,3535.8,2746.2,1359.8,628.8,4010.8,4096.2,1466.6,1444,374,312.6 43 | EPD_1_032056,1,2,74,Neurocon_1,Neurocon,NA,NA,2,2,1754656.017,22118.5,17878,8237.1,6696.5,4081.3,3769.3,5736.2,5502.2,1556.6,1598.9,3732.2,4295,1209.1,1463,343,482 44 | EPD_001,0,2,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1620724.937,24457.8,19152.6,6422.1,6379.7,3137.8,3122.5,5173.2,4525.7,1923.9,2322.5,4030.5,4300.6,1819.9,1736.8,327.6,349.2 45 | EPD_002,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1298359.679,16945.3,16556.1,8367.4,6140.2,3271.6,3373,3890.1,3708.2,1322.9,1407.5,3341.7,3729.1,1462,1439.4,384,402.7 46 | EPD_003,0,1,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1605639.56,21546.5,20705.9,NA,6805.9,3258.2,3083,4465.6,4541.9,950.6,1268.7,3643.5,4202.1,816.9,1564.9,460.4,416.7 47 | EPD_004,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1505840.774,11874.7,10032.3,7917,5944.6,3396.3,3684.9,NA,5224.1,NA,1997.9,4161.1,4394,1610.2,1623.2,526.1,442.3 48 | EPD_005,0,1,65,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1794421.948,29856.5,23538.3,NA,NA,3406.8,3406,6376.1,4888.2,1754.7,2183.7,3941.8,4143.3,1818.9,1703,471.9,473.1 49 | EPD_006,0,2,66,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1749751.576,33915.9,24083.1,7742.1,7028.4,3105.2,3027.2,4938.8,4759.7,1125.2,1968.1,3285.2,3834.1,1498.2,1898.8,407.2,432.3 50 | EPD_007,0,2,68,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1621011.582,12962.3,8831.1,7331.3,7340.7,2997.8,2951.4,6307.8,5318.8,1887.1,1772.8,3722,4382.4,1728.9,1726.1,737.4,612.9 51 | EPD_008,0,2,60,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1480815.747,13592.3,10593.1,8309.2,7483.6,2871.1,2914.1,3752.2,3920.3,882,1512.6,3818.6,4368.7,1383.6,1537.7,429.1,339.6 52 | EPD_009,0,1,62,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1465221.972,23462.6,21137.3,5328.2,5451.8,3057.1,2859.5,4463.7,4133.9,1554.8,1487.6,3611.7,3769.9,1852.3,1784.3,190.8,272.7 53 | EPD_010,0,1,71,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1310621.335,8695.7,9207.3,6395.6,5852.4,3088.9,3197,4197.6,4269.3,1135.6,1536.8,3047.7,3112.9,1526.2,1776.8,443.5,410.7 54 | EPD_011,0,2,61,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1351287.794,22773.7,16856.6,NA,NA,2671.4,2848.1,4690.3,3932.9,1665.6,1846.6,3066.9,3351.1,1364.5,1222.8,325.7,381.9 55 | EPD_012,0,2,58,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1266821.229,9667,9035.3,6327.1,5758.1,3379,3388.7,4894.5,4532.4,1387.6,1257.3,3278.4,3368.9,1487.5,1572.2,314.3,406.7 56 | EPD_013,0,1,56,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1268008.249,12745.9,12368.6,7556.3,6219.2,3755.8,3492.4,4643.1,4689.9,1850.2,1773.6,3517.9,3486.7,1437.8,1157.6,175.8,323.8 57 | EPD_014,0,1,72,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1217391.629,17928.9,15235.9,6458.3,5873.5,2891.1,2857.5,4202,3913.6,1010.4,1262.5,3813.5,4001.8,1656.3,1473.6,364.9,411.4 58 | EPD_015,0,1,71,OpenNeuro_1,OpenNeuro,NA,NA,0,NA,1449977.253,40508.8,34076.6,7861.1,6457,3415,3667.7,4232.5,3747.9,999,1458.9,3073.6,3519.8,1100.4,1234.1,229.8,251.2 59 | EPD_016,1,2,64,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1387866.494,8312.6,8874.9,8445.3,6075.4,2964.8,3088.2,5135.5,4502.3,1475.7,1484.4,4343.8,4664.2,1564.3,1561.8,333.4,368.3 60 | EPD_017,1,2,61,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1309411.248,12299.6,10123.2,6609.5,6310.6,3550.3,3779.8,4977.9,4745.4,1368.1,1412.6,4369.2,4343.3,1556.6,1511.6,509.5,423.2 61 | EPD_018,1,2,72,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1285303.1,13741.2,10835.1,6147.5,5770.6,2962.3,3048.8,5060.1,4707.3,1134.7,1489.5,3670.4,3858.2,1251.6,1259.2,537.5,432.3 62 | EPD_019,1,2,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1225976.857,12230.6,11181.8,5576.4,5673.7,3249.7,2927.5,4487.8,4330.3,1424.6,1472.2,3414.3,3671.1,1419.6,1215.4,292.4,305.6 63 | EPD_020,1,2,58,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1331464.088,15084.6,16228.1,8244.9,5876,2977.1,3105.6,3870.6,4588.9,945.1,1608.4,4102.4,3928.8,1411.5,1588.5,286.8,275.7 64 | EPD_021,1,1,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1657333.761,26584.4,24601.3,6471,6685.9,3550.7,3309.7,4326.8,5285.3,2260.1,1914.7,3277.5,3716.6,1127.7,1425.6,204.4,434.7 65 | EPD_022,1,2,69,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1180883.526,17586.4,11121,5892.7,5646.8,3596.5,3289.4,5590.1,5237.9,1614.9,1484.8,2961.8,3400.2,1353.8,1356,439.2,372.2 66 | EPD_023,1,2,66,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1262688.299,10273.3,9206.5,6045.2,5542.4,3282.4,3028.9,5096.7,4201.9,1337.2,1386.4,3765.7,4230.3,1379.4,1625.8,471.8,379 67 | EPD_024,1,2,66,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1205206.594,13270.3,13959.6,6024.6,5828.2,3303.4,3206.8,4689,4455.3,1315.8,1596,3470,3391.9,1299.1,1367.7,297,466.9 68 | EPD_025,1,2,75,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1381811.434,19841.1,14969.4,7036.8,6149.8,2555.7,2457.6,4430.4,3812.5,1455.3,1954,3791.6,3998.1,1480.7,1380.1,176.7,375.8 69 | EPD_026,1,1,54,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1707199.154,16022.1,16162.9,8734.7,NA,4190.6,4161.8,5212.1,4612,1694.6,1821,4185.4,NA,1834,1447.5,458,557.4 70 | EPD_027,1,1,54,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1556353.411,15652.3,10794.5,9212.5,7219.6,3773.6,3971.1,5377,5387.8,1241.1,1953.5,4340.2,4865,2089.8,2091.5,581.7,512.5 71 | EPD_028,1,1,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1603290.502,18457.1,17164.2,6741.2,5955.3,3427.6,3519.2,6199.8,5658.5,1458.9,1997.3,4214.3,4430.3,1828.1,1920.3,373.6,268.6 72 | EPD_029,1,1,52,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1718704.653,6454.8,6796,9171.7,8449.7,3256.4,2700.3,6037.7,5208.8,1306,1756,4710.7,4791.2,1689.2,1543.5,526.7,662.5 73 | EPD_030,1,1,68,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1598778.571,10856.8,11959.1,7126.7,5946.6,3946.4,3742.4,5888,5573.2,1312.1,1733.4,4724.9,4974.8,1868.4,1639.8,283.8,491 74 | EPD_031,1,1,72,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1303148.994,8148.8,6742,5827.6,5649.7,3081.1,3149.6,4730.4,4113.7,1210.7,1367.8,3945.8,4174.7,1813,1777.6,501.3,413.6 75 | EPD_032,1,2,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1475484.971,10824.5,10155,6423,6193.9,3429.5,3491.6,5421,4552.5,1530.2,1662,4055.8,4315,1322.3,1452.8,766.2,625.8 76 | EPD_033,1,1,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1251387.265,20674.8,17654.4,5176,4720.7,2893.2,3002.5,NA,4484.5,NA,1429.5,3617.9,3545.3,1419.4,1603.7,452.6,442.7 77 | EPD_034,1,1,67,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1404323.247,15810.1,14863.1,6137.3,5937.8,3234,3327.7,4860,4316.7,1286.4,1522.1,3692.6,3839.7,1501.2,1597,494,370.5 78 | EPD_035,1,1,70,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1419258.031,10253.7,7562,6421.7,6172,3180,3208.3,5581.8,4824.2,1399.4,1401.4,4269.9,4732.1,1529.3,1736.6,692.3,401.8 79 | EPD_036,1,1,60,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1315001.009,18323.3,18812.7,5181.9,5158,2817.1,3154.7,4328.7,3831.4,1258.8,1101.8,4003.3,3989.1,1356.6,1470.4,625.1,453.4 80 | EPD_037,1,1,65,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1258298.305,22384.8,22368.8,7159.6,6223.2,3641.1,3246.8,3791.6,3637.7,1330.6,1274.5,3202.9,3440.3,1310.8,1381.8,435.5,436.7 81 | EPD_038,1,2,65,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1364160.354,5109.8,3694.9,6059.4,6246.7,3267,2975.5,5448.3,5055.8,1336.8,1178.1,4168.4,4170.3,1509.9,1539,482.1,619.7 82 | EPD_039,1,1,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1736462.908,17926.2,14713.9,6772,6744.8,4346.2,4318.3,6318.2,5692.5,1665.7,1664.9,4144.9,4788.1,1790.4,1839.9,766.2,646.7 83 | EPD_040,1,2,68,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1551951.093,18719,14494.3,6228,5755.8,3204.7,3275.8,5300,4919.9,1593.3,1767.7,3440.9,3678.5,1708.1,1728,366.3,313.8 84 | EPD_041,1,2,73,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1368785.299,8922.4,4808.4,7807.4,6104.2,3344.7,2979,NA,5109.9,NA,1500.3,3842,4503.9,1489.2,1727.3,464.2,514 85 | EPD_042,1,2,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1353967.499,8580.6,6864.6,NA,5601.5,3072.9,3141,NA,4802.2,NA,1320.5,3572.7,3664.9,1388.4,1536.6,237.7,366.7 86 | EPD_043,1,2,75,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1416784.956,9433.5,7326.7,8458.2,6086.6,3154.5,3158.5,NA,4945.5,NA,1435.7,3968.5,4139.5,1680.8,1400.3,240.4,436.6 87 | EPD_044,1,2,74,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1456538.522,12628.3,12323.8,9302.4,6552.9,4280,4293.4,NA,5158.2,NA,1947.8,4081.9,4111.1,1406.7,1499.1,323,348.2 88 | EPD_045,1,2,78,OpenNeuro_1,OpenNeuro,NA,NA,NA,NA,1416813.085,13674.3,12396.3,7874,6618.7,3128.9,2923.4,4424.8,3943.4,2076.7,1716,3984.6,4196.3,1369,1767.5,203.4,329.8 89 | EPD_0_032057,0,2,64,TaoWu_1,TaoWu,NA,NA,0,NA,1136988.062,6636.4,6880.7,5665.3,5543.6,2870.8,2891.3,3707.7,3667,1068.1,1266.4,3920.9,4028.2,1264.4,1525.8,369.9,446.3 90 | EPD_0_032058,0,1,73,TaoWu_1,TaoWu,NA,NA,0,NA,1503204.57,18710.7,13936.9,7472.3,6923.1,3404.9,3096.6,4568.4,3740,1472.7,1442.3,NA,3850.8,1249.3,1358.1,328.6,362.1 91 | EPD_0_032059,0,1,67,TaoWu_1,TaoWu,NA,NA,0,NA,1597061.121,27406.6,27718.6,7712.7,6643.9,3786.1,3157,5536.2,4968.7,1621,1845.6,3765.2,3927.2,1557.1,1781.9,282.7,396 92 | EPD_0_032060,0,1,70,TaoWu_1,TaoWu,NA,NA,0,NA,1577992.856,20284,11651.9,7383.5,6632.7,4496.7,3715.7,NA,4477.2,NA,1460.1,3695,3962.6,1481.9,1412.3,239.7,430.1 93 | EPD_0_032061,0,2,75,TaoWu_1,TaoWu,NA,NA,0,NA,1486946.65,12162,9041,7805.8,7050.2,3566,3227.5,4069.7,3805,1290.7,1393.4,4357.9,4341.9,1408,1295.9,410.7,405.9 94 | EPD_0_032062,0,1,62,TaoWu_1,TaoWu,NA,NA,0,NA,1555999.832,14868.2,11097,8698.9,8143.1,3718.5,3109.1,4849.2,4138.3,1376.3,1664.8,4243.1,4615.3,1581.2,1589,318.5,423.7 95 | EPD_0_032063,0,1,71,TaoWu_1,TaoWu,NA,NA,0,NA,1564971.686,18008.5,16485.4,8916.4,7354.9,3609.9,3284.4,4506.4,4831.4,1064.3,1205.7,4043.8,4177.7,1531.6,1838.4,253.2,490.4 96 | EPD_0_032064,0,1,75,TaoWu_1,TaoWu,NA,NA,0,NA,1418112.397,6426.1,7797.9,8653.9,6659.1,2626.4,2798.3,NA,4038.6,NA,1527.1,4227.1,4439.1,1489.8,1560.4,342.3,386.6 97 | EPD_0_032065,0,1,64,TaoWu_1,TaoWu,NA,NA,0,NA,1651372.119,16010,8317.3,8178.9,7786.7,3641.9,3434.2,5458.5,4912.3,1845.8,1833.1,4015.6,4653.4,1661.3,1734.7,381.2,396.4 98 | EPD_0_032066,0,1,60,TaoWu_1,TaoWu,NA,NA,0,NA,1416979.177,5745.9,4362.9,7583.1,6629.6,2950.8,3047.8,5029.6,5365.4,1360.8,1510.8,4584.3,4934.9,1643.7,1780.2,479.8,499.4 99 | EPD_0_032067,0,1,59,TaoWu_1,TaoWu,NA,NA,0,NA,1480896.396,10649.9,9880.7,7936.5,7081.8,3824.2,3671.5,5753,5478.5,1405.9,1502.2,4740.1,4966.3,1666.2,1847.2,707.3,720.7 100 | EPD_0_032068,0,2,63,TaoWu_1,TaoWu,NA,NA,0,NA,1315379.197,9034.6,8144.3,7265.6,6229.6,3590.7,3506,4761.1,4506.3,1103.4,1284.8,3983.1,3991.3,1586.5,1470.4,383.9,362 101 | EPD_0_032069,0,1,60,TaoWu_1,TaoWu,NA,NA,0,NA,1631434.917,13706.6,8101.2,8355.8,7501.5,3278.5,2923.5,4524.4,4241.3,1237.5,1263.3,4440.4,4987.6,1690.6,1708.9,425.6,482.8 102 | EPD_0_032070,0,2,60,TaoWu_1,TaoWu,NA,NA,0,NA,1290174.933,6752.1,5676.3,7199.5,6203.3,3080.4,3056.3,4451.3,4279.9,1196.1,1371.6,4214.9,4271.7,1375.3,1521.3,464.6,393.2 103 | EPD_0_032071,0,2,59,TaoWu_1,TaoWu,NA,NA,0,NA,1333641.617,6915.1,6851.3,7664.6,6401.1,3126.7,3085.6,NA,4311.5,NA,1209.4,4511.9,4575.3,1262,1402.9,431.2,516.5 104 | EPD_0_032072,0,1,68,TaoWu_1,TaoWu,NA,NA,0,NA,1508014.348,18744.1,22032.2,7040.2,6034.8,3464.1,3222,4928.4,4673.2,1504.9,1624.9,4122.2,4184.3,1749.1,1674.2,452.5,488.3 105 | EPD_0_032073,0,2,63,TaoWu_1,TaoWu,NA,NA,0,NA,1371255.954,10859.1,11444.2,7214.5,6206.4,3321.6,3250.2,4783.3,4806.7,1152.7,1493.1,3825.7,3973.4,1340.3,1502.4,238.4,394.7 106 | EPD_0_032074,0,1,65,TaoWu_1,TaoWu,NA,NA,0,NA,1552521.831,9115.3,7363,8606,6791.7,3213.9,3214.7,5248.1,5801.3,1245.2,1600.9,4708.7,4694.4,1801.5,1724.7,341.6,428.4 107 | EPD_0_032075,0,2,60,TaoWu_1,TaoWu,NA,NA,0,NA,1408744.562,12708.1,13068.3,6869.3,6411.9,3414.3,3019.6,5396.3,4789,1363.2,1529.8,NA,NA,1657,1630,477.7,510.6 108 | EPD_0_032076,0,2,57,TaoWu_1,TaoWu,NA,NA,0,NA,1422592.744,3761.7,4151.4,8390.7,6233.9,2885.4,2712.1,3837.4,4600.9,722.4,1382.4,4075.8,4395.8,1763.7,1731.5,349.1,467.7 109 | EPD_1_032077,1,2,61,TaoWu_1,TaoWu,4,57,1,1,1349827.754,9221.6,5435,6299.4,NA,3148.8,2949,4832.3,4374.9,1459.8,1424.8,3981.5,4340.9,1531,1368.2,481.5,599.3 110 | EPD_1_032078,1,1,71,TaoWu_1,TaoWu,4,67,2,2,1797227.991,17769.8,17059.3,6677,5948.4,2590.3,2754.3,4281.6,4042.8,1289.5,1320.8,3113.4,3637.3,1884.9,1907.7,227.2,435.7 111 | EPD_1_032079,1,1,74,TaoWu_1,TaoWu,5,69,1,1,1449605.026,16118.2,13578.5,6598.2,6288.7,3905.3,3946.5,5414.1,5142.8,1863.9,1691.8,NA,4179.4,1717.1,1561.5,417.1,450.9 112 | EPD_1_032080,1,2,62,TaoWu_1,TaoWu,6,56,3,3,1381155.1,6831.6,4265.1,6233.2,5501.7,2802.1,2751.5,4860.1,4546.9,1327.8,1340.5,3471.7,3638.7,1255.5,1312.2,454,552.1 113 | EPD_1_032081,1,1,66,TaoWu_1,TaoWu,15,51,2,2,1498806.466,11477.9,10658.7,6803.9,7040.4,3499,3482.7,4782.5,4498,1539.5,1414.8,4550.9,4493,1881.3,1773.7,489.2,512.1 114 | EPD_1_032082,1,1,66,TaoWu_1,TaoWu,3,63,1,1,1433323.514,15944.9,14704.6,6417.2,6535,3146.8,3249.8,4256.1,4333.5,1546.6,1577.1,3669.7,4057.7,1316.7,1577.3,384.2,407.2 115 | EPD_1_032083,1,1,67,TaoWu_1,TaoWu,6,61,2,2,1431851.149,10325,9605,7411.9,6854.1,4228.9,4311,5904.1,5663.3,1713.3,1955.5,4117.2,4326.8,1781.1,1791.7,528,577 116 | EPD_1_032084,1,1,67,TaoWu_1,TaoWu,8,59,2,2,1634767.157,12837.7,9580,7355.2,7387.1,3431.3,3202.4,5380.4,4582.7,1768,1710.7,4039.1,4379,1480.7,1631,536.5,577.1 117 | EPD_1_032085,1,2,64,TaoWu_1,TaoWu,6,58,2,2.5,1406747.643,8815.6,6792.9,7073,6095.1,3716.1,3809.2,5580.2,5152,1188.7,1504.6,3746.6,3830,1338.2,1400.3,562.9,501.6 118 | EPD_1_032086,1,2,73,TaoWu_1,TaoWu,8,65,2,2,1236369.836,7632.9,7447.1,6150.7,5657,2845.6,2661.9,3808,3700.6,1073.5,1273.2,3580.4,3684,1342,1208.8,330,382.9 119 | EPD_1_032087,1,2,62,TaoWu_1,TaoWu,3,59,2,2.5,1520593.232,16473.3,11142.7,7164.6,6507.2,3588.6,3541.7,5055.7,4421,1422.4,1497.5,3415,3932.5,1585.9,1660.3,440.6,412 120 | EPD_1_032088,1,2,70,TaoWu_1,TaoWu,15,55,2,2,1289342.782,5853.9,7289.6,7300.6,6068.4,3519.5,3604.9,4165.7,4087.6,1398,1429.1,4107.4,4062.1,1461.5,1593,426.3,469.2 121 | EPD_1_032089,1,1,62,TaoWu_1,TaoWu,2,60,1,1,1528251.515,7430.2,8270.1,7728.6,7114.2,3759.1,4096.7,6190.3,6297.6,1355.5,1434.9,3975.5,4118.1,1526.5,1816.1,509.9,494 122 | EPD_1_032090,1,1,64,TaoWu_1,TaoWu,6,58,2,1.5,1455132.113,9820.3,7320.3,7325.6,6236.7,3465.8,3279,4536.7,4578.9,1262.8,1537.4,4392.9,4654.7,1496.2,1624.2,213.9,387.8 123 | EPD_1_032091,1,2,61,TaoWu_1,TaoWu,3,58,2,2,1391234.096,7111.7,10277.4,7441.3,6833.9,3551.2,3529.2,4727.5,4846.4,1443.9,1452.4,3424,3434,1197.3,1079.4,493.9,522.4 124 | EPD_1_032092,1,1,63,TaoWu_1,TaoWu,1,62,2,2.5,1568084.039,13046.9,14254.9,7498.8,7942.1,3733.6,3770.6,5980.9,5349.5,1772.3,1635.5,4372.8,4600.6,1528.6,1672.5,414.7,405.4 125 | EPD_1_032093,1,1,64,TaoWu_1,TaoWu,1,63,1,1,1696557.061,19154.3,17458.3,7830.4,7850.1,3740.4,3652.9,5555.1,4994.6,1557.6,1659.3,5104,4721.7,1492,1708.9,405.5,424.5 126 | EPD_1_032094,1,2,60,TaoWu_1,TaoWu,3,57,2,1.5,1474907.825,17642.5,13562.7,8318.5,7790.1,3695.7,3428.8,4683.7,4073.5,1333.2,1505.9,4118.5,4164.9,1665,1489,322.6,434.4 127 | EPD_1_032095,1,2,58,TaoWu_1,TaoWu,2,56,2,2.5,1375128.363,5068.1,4614.4,5613.6,6192.3,2899.5,2901,4796.5,4596.5,1468.7,1329.7,4135.9,4243.2,1465,1860.7,750.9,448.1 128 | sub-RC4101,0,1,66.28288,ANT_1,ANT,NA,NA,NA,NA,1572034,NA,NA,9252,9069,3977,4063,5262,5312,1999,1962,3547,3896,1593,1616,537,331 129 | sub-RC4103,0,2,57.45623,ANT_1,ANT,NA,NA,NA,NA,1498100,NA,NA,8734,8147,3038,3202,4122,4317,1609,1903,4282,3549,1237,1073,508,449 130 | sub-RC4106,0,1,70.1174,ANT_1,ANT,NA,NA,NA,NA,1651874,NA,NA,9448,9306,2721,2760,3963,4525,2101,1954,3111,3584,1735,1691,342,278 131 | sub-RC4107,0,2,44.87712,ANT_1,ANT,NA,NA,NA,NA,1341516,NA,NA,7478,7283,2885,2862,3826,3972,1577,1643,3462,3079,1450,1037,493,414 132 | sub-RC4109,0,1,68.16394,ANT_1,ANT,NA,NA,NA,NA,1555524,NA,NA,7928,7955,3643,3877,4540,4732,1783,2016,3573,4296,1486,1366,273,169 133 | sub-RC4110,0,2,54.44408,ANT_1,ANT,NA,NA,NA,NA,1589850,NA,NA,9306,9090,3434,3674,5492,5131,2170,2370,4530,4669,1715,1664,598,424 134 | sub-RC4111,0,2,49.37169,ANT_1,ANT,NA,NA,NA,NA,1400518,NA,NA,8039,7698,3055,3201,4225,3740,1590,1646,3846,3946,1205,1468,623,381 135 | sub-RC4112,0,2,64.32507,ANT_1,ANT,NA,NA,NA,NA,1335156,NA,NA,7189,6982,2684,2872,3759,3652,1489,1488,3484,3607,1408,1438,347,245 136 | sub-RC4113,0,1,60.15017,ANT_1,ANT,NA,NA,NA,NA,1344487,NA,NA,6317,5801,2643,2860,4308,3948,1507,1524,2469,1928,881,989,409,263 137 | sub-RC4114,0,1,60.70497,ANT_1,ANT,NA,NA,NA,NA,1676952,NA,NA,8720,8560,3062,3309,5717,5290,2075,1856,4283,4498,1492,1570,662,346 138 | sub-RC4115,0,1,70.02037,ANT_1,ANT,NA,NA,NA,NA,1539791,NA,NA,8452,7677,3257,3311,4885,5115,1773,1700,3507,3301,1725,1468,509,459 139 | sub-RC4116,0,1,64.57782,ANT_1,ANT,NA,NA,NA,NA,1363706,NA,NA,8195,7741,3825,3620,4639,5177,1732,1589,3293,3643,1064,1355,297,250 140 | sub-RC4117,0,2,76.10251,ANT_1,ANT,NA,NA,NA,NA,1351874,NA,NA,7222,7006,2418,2476,3895,3817,1579,1594,3752,3599,1482,1436,487,312 141 | sub-RC4118,0,1,59.68854,ANT_1,ANT,NA,NA,NA,NA,1740547,NA,NA,9989,9753,4017,4139,5332,4956,2106,2173,4131,4298,1701,1512,488,510 142 | sub-RC4119,0,1,66.4221,ANT_1,ANT,NA,NA,NA,NA,1583780,NA,NA,9158,8759,3459,3609,5387,5175,2105,1721,4781,4510,1277,1897,624,376 143 | sub-RC4120,0,1,70.93856,ANT_1,ANT,NA,NA,NA,NA,1647291,NA,NA,10251,10061,3483,3716,5006,4953,1944,2127,4031,4572,2226,1938,460,365 144 | sub-RC4121,0,1,77.01095,ANT_1,ANT,NA,NA,NA,NA,1647980,NA,NA,8072,8054,3593,3197,4361,4151,1815,1644,2995,3052,1433,1272,269,302 145 | sub-RC4122,0,1,78.33601,ANT_1,ANT,NA,NA,NA,NA,1627698,NA,NA,8913,8722,3973,4065,4967,5295,1828,1928,3719,3691,1200,1385,242,397 146 | sub-RC4123,0,1,83.29109,ANT_1,ANT,NA,NA,NA,NA,1602069,NA,NA,6746,6505,3228,3666,4305,4375,1497,1499,2874,3233,1024,927,359,252 147 | sub-RC4125,0,1,76.8756,ANT_1,ANT,NA,NA,NA,NA,1612733,NA,NA,8866,8671,3517,3847,5033,4981,1874,2094,3953,4059,1679,1383,424,388 148 | sub-RC4126,0,1,86.66116,ANT_1,ANT,NA,NA,NA,NA,1521157,NA,NA,8308,8057,3157,3585,4761,3690,1658,1595,2867,2835,1357,1342,396,222 149 | sub-RC4128,0,1,62.45495,ANT_1,ANT,NA,NA,NA,NA,1392688,NA,NA,6704,6580,2891,3153,4632,4249,1645,1724,3206,3530,1649,1544,481,414 150 | sub-RC4129,0,1,58.43304,ANT_1,ANT,NA,NA,NA,NA,1513751,NA,NA,8558,8285,3113,3296,4466,4455,1821,1805,3768,3949,1601,1366,602,394 151 | sub-RC4130,0,2,61.95773,ANT_1,ANT,NA,NA,NA,NA,1320361,NA,NA,7909,7842,2566,2526,4008,3987,1893,1867,3468,3548,1255,1041,382,338 152 | sub-RC4131,0,1,64.8826,ANT_1,ANT,NA,NA,NA,NA,1586875,NA,NA,9239,8546,3855,3613,5396,5135,2109,2064,3817,3809,1621,1298,526,458 153 | sub-RC4201,1,1,59.41114,ANT_1,ANT,NA,NA,NA,NA,1595505,NA,NA,9056,8766,3606,3681,5504,5413,1894,1882,4178,4013,1604,1453,691,412 154 | sub-RC4204,1,2,67.90571,ANT_1,ANT,NA,NA,NA,NA,1270761,NA,NA,7698,7550,3246,3501,4578,4131,1603,1522,3686,3402,1533,1350,563,372 155 | sub-RC4205,1,1,56.63652,ANT_1,ANT,NA,NA,NA,NA,1330076,NA,NA,7896,7860,3295,3265,4520,4312,1803,1734,4146,4620,1305,1243,493,427 156 | sub-RC4206,1,2,51.35526,ANT_1,ANT,NA,NA,NA,NA,1345798,NA,NA,7842,7424,3447,3491,4254,4217,1603,1504,3610,3725,1397,1354,553,396 157 | sub-RC4207,1,2,41.83067,ANT_1,ANT,NA,NA,NA,NA,1437303,NA,NA,7697,7486,3108,3196,4903,4596,1800,1892,3671,3094,1409,938,628,513 158 | sub-RC4208,1,2,51.32668,ANT_1,ANT,NA,NA,NA,NA,1426361,NA,NA,8851,8181,3316,3335,4155,4226,1739,1841,3948,3370,1661,1536,423,408 159 | sub-RC4210,1,2,55.04227,ANT_1,ANT,NA,NA,NA,NA,1561959,NA,NA,9000,8251,3023,3192,5082,4585,1797,1923,2907,3050,1667,1281,732,492 160 | sub-RC4211,1,2,67.73355,ANT_1,ANT,NA,NA,NA,NA,1345591,NA,NA,7828,7726,3612,3758,4680,4524,1926,1913,3777,3971,1541,1151,490,479 161 | sub-RC4212,1,1,55.52033,ANT_1,ANT,NA,NA,NA,NA,1471305,NA,NA,7510,7232,3570,4050,5129,4843,1715,1758,3893,3924,1294,1501,472,353 162 | sub-RC4213,1,1,47.33602,ANT_1,ANT,NA,NA,NA,NA,1469969,NA,NA,8782,8772,3685,3562,5214,4791,1837,1822,3852,3520,1298,1441,555,497 163 | sub-RC4214,1,1,63.33054,ANT_1,ANT,NA,NA,NA,NA,1464399,NA,NA,8720,8652,4074,4043,4614,4863,1971,2016,4214,4347,1719,1728,530,404 164 | sub-RC4215,1,2,51.01917,ANT_1,ANT,NA,NA,NA,NA,1316099,NA,NA,8339,8001,3342,3253,4430,4303,1712,1652,3589,3213,1517,1095,575,569 165 | sub-RC4217,1,1,74.85095,ANT_1,ANT,NA,NA,NA,NA,1629164,NA,NA,7509,7582,3276,3139,4274,4278,1572,1646,3413,3722,1382,1133,459,312 166 | sub-RC4218,1,2,66.91393,ANT_1,ANT,NA,NA,NA,NA,1433558,NA,NA,7808,7776,3471,3669,4479,4183,2156,2501,4109,4215,1272,1399,443,391 167 | sub-RC4219,1,2,75.99059,ANT_1,ANT,NA,NA,NA,NA,1387259,NA,NA,7609,7463,3218,3736,4212,4407,1705,1667,3650,3327,1426,1292,464,356 168 | sub-RC4220,1,1,67.73355,ANT_1,ANT,NA,NA,NA,NA,1764931,NA,NA,8916,8697,3262,3316,5219,5106,1907,1966,4340,4145,1732,1597,570,478 169 | sub-RC4221,1,2,66.09977,ANT_1,ANT,NA,NA,NA,NA,1249325,NA,NA,7970,7551,3262,3385,4513,4520,1716,1967,3496,3616,1472,1388,551,426 170 | sub-RC4224,1,2,73.11893,ANT_1,ANT,NA,NA,NA,NA,1479729,NA,NA,7937,7695,3268,3556,4434,3961,1703,1670,3439,3786,1483,1540,427,350 171 | sub-RC4225,1,1,72.94404,ANT_1,ANT,NA,NA,NA,NA,1679875,NA,NA,8600,8294,3598,3576,5024,5066,1952,1842,3903,3598,1927,1522,536,344 172 | sub-RC4226,1,2,69.34278,ANT_1,ANT,NA,NA,NA,NA,1571059,NA,NA,8950,8942,3705,3785,4980,4698,1896,2014,3860,3956,1711,1730,288,345 173 | sub-RC4227,1,1,69.04774,ANT_1,ANT,NA,NA,NA,NA,1632491,NA,NA,8840,8142,3472,3613,5389,5051,1914,1798,3691,3515,1393,873,636,345 -------------------------------------------------------------------------------- /course/setup.md: -------------------------------------------------------------------------------- 1 | # Setup for the workshop 2 | 3 | There are three ways how you can follow this workshop: 4 | 5 | 1. Full install (recommended): Full workshop experience, interactive, with all software dependencies. 6 | 2. Partial install (ok if the only possible options): Almost full workshop experience, interactive, with only some contents as "just watch". 7 | 3. No install/passive (not recommended): semi workshop experience and not interactive, with all content being "just watch". 8 | 9 | While option 2. is nice and easy to follow in an interactive manner and option 3. might work for a quick overview, we don't recommended either of them as getting `Docker`, `git`, etc. to work reliably on your machine is going to be very beneficial. This holds true for the workshop and especially beyond. Via installing these tools, you will be equipped to basically continue right 10 | away and start using them with a focus on reproducible neuroimaging in your everyday research workflow. Having that in mind and integrating other tools/resources focusing open and reproducible (neuro-/data) science, we 11 | generated a rather comprehensive set of install instructions below. While not all of them might be totally necessary for the workshop, they all will help you a great deal going further and are especially useful/needed given we have to hold the workshop virtually due to the COVID-19 pandemic. 12 | 13 | Don't worry, you got this! 14 | 15 | ![logo](https://media1.tenor.com/images/f72cb542d6b3e3c3421889e0a3d9628d/tenor.gif?itemid=4533805)\ 16 | https://media1.tenor.com/images/f72cb542d6b3e3c3421889e0a3d9628d/tenor.gif?itemid=4533805 17 | 18 | 19 | ## General things 20 | 21 | There are a few computing requirements for the course that are absolutely necessary (beyond the few software packages we would like you to install, described below): 22 | 23 | 1. You must have `administrator access` to your computer (i.e., you must be able to install things yourself without requesting IT approval). 24 | 1. You must have `at least 20 GB` of free disk space on your computer (but we would recommend more, to be safe). 25 | 1. If you are using `Windows` you must be using `Windows 10`; Windows 7 and 8 will not be sufficient for this course. 26 | 27 | If you foresee any of these being a problem please reach out to one of the instructors for what steps you can take to ensure you are ready for the course start. 28 | 29 | ## Required software 30 | 31 | To get the most out of our course, we ask that you arrive with the following software already installed: 32 | 33 | - A command-line shell: `Bash` 34 | - A version control system: `Git` & `DataLad` 35 | - A remote-capable text editor: `VSCode` 36 | - `Python 3` via `Miniconda` 37 | - A virtualization system: `Docker` 38 | - A `GitHub` account 39 | - A modern browser 40 | 41 | If you already have all of the above software tools/packages installed, or are confident you’ll be able to install them by the time the course starts, you can jump straight to [checking your install](#checking-your-install). 42 | The rest of this page provides more detail on installation procedures for each of the above elements, with separate instructions for each of the three major operating systems (`Windows`, `Mac OS`, and `Linux`). 43 | 44 | ### Some quick general notes on instructions 45 | 46 | - There is no difference between `Enter` and `Return` in these instructions, so just press whatever the equivalent on your keyboard is whenever one is stated 47 | - If you already have some of these things installed on your computer already that should (theoretically) be okay. 48 | However, you need to make sure that you are able to complete the steps described in [checking your install](#checking-your-install) without issue. 49 | - For example, having multiple different `Python` installations on your computer can lead to incredibly frustrating issues that are very difficult to debug. 50 | As such, if you have already installed `Python` via some other application (not `Miniconda`/`Anaconda`), we strongly encourage you to uninstall it before following the instructions below. 51 | You _must_ have `Python` installed via `Miniconda` for this course. 52 | 53 | ### OS-specific installation instructions 54 | 55 | Select the tab that corresponds to your operating system and follow the instructions therein. 56 | 57 | ```{tabbed} Windows 58 | **Windows Subsystem for Linux (WSL)** 59 | 60 | 1. Search for `Windows Powershell` in your applications; right click and select `Run as administrator`. 61 | Select `Yes` on the prompt that appears asking if you want to allow the app to make changes to your device. 62 | 2. Type the following into the Powershell and then press `Enter`: 63 | 64 | Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux 65 | 66 | 3. Press `Enter` again when prompted to reboot your computer. 67 | 4. Once your computer has rebooted, open the Microsoft Store and search for "Ubuntu." 68 | Install the program labelled "Ubuntu 18.04" (not "Ubuntu 16.04" or "Ubuntu") by clicking the tile, pressing `Get`, and then `Install`. 69 | 5. Search for and open Ubuntu from your applications. 70 | There will be a slight delay (of a few minutes) while it finishes installing. 71 | 6. You will be prompted to `Enter new UNIX username`. 72 | You can use any combination of alphanumeric characters here for your username, but a good choice is `` (e.g., `jsmith` for John Smith). 73 | You will then be prompted to enter a new password. 74 | (Choose something easy to remember as you will find yourself using it frequently.) 75 | 7. Right click on the top bar of the Ubuntu application and select "Properties". 76 | Under the "Options" tab, under the "Edit Options" heading, make sure the box reading "Use Ctrl+Shift+C/V as Copy/Paste" is checked. 77 | Under the "Terminal" tab, under the "Cursor Shape" heading, make sure the box reading "Vertical Bar" is checked. 78 | Press "Okay" to save these settings and then exit the application. 79 | 80 | (The above step-by-step WSL instructions are distilled from [here](https://docs.microsoft.com/en-us/windows/wsl/install-win10) and [here](https://docs.microsoft.com/en-us/windows/wsl/initialize-distro). 81 | If you have questions during the installation procedure those resources may have answers!) 82 | 83 | From this point on whenever the instructions specify to "open a terminal" please assume you are supposed to open the Ubuntu application. 84 | 85 | **Bash shell** 86 | 87 | You already have it, now that you’ve installed the WSL! 88 | 89 | **Git** 90 | 91 | You already have it, now that you’ve installed the WSL! 92 | 93 | **DataLad** 94 | 95 | Please follow the [fantastic install instructions of the `DataLad handbook`](http://handbook.datalad.org/en/latest/intro/installation.html#installation-and-configuration). 96 | 97 | **VSCode** 98 | 99 | 1. Go to https://code.visualstudio.com/ and click the download button, then run the `.exe` file. 100 | 1. Leave all the defaults during the installation with the following exception: 101 | - Please make sure the box labelled "Register Code as an editor for supported file types" is selected 102 | 103 | **VSCode extensions** 104 | 105 | 1. Open the `Ubuntu` application. 106 | 1. Type `code .` into the terminal and press `Enter`. 107 | You should see a message reading "Installing VS Code Server" and then a new windows will open up. 108 | 1. Press `Ctrl+Shift+P` in the new window that opens and type "Extensions: Install extensions" into the search bar that appears at the top of the screen. 109 | Select the appropriate entry from the dropdown menu that appears (there should be four entries; simply select the one that reads "Extensions: Install extensions"). 110 | 1. A new panel should appear on the left-hand side of the screen with a search bar. 111 | Search for each of the following extensions and press `Install` for the first entry that appears. (The author listed for all of these extensions should be "Microsoft".) 112 | - `Python` (n.b., you will need to reload `VSCode` after installing this) 113 | - `Live Share` (n.b., you may need to press "Ctrl/Cmd+Shift+P" and type "install extensions" again after installing this) 114 | - `Live Share Extension Pack` 115 | - `Docker` 116 | - `Remote - WSL` 117 | 118 | **Python** 119 | 120 | 1. Open a new terminal and type the following lines (separately) into the terminal, pressing `Enter` after each one: 121 | 122 | wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh 123 | bash Miniconda3-latest-Linux-x86_64.sh 124 | 125 | 1. A license agreement will be displayed and the bottom of the terminal will read `--More--`. 126 | Press `Enter` or the space bar until you are prompted with "Do you accept the license terms? [yes|no]." 127 | Type `yes` and then press `Enter` 128 | 1. The installation script will inform you that it is going to install into a default directory (e.g., `/home/$USER/miniconda3`). 129 | Leave this default and press `Enter`. 130 | 1. When you are asked "Do you wish the installer to initialize Miniconda3 by running conda init? [yes|no]," type `yes` and press `Enter`. 131 | Exit the `terminal` once the installation has finished. 132 | 1. Re-open the `Ubuntu` application. 133 | Type `which python` into the terminal and it should return a path (e.g., `/home/$USER/miniconda3/bin/python`). 134 | - If you do not see a path like this then please try typing `conda init`, closing your terminal, and repeating this step. 135 | If your issue is still not resolved skip the following step and contact an instructor. 136 | 1. Type the following to remove the installation script that was downloaded: 137 | 138 | rm ./Miniconda3-latest-Linux-x86_64.sh 139 | 140 | 141 | **Python packages** 142 | 143 | Open a `terminal` and type the following commands: 144 | 145 | conda config --append channels conda-forge 146 | conda config --set channel_priority strict 147 | conda install -y flake8 ipython jupyter jupyterlab matplotlib nibabel nilearn numpy pandas scipy seaborn 148 | 149 | **Docker** 150 | 151 | Unfortunately, `Docker` for `Windows` is a bit of a mess. 152 | The recommended version of `Docker` to install varies dramatically depending not only on which version of `Windows` you have installed (e.g., `Windows 10 Home` versus `Professional`/`Enterprise`/`Education`), but also which _build_ of `Windows` you have. 153 | As such, developing a comprehensive set of instructions for installing `Docker` is rather difficult. 154 | 155 | For this course, you will need to install either [Docker Toolbox for Windows](https://docs.docker.com/toolbox/toolbox_install_windows/) or [Docker for Windows Desktop](https://docs.docker.com/docker-for-windows/install/). 156 | Which you install will depend on your OS. 157 | **PLEASE NOTE** that installing `Docker for Windows Desktop` will disable `VirtualBox` on your computer. 158 | If you actively use `VirtualBox` we recommend you install `Docker Toolbox`. 159 | 160 | (**Note**: the below instructions assume you are installing `Docker Toolbox`. 161 | Because there are fewer requirements for `Docker Toolbox`, it is likely that you will be able to install this more easily.) 162 | 163 | 1. Download the latest [Docker Toolbox installer](https://github.com/docker/toolbox/releases/download/v19.03.1/DockerToolbox-19.03.1.exe) (note: that link will automatically download the file) 164 | 1. Run the downloaded `.exe` file and leave all the defaults during the installation procedure. 165 | Click `Yes`on the prompt that appears asking if the application can make changes to your computer. 166 | 1. Search for and open the newly-installed "Docker Quickstart" application. 167 | Again, click `Yes`on the prompt that appears asking if the application can make changes to your computer. 168 | The application will do a number of things to finish installing and setting up Docker. 169 | 1. Once you see a `$` prompt type `docker run hello-world`. 170 | A brief introductory message should be printed to the screen. 171 | 1. Close the "Docker Quickstart" application and open a terminal (i.e., the Ubuntu application). 172 | 1. Copy-paste the following commands. 173 | You will be prompted to enter your password once. 174 | 175 | # Update the apt package list. 176 | sudo apt-get update -y 177 | # Install Docker's package dependencies. 178 | sudo apt-get install -y \ 179 | apt-transport-https \ 180 | ca-certificates \ 181 | curl \ 182 | software-properties-common 183 | # Download and add Docker's official public PGP key. 184 | curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - 185 | # Verify the fingerprint. 186 | sudo apt-key fingerprint 0EBFCD88 187 | # Add the `stable` channel's Docker upstream repository. 188 | sudo add-apt-repository \ 189 | "deb [arch=amd64] https://download.docker.com/linux/ubuntu \ 190 | $(lsb_release -cs) \ 191 | stable" 192 | # Update the apt package list (for the new apt repo). 193 | sudo apt-get update -y 194 | # Install the latest version of Docker CE. 195 | sudo apt-get install -y docker-ce 196 | # Allow your user to access the Docker CLI without needing root access. 197 | sudo usermod -aG docker $USER 198 | 199 | 1. Close and re-open the terminal. 200 | 1. Type `pip install docker-compose`. 201 | 1. Type `powershell.exe "docker-machine config"`. 202 | You should get output similar to the following: 203 | 204 | --tlsverify 205 | --tlscacert="C:\\Users\\\\.docker\\machine\\machines\\default\\ca.pem" 206 | --tlscert="C:\\Users\\\\.docker\\machine\\machines\\default\\cert.pem" 207 | --tlskey="C:\\Users\\\\.docker\\machine\\machines\\default\\key.pem" 208 | -H=tcp://xxx.xxx.xx.xxx:xxxx 209 | 210 | 211 | where `` will have an actual value (likely your Windows username), and `tcp=xxx.xxx.xx.xxx:xxx` will be a series of numbers. 212 | If you don't get this output then something has gone wrong. 213 | Please make sure you were able to run the `docker run hello-world` command, above. 214 | If you were and you still don't receive this output, please contact one of the instructors. 215 | 216 | 1. You will use the the outputs of the above command to modify the commands below before running them in the terminal. 217 | First, take the numbers printed in place of the `x`s on the output of the line `-H=tcp://xxx.xxx.xx.xxx:xxxx` from above and replace the placeholder `xxx.xxx.xx.xxx:xxxx` on the first command below (`export DOCKER_HOST`). 218 | Second, take whatever value is printed in place of `` above and replace the `` placeholder on the second command below (`export DOCKER_CERT_PATH`). 219 | Once you have updated the commands appropriately, copy and paste them into the terminal: 220 | 221 | echo "export DOCKER_HOST=tcp://xxx.xxx.xx.xxx:xxxx" >> $HOME/.bashrc 222 | echo "export DOCKER_CERT_PATH=/mnt/c/Users//.docker/machine/certs" >> $HOME/.bashrc 223 | echo "export DOCKER_TLS_VERIFY=1" >> $HOME/.bashrc 224 | 225 | 226 | 1. Close and re-open a terminal (i.e., the Ubuntu application). 227 | Type `docker run hello-world`. 228 | The same brief introductory message you saw before should be printed to the screen. 229 | 230 | **Note**: If you restart your computer (or somehow otherwise shut down the Docker VM) you will need to re-open the "Docker Quickstart" application and wait until you see the `$` prompt again before your `docker` commands will work again! 231 | If you are having problems running `docker` commands in the terminal, try re-opening the "Docker Quickstart" application. 232 | 233 | (The above step-by-step instructions are distilled from [here](https://docs.docker.com/toolbox/toolbox_install_windows/) and [here](https://medium.com/@joaoh82/setting-up-docker-toolbox-for-windows-home-10-and-wsl-to-work-perfectly-2fd34ed41d51). 234 | If you have questions during the installation procedure please check those links for potential answers!) 235 | ``` 236 | 237 | ```{tabbed} Linux 238 | **Bash shell** 239 | 240 | You already have it! 241 | Depending on which version of `Linux` you’re running you may need to type `bash` inside the `terminal` to access it. 242 | To check whether this is necessary, follow these steps: 243 | 244 | 1. Open a `terminal` and type `echo $SHELL`. 245 | If it reads `/bin/bash` then you are all set! 246 | If not, whenever the instructions read "open a terminal," please assume you are to open a terminal, type `bash`, and the proceed with the instructions as specified. 247 | 248 | **Git** 249 | 250 | You may already have it; try typing `sudo apt-get install git` (Ubuntu, Debian) or `sudo yum install git` (Fedora) inside the `terminal`. 251 | If you are prompted to install it follow the instructions on-screen to do so. 252 | 253 | **DataLad** 254 | 255 | Please follow the [fantastic install instructions of the `DataLad handbook`](http://handbook.datalad.org/en/latest/intro/installation.html#installation-and-configuration). 256 | 257 | **VSCode** 258 | 259 | 1. Go to https://code.visualstudio.com/ and click the download button for either the .deb (`Ubuntu`, `Debian`) or the .rpm (`Fedora`, `CentOS`) file. 260 | 1. Double-click the downloaded file to install `VSCode`. 261 | (You may be prompted to type your administrator password during the install). 262 | 263 | **VSCode extensions** 264 | 265 | 1. Open the `Visual Studio Code` application. 266 | 1. Press `Ctrl+Shift+P` in the new window that opens and type "Extensions: Install extensions" into the search bar that appears at the top of the screen. 267 | Select the appropriate entry from the dropdown menu that appears (there should be four entries; simply select the one that reads "Extensions: Install extensions"). 268 | 1. A new panel should appear on the left-hand side of the screen with a search bar. 269 | Search for each of the following extensions and press `Install` for the first entry that appears. (The author listed for all of these extensions should be "Microsoft".) 270 | - `Python` (n.b., you will need to reload `VSCode` after installing this) 271 | - `Live Share` (n.b., you may need to press "Ctrl/Cmd+Shift+P" and type "install extensions" again after installing this) 272 | - `Live Share Extension Pack` 273 | - `Docker` 274 | 275 | **Python** 276 | 277 | 1. Open a new terminal and type the following lines (separately) into the `terminal`, pressing `Enter` after each one: 278 | 279 | wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh 280 | bash Miniconda3-latest-Linux-x86_64.sh 281 | 282 | 283 | 1. A license agreement will be displayed and the bottom of the terminal will read `--More--`. 284 | Press `Enter` or the space bar until you are prompted with "Do you accept the license terms? [yes|no]." 285 | Type `yes` and then press `Enter` 286 | 1. The installation script will inform you that it is going to install into a default directory (e.g., `/home/$USER/miniconda3`). 287 | Leave this default and press `Enter`. 288 | 1. When you are asked "Do you wish the installer to initialize Miniconda3 by running conda init? [yes|no]," type `yes` and press `Enter`. 289 | Exit the `terminal` once the installation has finished. 290 | 1. Re-open a new `terminal`. 291 | Type `which python` into the `terminal` and it should return a path (e.g., `/home/$USER/miniconda3/bin/python`). 292 | - If you do not see a path like this then please try typing `conda init`, closing your terminal, and repeating this step. 293 | If your issue is still not resolved skip the following step and contact an instructor. 294 | 1. Type the following to remove the installation script that was downloaded: 295 | 296 | rm ./Miniconda3-latest-Linux-x86_64.sh 297 | 298 | **Python packages** 299 | 300 | Open a terminal and type the following commands: 301 | 302 | conda config --append channels conda-forge 303 | conda config --set channel_priority strict 304 | conda install -y flake8 ipython jupyter jupyterlab matplotlib nibabel nilearn numpy pandas scipy seaborn 305 | 306 | 307 | **Docker** 308 | 309 | 1. You will be following different instructions depending on your distro ([Ubuntu](https://docs.docker.com/engine/install/ubuntu/), [Debian](https://docs.docker.com/engine/install/debian/), [Fedora](https://docs.docker.com/engine/install/fedora/), [CentOS](https://docs.docker.com/engine/install/centos/)). 310 | Make sure to follow the “Install using the repository” method! 311 | 1. Once you’ve installed Docker make sure to follow the [post-install instructions](https://docs.docker.com/engine/install/linux-postinstall/) as well. 312 | You only need to do the “Manage Docker as a non-root user” and “Configure Docker to start on boot” steps. 313 | 1. Open a new terminal and type `docker run hello-world`. 314 | A brief introductory message should be printed to the screen. 315 | ``` 316 | 317 | ```{tabbed} MacOs 318 | **Bash shell** 319 | 320 | You already have it! 321 | Depending on which version of Mac OS you’re running you may need to type `bash` inside the `terminal` to access it. 322 | To check whether this is necessary, follow these steps: 323 | 324 | 1. Open a terminal and type `echo $SHELL`. 325 | If it reads `/bin/bash` then you are all set! 326 | 327 | Note: If you are using `Mac Catalina (10.15.X)` then it is possible your default `shell` is NOT CORRECT. 328 | To set the default to `bash`, type `chsh -s /bin/bash` in the `terminal`, enter your password when prompted, and then close + re-open the `terminal`. 329 | 330 | **Git** 331 | 332 | You may already have it! 333 | Try opening a terminal and typing `git --version`. 334 | If you do not see something like “git version X.XX.X” printed out, then follow these steps: 335 | 336 | 1. Follow [this link](https://sourceforge.net/projects/git-osx-installer/files/git-2.23.0-intel-universal-mavericks.dmg/download?use_mirror=autoselect) to automatically download an installer. 337 | 1. Double click the downloaded file (`git-2.23.0-intel-universal-mavericks.dmg`) and then double click the `git-2.23.0-intel-universal-mavericks.pkg` icon inside the dmg that is opened. 338 | 1. Follow the on-screen instructions to install the package. 339 | 340 | **DataLad** 341 | 342 | Please follow the [fantastic install instructions of the `DataLad handbook`](http://handbook.datalad.org/en/latest/intro/installation.html#installation-and-configuration). 343 | 344 | **VSCode** 345 | 346 | 1. Go to https://code.visualstudio.com/ and click the download button. 347 | 1. Unzip the downloaded file (e.g., `VSCode-darwin-stable.zip`) and moving the resulting `Visual Studio Code` file to your Applications directory. 348 | 349 | **VSCode extensions** 350 | 351 | 1. Open the Visual Studio Code application 352 | 1. Type `Cmd+Shift+P` and then enter "Shell command: Install 'code' command in PATH" into the search bar that appears at the top of the screen. 353 | Select the highlighted entry. 354 | A notification box should appear in the bottom-right corner indicating that the command was installed successfully. 355 | 1. Type `Cmd+Shift+P` again and then enter "Extensions: Install extensions" into the search bar. 356 | Select the appropriate entry from the dropdown menu that appears (there should be four entries; simply select the one that reads "Extensions: Install extensions"). 357 | 1. A new panel should appear on the left-hand side of the screen with a search bar. 358 | Search for each of the following extensions and press `Install` for the first entry that appears. (The author listed for all of these extensions should be "Microsoft".) 359 | - `Python` (n.b., you will need to reload VSCode after installing this) 360 | - `Live Share` (n.b., you may need to press "Ctrl/Cmd+Shift+P" and type "install extensions" again after installing this) 361 | - `Live Share Extension Pack` 362 | - `Docker` 363 | 364 | **Python** 365 | 366 | 1. Open a new `terminal` and type the following lines (separately) into the terminal, pressing `Enter` after each one: 367 | 368 | curl -O https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh 369 | bash Miniconda3-latest-MacOSX-x86_64.sh 370 | 371 | 372 | 1. A license agreement will be displayed and the bottom of the `terminal` will read `--More--`. 373 | Press `Enter` or the space bar until you are prompted with "Do you accept the license terms? [yes|no]." 374 | Type `yes` and then press `Enter` 375 | 1. The installation script will inform you that it is going to install into a default directory (e.g., `/home/$USER/miniconda3`). 376 | Leave this default and press `Enter`. 377 | 1. When you are asked "Do you wish the installer to initialize Miniconda3 by running conda init? [yes|no]," type `yes` and press `Enter`. 378 | Exit the terminal once the installation has finished. 379 | 1. Re-open a terminal. 380 | Type `which python` into the terminal and it should return a path (e.g., `/home/$USER/miniconda3/bin/python`). 381 | - If you do not see a path like this then please try typing `conda init`, closing your terminal, and repeating this step. 382 | If your issue is still not resolved skip the following step and contact an instructor. 383 | 1. Type the following to remove the installation script that was downloaded: 384 | 385 | rm ./Miniconda3-latest-MacOSX-x86_64.sh 386 | 387 | 388 | **Python packages** 389 | 390 | Open a terminal and type the following commands: 391 | 392 | conda config --append channels conda-forge 393 | conda config --set channel_priority strict 394 | conda install -y flake8 ipython jupyter jupyterlab matplotlib nibabel nilearn numpy pandas scipy seaborn 395 | 396 | 397 | **Docker** 398 | 399 | 1. Go to https://hub.docker.com/editions/community/docker-ce-desktop-mac/ and press “Get Docker”. 400 | 1. Open the “Docker.dmg” file that is downloaded and drag and drop the icon to the Applications folder 401 | 1. Open the Docker application and enter your password. 402 | An icon will appear in the status bar in the top-left of the screen. 403 | Wait until it reads “Docker Desktop is now up and running!” 404 | 1. Open a new terminal and type `docker run hello-world`. 405 | A brief introductory message should be printed to the screen. 406 | 407 | (The above step-by-step Docker instructions are distilled from [here](https://docs.docker.com/docker-for-mac/install/). 408 | If you have questions during the installation procedure please check that link for potential answers!) 409 | ``` 410 | 411 | **Note**: If the instructions aren't working and you have spent more than 15-20 minutes troubleshooting on your own, reach out with the exact problems you're having or join one of the installation clinics. 412 | One of the instructors will try and get back to you quickly to help resolve the situation. 413 | 414 | 415 | 416 | ### GitHub account 417 | 418 | Go to https://github.com/join/ and follow the on-screen instructions to create an account. 419 | It is a good idea to associate this with your university e-mail (if you have one) as this will entitle you to sign up for the [GitHub Student Developer Pack](https://education.github.com/pack) which comes with some nice free bonuses. 420 | 421 | 422 | ### Modern web browser 423 | 424 | Install Firefox or Chrome. 425 | (Safari will also work.) 426 | Microsoft Edge is not modern, despite what Microsoft might try and otherwise tell you. 427 | 428 | ## Checking your install 429 | 430 | Now that you've installed everything it's time to check that everything works as expected! 431 | Type the following into your terminal: 432 | 433 | bash <( curl -s https://raw.githubusercontent.com/ReproNim/DGPA_workshop_2022/main/check_install.sh) 434 | 435 | If you installed everything correctly you should see a message informing you as such. 436 | If any problems were detected you should receive some brief instructions on what is wrong with potential suggestions on how to remedy it. 437 | If you followed these instructions step-by-step and cannot resolve the issue please contact one of the course instructors for more help. 438 | 439 | Yeah, you did! Great job! 440 | 441 | ![logo](https://media1.tenor.com/images/d5ebabf248130ec3842ed3b8627fd4f2/tenor.gif?itemid=4770158)\ 442 | https://media1.tenor.com/images/d5ebabf248130ec3842ed3b8627fd4f2/tenor.gif?itemid=4770158 443 | 444 | 445 | ## Enter the matrix 446 | 447 | Once you reached this point, you should be ready the enter the matrix and follow the workshop in your preferred way. Congrats, fantastic work! 448 | 449 | ![logo](https://media1.tenor.com/images/e5c21d98f56c4af119b4e14b6a9df893/tenor.gif?itemid=4011236)\ 450 | https://media1.tenor.com/images/e5c21d98f56c4af119b4e14b6a9df893/tenor.gif?itemid=4011236 451 | 452 | -------------------------------------------------------------------------------- /course/exercise.md: -------------------------------------------------------------------------------- 1 | # The Publication Exercise for the course 2 | 3 | # Prerequsites 4 | Don't have an account on github ? please create one on [GitHub](https://github.com) 5 | You will need a **token** - please navigate to your account (top right) and then go to "developer settings" and "personal access token" create a token - click on "repo" and take the token to do things like `git push` (see below). 6 | 7 | For the exercises below it will also be important to have a 'fork' of the "ReproNim/OHBMEducation-2022" GitHub repository. to do this: 8 | 1. Navigate your browser to https://github.com/ReproNim/OHBMEducation-2022 9 | 2. Select 'fork' in the row of operations at the top of the window 10 | 3. Select 'Create Fork' 11 | 12 | You now have you own copy of the couse materials to use in the future. 13 | 14 | # JupyterHub 15 | JupyterHub provides the computational platform we will use. We can all use a common platform with all our needed software prerequisites 16 | and environment pre-set. This eliminates the hassle of making all this work on everyone's local computers. However, the downside is that you 17 | may not be able to replicate this example 'at home' unless/until you solve the software prerequisites on your own computational platform. 18 | That's OK, ***ReproNim*** is here to help after the course with your own 'home' computational platform. 19 | 20 | ## Meet your ReproHub 21 | The ReproHub is hosted at a URL that we will give you in class. Point your browser to this url. You will use your GitHub authorization in order to log in. 22 | Once you are in the site, you will see the 'Launcher' from where you can 'launch' many ships. For now, we'll want a command line "terminal". 23 | Click that icon to get started. 24 | 25 | Amongst the first things we need to do on this platform is let *git* know about who you are. To do this: 26 | 27 | Set your username: ```git config --global user.name "FirstName LastName"``` 28 | 29 | Set your email address: ```git config --global user.email "YourUserName@example.com"``` 30 | 31 | # The Steps 32 | We will do our re-executable publication in a sequence of steps that are designed to accomplish all the necessary tasks in the timeframe allocated by our 33 | course. 34 | 35 | ## Create a GitHub Repo for your Publication 36 | ***Note*** We're not going to do this today, but in the event of a 'real' paper we would do the following. 37 | 38 | Our final publication should include a GitHub repository, where we can connect all the parts of our publication in a central, 39 | shared location. A GitHub repo is FAIR: Findable, Accessible, Interoperable and Reusable. Since this will be 'self published' and not peer-reviewed, 40 | we really should think of this as a pre-print of our 'publication'. Were this an actual publication, we would create a more formal document, complying 41 | with the norms of traditional scientific publication, and submit to a peer-reviewed journal (after depositing the preprint as well). 42 | 43 | ## Pre-Registration 44 | In this case, we do have a specific set of hypotheses we are testing. In general, we *should* pre-register this plan (or consider a [registered report]()) 45 | but those steps are considered to be outside the scope of what we can accomplish in our time together. We highly recommend you read more about this 46 | topic [here](https://help.osf.io/article/158-create-a-preregistration). 47 | 48 | ## Collect Data 49 | In this case we will use pre-collected data. Specifically, each of you will be assigned a subset of the 50 | [OpenNeuro ds001907](https://openneuro.org/datasets/ds001907/versions/3.0.2) collection. The description of this study can be found at the 51 | OpenNeuro website, and this data was published in the following [data paper](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7217223/). There are 52 | 46 subjects in this overall dataset (25 healthy aging, 21 Parkinson's Disease). There are many other potential sources of Parkinson's disease data: [NITRC-IR](https://www.nitrc.org/xnat/index.php), [Parkinson's Progression Markers Initiative](https://www.nitrc.org/projects/ppmi), [OpenNeuro](https://www.nitrc.org/projects/openneuro), etc. 53 | 54 | ### Data Subset 55 | Each student (or student group) will be assigned a subset of the above dataset for their analysis. This is both for practical purposes, as there is 56 | time to do only a limited amount of data analysis in this course, but also for didactic purposes, so that we can share (and aggregate) our individual 57 | results in support of both a *meta analysis* amongst our various subsets and a *mega analysis* by combining all of our individual results together. 58 | For our purposes here, each student/team will get 5 randomly assigned typically developing subjects and 5 subjects with Parkinson's Disease (PD). We know, 59 | 'your study' will be under-powered, but, frankly, it's still underpowered even if you have 100's of subjects, we'll talk about *that* separately. 60 | 61 | #### Let's Do It! 62 | On the JupyterHub in your home directory (i.e. the prompt in your terminal window will look something like ```jovyan@jupyter-YourGithubUsername:~$```, let's make a directory for your data: 63 | ``` 64 | cd 65 | mkdir my_data 66 | cd my_data 67 | ``` 68 | 69 | Now we are going to 'fork' (make a copy) of our specially prepared reduced version of the ds001907 dataset. To do this, we will 70 | 1. In your browser open up a new tab to the dataset's repo: https://github.com/ReproNim/ds001907-EDU 71 | 2. 'Fork' this dataset to your own GitHub account. 72 | 73 | ![picture](../pics/GitHub_fork.png) 74 | 75 | Now, on the JupyterHub in your my_data directory, let's clone your fork of this dataset, please replace "YOUR-GITHUB-LOGIN" with your GitHub username: 76 | ``` 77 | cd ~/my_data 78 | git clone https://github.com//ds001907-EDU.git my_ds001907-EDU 79 | ``` 80 | You now have the 'complete' dataset in the my_ds001907-EDU directory. 81 | 82 | Now, you can teach git to remember you, to save typing later on: 83 | ``` 84 | cd my_ds001907-EDU 85 | git config credential.helper 'cache --timeout=3600' 86 | ``` 87 | 88 | Let's create your own specific working subset from this master dataset. You (or your working group) will receive a "Group_Number" (i.e. Group_10) which assigns you a set of 10 cases for your working at the class. We then need to: 89 | 1. Remove the sub-* directories from the master data set (that you forked and cloned to your home directory) that are **not** in your assinged dataset 90 | 2. Remove the lines of the *participants.tsv* file that are not for your subjects (your assigned 10 cases) 91 | 92 | You can see the cases that are in your group with the following command: 93 | ``` 94 | more ~/Exercise-OHBM2022/Exercise/Groups/Group_10 95 | ``` 96 | which should generate output like: 97 | ``` 98 | Group 10 99 | sub-RC4101 100 | sub-RC4128 101 | sub-RC4129 102 | sub-RC4130 103 | sub-RC4131 104 | sub-RC4206 105 | sub-RC4207 106 | sub-RC4208 107 | sub-RC4210 108 | sub-RC4211 109 | ``` 110 | We have a simple 'helper script' that will create this subset of directories for you. 111 | ``` 112 | python ~/Exercise-OHBM2022/scripts/data_subset.py -g 113 | ``` 114 | 115 | Next, Let's confirm that your working subset is still have a happy BIDS dataset. The dataset you cloned (which is actually a DataLad dataset, which we will discuss later) so let's clean up the DataLad aspect 116 | of this new data subset and run the BIDS validator (locally as a Singularity container application, in a way that deals with a DataLad dataset): 117 | ``` 118 | datalad save -m "My new dataset" . 119 | ``` 120 | 121 | And then actually run the validator (using singularity): 122 | ``` 123 | cd ~/my_data 124 | singularity exec --bind $PWD/my_ds001907-EDU:/data /shared/sing/bids_validator.simg bids-validator /data --ignoreSymlinks --ignoreNiftiHeaders 125 | ``` 126 | 127 | This dataset will (hopefully) pass, and if it does, that outcome will be reported to you ("This dataset appears to be BIDS compatible.") by the validator. You can ignore the warnings about 128 | "...Tabular file contains custom columns..." for now. 129 | 130 | Great, you now have a valid BIDS (and DataLad) dataset. 131 | 132 | Now you can now "publish" this dataset by pushing it back to your GitHub repo: 133 | ``` 134 | cd ~/my_data/my_ds001907-EDU 135 | git push 136 | ``` 137 | ***Note:*** Git may or may not ask you for your username and password, but by password, it rally means the ***token*** that you saved earlier, depending on how you have your git account setup. Make sure you have your token... 138 | 139 | Congratulations! You have now published your dataset by putting it in a github repo, where it is now both findable and citable. 140 | 141 | Now that we have our dataset, and have it 'published', we can prepare to use this dataset in our analysis. We will do this in two main steps: 1) setting up our analysis environment, and 2) running the analysis. 142 | 143 | 144 | ## Setting the Analysis Environment Commands 145 | ``` 146 | cd 147 | datalad create -c text2git my_analysis 148 | cd my_analysis 149 | ``` 150 | Install all desired "components" 151 | ``` 152 | datalad install -d . -s https://github.com//ds001907-EDU.git rawdata 153 | ``` 154 | ``` 155 | datalad install -d . -s https://github.com/ReproNim/containers containers 156 | mkdir code 157 | datalad install -d . -s https://github.com/proj-nuisance/simple_workflow code/simple_workflow 158 | ``` 159 | Make some working space and tell git to ignore the working space 160 | ``` 161 | mkdir workdir 162 | echo workdir > .gitignore 163 | datalad save -m "ignore workdir" .gitignore 164 | ``` 165 | 166 | 167 | # Data Analysis 168 | We need to execute an analysis that supports the hypotheses we are considering. 169 | 170 | ## The Hypotheses 171 | Our hypotheses are drawn from the findings of the [ENIGMA Consortium](https://enigma.ini.usc.edu/). Specifically, the 172 | [ENIGMA Parkinson's Disease subgroup](https://enigma.ini.usc.edu/ongoing/enigma-parkinsons/) have identified a number of salient structural findings, 173 | reported in [this paper](https://pubmed.ncbi.nlm.nih.gov/34288137/). From these findings, we have selected the following *a priori* hypotheses: 174 | 1. ICV is slightly larger in PD patients 175 | 3. larger thalamic volumes (esp in earlier stage PD) 176 | 4. smaller putamen volumes 177 | 178 | We can assess these questions using software that will perform a volumetric analysis of the T1 structural imaging. For an efficient volumetric 179 | analysis, we have selected the [FSL software](), using the [Brain Extraction Tool]() we can determine total brain volume, using [FAST](), we can 180 | get tissue (gray, white, CSF) volumes, and [FIRST]() yields subcortical structural volumes. The run time for each subject should be approximately 181 | 10 minutes. 182 | 183 | ## The Container 184 | It turns out we already have a container that runs the set of FSL tools we need (coloquially know of as the 'simple1' container), it was used in 185 | the paper [Ghosh, et al, "A very simple, re-executable neuroimaging publication"](https://f1000research.com/articles/6-124), and is accessible 186 | already through the DataLad and ReproNim/Containers infrastructure. 187 | 188 | ``` 189 | datalad containers-run \ 190 | -n containers/repronim-simple-workflow \ 191 | --input 'rawdata/sub-RC4*/ses-1/anat/sub-*_ses-1_T1w.nii.gz' \ 192 | code/simple_workflow/run_demo_workflow.py \ 193 | -o . -w data/workdir --plugin_args 'dict(n_procs=10)' '{inputs}' 194 | ``` 195 | 196 | ***Congratulations! You have now run a completly re-executable analysis to venerate volumetric results on your anatomic data!*** 197 | Now, let's do someting with these results. 198 | 199 | First, let's get all of our *stuff* represented in a more completly standard representation. 200 | 201 | # Standards 202 | ## Imaging Data 203 | Our raw imaging data is in pretty good shape. It is in BIDS already (although we did have a warning about not having a participants.json 'dictionary). 204 | An issue with BIDS, however, is that even compliant BIDS are not completely 'self described'. The "diagnosis" field, for example, could have been 205 | called anything (dx, DIAG, condition, etc.), and therefore 'someone' would still need to tell you what this field actually is, if they handed you 206 | this file. [NIDM](http://nidm.nidash.org/) was developed to provide a standardised way to represent what our various data fields 'mean'. For 'diagnosis', to continue the 207 | example, I could associate that field with a reference to a definition of what we mean for this field, like this [link](TODO). So, our next step is 208 | to represent our imaging data in the NIDM representation. 209 | 210 | In the my_analysis directory... 211 | ``` 212 | bidsmri2nidm -d $PWD/rawdata -o $PWD/rawdata/my_nidm.ttl 213 | ``` 214 | 215 | You will need to answer a number of questions about your data. Details of an example session are shown [here](bidsmri2nidm.txt). 216 | You may get some 'warnings' don't worry about these (including regarding a missing INTERLEX API key). This process generates, in your 217 | rawdata BIDS directory a 'participants.json' file and the 'my_nidm.ttl' file. 218 | 219 | 220 | ## Standardized Representation of the Results 221 | The volumetric results for each structure measured are 222 | packaged in a .json representation. This .json can be transferred into the NIDM semantically encoded results (.ttl). 223 | First, lets generate a listing of your cases: 224 | ``` 225 | ls -d1 rawdata/sub-* | xargs -n 1 basename > cases.txt 226 | ``` 227 | cases.txt now lists your cases. Next, looping over all of your cases, let's convert the .json into NIDM and merge it into your main NIDM representation: 228 | 229 | ***At the moment (6/9/22) this will fail, since we don't have fslsegstats2nidm installed globally. IF you are checking this, you can install it 230 | for yourself by ```pip install https://github.com/ReproNim/fsl_seg_to_nidm/archive/refs/heads/master.zip```*** 231 | 232 | ***BASH*** 233 | ``` 234 | for i in `cat cases.txt`; 235 | do 236 | echo "Working on $i file..."; 237 | fslsegstats2nidm -f $PWD/${i}_ses-1_T1w/segstats.json -subjid $i -o $PWD/file.ttl -n $PWD/rawdata/my_nidm.ttl; 238 | done 239 | ``` 240 | 241 | This should generate the following output: 242 | ``` 243 | sub-RC4101 244 | Found subject ID: sub-RC4101 in NIDM file (agent: http://iri.nidash.org/c485338e-d9d7-11ec-8cef-acde48001122) 245 | Writing Augmented NIDM file... 246 | sub-RC4227 247 | Found subject ID: sub-RC4227 in NIDM file (agent: http://iri.nidash.org/d4a9e7ce-d9d7-11ec-8cef-acde48001122) 248 | Writing Augmented NIDM file... 249 | ``` 250 | and your volumes should now be included in you my_nidm.ttl file! 251 | 252 | Let's make DataLad happy with the results we have just updated... 253 | ``` 254 | datalad save -m "added imaging and volume metadata" -r 255 | ``` 256 | 257 | # Querying the results 258 | Now, what can we do with our results? We can interogate the my_nidm.ttl file for the contents that is has. For example, what subjects are in this data file? 259 | ``` 260 | pynidm query -nl rawdata/my_nidm.ttl -u /subjects 261 | ``` 262 | 263 | What do we know about a particular subject? 264 | ``` 265 | pynidm query -nl rawdata/my_nidm.ttl -u /subjects/ 266 | ``` 267 | (for example sub-RC4101). 268 | 269 | Extract specific fields, where fsl_000030 is the code for "Right-Thalamus-Proper (mm^3)": 270 | ``` 271 | pynidm query -nl rawdata/my_nidm.ttl --get_fields age,fsl_000030,sex 272 | ``` 273 | Codes for various structural volumes: 274 | 275 | | Structure | Code | 276 | |:---------:|:----:| 277 | |Left-Accumbens-area (mm^3)| fsl_000004| 278 | |Left-Amygdala (mm^3) |fsl_000006| 279 | |Left-Caudate (mm^3) |fsl_000008| 280 | |Left-Hippocampus (mm^3) |fsl_000010| 281 | |Left-Pallidum (mm^3) |fsl_000012| 282 | |Left-Putamen (mm^3) |fsl_000014| 283 | |Left-Thalamus-Proper (mm^3)| fsl_000016| 284 | |Right-Accumbens-area (mm^3)| fsl_000018| 285 | |Right-Amygdala (mm^3)| fsl_000020| 286 | |Right-Caudate (mm^3) |fsl_000022| 287 | |Right-Hippocampus (mm^3)| fsl_000024| 288 | |Right-Pallidum (mm^3) |fsl_000026| 289 | |Right-Putamen (mm^3) |fsl_000028| 290 | |Right-Thalamus-Proper (mm^3)| fsl_000030| 291 | |csf (mm^3) |fsl_000032| 292 | |gray (mm^3) |fsl_000034| 293 | |white (mm^3)| fsl_000036| 294 | 295 | 296 | ## Linear Regression 297 | Within the my_nidm.ttl we can also perform simple statistical analyses. Returning to our hypotheses, our nidm file knows about age, sex, diagnosis, and the regional brain volumes. 298 | For our anatomic regions, we do need to recall the codes that are assigned: brain volume (TODO), thalamus (fsl_000016,fsl_000030), putamen (fsl_000014, fsl_000028), for left and right, respectively. 299 | So, for any of these, we can use *pynidm* to run a linear regression for a specific model. For example, for Right Thalamus, a cpmprehensive 300 | model that includes tems for age, sex, and an age by sex interaction to predict diagnosis would look like: 301 | 302 | ``` 303 | pynidm linear-regression -nl rawdata/my_nidm.ttl -model "fsl_000030 = age*sex+sex+age+diagnosis" -contrast "diagnosis" 304 | ``` 305 | A simpler model that just looks at Right Thalamus predicting diagnosis would be: 306 | ``` 307 | pynidm linear-regression -nl rawdata/my_nidm.ttl -model "fsl_000030 = diagnosis" -contrast "diagnosis" 308 | ``` 309 | 310 |
311 | Click to expand! 312 | 313 | ```javascript 314 | *********************************************************************************************************** 315 | Your command was: pynidm linear-regression -nl rawdata/my_nidm.ttl -model "fsl_000030 = age*sex+sex+age+diagnosis" -contrast "diagnosis" 316 | 317 | Your data set has less than 20 points, which means the model calculated may not be accurate due to a lack of data. 318 | This means you cannot regularize the data either. 319 | Continue anyways? Y or N: Y 320 | age fsl_000030 diagnosis sex 321 | 0 66.28288 9069.0 0 0 322 | 1 69.04774 8142.0 1 0 323 | 324 | *********************************************************************************************************** 325 | 326 | Model Results: 327 | fsl_000030 ~ diagnosis + age + sex + age*sex 328 | 329 | *********************************************************************************************************** 330 | 331 | 332 | 333 | Treatment (Dummy) Coding: Dummy coding compares each level of the categorical variable to a base reference level. The base reference level is the value of the intercept. 334 | With contrast (treatment coding) 335 | OLS Regression Results 336 | ============================================================================== 337 | Dep. Variable: fsl_000030 R-squared: 1.000 338 | Model: OLS Adj. R-squared: nan 339 | Method: Least Squares F-statistic: nan 340 | Date: Mon, 23 May 2022 Prob (F-statistic): nan 341 | Time: 08:04:01 Log-Likelihood: 49.724 342 | No. Observations: 2 AIC: -95.45 343 | Df Residuals: 0 BIC: -98.06 344 | Df Model: 1 345 | Covariance Type: nonrobust 346 | ================================================================================================ 347 | coef std err t P>|t| [0.025 0.975] 348 | ------------------------------------------------------------------------------------------------ 349 | Intercept 56.4011 inf 0 nan nan nan 350 | C(diagnosis, Treatment)[T.1] -1302.9428 inf -0 nan nan nan 351 | age 135.9717 inf 0 nan nan nan 352 | sex 0 nan nan nan nan nan 353 | age:sex 0 nan nan nan nan nan 354 | ============================================================================== 355 | Omnibus: nan Durbin-Watson: 1.000 356 | Prob(Omnibus): nan Jarque-Bera (JB): 0.333 357 | Skew: 0.000 Prob(JB): 0.846 358 | Kurtosis: 1.000 Cond. No. 138. 359 | ============================================================================== 360 | 361 | Notes: 362 | [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. 363 | [2] The input rank is higher than the number of observations. 364 | 365 | 366 | Simple Coding: Like Treatment Coding, Simple Coding compares each level to a fixed reference level. However, with simple coding, the intercept is the grand mean of all the levels of the factors. 367 | OLS Regression Results 368 | ============================================================================== 369 | Dep. Variable: fsl_000030 R-squared: 1.000 370 | Model: OLS Adj. R-squared: nan 371 | Method: Least Squares F-statistic: nan 372 | Date: Mon, 23 May 2022 Prob (F-statistic): nan 373 | Time: 08:04:01 Log-Likelihood: 50.534 374 | No. Observations: 2 AIC: -97.07 375 | Df Residuals: 0 BIC: -99.68 376 | Df Model: 1 377 | Covariance Type: nonrobust 378 | ================================================================================================ 379 | coef std err t P>|t| [0.025 0.975] 380 | ------------------------------------------------------------------------------------------------ 381 | Intercept 54.0233 inf 0 nan nan nan 382 | C(diagnosis, Simple)[Simp.0] -1276.4203 inf -0 nan nan nan 383 | age 126.3790 inf 0 nan nan nan 384 | sex 0 nan nan nan nan nan 385 | age:sex 0 nan nan nan nan nan 386 | ============================================================================== 387 | Omnibus: nan Durbin-Watson: 1.000 388 | Prob(Omnibus): nan Jarque-Bera (JB): 0.333 389 | Skew: 0.000 Prob(JB): 0.846 390 | Kurtosis: 1.000 Cond. No. 135. 391 | ============================================================================== 392 | 393 | Notes: 394 | [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. 395 | [2] The input rank is higher than the number of observations. 396 | 397 | 398 | Sum (Deviation) Coding: Sum coding compares the mean of the dependent variable for a given level to the overall mean of the dependent variable over all the levels. 399 | OLS Regression Results 400 | ============================================================================== 401 | Dep. Variable: fsl_000030 R-squared: 1.000 402 | Model: OLS Adj. R-squared: nan 403 | Method: Least Squares F-statistic: nan 404 | Date: Mon, 23 May 2022 Prob (F-statistic): nan 405 | Time: 08:04:01 Log-Likelihood: 49.133 406 | No. Observations: 2 AIC: -94.27 407 | Df Residuals: 0 BIC: -96.88 408 | Df Model: 1 409 | Covariance Type: nonrobust 410 | ========================================================================================== 411 | coef std err t P>|t| [0.025 0.975] 412 | ------------------------------------------------------------------------------------------ 413 | Intercept 14.9315 inf 0 nan nan nan 414 | C(diagnosis, Sum)[S.0] 639.0088 inf 0 nan nan nan 415 | age 126.9568 inf 0 nan nan nan 416 | sex 0 nan nan nan nan nan 417 | age:sex 0 nan nan nan nan nan 418 | ============================================================================== 419 | Omnibus: nan Durbin-Watson: 0.138 420 | Prob(Omnibus): nan Jarque-Bera (JB): 0.333 421 | Skew: 0.000 Prob(JB): 0.846 422 | Kurtosis: 1.000 Cond. No. 67.7 423 | ============================================================================== 424 | 425 | Notes: 426 | [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. 427 | [2] The input rank is higher than the number of observations. 428 | 429 | 430 | Backward Difference Coding: In backward difference coding, the mean of the dependent variable for a level is compared with the mean of the dependent variable for the prior level. 431 | OLS Regression Results 432 | ============================================================================== 433 | Dep. Variable: fsl_000030 R-squared: 1.000 434 | Model: OLS Adj. R-squared: nan 435 | Method: Least Squares F-statistic: nan 436 | Date: Mon, 23 May 2022 Prob (F-statistic): nan 437 | Time: 08:04:01 Log-Likelihood: 50.534 438 | No. Observations: 2 AIC: -97.07 439 | Df Residuals: 0 BIC: -99.68 440 | Df Model: 1 441 | Covariance Type: nonrobust 442 | =========================================================================================== 443 | coef std err t P>|t| [0.025 0.975] 444 | ------------------------------------------------------------------------------------------- 445 | Intercept 54.0233 inf 0 nan nan nan 446 | C(diagnosis, Diff)[D.0] -1276.4203 inf -0 nan nan nan 447 | age 126.3790 inf 0 nan nan nan 448 | sex 0 nan nan nan nan nan 449 | age:sex 0 nan nan nan nan nan 450 | ============================================================================== 451 | Omnibus: nan Durbin-Watson: 1.000 452 | Prob(Omnibus): nan Jarque-Bera (JB): 0.333 453 | Skew: 0.000 Prob(JB): 0.846 454 | Kurtosis: 1.000 Cond. No. 135. 455 | ============================================================================== 456 | 457 | Notes: 458 | [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. 459 | [2] The input rank is higher than the number of observations. 460 | 461 | 462 | Helmert Coding: Our version of Helmert coding is sometimes referred to as Reverse Helmert Coding. The mean of the dependent variable for a level is compared to the mean of the dependent variable over all previous levels. Hence, the name ‘reverse’ being sometimes applied to differentiate from forward Helmert coding. 463 | OLS Regression Results 464 | ============================================================================== 465 | Dep. Variable: fsl_000030 R-squared: 1.000 466 | Model: OLS Adj. R-squared: nan 467 | Method: Least Squares F-statistic: nan 468 | Date: Mon, 23 May 2022 Prob (F-statistic): nan 469 | Time: 08:04:01 Log-Likelihood: 49.133 470 | No. Observations: 2 AIC: -94.27 471 | Df Residuals: 0 BIC: -96.88 472 | Df Model: 1 473 | Covariance Type: nonrobust 474 | ============================================================================================== 475 | coef std err t P>|t| [0.025 0.975] 476 | ---------------------------------------------------------------------------------------------- 477 | Intercept 14.9315 inf 0 nan nan nan 478 | C(diagnosis, Helmert)[H.1] -639.0088 inf -0 nan nan nan 479 | age 126.9568 inf 0 nan nan nan 480 | sex 0 nan nan nan nan nan 481 | age:sex 0 nan nan nan nan nan 482 | ============================================================================== 483 | Omnibus: nan Durbin-Watson: 0.138 484 | Prob(Omnibus): nan Jarque-Bera (JB): 0.333 485 | Skew: 0.000 Prob(JB): 0.846 486 | Kurtosis: 1.000 Cond. No. 67.7 487 | ============================================================================== 488 | 489 | Notes: 490 | [1] Standard Errors assume that the covariance matrix of the errors is correctly specified. 491 | [2] The input rank is higher than the number of observations. 492 | 493 | 494 | ``` 495 |
496 | 497 | 498 | # Publishing the results to the ReproLake 499 | The "ReproLake" is a public repository of metadata (imaging and results). We have implemented a ReproLake using a StarDog graph database. You can 'publish' your results to this accessible database with the following command: 500 | 501 | ``` 502 | curl --location --request POST 'https://stardog.scicrunch.io:5821/Repronim_OHBM_2022?graph=urn:http://repronim.org/' --header 'Content-Type: text/turtle' --user repro-student:XXXX --data-binary '@/home/jovyan/my_analysis/rawdata/my_nidm.ttl' 503 | ``` 504 | where the XXXX password will be provided in class, and your own initials should be inserted where specified (this provides a unique identifier). 505 | 506 | In addition to entering our data into the RdeproLake datbase, we can collect publically accessible NIDM files for use with other query engines. Specifically, let's collect all the NIDM files created by this class. To do this, we will add our NIDM file to a folder in our educational course GitHub repo with the following commands: 507 | ``` 508 | cd ~/my_analysis 509 | cp rawdata/my_nidm.ttl ~/Exercise-OHBM2022/Results/_nidm.ttl 510 | cd ~/Exercise-OHBM2022 511 | git add Results/_nidm.ttl 512 | git commit -m "Added NIDM file" 513 | git remote add OHBM https://github.com//OHBMEducation-2022.git 514 | git push OHBM 515 | ``` 516 | Now, on YOUR fork of the OHBMEducational-2022 repository, you can issue a 'pull request' to contribute these results back to the main educational course repository. 517 | 518 | ![picture](../pics/pull_request.png) 519 | 520 | 521 | 522 | # Publish the Complete Package 523 | There are numerous places you can now share this complete dataset. GitHub does not 'like' large binary datasets, all the imaging data we have 524 | been using so far actually, through the magic of 'git-annex' actually resides at the AWS S3 data hosting of OpenNeuro. In our processing, we 525 | have generated new data that does not have an alternate location. So, when we publish this complete data, we need to allow for the storage of 526 | this newly generated imaging data. For this exmple, we will use Git LFS, whereby a GitHub subscription allows up to 1GB of free storage and up 527 | to 1GB of bandwidth monthly. The following steps are summarized from the [DataLad Handbook here](http://handbook.datalad.org/en/latest/basics/101-139-gitlfs.html). 528 | 529 | In order to store annexed dataset contents on GitHub, we need first to create a so-called "sibling" of our DataLad dataset: 530 | ``` 531 | datalad create-sibling-github my_experiment 532 | ``` 533 | Remind git to remember you for this repository, to save lots of authenticating later on: 534 | ``` 535 | git config credential.helper 'cache --timeout=3600' 536 | ``` 537 | To avoid a possible shortcoming in DataLad GitHub interaction, we do the following first: 538 | ``` 539 | datalad push --to=github 540 | ``` 541 | 542 | And then we initialize a special remote of type `git-lfs`, pointing to the same GitHub repository: 543 | ``` 544 | git annex initremote github-lfs type=git-lfs url=https://github.com/$Your_GitHub_Username/my_experiment encryption=none embedcreds=no autoenable=true 545 | ``` 546 | 547 | By running `datalad siblings` from the dataset directory, it will be evident that we now have two siblings of the original DataLad dataset, for example: 548 | ``` 549 | datalad siblings 550 | .: here(+) [git] 551 | .: github(-) [https://github.com/$Your_GitHub_Username/my_experiment.git (git)] 552 | .: github-lfs(+) [git] 553 | ``` 554 | 555 | In order to link the annexed contents in the LFS special remote to the GitHub sibling such that we can update both simultaneously, we need to configure a publication dependency using the `publish-depends ` option. We'll set it such that the `github` sibling depends on the `github-lfs` sibling. 556 | 557 | ``` 558 | datalad siblings configure -s github --publish-depends github-lfs 559 | ``` 560 | 561 | Finally, with this single step it becomes possible to transfer the entire dataset, including annexed file content, to the same GitHub repository: 562 | ``` 563 | datalad push --to=github 564 | ``` 565 | 566 | Others can now fork or clone this repository, see your complete process, and re-run the analysis to exactely replicate your results, 567 | and, importantly, extend the exact same analysis to other data. 568 | 569 | # Write Your Paper 570 | In your GitHub repo, create a document that describes your re-executable publication. Make sure to indicate the raw data that you used, 571 | the analysis workflow and processing script you used, the complete results, and the statistical analysis. Indicate how someoneelse can 572 | rerun your analysis. --------------------------------------------------------------------------------