├── .github ├── ISSUE_TEMPLATE │ ├── bug_report.md │ ├── extraction-issues.md │ ├── feature_request.md │ ├── postmortem_report.md │ └── usage-question.md └── workflows │ ├── build_docs.yml │ ├── deploy_docs.yml │ └── main.yaml ├── .gitignore ├── LICENSE ├── README.md ├── docs_gh_pages ├── 010_api_reference.rst ├── 02_installation.md ├── 06_examples.rst ├── 09_contribution.md ├── Makefile ├── README.md ├── _static │ ├── 03b_tuto_matlab_cluster.png │ ├── IBL_data.png │ ├── css │ │ └── style.css │ └── one_demo.html ├── _templates │ ├── autosummary │ │ └── module.rst │ ├── custom-class-template.rst │ ├── custom-module-template.rst │ └── style.css.txt ├── atlas_examples.rst ├── conf.py ├── docs_external │ ├── alf_intro.md │ └── ibl_viewer.md ├── documentation_contribution_guidelines.md ├── genindex.rst ├── index.rst ├── loading_examples.rst ├── loading_examples │ └── loading_spike_waveforms.ipynb ├── make.bat ├── make_script.py ├── public_docs │ ├── data_release_pilot.md │ ├── information_contact.md │ └── public_introduction.md ├── requirements-docs.txt ├── scripts │ ├── execute_notebooks.py │ ├── gh_push.sh │ ├── myavi_to_png.py │ └── one_setup.py └── templates │ ├── colab_template.ipynb │ ├── docs_example_ipynb.ipynb │ └── docs_example_py.py └── requirements.txt /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: "[Bug report] - Add a title to your issue here" 5 | labels: bug 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | Steps to reproduce the behavior: 15 | 1. Go to '...' 16 | 2. Click on '....' 17 | 3. Scroll down to '....' 18 | 4. See error 19 | 20 | **Expected behavior** 21 | A clear and concise description of what you expected to happen. 22 | 23 | **Screenshots** 24 | If applicable, add screenshots to help explain your problem. 25 | 26 | **Desktop (please complete the following information):** 27 | - OS: [e.g. iOS] 28 | - Browser [e.g. chrome, safari] 29 | - Version [e.g. 22] 30 | 31 | **Smartphone (please complete the following information):** 32 | - Device: [e.g. iPhone6] 33 | - OS: [e.g. iOS8.1] 34 | - Browser [e.g. stock browser, safari] 35 | - Version [e.g. 22] 36 | 37 | **Additional context** 38 | Add any other context about the problem here. 39 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/extraction-issues.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Extraction issues 3 | about: Create a report on data extraction issues 4 | title: "[EXTRACT]" 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **EID / PID** 11 | Session EID / probe ID and path, e.g. 12 | - 09156021-9a1d-4e1d-ae59-48cbde3c5d42 hausserlab/PL015/2022-02-22/001 13 | (it can be a list if you have multiple failures) 14 | 15 | **What task has failed, with what error message** 16 | e.g. EphysSynchPulses failed for all the above sessions with error `gnagna` and I do not know what I can do about it, please advise. 17 | 18 | **Background on the recording** 19 | Is this session a dud ? Is it worth trying to extract it? Has anything unusual happened during recording? 20 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: "[Feature request] - Add a title to your issue here" 5 | labels: enhancement 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/postmortem_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Postmortem report 3 | about: Used to describe a downtime event that has since been resolved 4 | title: "[Postmortem report]" 5 | labels: postmortem 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Status:** {draft|final} 11 | **Owners:** {who worked on finding the resolution} 12 | 13 | ## Summary 14 | Description: {brief description of symptoms and root cause} 15 | Component: {affected area} 16 | Date/time: {YYYY-MM-DD HH:MM} 17 | Duration: {time from initial breakage to final resolution} 18 | User impact: {who was affected by the incident} 19 | 20 | ## Timeline (all times in UTC+00:00) 21 | 22 | ### 2022-01-01 23 | 24 | 14:44 - something happened 25 | 14:45 - next thing happened **<START OF OUTAGE>** 26 | 27 | ### 1900-01-02 28 | 29 | 09:12 - another thing happened **<END OF OUTAGE>** 30 | 31 | ### Impact & root cause 32 | 33 | {a more thorough summary of the problems that the outage caused, and **without blame**, describe the root cause of the outage} 34 | 35 | ### What worked 36 | 37 | {list things where things worked as expected in a positive manner} 38 | 39 | ### Where we got lucky 40 | 41 | {list things that mitigated this incident but not because of our foresight} 42 | 43 | ### What didn't work 44 | 45 | {things that failed or prevented a quicker resolution} 46 | 47 | ## Action items for the future 48 | 49 | {each item here should have an owner} 50 | 51 | ### Prevention 52 | 53 | {things that would have prevented this failure from happening in the first place, such as input validation, pinning dependencies, etc} 54 | 55 | ### Detection 56 | 57 | {things that would have detected this failure before it became an incident, such as better testing, monitoring, etc} 58 | 59 | ### Mitigation 60 | 61 | {things that would have made this failure less serious, such as graceful degradation, better exception handling, etc} 62 | 63 | ### Process 64 | 65 | {things that would have helped us resolve this failure faster, such as documented processes and protocols, etc} 66 | 67 | ### Fixes 68 | 69 | {the fixes that were necessary to resolve this incident} 70 | 71 | ## Other 72 | 73 | {any other useful information, such as relevant logs} -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/usage-question.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Usage question 3 | about: Ask 'how to' questions related to product usage 4 | title: "[Usage question] - Add your issue title here" 5 | labels: question 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the question you have** 11 | A clear and concise description of the question you have. Ex. How can I do [...] ? 12 | 13 | **Is your question related to a specific product?** 14 | Add links to relevant repositories or documents if so. 15 | 16 | **Additional context** 17 | Add any other context or screenshots about the usage question here. 18 | -------------------------------------------------------------------------------- /.github/workflows/build_docs.yml: -------------------------------------------------------------------------------- 1 | name: Build Docs 2 | on: 3 | workflow_dispatch: # manual trigger to kick off workflow 4 | inputs: 5 | logLevel: 6 | description: "Log level" 7 | required: true 8 | default: "warning" 9 | 10 | jobs: 11 | build_docs: 12 | runs-on: ubuntu-latest 13 | steps: 14 | - name: Configure AWS Credentials 15 | uses: aws-actions/configure-aws-credentials@v1 16 | with: 17 | aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} 18 | aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} 19 | aws-region: us-east-1 20 | 21 | - name: Checkout iblenv doc build branch 22 | uses: actions/checkout@v3 23 | with: 24 | ref: docs 25 | 26 | - name: Checkout ibllib doc build branch 27 | uses: actions/checkout@v3 28 | with: 29 | repository: int-brain-lab/ibllib 30 | ref: docs 31 | path: ibllib-repo 32 | 33 | - name: Checkout ONE-api 34 | uses: actions/checkout@v3 35 | with: 36 | repository: int-brain-lab/one 37 | path: ONE 38 | 39 | - name: Checkout iblatlas 40 | uses: actions/checkout@v3 41 | with: 42 | repository: int-brain-lab/iblatlas 43 | path: iblatlas 44 | 45 | - name: Move ibllib and ONE up a directory 46 | run: | 47 | mv ibllib-repo .. 48 | mv ONE .. 49 | mv iblatlas .. 50 | 51 | - name: Setup Python 52 | uses: actions/setup-python@v4 53 | with: 54 | python-version: 3.12 55 | 56 | - name: Install docs requirements 57 | run: | 58 | sudo apt-get install -y pandoc 59 | export TQDM_DISABLE=1 60 | pip install -r docs_gh_pages/requirements-docs.txt 61 | pip install -e ../ibllib-repo 62 | pip install git+https://github.com/jcouto/wfield.git 63 | pip install jupyter 64 | 65 | - name: ONE setup and build docs 66 | run: | 67 | cd docs_gh_pages 68 | python scripts/one_setup.py 69 | python make_script.py -e 70 | ls -l _build 71 | 72 | - name: Zip up documentation 73 | run: | 74 | sudo apt-get install -y zip 75 | zip -r build_zip docs_gh_pages/_build 76 | 77 | - name: Store zip file as artifacts 78 | uses: actions/upload-artifact@v4 79 | with: 80 | name: build_zip 81 | path: | 82 | build_zip.zip 83 | 84 | - name: Copy files to the production website with the AWS CLI 85 | run: | 86 | cd docs_gh_pages/_build/html 87 | aws s3 sync . s3://testdocs.internationalbrainlab.org 88 | -------------------------------------------------------------------------------- /.github/workflows/deploy_docs.yml: -------------------------------------------------------------------------------- 1 | name: Deploy Docs 2 | on: 3 | workflow_dispatch: # manual trigger to kick off workflow 4 | inputs: 5 | logLevel: 6 | description: "Log level" 7 | required: true 8 | default: "warning" 9 | 10 | jobs: 11 | deploy_docs: 12 | runs-on: ubuntu-latest 13 | steps: 14 | - name: Configure AWS Credentials 15 | uses: aws-actions/configure-aws-credentials@v1 16 | with: 17 | aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} 18 | aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} 19 | aws-region: us-east-1 20 | 21 | - name: Checkout iblenv doc build branch 22 | uses: actions/checkout@v3 23 | with: 24 | ref: master 25 | 26 | - name: Checkout ibllib doc build branch 27 | uses: actions/checkout@v3 28 | with: 29 | repository: int-brain-lab/ibllib 30 | ref: develop 31 | path: ibllib-repo 32 | 33 | - name: Checkout ONE-api 34 | uses: actions/checkout@v3 35 | with: 36 | repository: int-brain-lab/one 37 | path: ONE 38 | 39 | - name: Checkout iblatlas 40 | uses: actions/checkout@v3 41 | with: 42 | repository: int-brain-lab/iblatlas 43 | path: iblatlas 44 | 45 | - name: Move ibllib and ONE up a directory 46 | run: | 47 | mv ibllib-repo .. 48 | mv ONE .. 49 | mv iblatlas .. 50 | 51 | - name: Setup Python 52 | uses: actions/setup-python@v4 53 | with: 54 | python-version: 3.12 55 | 56 | - name: Install docs requirements 57 | run: | 58 | sudo apt-get install -y pandoc 59 | pip install -r docs_gh_pages/requirements-docs.txt 60 | pip install -e ../ibllib-repo 61 | pip install git+https://github.com/jcouto/wfield.git 62 | pip install jupyter 63 | 64 | - name: ONE setup and build docs 65 | run: | 66 | cd docs_gh_pages 67 | python scripts/one_setup.py 68 | python make_script.py -e 69 | ls -l _build 70 | 71 | - name: Clean up documentation 72 | run: | 73 | cd docs_gh_pages 74 | python make_script.py -pc 75 | 76 | - name: Commit documentation changes 77 | run: | 78 | git clone https://github.com/int-brain-lab/iblenv.git --branch gh-pages --single-branch gh-pages 79 | rm -rf gh-pages/* 80 | cp -r docs_gh_pages/_build/html/* gh-pages/ 81 | cd gh-pages 82 | touch .nojekyll 83 | git config --local user.email "action@github.com" 84 | git config --local user.name "GitHub Action" 85 | git add . 86 | git commit -m "Update documentation" -a || true 87 | 88 | - name: Push changes 89 | uses: ad-m/github-push-action@master 90 | with: 91 | branch: gh-pages 92 | directory: gh-pages 93 | force: True 94 | github_token: ${{ secrets.GITHUB_TOKEN }} 95 | 96 | 97 | - name: Copy files to the production website with the AWS CLI 98 | run: | 99 | cd docs_gh_pages/_build/html 100 | aws s3 sync . s3://docs.internationalbrainlab.org 101 | -------------------------------------------------------------------------------- /.github/workflows/main.yaml: -------------------------------------------------------------------------------- 1 | name: CI 2 | on: 3 | push: 4 | branches: [ master ] 5 | pull_request: 6 | branches: [ master ] 7 | 8 | jobs: 9 | incubator: 10 | name: build (${{ matrix.python-version }}, ${{ matrix.os }}) 11 | runs-on: ${{ matrix.os }} 12 | strategy: 13 | max-parallel: 3 14 | matrix: 15 | os: ["ubuntu-latest", "macos-latest", "windows-latest"] 16 | python-version: ["3.12"] 17 | steps: 18 | - name: Checkout branch 19 | uses: actions/checkout@v3 20 | 21 | - uses: conda-incubator/setup-miniconda@v3.0.3 22 | with: 23 | auto-update-conda: true 24 | python-version: ${{ matrix.python-version }} 25 | 26 | - name: Install all packages 27 | shell: bash -l {0} 28 | run: | 29 | conda activate test 30 | cd .. 31 | git clone https://github.com/int-brain-lab/iblapps.git 32 | pip install --editable iblapps 33 | cd iblenv 34 | pip install --requirement requirements.txt 35 | echo "----- pip list -----" 36 | pip list 37 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | 131 | # miscellaneous 132 | .DS_Store 133 | /.idea/* 134 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 International Brain Laboratory 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # IBLENV installation guide 2 | Unified environment and issue tracker for IBL github repositories. 3 | 4 | ## Update environment 5 | 6 | In a terminal, navigate to your working directory, the one in you cloned the `iblenv` and `iblapps` repositories previously 7 | (typically something like `int-brain-lab`). Run the following commands in your terminal: 8 | 9 | ```bash 10 | conda activate iblenv 11 | cd iblapps 12 | git pull 13 | cd .. 14 | cd iblenv 15 | git pull 16 | pip install -r requirements.txt --upgrade 17 | ``` 18 | 19 | If any errors are encountered, it is recommended to follow the "Removing an old installation" instructions and then the "Install 20 | from scratch" instructions. 21 | 22 | ## Install from scratch 23 | In order to create the unified environment for using IBL repositories, first download and install 24 | [Anaconda](https://www.anaconda.com/distribution/#download-section) and [git](https://git-scm.com/downloads), and follow their 25 | installer instructions to add each to the system path. Also, please ensure Anaconda is installed to your home directory. The 26 | below instructions will tell you how to set up and activate the unified conda environment (`iblenv`) and properly install 27 | multiple repositories within this environment. 28 | 29 | In your git terminal, navigate to the directory in which you want to install the IBL repositories (e.g. create a folder named 30 | something like `int-brain-lab` and work from within it). Then run the following commands: 31 | 32 | ```commandline 33 | conda update -n base -c defaults conda 34 | conda create --name iblenv python=3.10 --yes 35 | conda activate iblenv 36 | git clone https://github.com/int-brain-lab/iblapps 37 | pip install --editable iblapps 38 | git clone https://github.com/int-brain-lab/iblenv 39 | cd iblenv 40 | pip install --requirement requirements.txt 41 | ``` 42 | 43 | ## Removing an old installation 44 | The following command will completely remove an anaconda environment and all of its packages: `conda remove --name iblenv --all` 45 | 46 | ## Notes: 47 | - Whenever you run IBL code in Python you should activate the `iblenv` environment, i.e. `conda activate iblenv` 48 | - If you want to launch GUIs that rely on pyqt (e.g. the IBL data exploration gui or phy) from IPython, you should first run the 49 | IPython magic command `%gui qt`. 50 | [Additional documentation here for working with iblenv](https://int-brain-lab.github.io/iblenv/) 51 | 52 | ## Troubleshooting: 53 | 54 | ### Spyder 55 | If using Anaconda's Spyder IDE, please take note. When installing Spyder in a virtual environment, like iblenv, conda 56 | will add many packages to that virtual environment. In the case of iblenv, some packages installed by spyder are in direct 57 | conflict with the pip installed packages. This will create an inconsistent and unstable environment, especially when attempting 58 | to perform any sort of update on those packages. For more information about how to work with pip within conda, please read the 59 | following [article](https://www.anaconda.com/blog/using-pip-in-a-conda-environment). 60 | 61 | It is not recommended to use the Spyder IDE in conjunction with iblenv. Please seek alternatives, like 62 | [PyCharm](https://www.jetbrains.com/pycharm/) or [Visual Studio Code](https://code.visualstudio.com/). 63 | 64 | ### brotli error 65 | If attempting to set up this environment in older versions of anaconda or a version of anaconda that has been upgraded from an older version of anaconda, you may be presented with the following error when attempting to import the ONE api: 66 | ``` 67 | activate iblenv 68 | python -c "from one.api import ONE" 69 | Traceback (most recent call last): 70 | File "", line 1, in 71 | ... 72 | File "C:\Users\username\Anaconda3\envs\iblenv\lib\site-packages\urllib3\response.py", line 396, in HTTPResponse 73 | DECODER_ERROR_CLASSES += (brotli.error,) 74 | AttributeError: module 'brotli' has no attribute 'error' 75 | ``` 76 | 77 | The source of this issue looks to be with the way anaconda handled the brolipy package. One potential solution is to run the following: 78 | ``` 79 | activate iblenv 80 | conda install brotlipy 81 | python -c "from one.api import ONE" 82 | ``` 83 | 84 | If this results in the same error, a full removal of anaconda (windows uninstall followed by the manual removal of various files and directories hiding in several areas) and then a fresh install of Anaconda should correct the problem. 85 | 86 | More details can be found in this github [issue](https://github.com/conda/conda/issues/9903). 87 | -------------------------------------------------------------------------------- /docs_gh_pages/010_api_reference.rst: -------------------------------------------------------------------------------- 1 | API Reference 2 | ###################### 3 | 4 | .. toctree:: 5 | :maxdepth: 4 6 | :titlesonly: 7 | :glob: 8 | 9 | 10 | .. autosummary:: 11 | :toctree: _autosummary 12 | :template: custom-module-template.rst 13 | :recursive: 14 | 15 | ibllib 16 | 17 | .. autosummary:: 18 | :toctree: _autosummary 19 | :template: custom-module-template.rst 20 | :recursive: 21 | 22 | brainbox 23 | 24 | .. autosummary:: 25 | :toctree: _autosummary 26 | :template: custom-module-template.rst 27 | :recursive: 28 | 29 | iblatlas 30 | 31 | .. autosummary:: 32 | :toctree: _autosummary 33 | :template: custom-module-template.rst 34 | :recursive: 35 | 36 | iblutil 37 | 38 | -------------------------------------------------------------------------------- /docs_gh_pages/02_installation.md: -------------------------------------------------------------------------------- 1 | ## Unified Environment 2 | To facilitate the use of `ibllib` and `IBL-pipeline`, we have compiled all the dependencies into a unified python 3 | environment `iblenv`. In addition to these two libraries, this environment is also compatible with other visualisation 4 | tools and analysis pipelines being developed as part of the IBL. 5 | 6 | To install this python environment and get started using the IBL data pipeline, please follow 7 | [these](https://github.com/int-brain-lab/iblenv) installation instructions. 8 | -------------------------------------------------------------------------------- /docs_gh_pages/06_examples.rst: -------------------------------------------------------------------------------- 1 | Examples 2 | ======== 3 | 4 | Below is a list of short examples using public IBL data 5 | 6 | .. toctree:: 7 | :maxdepth: 1 8 | 9 | notebooks_external/docs_get_training_status 10 | notebooks_external/docs_get_rms_data 11 | notebooks_external/docs_get_power_spectrum_data 12 | notebooks_external/docs_compute_drift 13 | notebooks_external/docs_load_spike_sorting 14 | notebooks_external/docs_raw_data_decompress 15 | notebooks_external/docs_scatter_raster_plot 16 | notebooks_external/docs_explore_passive 17 | notebooks_external/docs_get_first_pass_map_sessions 18 | notebooks_external/docs_find_nearby_trajectories 19 | notebooks_external/docs_find_dist_neighbouring_region 20 | notebooks_external/docs_visualize_session_coronal_tilted 21 | notebooks_external/docs_visualization3D_subject_channels 22 | notebooks_external/docs_access_DLC 23 | notebooks_external/docs_load_video 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | -------------------------------------------------------------------------------- /docs_gh_pages/09_contribution.md: -------------------------------------------------------------------------------- 1 | # How to contribute 2 | 3 | ## Code 4 | ### Linting 5 | We use `flake8` python module to enforce a consistent style in the CI. 6 | 7 | ### Testing 8 | Unit testing. For Python we use the `pytest` module and alternate between `unittest` and `pytest` syntax. 9 | For `Matlab` we use the embedded test framework. 10 | 11 | ### Continuous Integration 12 | For production repositories such as ibllib and alyx, continuous integration is set on Travis to install the application and run the tests on pull request. 13 | 14 | 15 | ## Contributions and releases 16 | ### Contributions and code review 17 | The branches `develop` and `master` are protected. 18 | 19 | Contributions are developed either: 20 | - on a separate branch 21 | - on a separate fork of the repository 22 | 23 | and then merged in `develop` through a pull request. 24 | 25 | 26 | #### Practical tips: 27 | - to avoid merge conflicts merge `develop` into your branch (or rebase your branch) before submitting the PR ! 28 | - make sure your branch passes tests: 29 | - before pushing by running unit tests and flake8 locally 30 | - after pushing looking at continuous integration results on Github 31 | - go through the review process with maintainers on the pull request interface on GitHub. Remind them if not done in a timely manner. Github sends them a bazillion emails daily. 32 | 33 | 34 | ### Branching model: gitflow 35 | Branching model as close as possible to gitflow, i.e. one master branch with every commit tagged with a version number: one develop branch to integrate the different volatile feature branches. 36 | **Both develop and master should pass CI tests** 37 | 38 | ### Releasing scheme: semantic versioning 39 | If any releasing scheme (such as ibllib), we use semantic versioning using the major.minor.micro or major.minor.patch model. patch/micro for bugfixes; minor for augmented functionality; major for retrocompatibility breaks. 40 | Version 0.*.* are developing versions w/ no guarantee of retrocompatibility. 41 | 42 | It is a good practice to document the changes in a `RELEASE_NOTES.md` document at the root of the repository. 43 | NB: those notes should be geared towards users not other fellow contributors. 44 | -------------------------------------------------------------------------------- /docs_gh_pages/Makefile: -------------------------------------------------------------------------------- 1 | # Minimal makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = sphinx-build 7 | SPHINXPROJ = ibllib 8 | SOURCEDIR = . 9 | BUILDDIR = _build 10 | 11 | # Put it first so that "make" without argument is like "make help". 12 | help: 13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 14 | 15 | .PHONY: help Makefile 16 | 17 | # Catch-all target: route all unknown targets to Sphinx using the new 18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). 19 | %: Makefile 20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) -------------------------------------------------------------------------------- /docs_gh_pages/README.md: -------------------------------------------------------------------------------- 1 | # Overview of documentation 2 | 3 | The documentation is built locally and hosted on a github-pages website at this address: 4 | https://int-brain-lab.github.io/iblenv/ 5 | 6 | The website is generated using 7 | 1. The markdown files in the `./docs-gh-pages` folder 8 | 2. The python or ipython notebooks in the `./docs-gh-pages/notebooks` 9 | 3. The python or ipython notebooks in the ibllib repo `./examples` and `./brainbox/examples` folders 10 | 4. The docstrings in the source code of the `./ibllib`, `./alf`, `./one` and `./brainbox` folders 11 | 12 | 13 | # Contributing to documentation 14 | 15 | ### Adding examples or tutorials to the documentation 16 | Examples or tutorials should be placed in the folders (can be in sub-folders within these folders) 17 | `ibllib-repo/examples` 18 | or 19 | `ibllib-repo/brainbox/examples` 20 | 21 | They can be either `.py` or `.ipynb` form but must have a prefix of `docs` to be included in the documentation, 22 | e.g `docs_coronal_slice.py` or `docs_get_LFP_data.ipynb`. Each example/ tutorial must start with a title and a brief 23 | description of the content. Please refer to the templates in the [templates folder](./templates) for examples of 24 | how to layout the title and description in order for it to be correctly rendered and displayed on the website. 25 | 26 | Once you have created the example/ tutorial you should link to the file in the appropriate `.rst` file: `index.rst`, `06_examples.rst` 27 | , `atlas_examples.rst` etc... 28 | The link should be made by adding in the following line `notebooks_external\name_of_example_without_extension`, e.g 29 | `notebooks_external\docs_coronal_slice` 30 | 31 | `notebooks_external\docs_get_LFP_data` 32 | 33 | An example implementation can be seen in the `06_examples.rst` file 34 | 35 | ### Tips to create and edit example notebooks 36 | 37 | #### Hide a cell in the documentation 38 | In the cell metadata, add the key `nbsphinx` with the value `hidden` to hide the cell in the documentation. 39 | 40 | ```json 41 | { 42 | "nbsphinx": "hidden", 43 | "trusted": false 44 | } 45 | ``` 46 | 47 | #### Prevent execution of a cell in the build documentation 48 | Let's say an example is using too large of a dataset. One cell can be disabled by adding the following key to the to cell metadata. 49 | 50 | ```json 51 | { 52 | "ibl_execute": false 53 | } 54 | ``` 55 | 56 | #### Prevent execution of the whole notebook in the build documentation 57 | If the full notebook is to be skipped, you can also set the `ibl_execute` flag to `false` in the notebook metadata. 58 | 59 | #### Disable logging and tqdm output: 60 | To have a clean output in the documentation, it is recommended to disable the logging and tqdm output in the example by adding a hidden cell at the top of the notebook. 61 | (make sure the cell metadata contains the key `nbsphinx` with the value `hidden` as specified above) 62 | 63 | ```python 64 | # Turn off logging and disable tqdm this is a hidden cell on docs page 65 | import logging 66 | import os 67 | 68 | logger = logging.getLogger('ibllib') 69 | logger.setLevel(logging.CRITICAL) 70 | 71 | os.environ["TQDM_DISABLE"] = "1" 72 | ``` 73 | 74 | ## Making documentation using github actions 75 | Two github actions workflows have been made available to automate the building and the deployment of the docs. These are located in the int-brain-lab/iblenv repository and can be accessed under the actions tab 76 | 77 | ### Developing docs 78 | 79 | Steps: 80 | - create documentation branches called `docs` on the `ibllib` and `iblenv` repositories. The workflow will only run if the branch exists in both repos (TODO if it doesn't exist make github action fallback to master) 81 | - add your changes to the documentation 82 | - run the [Build docs workflow](https://github.com/int-brain-lab/iblenv/actions/workflows/build_docs.yml). To run the workflow click on the `run_workflow` button in the top left corner and choose the branch you want to launch it from (this should normally be docs). 83 | 84 | After the docs build has completed succesfully your documentation will appear at this site http://testdocs.internationalbrainlab.org.s3-website-us-east-1.amazonaws.com 85 | 86 | 87 | ### Deploying docs 88 | **WARNING: Do not run this workflow unless you have run the build docs workflow above and checked that the documentation is correct** 89 | 90 | Steps: 91 | - merge the `docs` branch into `master` on the `iblenv` repository 92 | - merge the `docs` branch into `develop` on the `ibllib` repository 93 | - run the [Deploy docs workflow](https://github.com/int-brain-lab/iblenv/actions/workflows/deploy_docs.yml). To run the workflow click on the `run_workflow` button in the top left corner and choose the branch you want to launch it from (this should be master). 94 | 95 | The new docs will then be deployed to the main documnetation website https://int-brain-lab.github.io/iblenv/ 96 | 97 | 98 | ## Making documentation locally 99 | ### Install dependencies to build the website locally 100 | Activate your iblenv environment first and install the dependencies on top using pip 101 | ```shell 102 | pip install -r ./docs_gh_pages/requirements-docs.txt 103 | ``` 104 | 105 | ### Option 1: Only building changes to documentation 106 | If you have only made changes to the documentation (any of the files with `.md` or `.rst` extenstion), you can build the 107 | documentation without running the examples. The examples previously updated on the website will remain. To only 108 | build the documentation, the following command can be used 109 | 110 | ```python 111 | cd ./docs_gh_pages 112 | python make_script.py -d 113 | ``` 114 | 115 | ### Option 2: Building changes to documentation and specific examples 116 | If you want to add a new example or change a few of the existing examples, it is possible to the build the documentation 117 | while only executing a few specified examples. The documentation can be built using the following commnand and providing 118 | the path to your .ipynb or .py example scripts. 119 | 120 | ```python 121 | cd ./docs_gh_pages 122 | python make_script.py -e -s 123 | ``` 124 | 125 | An example would be 126 | ``` 127 | python make_script.py -e -s C:\Users\Mayo\iblenv\ibllib-repo\brainbox\examples\docs_get_training_status.py C:\Users\Mayo\iblenv\iblenv\docs_gh_pages\notebooks\one_basics\one_basics.ipynb 128 | ``` 129 | 130 | ### Option 3: Building changes to documentation and all examples 131 | If you want to rebuild the documentation and all examples you can use the following code 132 | 133 | ```python 134 | cd ./docs_gh_pages 135 | python make_script.py -e 136 | ``` 137 | 138 | ### Previewing the built documentation 139 | Once the `make_script.py` has completed a preview of the documentation can be viewed by opening 140 | `./docs-gh-pages/_build/html/index.html` in a web browser. 141 | 142 | Check that all notebooks have run without errors and that your changes have been implemented correctly! (N.B if you have 143 | run the `make_script.py` using option 1 or 2, some or all of the examples will not have executed, this is expected) 144 | 145 | 146 | ## Pushing changes to gh-pages 147 | Once you are happy with the built documentation, the changes can be deployed to the website by running the following 148 | command 149 | 150 | ```python 151 | python make_script.py -gh -m "your commit message" 152 | ``` 153 | 154 | ## Cleaning up your build 155 | Once your changes have been pushed to github, run the following command to clean up your ibllib and iblenv 156 | directories and unexecute example notebooks 157 | ```python 158 | python make_script.py -c 159 | ``` 160 | 161 | 162 | 163 | -------------------------------------------------------------------------------- /docs_gh_pages/_static/03b_tuto_matlab_cluster.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/int-brain-lab/iblenv/6bcbb996999c8ee0439bdeb1cf89e990bdbac127/docs_gh_pages/_static/03b_tuto_matlab_cluster.png -------------------------------------------------------------------------------- /docs_gh_pages/_static/IBL_data.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/int-brain-lab/iblenv/6bcbb996999c8ee0439bdeb1cf89e990bdbac127/docs_gh_pages/_static/IBL_data.png -------------------------------------------------------------------------------- /docs_gh_pages/_static/css/style.css: -------------------------------------------------------------------------------- 1 | .highlight { 2 | background: #f5f5f5; 3 | } -------------------------------------------------------------------------------- /docs_gh_pages/_templates/autosummary/module.rst: -------------------------------------------------------------------------------- 1 | {{ fullname }} 2 | {{ underline }} 3 | 4 | .. automodule:: {{ fullname }} 5 | :members: 6 | :undoc-members: 7 | :show-inheritance: -------------------------------------------------------------------------------- /docs_gh_pages/_templates/custom-class-template.rst: -------------------------------------------------------------------------------- 1 | {{ fullname | escape | underline}} 2 | 3 | .. currentmodule:: {{ module }} 4 | 5 | .. autoclass:: {{ objname }} 6 | :members: 7 | :show-inheritance: 8 | 9 | {% block methods %} 10 | {% if methods %} 11 | .. rubric:: {{ _('Methods') }} 12 | 13 | .. autosummary:: 14 | :nosignatures: 15 | {% for item in methods %} 16 | {%- if not item.startswith('_') %} 17 | ~{{ name }}.{{ item }} 18 | {%- endif -%} 19 | {%- endfor %} 20 | {% endif %} 21 | {% endblock %} 22 | 23 | {% block attributes %} 24 | {% if attributes %} 25 | .. rubric:: {{ _('Attributes') }} 26 | 27 | .. autosummary:: 28 | {% for item in attributes %} 29 | ~{{ name }}.{{ item }} 30 | {%- endfor %} 31 | {% endif %} 32 | {% endblock %} -------------------------------------------------------------------------------- /docs_gh_pages/_templates/custom-module-template.rst: -------------------------------------------------------------------------------- 1 | {{ fullname | escape | underline}} 2 | 3 | .. automodule:: {{ fullname }} 4 | 5 | {% block attributes %} 6 | {% if attributes %} 7 | .. rubric:: Module attributes 8 | 9 | .. autosummary:: 10 | :toctree: 11 | {% for item in attributes %} 12 | {{ item }} 13 | {%- endfor %} 14 | {% endif %} 15 | {% endblock %} 16 | 17 | {% block functions %} 18 | {% if functions %} 19 | .. rubric:: {{ _('Functions') }} 20 | 21 | .. autosummary:: 22 | :nosignatures: 23 | {% for item in functions %} 24 | {{ item }} 25 | {%- endfor %} 26 | {% endif %} 27 | {% endblock %} 28 | 29 | {% block classes %} 30 | {% if classes %} 31 | .. rubric:: {{ _('Classes') }} 32 | 33 | .. autosummary:: 34 | :nosignatures: 35 | {% for item in classes %} 36 | {{ item }} 37 | {%- endfor %} 38 | {% endif %} 39 | {% endblock %} 40 | 41 | {% block exceptions %} 42 | {% if exceptions %} 43 | .. rubric:: {{ _('Exceptions') }} 44 | 45 | .. autosummary:: 46 | {% for item in exceptions %} 47 | {{ item }} 48 | {%- endfor %} 49 | {% endif %} 50 | {% endblock %} 51 | 52 | {% block modules %} 53 | {% if modules %} 54 | .. autosummary:: 55 | :toctree: 56 | :template: custom-module-template.rst 57 | :recursive: 58 | {% for item in modules %} 59 | {{ item }} 60 | {%- endfor %} 61 | {% endif %} 62 | {% endblock %} -------------------------------------------------------------------------------- /docs_gh_pages/_templates/style.css.txt: -------------------------------------------------------------------------------- 1 | .highlight { 2 | background: #ffffff !important; 3 | } 4 | 5 | -------------------------------------------------------------------------------- /docs_gh_pages/atlas_examples.rst: -------------------------------------------------------------------------------- 1 | Atlas Examples 2 | ============== 3 | 4 | We present a set of hands-on examples to illustrate how to manipulate and visualize hierarchical brain atlas ontologies. 5 | The full package documentation can be found `here `_ 6 | 7 | Anatomical Atlases 8 | ****************** 9 | 10 | Below is a list of examples using the ibllib.atlas module 11 | 12 | The Allen Mouse Brain Common Coordinate Framework: A 3D Reference Atlas. Cell. 181(4):936-953.e20. doi: 10.1016/j.cell.2020.04.007. 13 | https://www.sciencedirect.com/science/article/pii/S0092867420304025 14 | 15 | .. toctree:: 16 | :maxdepth: 1 17 | 18 | notebooks_external/atlas_working_with_ibllib_atlas 19 | notebooks_external/atlas_mapping 20 | notebooks_external/atlas_plotting_scalar_on_slice 21 | notebooks_external/atlas_dorsal_cortex_flatmap 22 | notebooks_external/atlas_circular_pyramidal_flatmap 23 | notebooks_external/atlas_plotting_points_on_slice 24 | notebooks_external/atlas_swanson_flatmap 25 | 26 | 27 | Gene expression 28 | *************** 29 | 30 | .. toctree:: 31 | :maxdepth: 1 32 | 33 | notebooks_external/atlas_genomics_load_agea 34 | 35 | Ng, L., Bernard, A., Lau, C. et al. An anatomic gene expression atlas of the adult mouse brain. Nat Neurosci 12, 356–362 (2009). https://doi.org/10.1038/nn.2281 36 | https://www.nature.com/articles/nn.2281 37 | 38 | Lein, E.S. et al. (2007). Genome-wide atlas of gene expression in the adult mouse brain, Nature 445: 168-176. https://doi:10.1038/nature05453 -------------------------------------------------------------------------------- /docs_gh_pages/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # Configuration file for the Sphinx documentation builder. 4 | # 5 | # This file does only contain a selection of the most common options. For a 6 | # full list see the documentation: 7 | # http://www.sphinx-doc.org/en/master/config 8 | 9 | # -- Path setup -------------------------------------------------------------- 10 | 11 | # If extensions (or modules to document with autodoc) are in another directory, 12 | # add these directories to sys.path here. If the directory is relative to the 13 | # documentation root, use os.path.abspath to make it absolute, like shown here. 14 | # 15 | 16 | import sys 17 | from pathlib import Path 18 | import matplotlib 19 | matplotlib.use('agg') 20 | 21 | print(Path.cwd().parent.parent) 22 | sys.path.insert(0, Path.cwd().parent.parent) 23 | 24 | print('Python %s on %s' % (sys.version, sys.platform)) 25 | print(sys.path) 26 | 27 | # -- Project information ----------------------------------------------------- 28 | 29 | project = 'IBL Library' 30 | copyright = '2020, International Brain Laboratory' 31 | author = 'International Brain Laboratory' 32 | 33 | # The short X.Y version 34 | version = '' 35 | # The full version, including alpha/beta/rc tags 36 | release = '' 37 | 38 | 39 | # -- General configuration --------------------------------------------------- 40 | 41 | # If your documentation needs a minimal Sphinx version, state it here. 42 | # 43 | # needs_sphinx = '1.0' 44 | 45 | # Add any Sphinx extension module names here, as strings. They can be 46 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 47 | # ones. 48 | extensions = ['sphinx.ext.autodoc', 49 | 'sphinx.ext.autosummary', 50 | 'sphinx.ext.mathjax', 51 | 'sphinx.ext.githubpages', 52 | 'sphinx.ext.intersphinx', 53 | 'sphinx_copybutton', 54 | 'nbsphinx', 55 | 'nbsphinx_link', 56 | 'myst_parser', 57 | 'sphinx.ext.napoleon', 58 | 'sphinx.ext.viewcode',] 59 | #'sphinx_gallery.gen_gallery'] 60 | 61 | # Add any paths that contain templates here, relative to this directory. 62 | templates_path = ['_templates'] 63 | 64 | # Looks for objects in external projects 65 | intersphinx_mapping = { 66 | 'one_api': ('https://int-brain-lab.github.io/ONE/', None), 67 | } 68 | 69 | 70 | #sphinx_gallery_conf = { 71 | # 'examples_dirs': '../../ibllib-repo/examples/one/ephys', # path to your example scripts 72 | # 'gallery_dirs': 'auto_examples', # path to where to save gallery generated output 73 | # 'filename_pattern': 'docs_', 74 | #} 75 | 76 | 77 | #autoapi_add_toctree_entry = False 78 | #autoapi_dirs = ['../../ibllib-repo/ibllib', '../../ibllib-repo/alf', '../../ibllib-repo/oneibl'] 79 | # The master toctree document. 80 | master_doc = 'index' 81 | 82 | # The language for content autogenerated by Sphinx. Refer to documentation 83 | # for a list of supported languages. 84 | # 85 | # This is also used if you do content translation via gettext catalogs. 86 | # Usually you set "language" from the command line for these cases. 87 | language = None 88 | 89 | # List of patterns, relative to source directory, that match files and 90 | # directories to ignore when looking for source files. 91 | # This pattern also affects html_static_path and html_extra_path . 92 | exclude_patterns = ['_build', '_templates', 'documentation_contribution_guidelines.md', 93 | '.ipynb_checkpoints', 'templates', 'README.md', 'gh-pages'] 94 | 95 | # The name of the Pygments (syntax highlighting) style to use. 96 | pygments_style = 'sphinx' 97 | 98 | 99 | # -- Options for HTML output ------------------------------------------------- 100 | 101 | # The theme to use for HTML and HTML Help pages. See the documentation for 102 | # a list of builtin themes. 103 | 104 | html_theme = 'sphinx_rtd_theme' 105 | 106 | 107 | # Theme options are theme-specific and customize the look and feel of a theme 108 | # further. For a list of options available for each theme, see the 109 | # documentation. 110 | # 111 | # html_theme_options = {} 112 | 113 | # Add any paths that contain custom static files (such as style sheets) here, 114 | # relative to this directory. They are copied after the builtin static files, 115 | # so a file named "default.css" will overwrite the builtin "default.css". 116 | html_static_path = ['_static'] 117 | html_css_files = ['css/style.css'] 118 | 119 | # Custom sidebar templates, must be a dictionary that maps document names 120 | # to template names. 121 | # 122 | # The default sidebars (for documents that don't match any pattern) are 123 | # defined by theme itself. Builtin themes are using these templates by 124 | # default: ``['localtoc.html', 'relations.html', 'sourcelink.html', 125 | # 'searchbox.html']``. 126 | # 127 | html_sidebars = {} 128 | 129 | 130 | # -- Options for HTMLHelp output --------------------------------------------- 131 | 132 | # Output file base name for HTML help builder. 133 | htmlhelp_basename = 'ibllibdoc' 134 | 135 | 136 | # -- Options for LaTeX output ------------------------------------------------ 137 | 138 | latex_elements = { 139 | # The paper size ('letterpaper' or 'a4paper'). 140 | # 141 | # 'papersize': 'letterpaper', 142 | 143 | # The font size ('10pt', '11pt' or '12pt'). 144 | # 145 | # 'pointsize': '10pt', 146 | 147 | # Additional stuff for the LaTeX preamble. 148 | # 149 | # 'preamble': '', 150 | 151 | # Latex figure (float) alignment 152 | # 153 | # 'figure_align': 'htbp', 154 | } 155 | 156 | # Grouping the document tree into LaTeX files. List of tuples 157 | # (source start file, target name, title, 158 | # author, documentclass [howto, manual, or own class]). 159 | latex_documents = [ 160 | (master_doc, 'ibllib.tex', 'ibllib Documentation', 161 | 'International Brain Laboratory', 'manual'), 162 | ] 163 | 164 | 165 | # -- Options for manual page output ------------------------------------------ 166 | 167 | # One entry per manual page. List of tuples 168 | # (source start file, name, description, authors, manual section). 169 | man_pages = [ 170 | (master_doc, 'ibllib', 'ibllib Documentation', 171 | [author], 1) 172 | ] 173 | 174 | 175 | # -- Options for Texinfo output ---------------------------------------------- 176 | 177 | # Grouping the document tree into Texinfo files. List of tuples 178 | # (source start file, target name, title, author, 179 | # dir menu entry, description, category) 180 | texinfo_documents = [ 181 | (master_doc, 'ibllib', 'ibllib Documentation', 182 | author, 'ibllib', 'One line description of project.', 183 | 'Miscellaneous'), 184 | ] 185 | 186 | 187 | # -- Options for autosummary and autodoc ------------------------------------ 188 | autosummary_generate = True 189 | # Don't add module names to function docs 190 | add_module_names = False 191 | 192 | autodoc_default_options = { 193 | 'members': True, 194 | 'member-order': 'bysource', 195 | 'undoc-members': True, 196 | 'show-inheritance': False 197 | } 198 | 199 | 200 | def param_line_break(app, what, name, obj, options, lines): 201 | first_param = next((i for i, j in enumerate(lines) if ':param' in j), -1) 202 | if first_param != -1: 203 | # if the first param is not preceded by a line break add one in 204 | if lines[first_param - 1] != '': 205 | lines.insert(first_param, '') 206 | return 207 | 208 | 209 | def setup(app): 210 | # Connect the autodoc-skip-member event from apidoc to the callback 211 | app.connect('autodoc-process-docstring', param_line_break) 212 | 213 | # def autodoc_skip_member_handler(app, what, name, obj, skip, options): 214 | # # Basic approach; you might want a regex instead 215 | # # TODO still makes the folder structure, need to figure out how not to do that also makes 216 | # all the private methods which we don't want 217 | # if 'test' in name.lower(): 218 | # return True 219 | # else: 220 | # return False 221 | # 222 | # # Automatically called by sphinx at startup 223 | # def setup(app): 224 | # # Connect the autodoc-skip-member event from apidoc to the callback 225 | # app.connect('autodoc-skip-member', autodoc_skip_member_handler) 226 | 227 | # -- Options for nbsphinx ------------------------------------ 228 | 229 | # Only use nbsphinx for formatting the notebooks i.e never execute 230 | nbsphinx_execute = 'never' 231 | # Cancel compile on errors in notebooks 232 | nbsphinx_allow_errors = False 233 | # Add cell execution out number 234 | nbsphinx_output_prompt = 'Out[%s]:' 235 | # Configuration for images 236 | nbsphinx_execute_arguments = [ 237 | "--InlineBackend.figure_formats={'svg', 'pdf'}", 238 | "--InlineBackend.rc={'figure.dpi': 96}", 239 | ] 240 | plot_formats = [('png', 512)] 241 | 242 | # Add extra prolog to beginning of each .ipynb file 243 | # Add option to download notebook and link to github page 244 | # nbsphinx_prolog = r""" 245 | # 246 | # {% if env.metadata[env.docname]['nbsphinx-link-target'] %} 247 | # {% set nb_path = env.metadata[env.docname]['nbsphinx-link-target'] | dirname %} 248 | # {% set nb_name = env.metadata[env.docname]['nbsphinx-link-target'] | basename %} 249 | # {% else %} 250 | # {% set nb_name = env.doc2path(env.docname, base=None) | basename %} 251 | # {% set nb_path = env.doc2path(env.docname, base=None) | dirname %} 252 | # {% endif %} 253 | # 254 | # .. raw:: html 255 | # 256 | # 257 | # 259 | # 260 | # """ 261 | -------------------------------------------------------------------------------- /docs_gh_pages/docs_external/alf_intro.md: -------------------------------------------------------------------------------- 1 | ```{include} ../../../ONE/one/alf/README.md 2 | ``` 3 | -------------------------------------------------------------------------------- /docs_gh_pages/docs_external/ibl_viewer.md: -------------------------------------------------------------------------------- 1 | ```{include} ../../../iblviewer-repo/README.md 2 | :relative-docs: assests/ 3 | :relative-images: 4 | ``` -------------------------------------------------------------------------------- /docs_gh_pages/documentation_contribution_guidelines.md: -------------------------------------------------------------------------------- 1 | # Overview of documentation 2 | 3 | The documentation is built locally and hosted on a github-pages website at this address: 4 | https://int-brain-lab.github.io/iblenv/ 5 | 6 | The website is generated using 7 | 1. The markdown files in the `./docs-gh-pages` folder 8 | 2. The python or ipython notebooks in the `./docs-gh-pages/notebooks` 9 | 3. The python or ipython notebooks in the ibllib repo `./examples` and `./brainbox/examples` folders 10 | 4. The docstrings in the source code of the `./ibllib`, `./alf`, `./one` and `./brainbox` folders 11 | 12 | 13 | # Contributing to documentation 14 | 15 | ### Adding examples or tutorials to the documentation 16 | Examples or tutorials should be placed in the folders (can be in sub-folders within these folders) 17 | `ibllib-repo/examples` 18 | or 19 | `ibllib-repo/brainbox/examples` 20 | 21 | They can be either `.py` or `.ipynb` form but must have a prefix of `docs` to be included in the documentation, 22 | e.g `docs_coronal_slice.py` or `docs_get_LFP_data.ipynb`. Each example/ tutorial must start with a title and a brief 23 | description of the content. Please refer to the templates in the [templates folder](./templates) for examples of 24 | how to layout the title and description in order for it to be correctly rendered and displayed on the website. 25 | 26 | Once you have created the example/ tutorial you should link to the file in either `05_tutorials.rst` or `06_examples.rst`. 27 | The link should be made by adding in the following line `notebooks_external\name_of_example_without_extension`, e.g 28 | `notebooks_external\docs_coronal_slice` 29 | 30 | `notebooks_external\docs_get_LFP_data` 31 | 32 | An example implementation can be seen in the `06_examples.rst` file 33 | 34 | ## Making documentation 35 | ### Install dependencies to build the website locally 36 | ```shell 37 | pip install -r ./docs_gh_pages/requirements-docs.txt 38 | ``` 39 | 40 | ### Option 1: Only building changes to documentation 41 | If you have only made changes to the documentation (any of the files with `.md` or `.rst` extenstion), you can build the 42 | documentation without running the examples. The examples previously updated on the website will remain. To only 43 | build the documentation, the following command can be used 44 | 45 | ```python 46 | cd ./docs_gh_pages 47 | python make_script.py -d 48 | ``` 49 | 50 | ### Option 2: Building changes to documentation and specific examples 51 | If you want to add a new example or change a few of the existing examples, it is possible to the build the documentation 52 | while only executing a few specified examples. The documentation can be built using the following commnand and providing 53 | the path to your .ipynb or .py example scripts. 54 | 55 | ```python 56 | cd ./docs_gh_pages 57 | python make_script.py -e -s 58 | ``` 59 | 60 | An example would be 61 | ``` 62 | python make_script.py -e -s C:\Users\Mayo\iblenv\ibllib-repo\brainbox\examples\docs_get_training_status.py C:\Users\Mayo\iblenv\iblenv\docs_gh_pages\notebooks\one_basics\one_basics.ipynb 63 | ``` 64 | 65 | ### Option 3: Building changes to documentation and all examples 66 | If you want to rebuild the documentation and all examples you can use the following code 67 | 68 | ```python 69 | cd ./docs_gh_pages 70 | python make_script.py -e 71 | ``` 72 | 73 | ### Previewing the built documentation 74 | Once the `make_script.py` has completed a preview of the documentation can be viewed by opening 75 | `./docs-gh-pages/_build/html/index.html` in a web browser. 76 | 77 | Check that all notebooks have run without errors and that your changes have been implemented correctly! (N.B if you have 78 | run the `make_script.py` using option 1 or 2, some or all of the examples will not have executed, this is expected) 79 | 80 | 81 | ## Pushing changes to gh-pages 82 | Once you are happy with the built documentation, the changes can be deployed to the website by running the following 83 | command 84 | 85 | ```python 86 | python make_script.py -gh -m "your commit message" 87 | ``` 88 | 89 | ## Cleaning up your build 90 | Once your changes have been pushed to github, run the following command to clean up your ibllib and iblenv 91 | directories and unexecute example notebooks 92 | ```python 93 | python make_script.py -c 94 | ``` 95 | 96 | 97 | -------------------------------------------------------------------------------- /docs_gh_pages/genindex.rst: -------------------------------------------------------------------------------- 1 | Detailed Index 2 | ============== 3 | -------------------------------------------------------------------------------- /docs_gh_pages/index.rst: -------------------------------------------------------------------------------- 1 | .. one_ibl documentation master file, created by 2 | sphinx-quickstart on Fri Jul 20 17:20:00 2018. 3 | 4 | Welcome to IBL code library documentation! 5 | ########################################## 6 | 7 | IBL data structure 8 | ************************* 9 | .. image:: ./_static/IBL_data.png 10 | :alt: Alyx data structure 11 | 12 | In the IBL, data acquired in laboratories spread across countries needs to be centralized into a 13 | common store, accessible from anywhere in the world, at all times. 14 | This challenge is met by the IBL data architecture, documented briefly below; a thorough description 15 | can be found in our `preprint `_. 16 | 17 | The central store has two components: 18 | 19 | * A **Bulk Data Store** that stores **large raw data files** (e.g. raw electrophysiology and video data) as well as **pre-processed data** (e.g. results of spike sorting or video segmentation). This database is accessible through HTTP, FTP and Globus. This is known informally as the "Flatiron server" as our original data server was generously hosted by the `Flatiron Institute `_. 20 | * A **Relational Database** that stores **metadata** (e.g. information on each experiment and experimental subject) in a structured manner, together with links to the bulk data files. This database is known as `Alyx `_, for reasons no-one can remember. Alyx contains a web-based front-end to allow users to perform colony management and enter metadata during experiments; documentation on this front end is `here `_. Information on how to connect to Alyx programmatically is `here `_. 21 | 22 | Tools to access the data 23 | ************************* 24 | There are two main ways to access the data: 25 | 26 | * `ONE `_: an API that connects to the central store, allowing users to search and load data of interest from specific experiments. 27 | * `Datajoint `_: a framework to perform automated pipelined analyses on a subset of lightweight data such as behavioral choices and spike times, that allows rapid integration of data from multiple experiments and users. 28 | 29 | The full IBL data will be publically released when we have completed collection, preprocessing, curation, and quality control. In the meantime, a subset of curated data are publically available. 30 | 31 | Software to analyze IBL data 32 | **************************** 33 | IBL has released a suite of tools to process and visualize our data. 34 | 35 | * `Brainbox `_: A library of analysis functions that can be used on IBL data or other neurophysiology recordings. 36 | * `IBL Viewer `_: A simple and fast interactive visualization tool based on VTK that uses GPU accelerated volume and surface rendering. From electrophysiological data to neuronal connectivity, this tool allows simple and effective 3D visualization for many use-cases like multi-slicing and time series (even on volumes), and can be embedded within Jupyter Lab/Notebook and Qt user interfaces. 37 | 38 | .. attention:: 39 | To get all the software, including ONE, brainbox and visualization tools, install the 40 | `Unified Environment <./02_installation.html>`_. This is recommended for IBL members. 41 | 42 | .. toctree:: 43 | :hidden: 44 | :caption: The Open Neurophysiology Environment 45 | :maxdepth: 1 46 | 47 | notebooks_external/one_quickstart 48 | Full documentation Website for ONE 49 | 50 | .. toctree:: 51 | :hidden: 52 | :caption: Public 53 | :maxdepth: 1 54 | 55 | public_docs/public_introduction 56 | notebooks_external/data_release_behavior 57 | public_docs/data_release_pilot 58 | notebooks_external/data_release_repro_ephys 59 | notebooks_external/data_release_brainwidemap 60 | notebooks_external/data_release_spikesorting_benchmarks 61 | public_docs/information_contact 62 | 63 | .. toctree:: 64 | :hidden: 65 | :caption: Exploring IBL Data 66 | :maxdepth: 1 67 | 68 | notebooks_external/data_structure 69 | notebooks_external/data_download 70 | loading_examples 71 | 72 | 73 | .. toctree:: 74 | :hidden: 75 | :caption: Miscellaneous 76 | :maxdepth: 1 77 | 78 | 02_installation 79 | 09_contribution 80 | 81 | .. toctree:: 82 | :hidden: 83 | :caption: Examples & Tutorials 84 | :maxdepth: 1 85 | 86 | atlas_examples 87 | notebooks_external/docs_wheel_moves 88 | notebooks_external/docs_wheel_screen_stimulus 89 | 90 | .. toctree:: 91 | :hidden: 92 | :caption: API Reference 93 | :maxdepth: 1 94 | 95 | 010_api_reference.rst 96 | genindex 97 | -------------------------------------------------------------------------------- /docs_gh_pages/loading_examples.rst: -------------------------------------------------------------------------------- 1 | Loading Data 2 | ============ 3 | 4 | Below is a list of examples showing how to load different types of IBL data 5 | 6 | .. toctree:: 7 | :maxdepth: 1 8 | 9 | notebooks_external/loading_trials_data 10 | notebooks_external/loading_wheel_data 11 | notebooks_external/loading_spikesorting_data 12 | loading_examples/loading_spike_waveforms 13 | notebooks_external/loading_passive_data 14 | notebooks_external/loading_ephys_data 15 | notebooks_external/loading_raw_ephys_data 16 | notebooks_external/loading_video_data 17 | notebooks_external/loading_raw_video_data 18 | notebooks_external/loading_widefield_data 19 | notebooks_external/loading_multi_photon_imaging_data 20 | notebooks_external/loading_raw_mesoscope_data 21 | notebooks_external/loading_photometry_data -------------------------------------------------------------------------------- /docs_gh_pages/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | pushd %~dp0 4 | 5 | REM Command file for Sphinx documentation 6 | 7 | if "%SPHINXBUILD%" == "" ( 8 | set SPHINXBUILD=sphinx-build 9 | ) 10 | set SOURCEDIR=. 11 | set BUILDDIR=_build 12 | set SPHINXPROJ=one_ibl 13 | 14 | if "%1" == "" goto help 15 | 16 | %SPHINXBUILD% >NUL 2>NUL 17 | if errorlevel 9009 ( 18 | echo. 19 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx 20 | echo.installed, then set the SPHINXBUILD environment variable to point 21 | echo.to the full path of the 'sphinx-build' executable. Alternatively you 22 | echo.may add the Sphinx directory to PATH. 23 | echo. 24 | echo.If you don't have Sphinx installed, grab it from 25 | echo.http://sphinx-doc.org/ 26 | exit /b 1 27 | ) 28 | 29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 30 | goto end 31 | 32 | :help 33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 34 | 35 | :end 36 | popd 37 | -------------------------------------------------------------------------------- /docs_gh_pages/make_script.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | import shutil 4 | import argparse 5 | import subprocess 6 | from pathlib import Path 7 | 8 | from iblutil.util import setup_logger 9 | from scripts.execute_notebooks import process_notebooks 10 | 11 | os.environ["TQDM_DISABLE"] = "1" 12 | _logger = setup_logger(name='íbllib', level=20) 13 | 14 | root = Path.cwd() 15 | scripts_path = root.joinpath('scripts') 16 | 17 | nb_path = root.joinpath('notebooks') 18 | nb_path_external = [# Path(root.parent.parent).joinpath('ibllib-repo', 'examples'), 19 | Path(root.parent.parent).joinpath('ibllib-repo', 'examples', 'loading_data'), 20 | Path(root.parent.parent).joinpath('iblatlas', 'examples'), 21 | Path(root.parent.parent).joinpath('ibllib-repo', 'examples', 'data_release'), 22 | Path(root.parent.parent).joinpath('ibllib-repo', 'examples', 'exploring_data'), 23 | Path(root.parent.parent).joinpath('ibllib-repo', 'brainbox', 'examples'), 24 | Path(root.parent.parent).joinpath('ONE', 'docs', 'notebooks')] 25 | # external_file_patterns = ['docs', 'loading', 'atlas', 'docs', 'quickstart'] 26 | external_file_patterns = ['loading', 'atlas', 'data', 'data', 'docs_wheel', 'quickstart'] 27 | 28 | 29 | def make_documentation(execute, force, documentation, clean, specific, github, message, pre_clean): 30 | 31 | # Clean up any nblink files 32 | nb_external_files = root.joinpath('notebooks_external').glob('*') 33 | for file in nb_external_files: 34 | os.remove(file) 35 | 36 | assert len(external_file_patterns) == len(nb_path_external) 37 | status = 0 38 | # Case where we want to rebuild all examples 39 | if execute and not specific: 40 | # Execute notebooks in docs folder 41 | #### remove the running of datajoint docs 42 | # status += process_notebooks(nb_path, execute=True, force=force) 43 | # Execute notebooks in external folders 44 | for nb_path_ext, pattern in zip(nb_path_external, external_file_patterns): 45 | status += process_notebooks(nb_path_ext, execute=True, force=force, 46 | link=True, filename_pattern=pattern) 47 | _logger.info("Finished processing notebooks") 48 | 49 | if status != 0: 50 | # One or more examples returned an error 51 | sys.exit(1) 52 | else: 53 | # If no errors make the documentation 54 | _logger.info("Cleaning up previous documentation") 55 | os.system("make clean") 56 | _logger.info("Making documentation") 57 | os.system("make html") 58 | sys.exit(0) 59 | 60 | # Case where we only want to build specific examples 61 | if execute and specific: 62 | for nb in specific: 63 | if str(nb).startswith(str(root)): 64 | status += process_notebooks(nb, execute=True, force=force) 65 | else: 66 | status += process_notebooks(nb, execute=True, force=force, link=True) 67 | _logger.info("Finished processing notebooks") 68 | 69 | # Create the link files for the other notebooks in external paths that we haven't 70 | # executed. N.B this must be run after the above commands 71 | for nb_path_ext, pattern in zip(nb_path_external, external_file_patterns): 72 | process_notebooks(nb_path_ext, execute=False, link=True, filename_pattern=pattern) 73 | 74 | if status != 0: 75 | # One or more examples returned an error 76 | sys.exit(1) 77 | else: 78 | # If no errors make the documentation 79 | _logger.info("Cleaning up previous documentation") 80 | os.system("make clean") 81 | _logger.info("Making documentation") 82 | os.system("make html") 83 | sys.exit(0) 84 | 85 | if documentation: 86 | for nb_path_ext, pattern in zip(nb_path_external, external_file_patterns): 87 | process_notebooks(nb_path_ext, execute=False, link=True, filename_pattern=pattern) 88 | 89 | _logger.info("Cleaning up previous documentation") 90 | os.system("make clean") 91 | _logger.info("Making documentation") 92 | os.system("make html") 93 | sys.exit(0) 94 | 95 | if pre_clean: 96 | # clean up for github but don't commit. In the examples only notebooks with an execute flag=True are kept, 97 | # the rest are deleted. 98 | # Clean up the build path regardless 99 | build_nb_path = root.joinpath('_build', 'html', 'notebooks') 100 | build_nb_external_path = root.joinpath('_build', 'html', 'notebooks_external') 101 | process_notebooks(build_nb_path, execute=False, cleanup=True, remove_gh=True) 102 | process_notebooks(build_nb_external_path, execute=False, cleanup=True, remove_gh=True) 103 | 104 | # remove the _sources folder as we don't need this 105 | build_nb_source_path = root.joinpath('_build', 'html', '_sources') 106 | if build_nb_source_path.exists(): 107 | shutil.rmtree(build_nb_source_path) 108 | 109 | 110 | if github: 111 | # clean up for github. In the examples only notebooks with an execute flag=True are kept, 112 | # the rest are deleted. 113 | # Clean up the build path regardless 114 | build_nb_path = root.joinpath('_build', 'html', 'notebooks') 115 | build_nb_external_path = root.joinpath('_build', 'html', 'notebooks_external') 116 | process_notebooks(build_nb_path, execute=False, cleanup=True, remove_gh=True) 117 | process_notebooks(build_nb_external_path, execute=False, cleanup=True, remove_gh=True) 118 | 119 | # remove the _sources folder as we don't need this 120 | build_nb_source_path = root.joinpath('_build', 'html', '_sources') 121 | if build_nb_source_path.exists(): 122 | shutil.rmtree(build_nb_source_path) 123 | 124 | # Need to figure out how to do this 125 | if not message: 126 | message = "commit latest documentation" 127 | 128 | exec = Path('scripts').joinpath('gh_push.sh') 129 | command = f'{exec} "{message}"' 130 | print(command) 131 | subprocess.call(command, shell=True) # noqa: E605 132 | 133 | # Clean up notebooks in directory if also specified 134 | if clean: 135 | _logger.info("Cleaning up notebooks") 136 | process_notebooks(nb_path, execute=False, cleanup=True) 137 | for nb_path_ext, pattern in zip(nb_path_external, external_file_patterns): 138 | process_notebooks(nb_path_ext, execute=False, cleanup=True, 139 | filename_pattern=pattern) 140 | 141 | try: 142 | build_path = root.joinpath('_build') 143 | if build_path.exists(): 144 | shutil.rmtree(build_path) 145 | except Exception as err: 146 | print(err) 147 | _logger.error('Could not remove _build directory in iblenv/docs_gh_pages, please ' 148 | 'delete manually') 149 | try: 150 | autosummary_path = root.joinpath('_autosummary') 151 | if autosummary_path.exists(): 152 | shutil.rmtree(autosummary_path) 153 | except Exception as err: 154 | print(err) 155 | _logger.error('Could not remove _autosummary directory in iblenv/docs_gh_pages, please' 156 | ' delete manually') 157 | 158 | 159 | if __name__ == "__main__": 160 | 161 | parser = argparse.ArgumentParser(description='Make IBL documentation') 162 | 163 | parser.add_argument('-e', '--execute', default=False, action='store_true', 164 | help='Execute notebooks') 165 | parser.add_argument('-f', '--force', default=False, action='store_true', 166 | help='Force notebook execution even if already run') 167 | parser.add_argument('-d', '--documentation', default=False, action='store_true', 168 | help='Make documentation') 169 | parser.add_argument('-s', '--specific', nargs='+', required=False, 170 | help='List of specific files to execute') 171 | parser.add_argument('-c', '--cleanup', default=False, action='store_true', 172 | help='Cleanup notebooks once documentation made') 173 | parser.add_argument('-gh', '--github', default=False, action='store_true', 174 | help='Push documentation to gh-pages') 175 | parser.add_argument('-pc', '--preclean', default=False, action='store_true', 176 | help='Clean up documentation for gh-pages') 177 | parser.add_argument('-m', '--message', default=None, required=False, type=str, 178 | help='Commit message') 179 | args = parser.parse_args() 180 | make_documentation(execute=args.execute, force=args.force, documentation=args.documentation, 181 | clean=args.cleanup, specific=args.specific, github=args.github, 182 | message=args.message, pre_clean=args.preclean) 183 | -------------------------------------------------------------------------------- /docs_gh_pages/public_docs/data_release_pilot.md: -------------------------------------------------------------------------------- 1 | # Data Release - Pilot Dataset 2 | The IBL has released the datasets associated with 4 Neuropixels pilot sessions. 3 | 4 | You can view the datasets in a web-browser by clicking on these links: 5 | 6 | - [churchlandlab/Subjects/CSHL049/2020-01-08/001](https://ibl.flatironinstitute.org/public/churchlandlab/Subjects/CSHL049/2020-01-08/001/) 7 | - [cortexlab/Subjects/KS023/2019-12-10/001](https://ibl.flatironinstitute.org/public/cortexlab/Subjects/KS023/2019-12-10/001/) 8 | - [hoferlab/Subjects/SWC_043/2020-09-21/001](https://ibl.flatironinstitute.org/public/hoferlab/Subjects/SWC_043/2020-09-21/001/) 9 | - [zadorlab/Subjects/CSH_ZAD_029/2020-09-19/001](https://ibl.flatironinstitute.org/public/zadorlab/Subjects/CSH_ZAD_029/2020-09-19/001/) 10 | 11 | ## Data structure and download 12 | The organisation of the data follows the standard IBL data structure. 13 | 14 | Please see 15 | 16 | - [These instructions](https://int-brain-lab.github.io/iblenv/notebooks_external/data_structure.html) to download an example dataset for one session, and get familiarised with the data structure 17 | - [These instructions](https://int-brain-lab.github.io/iblenv/notebooks_external/data_download.html) to learn how to use the ONE-api to search and download the released datasets 18 | - [These instructions](https://int-brain-lab.github.io/iblenv/loading_examples.html) to get familiarised with specific data loading functions 19 | 20 | Note: 21 | - The tag associated to this release is `2021_Q2_PreRelease` -------------------------------------------------------------------------------- /docs_gh_pages/public_docs/information_contact.md: -------------------------------------------------------------------------------- 1 | # Information and troubleshooting 2 | - Issues with the data? Post an issue here: with the tag `ibl` 3 | - Alternatively post an issue here: 4 | - General questions about the datasets or publications? Email: [info@internationalbrainlab.org](info@internationalbrainlab.org) -------------------------------------------------------------------------------- /docs_gh_pages/public_docs/public_introduction.md: -------------------------------------------------------------------------------- 1 | # Publicly available IBL data 2 | 3 | ## Introduction to the IBL experiments 4 | The aim of the International Brain Laboratory (IBL) is to understand the brain functions 5 | underlying decision making. Understanding these processes is a problem with a scale and complexity 6 | that far exceed what can be tackled by any single laboratory and that demands computational theory 7 | to be interwoven with experimental design and analysis in a manner not yet achieved. To overcome these 8 | challenges, we have created a virtual laboratory, unifying a group of 22 highly experienced neuroscience 9 | groups distributed across the world. 10 | 11 | Datasets are acquired in a dozen of laboratories performing experimental work (e.g. Churchland lab, Mainen lab, Zador lab). 12 | In each of these laboratories, mice are first trained in the IBL decision-making task 13 | [following a standardized training pipeline](https://elifesciences.org/articles/63711). Briefly, the mice are fixed 14 | in front of a screen, and have to turn a Lego wheel in order to position a visual stimulus appearing on one of the 15 | side of the screen towards its center. The complexity of the task increases as the contrast of the visual stimulus is lowered, 16 | and the probability of the stimulus appearing on a given side varies. 17 | 18 | Various sensors are used to monitor the animal's performance and condition (e.g. a rotary encoder attached to the Lego wheel, 19 | camera(s) to view the mouse posture). Behavior data is acquired throughout the learning phase on "Training rigs" (see our 20 | [article on behavioral training](https://elifesciences.org/articles/63711) for details). 21 | Once a mouse has reached proficiency in the IBL task, it is moved to an "Ephys rig" where its brain activity is recorded from. 22 | 23 | We aim to use different brain recording modalities, so as to have complimentary views on the brain activity. For example: 24 | Neuropixels, Mesoscope, Fiberfluorophotometry, Widefield Imaging techniques all have their unique advantages over one another. 25 | Our most advanced project as of now is the one involving Neuropixels recordings. 26 | 27 | ### Neuropixels datasets 28 | The data consists of neurophysiological and behavior measurements acquired in mice, using Neuropixels probes. 29 | In a single recording session, up to two Neuropixels probes (labelled typically `probe00` or `probe01`) 30 | are inserted in the mouse's brain. The location of these probes in the brain will vary from mouse to mouse, as the 31 | aim of this IBL project is to tile the whole mouse brain using these probes. 32 | 33 | The data is acquired on three computers (one decidated to acquiring the raw ephys, raw video and raw behavioral data 34 | respectively), and saved into their corresponding folder (see sections below for details). 35 | At the end of a Neuropixels recording session, some stimuli are replayed whilst the mouse is passive 36 | (i.e. not engaged in the IBL task). The behavioral data acquired during this replay of stimuli is also saved in a dedicated folder. 37 | 38 | Once acquired, the data are centralised onto a single computer, and processed (using heavy algorithms, such as 39 | Deep Lab Cut for tracking points on video data, or pyKilosort for detecting and sorting the spikes of cells on the ephys traces), 40 | before being sent and stored onto our centralised database (see our [article on data architecture ](https://www.biorxiv.org/content/10.1101/827873v3) for details). 41 | 42 | 43 | ## What is available for download 44 | 45 | ### Behavioral data associated to our 2020 publication 46 | The IBL has released all of the behavior sessions associated with the publication 47 | [Standardized and reproducible measurement of decision-making in mice](https://elifesciences.org/articles/63711) 48 | via ONE and Datajoint. 49 | * Please follow this [link](../notebooks_external/data_release_behavior) for instructions on how to access this data 50 | 51 | ### Pilot neuropixels datasets 52 | The IBL has released a handful of pilot datasets that are available for download through ONE. 53 | * Please follow this [link](data_release_pilot) to explore these datasets 54 | 55 | ### Reproducible ephys data associated with our 2022 preprint 56 | The IBL has released all data associated with the preprint [Reproducibility of in vivo electrophysiological measurements in mice](https://www.biorxiv.org/content/10.1101/2022.05.09.491042v3). 57 | * Please follow this [link](../notebooks_external/data_release_repro_ephys) to access this data 58 | 59 | 60 | ### Brain wide map data release - Q4 2022 61 | The IBL has released all data described in our [technical paper](https://doi.org/10.6084/m9.figshare.21400815). 62 | * Please follow this [link](../notebooks_external/data_release_brainwidemap) to access this data 63 | 64 | ### Coming soon: benchmark dataset for spike sorting 65 | A list of insertion for which we will provide benchmarks is available [here](../notebooks_external/data_release_spikesorting_benchmarks). -------------------------------------------------------------------------------- /docs_gh_pages/requirements-docs.txt: -------------------------------------------------------------------------------- 1 | ipyevents 2 | jupyter 3 | nbsphinx 4 | nbsphinx-link 5 | myst_parser 6 | pandoc 7 | sphinx >=3.1.2 8 | sphinx-copybutton 9 | sphinx-gallery 10 | sphinx_rtd_theme 11 | -------------------------------------------------------------------------------- /docs_gh_pages/scripts/execute_notebooks.py: -------------------------------------------------------------------------------- 1 | import logging 2 | import os 3 | import json 4 | import logging 5 | import time 6 | import shutil 7 | from pathlib import Path 8 | import re 9 | 10 | 11 | from nbconvert.preprocessors import (ExecutePreprocessor, CellExecutionError, 12 | ClearOutputPreprocessor) 13 | import nbformat 14 | import sphinx_gallery.notebook as sph_nb 15 | import sphinx_gallery.gen_gallery as gg 16 | 17 | _logger = logging.getLogger('íbllib') 18 | IPYTHON_VERSION = 4 19 | TIMEOUT_CELLS = 1200 20 | 21 | 22 | class NotebookConverter(object): 23 | 24 | def __init__(self, nb_path, output_path=None, overwrite=True, kernel_name=None): 25 | """ 26 | Parameters 27 | ---------- 28 | nb_path : str 29 | Path to ipython notebook 30 | output_path: str, default=None 31 | Path to where executed notebook, rst file and colab notebook will be saved. Default is 32 | to save in same directory of notebook 33 | overwrite: bool, default=True 34 | Whether to save executed notebook as same filename as unexecuted notebook or create new 35 | file with naming convention 'exec_....'. Default is to write to same file 36 | kernel_name: str 37 | Kernel to use to run notebooks. If not specified defaults to 'python3' 38 | """ 39 | self.nb_path = Path(nb_path).absolute() 40 | self.nb_link_path = Path(__file__).parent.parent.joinpath('notebooks_external') 41 | os.makedirs(self.nb_link_path, exist_ok=True) 42 | self.nb = self.nb_path.parts[-1] 43 | self.nb_dir = self.nb_path.parent 44 | self.nb_name = self.nb_path.stem 45 | self.overwrite = overwrite 46 | 47 | # If no output path is specified save everything into directory containing notebook 48 | if output_path is not None: 49 | self.output_path = Path(output_path).absolute() 50 | os.makedirs(self.output_path, exist_ok=True) 51 | else: 52 | self.output_path = self.nb_dir 53 | 54 | # If overwrite is True, write the executed notebook to the same name as the notebook 55 | if self.overwrite: 56 | self.executed_nb_path = self.output_path.joinpath(self.nb) 57 | self.temp_nb_path = self.output_path.joinpath(f'executed_{self.nb}') 58 | else: 59 | self.executed_nb_path = self.output_path.joinpath(f'executed_{self.nb}') 60 | 61 | if kernel_name is not None: 62 | self.execute_kwargs = dict(timeout=TIMEOUT_CELLS, kernel_name=kernel_name, allow_errors=False) 63 | else: 64 | self.execute_kwargs = dict(timeout=TIMEOUT_CELLS, kernel_name='python3', allow_errors=False) 65 | 66 | @staticmethod 67 | def py_to_ipynb(py_path): 68 | """ 69 | Convert python script to ipython notebook 70 | Returns 71 | ------- 72 | """ 73 | nb_path = sph_nb.replace_py_ipynb(py_path) 74 | if not Path(nb_path).exists(): 75 | file_conf, blocks = sph_nb.split_code_and_text_blocks(py_path) 76 | gallery_config = gg.DEFAULT_GALLERY_CONF 77 | gallery_config['first_notebook_cell'] = None 78 | example_nb = sph_nb.jupyter_notebook(blocks, gallery_config, nb_path) 79 | sph_nb.save_notebook(example_nb, nb_path) 80 | return nb_path 81 | 82 | def link(self): 83 | """ 84 | Create nb_sphinx link file for notebooks external to the docs directory 85 | """ 86 | link_path = os.path.relpath(self.nb_path, self.nb_link_path) 87 | link_dict = {"path": link_path} 88 | link_save_path = self.nb_link_path.joinpath(str(self.nb_name) + '.nblink') 89 | 90 | with open(link_save_path, 'w') as f: 91 | json.dump(link_dict, f) 92 | 93 | def execute(self, force=False): 94 | """ 95 | Executes the specified notebook file, and writes the executed notebook to a 96 | new file. 97 | Parameters 98 | ---------- 99 | force : bool, optional 100 | To force rerun notebook even if it has already been executed 101 | Returns 102 | ------- 103 | executed_nb_path : str, ``None`` 104 | The path to the executed notebook path, or ``None`` if ``write=False``. 105 | status: bool 106 | Whether the notebook executed without errors or not, 0 = ran without error, 1 = error 107 | """ 108 | 109 | with open(self.nb_path, encoding='utf-8') as f: 110 | nb = nbformat.read(f, as_version=IPYTHON_VERSION) 111 | 112 | skip_execution = not nb['metadata'].get('ibl_execute', True) 113 | is_executed = nb['metadata'].get('docs_executed') 114 | 115 | if skip_execution: 116 | _logger.info(f"Notebook {self.nb} in {self.nb_dir} has the 'ibl_execute' flag set to True," 117 | f"skipping") 118 | status = 0 119 | elif is_executed == 'executed' and not force: 120 | _logger.info(f"Notebook {self.nb} in {self.nb_dir} already executed, skipping," 121 | f"to force execute, parse argument -f") 122 | status = 0 123 | else: 124 | 125 | # Execute the notebook 126 | _logger.info(f"Executing notebook {self.nb} in {self.nb_dir}") 127 | t0 = time.time() 128 | 129 | clear_executor = ClearOutputPreprocessor() 130 | executor = ExecuteNotebooks(**self.execute_kwargs) 131 | 132 | # First clean up the notebook and remove any cells that have been run 133 | clear_executor.preprocess(nb, {}) 134 | 135 | try: 136 | executor.preprocess(nb, {'metadata': {'path': self.nb_dir}}) 137 | execute_dict = {'docs_executed': 'executed'} 138 | nb['metadata'].update(execute_dict) 139 | status = 0 140 | except CellExecutionError as err: 141 | execute_dict = {'docs_executed': 'errored'} 142 | nb['metadata'].update(execute_dict) 143 | _logger.error(f"Error executing notebook {self.nb}") 144 | _logger.error(err) 145 | status = 1 146 | 147 | _logger.info(f"Finished running notebook ({time.time() - t0})") 148 | 149 | _logger.info(f"Writing executed notebook to {self.executed_nb_path}") 150 | # Makes sure original notebook isn't left blank in case of error during writing 151 | if self.overwrite: 152 | with open(self.temp_nb_path, 'w', encoding='utf-8') as f: 153 | nbformat.write(nb, f) 154 | shutil.copyfile(self.temp_nb_path, self.executed_nb_path) 155 | os.remove(self.temp_nb_path) 156 | else: 157 | with open(self.executed_nb_path, 'w', encoding='utf-8') as f: 158 | nbformat.write(nb, f) 159 | 160 | return self.executed_nb_path, status 161 | 162 | def unexecute(self, remove_gh=False): 163 | """ 164 | Unexecutes the notebook i.e. removes all output cells. If remove_gh=True looks to see if 165 | notebook metadata contains an executed tag. If it doesn't it means the notebook either 166 | errored or was not run (for case when only specific notebooks chosen to build examples) and 167 | removes the notebooks so old ones can be used. 168 | 169 | If the notebook has the flag `ibl_execute` set to false, we do not interfere with any of the outputs 170 | """ 171 | _logger.info(f"Cleaning up notebook {self.nb} in {self.nb_dir}") 172 | if not self.executed_nb_path.exists(): 173 | _logger.warning(f"{self.executed_nb_path} not found, nothing to clean") 174 | return 175 | 176 | with open(self.executed_nb_path, encoding='utf-8') as f: 177 | nb = nbformat.read(f, as_version=IPYTHON_VERSION) 178 | 179 | # if the flag for automatic execution is set to false, it means the notebook has 180 | # been run manually as it may rely on large datasets, and in this case we do not interfere with outputs 181 | skip_execution = not nb['metadata'].get('ibl_execute', True) 182 | if skip_execution: 183 | return 184 | 185 | if not remove_gh: 186 | if nb['metadata'].get('docs_executed', None): 187 | nb['metadata'].pop('docs_executed') 188 | 189 | clear_executor = ClearOutputPreprocessor() 190 | clear_executor.remove_metadata_fields.add('execution') 191 | clear_executor.preprocess(nb, {}) 192 | 193 | with open(self.executed_nb_path, 'w', encoding='utf-8') as f: 194 | nbformat.write(nb, f) 195 | 196 | elif remove_gh: 197 | executed_flag = nb['metadata'].get('docs_executed', None) 198 | if executed_flag != 'executed': 199 | _logger.warning(f"Notebook {self.nb} not executed or errored, " 200 | f"version already on website will be used") 201 | os.remove(self.executed_nb_path) 202 | os.remove(self.output_path.joinpath(self.nb_name + '.html')) 203 | else: 204 | _logger.info(f"Notebook {self.nb} executed, " 205 | f"new version will be uploaded to website") 206 | clear_executor = ClearOutputPreprocessor() 207 | clear_executor.preprocess(nb, {}) 208 | 209 | with open(self.executed_nb_path, 'w', encoding='utf-8') as f: 210 | nbformat.write(nb, f) 211 | 212 | 213 | def process_notebooks(nbfile_or_path, execute=True, force=False, link=False, cleanup=False, 214 | filename_pattern='', remove_gh=False, **kwargs): 215 | """ 216 | Execute and optionally convert the specified notebook file or directory of 217 | notebook files. 218 | Wrapper for `NotebookConverter` class that does all the file handling. 219 | Parameters 220 | ---------- 221 | nbfile_or_path : str 222 | Either a single notebook filename or a path containing notebook files. 223 | execute : bool 224 | Whether or not to execute the notebooks 225 | link : bool, default = False 226 | Whether to create nbsphink link file 227 | cleanup : bool, default = False 228 | Whether to unexecute notebook and clean up files. To clean up must set this to True and 229 | execute argument to False 230 | filename_pattern: str default = '' 231 | Filename pattern to look for in .py or .ipynb files to include in docs 232 | remove_gh: bool default = False 233 | Whether to remove notebook from build examples (in case where we want to use old version) 234 | **kwargs 235 | Other keyword arguments that are passed to the 'NotebookExecuter' 236 | """ 237 | 238 | overall_status = 0 239 | if os.path.isdir(nbfile_or_path): 240 | # It's a path, so we need to walk through recursively and find any 241 | # notebook files 242 | for root, dirs, files in os.walk(nbfile_or_path): 243 | for name in files: 244 | 245 | _, ext = os.path.splitext(name) 246 | full_path = os.path.join(root, name) 247 | 248 | # skip checkpoints 249 | if 'ipynb_checkpoints' in full_path: 250 | if cleanup: 251 | os.remove(full_path) 252 | continue 253 | else: 254 | continue 255 | 256 | # if file has 'ipynb' extension create the NotebookConverter object 257 | if ext == '.ipynb': 258 | if re.search(filename_pattern, name): 259 | nbc = NotebookConverter(full_path, **kwargs) 260 | # Want to create the link file 261 | if link: 262 | nbc.link() 263 | # Execute the notebook 264 | if execute: 265 | _logger.info(f"Executing notebook {full_path}") 266 | _, status = nbc.execute(force=force) 267 | overall_status += status 268 | # If cleanup is true and execute is false unexecute the notebook 269 | if cleanup: 270 | _logger.info(f"Cleaning up notebook {full_path}") 271 | nbc.unexecute(remove_gh=remove_gh) 272 | 273 | # if file has 'py' extension convert to '.ipynb' and then execute 274 | elif ext == '.py': 275 | if re.search(filename_pattern, name): 276 | # See if the ipynb version already exists 277 | ipy_path = sph_nb.replace_py_ipynb(full_path) 278 | if Path(ipy_path).exists(): 279 | # If it does and we want to execute, skip as it would have been 280 | # executed above already 281 | if execute: 282 | continue 283 | # If cleanup then we want to delete this file 284 | if cleanup: 285 | os.remove(ipy_path) 286 | else: 287 | # If it doesn't exist, we need to make it 288 | full_path = NotebookConverter.py_to_ipynb(full_path) 289 | nbc = NotebookConverter(full_path, **kwargs) 290 | if link: 291 | nbc.link() 292 | # Execute the notebook 293 | if execute: 294 | _, status = nbc.execute(force=force) 295 | overall_status += status 296 | # If cleanup then we want to delete this file 297 | if cleanup: 298 | os.remove(full_path) 299 | 300 | else: 301 | full_path = Path(nbfile_or_path) 302 | ext = full_path.suffix 303 | 304 | if ext == '.py': 305 | ipy_path = sph_nb.replace_py_ipynb(full_path) 306 | if not Path(ipy_path).exists(): 307 | full_path = NotebookConverter.py_to_ipynb(full_path) 308 | else: 309 | full_path = ipy_path 310 | 311 | nbc = NotebookConverter(full_path, **kwargs) 312 | # Want to create the link file 313 | if link: 314 | nbc.link() 315 | # Execute the notebook 316 | if execute: 317 | _, status = nbc.execute(force=force) 318 | overall_status += status 319 | # If cleanup is true and execute is false, unexecute the notebook 320 | if cleanup: 321 | nbc.unexecute() 322 | if ext == '.py': 323 | os.remove(full_path) 324 | 325 | return overall_status 326 | 327 | 328 | class ExecuteNotebooks(ExecutePreprocessor): 329 | 330 | def __init__(self, **kw): 331 | super().__init__(**kw) 332 | 333 | def preprocess_cell(self, cell, resources, index): 334 | """ 335 | Override if you want to apply some preprocessing to each cell. 336 | Must return modified cell and resource dictionary. 337 | 338 | Parameters 339 | ---------- 340 | cell : NotebookNode cell 341 | Notebook cell being processed 342 | resources : dictionary 343 | Additional resources used in the conversion process. Allows 344 | preprocessors to pass variables into the Jinja engine. 345 | index : int 346 | Index of the cell being processed 347 | """ 348 | self._check_assign_resources(resources) 349 | if self.get_execute_meta(cell['metadata']): 350 | cell = self.execute_cell(cell, index, store_history=True) 351 | return cell, self.resources 352 | 353 | def get_execute_meta(self, metadata): 354 | return metadata.get('ibl_execute', True) 355 | 356 | -------------------------------------------------------------------------------- /docs_gh_pages/scripts/gh_push.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # Script to push html doc files to gh pages 3 | # Clone the gh-pages branch to local documentation directory 4 | git clone -b gh-pages https://github.com/int-brain-lab/iblenv.git gh-pages 5 | cd gh-pages 6 | 7 | # Copy everything from output of build into gh-pages branch 8 | cp -R ../_build/html/* ./ 9 | 10 | # Add and commit all changes 11 | git add -A . 12 | git commit -m "$1"; 13 | 14 | # Push the changes 15 | git push -q origin gh-pages 16 | 17 | # Leave gh-pages repo and delete 18 | cd ../ 19 | rm -rf gh-pages 20 | -------------------------------------------------------------------------------- /docs_gh_pages/scripts/myavi_to_png.py: -------------------------------------------------------------------------------- 1 | import nbformat 2 | from pathlib import Path 3 | 4 | file_path = Path('C:/Users/Mayo/iblenv/ibllib-repo/examples/one/histology/docs_find_nearby_trajectories.ipynb') 5 | 6 | -------------------------------------------------------------------------------- /docs_gh_pages/scripts/one_setup.py: -------------------------------------------------------------------------------- 1 | from one.api import ONE 2 | pw = 'international' 3 | one = ONE(base_url='https://openalyx.internationalbrainlab.org', password=pw, silent=True) -------------------------------------------------------------------------------- /docs_gh_pages/templates/colab_template.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Untitled0.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "code", 18 | "metadata": { 19 | "id": "HzRwehxFukh_", 20 | "colab_type": "code", 21 | "colab": {} 22 | }, 23 | "source": [ 24 | "!pip install ibllib\n", 25 | "from google.colab import drive\n", 26 | "drive.mount('/content/drive')\n", 27 | "!cp drive/My\\ Drive/params/.one_params ../root" 28 | ], 29 | "execution_count": null, 30 | "outputs": [] 31 | } 32 | ] 33 | } -------------------------------------------------------------------------------- /docs_gh_pages/templates/docs_example_ipynb.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Title of your example\n", 8 | "Brief description of example should go here. This example shows how to structure the introduction for\n", 9 | "the title and description so that it is rendered correctly during building of documentation" 10 | ] 11 | }, 12 | { 13 | "cell_type": "code", 14 | "execution_count": null, 15 | "metadata": {}, 16 | "outputs": [], 17 | "source": [ 18 | "# Author: Mayo\n", 19 | "# Code should follow\n", 20 | "import numpy as np\n", 21 | "import pandas as pd\n", 22 | "\n", 23 | "# etc etc" 24 | ] 25 | } 26 | ], 27 | "metadata": { 28 | "kernelspec": { 29 | "display_name": "Python [conda env:ibl_docs] *", 30 | "language": "python", 31 | "name": "conda-env-ibl_docs-py" 32 | }, 33 | "language_info": { 34 | "codemirror_mode": { 35 | "name": "ipython", 36 | "version": 3 37 | }, 38 | "file_extension": ".py", 39 | "mimetype": "text/x-python", 40 | "name": "python", 41 | "nbconvert_exporter": "python", 42 | "pygments_lexer": "ipython3", 43 | "version": "3.7.7" 44 | } 45 | }, 46 | "nbformat": 4, 47 | "nbformat_minor": 4 48 | } 49 | -------------------------------------------------------------------------------- /docs_gh_pages/templates/docs_example_py.py: -------------------------------------------------------------------------------- 1 | """ 2 | Title of your example 3 | ===================== 4 | Brief description of example should go here. This example shows how to structure the docstring for 5 | the title and description so that it is rendered correctly during building of documentation 6 | """ 7 | 8 | # Author: Mayo 9 | # Code should follow 10 | import numpy as np 11 | import pandas as pd 12 | 13 | # etc etc 14 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | apptools >=4.5.0 2 | boto3 3 | click 4 | colorcet 5 | colorlog 6 | cython 7 | dataclasses 8 | datajoint 9 | flake8 10 | globus-sdk 11 | graphviz 12 | h5py 13 | ibl-neuropixel 14 | ibllib 15 | iblutil 16 | jupyter 17 | jupyterlab 18 | matplotlib 19 | mtscomp 20 | nbformat 21 | numba 22 | numpy 23 | opencv-python # macOS 10.13 and prior are incompatible with modern versions of opencv 24 | ONE-api 25 | pandas 26 | phylib 27 | pillow 28 | plotly 29 | pyarrow 30 | pyflakes >= 2.4.0 31 | pynrrd 32 | pyopengl 33 | PyQt5 34 | pyqtgraph 35 | pytest 36 | requests 37 | scikits-bootstrap 38 | scikit-learn 39 | scipy >=1.4.1 40 | seaborn 41 | SimpleITK 42 | soundfile 43 | sphinx_gallery 44 | statsmodels 45 | tqdm 46 | --------------------------------------------------------------------------------