├── 2017 └── logo_mathworks.png ├── 2019 └── results │ └── hackathon.tsv ├── 2020 ├── papers │ ├── 7.pdf │ ├── 107.pdf │ ├── 112.pdf │ ├── 116.pdf │ ├── 124.pdf │ ├── 127.pdf │ ├── 128.pdf │ ├── 130.pdf │ ├── 133.pdf │ ├── 134.pdf │ ├── 135.pdf │ ├── 138.pdf │ ├── 139.pdf │ ├── 144.pdf │ ├── 148.pdf │ ├── 161.pdf │ ├── 162.pdf │ ├── 171.pdf │ ├── 185.pdf │ ├── 189.pdf │ ├── 196.pdf │ ├── 198.pdf │ ├── 202.pdf │ ├── 217.pdf │ ├── 225.pdf │ ├── 227.pdf │ ├── 229.pdf │ ├── 236.pdf │ ├── 253.pdf │ ├── 277.pdf │ ├── 281.pdf │ ├── 282.pdf │ ├── 285.pdf │ ├── 297.pdf │ ├── 305.pdf │ ├── 307.pdf │ ├── 32.pdf │ ├── 328.pdf │ ├── 339.pdf │ ├── 340.pdf │ ├── 349.pdf │ ├── 35.pdf │ ├── 353.pdf │ ├── 356.pdf │ ├── 363.pdf │ ├── 374.pdf │ ├── 39.pdf │ ├── 406.pdf │ ├── 417.pdf │ ├── 424.pdf │ ├── 435.pdf │ ├── 44.pdf │ ├── 445.pdf │ ├── 456.pdf │ ├── 462.pdf │ ├── 61.pdf │ ├── 63.pdf │ ├── 71.pdf │ ├── 72.pdf │ ├── 74.pdf │ ├── 85.pdf │ ├── 95.pdf │ ├── 2020ChallengePaper.pdf │ └── index.md ├── logo_mathworks.png ├── matfile_format.pdf ├── logo_google_cloud.png ├── logo_moore_foundation.png ├── Dx_map.csv ├── faq.md ├── submissions.md ├── eqn_jaccard_index.svg ├── eqn_sum_g_measure.svg └── eqn_sum_f_measure.svg ├── 2021 ├── logo_mathworks.png ├── matfile_format.pdf ├── logo_google_cloud.png ├── papers │ ├── cinc_draft.pdf │ ├── cinc_template.pdf │ ├── cinc_template.zip │ ├── cinc_template.docx │ ├── 2020ChallengePaper.pdf │ └── index.md ├── logo_moore_foundation.png ├── leaderboard │ └── index.md ├── faq │ └── index.md └── submissions │ └── index.md ├── .gitignore ├── update_steps.txt ├── assets ├── css │ └── style.scss ├── img │ ├── cinc.jpg │ ├── ecg-ml.gif │ ├── physionet.gif │ └── physionet-cinc.gif ├── fonts │ ├── Noto-Sans-700 │ │ ├── Noto-Sans-700.eot │ │ ├── Noto-Sans-700.ttf │ │ ├── Noto-Sans-700.woff │ │ └── Noto-Sans-700.woff2 │ ├── Noto-Sans-italic │ │ ├── Noto-Sans-italic.eot │ │ ├── Noto-Sans-italic.ttf │ │ ├── Noto-Sans-italic.woff │ │ └── Noto-Sans-italic.woff2 │ ├── Noto-Sans-regular │ │ ├── Noto-Sans-regular.eot │ │ ├── Noto-Sans-regular.ttf │ │ ├── Noto-Sans-regular.woff │ │ └── Noto-Sans-regular.woff2 │ └── Noto-Sans-700italic │ │ ├── Noto-Sans-700italic.eot │ │ ├── Noto-Sans-700italic.ttf │ │ ├── Noto-Sans-700italic.woff │ │ └── Noto-Sans-700italic.woff2 └── js │ └── scale.fix.js ├── thumbnail.png ├── Gemfile ├── script ├── bootstrap ├── cibuild ├── validate-html └── release ├── .rubocop.yml ├── .travis.yml ├── .github ├── CODEOWNERS ├── no-response.yml ├── settings.yml ├── stale.yml └── config.yml ├── _layouts ├── post.html ├── default.html ├── 2017.html ├── 2020.html └── 2021.html ├── docs ├── SUPPORT.md ├── CODE_OF_CONDUCT.md └── CONTRIBUTING.md ├── jekyll-theme-minimal.gemspec ├── LICENSE ├── _config.yml ├── _sass ├── fonts.scss ├── rouge-github.scss └── jekyll-theme-minimal.scss ├── README.md ├── about └── index.md ├── index.md └── faq └── index.md /.gitignore: -------------------------------------------------------------------------------- 1 | _site 2 | .sass-cache 3 | Gemfile.lock 4 | *.gem 5 | -------------------------------------------------------------------------------- /update_steps.txt: -------------------------------------------------------------------------------- 1 | - Update CNAME file to moody-challenge.physionet.org -------------------------------------------------------------------------------- /assets/css/style.scss: -------------------------------------------------------------------------------- 1 | --- 2 | --- 3 | 4 | @import "jekyll-theme-minimal"; 5 | -------------------------------------------------------------------------------- /thumbnail.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/thumbnail.png -------------------------------------------------------------------------------- /2020/papers/7.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/7.pdf -------------------------------------------------------------------------------- /Gemfile: -------------------------------------------------------------------------------- 1 | # frozen_string_literal: true 2 | 3 | source 'https://rubygems.org' 4 | 5 | gemspec 6 | -------------------------------------------------------------------------------- /script/bootstrap: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | set -e 4 | 5 | gem install bundler 6 | bundle install 7 | -------------------------------------------------------------------------------- /2020/papers/107.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/107.pdf -------------------------------------------------------------------------------- /2020/papers/112.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/112.pdf -------------------------------------------------------------------------------- /2020/papers/116.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/116.pdf -------------------------------------------------------------------------------- /2020/papers/124.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/124.pdf -------------------------------------------------------------------------------- /2020/papers/127.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/127.pdf -------------------------------------------------------------------------------- /2020/papers/128.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/128.pdf -------------------------------------------------------------------------------- /2020/papers/130.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/130.pdf -------------------------------------------------------------------------------- /2020/papers/133.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/133.pdf -------------------------------------------------------------------------------- /2020/papers/134.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/134.pdf -------------------------------------------------------------------------------- /2020/papers/135.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/135.pdf -------------------------------------------------------------------------------- /2020/papers/138.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/138.pdf -------------------------------------------------------------------------------- /2020/papers/139.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/139.pdf -------------------------------------------------------------------------------- /2020/papers/144.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/144.pdf -------------------------------------------------------------------------------- /2020/papers/148.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/148.pdf -------------------------------------------------------------------------------- /2020/papers/161.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/161.pdf -------------------------------------------------------------------------------- /2020/papers/162.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/162.pdf -------------------------------------------------------------------------------- /2020/papers/171.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/171.pdf -------------------------------------------------------------------------------- /2020/papers/185.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/185.pdf -------------------------------------------------------------------------------- /2020/papers/189.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/189.pdf -------------------------------------------------------------------------------- /2020/papers/196.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/196.pdf -------------------------------------------------------------------------------- /2020/papers/198.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/198.pdf -------------------------------------------------------------------------------- /2020/papers/202.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/202.pdf -------------------------------------------------------------------------------- /2020/papers/217.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/217.pdf -------------------------------------------------------------------------------- /2020/papers/225.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/225.pdf -------------------------------------------------------------------------------- /2020/papers/227.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/227.pdf -------------------------------------------------------------------------------- /2020/papers/229.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/229.pdf -------------------------------------------------------------------------------- /2020/papers/236.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/236.pdf -------------------------------------------------------------------------------- /2020/papers/253.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/253.pdf -------------------------------------------------------------------------------- /2020/papers/277.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/277.pdf -------------------------------------------------------------------------------- /2020/papers/281.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/281.pdf -------------------------------------------------------------------------------- /2020/papers/282.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/282.pdf -------------------------------------------------------------------------------- /2020/papers/285.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/285.pdf -------------------------------------------------------------------------------- /2020/papers/297.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/297.pdf -------------------------------------------------------------------------------- /2020/papers/305.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/305.pdf -------------------------------------------------------------------------------- /2020/papers/307.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/307.pdf -------------------------------------------------------------------------------- /2020/papers/32.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/32.pdf -------------------------------------------------------------------------------- /2020/papers/328.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/328.pdf -------------------------------------------------------------------------------- /2020/papers/339.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/339.pdf -------------------------------------------------------------------------------- /2020/papers/340.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/340.pdf -------------------------------------------------------------------------------- /2020/papers/349.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/349.pdf -------------------------------------------------------------------------------- /2020/papers/35.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/35.pdf -------------------------------------------------------------------------------- /2020/papers/353.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/353.pdf -------------------------------------------------------------------------------- /2020/papers/356.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/356.pdf -------------------------------------------------------------------------------- /2020/papers/363.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/363.pdf -------------------------------------------------------------------------------- /2020/papers/374.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/374.pdf -------------------------------------------------------------------------------- /2020/papers/39.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/39.pdf -------------------------------------------------------------------------------- /2020/papers/406.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/406.pdf -------------------------------------------------------------------------------- /2020/papers/417.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/417.pdf -------------------------------------------------------------------------------- /2020/papers/424.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/424.pdf -------------------------------------------------------------------------------- /2020/papers/435.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/435.pdf -------------------------------------------------------------------------------- /2020/papers/44.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/44.pdf -------------------------------------------------------------------------------- /2020/papers/445.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/445.pdf -------------------------------------------------------------------------------- /2020/papers/456.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/456.pdf -------------------------------------------------------------------------------- /2020/papers/462.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/462.pdf -------------------------------------------------------------------------------- /2020/papers/61.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/61.pdf -------------------------------------------------------------------------------- /2020/papers/63.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/63.pdf -------------------------------------------------------------------------------- /2020/papers/71.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/71.pdf -------------------------------------------------------------------------------- /2020/papers/72.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/72.pdf -------------------------------------------------------------------------------- /2020/papers/74.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/74.pdf -------------------------------------------------------------------------------- /2020/papers/85.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/85.pdf -------------------------------------------------------------------------------- /2020/papers/95.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/95.pdf -------------------------------------------------------------------------------- /assets/img/cinc.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/img/cinc.jpg -------------------------------------------------------------------------------- /assets/img/ecg-ml.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/img/ecg-ml.gif -------------------------------------------------------------------------------- /.rubocop.yml: -------------------------------------------------------------------------------- 1 | AllCops: 2 | Exclude: 3 | - _site/**/* 4 | 5 | Metrics/LineLength: 6 | Enabled: false 7 | -------------------------------------------------------------------------------- /2017/logo_mathworks.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2017/logo_mathworks.png -------------------------------------------------------------------------------- /2020/logo_mathworks.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/logo_mathworks.png -------------------------------------------------------------------------------- /2020/matfile_format.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/matfile_format.pdf -------------------------------------------------------------------------------- /2021/logo_mathworks.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/logo_mathworks.png -------------------------------------------------------------------------------- /2021/matfile_format.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/matfile_format.pdf -------------------------------------------------------------------------------- /assets/img/physionet.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/img/physionet.gif -------------------------------------------------------------------------------- /2020/logo_google_cloud.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/logo_google_cloud.png -------------------------------------------------------------------------------- /2021/logo_google_cloud.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/logo_google_cloud.png -------------------------------------------------------------------------------- /2021/papers/cinc_draft.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/papers/cinc_draft.pdf -------------------------------------------------------------------------------- /2021/papers/cinc_template.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/papers/cinc_template.pdf -------------------------------------------------------------------------------- /2021/papers/cinc_template.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/papers/cinc_template.zip -------------------------------------------------------------------------------- /assets/img/physionet-cinc.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/img/physionet-cinc.gif -------------------------------------------------------------------------------- /.travis.yml: -------------------------------------------------------------------------------- 1 | language: ruby 2 | cache: bundler 3 | rvm: 2.6 4 | 5 | install: script/bootstrap 6 | script: script/cibuild 7 | -------------------------------------------------------------------------------- /2020/logo_moore_foundation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/logo_moore_foundation.png -------------------------------------------------------------------------------- /2021/logo_moore_foundation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/logo_moore_foundation.png -------------------------------------------------------------------------------- /2021/papers/cinc_template.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/papers/cinc_template.docx -------------------------------------------------------------------------------- /2020/papers/2020ChallengePaper.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2020/papers/2020ChallengePaper.pdf -------------------------------------------------------------------------------- /2021/papers/2020ChallengePaper.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/2021/papers/2020ChallengePaper.pdf -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700/Noto-Sans-700.eot: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700/Noto-Sans-700.eot -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700/Noto-Sans-700.ttf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700/Noto-Sans-700.ttf -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700/Noto-Sans-700.woff: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700/Noto-Sans-700.woff -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700/Noto-Sans-700.woff2: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700/Noto-Sans-700.woff2 -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-italic/Noto-Sans-italic.eot: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-italic/Noto-Sans-italic.eot -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-italic/Noto-Sans-italic.ttf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-italic/Noto-Sans-italic.ttf -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-italic/Noto-Sans-italic.woff: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-italic/Noto-Sans-italic.woff -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-italic/Noto-Sans-italic.woff2: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-italic/Noto-Sans-italic.woff2 -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-regular/Noto-Sans-regular.eot: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-regular/Noto-Sans-regular.eot -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-regular/Noto-Sans-regular.ttf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-regular/Noto-Sans-regular.ttf -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-regular/Noto-Sans-regular.woff: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-regular/Noto-Sans-regular.woff -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-regular/Noto-Sans-regular.woff2: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-regular/Noto-Sans-regular.woff2 -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.eot: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.eot -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.ttf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.ttf -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff -------------------------------------------------------------------------------- /assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff2: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MIT-LCP/moody-ch.github.io/master/assets/fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff2 -------------------------------------------------------------------------------- /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | # Require maintainer's :+1: for changes to the .github/ repo-config files 2 | # mainly due to https://github.com/probot/settings privilege escalation 3 | .github/* @pages-themes/maintainers 4 | -------------------------------------------------------------------------------- /script/cibuild: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | set -e 4 | 5 | bundle exec jekyll build 6 | bundle exec htmlproofer ./_site --check-html --check-sri 7 | bundle exec rubocop -D 8 | bundle exec script/validate-html 9 | gem build jekyll-theme-minimal.gemspec 10 | -------------------------------------------------------------------------------- /_layouts/post.html: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | --- 4 | 5 | {{ page.date | date: "%-d %B %Y" }} 6 |

{{ page.title }}

7 | 8 |

by {{ page.author | default: site.author }}

9 | 10 | {{content}} 11 | 12 | {% if page.tags %} 13 | tags: {{ page.tags | join: " - " }} 14 | {% endif %} 15 | -------------------------------------------------------------------------------- /docs/SUPPORT.md: -------------------------------------------------------------------------------- 1 | ## Where to get help 2 | 3 | If you think you've found a bug in The Minimal Theme, please [check the existing issues](https://github.com/pages-themes/minimal/issues), and if no one has reported the problem, [open a new issue](https://github.com/pages-themes/minimal/issues/new). 4 | 5 | If you have a general question about the theme, how to implement it, or how to customize it for your site you have two options: 6 | 7 | 1. [Contact GitHub Support](https://github.com/contact?form%5Bsubject%5D=GitHub%20Pages%20theme%20pages-themes/minimal), or 8 | 9 | 2. Ask your question of the Jekyll community on [talk.jekyllrb.com](https://talk.jekyllrb.com/) 10 | -------------------------------------------------------------------------------- /script/validate-html: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env ruby 2 | # frozen_string_literal: true 3 | 4 | require 'w3c_validators' 5 | 6 | def validator(file) 7 | extension = File.extname(file) 8 | if extension == '.html' 9 | W3CValidators::NuValidator.new 10 | elsif extension == '.css' 11 | W3CValidators::CSSValidator.new 12 | end 13 | end 14 | 15 | def validate(file) 16 | puts "Checking #{file}..." 17 | 18 | path = File.expand_path "../_site/#{file}", __dir__ 19 | results = validator(file).validate_file(path) 20 | 21 | return puts 'Valid!' if results.errors.empty? 22 | 23 | results.errors.each { |err| puts err.to_s } 24 | exit 1 25 | end 26 | 27 | validate 'index.html' 28 | validate File.join 'assets', 'css', 'style.css' 29 | -------------------------------------------------------------------------------- /.github/no-response.yml: -------------------------------------------------------------------------------- 1 | # Configuration for probot-no-response - https://github.com/probot/no-response 2 | 3 | # Number of days of inactivity before an Issue is closed for lack of response 4 | daysUntilClose: 14 5 | # Label requiring a response 6 | responseRequiredLabel: more-information-needed 7 | # Comment to post when closing an Issue for lack of response. Set to `false` to disable 8 | closeComment: > 9 | This issue has been automatically closed because there has been no response 10 | to our request for more information from the original author. With only the 11 | information that is currently in the issue, we don't have enough information 12 | to take action. Please reach out if you have or find the answers we need so 13 | that we can investigate further. 14 | -------------------------------------------------------------------------------- /.github/settings.yml: -------------------------------------------------------------------------------- 1 | # Repository settings set via https://github.com/probot/settings 2 | 3 | repository: 4 | has_issues: true 5 | has_wiki: false 6 | has_projects: false 7 | has_downloads: false 8 | 9 | labels: 10 | - name: help wanted 11 | oldname: help-wanted 12 | color: 0e8a16 13 | - name: more-information-needed 14 | color: d93f0b 15 | - name: bug 16 | color: b60205 17 | - name: feature 18 | color: 1d76db 19 | - name: good first issue 20 | color: "5319e7" 21 | 22 | # Not currently implemented by probot/settings, but manually implemented in script/deploy 23 | branch_protection: 24 | restrictions: null 25 | enforce_admins: false 26 | required_status_checks: 27 | strict: true 28 | contexts: 29 | - "continuous-integration/travis-ci" 30 | required_pull_request_reviews: 31 | require_code_owner_reviews: true 32 | -------------------------------------------------------------------------------- /.github/stale.yml: -------------------------------------------------------------------------------- 1 | # Configuration for probot-stale - https://github.com/probot/stale 2 | 3 | # Number of days of inactivity before an Issue or Pull Request becomes stale 4 | daysUntilStale: 60 5 | 6 | # Number of days of inactivity before a stale Issue or Pull Request is closed 7 | daysUntilClose: 7 8 | 9 | # Issues or Pull Requests with these labels will never be considered stale 10 | exemptLabels: 11 | - pinned 12 | - security 13 | 14 | # Label to use when marking as stale 15 | staleLabel: wontfix 16 | 17 | # Comment to post when marking as stale. Set to `false` to disable 18 | markComment: > 19 | This issue has been automatically marked as stale because it has not had 20 | recent activity. It will be closed if no further activity occurs. Thank you 21 | for your contributions. 22 | 23 | # Comment to post when closing a stale Issue or Pull Request. Set to `false` to disable 24 | closeComment: false 25 | 26 | # Limit to only `issues` or `pulls` 27 | # only: issues 28 | -------------------------------------------------------------------------------- /jekyll-theme-minimal.gemspec: -------------------------------------------------------------------------------- 1 | # frozen_string_literal: true 2 | 3 | Gem::Specification.new do |s| 4 | s.name = 'jekyll-theme-minimal' 5 | s.version = '0.1.1' 6 | s.license = 'CC0-1.0' 7 | s.authors = ['Steve Smith', 'GitHub, Inc.'] 8 | s.email = ['opensource+jekyll-theme-minimal@github.com'] 9 | s.homepage = 'https://github.com/pages-themes/minimal' 10 | s.summary = 'Minimal is a Jekyll theme for GitHub Pages' 11 | 12 | s.files = `git ls-files -z`.split("\x0").select do |f| 13 | f.match(%r{^((_includes|_layouts|_sass|assets)/|(LICENSE|README)((\.(txt|md|markdown)|$)))}i) 14 | end 15 | 16 | s.platform = Gem::Platform::RUBY 17 | s.add_runtime_dependency 'jekyll', '> 3.5', '< 5.0' 18 | s.add_runtime_dependency 'jekyll-seo-tag', '~> 2.0' 19 | s.add_development_dependency 'html-proofer', '~> 3.0' 20 | s.add_development_dependency 'rubocop', '~> 0.50' 21 | s.add_development_dependency 'w3c_validators', '~> 1.3' 22 | end 23 | -------------------------------------------------------------------------------- /script/release: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # Tag and push a release. 3 | 4 | set -e 5 | 6 | # Make sure we're in the project root. 7 | 8 | cd $(dirname "$0")/.. 9 | 10 | # Make sure the darn thing works 11 | 12 | bundle update 13 | 14 | # Build a new gem archive. 15 | 16 | rm -rf jekyll-theme-minimal-*.gem 17 | gem build -q jekyll-theme-minimal.gemspec 18 | 19 | # Make sure we're on the master branch. 20 | 21 | (git branch | grep -q 'master') || { 22 | echo "Only release from the master branch." 23 | exit 1 24 | } 25 | 26 | # Figure out what version we're releasing. 27 | 28 | tag=v`ls jekyll-theme-minimal-*.gem | sed 's/^jekyll-theme-minimal-\(.*\)\.gem$/\1/'` 29 | 30 | # Make sure we haven't released this version before. 31 | 32 | git fetch -t origin 33 | 34 | (git tag -l | grep -q "$tag") && { 35 | echo "Whoops, there's already a '${tag}' tag." 36 | exit 1 37 | } 38 | 39 | # Tag it and bag it. 40 | 41 | gem push jekyll-theme-minimal-*.gem && git tag "$tag" && 42 | git push origin master && git push origin "$tag" 43 | -------------------------------------------------------------------------------- /assets/js/scale.fix.js: -------------------------------------------------------------------------------- 1 | (function(document) { 2 | var metas = document.getElementsByTagName('meta'), 3 | changeViewportContent = function(content) { 4 | for (var i = 0; i < metas.length; i++) { 5 | if (metas[i].name == "viewport") { 6 | metas[i].content = content; 7 | } 8 | } 9 | }, 10 | initialize = function() { 11 | changeViewportContent("width=device-width, minimum-scale=1.0, maximum-scale=1.0"); 12 | }, 13 | gestureStart = function() { 14 | changeViewportContent("width=device-width, minimum-scale=0.25, maximum-scale=1.6"); 15 | }, 16 | gestureEnd = function() { 17 | initialize(); 18 | }; 19 | 20 | 21 | if (navigator.userAgent.match(/iPhone/i)) { 22 | initialize(); 23 | 24 | document.addEventListener("touchstart", gestureStart, false); 25 | document.addEventListener("touchend", gestureEnd, false); 26 | } 27 | })(document); 28 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | BSD 2-Clause License 2 | 3 | Copyright (c) 2020, PhysioNet/CinC Challenge 4 | All rights reserved. 5 | 6 | Redistribution and use in source and binary forms, with or without 7 | modification, are permitted provided that the following conditions are met: 8 | 9 | 1. Redistributions of source code must retain the above copyright notice, this 10 | list of conditions and the following disclaimer. 11 | 12 | 2. Redistributions in binary form must reproduce the above copyright notice, 13 | this list of conditions and the following disclaimer in the documentation 14 | and/or other materials provided with the distribution. 15 | 16 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 17 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 18 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 19 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 20 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 21 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 22 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 23 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 24 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 25 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 26 | -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | title: PhysioNet/CinC Challenges 2 | logo: /assets/img/physionet-cinc.gif 3 | logo_2017: /assets/img/ecg-ml.gif 4 | logo_2020: /assets/img/ecg-ml.gif 5 | logo_2021: /assets/img/ecg-ml.gif 6 | description: Quick links for this year's Challenge: Please post questions and comments in the forum. However, if your question reveals information about your entry, then please email challenge at physionet.org. We may post parts of our reply publicly if we feel that all Challengers should benefit from it. We will not answer emails about the Challenge to any other address. 7 | show_downloads: false 8 | google_analytics: 9 | theme: jekyll-theme-minimal 10 | -------------------------------------------------------------------------------- /.github/config.yml: -------------------------------------------------------------------------------- 1 | # Behaviorbot config. See https://github.com/behaviorbot/ for more information. 2 | # Note: Please Don't edit this file directly. 3 | # Edit https://github.com/pages-themes/maintenance-scripts instead. 4 | 5 | # Configuration for update-docs - https://github.com/behaviorbot/update-docs 6 | updateDocsComment: "Thanks for the pull request! If you are making any changes to the user-facing functionality, please be sure to update the documentation in the `README` or `docs/` folder alongside your change. :heart:" 7 | 8 | # Configuration for request-info - https://github.com/behaviorbot/request-info 9 | requestInfoReplyComment: Thanks for this. Do you mind providing a bit more information about what problem you're trying to solve? 10 | requestInfoLabelToAdd: more-information-needed 11 | 12 | # Configuration for new-issue-welcome - https://github.com/behaviorbot/new-issue-welcome 13 | #newIssueWelcomeComment: > 14 | # Welcome! 15 | 16 | # Configuration for new-pr-welcome - https://github.com/behaviorbot/new-pr-welcome 17 | newPRWelcomeComment: Welcome! Congrats on your first pull request to The Minimal Theme. If you haven't already, please be sure to check out [the contributing guidelines](https://github.com/pages-themes/minimal/blob/master/docs/CONTRIBUTING.md). 18 | 19 | # Configuration for first-pr-merge - https://github.com/behaviorbot/first-pr-merge 20 | firstPRMergeComment: "Congrats on getting your first pull request to The Minimal Theme merged! Without amazing humans like you submitting pull requests, we couldn’t run this project. You rock! :tada:

If you're interested in tackling another bug or feature, take a look at [the open issues](https://github.com/pages-themes/minimal/issues), especially those [labeled `help wanted`](https://github.com/pages-themes/minimal/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)." 21 | -------------------------------------------------------------------------------- /2019/results/hackathon.tsv: -------------------------------------------------------------------------------- 1 | Rank Team name Team members Title of abstract Utility score on full test set Utility score on test set A Utility score on test set B Utility score on test set C AUROC on test set A AUROC on test set B AUROC on test set C AUPRC on test set A AUPRC on test set B AUPRC on test set C Accuracy on test set A Accuracy on test set B Accuracy on test set C F-measure on test set A F-measure on test set B F-measure on test set C Execution time on test set A (h:m:s) Number of successful entries submitted 2 | 1 SailOcean Meicheng Yang, Hongxiang Gao, Xingyao Wang, Yuwen Li, Xing Liu, Jianqing Li, Chengyu Liu N/A 0.364 0.430 0.422 -0.045 0.823 0.859 0.814 0.106 0.129 0.071 0.784 0.885 0.777 0.123 0.136 0.050 4:43:08 2 3 | 2 prna Jonathan Rubin, Yale Chang, Saman Parvaneh, Gregory Boverman N/A 0.342 0.417 0.420 -0.156 0.167 0.173 0.155 0.014 0.015 0.005 0.778 0.887 0.740 0.120 0.137 0.044 10:38:44 2 4 | 3 Sepsyd John Anda Du, Miquel Alfaras, Naoki Nonaka, Inès Krissaane, Edwar Hernando Macias Toro, Matthieu Scherpf N/A 0.329 0.399 0.389 -0.104 0.804 0.845 0.793 0.102 0.107 0.055 0.804 0.889 0.772 0.124 0.131 0.046 1:35:37 2 5 | 4 TAG Xiaopeng Zhao, Simon Lyra, Marcus Vollmer, Christoph Hoog Antink N/A 0.308 0.397 0.375 -0.220 0.000 0.000 0.000 0.000 0.000 0.000 0.831 0.899 0.731 0.133 0.135 0.041 9:29:15 2 6 | 5 vn ByeongTak Lee, KyungJae Cho, Oyeon Kwon N/A 0.301 0.372 0.345 -0.101 0.802 0.815 0.782 0.097 0.090 0.041 0.882 0.914 0.786 0.156 0.139 0.046 3:47:44 1 7 | 6 cinc_sepsis_pass Mengsha Fu N/A 0.087 0.154 0.072 -0.155 0.689 0.719 0.701 0.046 0.033 0.021 0.806 0.892 0.848 0.081 0.066 0.036 14:08:44 2 8 | 7 Ulster University Pardis Biglarbeigi, Donal McLaughlin, Khaled Rjoob, Abdullah Abdullah, Niamh McCallan, Alicja Jasinska-Piadlo, Raymond Bond, Dewar Finlay, Kok Yew Ng, Alan Kennedy, Jim McLaughlin N/A 0.035 0.042 0.037 0.002 0.000 0.000 0.500 0.000 0.000 0.009 0.950 0.960 0.958 0.065 0.062 0.047 10:49:01 2 9 | 8 CRASHers Marco AF Pimentel, Adam Mahdi, Oliver Redfern, Mauro Santos N/A 0.000 0.000 0.000 0.000 0.500 0.500 0.500 0.022 0.014 0.009 0.978 0.986 0.991 0.000 0.000 0.000 0:48:54 1 10 | x ARUL Induparkavi Murugesan, Lingeshwaran Balasubramanian, Malathi Murugesan N/A N/A 1 11 | 12 | * Failed to satisfy all conditions of competition 13 | x Failed to submit successful entry 14 | ? Did not identify team members 15 | # Disqualified for submitting code substantially similar to another team -------------------------------------------------------------------------------- /2021/papers/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2021 3 | --- 4 | 5 | # Will Two Do? Varying Dimensions in Electrocardiography: The PhysioNet/Computing in Cardiology Challenge 2021 6 | 7 | ## Citations 8 | 9 | For this year's Challenge, please cite: 10 | 11 | [Perez Alday EA, Gu A, Shah AJ, Robichaux C, Wong AI, Liu C, Liu F, Rad AB, Elola A, Seyedi S, Li Q, Sharma A, Clifford GD*, Reyna MA*. Classification of 12-lead ECGs: the PhysioNet/Computing in Cardiology Challenge 2020. Physiol Meas. 2020 Nov 11. doi: 10.1088/1361-6579/abc960](2020ChallengePaper.pdf) 12 | 13 | [Reyna MA, Sadr N, Perez Alday EA, Gu A, Shah AJ, Robichaux C, Rad AB, Elola A, Seyedi S, Ansari S, Ghanbari H, Li Q, Sharma A, Clifford GD. Will Two Do? Varying Dimensions in Electrocardiography: the PhysioNet/Computing in Cardiology Challenge 2021. Computing in Cardiology 2021; 48: 1-4](cinc_draft.pdf) 14 | 15 | The above CinC paper is a draft. We will update it after CinC to include information about the conclusion of the Challenge. 16 | 17 | ## Templates 18 | 19 | Please follow the [preparation and submission instructions](https://www.cinc.org/instructions-for-preparing-and-submitting-full-papers/) for your CinC papers. 20 | 21 | We have prepared templates to help you prepare your CinC papers. Please use the LaTeX template ([Overleaf](https://www.overleaf.com/read/qqsvqvyrqkzr 22 | ) or [download](cinc_template.zip)) or the [Word template](cinc_template.docx). These [templates](cinc_template.pdf) include important instructions, advice, and references. 23 | 24 | Please *read* these instructions and one of the templates. Each year, we need to provide feedback for many teams that miss some of the paper requirements, and each year, some papers are unfortunately rejected because the teams were unable to address important issues. 25 | 26 | ## Focus Issue Papers 27 | 28 | The [Physiological Measurement Focus Collection: Classification of Multilead ECGs](https://iopscience.iop.org/journal/0967-3334/page/Focus%20issues) includes the above [paper](https://iopscience.iop.org/article/10.1088/1361-6579/abc960). 29 | 30 | ## Conference Papers 31 | 32 | The conference papers for [Computing in Cardiology 2021](https://www.cinc2021.org/) will be available on the [CinC](https://www.cinc.org/cinc-papers-on-line/) and [IEEE](https://ieeexplore.ieee.org/xpl/conhome/1000157/all-proceedings) websites. 33 | 34 | --- 35 | 36 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 37 | 38 | [Back](../) 39 | -------------------------------------------------------------------------------- /_sass/fonts.scss: -------------------------------------------------------------------------------- 1 | @font-face { 2 | font-family: 'Noto Sans'; 3 | font-weight: 400; 4 | font-style: normal; 5 | src: url('../fonts/Noto-Sans-regular/Noto-Sans-regular.eot'); 6 | src: url('../fonts/Noto-Sans-regular/Noto-Sans-regular.eot?#iefix') format('embedded-opentype'), 7 | local('Noto Sans'), 8 | local('Noto-Sans-regular'), 9 | url('../fonts/Noto-Sans-regular/Noto-Sans-regular.woff2') format('woff2'), 10 | url('../fonts/Noto-Sans-regular/Noto-Sans-regular.woff') format('woff'), 11 | url('../fonts/Noto-Sans-regular/Noto-Sans-regular.ttf') format('truetype'), 12 | url('../fonts/Noto-Sans-regular/Noto-Sans-regular.svg#NotoSans') format('svg'); 13 | } 14 | 15 | @font-face { 16 | font-family: 'Noto Sans'; 17 | font-weight: 700; 18 | font-style: normal; 19 | src: url('../fonts/Noto-Sans-700/Noto-Sans-700.eot'); 20 | src: url('../fonts/Noto-Sans-700/Noto-Sans-700.eot?#iefix') format('embedded-opentype'), 21 | local('Noto Sans Bold'), 22 | local('Noto-Sans-700'), 23 | url('../fonts/Noto-Sans-700/Noto-Sans-700.woff2') format('woff2'), 24 | url('../fonts/Noto-Sans-700/Noto-Sans-700.woff') format('woff'), 25 | url('../fonts/Noto-Sans-700/Noto-Sans-700.ttf') format('truetype'), 26 | url('../fonts/Noto-Sans-700/Noto-Sans-700.svg#NotoSans') format('svg'); 27 | } 28 | 29 | @font-face { 30 | font-family: 'Noto Sans'; 31 | font-weight: 400; 32 | font-style: italic; 33 | src: url('../fonts/Noto-Sans-italic/Noto-Sans-italic.eot'); 34 | src: url('../fonts/Noto-Sans-italic/Noto-Sans-italic.eot?#iefix') format('embedded-opentype'), 35 | local('Noto Sans Italic'), 36 | local('Noto-Sans-italic'), 37 | url('../fonts/Noto-Sans-italic/Noto-Sans-italic.woff2') format('woff2'), 38 | url('../fonts/Noto-Sans-italic/Noto-Sans-italic.woff') format('woff'), 39 | url('../fonts/Noto-Sans-italic/Noto-Sans-italic.ttf') format('truetype'), 40 | url('../fonts/Noto-Sans-italic/Noto-Sans-italic.svg#NotoSans') format('svg'); 41 | } 42 | 43 | @font-face { 44 | font-family: 'Noto Sans'; 45 | font-weight: 700; 46 | font-style: italic; 47 | src: url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.eot'); 48 | src: url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.eot?#iefix') format('embedded-opentype'), 49 | local('Noto Sans Bold Italic'), 50 | local('Noto-Sans-700italic'), 51 | url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff2') format('woff2'), 52 | url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.woff') format('woff'), 53 | url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.ttf') format('truetype'), 54 | url('../fonts/Noto-Sans-700italic/Noto-Sans-700italic.svg#NotoSans') format('svg'); 55 | } 56 | -------------------------------------------------------------------------------- /_layouts/default.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | {% seo %} 9 | 10 | 13 | 14 | 15 | 16 |
17 |
18 |

{{ site.title | default: site.github.repository_name }}

19 | 20 | {% if site.logo %} 21 | Logo 22 | {% endif %} 23 | 24 |

{{ site.description }}

25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | {% if site.show_downloads %} 35 | 40 | {% endif %} 41 | 42 | 43 |
44 |
45 | 46 | {{ content }} 47 | 48 |
49 | 55 |
56 | 57 | {% if site.google_analytics %} 58 | 66 | {% endif %} 67 | 68 | 69 | -------------------------------------------------------------------------------- /_layouts/2017.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | {% seo %} 9 | 10 | 13 | 14 | 15 | 16 |
17 |
18 |

{{ site.title | default: site.github.repository_name }}

19 | 20 | {% if site.logo %} 21 | Logo 22 | {% endif %} 23 | 24 |

{{ site.description }}

25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | {% if site.show_downloads %} 35 | 40 | {% endif %} 41 | 42 | 43 |
44 |
45 | 46 | {{ content }} 47 | 48 |
49 | 55 |
56 | 57 | {% if site.google_analytics %} 58 | 66 | {% endif %} 67 | 68 | 69 | -------------------------------------------------------------------------------- /_layouts/2020.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | {% seo %} 9 | 10 | 13 | 14 | 15 | 16 |
17 |
18 |

{{ site.title | default: site.github.repository_name }}

19 | 20 | {% if site.logo %} 21 | Logo 22 | {% endif %} 23 | 24 |

{{ site.description }}

25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | {% if site.show_downloads %} 35 | 40 | {% endif %} 41 | 42 | 43 |
44 |
45 | 46 | {{ content }} 47 | 48 |
49 | 55 |
56 | 57 | {% if site.google_analytics %} 58 | 66 | {% endif %} 67 | 68 | 69 | -------------------------------------------------------------------------------- /_layouts/2021.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | {% seo %} 9 | 10 | 13 | 14 | 15 | 16 |
17 |
18 |

{{ site.title | default: site.github.repository_name }}

19 | 20 | {% if site.logo %} 21 | Logo 22 | {% endif %} 23 | 24 |

{{ site.description }}

25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | {% if site.show_downloads %} 35 | 40 | {% endif %} 41 | 42 | 43 |
44 |
45 | 46 | {{ content }} 47 | 48 |
49 | 55 |
56 | 57 | {% if site.google_analytics %} 58 | 66 | {% endif %} 67 | 68 | 69 | -------------------------------------------------------------------------------- /docs/CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. 6 | 7 | ## Our Standards 8 | 9 | Examples of behavior that contributes to creating a positive environment include: 10 | 11 | * Using welcoming and inclusive language 12 | * Being respectful of differing viewpoints and experiences 13 | * Gracefully accepting constructive criticism 14 | * Focusing on what is best for the community 15 | * Showing empathy towards other community members 16 | 17 | Examples of unacceptable behavior by participants include: 18 | 19 | * The use of sexualized language or imagery and unwelcome sexual attention or advances 20 | * Trolling, insulting/derogatory comments, and personal or political attacks 21 | * Public or private harassment 22 | * Publishing others' private information, such as a physical or electronic address, without explicit permission 23 | * Other conduct which could reasonably be considered inappropriate in a professional setting 24 | 25 | ## Our Responsibilities 26 | 27 | Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. 28 | 29 | Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. 30 | 31 | ## Scope 32 | 33 | This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. 34 | 35 | ## Enforcement 36 | 37 | Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at opensource@github.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. 38 | 39 | Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. 40 | 41 | ## Attribution 42 | 43 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version] 44 | 45 | [homepage]: http://contributor-covenant.org 46 | [version]: http://contributor-covenant.org/version/1/4/ 47 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # The Minimal theme 2 | 3 | [![Build Status](https://travis-ci.org/pages-themes/minimal.svg?branch=master)](https://travis-ci.org/pages-themes/minimal) [![Gem Version](https://badge.fury.io/rb/jekyll-theme-minimal.svg)](https://badge.fury.io/rb/jekyll-theme-minimal) 4 | 5 | *Minimal is a Jekyll theme for GitHub Pages. You can [preview the theme to see what it looks like](http://pages-themes.github.io/minimal), or even [use it today](#usage).* 6 | 7 | ![Thumbnail of minimal](thumbnail.png) 8 | 9 | ## Usage 10 | 11 | To use the Minimal theme: 12 | 13 | 1. Add the following to your site's `_config.yml`: 14 | 15 | ```yml 16 | theme: jekyll-theme-minimal 17 | ``` 18 | 19 | 2. Optionally, if you'd like to preview your site on your computer, add the following to your site's `Gemfile`: 20 | 21 | ```ruby 22 | gem "github-pages", group: :jekyll_plugins 23 | ``` 24 | 25 | 26 | 27 | ## Customizing 28 | 29 | ### Configuration variables 30 | 31 | Minimal will respect the following variables, if set in your site's `_config.yml`: 32 | 33 | ```yml 34 | title: [The title of your site] 35 | description: [A short description of your site's purpose] 36 | ``` 37 | 38 | Additionally, you may choose to set the following optional variables: 39 | 40 | ```yml 41 | logo: [Location of the logo] 42 | show_downloads: ["true" or "false" to indicate whether to provide a download URL] 43 | google_analytics: [Your Google Analytics tracking ID] 44 | ``` 45 | 46 | ### Stylesheet 47 | 48 | If you'd like to add your own custom styles: 49 | 50 | 1. Create a file called `/assets/css/style.scss` in your site 51 | 2. Add the following content to the top of the file, exactly as shown: 52 | ```scss 53 | --- 54 | --- 55 | 56 | @import "{{ site.theme }}"; 57 | ``` 58 | 3. Add any custom CSS (or Sass, including imports) you'd like immediately after the `@import` line 59 | 60 | ### Layouts 61 | 62 | If you'd like to change the theme's HTML layout: 63 | 64 | 1. [Copy the original template](https://github.com/pages-themes/minimal/blob/master/_layouts/default.html) from the theme's repository
(*Pro-tip: click "raw" to make copying easier*) 65 | 2. Create a file called `/_layouts/default.html` in your site 66 | 3. Paste the default layout content copied in the first step 67 | 4. Customize the layout as you'd like 68 | 69 | ## Roadmap 70 | 71 | See the [open issues](https://github.com/pages-themes/minimal/issues) for a list of proposed features (and known issues). 72 | 73 | ## Project philosophy 74 | 75 | The Minimal theme is intended to make it quick and easy for GitHub Pages users to create their first (or 100th) website. The theme should meet the vast majority of users' needs out of the box, erring on the side of simplicity rather than flexibility, and provide users the opportunity to opt-in to additional complexity if they have specific needs or wish to further customize their experience (such as adding custom CSS or modifying the default layout). It should also look great, but that goes without saying. 76 | 77 | ## Contributing 78 | 79 | Interested in contributing to Minimal? We'd love your help. Minimal is an open source project, built one contribution at a time by users like you. See [the CONTRIBUTING file](docs/CONTRIBUTING.md) for instructions on how to contribute. 80 | 81 | ### Previewing the theme locally 82 | 83 | If you'd like to preview the theme locally (for example, in the process of proposing a change): 84 | 85 | 1. Clone down the theme's repository (`git clone https://github.com/pages-themes/minimal`) 86 | 2. `cd` into the theme's directory 87 | 3. Run `script/bootstrap` to install the necessary dependencies 88 | 4. Run `bundle exec jekyll serve` to start the preview server 89 | 5. Visit [`localhost:4000`](http://localhost:4000) in your browser to preview the theme 90 | 91 | ### Running tests 92 | 93 | The theme contains a minimal test suite, to ensure a site with the theme would build successfully. To run the tests, simply run `script/cibuild`. You'll need to run `script/bootstrap` one before the test script will work. 94 | -------------------------------------------------------------------------------- /_sass/rouge-github.scss: -------------------------------------------------------------------------------- 1 | .highlight table td { padding: 5px; } 2 | .highlight table pre { margin: 0; } 3 | .highlight .cm { 4 | color: #999988; 5 | font-style: italic; 6 | } 7 | .highlight .cp { 8 | color: #999999; 9 | font-weight: bold; 10 | } 11 | .highlight .c1 { 12 | color: #999988; 13 | font-style: italic; 14 | } 15 | .highlight .cs { 16 | color: #999999; 17 | font-weight: bold; 18 | font-style: italic; 19 | } 20 | .highlight .c, .highlight .cd { 21 | color: #999988; 22 | font-style: italic; 23 | } 24 | .highlight .err { 25 | color: #a61717; 26 | background-color: #e3d2d2; 27 | } 28 | .highlight .gd { 29 | color: #000000; 30 | background-color: #ffdddd; 31 | } 32 | .highlight .ge { 33 | color: #000000; 34 | font-style: italic; 35 | } 36 | .highlight .gr { 37 | color: #aa0000; 38 | } 39 | .highlight .gh { 40 | color: #999999; 41 | } 42 | .highlight .gi { 43 | color: #000000; 44 | background-color: #ddffdd; 45 | } 46 | .highlight .go { 47 | color: #888888; 48 | } 49 | .highlight .gp { 50 | color: #555555; 51 | } 52 | .highlight .gs { 53 | font-weight: bold; 54 | } 55 | .highlight .gu { 56 | color: #aaaaaa; 57 | } 58 | .highlight .gt { 59 | color: #aa0000; 60 | } 61 | .highlight .kc { 62 | color: #000000; 63 | font-weight: bold; 64 | } 65 | .highlight .kd { 66 | color: #000000; 67 | font-weight: bold; 68 | } 69 | .highlight .kn { 70 | color: #000000; 71 | font-weight: bold; 72 | } 73 | .highlight .kp { 74 | color: #000000; 75 | font-weight: bold; 76 | } 77 | .highlight .kr { 78 | color: #000000; 79 | font-weight: bold; 80 | } 81 | .highlight .kt { 82 | color: #445588; 83 | font-weight: bold; 84 | } 85 | .highlight .k, .highlight .kv { 86 | color: #000000; 87 | font-weight: bold; 88 | } 89 | .highlight .mf { 90 | color: #009999; 91 | } 92 | .highlight .mh { 93 | color: #009999; 94 | } 95 | .highlight .il { 96 | color: #009999; 97 | } 98 | .highlight .mi { 99 | color: #009999; 100 | } 101 | .highlight .mo { 102 | color: #009999; 103 | } 104 | .highlight .m, .highlight .mb, .highlight .mx { 105 | color: #009999; 106 | } 107 | .highlight .sb { 108 | color: #d14; 109 | } 110 | .highlight .sc { 111 | color: #d14; 112 | } 113 | .highlight .sd { 114 | color: #d14; 115 | } 116 | .highlight .s2 { 117 | color: #d14; 118 | } 119 | .highlight .se { 120 | color: #d14; 121 | } 122 | .highlight .sh { 123 | color: #d14; 124 | } 125 | .highlight .si { 126 | color: #d14; 127 | } 128 | .highlight .sx { 129 | color: #d14; 130 | } 131 | .highlight .sr { 132 | color: #009926; 133 | } 134 | .highlight .s1 { 135 | color: #d14; 136 | } 137 | .highlight .ss { 138 | color: #990073; 139 | } 140 | .highlight .s { 141 | color: #d14; 142 | } 143 | .highlight .na { 144 | color: #008080; 145 | } 146 | .highlight .bp { 147 | color: #999999; 148 | } 149 | .highlight .nb { 150 | color: #0086B3; 151 | } 152 | .highlight .nc { 153 | color: #445588; 154 | font-weight: bold; 155 | } 156 | .highlight .no { 157 | color: #008080; 158 | } 159 | .highlight .nd { 160 | color: #3c5d5d; 161 | font-weight: bold; 162 | } 163 | .highlight .ni { 164 | color: #800080; 165 | } 166 | .highlight .ne { 167 | color: #990000; 168 | font-weight: bold; 169 | } 170 | .highlight .nf { 171 | color: #990000; 172 | font-weight: bold; 173 | } 174 | .highlight .nl { 175 | color: #990000; 176 | font-weight: bold; 177 | } 178 | .highlight .nn { 179 | color: #555555; 180 | } 181 | .highlight .nt { 182 | color: #000080; 183 | } 184 | .highlight .vc { 185 | color: #008080; 186 | } 187 | .highlight .vg { 188 | color: #008080; 189 | } 190 | .highlight .vi { 191 | color: #008080; 192 | } 193 | .highlight .nv { 194 | color: #008080; 195 | } 196 | .highlight .ow { 197 | color: #000000; 198 | font-weight: bold; 199 | } 200 | .highlight .o { 201 | color: #000000; 202 | font-weight: bold; 203 | } 204 | .highlight .w { 205 | color: #bbbbbb; 206 | } 207 | .highlight { 208 | background-color: #f8f8f8; 209 | } 210 | -------------------------------------------------------------------------------- /about/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | --- 4 | 5 | # About Us and Contact Information 6 | 7 | The PhysioNet Challenges are an activity of PhysioNet (the moniker of the Research Resource for Complex Physiologic Signals). 8 | 9 | PhysioNet was established in 1999 under the auspices of the National Institutes of Health (NIH), co-founded by Principal Investigators [Ary L. Goldberger](http://reylab.bidmc.harvard.edu/people/Ary.shtml) at Beth Israel Deaconess Medical Center's/Harvard Medical School's Margret & H. A. Rey Institute for Nonlinear Dynamics in Medicine ([ReyLab](http://reylab.bidmc.harvard.edu/index.shtml)) and [Roger G. Mark](https://imes.mit.edu/people/faculty/mark-roger/) at MIT's Laboratory for Computational Physiology ([LCP](https://lcp.mit.edu/)). The PhysioNet resource's original and ongoing missions focus on conducting and catalyzing biomedical research and education, in part by offering free access to large collections of physiological and clinical data and related open-source software. 10 | 11 | In cooperation with this annual conference, PhysioNet also hosts an annual series of biomedical 'Challenges', focusing research on unsolved problems in clinical and basic science. These Challenges have been supported by the NIH, Google, MathWorks and the Gordon and Betty Moore Foundation. For the first 15 years, these Challenges were led by George Moody at the LCP. Since 2015, the Challenges have been led by [Gari Clifford](http://gdclifford.info) at Emory University and the Georgia Institute of Technology. 12 | 13 | Currently supported by the National Institute of Biomedical Imaging and Bioengineering (NIBIB) (R01EB030362), PhysioNet has three specific aims, managed by its three interactive research groups: 14 | 15 | 1. The Laboratory for Computational Physiology (LCP) at MIT runs the [PhysioNet.org](https://physionet.org) website to distribute data and resources such as open-source software. It maintains the software, including the WFDB libraries for accessing waveform data. 16 | 17 | 2. The ReyLab at Beth Israel Deaconess/Harvard Medical school, which develops computational software for time series analysis and other open-source resources and research posted though [PhysioNet.org](https://physionet.org). 18 | 19 | 3. The Department of Biomedical Informatics (BMI) at Emory, which runs the Challenges and [physionetchallenges.org](https://physionetchallenges.org). It supports the code and data provided by the organizers for the current Challenge but not for past Challenges or other projects. 20 | 21 | For support of all libraries and code provided on PhysioNet.org, please contact the LCP, which can be reached directly for support at [https://physionet.org/about/#contact](https://physionet.org/about/#contact). 22 | 23 | For support about Computing and Cardiology (CinC), please email contact[at]cinc.org. Although we partner with CinC, we cannot provide exceptions to the conference rules, discounts on fees, visa support letters for attendance, or other specific information about the conference not directly related to the Challenge. 24 | 25 | For support on the current Challenge, please contact help[at]physionetchallenges.org. Previous Challenges are not generally supported, but requests will be considered, particularly if resources are available. The submission of open-source code to be re-run on past Challenge data will be considered if the requestor can provide resources to assist in this. Unfortunately, the funding from the NIH and our generous sponsors cannot always match the large number of requests that we receive. Test data from previous Challenges will not be released under any circumstance to prevent overfitting on the test data. 26 | 27 | If you are interested in contributing to, or posing a Challenge, please feel free to contact us with details of the databases you can provide, the nature of the problem you wish to solve, and some demo code which makes a basic attempt to solve the problem. We strongly recommend having at least three independent databases, two to become public, and one to remain private/hidden. For more information on the general aims and framework of the Challenge, and the criteria for a successful event, please see [here](https://arxiv.org/abs/2007.10502). 28 | 29 | --- 30 | 31 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 32 | 33 | [Back](../) 34 | -------------------------------------------------------------------------------- /2021/leaderboard/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2021 3 | --- 4 | 5 | # PhysioNet/CinC Challenge 2021 Results 6 | 7 | This page contains the final scores for the 2021 PhysioNet/CinC Challenge: 8 | - __Official entries__: [only Challenge metric scores](https://docs.google.com/spreadsheets/d/1cTLRmSLS1_TOwx-XnY-QVoUyO2rFyPUGTHRzNm3u8EM/edit?usp=sharing) and [all scores](https://docs.google.com/spreadsheets/d/1Zf7A_w_Pn3A--jSOHinrSQ-FnJ8CrptzeHm6U9Gph4M/edit?usp=sharing) 9 | - __Unofficial entries__: [only Challenge metric scores](https://docs.google.com/spreadsheets/d/1iMKPXDvqfyQlwhsd4N6CjKZccikhsIkSDygLEsICqsw/edit?usp=sharing) and [all scores](https://docs.google.com/spreadsheets/d/1iPDrWi1SsWUO-VNMwQK8W63iA_mJgzDN-Z-mmhaGcN4/edit?usp=sharing) 10 | - __Team eligibility__: [here](https://docs.google.com/spreadsheets/d/1sSKA9jMp8oT2VqyX4CTirIT3m5lSohIuk5GWf-Cq8FU/edit?usp=sharing) (determines whether entries are official or unofficial) 11 | 12 | These tables contain scores for the 12-lead, 6-lead, 4-lead, 3-lead, and 2-lead versions of the hidden validation and test sets. The [2021 Challenge webpage](../) and [2021 Challenge paper](../papers/cinc_draft.pdf) describes the [lead combinations](../#data) and the validation and test [data sources](../#data-sources). We also included an _all-lead_ score in the tables, which is computed as the mean of the 12-lead, 3-lead, and 2-lead scores. 13 | 14 | We used the [Challenge scoring metric](../#scoring) that we implemented in the [evaluation code repository](https://github.com/physionetchallenges/evaluation-2021) to rank the official entries on the test set. We sorted the unofficial entries alphabetically by team name. 15 | 16 | In these tables, you can find the following information: 17 | - The [only Challenge metric scores of the official entries](https://docs.google.com/spreadsheets/d/1cTLRmSLS1_TOwx-XnY-QVoUyO2rFyPUGTHRzNm3u8EM/edit?usp=sharing) and [only Challenge metric scores of the unofficial entries](https://docs.google.com/spreadsheets/d/1iMKPXDvqfyQlwhsd4N6CjKZccikhsIkSDygLEsICqsw/edit?usp=sharing) spreadsheets provide the Challenge scoring metric on the hidden datasets as well as the the run time for training the models and running the trained models. Please click on the tabs at the bottom of the spreadsheets to switch between the 12-lead, 6-lead, 4-lead, 3-lead, 2-lead and all-lead (mean of the scores on the 12-lead, 3-lead, and 2-lead scores) scores. 18 | - The [all scores of the official entries](https://docs.google.com/spreadsheets/d/1Zf7A_w_Pn3A--jSOHinrSQ-FnJ8CrptzeHm6U9Gph4M/edit?usp=sharing) and [all scores of the unofficial entries](https://docs.google.com/spreadsheets/d/1iPDrWi1SsWUO-VNMwQK8W63iA_mJgzDN-Z-mmhaGcN4/edit?usp=sharing) spreadsheets provide more evaluation metrics: the area under the receiver operating characteristic curve (AUROC), area under the precision recall curve (AUPRC), _F_-measure, accuracy, and the Challenge scoring metric scores. The accuracy metric is the fraction of correctly diagnosed recordings, i.e., all classes for the recording are correct. Please click on the tabs at the bottom of the spreadsheets to switch between the 12-lead, 6-lead, 4-lead, 3-lead, and 2-lead scores 19 | - The [team eligibility table](https://docs.google.com/spreadsheets/d/1sSKA9jMp8oT2VqyX4CTirIT3m5lSohIuk5GWf-Cq8FU/edit?usp=sharing) contains information about all of the 2021 Challenge entries (official and unofficial entries), including satisfiaction of the [rules](../#rules) for rankings and prize eligibility. 20 | 21 | To refer to these tables in your publication, please cite the following papers: 22 | 23 | [Perez Alday EA, Gu A, Shah AJ, Robichaux C, Wong AI, Liu C, Liu F, Rad AB, Elola A, Seyedi S, Li Q, Sharma A, Clifford GD*, Reyna MA*. Classification of 12-lead ECGs: the PhysioNet/Computing in Cardiology Challenge 2020. Physiol Meas. 41 (2020). doi: 10.1088/1361-6579/abc960](https://iopscience.iop.org/article/10.1088/1361-6579/abc960). 24 | 25 | [Reyna MA, Sadr N, Perez Alday EA, Gu A, Shah AJ, Robichaux C, Rad AB, Elola A, Seyedi S, Ansari S, Ghanbari H, Li Q, Sharma A, Clifford GD. Will Two Do? Varying Dimensions in Electrocardiography: the PhysioNet/Computing in Cardiology Challenge 2021. Computing in Cardiology 2021; 48: 1-4](../papers/cinc_draft.pdf). 26 | 27 | --- 28 | 29 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 30 | 31 | [Back](../) 32 | -------------------------------------------------------------------------------- /2020/Dx_map.csv: -------------------------------------------------------------------------------- 1 | Dx,SNOMED CT Code,Abbreviation 2 | 1st degree av block,270492004,IAVB 3 | 2nd degree av block,195042002,IIAVB 4 | abnormal QRS,164951009,abQRS 5 | accelerated junctional rhythm,426664006,AJR 6 | acute myocardial infarction,57054005,AMI 7 | acute myocardial ischemia,413444003,AMIs 8 | anterior ischemia,426434006,AnMIs 9 | anterior myocardial infarction,54329005,AnMI 10 | atrial bigeminy,251173003,AB 11 | atrial fibrillation,164889003,AF 12 | atrial fibrillation and flutter,195080001,AFAFL 13 | atrial flutter,164890007,AFL 14 | atrial hypertrophy,195126007,AH 15 | atrial pacing pattern,251268003,AP 16 | atrial tachycardia,713422000,ATach 17 | atrioventricular junctional rhythm,29320008,AVJR 18 | av block,233917008,AVB 19 | blocked premature atrial contraction,251170000,BPAC 20 | brady tachy syndrome,74615001,BTS 21 | bradycardia,426627000,Brady 22 | bundle branch block,6374002,BBB 23 | cardiac dysrhythmia,698247007,CD 24 | chronic atrial fibrillation,426749004,CAF 25 | chronic myocardial ischemia,413844008,CMI 26 | complete heart block,27885002,CHB 27 | complete right bundle branch block,713427006,CRBBB 28 | congenital incomplete atrioventricular heart block,204384007,CIAHB 29 | coronary heart disease,53741008,CHD 30 | decreased qt interval,77867006,SQT 31 | diffuse intraventricular block,82226007,DIB 32 | early repolarization,428417006,ERe 33 | fusion beats,13640000,FB 34 | heart failure,84114007,HF 35 | heart valve disorder,368009,HVD 36 | high t-voltage,251259000,HTV 37 | idioventricular rhythm,49260003,IR 38 | incomplete left bundle branch block,251120003,ILBBB 39 | incomplete right bundle branch block,713426002,IRBBB 40 | indeterminate cardiac axis,251200008,ICA 41 | inferior ischaemia,425419005,IIs 42 | inferior ST segment depression,704997005,ISTD 43 | junctional escape,426995002,JE 44 | junctional premature complex,251164006,JPC 45 | junctional tachycardia,426648003,JTach 46 | lateral ischaemia,425623009,LIs 47 | left anterior fascicular block,445118002,LAnFB 48 | left atrial abnormality,253352002,LAA 49 | left atrial enlargement,67741000119109,LAE 50 | left atrial hypertrophy,446813000,LAH 51 | left axis deviation,39732003,LAD 52 | left bundle branch block,164909002,LBBB 53 | left posterior fascicular block,445211001,LPFB 54 | left ventricular hypertrophy,164873001,LVH 55 | left ventricular strain,370365005,LVS 56 | low qrs voltages,251146004,LQRSV 57 | mobitz type i wenckebach atrioventricular block,54016002,MoI 58 | myocardial infarction,164865005,MI 59 | myocardial ischemia,164861001,MIs 60 | nonspecific intraventricular conduction disorder,698252002,NSIVCB 61 | nonspecific st t abnormality,428750005,NSSTTA 62 | old myocardial infarction,164867002,OldMI 63 | pacing rhythm,10370003,PR 64 | paired ventricular premature complexes,251182009,VPVC 65 | paroxysmal atrial fibrillation,282825002,PAF 66 | paroxysmal supraventricular tachycardia,67198005,PSVT 67 | paroxysmal ventricular tachycardia,425856008,PVT 68 | premature atrial contraction,284470004,PAC 69 | premature ventricular contractions,427172004,PVC 70 | ventricular premature beats,17338001,VPB 71 | prolonged pr interval,164947007,LPR 72 | prolonged qt interval,111975006,LQT 73 | qwave abnormal,164917005,QAb 74 | r wave abnormal,164921003,RAb 75 | rapid atrial fibrillation,314208002,RAF 76 | right atrial abnormality,253339007,RAAb 77 | right atrial hypertrophy,446358003,RAH 78 | right axis deviation,47665007,RAD 79 | right bundle branch block,59118001,RBBB 80 | right ventricular hypertrophy,89792004,RVH 81 | s t changes,55930002,STC 82 | shortened pr interval,49578007,SPRI 83 | sinoatrial block,65778007,SAB 84 | sinus arrhythmia,427393009,SA 85 | sinus bradycardia,426177001,SB 86 | sinus node dysfunction,60423000,SND 87 | sinus rhythm,426783006,NSR 88 | sinus tachycardia,427084000,STach 89 | st depression,429622005,STD 90 | st elevation,164931005,STE 91 | st interval abnormal,164930006,STIAb 92 | supraventricular bigeminy,251168009,SVB 93 | supraventricular premature beats,63593006,SVPB 94 | supraventricular tachycardia,426761007,SVT 95 | suspect arm ecg leads reversed,251139008,ALR 96 | t wave abnormal,164934002,TAb 97 | t wave inversion,59931005,TInv 98 | transient ischemic attack,266257000,TIA 99 | u wave abnormal,164937009,UAb 100 | ventricular bigeminy,11157007,VBig 101 | ventricular ectopics,164884008,VEB 102 | ventricular escape beat,75532003,VEsB 103 | ventricular escape rhythm,81898007,VEsR 104 | ventricular fibrillation,164896001,VF 105 | ventricular flutter,111288001,VFL 106 | ventricular hypertrophy,266249003,VH 107 | ventricular pacing pattern,251266004,VPP 108 | ventricular pre excitation,195060002,VPEx 109 | ventricular tachycardia,164895002,VTach 110 | ventricular trigeminy,251180001,VTrig 111 | wandering atrial pacemaker,195101003,WAP 112 | wolff parkinson white pattern,74390002,WPW 113 | -------------------------------------------------------------------------------- /_sass/jekyll-theme-minimal.scss: -------------------------------------------------------------------------------- 1 | @import "fonts"; 2 | @import "rouge-github"; 3 | 4 | body { 5 | background-color: #fff; 6 | padding:50px; 7 | font: 14px/1.5 "Noto Sans", "Helvetica Neue", Helvetica, Arial, sans-serif; 8 | color:#727272; 9 | font-weight:400; 10 | } 11 | 12 | h1, h2, h3, h4, h5, h6 { 13 | color:#222; 14 | margin:0 0 20px; 15 | } 16 | 17 | p, ul, ol, table, pre, dl { 18 | margin:0 0 20px; 19 | } 20 | 21 | h1, h2, h3 { 22 | line-height:1.1; 23 | } 24 | 25 | h1 { 26 | font-size:28px; 27 | } 28 | 29 | h2 { 30 | color:#393939; 31 | } 32 | 33 | h3, h4, h5, h6 { 34 | color:#494949; 35 | } 36 | 37 | a { 38 | color:#267CB9; 39 | text-decoration:none; 40 | } 41 | 42 | a:hover, a:focus { 43 | color:#069; 44 | font-weight: bold; 45 | } 46 | 47 | a small { 48 | font-size:11px; 49 | color:#777; 50 | margin-top:-0.3em; 51 | display:block; 52 | } 53 | 54 | a:hover small { 55 | color:#777; 56 | } 57 | 58 | .wrapper { 59 | width:860px; 60 | margin:0 auto; 61 | } 62 | 63 | blockquote { 64 | border-left:1px solid #e5e5e5; 65 | margin:0; 66 | padding:0 0 0 20px; 67 | font-style:italic; 68 | } 69 | 70 | code, pre { 71 | font-family:Monaco, Bitstream Vera Sans Mono, Lucida Console, Terminal, Consolas, Liberation Mono, DejaVu Sans Mono, Courier New, monospace; 72 | color:#333; 73 | } 74 | 75 | pre { 76 | padding:8px 15px; 77 | background: #f8f8f8; 78 | border-radius:5px; 79 | border:1px solid #e5e5e5; 80 | overflow-x: auto; 81 | } 82 | 83 | table { 84 | width:100%; 85 | border-collapse:collapse; 86 | } 87 | 88 | th, td { 89 | text-align:left; 90 | padding:5px 10px; 91 | border-bottom:1px solid #e5e5e5; 92 | } 93 | 94 | dt { 95 | color:#444; 96 | font-weight:700; 97 | } 98 | 99 | th { 100 | color:#444; 101 | } 102 | 103 | img { 104 | max-width:100%; 105 | } 106 | 107 | header { 108 | width:270px; 109 | float:left; 110 | position:fixed; 111 | -webkit-font-smoothing:subpixel-antialiased; 112 | } 113 | 114 | ul.downloads { 115 | list-style:none; 116 | height:40px; 117 | padding:0; 118 | background: #f4f4f4; 119 | border-radius:5px; 120 | border:1px solid #e0e0e0; 121 | width:270px; 122 | } 123 | 124 | .downloads li { 125 | width:89px; 126 | float:left; 127 | border-right:1px solid #e0e0e0; 128 | height:40px; 129 | } 130 | 131 | .downloads li:first-child a { 132 | border-radius:5px 0 0 5px; 133 | } 134 | 135 | .downloads li:last-child a { 136 | border-radius:0 5px 5px 0; 137 | } 138 | 139 | .downloads a { 140 | line-height:1; 141 | font-size:11px; 142 | color:#676767; 143 | display:block; 144 | text-align:center; 145 | padding-top:6px; 146 | height:34px; 147 | } 148 | 149 | .downloads a:hover, .downloads a:focus { 150 | color:#675C5C; 151 | font-weight:bold; 152 | } 153 | 154 | .downloads ul a:active { 155 | background-color:#f0f0f0; 156 | } 157 | 158 | strong { 159 | color:#222; 160 | font-weight:700; 161 | } 162 | 163 | .downloads li + li + li { 164 | border-right:none; 165 | width:89px; 166 | } 167 | 168 | .downloads a strong { 169 | font-size:14px; 170 | display:block; 171 | color:#222; 172 | } 173 | 174 | section { 175 | width:500px; 176 | float:right; 177 | padding-bottom:50px; 178 | } 179 | 180 | small { 181 | font-size:11px; 182 | } 183 | 184 | hr { 185 | border:0; 186 | background:#e5e5e5; 187 | height:1px; 188 | margin:0 0 20px; 189 | } 190 | 191 | footer { 192 | width:270px; 193 | float:left; 194 | position:fixed; 195 | bottom:50px; 196 | -webkit-font-smoothing:subpixel-antialiased; 197 | } 198 | 199 | @media print, screen and (max-width: 960px) { 200 | 201 | div.wrapper { 202 | width:auto; 203 | margin:0; 204 | } 205 | 206 | header, section, footer { 207 | float:none; 208 | position:static; 209 | width:auto; 210 | } 211 | 212 | header { 213 | padding-right:320px; 214 | } 215 | 216 | section { 217 | border:1px solid #e5e5e5; 218 | border-width:1px 0; 219 | padding:20px 0; 220 | margin:0 0 20px; 221 | } 222 | 223 | header a small { 224 | display:inline; 225 | } 226 | 227 | header ul { 228 | position:absolute; 229 | right:50px; 230 | top:52px; 231 | } 232 | } 233 | 234 | @media print, screen and (max-width: 720px) { 235 | body { 236 | word-wrap:break-word; 237 | } 238 | 239 | header { 240 | padding:0; 241 | } 242 | 243 | header ul, header p.view { 244 | position:static; 245 | } 246 | 247 | pre, code { 248 | word-wrap:normal; 249 | } 250 | } 251 | 252 | @media print, screen and (max-width: 480px) { 253 | body { 254 | padding:15px; 255 | } 256 | 257 | .downloads { 258 | width:99%; 259 | } 260 | 261 | .downloads li, .downloads li + li + li { 262 | width:33%; 263 | } 264 | } 265 | 266 | @media print { 267 | body { 268 | padding:0.4in; 269 | font-size:12pt; 270 | color:#444; 271 | } 272 | } 273 | -------------------------------------------------------------------------------- /docs/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing to The Minimal Theme 2 | 3 | Hi there! We're thrilled that you'd like to contribute to The Minimal Theme. Your help is essential for keeping it great. 4 | 5 | The Minimal Theme is an open source project supported by the efforts of an entire community and built one contribution at a time by users like you. We'd love for you to get involved. Whatever your level of skill or however much time you can give, your contribution is greatly appreciated. There are many ways to contribute, from writing tutorials or blog posts, improving the documentation, submitting bug reports and feature requests, helping other users by commenting on issues, or writing code which can be incorporated into The Minimal Theme itself. 6 | 7 | Following these guidelines helps to communicate that you respect the time of the developers managing and developing this open source project. In return, they should reciprocate that respect in addressing your issue, assessing changes, and helping you finalize your pull requests. 8 | 9 | 10 | ## Looking for support? 11 | 12 | We'd love to help. Check out [the support guidelines](SUPPORT.md). 13 | 14 | ## How to report a bug 15 | 16 | Think you found a bug? Please check [the list of open issues](https://github.com/pages-themes/minimal/issues) to see if your bug has already been reported. If it hasn't please [submit a new issue](https://github.com/pages-themes/minimal/issues/new). 17 | 18 | Here are a few tips for writing *great* bug reports: 19 | 20 | * Describe the specific problem (e.g., "widget doesn't turn clockwise" versus "getting an error") 21 | * Include the steps to reproduce the bug, what you expected to happen, and what happened instead 22 | * Check that you are using the latest version of the project and its dependencies 23 | * Include what version of the project your using, as well as any relevant dependencies 24 | * Only include one bug per issue. If you have discovered two bugs, please file two issues 25 | * Even if you don't know how to fix the bug, including a failing test may help others track it down 26 | 27 | **If you find a security vulnerability, do not open an issue. Please email security@github.com instead.** 28 | 29 | ## How to suggest a feature or enhancement 30 | 31 | If you find yourself wishing for a feature that doesn't exist in The Minimal Theme, you are probably not alone. There are bound to be others out there with similar needs. Many of the features that The Minimal Theme has today have been added because our users saw the need. 32 | 33 | Feature requests are welcome. But take a moment to find out whether your idea fits with the scope and goals of the project. It's up to you to make a strong case to convince the project's developers of the merits of this feature. Please provide as much detail and context as possible, including describing the problem you're trying to solve. 34 | 35 | [Open an issue](https://github.com/pages-themes/minimal/issues/new) which describes the feature you would like to see, why you want it, how it should work, etc. 36 | 37 | 38 | 39 | ## Your first contribution 40 | 41 | We'd love for you to contribute to the project. Unsure where to begin contributing to The Minimal Theme? You can start by looking through these "good first issue" and "help wanted" issues: 42 | 43 | * [Good first issues](https://github.com/pages-themes/minimal/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) - issues which should only require a few lines of code and a test or two 44 | * [Help wanted issues](https://github.com/pages-themes/minimal/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) - issues which may be a bit more involved, but are specifically seeking community contributions 45 | 46 | *p.s. Feel free to ask for help; everyone is a beginner at first* :smiley_cat: 47 | 48 | ## How to propose changes 49 | 50 | Here's a few general guidelines for proposing changes: 51 | 52 | * If you are making visual changes, include a screenshot of what the affected element looks like, both before and after. 53 | * Follow the [Jekyll style guide](https://ben.balter.com/jekyll-style-guide). 54 | * If you are changing any user-facing functionality, please be sure to update the documentation 55 | * Each pull request should implement **one** feature or bug fix. If you want to add or fix more than one thing, submit more than one pull request 56 | * Do not commit changes to files that are irrelevant to your feature or bug fix 57 | * Don't bump the version number in your pull request (it will be bumped prior to release) 58 | * Write [a good commit message](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html) 59 | 60 | At a high level, [the process for proposing changes](https://guides.github.com/introduction/flow/) is: 61 | 62 | 1. [Fork](https://github.com/pages-themes/minimal/fork) and clone the project 63 | 2. Configure and install the dependencies: `script/bootstrap` 64 | 3. Make sure the tests pass on your machine: `script/cibuild` 65 | 4. Create a new branch: `git checkout -b my-branch-name` 66 | 5. Make your change, add tests, and make sure the tests still pass 67 | 6. Push to your fork and [submit a pull request](https://github.com/pages-themes/minimal/compare) 68 | 7. Pat your self on the back and wait for your pull request to be reviewed and merged 69 | 70 | **Interesting in submitting your first Pull Request?** It's easy! You can learn how from this *free* series [How to Contribute to an Open Source Project on GitHub](https://egghead.io/series/how-to-contribute-to-an-open-source-project-on-github) 71 | 72 | ## Bootstrapping your local development environment 73 | 74 | `script/bootstrap` 75 | 76 | ## Running tests 77 | 78 | `script/cibuild` 79 | 80 | ## Code of conduct 81 | 82 | This project is governed by [the Contributor Covenant Code of Conduct](CODE_OF_CONDUCT.md). By participating, you are expected to uphold this code. 83 | 84 | ## Additional Resources 85 | 86 | * [Contributing to Open Source on GitHub](https://guides.github.com/activities/contributing-to-open-source/) 87 | * [Using Pull Requests](https://help.github.com/articles/using-pull-requests/) 88 | * [GitHub Help](https://help.github.com) 89 | -------------------------------------------------------------------------------- /index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | --- 4 | 5 | # The PhysioNet/Computing in Cardiology Challenges 6 | 7 | For the past 22 years, [PhysioNet](https://physionet.org) and [Computing in Cardiology](http://www.cinc.org/) have co-hosted a series of annual challenges to tackle clinically interesting questions that are either unsolved or not well-solved. 8 | 9 | The [PhysioNet/Computing in Cardiology Challenge 2021](/2021/) invites participants to identify clinical diagnoses from reduced-lead ECG recordings, extending last year's Challenge on twelve-lead ECGs. 10 | 11 | We ask participants to design and implement working, open-source algorithms that can, based only on the provided clinical data, automatically identify any cardiac abnormalities present in an ECG recording. The winners of the Challenge will be the teams whose algorithms achieve the highest score for hidden test sets of twelve-lead, six-lead, four-lead, three-lead, or two-lead ECG recordings. 12 | 13 | Please check the below links for information about [current](#current) and [past](#past) Challenges, including [important details](/faq/) about scoring and test data for previous Challenges. 14 | 15 | ## Current Challenge 16 | - __2021__: [Will Two Do? Varying Dimensions in Electrocardiography](/2021/) 17 | 18 | ## Past Challenges 19 | - __2020__: [Classification of 12-lead ECGs](/2020/) 20 | Papers and contributed software ongoing 21 | - __2019__: [Early Prediction of Sepsis from Clinical Data](https://physionet.org/content/challenge-2019/) 22 | Papers and contributed software ongoing 23 | - __2018__: [You Snooze, You Win](https://physionet.org/content/challenge-2018/) 24 | Papers and contributed software ongoing 25 | - __2017__: [AF Classification from a Short Single Lead ECG Recording](https://physionet.org/content/challenge-2017/) 26 | [57 papers](https://archive.physionet.org/challenge/2017/papers/) and [64 contributed software](https://archive.physionet.org/challenge/2017/sources/) 27 | - __2016__: [Classification of Normal/Abnormal Heart Sound Recordings](https://physionet.org/content/challenge-2016/) 28 | [11 papers](https://archive.physionet.org/challenge/2016/papers/) and [48 contributed software](https://archive.physionet.org/challenge/2016/sources/) 29 | - __2015__: [Reducing False Arrhythmia Alarms in the ICU](https://physionet.org/content/challenge-2015/) 30 | [20 papers](https://archive.physionet.org/challenge/2015/papers/) and [28 contributed software](https://archive.physionet.org/challenge/2015/sources/) 31 | - __2014__: [Robust Detection of Heart Beats in Multimodal Data](https://physionet.org/content/challenge-2014/) 32 | [15 papers](https://archive.physionet.org/challenge/2014/papers/) and [35 contributed software](https://archive.physionet.org/challenge/2014/sources/) 33 | - __2013__: [Noninvasive Fetal ECG](https://physionet.org/content/challenge-2013/) 34 | [29 papers](https://archive.physionet.org/challenge/2013/papers/) and [17 contributed software](https://archive.physionet.org/challenge/2013/sources/) 35 | - __2012__: [Predicting Mortality of ICU Patients](https://physionet.org/content/challenge-2012/) 36 | [17 papers](https://archive.physionet.org/challenge/2012/papers/) and [58 contributed software](https://archive.physionet.org/challenge/2012/sources/) 37 | - __2011__: [Improving the Quality of ECGs Collected using Mobile Phones](https://physionet.org/content/challenge-2011/) 38 | [17 papers](https://archive.physionet.org/challenge/2011/papers/) and [7 contributed software](https://archive.physionet.org/challenge/2011/sources/) 39 | - __2010__: [Mind the Gap](https://physionet.org/content/challenge-2010/) 40 | [13 papers](https://archive.physionet.org/challenge/2010/papers/) and [5 contributed software](https://archive.physionet.org/challenge/2010/sources/) 41 | - __2009__: [Predicting Acute Hypotensive Episodes](https://physionet.org/content/challenge-2009/) 42 | [11 papers](https://archive.physionet.org/challenge/2009/papers/) and [4 contributed software](https://archive.physionet.org/challenge/2009/sources/) 43 | - __2008__: [Detecting and Quantifying T-Wave Alternans](https://physionet.org/content/challenge-2008/) 44 | [19 papers](https://archive.physionet.org/challenge/2008/papers/) and [5](https://archive.physionet.org/challenge/2008/sources/) + [1](https://archive.physionet.org/physiotools/TWAnalyser/) contributed software 45 | - __2007__: [Electrocardiographic Imaging of Myocardial Infarction](https://physionet.org/content/challenge-2007/) 46 | [6 papers](https://archive.physionet.org/challenge/2007/papers/) 47 | - __2006__: [QT Interval Measurement](https://physionet.org/content/challenge-2006/) 48 | [20 papers](https://archive.physionet.org/challenge/2006/papers/) and [6 contributed software](https://archive.physionet.org/challenge/2006/sources/) 49 | - __2005__: [The First Five Challenges Revisited](https://physionet.org/content/challenge-2005/) 50 | [5 papers](https://archive.physionet.org/challenge/2005/papers/) 51 | - __2004__: [Spontaneous Termination of Atrial Fibrillation](https://physionet.org/content/challenge-2004/) 52 | [12 papers](https://archive.physionet.org/challenge/2004/papers/) and [1 contributed software](https://archive.physionet.org/challenge/2004/cantini-src/) 53 | - __2003__: [Distinguishing Ischemic from Non-Ischemic ST Changes](https://physionet.org/content/challenge-2003/) 54 | [3 papers](https://archive.physionet.org/challenge/2003/papers/) and [1 contributed software](https://archive.physionet.org/challenge/2003/code/) 55 | - __2002__: [RR Interval Time Series Modeling](https://physionet.org/content/challenge-2002/) 56 | [12 papers](https://archive.physionet.org/challenge/2002/papers/) and [10 contributed software](https://archive.physionet.org/challenge/2002/generators/) 57 | - __2001__: [Predicting Paroxysmal Atrial Fibrillation](https://physionet.org/content/challenge-2001/) 58 | [9 papers](https://archive.physionet.org/challenge/2001/papers/) 59 | - __2000__: [Detecting Sleep Apnea from the ECG](https://physionet.org/content/challenge-2000/) 60 | [13 papers](https://archive.physionet.org/challenge/2000/papers/) and [1 contributed software](https://archive.physionet.org/physiotools/apdet/) 61 | 62 | --- 63 | 64 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 65 | -------------------------------------------------------------------------------- /2020/faq.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2020 3 | --- 4 | 5 | # Frequently Asked Questions (FAQ) 6 | 7 | __Can I attend this year's conference remotely? Will I be eligble for prizes?__ 8 | 9 | Yes, due to the unique circumstances of this year, remote attendance is allowed for CinC 2020. You are still eligible for prizes if you attend remotely (as long as you satisfy the other criteria). 10 | 11 | __Why don't you merge PVC, VEB, and VPB labels?__ 12 | 13 | Each database is labelled using a different ontology, or subset of terms in an ontology (or sometimes no ontology, i.e., just a free-text description). We needed to decide how to map these ontologies to a consistent set of labels. For example, we have the following four labels for ventricular ectopic beats: 14 | 15 | | Description | SNOMED Code | Abbreviation | 16 | | -----------------------------------| ------------|--------------| 17 | | Premature ventricular complexes | 164884008 | PVC | 18 | | Premature ventricular contractions | 427172004 | PVC | 19 | | Ventricular ectopic beats | 17338001 | VEB | 20 | | Ventricular premature beat | 17338001 | VPB | 21 | 22 | We have chosen to retain the distinction between these in terms of SNOMED codes (but merged PVCs because we could really see no reason they had two separate codes), but the scored labels carry the same weight in scoring matrix, so mixing them up doesn't cost you any points. You may then ask, 'why not merge them all in the labelling'? That's a question you have to answer for yourself! You are certainly welcome to do that - but you may not want to. You may note that only VPB indicates the temporal location of the beat relative to the preceding normal beat. This may, or may not, affect your algorithm, depending on how you write your code. You may or may not want it to affect your algorithm - the relative timing of beats certainly gives you information! 23 | 24 | In general, we have tried to provide you with as much useful information as possible, without overwhelming you with a complete data dump. 25 | 26 | __Are the scores currently on the leaderboard the final scores for the Challenge?__ 27 | 28 | No, the leaderboard contains scores on a subset of the test data during the unofficial and official phases of the Challenge. The final scores on the full test data are released after the conference for the "best" model selected by each team. 29 | 30 | __How will you choose which model to evaluate on the full test data? The latest entry? The best-scoring entry?__ 31 | 32 | You will be able to choose which model you would like to have scored on the full test set. We will ask for teams to choose their “best” model shortly before the end of the official phase of the Challenge. If you do not choose a model, or if there is any ambiguity about your choice, then we will use the model with the highest score on the current subset of the test data. 33 | 34 | __Are we allowed to do transfer learning using pre-trained networks?__ 35 | 36 | Yes, most certainly. We encourage you to do this. You do not need to include your data in the code stack for training the algorithm, but you do need to include the pre-trained model in the code and provide code to retrain (continue training) on the training data we provide. 37 | You must also thoroughly document the content of the database you used to pre-train. 38 | 39 | __Do we need to provide our training code?__ 40 | 41 | Yes, this is a new and required (and exciting) part of this year's Challenge. 42 | 43 | __Can I provide my training code but request that you not use/run it?__ 44 | 45 | No, the training code is an important part of this year's Challenge. 46 | 47 | __What computational resources do you provide for our code?__ 48 | 49 | We run your training code on Google Cloud using 8 vCPUs, 64 GB RAM, and an optional NVIDIA T4 Tensor Core GPU. The training code has a 72 hour time limit. 50 | 51 | We run each your trained model on Google Cloud using 2 vCPUs, 13 GB RAM, and an optional NVIDIA T4 Tensor Core GPU. Your trained model has a 24 hour time limit on the test set. 52 | 53 | __For the license, can we make it open source but restrict to non-commercial use?__ 54 | 55 | Yes, the philosophy of the Challenge is to encourage researchers to make their code free to use for research. We hope that companies will approach you to license the code, too! If you do not specify any license, then we will assume that the license is the BSD 3-Clause License. 56 | 57 | __Are the provided records a more accurate representation of the hold-out evaluation data that what was previously provided?__ 58 | 59 | We are creating a large database of heterogeneous data with varying labels, some of which are wrong or incomplete. Leads can be inverted, noisy, mislabeled. We have deliberately made no attempt to clean this up. The test data contains better labels, but it is not perfect either, and although it roughly correspond to the training data, it includes some deliberate differences. 60 | 61 | __Should I submit your example code to test the submission system?__ 62 | 63 | No, please only submit your code to the submission system. 64 | 65 | __Should I submit an empty repository to test the submission system?__ 66 | 67 | No, please only submit an entry after you have finished and tested your code. 68 | 69 | __I left out a file, or I missed the deadline, or something else. Can I email you my code?__ 70 | 71 | No, please use the submission form to submit your entry through a repository. 72 | 73 | __Do I need to upload your training data? What about the code for evaluating my algorithm?__ 74 | 75 | No, we have our training and test data and our evaluation code. 76 | 77 | __Do you run the code that was in my repository at the time of submission?__ 78 | 79 | No, not yet. If you change your code after submitting, then we may or may not run the updated version of your code. If you want to update your code but do not want us to run the updates (yet), then please make changes in a subdirectory or in another branch of your repository. 80 | 81 | __Why is my entry unsuccessful on your submission system? It works on my computer.__ 82 | 83 | If you used Python, then make sure that it runs in Docker. 84 | 85 | __My entry had some kind of error. Did I lose one of my total entries?__ 86 | 87 | No, only scored entries (submitted entries that receive a score) count against the total number of allowed entries. 88 | 89 | --- 90 | 91 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 92 | 93 | [Back](../index.html) 94 | -------------------------------------------------------------------------------- /faq/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: default 3 | --- 4 | 5 | # Frequently Asked Questions (FAQ) - General 6 | 7 | This page provides general FAQs for the Challenges. Please see the [current Challenge FAQs](../2021/faq/) for more specific information about the current Challenge. 8 | 9 | [Challenge History](#history) 10 | 11 | - [What is the history of the Challenges?](#challenge-history) 12 | 13 | [Data](#data) 14 | - [I want to evaluate my code on the test data. Can you provide either the test data or labels for the current Challenge or a previous Challenge?](#test-data) 15 | - [Are we allowed to use external public or private data? Can we use transfer learning with pre-trained networks?](#external-data) 16 | 17 | [Scoring](#score) 18 | 19 | - [I missed the Challenge, but I still want to run my code on the test data. If you aren't providing test data or labels, then can you run my code for me?](#run-code) 20 | - [Can you score my algorithm for one of the previous Challenges?](#previous-scoring) 21 | 22 | [Contribution](#contribution) 23 | 24 | - [I would like to suggest/help organize/contribute software or data to a Challenge - how can I do this?](#software-data) 25 | 26 | 27 | ## Challenge History 28 | 29 | __What is the history of the Challenges?__ 30 | 31 | To find the information about the history of the Challenges, please see [here](https://physionetchallenges.org/about/). 32 | 33 | ## Data 34 | 35 | __I want to evaluate my code on the test data. Can you provide either the test data or labels for the current Challenge or a previous Challenge?__ 36 | 37 | No, we prohibit access to both the test data and test labels to prevent significant information leakage from the out-of-sample test data to the training process. This is true for both current and past Challenges. Having access to the test labels provides the researcher too much of an opportunity to look at the test data and perform multiple re-tests. These re-tests constitute an outer training loop that lead to overfitting and an overly-optimistic value for the performance metric. Access to the test data (even without labels), provides the opportunity to employ techniques to extract information about the test data that are not representative of the 'future' use of an algorithm. These include extracting population statistics (mean, distributions), unsupervised clustering, and even hand-labelling of the data. 38 | 39 | Reducing information leakage from test to training is also the reason why you will only be allowed one shot at the full test data (or perhaps two, if you publish in the follow-up journal focus issue). We understand that providing teams up to 15 entries on a subset of the test data causes a small leak, but whenever possible we keep one database separate until the final run. One final reason for not sharing the test data and labels publicly is that we sometimes use data from previous Challenges in future Challenges. 40 | 41 | Moreover, we try to source diverse datasets for the Challenges, and we often use datasets in the Challenges that we are unable release as test data. 42 | 43 | __Are we allowed to use external public or private data? Can we use transfer learning with pre-trained networks?__ 44 | 45 | Yes, most certainly. We encourage you to do this. You do not need to include your data in the code stack for training the algorithm, but you do need to include the pre-trained model in the code and provide code to retrain (continue training) on the training data we provide. The pre-trained network must have a compatible license to the rest of your code. You must also thoroughly document the content of the database you used to pre-train your network. If you are able to provide access to the data, or it is already public, please include links in both your README, and the article documenting your entry. If you would like to contribute data to the Challenge for others to use (or as test data), please contact us directly. We'd be delighted to add you to the team/authorship of the resulting articles if the data adds value. 46 | 47 | ## Scoring 48 | 49 | __I missed the Challenge, but I still want to run my code on the test data. If you aren't providing test data or labels, then can you run my code for me?__ 50 | 51 | Yes - under certain conditions. First, check with us that we are able to resource your request. We are really busy and have more work that we are funded to do. If you are able to provide any donations to the resource to fund an engineer's time, your request will be prioritized. Second, you must provide a 90% complete draft of the article you are writing to describe the method, pointing out where it differs from other known approaches, particularly in the Challenge. We prioritize novelty. We won't judge your training and validation statistics too heavily, since we are more interested in adding to the discussion around methods, rather than adding a few percent to the top score. We also prioritize open source approaches, although if you do wish to keep your method secret, we may be able to offer a sponsorship plan. Finally, you need to package the code exactly as in the Challenges, and ensure it works in the containerized environment provided (not just on your personal computer or an arbitrarily configured cluster). Please note that bugs in the code will increase the likelihood that you do not receive a score, as we cannot invest large amounts of time into providing a single group with support. 52 | 53 | __Can you score my algorithm for one of the previous Challenges?__ 54 | 55 | Yes, we are happy to support ongoing research with past Challenges subject to available resources (the Challenges are largely run by volunteers) and whether you are able to do the following: 56 | 57 | 1. You must share your code in a GitHub or Gitlab repository. 58 | 2. You must include your code, including your training code and forward model. 59 | 3. You must include an open-source LICENSE file, an AUTHORS file, and a README file that describes the results on the training set. 60 | 4. You must include a detailed draft article describing your method, including the results on the training set (matching the results in your README), with the target journal for your submission. 61 | 5. Your article must describe how your technique differs from all other methods from the Challenge and the subsequent focus issue, especially any previous methods that you may have developed. 62 | 6. You must include a statement that no one from your team will attempt to submit another entry. Each team receives at most one follow-up shot at the data. 63 | 7. We must be able to run your code. For recent Challenges, we expect you to follow the submission instructions and format your submission in the same way. 64 | 65 | If you agree to the above conditions, then please contact us at challenge@physionet.org to submit your entry. Even so, we cannot guarantee that we will be able to run your code. 66 | 67 | ## Contribution 68 | 69 | __I would like to suggest/help organize/contribute software or data to a Challenge - how can I do this?__ 70 | 71 | If you are interested in contributing to, or posing a Challenge, please feel free to contact us with details of the databases you can provide, the nature of the problem you wish to solve, and some demo code which makes a basic attempt to solve the problem. We strongly recommend having at least three independent databases, two to become public, and one to remain private/hidden. For more information on the general aims and framework of the Challenge, and the criteria for a successful event, please see [here](https://arxiv.org/abs/2007.10502). 72 | 73 | __For the current Challenge FAQs, please visit [here](../2021/faq/).__ 74 | 75 | --- 76 | 77 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 78 | 79 | [Back](../index.html) 80 | -------------------------------------------------------------------------------- /2021/faq/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2021 3 | --- 4 | 5 | # Frequently Asked Questions (FAQ) for the 2021 Challenge 6 | 7 | This page provides specific FAQs for the 2021 Challenge. Please read the [general Challenge FAQs](../../faq/) for more general questions about the Challenges. 8 | 9 | [Rules](#rules) 10 | - [I missed the abstract deadline. Can I still participate in the Challenge?](#missed-abstract-deadline) 11 | - [Can I attend this year's conference remotely? Will I be eligible for prizes?](#remotely) 12 | - [Can I collaborate with another team from the 2020 Challenge?](#collaborate) 13 | - [For the license, can we make it open source but restrict to non-commercial use?](#license) 14 | 15 | [Labels and scores](#scoring) 16 | 17 | - [Why don't you merge PVC, VEB, and VPB labels?](#merge) 18 | - [Are the scores currently on the leaderboard the final scores for the Challenge?](#leaderboard) 19 | 20 | [Data](#data) 21 | 22 | - [Are the provided records a more accurate representation of the hold-out evaluation data that what was previously provided?](#hold-out) 23 | - [Do I need to upload your training data? What about the code for evaluating my algorithm?](#upload-data) 24 | 25 | [Training and testing](#train-test) 26 | 27 | - [How will you choose which model to evaluate on the full test data? The latest entry? The best-scoring entry?](#choose-model) 28 | - [Are we allowed to do transfer learning using pre-trained networks?](#pre-trained) 29 | - [Do we need to provide our training code?](#training-code) 30 | - [Can I provide my training code but request that you not use/run it?](#not-train) 31 | - [What computational resources do you provide for our code?](#computational) 32 | - [Should I submit your example code to test the submission system?](#test-submission) 33 | - [Should I submit an empty repository to test the submission system?](#empty) 34 | - [I left out a file, or I missed the deadline, or something else. Can I email you my code?](#email-code) 35 | - [Do you run the code that was in my repository at the time of submission?](#repository) 36 | - [Why is my entry unsuccessful on your submission system? It works on my computer.](#unsuccessful-entry) 37 | - [My entry had some kind of error. Did I lose one of my total entries?](#error-lose-entry) 38 | 39 | 40 | ## Rules 41 | 42 | __I missed the abstract deadline. Can I still participate in the Challenge?__ 43 | 44 | Yes, you can still participate. An accepted CinC abstract is required for prizes, rankings, and the opportunity to present your work at CinC, but you can still submit algorithms to the official phase without an accepted abstract. 45 | 46 | A 'wildcard' entry is reserved for a high-scoring team who submitted an abstract that was not accepted or who were unable to submit an abstract by the deadline. Please read [here](../#wild-card) for more details, including the deadline. 47 | 48 | __Can I attend this year's conference remotely? Will I be eligible for prizes?__ 49 | 50 | Due to the unique circumstances of 2020 and 2021, remote attendance is allowed for both CinC 2020 and 2021. Participants were still eligible for prizes if they attend remotely (as long as they satisfied the other criteria). 51 | 52 | __Can I collaborate with another team from the 2020 Challenge?__ 53 | 54 | Yes, as long as no one from either team competes in a different team. 55 | 56 | __For the license, can we make it open source but restrict to non-commercial use?__ 57 | 58 | Yes, the philosophy of the Challenge is to encourage researchers to make their code free to use for research. We hope that companies will approach you to license the code, too! If you do not specify any license, then we will assume that the license is the BSD 3-Clause License. 59 | 60 | ## Labels and scores 61 | 62 | __Why don't you merge PVC, VEB, and VPB labels?__ 63 | 64 | Each database is labelled using a different ontology, or subset of terms in an ontology (or sometimes no ontology, i.e., just a free-text description). We needed to decide how to map these ontologies to a consistent set of labels. For example, we have the following four labels for ventricular ectopic beats: 65 | 66 | | Description | SNOMED Code | Abbreviation | 67 | | -----------------------------------| ------------|--------------| 68 | | Premature ventricular complexes | 164884008 | PVC | 69 | | Premature ventricular contractions | 427172004 | PVC | 70 | | Ventricular ectopic beats | 17338001 | VEB | 71 | | Ventricular premature beat | 17338001 | VPB | 72 | 73 | We have chosen to retain the distinction between these in terms of SNOMED codes (but merged PVCs because we could really see no reason they had two separate codes), but the scored labels carry the same weight in scoring matrix, so mixing them up doesn't cost you any points. You may then ask, 'why not merge them all in the labelling'? That's a question you have to answer for yourself! You are certainly welcome to do that - but you may not want to. You may note that only VPB indicates the temporal location of the beat relative to the preceding normal beat. This may, or may not, affect your algorithm, depending on how you write your code. You may or may not want it to affect your algorithm - the relative timing of beats certainly gives you information! 74 | 75 | In general, we have tried to provide you with as much useful information as possible, without overwhelming you with a complete data dump. 76 | 77 | __Are the scores currently on the leaderboard the final scores for the Challenge?__ 78 | 79 | No, the leaderboard contains scores on a subset of the hidden data during the unofficial and official phases of the Challenge. The final scores on the full test data are released after the conference for the "best" model selected by each team. 80 | 81 | ## Data 82 | 83 | __Are the provided records a more accurate representation of the held-out evaluation data that what was previously provided?__ 84 | 85 | We are creating a large database of heterogeneous data with varying labels, some of which are wrong or incomplete. Leads can be inverted, noisy, and/or mislabeled. We have deliberately made no attempt to clean this up. The test data contains better labels, but it is not perfect either, and although it roughly correspond to the training data, it includes some deliberate differences. 86 | 87 | __Do I need to upload your training data? What about the code for evaluating my algorithm?__ 88 | 89 | No, we have the training, validation, and test data along with the evaluation code? 90 | 91 | ## Training and Testing 92 | 93 | __How will you choose which model to evaluate on the test set? The latest entry? The best-scoring entry?__ 94 | 95 | You will be able to choose which model you would like to have scored on the full test set. We will ask for teams to choose their “best” model shortly before the end of the official phase of the Challenge. If you do not choose a model, or if there is any ambiguity about your choice, then we will use the model with the highest score on the current subset of the test data. 96 | 97 | __Are we allowed to do transfer learning using pre-trained networks?__ 98 | 99 | Yes, most certainly. We encourage you to do this. You do not need to include your data in the code stack for training the algorithm, but you do need to include the pre-trained model in the code and provide code to retrain (continue training) on the training data we provide. You must also thoroughly document the content of the database you used to pre-train. 100 | 101 | __Do we need to provide our training code?__ 102 | 103 | Yes, this is a required (and exciting) part of this year's Challenge. 104 | 105 | __Can I provide my training code but request that you not use/run it?__ 106 | 107 | No, the training code is an important part of this year's Challenge. 108 | 109 | __What computational resources do you provide for our code?__ 110 | 111 | We run your training code on Google Cloud using 10 vCPUs, 65 GB RAM, and an optional [NVIDIA T4 Tensor Core GPU](https://www.nvidia.com/en-us/data-center/tesla-t4/) with 16 GB VRAM. Your training code has a 48 hour time limit using the GPU and a 72 hours time limit without using a GPU. 112 | 113 | We run your trained model on Google Cloud using 6 vCPUs, 39 GB RAM, and an optional [NVIDIA T4 Tensor Core GPU](https://www.nvidia.com/en-us/data-center/tesla-t4/) with 16 GB VRAM. Your trained models have a 24 hour time limit on each of the validation and test sets. 114 | 115 | Also 100 GB disk space is allocated for each submission and for submissions reuiring a GPU, one instance of [NVIDIA-TESLA-T4](https://www.nvidia.com/en-us/data-center/tesla-t4/) with NVIDIA Driver Version of [418.40.04](https://www.nvidia.com/en-us/data-center/tesla-t4/) is attached and configured. 116 | We are using an [N1 custom machine type](https://cloud.google.com/compute/docs/machine-types#n1_high-memory_machine_types). If you would like to use a predefined machine type, then the n1-highmem-8 is the closest but with slightly fewer vCPUs and slightly less RAM. 117 | 118 | __Should I submit your example code to test the submission system?__ 119 | 120 | No, please only submit your code to the submission system. 121 | 122 | __Should I submit an empty repository to test the submission system?__ 123 | 124 | No, please only submit an entry after you have finished and tested your code. 125 | 126 | __I left out a file, or I missed the deadline, or something else. Can I email you my code?__ 127 | 128 | No, please use the submission form to submit your entry through a repository. 129 | 130 | __Do you run the code that was in my repository at the time of submission?__ 131 | 132 | No, not yet. If you change your code after submitting, then we may or may not run the updated version of your code. If you want to update your code but do not want us to run the updates (yet), then please make changes in a subdirectory or in another branch of your repository. 133 | 134 | __Why is my entry unsuccessful on your submission system? It works on my computer.__ 135 | 136 | If you used Python for your entry, then test it in Docker. 137 | 138 | __My entry had some kind of error. Did I lose one of my total entries?__ 139 | 140 | No, only scored entries (submitted entries that receive a score) count against the total number of allowed entries. 141 | 142 | __For more general Challenge FAQs, please visit [here](../../faq/).__ 143 | 144 | --- 145 | 146 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 147 | 148 | [Back](../) 149 | -------------------------------------------------------------------------------- /2020/papers/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2020 3 | --- 4 | 5 | # Classification of 12-lead ECGs: the PhysioNet/Computing in Cardiology Challenge 2020 6 | 7 | ## Citations 8 | 9 | For this year's Challenge, please cite: 10 | 11 | [Perez Alday EA, Gu A, Shah AJ, Robichaux C, Wong AI, Liu C, Liu F, Rad AB, Elola A, Seyedi S, Li Q, Sharma A, Clifford GD*, Reyna MA*. Classification of 12-lead ECGs: the PhysioNet/Computing in Cardiology Challenge 2020. Physiol Meas. 2020 Nov 11. doi: 10.1088/1361-6579/abc960](2020ChallengePaper.pdf) 12 | 13 | ## Focus Issue Papers 14 | 15 | The [Physiological Measurement Focus Collection: Classification of Multilead ECGs](https://iopscience.iop.org/journal/0967-3334/page/Focus%20issues) includes the above [paper](https://iopscience.iop.org/article/10.1088/1361-6579/abc960). 16 | 17 | ## Conference Papers 18 | 19 | The conference papers for [Computing in Cardiology 2020](https://www.cinc2020.org/) are available on the [CinC](https://www.cinc.org/cinc-papers-on-line/) and [IEEE](https://ieeexplore.ieee.org/xpl/conhome/1000157/all-proceedings) websites. 20 | 21 | The scores for the teams are available [here](https://github.com/physionetchallenges/evaluation-2020/tree/master/Results), including scores for [official entries](https://github.com/physionetchallenges/evaluation-2020/blob/master/Results/physionet_2020_official_scores.csv) that were scored on the validation and test data, scores for [unofficial entries](https://github.com/physionetchallenges/evaluation-2020/blob/master/Results/physionet_2020_unofficial_scores.csv) that were scored on the validation and test data but did not satisfy one or more of the Challenge rules or were unsuccessful on one or more of the test databases, and [additional](https://github.com/physionetchallenges/evaluation-2020/blob/master/Results/physionet_2020_full_metrics_official_entries.csv) [metrics](https://github.com/physionetchallenges/evaluation-2020/blob/master/Results/physionet_2020_validation_metrics_by_class_official_entries.csv) for the official entries. 22 | 23 | | Abstract | Team Name | Title | Author(s) | 24 | | --- | --- | --- | --- | 25 | | [7](7.pdf) | SpaceOn Flattop | Multi-label Classification of Electrocardiogram With Modified Residual Networks | Shan Yang, Heng Xiang, Qingda Kong and Chunli Wang | 26 | | [32](32.pdf) | ISIBrno | Utilization of Residual CNN-GRU with Attention Mechanism for Classification of 12-lead ECG | Petr Nejedly, Adam Ivora, Ivo Viscor, Josef Halamek, Pavel Jurak and Filip Plesinger | 27 | | [35](35.pdf) | PALab | Arrhythmia Detection and Classification of 12-lead ECGs Using a Deep Neural Network | wenxiao jia, Xiao Xu, Xian Xu, Yuyao Sun and Xiaoshuang Liu | 28 | | [39](39.pdf) | nebula | Automatic 12-lead ECG Classification Using Deep Neural Networks | Wenjie Cai, Shuaicong Hu, Jingying Yang and Jianjian Cao | 29 | | [44](44.pdf) | DeepHeart | Automatic Classification of Arrhythmias by Residual Network and BiGRU With Attention Mechanism | Runnan He, Kuanquan Wang, Na Zhao, Qiang Sun, Yacong Li, Qince Li and Henggui Zhang | 30 | | [61](61.pdf) | Alba_W.O. | Selected Features for Classification of 12-lead ECGs | Marek Żyliński and Gerard Cybulski | 31 | | [63](63.pdf) | NN-MIH | Electrocardiogram Classification by Modified EfficientNet with Data Augmentation | Naoki Nonaka and Jun Seita | 32 | | [71](71.pdf) | DSC | Multi-Class Classification of Pathologies Found on Short ECG Signals | Georgi Nalbantov, Svetoslav Ivanov and Jeffrey van Prehn | 33 | | [72](72.pdf) | NTU-Accesslab | Explainable Deep Neural Network for Identifying Cardiac Abnormalities Using Class Activation Map | Yu-Cheng Lin, Yun-Chieh Lee, Wen-Chiao Tsai, Win-Ken Beh and An-Yeu Wu | 34 | | [74](74.pdf) | Marquette | A Deep Neural Network and Reconstructed Phase Space Approach to Classifying 12-lead ECGs | David Kaftan and Richard Povinelli | 35 | | [85](85.pdf) | CQUPT_ECG | SE-ECGNet: Multi-scale SE-Net for Multi-lead ECG Data | Jiabo Chen, Tianlong Chen, Bin Xiao, Xiuli Bi, Yongchao Wang, Weisheng Li, Han Duan, Junhui Zhang and Xu Ma | 36 | | [95](95.pdf) | ECGLearner | Deep Multi-Label Multi-Instance Classification on 12-Lead ECG | Yingjing Feng and Edward Vigmond | 37 | | [107](107.pdf) | prna | A Wide and Deep Transformer Neural Network for 12-Lead ECG Classification | Annamalai Natarajan, Yale Chang, Sara Mariani, Asif Rahman, Gregory Boverman, Shruti Vij and Jonathan Rubin | 38 | | [112](112.pdf) | Between a ROC and a heart place | Adaptive lead weighted ResNet trained with different duration signals for classifying 12-lead ECGs | Zhibin Zhao, Hui Fang, Samuel Relton, Ruqiang Yan, Yuhong Liu, Zhijing Li, Jing Qin and David Wong | 39 | | [116](116.pdf) | Gio_Ivo | Rule-Based methods and Deep Learning Networks for Automatic Classification of ECG | giovanni bortolan, Ivaylo Christov and Iana Simova | 40 | | [124](124.pdf) | BioS | 12-lead ECG Arrythmia Classification Using Convolutional Neural Network for Mutually Non-Exclusive Classes | Mateusz Soliński, Michał Łepek, Antonina Pater, Katarzyna Muter, Przemysław Wiszniewski, Dorota Kokosińska, Judyta Salamon and Zuzanna Puzio | 41 | | [127](127.pdf) | Care4MyHeart | Identification of Cardiac Arrhythmias from 12-lead ECG using Beat-wise Analysis and a Combination of CNN and LSTM | Mohanad Alkhodari, Leontios J. Hadjileontiadis and Ahsan H. Khandoker | 42 | | [128](128.pdf) | CVC | Multilabel 12-Lead Electrocardiogram Classification Using Gradient Boosting Tree Ensemble | Alexander William Wong, Weijie Sun, Sunil Vasu Kalmady, Padma Kaul and Abram Hindle | 43 | | [130](130.pdf) | Code Team | Automatic 12-lead ECG Classification Using a Convolutional Network Ensemble | Antonio H. Ribeiro, Daniel Gedon, Daniel Martins Teixeira, Manoel Horta Ribeiro, Antonio Luiz Ribeiro, Thomas B. Schön and Wagner Meira Jr | 44 | | [133](133.pdf) | Triage | Combining Scatter Transform and Deep Neural Networks for Multilabel ECG Signal Classification | Maximilian Oppelt, Maximilian Riehl, Felix Kemeth and Jan Steffan | 45 | | [134](134.pdf) | JuJuRock | Multi-label Arrhythmia Classification From 12-Lead Electrocardiograms | Po-Ya Hsu, Po-Han Hsu, Tsung-Han Lee and Hsin-Li Liu | 46 | | [135](135.pdf) | Leicester-Fox | Diagnostic of Multiple Cardiac Disorders from 12-lead ECGs Using Graph Convolutional Network Based Multi-label Classification | Zheheng Jiang, Tiago Paggi de Almeida, Fernando Schlindwein, G. André Ng, Huiyu Zhou and Xin Li | 47 | | [138](138.pdf) | Eagles | Automated Classification of Electrocardiograms Using Wavelet Analysis and Deep Learning | Andrew Demonbreun and Grace Mirsky | 48 | | [139](139.pdf) | EASTBLUE | Multi-label Classification of Abnormalities in 12-Lead ECG Using Deep Learning | Ao Ran, Dongsheng Ruan, Yuan Zheng and Huafeng Liu | 49 | | [144](144.pdf) | CardioLux - Unicauca | ECG Arrhythmia Classification using Non-Linear Features and Convolutional Neural Networks | Sebastian Cajas, Pedro Astaiza, David Santiago Garcia Chicangana, Camilo Segura and Diego Lopez | 50 | | [148](148.pdf) | easyG | Multi-Stream Deep Neural Network for 12-Lead ECG Classification | Martin Baumgartner, Dieter Hayn, Andreas Ziegl, Alphons Eggerth and Günter Schreier | 51 | | [161](161.pdf) | ECU | Impact of Neural Architecture Design on Cardiac Abnormality Classification Using 12-lead ECG Signals | Najmeh Fayyazifar, Selam Ahderom, David Suter, Andrew Maiorana and Girish dwivedi | 52 | | [162](162.pdf) | Triology | A Real-time ECG Classification Scheme Using Anti-aliased Blocks with Low Sampling Rate | Yunkai Yu, Zhihong Yang, Zhicheng Yang, Peiyao Li and Yuyang You | 53 | | [171](171.pdf) | HITTING | Cardiac Pathologies Detection and Classification in 12-lead ECG | Radovan Smisek, Andrea Nemcova, Lucie Marsanova, Lukas Smital, Martin Vítek and Jiri Kozumplik | 54 | | [185](185.pdf) | Madhardmax | Interpretable XGBoost Based Classification of 12-lead ECGs Applying Information Theory Measures From Neuroscience | Hardik Rajpal, Madalina Sas, Rebecca Joakim, Chris Lockwood, Nicholas S. Peters and Max Falkenberg | 55 | | [189](189.pdf) | BUTTeam | ECG Abnormalities Recognition Using Convolutional Network With Global Skip Connections and Custom Loss Function | Tomas Vicar, Jakub Hejc, Petra Novotna, Marina Ronzhina and Oto Janousek | 56 | | [196](196.pdf) | MetaHeart | A Novel Convolutional Neural Network for Arrhythmia Detection From 12-lead Electrocardiograms | Zhengling He, Pengfei Zhang, Lirui Xu, Zhongrui Bai, Hao Zhang, Weisong Li, Pan Xia and Xianxiang Chen | 57 | | [198](198.pdf) | Pink Irish Hat | ECG Classification With a Convolutional Recurrent Neural Network | Halla Sigurthorsdottir, Jérôme Van Zaen, Ricard Delgado-Gonzalo and Mathieu Lemay | 58 | | [202](202.pdf) | Technion_AIMLAB | Classification of 12-lead ECGs using digital biomarkers and representation learning | David Assaraf, Jeremy Levy, Janmajay Singh, Armand Chocron and Joachim A. Behar | 59 | | [217](217.pdf) | Whitaker's Lab | Detecting Cardiac Abnormalities from 12-lead ECG Signals Using Feature Extraction, Dimensionality Reduction, and Machine Learning Classification | Garrett Perkins, J. Chase McGlinn, Muhammad Rizwan and Bradley Whitaker | 60 | | [225](225.pdf) | Cardio-Challengers | A Bio-toolkit for Multi-Cardiac Abnormality Diagnosis Using ECG Signal and Deep Learning | Akash Kirodiwal, Apoorva Srivastava, Ashutosh Dash, Ayantika Saha, Gopi Vamsi Penaganti, Sawon Pratiher, sazedul alam, Amit Patra, Nirmalya Ghosh and Nilanjan Banerjee | 61 | | [227](227.pdf) | Team UIO | Convolutional Neural Network and Rule-Based Algorithms for Classifying 12-lead ECGs | Bjørn-Jostein Singstad and Christian Tronstad | 62 | | [229](229.pdf) | UC_Lab_Kn | Cardiac Abnormality Detection in 12-lead ECGs with Deep Convolutional Neural Networks Using Data Augmentation | Lucas Weber, Maksym Gaiduk, Wilhelm Daniel Scherz and Ralf Seepold | 63 | | [236](236.pdf) | PhysioNet Challenge Organizers | Classification of 12-lead ECGs: the PhysioNet/Computing in Cardiology Challenge 2020 | Matthew Reyna, Erick Andres Perez Alday, Annie Gu, Chengyu Liu, Salman Seyedi, Ali Bahrami Rad, Andoni Elola, Qiao Li, Ashish Sharma and Gari Clifford | 64 | | [253](253.pdf) | UMCUVA | Automated Comprehensive Interpretation of 12-lead Electrocardiograms Using Pre-trained Exponentially Dilated Causal Convolutional Neural Networks | Max Bos, Rutger van de Leur, Jeroen Vranken, Deepak Gupta, Pim van der Harst, Pieter Doevendans and René van Es | 65 | | [277](277.pdf) | AI Strollers | Classification of 12 Lead ECG Signal Using 1D-CNN With Class Dependent Threshold | Rohit Pardasani and Navchetan Awasthi | 66 | | [281](281.pdf) | HeartBeats | Classification of Cardiac Abnormalities From ECG Signals Using SE-ResNet | Zhaowei Zhu, Han Wang, Tingting Zhao, Yangming Guo, Zhuoyang Xu, Zhuo Liu, Siqi Liu, Xiang Lan, Xingzhi Sun and Mengling Feng | 67 | | [282](282.pdf) | Minibus | MADNN: A Multi-scale Attention Deep Neural Network for Arrythmia Classification | Ran Duan, Xiaodong He and Ouyang Zhuoran | 68 | | [285](285.pdf) | ECGMaster | Multi-Label Classification of 12-lead ECGs by Using Residual CNN and Class-Wise Attention | Yang Liu, Kuanquan Wang, Yongfeng Yuan, Qince Li, Yacong Li, Yongpeng Xu and Henggui Zhang | 69 | | [297](297.pdf) | Cordi-Ak | A Topology Informed Random Forest Classifier for ECG Classification | Paul Samuel Ignacio, Jay-Anne Bulauan and John Rick Manzanares | 70 | | [305](305.pdf) | ELBIT | A Deep Learning Solution for Automatized Interpretation of 12-Lead ECGs | Alvaro Huerta Herraiz, Arturo Martinez-Rodrigo, José J Rieta and Raul Alcaraz | 71 | | [307](307.pdf) | SBU_AI | Classification of 12-lead ECGs Using Intra-Heartbeat Discrete-time Fourier Transform and Inter-Heartbeat Attention | Ibrahim Hammoud, IV Ramakrishnan and Petar Djuric | 72 | | [328](328.pdf) | DSAIL_SNU | Bag of Tricks for Electrocardiogram Classification with Deep Neural Networks | Seonwoo Min, Hyun-Soo Choi, Hyeongrok Han, Minji Seo, Jin-Kook Kim, Junsang Park, Sunghoon Jung, Il-Young Oh, Byunghan Lee and Sungroh Yoon | 73 | | [339](339.pdf) | MIndS | Detection of Cardiac Arrhythmias From Varied Length Multichannel Electrocardiogram Recordings Using Deep Convolutional Neural Networks | Marwen Sallem, Adnen Saadaoui, Amina Ghrissi and Vicente Zarzoso | 74 | | [349](349.pdf) | CardiUniBo | On the Application of Convolutional Neural Networks for 12-lead ECG Multi-label Classification using Datasets from Multiple Centers | Davide Borra, Alice Andalò, Stefano Severi and Cristiana Corsi | 75 | | [353](353.pdf) | LaussenLabs | Rhythm classification of 12-lead ECGs using deep neural network and class-activation maps for improved explainability | Sebastian Goodfellow, Dmitrii Shubin, Danny Eytan, Andrew Goodwin, Anusha Jega, Azadeh Assadi, Mjaye Mazwi, Robert Greer, Sujay Nagaraj, Peter Laussen, William Dixon and Carson McLean | 76 | | [356](356.pdf) | heartly-ai | ECG Segmentation using a Neural Network as the Basis for Detection of Cardiac Pathologies | Philipp Sodmann and Marcus Vollmer | 77 | | [363](363.pdf) | Desafinado | Classification of 12-lead ECGs Using Gradient Boosting on Features Acquired With Domain-Specific and Domain-Agnostic Methods | Durmus Umutcan Uguz, Felix Berief, Steffen Leonhardt and Christoph Hoog Antink | 78 | | [374](374.pdf) | MCIRCC | Classification of 12-Lead Electrocardiograms Using Residual Neural Networks and Transfer Learning | Sardar Ansari, Christopher Gillies, Brandon Cummings, Jonathan Motyka, Guan Wang, Kevin Ward and Hamid Ghanbari | 79 | | [406](406.pdf) | BiSP Lab | Classification of 12-lead ECG with an Ensemble Machine Learning Approach | Matteo Bodini, Massimo W Rivolta and Roberto Sassi | 80 | | [417](417.pdf) | AUTh Team | Convolutional Recurrent Neural Network and LightGBM Ensemble Model for 12-lead ECG Classification | Charilaos Zisou, Andreas Sochopoulos and Konstantinos Kitsios | 81 | | [424](424.pdf) | deepzx987 | Automatic Concurrent Arrhythmia Classification Using Deep Residual Neural Networks | Deepankar Nankani, Pallabi Saikia and Rashmi Dutta Baruah | 82 | | [435](435.pdf) | Germinating | ECG Morphological Decomposition for Automatic Rhythm Identification | Guadalupe García Isla, Rita Laureanti, Valentina Corino and Luca Mainardi | 83 | | [445](445.pdf) | Sharif AI Team | Classification of 12-lead ECG Signals with Adversarial Multi-Source Domain Generalization | Hosein Hasani, Adeleh Bitarafan and Mahdieh Soleymani | 84 | | [456](456.pdf) | UIDT-UNAM | Cardiac Arrhythmias Identification by Parallel CNNs and ECG Time-Frequency Representation | Jonathan Roberto Torres Castillo, K De Los Rios and Miguel Ángel Padilla Castañeda | 85 | | [462](462.pdf) | BitScattered | Arrhythmia classification of 12-lead Electrocardiograms by Hybrid Scattering-LSTM networks | Philip Warrick, Masun Nabhan Homsi, Vincent Lostanlen, Michael Eikenberg and Joakim Andén | 86 | 87 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 88 | 89 | [Back](../) 90 | -------------------------------------------------------------------------------- /2020/submissions.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2020 3 | --- 4 | 5 | # Submission Instructions and Form 6 | 7 | ## PhysioNet/CinC Challenge 2020: Cloud Submission Instructions 8 | 9 | ### Table of contents 10 | 11 | - [Introduction](#introduction) 12 | - [Preparation and submission instructions](#preparation) 13 | - [MATLAB-specific instructions](#matlab) 14 | - [Python-specific instructions](#python) 15 | - [R-specific instructions](#r) 16 | - [Julia-specific instructions](#julia) 17 | - [Docker-specific FAQs](#docker) 18 | - [FAQ](#faq) 19 | - [Submission form](#Submission-form) 20 | 21 | ### Introduction 22 | For the first time in a public competition, teams must submit both the code for their models and the code for training their models. To help, we have shared simple baseline models in Python and MATLAB, and we encourage teams to use our Python and MATLAB code as templates for their entries. To add the code for training your model to your entry, please edit the train_12ECG_classifier script, and to add the code for running your model to your entry, please edit the run_12ECG_classifier script. Please see the following sections for more detailed, language-specific instructions. 23 | 24 | ### Preparation and submission instructions 25 | 1. Create a private GitHub or Gitlab repository for your code. We recommend cloning our example code and replacing it with your code. Add `physionetchallengeshelper` as a collaborator to your repository. 26 | 2. Add your classification code to your repository. Like the example code, your code must be in the root directory of the master branch. 27 | 3. Do not anything that is not needed to create and run your classification code. 28 | 4. Follow the instructions for the programming language of your submission. 29 | 5. Use Google Forms to submit your entry. We will clone your repository using the HTTPS URL that ends in `.git`. On GitHub, you can get this URL by clicking on “Clone or download” and copying and pasting the URL, e.g., `https://github.com/physionetchallenges/python-classifier-2020.git`. Please see [here](https://help.github.com/en/articles/which-remote-url-should-i-use) for an example. 30 | 6. We will put the scores for successful entries on the leaderboard. The leaderboard will publicly show your team name, run time, and score. 31 | 32 | ### MATLAB-specific instructions 33 | 34 | 1. Confirm that your MATLAB code compiles and runs in MATLAB 2019b or MATLAB 2020a. 35 | 2. Using our sample MATLAB classification code ([link](https://github.com/physionetchallenges/matlab-classifier-2020)) as a template, format your code in the following way. Consider downloading this repository, replacing our code with your code, and adding the updated files to your repository. 36 | 3. AUTHORS.txt, LICENSE.txt, README.md: Update as needed. Unfortunately, our submission system will be unable to read your README. 37 | 4. train_12ECG_classifier.m: Update this script to create and save your model. It also performs all file input and model output. It takes the header with all the data and demographics information, a matrix of 12 leads ECG (columns are ECG leads and rows are time windows), and outputs your model (weights and any needed parameters). You can edit this script as much as you need. 38 | 5. train_model.m: Do not edit this script. It calls your train_12ECG_classifier.m script. We will not use the train_model.m script from your repository, so any change made to this code will not be included. 39 | 6. load_12ECG.model.m: Update this script to load your model weights and any parameters from files in your submission. It takes no input (place any filenames, etc. in the body of the function itself) and returns any output that you choose. You must implement this function in the load_12ECG_model.m script. 40 | 7. run_12ECG_classifier.m: Update this script to run your model. It takes the header with all the data and demographics information, a matrix of 12 leads ECG (columns are ECG leads and rows are time windows), and the output from load_12ECG_model as input and returns a probability or confidence score and a binary classification for each class as output. You must implement this function in the run_12ECG_classifier.m script. 41 | 8. driver.m: Do not change this script. It calls your load_12ECG_model function once, and it calls your run_12ECG_classifier function for each 12ECG recording. It also performs all file input and output. We will not use the driver.m script from your repository, so any change made to this code will not be included. 42 | 9. Add your code to the root/base directory of the master branch of your repository. 43 | 10. We will download your code, compile it using the MATLAB compiler: training (`mcc -m train_model.m -a .`) and running (`mcc -m driver.m -a .`) your classifier, and run them on Google Cloud. 44 | 11. Here is a sample repository that you can use as a template: [MATLAB classifier](https://github.com/physionetchallenges/matlab-classifier-2020). 45 | 46 | ### Python-specific instructions 47 | 1. Using our sample Python classification code ([link](https://github.com/physionetchallenges/python-classifier-2020)) as a template, format your code in the following way. Consider downloading this repository, replacing our code with your code, and adding the updated files to your repository. 48 | 2. Dockerfile: Update to specify the version of Python that you are using on your machine. Add any additional packages that you need. Do not change the name or location of this file. The structure of this file is important, especially the 3 lines that are marked as Do Not Delete. 49 | 3. requirements.txt: Add Python packages to be installed with pip. Specify the versions of these packages that you are using on your machine. Remove unnecessary packages, such as Matplotlib, that your classification code does not need. 50 | 4. AUTHORS.txt, LICENSE.txt, README.md: Update as needed. Unfortunately, our submission system will be unable to read your README. 51 | 5. run_12ECG_classifier.py: Update this script to load and run your model using the following functions. 52 | - load_12ECG_model: Update this function to load your model weights and any parameters from files in your submission. It takes no input (place any filenames, etc. in the body of the function itself) and returns any output that you choose. You must implement this function in the run_12ECG_classifier.py script. 53 | - run_12ECG_classifier: Update this function to run your model. It takes the header with all the data and demographics information, a matrix of 12 leads ECG (columns are ECG leads and rows are time windows), and the output from load_12ECG_model as input and returns a risk score and a binary classification for each class as output. You must implement this function in the run_12ECG_classifier.py script 54 | 6. driver.py: Do not change this script. It calls your load_12ECG_model function only once and your run_12ECG_classifier function for each 12ECG recording. It also performs all file input and output. We will not use the driver.py script from your repository, so any change made to this code will not be included. 55 | 7. Add your code to the root/base directory of the master branch of your repository. 56 | 8. We will download your code, build a Docker container from your Dockerfile, and run it on Google Cloud. 57 | 9. Here is a sample repository that you can use as a template: [Python classifier](https://github.com/physionetchallenges/python-classifier-2020). 58 | 59 | ### Docker-specific FAQs 60 | 61 | __Why containers?__ 62 | 63 | Containers allow you to define the environment that you think is best suited for your algorithm. For example, if you think your algorithm needs a specific version of CentOS, a certain version of a library, and specific frameworks, then you can use the containers to specify this. Here are two links with good, data science-centric introductions to Docker: 64 | [https://towardsdatascience.com/how-docker-can-help-you-become-a-more-effective-data-scientist-7fc048ef91d5](https://towardsdatascience.com/how-docker-can-help-you-become-a-more-effective-data-scientist-7fc048ef91d5) 65 | [https://link.medium.com/G87RxYuQIV](https://link.medium.com/G87RxYuQIV) 66 | 67 | __Quickly, how can I test my submission locally?__ 68 | 69 | Install Docker. Clone your repository. Build an image. Run it on a single recording. 70 | 71 | __Less quickly, how can I test my submission locally? Please give me commands that I can copy and paste.__ 72 | 73 | Here are instructions for testing the [Python example code](https://github.com/physionetchallenges/python-classifier-2020) in Linux. You can test the non-Python example code in a Mac, for example, in a similar way. If you have trouble testing your code, then make sure that you can test the example code, which is known to work. 74 | First, create a folder, `docker_test`, in your home directory. Then, put the example code from GitHub in `docker_test/python-classifier-2020-master`, some of the training data in `docker_test/input_directory` and `docker_test/input_training_directory`, an empty folders for the output of the training code in `docker_test/output_training_directory`, and empty folder for the classifications in `docker_test/output_directory.` Finally, build a Docker image and run the example code using the following steps: 75 | 76 | ``` 77 | Docker 78 | user@computer:~/docker_test$ ls 79 | input_directory output_directory python-classifier-2020-master 80 | 81 | user@computer:~/docker_test$ ls input_directory/ 82 | A0001.hea A0001.mat A0002.hea A0002.mat A0003.hea ... 83 | 84 | user@computer:~/docker_test$ cd python-classifier-2020-master/ 85 | 86 | user@computer:~/docker_test/python-classifier-2020-master$ docker build -t image . 87 | 88 | Sending build context to Docker daemon 30.21kB 89 | [...] 90 | Successfully tagged image:latest 91 | 92 | user@computer:~/docker_test/python-classifier-2020-master$ docker run -it -v 93 | ~/docker_test/input_training_directory:/physionet/input_training_directory -v 94 | ~/docker_test/output_training_directory:/physionet/output_training_directory -v 95 | ~/docker_test/input_directory:/physionet/input_directory -v ~/docker_test/output_directory:/physionet/output_directory image bash 96 | 97 | root@[...]:/physionet# ls 98 | AUTHORS.txt Dockerfile LICENSE.txt README.md driver.py run_12ECG_classifier.py get_12ECG_features.py input_directory output_directory requirements.txt 99 | 100 | root@[...]:/physionet# python train_model.py input_training_directory/ output_training_directory/ 101 | 102 | root@[...]:/physionet# python driver.py output_training_directory/ input_directory/ output_directory/ 103 | 104 | root@[...]:/physionet# exit 105 | Exit 106 | 107 | user@computer:~/docker_test$ cd .. 108 | 109 | user@computer:~/docker_test$ ls output_directory/ 110 | A0001.csv A0002.csv A0003.csv A0004.csv A0005.csv 111 | ``` 112 | 113 | __How do I install Docker?__ 114 | 115 | Go to [https://docs.docker.com/install/](https://docs.docker.com/install/) and install the Docker Community Edition. For troubleshooting, see [https://docs.docker.com/config/daemon/](https://docs.docker.com/config/daemon/) 116 | 117 | __Do I have to use your Dockerfile?__ 118 | 119 | No. The only part of the Dockerfile we care about are the three lines marked as ”DO NOT EDIT”. These three lines help ensure that, during the build process of the container, your code is copied into a folder called physionet so that our cloud-based pipelines can find your code and run it. Please do not change those three lines. You are free to change your base image, and at times you should (see next question). 120 | 121 | __What’s the base image in Docker?__ 122 | 123 | Think of Docker as a series of images, or snapshots of a virtual machine, that are layered on top of each other. For example, our image may built on top of a very lightweight Ubuntu operating system with Python 3.7.3 that we get from the official Docker Hub (think of it as a GitHub for Docker). We can then install our requirements (NumPy and SciPy) on it. If you need the latest version of TensorFlow, then search for it on [hub.docker.com](https://hub.docker.com/) and edit your file so that the first line of your Dockerfile now reads as: `FROM tensorflow`. For a specific version, say 1.11, lookup the [tags](https://hub.docker.com/r/tensorflow/tensorflow/tags) and change it accordingly to `FROM tensorflow:1.11.0`. We recommend using specific versions for reproducibility. 124 | 125 | __sklearn or scikit-learn?__ 126 | 127 | The single most common error we noticed in the requirements.txt file for Python submissions was the sklearn package. If your entry uses scikit-learn, then you need to install via pip using the package name scikit-learn instead of sklearn in your requirements.txt file: [See here](https://scikit-learn.org/stable/install.html). 128 | 129 | __xgboost?__ 130 | 131 | For Python, replace `python:3.7.3-slim` with `python:3.7.3-stretch` in the first line of your Dockerfile. This image includes additional packages, such as GCC, that xgboost needs. Additionally, include xgboost in your requirements.txt file. Specify the version of xgboost that you are using in your requirements.txt file. 132 | For R, add `RUN R -e 'install.packages(“xgboost”)'` to your Dockerfile. 133 | 134 | __Pandas?__ 135 | 136 | Replace `python:3.7.3-slim` with `python:3.7.3-stretch` in the first line of your Dockerfile. 137 | 138 | __Why can’t I install a common Python or R package using Python or R’s package manager?__ 139 | 140 | Some packages have dependencies, such as GCC, that need to be installed. Try replacing `python:3.7.3-slim` with `python:3.7.3-stretch`, which includes more packages by default, or installing the dependencies 141 | 142 | If the first line of your Dockerfile is `FROM python:3.7.3-slim`, then you are building a Docker image with the Debian Linux distribution, so you can install GCC and other related libraries that many Python and R packages use by adding the line `RUN apt install build-essential` to your Dockerfile before installing these packages. 143 | 144 | __How do I build my image?__ 145 | 146 | ``` 147 | git clone <> 148 | cd <> 149 | ls 150 | ``` 151 | 152 | You should see a Dockerfile and other relevant files here. 153 | 154 | ``` 155 | docker build -t <> . 156 | docker images 157 | docker run -it <> bash 158 | ``` 159 | 160 | This will take you into your container and you should see your code. 161 | 162 | __What can I do to make sure that my submission is successful?__ 163 | 164 | You can avoid most submission errors with the following steps: 165 | - Do not change the driver script. We will only use the driver scripts (driver.m, driver.py, and driver.R, driver.jl) in the MATLAB, Python, R, and Julia example repositories ([https://github.com/physionetchallenges](https://github.com/physionetchallenges), so any changes that you make will not be used. 166 | - Do build your Docker image. The above FAQ provides advice for common Docker-related issues. 167 | - Do test your Docker code on at least one file from the training dataset. 168 | - Do try to reduce the run time of your code by moving code from the run_12ECG_classifier function to the load_12ECG_model function for repeated tasks. Most submissions run in a couple of hours on the test data. 169 | 170 | __Why is my entry unsuccessful on your submission system? It works on my computer.__ 171 | 172 | There are several common reasons for unexpected errors: 173 | - You may have changed the driver script. For consistency across submissions from different participants, we will use the driver scripts available on [https://github.com/physionetchallenges/](https://github.com/physionetchallenges). 174 | - You may have unmet dependencies. Note that packages in the requirements.txt file for Python submissions may have dependencies, such as gcc, that pip is unable to install. You can often identify such issues by trying to build a Docker image from your Dockerfile. 175 | - You may have used a specific version of a Python, R, or Julia package on your computer, but you didn’t specify the version of the package in your Dockerfile or your requirements.txt file, so we installed the latest available version of the package. These versions may be incompatible. For example, if you train your data using one version of a machine learning package and we test it with another version of the package, then your entry may fail. 176 | 177 | ## Submission Form 178 | 179 | The submission form can be found here: 180 | [https://forms.gle/PWu87SqN8frh6aKS7](https://forms.gle/PWu87SqN8frh6aKS7) 181 | 182 | --- 183 | 184 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 185 | 186 | [Back](../index.html) 187 | -------------------------------------------------------------------------------- /2020/eqn_jaccard_index.svg: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | -------------------------------------------------------------------------------- /2020/eqn_sum_g_measure.svg: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | -------------------------------------------------------------------------------- /2020/eqn_sum_f_measure.svg: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | -------------------------------------------------------------------------------- /2021/submissions/index.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: 2021 3 | --- 4 | 5 | # Submission Instructions 6 | 7 | ## PhysioNet/CinC Challenge 2021: Cloud Submission Instructions 8 | 9 | ### Table of contents 10 | 11 | - [Introduction](#introduction) 12 | - [Preparation and submission instructions](#preparation) 13 | - [MATLAB-specific instructions](#matlab) 14 | - [Python-specific instructions](#python) 15 | - [Docker-specific FAQs](#docker) 16 | - [FAQ](#faq) 17 | 18 | ### Introduction 19 | 20 | Similarly to [last year's Challenge](../../2020/), teams must submit both the code for their models and the code for training their models. To help, we have implemented example entries in both MATLAB and Python, and we encourage teams to use these example entries as templates for their entries. 21 | 22 | ### Preparation and submission instructions 23 | 24 | 1. Create a private GitHub or GitLab repository for your code. We recommend cloning our example code and replacing it with your code. Add `physionetchallengeshelper` as a collaborator to your repository. 25 | 2. Add your classification code to your repository. Like the example code, your code must be in the root directory of the master branch. 26 | 3. Do not include extra files that are not required to create and run your classification code, such as the training data. 27 | 4. Follow the instructions for the programming language of your submission. 28 | 5. Submit your entry through [this form](https://docs.google.com/forms/d/e/1FAIpQLSeB5sNtVeLwK48Ba_Qh43eFWmt2tTdIh7Q3ICq_J3G46hSE2g/viewform?usp=sf_link). We will clone your repository using the HTTPS URL that ends in `.git`. On GitHub, you can get this URL by clicking on “Clone or download” and copying and pasting the URL, e.g., `https://github.com/physionetchallenges/python-classifier-2021.git`. Please see [here](https://help.github.com/en/articles/which-remote-url-should-i-use) for an example. 29 | 6. We will put the scores for successful entries on the leaderboard. The leaderboard will publicly show your team name, run time, and score. 30 | 31 | ### MATLAB-specific instructions 32 | 33 | 1. Confirm that your MATLAB code compiles and runs in MATLAB R2020B or R2021A (when available). 34 | 2. Using our sample MATLAB classification code ([link](https://github.com/physionetchallenges/matlab-classifier-2021.git)) as a template, format your code in the following way. Consider downloading this repository, replacing our code with your code, and adding the updated files to your repository. 35 | 3. `AUTHORS.txt`, `LICENSE.txt`, `README.md`: Update as appropriate. Please include your authors. Unfortunately, our submission system will be unable to read your README file to change how we run your code. 36 | 4. `train_model.m`: Do not edit this script. It calls your `team_training_code.m` script. We will not use the `train_model.m` script from your repository, so any change made to this code will not be included. 37 | 5. `team_training_code.m`: Update this script to create and save your model. It loads the header with the data and demographics information for a recording, extracts features from the data using the `get_features.m` function which you can update and edit, and outputs and saves your model (weights and any needed parameters). You can edit this script and the `get_features.m` function as much as you need. 38 | 6. `test_model.m`: Do not change this script. It loads your models by calling `load_ECG_*leads_model` functions (`*=2,3,6 or 12` for four different lead sets; 2-leads, 3-leads, 6-leads and 12-leads models). Then, it calls your `team_testing_code` function for each recording and performs on all file input and output. We will not use the `test_model.m` script from your repository, so any change made to this code will not be included. 39 | 7. `team_testing_code.m`: Update this script to load and run your model weights and any parameters from files in your submission. It takes the input test data, header files, and the loaded models (outputs of your `train_model.m`) and returns a probability or confidence score and a binary classification for each class as output. 40 | 8. `get_features.m`: Update this scripts to extract your choice of features from the ECG recordings. 41 | 9. `get_leads.m`: Do not edit this script. It extracts 4 different lead sets (2-leads, 3-leads, 6-leads and 12-leads) of ECG recordings. 42 | 10. `extract_data_from_header.m`: Do not edit this script. It extracts the data information from the header files. 43 | 11. Add your code to the root/base directory of the master branch of your repository. 44 | 12. We will download your code, compile it using the MATLAB compiler: training (`mcc -m train_model.m -a .`) and running (`mcc -m test_model.m -a .`) your classifier, and run them on Google Cloud. 45 | 13. Here is a sample repository that you can use as a template: [MATLAB classifier](https://github.com/physionetchallenges/matlab-classifier-2021). 46 | 47 | ### Python-specific instructions 48 | 1. Using our sample Python classification code ([link](https://github.com/physionetchallenges/python-classifier-2021)) as a template, format your code in the following way. Consider downloading this repository, replacing our code with your code, and adding the updated files to your repository. 49 | 2. `Dockerfile`: Update to specify the version of Python that you are using on your machine. Add any additional packages that you need. Do not change the name or location of this file. The structure of this file is important, especially the 3 lines that are marked as "DO NOT EDIT". 50 | 3. `requirements.txt`: Add Python packages to be installed with `pip`. Specify the versions of these packages that you are using on your machine. Remove unnecessary packages, such as Matplotlib, that your classification code does not need. 51 | 4. `AUTHORS.txt`, `LICENSE.txt`, `README.md`: Update as appropriate. Please include your authors. Unfortunately, our submission system will be unable to read your README file to change how we run your code. 52 | 5. `team_code.py`: Update this script to load and run your trained model. 53 | 6. `train_model.py`: Do not change this script. It calls functions from the `team_code` script to run your training code on the training data. 54 | 7. `helper_code.py` Do not change this script. It is a script with helper variables and functions used for our code. You are welcome to use them in your code. 55 | 8. `test_model.py`: Do not change this script. It calls your trained models to run on the test data. We will not use the `test_model.py` script from your repository, so any change made to this code will not be included. 56 | 9. Add your code to the root/base directory of the master branch of your repository. 57 | 10. We will download your code, build a Docker image from your Dockerfile, and run it on Google Cloud. 58 | 11. Here is a sample repository that you can use as a template: [Python classifier](https://github.com/physionetchallenges/python-classifier-2021). 59 | 60 | ### Docker-specific FAQs 61 | 62 | __Why containers?__ 63 | 64 | Containers allow you to define the environment that you think is best suited for your algorithm. For example, if you think your algorithm needs a specific version of a Linux distribution or a certain version of a library or framework, then you can use the containers to specify the environment. Here are two links with data science-centric introductions to Docker: 65 | [https://towardsdatascience.com/how-docker-can-help-you-become-a-more-effective-data-scientist-7fc048ef91d5](https://towardsdatascience.com/how-docker-can-help-you-become-a-more-effective-data-scientist-7fc048ef91d5) 66 | [https://link.medium.com/G87RxYuQIV](https://link.medium.com/G87RxYuQIV) 67 | 68 | __Quickly, how can I test my submission locally?__ 69 | 70 | Install Docker. Clone your repository. Build an image. Run it on a single recording. 71 | 72 | __Less quickly, how can I test my submission locally? Please give me commands that I can copy and paste.__ 73 | 74 | To guarantee that we can run your code, please [install](https://docs.docker.com/get-docker/) Docker, build a Docker image from your code, and run it on the training data. To quickly check your code for bugs, you may want to run it on a subset of the training data. 75 | 76 | If you have trouble running your code, then please try the follow steps to run the example code, which is known to work. 77 | 78 | 1. Create a folder `example` in your home directory with several subfolders. 79 | 80 | user@computer:~$ cd ~/ 81 | user@computer:~$ mkdir example 82 | user@computer:~$ cd example 83 | user@computer:~/example$ mkdir training_data test_data model test_outputs 84 | 85 | 2. Download the training data from the [Challenge website](https://physionetchallenges.org/2021/#data-access). Put some of the training data in `training_data` and `test_data`. You can use some of the training data to check your code (and should perform cross-validation on the training data to evaluate your algorithm). 86 | 87 | 3. Download or clone this repository in your terminal. 88 | 89 | user@computer:~/example$ git clone https://github.com/physionetchallenges/python-classifier-2021.git 90 | 91 | 4. Build a Docker image and run the example code in your terminal. 92 | 93 | user@computer:~/example$ ls 94 | model python-classifier-2021 test_data test_outputs training_data 95 | 96 | user@computer:~/example$ ls training_data/ 97 | A0001.hea A0001.mat A0002.hea A0002.mat A0003.hea ... 98 | 99 | user@computer:~/example$ cd python-classifier-2021/ 100 | 101 | user@computer:~/example/python-classifier-2021$ docker build -t image . 102 | 103 | Sending build context to Docker daemon 30.21kB 104 | [...] 105 | Successfully tagged image:latest 106 | 107 | user@computer:~/example/python-classifier-2021$ docker run -it -v ~/example/model:/physionet/model -v ~/example/test_data:/physionet/test_data -v ~/example/test_outputs:/physionet/test_outputs -v ~/example/training_data:/physionet/training_data image bash 108 | 109 | root@[...]:/physionet# ls 110 | Dockerfile model test_data train_model.py 111 | extract_leads_wfdb.py README.md test_model.py 112 | helper_code.py requirements.txt test_outputs 113 | LICENSE team_code.py training_data 114 | 115 | root@[...]:/physionet# python train_model.py training_data model 116 | 117 | root@[...]:/physionet# python test_model.py model test_data test_outputs 118 | 119 | root@[...]:/physionet# exit 120 | Exit 121 | 122 | user@computer:~/example/python-classifier-2021$ cd .. 123 | 124 | user@computer:~/example$ ls test_outputs/ 125 | A0006.csv A0007.csv A0008.csv A0009.csv A0010.csv ... 126 | 127 | ### FAQ 128 | 129 | __What computational resources will my entry have?__ 130 | 131 | We will run your training code on Google Cloud using 10 vCPUs, 65 GB RAM, 100 GB disk space, and an optional [NVIDIA T4 Tensor Core GPU](https://www.nvidia.com/en-us/data-center/tesla-t4/) with 16 GB VRAM. Your training code has a 72 hour time limit without a GPU and a 48 hour time limit with a GPU. 132 | 133 | We will run your trained model on Google Cloud using 6 vCPUs, 39 GB RAM, 100 GB disk space, and an optional [NVIDIA T4 Tensor Core GPU](https://www.nvidia.com/en-us/data-center/tesla-t4/) with 16 GB VRAM. Your trained model has a 24 hour time limit on each of the validation and test sets. 134 | 135 | We are using an [N1 custom machine type](https://cloud.google.com/compute/docs/machine-types#n1_high-memory_machine_types) to run submissions on [GCP](https://cloud.google.com/compute/). If you would like to use a predefined machine type, then the `n1-highmem-8` is the closest predefined machine type, but with 2 fewer vCPUs and 13 GB less RAM. For GPU submissions, we use the [418.40.04 driver version](https://docs.nvidia.com/datacenter/tesla/tesla-release-notes-418-4004/index.html). 136 | 137 | __How do I install Docker?__ 138 | 139 | Go to [https://docs.docker.com/install/](https://docs.docker.com/install/) and install the Docker Community Edition. For troubleshooting, see [https://docs.docker.com/config/daemon/](https://docs.docker.com/config/daemon/) 140 | 141 | __Do I have to use your Dockerfile?__ 142 | 143 | No. The only part of the Dockerfile we care about are the three lines marked as ”DO NOT EDIT”. These three lines help ensure that, during the build process of the container, your code is copied into a folder called physionet so that our cloud-based pipelines can find your code and run it. Please do not change those three lines. You are free to change your base image, and at times you should (see the next question). 144 | 145 | __What’s the base image in Docker?__ 146 | 147 | Think of Docker as a series of images, or snapshots of a virtual machine, that are layered on top of each other. For example, your image may built on top of a very lightweight Ubuntu operating system with Python 3.8.6 that we get from the official Docker Hub (think of it as a GitHub for Docker). We can then install our requirements (NumPy and SciPy) on it. If you need the latest version of TensorFlow, then search for it on [hub.docker.com](https://hub.docker.com/) and edit your file so that the first line of your Dockerfile now reads as: `FROM tensorflow`. For a specific version, say 1.11, lookup the [tags](https://hub.docker.com/r/tensorflow/tensorflow/tags) and change it accordingly to `FROM tensorflow:1.11.0`. We recommend using specific versions for reproducibility. 148 | 149 | __sklearn or scikit-learn?__ 150 | 151 | For Python, if your entry uses scikit-learn, then you need to install it via `pip` using the package name `scikit-learn` instead of `sklearn` in your `requirements.txt` file: [See here](https://scikit-learn.org/stable/install.html). 152 | 153 | __xgboost?__ 154 | 155 | For Python, try `python:3.8.9-buster` in the first line of your Dockerfile. This image includes additional packages, such as GCC, that xgboost needs. Additionally, include xgboost in your requirements.txt file. Specify the version of xgboost that you are using in your requirements.txt file. 156 | For R, add `RUN R -e 'install.packages(“xgboost”)'` to your Dockerfile. 157 | 158 | __Pandas?__ 159 | 160 | For Python, try `python:3.8.9-buster` in the first line of your Dockerfile if you experience errors. 161 | 162 | __GPUs?__ 163 | 164 | We provide an optional [NVIDIA T4 Tensor Core GPU](https://www.nvidia.com/en-us/data-center/tesla-t4/) with 16 GB VRAM. We use the NVIDIA `418.40.04` driver for the GPU. The latest supported version of CUDA is 10.1, and the latest supported version of PyTorch is therefore 1.7.1. 165 | 166 | __Why can’t I install a common Python or R package using Python or R’s package manager?__ 167 | 168 | Some packages have dependencies, such as GCC, that need to be installed. Try `python:3.8.9-buster`, which includes more packages by default, or installing the dependencies. If the first line of your Dockerfile is `FROM python:3.8.6-slim`, then you are building a Docker image with the Debian Linux distribution, so you can install GCC and other related libraries that many Python and R packages use by adding the line `RUN apt install build-essential` to your Dockerfile before installing these packages. 169 | 170 | __How do I build my image?__ 171 | 172 | ``` 173 | git clone <> 174 | cd <> 175 | ls 176 | ``` 177 | 178 | You should see a Dockerfile and other relevant files here. 179 | 180 | ``` 181 | docker build -t <> . 182 | docker images 183 | docker run -it <> bash 184 | ``` 185 | 186 | This will take you into your container and you should see your code. 187 | 188 | Please see [Docker-specific FAQs](#docker) for more information and description. 189 | 190 | 191 | __What can I do to make sure that my submission is successful?__ 192 | 193 | You can avoid most submission errors with the following steps: 194 | - Do not change the train_model or test_model scripts. We will only use the versions of these scripts in the MATLAB and Python example repositories ([https://github.com/physionetchallenges](https://github.com/physionetchallenges)), so any changes that you make will not be used. 195 | - Do build your Docker image. The above FAQ provides advice for common Docker-related issues. 196 | - Do test your Docker code on at least one file from the training dataset. 197 | - Do try to reduce the run time of your code by moving code from the run_model function to the load_model function for repeated tasks. Most submissions can run in a couple of hours on the test data. 198 | 199 | __Why is my entry unsuccessful on your submission system? It works on my computer.__ 200 | 201 | There are several common reasons for unexpected errors: 202 | - You may have changed the driver script. For consistency across submissions from different participants, we will use the driver scripts available on [https://github.com/physionetchallenges/](https://github.com/physionetchallenges). 203 | - You may have unmet dependencies. Note that packages in the requirements.txt file for Python submissions may have dependencies, such as gcc, that pip is unable to install. You can often identify such issues by trying to build a Docker image from your Dockerfile. 204 | - You may have used a specific version of a Python package on your computer, but you didn’t specify the version of the package in your Dockerfile or your requirements.txt file, so we installed the latest available version of the package. These versions may be incompatible. For example, if you train your data using one version of a machine learning package and we test it with another version of the package, then your entry may fail. 205 | 206 | ## How do I learn more? 207 | 208 | Please see the [PhysioNet/CinC Challenge 2021 webpage](https://physionetchallenges.org/2021/) for more details. Please post questions and concerns on the [Challenge discussion forum](https://groups.google.com/forum/#!forum/physionet-challenges). 209 | 210 | ## Useful links 211 | 212 | * [Submission Form](https://docs.google.com/forms/d/e/1FAIpQLSeB5sNtVeLwK48Ba_Qh43eFWmt2tTdIh7Q3ICq_J3G46hSE2g/viewform?usp=sf_link) 213 | * [The PhysioNet/CinC Challenge 2021 webpage](https://physionetchallenges.org/2021/) 214 | * [Frequently Asked Questions (FAQ)](https://physionetchallenges.org/faq/) 215 | 216 | --- 217 | 218 | Supported by the [National Institute of Biomedical Imaging and Bioengineering (NIBIB)](https://www.nibib.nih.gov/) under NIH grant R01EB030362. 219 | 220 | [Back](../) 221 | --------------------------------------------------------------------------------