├── .flaskenv ├── .github └── FUNDING.yml ├── .gitignore ├── LICENSE.md ├── README.md ├── data ├── collections │ ├── postman.json │ └── swagger.json └── country_name_to_iso.json ├── example.env ├── migrations ├── README ├── alembic.ini ├── env.py ├── script.py.mako └── versions │ └── 8fc93c85d7ff_initial_records_table.py ├── requirements.txt ├── scripts ├── init_db.py └── start_covid.sh └── src ├── __init__.py ├── app.py ├── config.py ├── models.py ├── templates └── index.html └── wsgi.py /.flaskenv: -------------------------------------------------------------------------------- 1 | FLASK_APP=app.py 2 | -------------------------------------------------------------------------------- /.github/FUNDING.yml: -------------------------------------------------------------------------------- 1 | # These are supported funding model platforms 2 | 3 | patreon: # Replace with a single Patreon username 4 | ko_fi: backtrackbaba 5 | custom: ['https://www.buymeacoffee.com/I5ndObQ'] 6 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | 131 | #IDE's 132 | .idea/ 133 | .history/ 134 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Saiprasad Balasubramanian 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Covid API 2 | 3 | # Introduction 4 | This project builds upon the dataset of [John Hopkins University](https://github.com/CSSEGISandData/COVID-19) in CSV form which was converted to JSON Time Series format by [pomber](https://github.com/pomber/covid19). 5 | 6 | Our project intends to make that set queryable in a manner in which it could be easily consumed to build public dashboards. 7 | 8 | # Overview 9 | Analyzing the dataset, here are the major points that we came across. 10 | 11 | The API's have been mapped to use [ISO 3166](https://en.wikipedia.org/wiki/ISO_3166) standard to query countries instead of names as in the source datasets built upon as it wasn't in a standard format. 12 | 13 | The API's consume & return dates as per [ISO 8601](https://en.wikipedia.org/wiki/ISO_8601) standards in `yyyy-mm-dd` format The dates have been normalized from the underlying dataset by padding single digits in the date and month section. 14 | 15 | # Authentication 16 | There's no authentication required. Anybody and everybody is welcome to use this widely. 17 | 18 | # Rate limit 19 | There is no rate limit of any kind but we hope that you use it in a sensible manner and whenever possible cache response for a few hours as the underlying API's are updated ~~thrice a day~~ daily. 20 | 21 | # Updation 22 | The datasets are updated ~~thrice a day~~ daily. As of now, we manually trigger the updation of our API's as we don't have any downstream notification's sent on updation. We are also working on having a notification mechanism in place to support all the consumers of the API. PR's are always welcome! 23 | 24 | # Documentation 25 | Postman collection has been created along with documentation for you to get started with this project. Docs can be found [here](https://documenter.getpostman.com/view/2568274/SzS8rjbe?version=latest) 26 | 27 | 28 | # Examples 29 | 30 | 1) How do I get the global data on any given day? 31 | 32 | You could use the [`/api/v1/global/2020-03-15`](https://covidapi.info/api/v1/global/2020-03-15) endpoint 33 | 34 | 2) How do I get the data for a country in a date-range? 35 | 36 | Ex: To get the data for India between 10th and 19th March 2020, you could use [`/api/v1/country/IND/timeseries/2020-03-10/2020-03-19`](https://covidapi.info/api/v1/country/IND/timeseries/2020-03-10/2020-03-19) 37 | 38 | 3) How do I get the data for the last record for a country? 39 | 40 | Ex: You'll need to get the last date for any country by hitting the [`/api/v1/latest-date`](https://covidapi.info/api/v1/latest-date) endpoint and then use that date to query the country endpoint like this [`/api/v1/country/IND/2020-03-15`](https://covidapi.info/api/v1/country/IND/2020-03-15) 41 | 42 | # Local Setup 43 | 44 | ### Clone the repo locally 45 | `git clone https://github.com/backtrackbaba/covid-api.git` 46 | 47 | ### Setup a new Python Environment and source it 48 | 49 | Python version 3.6+ would be required to run the project 50 | ``` 51 | virtualenv -p python3 path/for/environment/covid 52 | 53 | source path/for/environment/covid/bin/activate 54 | ``` 55 | 56 | ### Set and source the env file 57 | ``` 58 | cd path/to/cloned/project 59 | 60 | # Change the values of env as per your local setup using example.env 61 | 62 | source .env 63 | ``` 64 | 65 | ### Starting application 66 | ``` 67 | cd path/to/cloned/project 68 | 69 | flask run 70 | ``` 71 | 72 | ### Seeding the database 73 | Once the local instance of flask is up and running, you could use the `/protected/update-db` endpoint to start with the seeding of the database 74 | 75 | ### Updating the database 76 | Same as what you did seeding the database, you'll need to hit the same endpoint to start updating the DB with the latest data 77 | 78 | ### Debugging 79 | 80 | While developing an endpoint, you could remove the cache decorator from the endpoint and enable it once the whole endpoint is up and running. Changes have been made in v2 to ensure this thing is taken care of automatically in local environment. 81 | 82 | While hitting any global endpoints which, you might get into `TypeError` which is usually caused when JHU, the data provider changes names of any of the countries in the data or add a name which isn't in the `country_name_to_iso.json` file. You could simply add the same into the file and update the database again 83 | 84 | 85 | Please open an issue if you get into some other problem and aren't able to figure out why it happened. I'll be glad to discuss any design decisions that you might come across in the code. 86 | 87 | # Sources 88 | 89 | [Novel Coronavirus (COVID-19) Cases, provided by JHU CSSE](https://github.com/CSSEGISandData/COVID-19) 90 | 91 | [JSON time-series of coronavirus cases (confirmed, deaths and recovered) per country - updated daily ](https://github.com/pomber/covid19) 92 | 93 | # Contributors 94 | 95 | Saiprasad Balasubramanian - [LinkedIn](https://www.linkedin.com/in/saiprasadbala/) - [Github](https://github.com/backtrackbaba) 96 | 97 | Harsh Jain - [LinkedIn](https://www.linkedin.com/in/hrkj-18/) 98 | 99 | Girisha Navani - [LinkedIn](https://www.linkedin.com/in/girisha-navani-87065215b/) 100 | 101 | # Contributing 102 | Contributions are always welcome and encouraged!! This code was whipped out in a very shot span of time for a friend to query on it. There's some refactoring to be done to remove any hacks and build on in a good manner. Ideas are always welcome 103 | 104 | # Roadmap 105 | There's a roadmap in mind to build up more endpoints. As of now there are just two endpoints which with plans to add more. I'll put it out here in the Kanban board as link it with the Issues. 106 | 107 | # License 108 | MIT Licensed 109 | 110 | -------------------------------------------------------------------------------- /data/country_name_to_iso.json: -------------------------------------------------------------------------------- 1 | { 2 | "Thailand": "THA", 3 | "Japan": "JPN", 4 | "Singapore": "SGP", 5 | "Nepal": "NPL", 6 | "Malaysia": "MYS", 7 | "Canada": "CAN", 8 | "Australia": "AUS", 9 | "Cambodia": "KHM", 10 | "Sri Lanka": "LKA", 11 | "Germany": "DEU", 12 | "Finland": "FIN", 13 | "United Arab Emirates": "ARE", 14 | "Philippines": "PHL", 15 | "India": "IND", 16 | "Italy": "ITA", 17 | "Sweden": "SWE", 18 | "Spain": "ESP", 19 | "Belgium": "BEL", 20 | "Egypt": "EGY", 21 | "Lebanon": "LBN", 22 | "Iraq": "IRQ", 23 | "Oman": "OMN", 24 | "Afghanistan": "AFG", 25 | "Bahrain": "BHR", 26 | "Kuwait": "KWT", 27 | "Algeria": "DZA", 28 | "Croatia": "HRV", 29 | "Switzerland": "CHE", 30 | "Austria": "AUT", 31 | "Israel": "ISR", 32 | "Pakistan": "PAK", 33 | "Brazil": "BRA", 34 | "Georgia": "GEO", 35 | "Greece": "GRC", 36 | "North Macedonia": "MKD", 37 | "Norway": "NOR", 38 | "Romania": "ROU", 39 | "Estonia": "EST", 40 | "San Marino": "SMR", 41 | "Belarus": "BLR", 42 | "Iceland": "ISL", 43 | "Lithuania": "LTU", 44 | "Mexico": "MEX", 45 | "New Zealand": "NZL", 46 | "Nigeria": "NGA", 47 | "Ireland": "IRL", 48 | "Luxembourg": "LUX", 49 | "Monaco": "MCO", 50 | "Qatar": "QAT", 51 | "Ecuador": "ECU", 52 | "Azerbaijan": "AZE", 53 | "Armenia": "ARM", 54 | "Dominican Republic": "DOM", 55 | "Indonesia": "IDN", 56 | "Portugal": "PRT", 57 | "Andorra": "AND", 58 | "Latvia": "LVA", 59 | "Morocco": "MAR", 60 | "Saudi Arabia": "SAU", 61 | "Senegal": "SEN", 62 | "Argentina": "ARG", 63 | "Chile": "CHL", 64 | "Jordan": "JOR", 65 | "Ukraine": "UKR", 66 | "Hungary": "HUN", 67 | "Liechtenstein": "LIE", 68 | "Poland": "POL", 69 | "Tunisia": "TUN", 70 | "Bosnia and Herzegovina": "BIH", 71 | "Slovenia": "SVN", 72 | "South Africa": "ZAF", 73 | "Bhutan": "BTN", 74 | "Cameroon": "CMR", 75 | "Colombia": "COL", 76 | "Costa Rica": "CRI", 77 | "Peru": "PER", 78 | "Serbia": "SRB", 79 | "Slovakia": "SVK", 80 | "Togo": "TGO", 81 | "Malta": "MLT", 82 | "Martinique": "MTQ", 83 | "Bulgaria": "BGR", 84 | "Maldives": "MDV", 85 | "Bangladesh": "BGD", 86 | "Paraguay": "PRY", 87 | "Albania": "ALB", 88 | "Cyprus": "CYP", 89 | "Brunei": "BRN", 90 | "US": "USA", 91 | "Burkina Faso": "BFA", 92 | "Holy See": "VAT", 93 | "Mongolia": "MNG", 94 | "Panama": "PAN", 95 | "China": "CHN", 96 | "Iran": "IRN", 97 | "Korea, South": "KOR", 98 | "France": "FRA", 99 | "Cruise Ship": "SHP", 100 | "Denmark": "DNK", 101 | "Czechia": "CZE", 102 | "Taiwan*": "TWN", 103 | "Vietnam": "VNM", 104 | "Russia": "RUS", 105 | "Moldova": "MDA", 106 | "Bolivia": "BOL", 107 | "Honduras": "HND", 108 | "United Kingdom": "GBR", 109 | "Congo (Kinshasa)": "COD", 110 | "Cote d'Ivoire": "CIV", 111 | "Jamaica": "JAM", 112 | "Turkey": "TUR", 113 | "Cuba": "CUB", 114 | "Guyana": "GUY", 115 | "Kazakhstan": "KAZ", 116 | "Ethiopia": "ETH", 117 | "Sudan": "SDN", 118 | "Guinea": "GIN", 119 | "Kenya": "KEN", 120 | "Antigua and Barbuda": "ATG", 121 | "Uruguay": "URY", 122 | "Ghana": "GHA", 123 | "Namibia": "NAM", 124 | "Seychelles": "SYC", 125 | "Trinidad and Tobago": "TTO", 126 | "Venezuela": "VEN", 127 | "Eswatini": "SWZ", 128 | "Gabon": "GAB", 129 | "Guatemala": "GTM", 130 | "Mauritania": "MRT", 131 | "Rwanda": "RWA", 132 | "Saint Lucia": "LCA", 133 | "Saint Vincent and the Grenadines": "VCT", 134 | "Suriname": "SUR", 135 | "Kosovo": "RKS", 136 | "Central African Republic": "CAF", 137 | "Congo (Brazzaville)": "COG", 138 | "Equatorial Guinea": "GNQ", 139 | "Uzbekistan": "UZB", 140 | "Netherlands": "NLD", 141 | "Benin": "BEN", 142 | "Liberia": "LBR", 143 | "Somalia": "SOM", 144 | "Tanzania": "TZA", 145 | "Barbados": "BRB", 146 | "Montenegro": "MNE", 147 | "Kyrgyzstan": "KGZ", 148 | "Mauritius": "MUS", 149 | "Zambia": "ZMB", 150 | "Djibouti": "DJI", 151 | "Gambia, The": "GMB", 152 | "Gambia": "GMB", 153 | "Bahamas, The": "BHS", 154 | "Bahamas": "BHS", 155 | "Chad": "TCD", 156 | "El Salvador": "SLV", 157 | "Fiji": "FJI", 158 | "Nicaragua": "NIC", 159 | "Haiti": "HTI", 160 | "Syria": "SYR", 161 | "Angola": "AGO", 162 | "Madagascar": "MDG", 163 | "Cabo Verde": "CPV", 164 | "Niger": "NER", 165 | "Papua New Guinea": "PNG", 166 | "Zimbabwe": "ZWE", 167 | "Cape Verde": "CBV", 168 | "East Timor": "ETL", 169 | "Uganda": "UGA", 170 | "Dominica": "DMA", 171 | "Grenada": "GRD", 172 | "Mozambique": "MOZ", 173 | "Timor-Leste": "TLS", 174 | "Eritrea": "ERI", 175 | "Belize": "BLZ", 176 | "Diamond Princess": "DPS", 177 | "Laos": "LAO", 178 | "Libya": "LBY", 179 | "The West Bank and Gaza": "WBG", 180 | "Guinea-Bissau": "GNB", 181 | "Mali": "MLI", 182 | "Saint Kitts and Nevis": "KNA", 183 | "West Bank and Gaza": "WBG", 184 | "Burma": "MMR", 185 | "Myanmar": "MMR", 186 | "MS Zaandam": "MSZ", 187 | "Botswana": "BWA", 188 | "Sierra Leone": "SLE", 189 | "Burundi": "BDI", 190 | "Malawi": "MWI", 191 | "Western Sahara": "ESH", 192 | "South Sudan": "SSD", 193 | "Sao Tome and Principe": "STP", 194 | "Yemen": "YEM", 195 | "Tajikistan": "TJK", 196 | "Comoros": "COM", 197 | "Lesotho": "LSO", 198 | "Marshall Islands": "MHL", 199 | "Samoa": "WSM", 200 | "Solomon Islands": "SLB", 201 | "Vanuatu": "VUT", 202 | "Micronesia": "FSM", 203 | "Kiribati": "KIR", 204 | "Palau": "PLW", 205 | "Summer Olympics 2020": "SOL" 206 | } -------------------------------------------------------------------------------- /example.env: -------------------------------------------------------------------------------- 1 | export POSTGRESQL_URL="IP:PORT" 2 | export POSTGRESQL_USERNAME="username" 3 | export POSTGRESQL_PASSWORD="password" 4 | export POSTGRESQL_DATABASE="db_name" 5 | 6 | export CACHE_REDIS_HOST="host" 7 | export CACHE_REDIS_PORT="port" 8 | export CACHE_REDIS_PASSWORD="password" 9 | export CACHE_REDIS_DB="db_number" 10 | 11 | export BASIC_AUTH_USERNAME="username" 12 | export BASIC_AUTH_PASSWORD="password" 13 | 14 | 15 | export CACHE_REDIS_TIMEOUT=1 16 | 17 | export PYTHONPATH=$PWD/src -------------------------------------------------------------------------------- /migrations/README: -------------------------------------------------------------------------------- 1 | Generic single-database configuration. -------------------------------------------------------------------------------- /migrations/alembic.ini: -------------------------------------------------------------------------------- 1 | # A generic, single database configuration. 2 | 3 | [alembic] 4 | # template used to generate migration files 5 | # file_template = %%(rev)s_%%(slug)s 6 | 7 | # set to 'true' to run the environment during 8 | # the 'revision' command, regardless of autogenerate 9 | # revision_environment = false 10 | 11 | 12 | # Logging configuration 13 | [loggers] 14 | keys = root,sqlalchemy,alembic 15 | 16 | [handlers] 17 | keys = console 18 | 19 | [formatters] 20 | keys = generic 21 | 22 | [logger_root] 23 | level = WARN 24 | handlers = console 25 | qualname = 26 | 27 | [logger_sqlalchemy] 28 | level = WARN 29 | handlers = 30 | qualname = sqlalchemy.engine 31 | 32 | [logger_alembic] 33 | level = INFO 34 | handlers = 35 | qualname = alembic 36 | 37 | [handler_console] 38 | class = StreamHandler 39 | args = (sys.stderr,) 40 | level = NOTSET 41 | formatter = generic 42 | 43 | [formatter_generic] 44 | format = %(levelname)-5.5s [%(name)s] %(message)s 45 | datefmt = %H:%M:%S 46 | -------------------------------------------------------------------------------- /migrations/env.py: -------------------------------------------------------------------------------- 1 | from __future__ import with_statement 2 | 3 | import logging 4 | from logging.config import fileConfig 5 | 6 | from sqlalchemy import engine_from_config 7 | from sqlalchemy import pool 8 | 9 | from alembic import context 10 | 11 | # this is the Alembic Config object, which provides 12 | # access to the values within the .ini file in use. 13 | config = context.config 14 | 15 | # Interpret the config file for Python logging. 16 | # This line sets up loggers basically. 17 | fileConfig(config.config_file_name) 18 | logger = logging.getLogger('alembic.env') 19 | 20 | # add your model's MetaData object here 21 | # for 'autogenerate' support 22 | # from myapp import mymodel 23 | # target_metadata = mymodel.Base.metadata 24 | from flask import current_app 25 | config.set_main_option( 26 | 'sqlalchemy.url', 27 | str(current_app.extensions['migrate'].db.engine.url).replace('%', '%%')) 28 | target_metadata = current_app.extensions['migrate'].db.metadata 29 | 30 | # other values from the config, defined by the needs of env.py, 31 | # can be acquired: 32 | # my_important_option = config.get_main_option("my_important_option") 33 | # ... etc. 34 | 35 | 36 | def run_migrations_offline(): 37 | """Run migrations in 'offline' mode. 38 | 39 | This configures the context with just a URL 40 | and not an Engine, though an Engine is acceptable 41 | here as well. By skipping the Engine creation 42 | we don't even need a DBAPI to be available. 43 | 44 | Calls to context.execute() here emit the given string to the 45 | script output. 46 | 47 | """ 48 | url = config.get_main_option("sqlalchemy.url") 49 | context.configure( 50 | url=url, target_metadata=target_metadata, literal_binds=True 51 | ) 52 | 53 | with context.begin_transaction(): 54 | context.run_migrations() 55 | 56 | 57 | def run_migrations_online(): 58 | """Run migrations in 'online' mode. 59 | 60 | In this scenario we need to create an Engine 61 | and associate a connection with the context. 62 | 63 | """ 64 | 65 | # this callback is used to prevent an auto-migration from being generated 66 | # when there are no changes to the schema 67 | # reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html 68 | def process_revision_directives(context, revision, directives): 69 | if getattr(config.cmd_opts, 'autogenerate', False): 70 | script = directives[0] 71 | if script.upgrade_ops.is_empty(): 72 | directives[:] = [] 73 | logger.info('No changes in schema detected.') 74 | 75 | connectable = engine_from_config( 76 | config.get_section(config.config_ini_section), 77 | prefix='sqlalchemy.', 78 | poolclass=pool.NullPool, 79 | ) 80 | 81 | with connectable.connect() as connection: 82 | context.configure( 83 | connection=connection, 84 | target_metadata=target_metadata, 85 | process_revision_directives=process_revision_directives, 86 | **current_app.extensions['migrate'].configure_args 87 | ) 88 | 89 | with context.begin_transaction(): 90 | context.run_migrations() 91 | 92 | 93 | if context.is_offline_mode(): 94 | run_migrations_offline() 95 | else: 96 | run_migrations_online() 97 | -------------------------------------------------------------------------------- /migrations/script.py.mako: -------------------------------------------------------------------------------- 1 | """${message} 2 | 3 | Revision ID: ${up_revision} 4 | Revises: ${down_revision | comma,n} 5 | Create Date: ${create_date} 6 | 7 | """ 8 | from alembic import op 9 | import sqlalchemy as sa 10 | ${imports if imports else ""} 11 | 12 | # revision identifiers, used by Alembic. 13 | revision = ${repr(up_revision)} 14 | down_revision = ${repr(down_revision)} 15 | branch_labels = ${repr(branch_labels)} 16 | depends_on = ${repr(depends_on)} 17 | 18 | 19 | def upgrade(): 20 | ${upgrades if upgrades else "pass"} 21 | 22 | 23 | def downgrade(): 24 | ${downgrades if downgrades else "pass"} 25 | -------------------------------------------------------------------------------- /migrations/versions/8fc93c85d7ff_initial_records_table.py: -------------------------------------------------------------------------------- 1 | """initial records table 2 | 3 | Revision ID: 8fc93c85d7ff 4 | Revises: 5 | Create Date: 2020-03-21 06:21:34.000497 6 | 7 | """ 8 | from alembic import op 9 | import sqlalchemy as sa 10 | from sqlalchemy.dialects import postgresql 11 | 12 | # revision identifiers, used by Alembic. 13 | revision = '8fc93c85d7ff' 14 | down_revision = None 15 | branch_labels = None 16 | depends_on = None 17 | 18 | 19 | def upgrade(): 20 | # ### commands auto generated by Alembic - please adjust! ### 21 | op.create_table('records', 22 | sa.Column('uuid', postgresql.UUID(as_uuid=True), nullable=False), 23 | sa.Column('country_iso', sa.String(length=100), nullable=True), 24 | sa.Column('country_name', sa.String(length=100), nullable=True), 25 | sa.Column('date', sa.DATE(), nullable=True), 26 | sa.Column('confirmed', sa.INTEGER(), nullable=True), 27 | sa.Column('deaths', sa.INTEGER(), nullable=True), 28 | sa.Column('recovered', sa.INTEGER(), nullable=True), 29 | sa.PrimaryKeyConstraint('uuid') 30 | ) 31 | op.create_index(op.f('ix_records_country_iso'), 'records', ['country_iso'], unique=False) 32 | # ### end Alembic commands ### 33 | 34 | 35 | def downgrade(): 36 | # ### commands auto generated by Alembic - please adjust! ### 37 | op.drop_index(op.f('ix_records_country_iso'), table_name='records') 38 | op.drop_table('records') 39 | # ### end Alembic commands ### 40 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | Flask==1.1.1 2 | Flask-SQLAlchemy==2.4.1 3 | SQLAlchemy==1.3.15 4 | Flask-Migrate==2.5.3 5 | psycopg2-binary==2.8.4 6 | python-dotenv==0.12.0 7 | requests==2.23.0 8 | click==7.1.1 9 | walrus==0.8.0 10 | gunicorn[gthread]==20.0.4 11 | flask-cors==3.0.9 12 | Flask-BasicAuth==0.2.0 13 | 14 | -------------------------------------------------------------------------------- /scripts/init_db.py: -------------------------------------------------------------------------------- 1 | import json 2 | import uuid 3 | 4 | import requests 5 | from sqlalchemy import create_engine 6 | from sqlalchemy.ext.declarative import declarative_base 7 | from sqlalchemy.orm import sessionmaker 8 | 9 | from config import Config 10 | from models import Records 11 | 12 | MASTER_DATA_URL = "https://pomber.github.io/covid19/timeseries.json" 13 | 14 | master_data_json = requests.get(MASTER_DATA_URL).json() 15 | 16 | POSTGRESQL_URL = Config.POSTGRESQL_URL 17 | POSTGRESQL_USERNAME = Config.POSTGRESQL_USERNAME 18 | POSTGRESQL_PASSWORD = Config.POSTGRESQL_PASSWORD 19 | POSTGRESQL_DATABASE = Config.POSTGRESQL_DATABASE 20 | 21 | db_string = 'postgresql+psycopg2://{user}:{pw}@{url}/{db}'.format(user=POSTGRESQL_USERNAME, 22 | pw=POSTGRESQL_PASSWORD, 23 | url=POSTGRESQL_URL, 24 | db=POSTGRESQL_DATABASE) 25 | db = create_engine(db_string) 26 | base = declarative_base() 27 | print("db_string", db_string) 28 | Session = sessionmaker(db) 29 | session = Session() 30 | 31 | countries = list(master_data_json.keys()) 32 | 33 | with open('/Users/sai/workspace/work/covid-api/data/country_name_to_iso.json', 'r') as fp: 34 | country_name_to_code = json.loads(fp.read()) 35 | 36 | for country in countries: 37 | for everyday in master_data_json[country]: 38 | record = Records() 39 | record.uuid = uuid.uuid4() 40 | record.country_name = country 41 | record.country_iso = country_name_to_code.get(country) 42 | record.date = everyday["date"] 43 | record.confirmed = everyday["confirmed"] 44 | record.deaths = everyday["deaths"] 45 | record.recovered = everyday["recovered"] 46 | session.add(record) 47 | print("Record Object", record) 48 | session.commit() 49 | print(record) 50 | print(f"Successfully added record for {country}") 51 | -------------------------------------------------------------------------------- /scripts/start_covid.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | NAME="libdb" 4 | PROJECT_SRC="/opt/covid/covid-api/src" 5 | LOGFILE="/opt/projects/covid/data/logs/gunicorn/gunicorn.log" 6 | 7 | USER=$(whoami) 8 | GROUP=$(id -g -n) 9 | 10 | if [[ "$OSTYPE" == "linux-gnu" ]]; then 11 | NUM_WORKERS=$((2 * $(cat /proc/cpuinfo | grep 'core id' | wc -l) + 1)) 12 | elif [[ "$OSTYPE" == "darwin"* ]]; then 13 | NUM_WORKERS=3 14 | fi 15 | 16 | NUM_WORKERS=3 17 | NUM_THREADS=$((4 * $NUM_WORKERS)) 18 | 19 | WSGI_MODULE=wsgi 20 | 21 | cd $PROJECT_SRC 22 | 23 | source /opt/covid/runtime-environments/python/bin/activate 24 | source /opt/covid/covid-devops/environments/.env 25 | 26 | export PYTHONPATH=$PROJECT_SRC 27 | 28 | export FLASK_ENV=production 29 | export FLASK_APP=app 30 | 31 | echo "Starting $NAME with $NUM_WORKERS workers and $NUM_THREADS threads!" 32 | 33 | exec gunicorn ${WSGI_MODULE}:app \ 34 | --name $NAME \ 35 | --workers $NUM_WORKERS \ 36 | --user=$USER --group=$GROUP \ 37 | --bind=0.0.0.0:7000 \ 38 | --log-level=info \ 39 | --reload \ 40 | --log-file=$LOGFILE \ 41 | --timeout=1200 \ 42 | --threads=$NUM_THREADS \ 43 | --worker-class=gthread 44 | -------------------------------------------------------------------------------- /src/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/backtrackbaba/covid-api/74a0c61d67b58ac83fc92dc686e9815e7935470e/src/__init__.py -------------------------------------------------------------------------------- /src/app.py: -------------------------------------------------------------------------------- 1 | import json 2 | import os 3 | import time 4 | import uuid 5 | from operator import and_ 6 | 7 | import requests 8 | from flask import Flask, render_template, abort 9 | from flask_basicauth import BasicAuth 10 | from flask_cors import CORS 11 | from flask_migrate import Migrate 12 | from flask_sqlalchemy import SQLAlchemy 13 | from sqlalchemy import desc, asc 14 | from sqlalchemy.dialects.postgresql import DATE, UUID 15 | from walrus import Database 16 | 17 | from config import Config 18 | 19 | app = Flask(__name__) 20 | app.config.from_object(Config) 21 | db = SQLAlchemy(app) 22 | migrate = Migrate(app, db) 23 | CORS(app) 24 | basic_auth = BasicAuth(app) 25 | 26 | redis_db = Database(host=os.environ.get('CACHE_REDIS_HOST'), port=os.environ.get('CACHE_REDIS_PORT'), 27 | db=os.environ.get('CACHE_REDIS_DB'), password=os.environ.get('CACHE_REDIS_PASSWORD')) 28 | cache = redis_db.cache() 29 | 30 | DATA_DIR = app.config['DATA_DIR'] 31 | 32 | 33 | class Records(db.Model): 34 | __tablename__ = 'records' 35 | __table_args__ = {'extend_existing': True} 36 | 37 | uuid = db.Column(UUID(as_uuid=True), primary_key=True) 38 | country_iso = db.Column(db.String(100), index=True) 39 | country_name = db.Column(db.String(100)) 40 | date = db.Column(DATE) 41 | confirmed = db.Column(db.INTEGER) 42 | deaths = db.Column(db.INTEGER) 43 | recovered = db.Column(db.INTEGER) 44 | 45 | def __repr__(self): 46 | return f'' 47 | 48 | 49 | @app.route('/') 50 | @cache.cached(timeout=86400, metrics=True) 51 | def home(): 52 | return render_template('index.html') 53 | 54 | 55 | @app.route('/api/v1/country/') 56 | @cache.cached(timeout=86400, metrics=True) 57 | def country(country_iso): 58 | result = Records.query.filter_by(country_iso=country_iso.upper()).all() 59 | if len(result) == 0: 60 | abort(404) 61 | data = { 62 | 'count': len(result), 63 | 'result': {} 64 | } 65 | for record in result: 66 | data['result'][record.date.strftime('%Y-%m-%d')] = {"confirmed": record.confirmed, "deaths": record.deaths, 67 | "recovered": record.recovered} 68 | return data 69 | 70 | 71 | @app.route('/api/v1/country//') 72 | @cache.cached(timeout=86400, metrics=True) 73 | def country_date(country_iso, date): 74 | result = fetch_country_on_date(country_iso.upper(), date) 75 | data = { 76 | 'count': 1, 77 | 'result': {} 78 | } 79 | data['result'][result.date.strftime('%Y-%m-%d')] = {"confirmed": result.confirmed, "deaths": result.deaths, 80 | "recovered": result.recovered} 81 | return data 82 | 83 | 84 | @app.route('/api/v1/country//timeseries//') 85 | @cache.cached(timeout=86400, metrics=True) 86 | def country_timeseries(country_iso, from_date, to_date): 87 | results = Records.query.filter(Records.country_iso == country_iso).filter( 88 | and_(Records.date >= from_date, Records.date < to_date)).order_by(asc(Records.date)).all() 89 | if len(results) == 0: 90 | abort(404) 91 | data_list = [] 92 | for result in results: 93 | data_list.append({"date": str(result.date), "confirmed": result.confirmed, "deaths": result.deaths, 94 | "recovered": result.recovered}) 95 | data = { 96 | 'count': len(results), 97 | 'result': data_list 98 | } 99 | return data 100 | 101 | 102 | @cache.cached(timeout=86400, metrics=True) 103 | def get_country_time_series(country_iso, from_date, to_date): 104 | result = Records.query.filter(Records.country_iso == country_iso).filter( 105 | and_(Records.date >= from_date, Records.date < to_date)).order_by(asc(Records.date)).all() 106 | return result 107 | 108 | 109 | @app.route('/api/v1/global/timeseries//') 110 | @cache.cached(timeout=86400, metrics=True) 111 | def global_timeseries(from_date, to_date): 112 | result = {} 113 | country_list = Records.query.distinct(Records.country_iso).all() 114 | data = { 115 | 'count': len(country_list), 116 | 'result': result 117 | } 118 | 119 | for country in country_list: 120 | country_result = get_country_time_series(country.country_iso, from_date, to_date) 121 | data_list = [] 122 | for entry in country_result: 123 | data_list.append({"date": str(entry.date), "confirmed": entry.confirmed, "deaths": entry.deaths, 124 | "recovered": entry.recovered}) 125 | result[country.country_iso] = data_list 126 | return data 127 | 128 | 129 | @cache.cached(timeout=86400, metrics=True) 130 | def fetch_country_on_date(country_iso, date): 131 | result = Records.query.filter(Records.country_iso == country_iso).filter(Records.date == date).first() 132 | if not result: 133 | abort(404) 134 | else: 135 | return result 136 | 137 | 138 | @app.route('/api/v1/global') 139 | @cache.cached(timeout=86400, metrics=True) 140 | def world(): 141 | date = Records.query.filter(Records.country_iso == "IND").order_by(desc(Records.date)).first().date 142 | results = Records.query.filter(Records.date == date).all() 143 | global_confirmed_count, global_death_count, global_recovered_count = 0, 0, 0 144 | for result in results: 145 | global_confirmed_count += result.confirmed 146 | global_death_count += result.deaths 147 | global_recovered_count += result.recovered 148 | data = { 149 | 'count': 1, 150 | 'date': str(date), 151 | 'result': {"confirmed": global_confirmed_count, "deaths": global_death_count, 152 | "recovered": global_recovered_count} 153 | } 154 | return data 155 | 156 | 157 | @app.route('/api/v1/global/') 158 | @cache.cached(timeout=86400, metrics=True) 159 | def world_date(date): 160 | results = Records.query.filter(Records.date == date).all() 161 | global_confirmed_count, global_death_count, global_recovered_count = 0, 0, 0 162 | for result in results: 163 | global_confirmed_count += result.confirmed 164 | global_death_count += result.deaths 165 | global_recovered_count += result.recovered 166 | data = { 167 | 'count': 1, 168 | 'date': date, 169 | 'result': {"confirmed": global_confirmed_count, "deaths": global_death_count, 170 | "recovered": global_recovered_count} 171 | } 172 | return data 173 | 174 | 175 | @app.route('/api/v1/global//') 176 | @cache.cached(timeout=86400, metrics=True) 177 | def world_date_window(from_date, to_date): 178 | from_date_result = Records.query.filter(Records.date == from_date).all() 179 | to_date_result = Records.query.filter(Records.date == to_date).all() 180 | from_date_confirmed_count, from_date_death_count, from_date_recovered_count = 0, 0, 0 181 | to_date_confirmed_count, to_date_death_count, to_date_recovered_count = 0, 0, 0 182 | 183 | for result in from_date_result: 184 | from_date_confirmed_count += result.confirmed 185 | from_date_death_count += result.deaths 186 | from_date_recovered_count += result.recovered 187 | 188 | for result in to_date_result: 189 | to_date_confirmed_count += result.confirmed 190 | to_date_death_count += result.deaths 191 | to_date_recovered_count += result.recovered 192 | 193 | data = { 194 | 'count': 1, 195 | 'from_date': from_date, 196 | 'to_date': to_date, 197 | 'result': { 198 | "confirmed": abs(from_date_confirmed_count - to_date_confirmed_count), 199 | "deaths": abs(from_date_death_count - to_date_death_count), 200 | "recovered": abs(from_date_recovered_count - to_date_recovered_count)} 201 | } 202 | return data 203 | 204 | 205 | @app.route('/api/v1/latest-date') 206 | @cache.cached(timeout=86400, metrics=True) 207 | def latest_date(): 208 | latest_date = Records.query.filter(Records.country_iso == "IND").order_by(desc(Records.date)).first().date 209 | return str(latest_date) 210 | 211 | 212 | @app.route('/api/v1/country//latest') 213 | @cache.cached(timeout=86400, metrics=True) 214 | def country_latest(country_iso): 215 | latest_date = Records.query.filter(Records.country_iso == "IND").order_by(desc(Records.date)).first().date 216 | result = fetch_country_on_date(country_iso, latest_date) 217 | data = { 218 | 'count': 1, 219 | 'result': {} 220 | } 221 | data['result'][result.date.strftime('%Y-%m-%d')] = {"confirmed": result.confirmed, "deaths": result.deaths, 222 | "recovered": result.recovered} 223 | return data 224 | 225 | 226 | @cache.cached(timeout=86400, metrics=True) 227 | def global_count_on_date(date): 228 | dates = Records.query.filter(Records.date == date).all() 229 | latest_date = Records.query.filter(Records.country_iso == "IND").order_by(desc(Records.date)).first().date 230 | print(dates) 231 | return dates 232 | 233 | 234 | @cache.cached(timeout=86400, metrics=True) 235 | @app.route('/api/v1/global/count') 236 | def global_count(): 237 | dates = Records.query.distinct(Records.date).all() 238 | date_result = {} 239 | for entry in dates: 240 | date = entry.date 241 | results = global_count_on_date(date) 242 | global_confirmed_count, global_death_count, global_recovered_count = 0, 0, 0 243 | for result in results: 244 | global_confirmed_count += result.confirmed 245 | global_death_count += result.deaths 246 | global_recovered_count += result.recovered 247 | date_result[str(date)] = { 248 | "confirmed": global_confirmed_count, 249 | "deaths": global_death_count, 250 | "recovered": global_recovered_count 251 | } 252 | data = { 253 | "count": len(date_result), 254 | "result": date_result 255 | } 256 | return data 257 | 258 | 259 | @cache.cached(timeout=86400, metrics=True) 260 | @app.route('/api/v1/global/latest') 261 | def global_latest(): 262 | latest_date = Records.query.filter(Records.country_iso == "IND").order_by(desc(Records.date)).first().date 263 | results = global_count_on_date(latest_date) 264 | data = { 265 | 'count': len(results), 266 | 'result': [], 267 | 'date': str(latest_date) 268 | } 269 | for result in results: 270 | country_data = {} 271 | country_data[result.country_iso] = {"confirmed": result.confirmed, "deaths": result.deaths, 272 | "recovered": result.recovered} 273 | data['result'].append(country_data) 274 | return data 275 | 276 | 277 | @app.route('/protected/update-db') 278 | @basic_auth.required 279 | def update_db(): 280 | t1 = time.time() 281 | record_counter = 0 282 | # Fetch the latest dataset 283 | master_data_url = "https://pomber.github.io/covid19/timeseries.json" 284 | country_name_to_iso_file = DATA_DIR + '/country_name_to_iso.json' 285 | master_data_json = requests.get(master_data_url).json() 286 | countries = list(master_data_json.keys()) 287 | 288 | with open(country_name_to_iso_file, 'r') as fp: 289 | country_name_to_code = json.loads(fp.read()) 290 | 291 | db.session.query(Records).delete() 292 | db.session.commit() 293 | 294 | for country in countries: 295 | country_records = [] 296 | for everyday in master_data_json[country]: 297 | everyday_record = { 298 | "uuid": uuid.uuid4(), 299 | "country_name": country, 300 | "country_iso": country_name_to_code.get(country), 301 | "date": everyday["date"], 302 | "confirmed": everyday["confirmed"], 303 | "deaths": everyday["deaths"], 304 | "recovered": everyday["recovered"] or 0, 305 | } 306 | country_records.append(everyday_record) 307 | 308 | db.session.bulk_insert_mappings(Records, country_records) 309 | db.session.commit() 310 | print(f"Added record for {country}") 311 | record_counter += len(country_records) 312 | 313 | redis_db.flushdb() 314 | print(f"Added records for {len(countries)} countries!") 315 | return f"Added {record_counter} records in {time.time() - t1} seconds!" 316 | 317 | 318 | @app.route('/protected/clear-redis') 319 | @basic_auth.required 320 | def clear_redis(): 321 | redis_db.flushdb() 322 | return "Cleared Redis!!!" 323 | 324 | 325 | @app.route('/protected/redis-info') 326 | @basic_auth.required 327 | def redis_info(): 328 | info = redis_db.info() 329 | data = { 330 | "memory": info.get("used_memory_human"), 331 | "db": info.get(f"db{os.environ.get('CACHE_REDIS_DB')}"), 332 | "commands_processed": info.get("total_commands_processed"), 333 | "keyspace_hits": info.get("keyspace_hits"), 334 | "keyspace_misses": info.get("keyspace_misses") 335 | } 336 | return data 337 | -------------------------------------------------------------------------------- /src/config.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | 4 | class Config(object): 5 | SQLALCHEMY_DATABASE_URI = 'postgresql+psycopg2://{user}:{pw}@{url}/{db}'.format( 6 | user=os.environ.get('POSTGRESQL_USERNAME'), 7 | pw=os.environ.get('POSTGRESQL_PASSWORD'), 8 | url=os.environ.get('POSTGRESQL_URL'), 9 | db=os.environ.get('POSTGRESQL_DATABASE')) 10 | 11 | SQLALCHEMY_TRACK_MODIFICATIONS = False 12 | 13 | BASIC_AUTH_USERNAME = os.environ.get('BASIC_AUTH_USERNAME') 14 | BASIC_AUTH_PASSWORD = os.environ.get('BASIC_AUTH_PASSWORD') 15 | 16 | BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) 17 | DATA_DIR = os.path.join(BASE_DIR, 'data') 18 | -------------------------------------------------------------------------------- /src/models.py: -------------------------------------------------------------------------------- 1 | from sqlalchemy.dialects.postgresql import DATE, UUID 2 | 3 | from src.app import db 4 | 5 | 6 | class Records(db.Model): 7 | __tablename__ = 'records' 8 | __table_args__ = {'extend_existing': True} 9 | 10 | uuid = db.Column(UUID(as_uuid=True), primary_key=True) 11 | country_iso = db.Column(db.String(100), index=True) 12 | country_name = db.Column(db.String(100)) 13 | date = db.Column(DATE) 14 | confirmed = db.Column(db.INTEGER) 15 | deaths = db.Column(db.INTEGER) 16 | recovered = db.Column(db.INTEGER) 17 | 18 | def __repr__(self): 19 | return f'' 20 | -------------------------------------------------------------------------------- /src/templates/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 7 | 8 | Covid19 API 9 | 10 | 11 |

Covid19 API

12 | 13 |

Thank you for the interest in the project.

14 | 15 |

The documentation for the endpoints along with the Postman collection can be found here.

17 |

The project is opensource and the source code is available on GitHub

19 | 20 |

Please raise a pull request on GitHub for any contribution, suggestion or feedback

22 | 23 | -------------------------------------------------------------------------------- /src/wsgi.py: -------------------------------------------------------------------------------- 1 | from app import app 2 | 3 | if __name__ == "__main__": 4 | app.run() 5 | --------------------------------------------------------------------------------