├── .env
├── .flake8
├── .gitignore
├── Dockerfile
├── README.md
├── app
└── app.py
├── docker-compose.yml
├── makefile
├── mypy.ini
├── pyproject.toml
├── requirements-dev.in
├── requirements-dev.txt
├── requirements.in
└── requirements.txt
/.env:
--------------------------------------------------------------------------------
1 | REDIS_DSN="redis://:ubuntu@localhost:6380/0"
2 | CACHE_EXPIRE="3600"
3 | MAPBOX_ACCESS_TOKEN=pk.eyJ1IjoicmVkbmFmaSIsImEiOiJjanhjcTJ0NW0wMXh0M3VqeXJqa2RrM21hIn0.gmFT-Ve-PfuEoogPnfZoFw
4 |
--------------------------------------------------------------------------------
/.flake8:
--------------------------------------------------------------------------------
1 | [flake8]
2 | extend-exclude =
3 | .git,
4 | __pycache__,
5 | docs/source/conf.py,
6 | old,
7 | build,
8 | dist,
9 | .venv,
10 | venv
11 |
12 | extend-ignore = E203, E266, E501, W605
13 |
14 | # Black's default line length.
15 | max-line-length = 88
16 |
17 | max-complexity = 18
18 |
19 | # Specify the list of error codes you wish Flake8 to report.
20 | select = B,C,E,F,W,T4,B9
21 |
22 | # Parallelism
23 | jobs = 4
24 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | pip-wheel-metadata/
24 | share/python-wheels/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 | MANIFEST
29 |
30 | # PyInstaller
31 | # Usually these files are written by a python script from a template
32 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
33 | *.manifest
34 | *.spec
35 |
36 | # Installer logs
37 | pip-log.txt
38 | pip-delete-this-directory.txt
39 |
40 | # Unit test / coverage reports
41 | htmlcov/
42 | .tox/
43 | .nox/
44 | .coverage
45 | .coverage.*
46 | .cache
47 | nosetests.xml
48 | coverage.xml
49 | *.cover
50 | *.py,cover
51 | .hypothesis/
52 | .pytest_cache/
53 |
54 | # Translations
55 | *.mo
56 | *.pot
57 |
58 | # Django stuff:
59 | *.log
60 | local_settings.py
61 | db.sqlite3
62 | db.sqlite3-journal
63 |
64 | # Flask stuff:
65 | instance/
66 | .webassets-cache
67 |
68 | # Scrapy stuff:
69 | .scrapy
70 |
71 | # Sphinx documentation
72 | docs/_build/
73 |
74 | # PyBuilder
75 | target/
76 |
77 | # Jupyter Notebook
78 | .ipynb_checkpoints
79 |
80 | # IPython
81 | profile_default/
82 | ipython_config.py
83 |
84 | # pyenv
85 | .python-version
86 |
87 | # pipenv
88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies
90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not
91 | # install all needed dependencies.
92 | #Pipfile.lock
93 |
94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow
95 | __pypackages__/
96 |
97 | # Celery stuff
98 | celerybeat-schedule
99 | celerybeat.pid
100 |
101 | # SageMath parsed files
102 | *.sage.py
103 |
104 | # Environments
105 | #.env
106 | .venv
107 | env/
108 | venv/
109 | ENV/
110 | env.bak/
111 | venv.bak/
112 |
113 | # Spyder project settings
114 | .spyderproject
115 | .spyproject
116 |
117 | # Rope project settings
118 | .ropeproject
119 |
120 | # mkdocs documentation
121 | /site
122 |
123 | # mypy
124 | .mypy_cache/
125 | .dmypy.json
126 | dmypy.json
127 |
128 | # Pyre type checker
129 | .pyre/
130 | redis-data/dump.rdb
131 |
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
1 | FROM python:3.9-slim-buster
2 |
3 | RUN mkdir /app
4 | WORKDIR /app
5 | COPY requirements.txt /app
6 | RUN pip install -r requirements.txt
7 |
8 | COPY . /app
9 | EXPOSE 5000
10 | CMD ["uvicorn", "app.app:app", "--host", "0.0.0.0", "--port", "5000"]
11 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | # Redis || Request || Caching
4 |
5 | 🐍 Simple Python Application to Demonstrate API Request Caching with Redis
6 |
7 |
8 |
9 |
10 |
11 | [**|| Blog ||**](https://rednafi.github.io/digressions/python/database/2020/05/25/python-redis-cache.html)
12 |
13 |
14 |
15 | ## Description
16 |
17 | This app sends request to [Mapbox](https://www.mapbox.com/)'s [route optimization API](https://docs.mapbox.com/api/navigation/#optimization) and caches the return value in a Redis database for 1 hours. Meanwhile, if new a new request arrives, the app first checks if the return value exists in the Redis cache. If the value exists, it shows the cached value, otherwise, it sends a new request to the Mapbox API, cache that value and then shows the result.
18 |
19 | The app uses the following stack:
20 |
21 | * [Httpx](https://github.com/encode/httpx/) for sending the requests
22 | * [Redis](https://redis.io/) for caching
23 | * [RedisInsight](https://redislabs.com/redisinsight/) for monitoring the caches
24 | * [FastAPI](https://github.com/tiangolo/fastapi) for wrapping the original API and exposing a new one
25 | * [Docker](https://www.docker.com/) and [docker-compose](https://docs.docker.com/compose/) for deployment
26 |
27 | ## Requirements
28 |
29 | * [Docker](https://www.docker.com/)
30 | * [Docker-compose](https://docs.docker.com/compose/)
31 |
32 | ## Run the App
33 |
34 | * Clone the repository.
35 |
36 | ```
37 | git clone git@github.com:rednafi/redis-request-caching.git
38 | ```
39 |
40 | * Go to `.env` file and provide your `MAPBOX_ACCESS_TOKEN`. You can get it from [here.](https://docs.mapbox.com/help/how-mapbox-works/access-tokens/)
41 |
42 | ```
43 | MAPBOX_ACCESS_TOKEN="Your-Mapbox-API-token"
44 | ```
45 |
46 | * In the `.env` file, replace the host ip with your own local ip
47 |
48 | * Go to the root directory and run:
49 |
50 | ```bash
51 | docker-compose up -d
52 | ```
53 |
54 | * Go to your browser and hit the following url:
55 |
56 | ```
57 | http://localhost:5000/route-optima/90.3866,23.7182;90.3742,23.7461
58 | ```
59 |
60 | * This should return a response like the following:
61 |
62 | ```json
63 | {
64 | "code":"Ok",
65 | "waypoints":[
66 | {
67 | "distance":26.041809241776583,
68 | "name":"",
69 | "location":[
70 | 90.386855,
71 | 23.718213
72 | ],
73 | "waypoint_index":0,
74 | "trips_index":0
75 | },
76 | {
77 | "distance":6.286653078791968,
78 | "name":"",
79 | "location":[
80 | 90.374253,
81 | 23.746129
82 | ],
83 | "waypoint_index":1,
84 | "trips_index":0
85 | }
86 | ],
87 | "trips":[
88 | {
89 | "geometry":{
90 | "coordinates":[
91 | [
92 | 90.386855,
93 | 23.718213
94 | ],
95 | "...
96 |
97 |
98 | ..."
99 | ],
100 | "type":"LineString"
101 | },
102 | "legs":[
103 | {
104 | "summary":"",
105 | "weight":3303.1,
106 | "duration":2842.8,
107 | "steps":[
108 |
109 | ],
110 | "distance":5250.2
111 | },
112 | {
113 | "summary":"",
114 | "weight":2536.5,
115 | "duration":2297,
116 | "steps":[
117 |
118 | ],
119 | "distance":4554.8
120 | }
121 | ],
122 | "weight_name":"routability",
123 | "weight":5839.6,
124 | "duration":5139.8,
125 | "distance":9805
126 | }
127 | ],
128 | "cache":false
129 | }
130 | ```
131 |
132 | * If you've hit the above url for the first time, the `cache` attribute of the json response should show `false`. This means that the response is being served from the original MapBox API. However, hitting the same url with the same coordinates again will show the cached response and this time the `cache` attribute should show `true`.
133 |
134 | * You can also go to `http://localhost:5000/docs` and play around with the swagger UI.
135 |
136 |
137 | 
138 |
139 |
140 |
141 | * The cached data can be monitored using redisinsight. Go to `http://localhost:8001`.
142 |
143 | Select the `ADD REDIS DATABASE` button and click `add database`.
144 |
145 | 
146 |
147 | This should bring up a prompt like this:
148 |
149 |
150 | 
151 |
152 | Give your database a name and provide the localhost IP adress in the form. Keep the username field black and use `ubuntu` as the password.
153 |
154 | Once you've done that you'll be taken to a page like the following:
155 |
156 |
157 | 
158 |
159 | Select `Browser` button in the top left panel and there you'll be able to look at your cached responses.
160 |
161 | 
162 |
163 |
164 |
165 | ## Remarks
166 |
167 | All the pieces of codes in the blog were written and tested with python 3.9 on a machine running Ubuntu 20.04.
168 |
169 | ## Disclaimer
170 |
171 | This app has been made for demonstration purpose only. So it might not reflect the best practices of production ready applications. Using APIs without authentication like this is not recommended.
172 |
173 |
--------------------------------------------------------------------------------
/app/app.py:
--------------------------------------------------------------------------------
1 | import json
2 | from datetime import timedelta
3 | from typing import Any
4 |
5 | import aioredis
6 | import httpx
7 | from fastapi import FastAPI
8 | from konfik import Konfik
9 |
10 | konfik = Konfik(".env")
11 | config = konfik.config
12 | redis = aioredis.from_url(config.REDIS_DSN, encoding="utf-8", decode_responses=True)
13 |
14 |
15 | async def get_routes_from_api(coordinates: str) -> dict:
16 | """Data from mapbox api."""
17 |
18 | async with httpx.AsyncClient() as client:
19 | base_url = "https://api.mapbox.com/optimized-trips/v1/mapbox/driving"
20 |
21 | geometries = "geojson"
22 | access_token = config.MAPBOX_ACCESS_TOKEN
23 |
24 | url = f"{base_url}/{coordinates}?geometries={geometries}&access_token={access_token}"
25 |
26 | response = await client.get(url)
27 | return response.json()
28 |
29 |
30 | async def get_routes_from_cache(redis: aioredis.client.Redis, key: str) -> str:
31 | """Data from redis."""
32 |
33 | val = await redis.get(key)
34 | return val
35 |
36 |
37 | async def set_routes_to_cache(
38 | redis: aioredis.client.Redis, key: str, value: str
39 | ) -> bool:
40 | """Data to redis."""
41 |
42 | state = await redis.setex(
43 | key,
44 | timedelta(seconds=int(config.CACHE_EXPIRE)),
45 | value=value,
46 | )
47 | return state
48 |
49 |
50 | async def route_optima(coordinates: str) -> dict:
51 |
52 | # First it looks for the data in redis cache.
53 | val = await get_routes_from_cache(redis, key=coordinates)
54 |
55 | # If cache is found then serves the data from cache.
56 | if val is not None:
57 | data = json.loads(val)
58 | data["cache"] = True
59 | return data
60 |
61 | else:
62 | # If cache is not found then sends request to the MapBox API.
63 | data = await get_routes_from_api(coordinates)
64 |
65 | # This block sets saves the respose to redis and serves it directly.
66 | if data.get("code") == "Ok":
67 | data["cache"] = False
68 | data = json.dumps(data)
69 | state = await set_routes_to_cache(redis, key=coordinates, value=data)
70 |
71 | if state is True:
72 | return data
73 | return data
74 |
75 |
76 | app = FastAPI()
77 |
78 |
79 | @app.get("/route-optima/{coordinates}")
80 | async def view(coordinates: str) -> dict[str, Any]:
81 | """This will wrap our original route optimization API and
82 | incorporate Redis Caching. You'll only expose this API to
83 | the end user."""
84 |
85 | # coordinates = "90.3866,23.7182;90.3742,23.7461"
86 |
87 | return await route_optima(coordinates)
88 |
--------------------------------------------------------------------------------
/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: "3.8"
2 | services:
3 | redis:
4 | container_name: redis-cont
5 | image: redis:alpine
6 |
7 | environment:
8 | - REDIS_PASSWORD=ubuntu
9 | - REDIS_REPLICATION_MODE=master
10 |
11 | volumes:
12 | # Save redisearch data to your current working directory.
13 | - ./redis-data:/data
14 |
15 | command:
16 | # Save if 1 keys are added in every 900 seconds.
17 | - --save 900 1
18 | - --port 6380
19 |
20 | # Set password.
21 | - --requirepass ubuntu
22 | network_mode: host
23 |
24 |
25 | # redisinsight: # Redis db visualization dashboard.
26 | # container_name: redis-insight
27 | # image: redislabs/redisinsight
28 | # ports:
29 | # - 8001:8001
30 | # volumes:
31 | # - redisinsight:/db
32 | # network_mode: host
33 |
34 |
35 | web: # Traveling Salesman Optimization api.
36 | container_name: web-cont
37 | build:
38 | context: ./
39 | dockerfile: Dockerfile
40 | ports:
41 | - 5000:5000
42 | depends_on:
43 | - redis
44 | env_file:
45 | - .env
46 | network_mode: host
47 |
48 |
49 | volumes:
50 | redis-data:
51 | redisinsight:
52 |
--------------------------------------------------------------------------------
/makefile:
--------------------------------------------------------------------------------
1 | .PHONY: help
2 |
3 | help:
4 | @grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
5 |
6 |
7 | .PHONY: venvcheck ## Check if venv is active
8 | venvcheck:
9 | ifeq ("$(VIRTUAL_ENV)","")
10 | @echo "Venv is not activated!"
11 | @echo "Activate venv first."
12 | @echo
13 | exit 1
14 | endif
15 |
16 |
17 | .PHONY: install
18 | install: venvcheck ## Install the dependencies
19 | @pip install -r requirements.txt
20 |
21 |
22 | .PHONY: lint
23 | lint: venvcheck ## Run Black and Isort linters
24 | @black .
25 | @isort .
26 | @mypy .
27 |
28 |
29 | .PHONY: update
30 | update: venvcheck
31 | @echo "fastapi uvloop konfik uvicorn httpx" | xargs -n 1 -d ' ' -P 5 -I {} sh -c "pip freeze | grep {}"
32 |
--------------------------------------------------------------------------------
/mypy.ini:
--------------------------------------------------------------------------------
1 | [mypy]
2 | follow_imports = skip
3 | ignore_missing_imports = True
4 | warn_no_return=False
5 | allow_untyped_globals=True
6 | allow_redefinition=True
7 | pretty=True
8 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.isort]
2 | profile="black"
3 | atomic=true
4 | extend_skip_glob="migrations,scripts"
5 | line_length=88
6 |
7 | [tool.black]
8 | extend-exclude="migrations,scripts"
9 |
--------------------------------------------------------------------------------
/requirements-dev.in:
--------------------------------------------------------------------------------
1 | black
2 | isort
3 | flake8
4 | mypy
5 | pip-tools
6 |
--------------------------------------------------------------------------------
/requirements-dev.txt:
--------------------------------------------------------------------------------
1 | #
2 | # This file is autogenerated by pip-compile
3 | # To update, run:
4 | #
5 | # pip-compile --output-file=requirements-dev.txt requirements-dev.in
6 | #
7 | appdirs==1.4.4
8 | # via black
9 | black==21.5b0
10 | # via -r requirements-dev.in
11 | click==7.1.2
12 | # via
13 | # black
14 | # pip-tools
15 | flake8==3.9.1
16 | # via -r requirements-dev.in
17 | isort==5.8.0
18 | # via -r requirements-dev.in
19 | mccabe==0.6.1
20 | # via flake8
21 | mypy-extensions==0.4.3
22 | # via
23 | # black
24 | # mypy
25 | mypy==0.812
26 | # via -r requirements-dev.in
27 | pathspec==0.8.1
28 | # via black
29 | pep517==0.10.0
30 | # via pip-tools
31 | pip-tools==6.1.0
32 | # via -r requirements-dev.in
33 | pycodestyle==2.7.0
34 | # via flake8
35 | pyflakes==2.3.1
36 | # via flake8
37 | regex==2021.4.4
38 | # via black
39 | toml==0.10.2
40 | # via black
41 | typed-ast==1.4.3
42 | # via mypy
43 | typing-extensions==3.10.0.0
44 | # via mypy
45 |
46 | # The following packages are considered to be unsafe in a requirements file:
47 | # pip
48 |
--------------------------------------------------------------------------------
/requirements.in:
--------------------------------------------------------------------------------
1 | aioredis==2.0.0a1
2 | konfik
3 | httpx
4 | fastapi
5 | uvicorn
6 | uvloop
7 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | #
2 | # This file is autogenerated by pip-compile
3 | # To update, run:
4 | #
5 | # pip-compile --output-file=requirements.txt requirements.in
6 | #
7 | aioredis==2.0.0a1
8 | # via -r requirements.in
9 | anyio==3.1.0
10 | # via httpcore
11 | asgiref==3.3.4
12 | # via uvicorn
13 | async-timeout==3.0.1
14 | # via aioredis
15 | certifi==2021.5.30
16 | # via
17 | # httpcore
18 | # httpx
19 | click==8.0.1
20 | # via uvicorn
21 | fastapi==0.65.2
22 | # via -r requirements.in
23 | h11==0.12.0
24 | # via
25 | # httpcore
26 | # uvicorn
27 | httpcore==0.15.0
28 | # via httpx
29 | httpx==0.23.0
30 | # via -r requirements.in
31 | idna==3.2
32 | # via
33 | # anyio
34 | # rfc3986
35 | konfik==2.0.5
36 | # via -r requirements.in
37 | pydantic==1.8.2
38 | # via fastapi
39 | pygments==2.9.0
40 | # via konfik
41 | python-dotenv==0.15.0
42 | # via konfik
43 | pyyaml==5.4.1
44 | # via konfik
45 | rfc3986[idna2008]==1.5.0
46 | # via httpx
47 | sniffio==1.2.0
48 | # via
49 | # anyio
50 | # httpcore
51 | # httpx
52 | starlette==0.14.2
53 | # via fastapi
54 | toml==0.10.2
55 | # via konfik
56 | typing-extensions==3.10.0.0
57 | # via
58 | # aioredis
59 | # pydantic
60 | uvicorn==0.14.0
61 | # via -r requirements.in
62 | uvloop==0.15.2
63 | # via -r requirements.in
64 |
--------------------------------------------------------------------------------