├── .gitignore ├── README.md ├── celery_app ├── app │ ├── __init__.py │ ├── db.py │ ├── service.py │ └── tasks.py ├── demo.py ├── docker │ ├── celery.dockerfile │ ├── flower.dockerfile │ └── postgis │ │ ├── Dockerfile │ │ ├── README.txt │ │ ├── geopython.dump │ │ ├── initdb-postgis.sh │ │ └── update-postgis.sh ├── requirements │ ├── celery_app.txt │ └── flower_app.txt └── tests │ ├── __init__.py │ └── tests_unit │ ├── __init__.py │ └── test_tasks.py ├── docker-compose.yml └── extras ├── geopython-final.key └── map.qgs /.gitignore: -------------------------------------------------------------------------------- 1 | .idea/ 2 | __pycache__/ 3 | build_and_run.sh 4 | .pytest_cache 5 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # GeoPython Workshop 2018 2 | ## Task queues with Celery and RabbitMQ 3 | 4 | Celery is a popular task queue library for the Python programming language. This demo application asynchronously executes a simple geoprocessing task which creates a geometry and writes it to a PostGIS database. 5 | 6 | This application is a [docker-compose](https://docs.docker.com/compose/) orchestration of four Docker containers: 7 | 8 | - [Celery](http://docs.celeryproject.org/en/latest/index.html) - Python Celery application to produce/consume messages (the worker) 9 | - [RabbitMQ](https://www.rabbitmq.com/#getstarted) - Message broker (the queue) 10 | - [Flower](http://flower.readthedocs.io/en/latest/) - Web app to monitor tasks 11 | - [PostGIS](https://postgis.net/) - Spatial database to store task output 12 | 13 | ## Before you start... 14 | 15 | This application requires [Docker Community Edition and Docker Compose](https://www.docker.com/community-edition). 16 | 17 | #### Build and run the containers 18 | 19 | 1. Open a terminal and change directory to the location of `docker-compose.yml`. 20 | 21 | 2. Build and start the containers (some images are pulled from Docker hub - be patient!): 22 | 23 | ```docker 24 | docker-compose up --build 25 | ``` 26 | 27 | The console will display the stdout from each container as they build and run. 28 | 29 | You can inspect the docker-compose file to gain a further understanding of the application structure. 30 | 31 | 3. Check everything is running OK, in a new terminal window (as it is starting up interactively in the first console): 32 | 33 | ```docker 34 | docker-compose ps 35 | ``` 36 | 37 | You should see the following containers ids listed 38 | ```docker 39 | celeryworkshopgeopython2018_celery_1 40 | celeryworkshopgeopython2018_flower_1 41 | celeryworkshopgeopython2018_postgis_1 42 | celeryworkshopgeopython2018_rabbitmq_1 43 | ``` 44 | 45 | The state of each container should be "Up". 46 | 47 | 4. You may wish to execute commands inside a container: 48 | 49 | ```docker 50 | docker exec -it /bin/sh -c "" 51 | ``` 52 | For example: 53 | ```docker 54 | docker exec -it celeryworkshopgeopython2018_celery_1 /bin/sh -c "ls" 55 | ``` 56 | 57 | 5. Stop the containers by cancelling the process in terminal window. 58 | 59 | #### Re-deploying changes to the code 60 | 61 | Each time you make a change to any Python files run the command in step 2 - `docker-compose up --build`. This will rebuild the images incorporating any changes. 62 | 63 | #### Monitoring tasks with Flower 64 | 65 | Flower is a web application for monitoring Celery tasks. Once the application is running, in your web browser navigate to http://localhost:5555. 66 | 67 | Flower provides monitoring of task executed by the worker. Some of Flowers features include: 68 | 69 | - View tasks on the queue 70 | - Inspect and control tasks 71 | - Inspect failed tasks exception messages 72 | - Control worker process pool size 73 | - Monitor 74 | - [periodic tasks](http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html) 75 | 76 | Not all of these features are available in this project. 77 | 78 | #### Troubleshooting 79 | 80 | The logs of each container can be inspected with the command: 81 | 82 | ```docker 83 | docker logs 84 | ``` 85 | 86 | The container ID's can be identified with the docker ps command. 87 | 88 | If you think you container is broken beyond repair run: 89 | 90 | ```docker 91 | docker-compose up --force-recreate 92 | ``` 93 | 94 | **This will force a complete rebuild so be aware this will also pull external images again.** 95 | 96 | ## Workshop begins... 97 | 98 | #### Instantiate Celery application object 99 | 100 | Before defining any tasks, a Celery application object must be instantiated. 101 | 102 | *celery_app/app/tasks.py* 103 | ```python 104 | from celery import Celery 105 | 106 | # URL to connect to broker 107 | broker_url = 'amqp://celery_user:secret@rabbitmq:5672/celery_app' 108 | 109 | # Create Celery application 110 | application = Celery('tasks', broker=broker_url) 111 | ... 112 | ``` 113 | 114 | The `Celery` object has two required parameters: 115 | - Name of the current module - 'tasks' 116 | - URL of the message broker as a keyword argument 117 | 118 | #### Defining a task 119 | 120 | To define a task to run asynchronously, simply apply the `task` decorator to a function. This application has a single task `do_task` defined in `tasks.py`. 121 | 122 | *celery_app/app/tasks.py* 123 | ```python 124 | ... 125 | 126 | @application.task(base=AppBaseTask, bind=True, max_retries=3, soft_time_limit=5) 127 | def do_task(self, x, y): 128 | """Performs simple geoprocessing task. 129 | 130 | Failed tasks are retried x times by the Task classes on_retry method. 131 | When tasks fail completely they are handled by the Task classes on_failure method 132 | 133 | Args: 134 | self: instance of the Task 135 | x: integer 136 | y: integer 137 | 138 | Raises: 139 | TaskError: failed tasked are handled by the parent task class. 140 | 141 | Returns: 142 | None 143 | """ 144 | try: 145 | geoprocess(x, y) 146 | except ServiceError as se: 147 | self.retry(countdown=10, exc=se) 148 | except Exception as exc: 149 | raise TaskError(exc) 150 | ... 151 | ``` 152 | 153 | The `task` decorator takes several optional keyword arguments including the maximum number of retries to attempt in the event of a failure and a timeout limit for hanging tasks. 154 | 155 | #### Calling tasks asynchronously 156 | 157 | To "queue" a task use the `delay` function and provide the arguments which the `do_task` function requires. 158 | 159 | *celery_app/demo.py* 160 | ```python 161 | from app.tasks import do_task 162 | 163 | def call_do_task(): 164 | """Call the do_task task x times.""" 165 | iterations = 2500 166 | 167 | for task_execution in range(iterations): 168 | 169 | x = get_random_x() 170 | y = get_random_y() 171 | 172 | do_task.delay(x, y) 173 | 174 | print(f"called do_task({x}, {y}) asynchronously") 175 | ... 176 | ``` 177 | 178 | ### Challenge 1 179 | 180 | The challenge is to write code to add tasks to the queue. 181 | 182 | Hints: 183 | 184 | 1. Checkout branch `challenge-1`: 185 | ```bash 186 | git checkout challenge-1 187 | ``` 188 | 189 | 2. Look in *celery_app/demo.py* for the TODO comment relating to challenge 1. 190 | 191 | 3. Look in *celery_app/app/tasks.py* to identify the function names and expected parameters of the the celery task. 192 | 193 | 4. In *celery_app/demo.py* write a loop that will iterate 2500 times. Inside the loop, use the two helper functions `get_random_x` and `get_random_y` to get suitable x and y values, add these values to the queue. You will need to use the special celery `delay` function. 194 | 195 | 5. Rebuild the docker containers after you have made your changes, [then run `demo.py`](#build-and-run-the-containers) within the docker container using docker exec. Check the process of the running/queued tasks in the [Flower web app](#monitoring-tasks-with-flower). 196 | 197 | 6. If you have pgadmin or similar, you can connect to the database and see records being added to the grid_squares table. If you have desktop GIS package (such as QGIS), you will be able to add the table as a vector layer to visualise the data. 198 | 199 | The database connection details are: 200 | - DB = geopython-db 201 | - Port = 5432 202 | - User = geopython 203 | - Password = geopython 204 | - Schema = geopython 205 | 206 | ### Challenge 2 207 | 208 | The challenge is to get celery to retry processing a task in the event of a recoverable error. 209 | 210 | Hints: 211 | 212 | 1. Checkout branch `challenge-2`. **make sure you commit any changes you wish to keep or discard changes before switching branch** 213 | 214 | 2. Rebuild the docker containers, then run `demo.py`. Observe that some tasks fail and none are retried. 215 | 216 | 3. Check the celery docs for [retrying failed tasks](http://docs.celeryproject.org/en/latest/userguide/tasks.html#retrying). 217 | 218 | 4. Look in *celery_app/app/service.py* to see where a random artificial error is thrown. Note the exception type. 219 | 220 | 5. In the place where the celery task is executed (look for `TODO - challenge 2`), handle the above error using python exception handling, and trigger the Celery retry mechanism. See if you can also add a retry delay. 221 | 222 | 6. Rebuild the docker containers after you have made your changes, then run `demo.py` - If you have been successful, you should see the retry counter in flower incrementing, and most likely no fails. All tasks should pass after being retried. 223 | 224 | ### Challenge 3 225 | 226 | This challenge will show multiple celery workers on different machines all pulling tasks from the same queue so that tasks are processed faster. 227 | 228 | In this exercise, if the technology allows us, you will connect your celery worker to a central queue, process tasks, and write the results to a central database. 229 | 230 | 1. Checkout branch `challenge-3`. 231 | 232 | 2. Edit *celery_app/app/tasks.py* line #7 to replace `***IP_ADDRESS_HERE***` with the IP address given to you in the workshop. This allows the celery worker to connect to the central queue. 233 | 234 | 3. Edit *celery_app/app/db.py* line #14 to replace `***IP_ADDRESS_HERE***` with the IP address given to you in the workshop. This is for connection to write results to a central database. 235 | 236 | 4. Rebuild the Docker containers 237 | 238 | 5. Open up the Flower web interface. If everything is working correctly you should be able to see all the workers that are connected to the central queue and the speeds at which they are processing tasks. 239 | 240 | The results of the processing are being shown on the screen. 241 | 242 | 243 | 244 | ### Tests 245 | 246 | ```bash 247 | python -m pytest tests 248 | ``` 249 | -------------------------------------------------------------------------------- /celery_app/app/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/thinkWhere/Celery-Workshop-GeoPython-2018/355beb3ba38d96cedd7d3680a6a54e60d0c541f8/celery_app/app/__init__.py -------------------------------------------------------------------------------- /celery_app/app/db.py: -------------------------------------------------------------------------------- 1 | from contextlib import contextmanager 2 | 3 | from psycopg2.pool import ThreadedConnectionPool 4 | 5 | 6 | pool = None 7 | 8 | def create_connection_pool(): 9 | """Create a connection pool for the worker""" 10 | pool = ThreadedConnectionPool(1, 4, 11 | dbname="geopython-db", 12 | user="geopython", 13 | host="postgis", 14 | port="5432", 15 | password="geopython") 16 | return pool 17 | 18 | 19 | @contextmanager 20 | def get_db_connection(): 21 | """Context manager for a database connection""" 22 | try: 23 | # Setup connection pool is it doesn't exist 24 | global pool 25 | if pool is None: 26 | pool = create_connection_pool() 27 | 28 | connection = pool.getconn() 29 | yield connection 30 | finally: 31 | pool.putconn(connection) 32 | 33 | 34 | @contextmanager 35 | def get_db_cursor(): 36 | """Context manager for database cursor""" 37 | with get_db_connection() as connection: 38 | cursor = connection.cursor() 39 | try: 40 | yield cursor 41 | connection.commit() 42 | finally: 43 | cursor.close() 44 | -------------------------------------------------------------------------------- /celery_app/app/service.py: -------------------------------------------------------------------------------- 1 | import math 2 | import psycopg2 3 | 4 | from app.db import get_db_cursor 5 | 6 | 7 | class ServiceError(Exception): 8 | """Custom exception for service errors""" 9 | pass 10 | 11 | 12 | def geoprocess(x, y): 13 | """ 14 | Implementation code for celery task 15 | :param x: x value for task 16 | :param y: y value for task 17 | """ 18 | with get_db_cursor() as cur: 19 | 20 | # check for intersection 21 | query = f"""SELECT gu_a3 22 | from uk_boundary 23 | WHERE ST_Intersects(geom, ST_GeomFromEWKT('SRID=27700;POINT({x} {y})')); 24 | """ 25 | cur.execute(query) 26 | 27 | # determine style code to use 28 | rows = cur.fetchall() 29 | if rows: 30 | style = rows[0][0] 31 | else: 32 | style = 'SEA' 33 | 34 | # get ref code 35 | ref = get_ref(x, y) 36 | 37 | # make object for db and insert it. Insert exceptions will raise to task. 38 | size = 1000 39 | xmin = int(x / size) * size 40 | xmax = xmin + size 41 | ymin = int(y / size) * size 42 | ymax = ymin + size 43 | string_for_db = f"ST_GeometryFromText('POLYGON(({xmin} {ymin},{xmax} {ymin},{xmax} {ymax},{xmin} {ymax},{xmin} {ymin}))',27700)" 44 | cur.execute(f"INSERT INTO grid_squares (grid_ref, style_cat, geom) VALUES('{ref}', '{style}', {string_for_db})") 45 | 46 | 47 | def get_ref(e, n): 48 | # Derived from 49 | # http://www.movable-type.co.uk/scripts/latlong-gridref.html 50 | gridChars = "ABCDEFGHJKLMNOPQRSTUVWXYZ" 51 | e100k = math.floor(e / 100000) 52 | n100k = math.floor(n / 100000) 53 | l1 = (19 - n100k) - (19 - n100k) % 5 + math.floor((e100k + 10) / 5) 54 | l2 = (19 - n100k) * 5 % 25 + e100k % 5 55 | letPair = gridChars[int(l1)] + gridChars[int(l2)] 56 | e100m = math.trunc(round(float(e) / 100)) 57 | egr = str(e100m).rjust(4, "0")[1:] 58 | if n >= 1000000: 59 | n = n - 1000000 60 | n100m = math.trunc(round(float(n) / 100)) 61 | ngr = str(n100m).rjust(4, "0")[1:] 62 | return letPair + egr + ngr 63 | -------------------------------------------------------------------------------- /celery_app/app/tasks.py: -------------------------------------------------------------------------------- 1 | from celery import Celery, Task 2 | 3 | from app.service import geoprocess, ServiceError 4 | 5 | # URL to connect to broker 6 | broker_url = 'amqp://celery_user:secret@rabbitmq:5672/celery_app' 7 | 8 | # Create Celery application 9 | application = Celery('tasks', broker=broker_url) 10 | 11 | 12 | class TaskError(Exception): 13 | """Custom exception for handling task errors""" 14 | pass 15 | 16 | 17 | class AppBaseTask(Task): 18 | """Base Task 19 | 20 | The base Task class can be used to define 21 | shared behaviour between tasks. 22 | 23 | """ 24 | 25 | abstract = True 26 | 27 | def on_retry(self, exc, task_id, args, kwargs, einfo): 28 | # TODO: log out here 29 | super(AppBaseTask, self).on_retry(exc, task_id, args, kwargs, einfo) 30 | 31 | def on_failure(self, exc, task_id, args, kwargs, einfo): 32 | # TODO: log out here 33 | super(AppBaseTask, self).on_failure(exc, task_id, args, kwargs, einfo) 34 | 35 | 36 | @application.task(base=AppBaseTask, bind=True, max_retries=3, soft_time_limit=5) 37 | def do_task(self, x, y): 38 | """Performs simple geoprocessing task. 39 | 40 | Failed tasks are retried x times by the Task classes on_retry method. 41 | When tasks fail completely they are handled by the Task classes on_failure method 42 | 43 | Args: 44 | self: instance of the Task 45 | x: integer 46 | y: integer 47 | 48 | Raises: 49 | TaskError: failed tasked are handled by the parent task class. 50 | 51 | Returns: 52 | None 53 | """ 54 | try: 55 | geoprocess(x, y) 56 | except ServiceError as se: 57 | self.retry(countdown=10, exc=se) 58 | except Exception as exc: 59 | raise TaskError(exc) 60 | -------------------------------------------------------------------------------- /celery_app/demo.py: -------------------------------------------------------------------------------- 1 | from app.tasks import do_task 2 | import random 3 | 4 | 5 | def call_do_task(): 6 | """Call the do_task task x times.""" 7 | iterations = 2500 8 | 9 | for task_execution in range(iterations): 10 | 11 | x = get_random_x() 12 | y = get_random_y() 13 | 14 | do_task.delay(x, y) 15 | 16 | print(f"called do_task({x}, {y}) asynchronously") 17 | 18 | 19 | def get_random_x(): 20 | return random.randint(0, 700000) 21 | 22 | 23 | def get_random_y(): 24 | return random.randint(0, 1300000) 25 | 26 | 27 | if __name__ == '__main__': 28 | 29 | call_do_task() 30 | -------------------------------------------------------------------------------- /celery_app/docker/celery.dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.6.4-alpine3.7 2 | 3 | COPY requirements/celery_app.txt /celery_app/requirements/celery_app.txt 4 | 5 | RUN pip install --upgrade pip 6 | 7 | RUN apk update \ 8 | && apk add --virtual build-deps gcc python3-dev musl-dev \ 9 | && apk add postgresql-dev \ 10 | && pip install -r /celery_app/requirements/celery_app.txt \ 11 | && apk del build-deps 12 | 13 | WORKDIR /celery_app 14 | 15 | COPY . /celery_app 16 | 17 | CMD ["celery", "-A", "app.tasks", "worker", "--loglevel=info"] 18 | -------------------------------------------------------------------------------- /celery_app/docker/flower.dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.6.4-alpine3.7 2 | 3 | COPY requirements/flower_app.txt /celery_app/requirements/flower_app.txt 4 | 5 | RUN pip install --upgrade pip 6 | 7 | RUN apk update \ 8 | && apk add --virtual build-deps gcc python3-dev musl-dev \ 9 | && apk add postgresql-dev \ 10 | && pip install -r /celery_app/requirements/flower_app.txt \ 11 | && apk del build-deps 12 | 13 | WORKDIR /celery_app 14 | 15 | COPY . /celery_app 16 | 17 | CMD ["celery", "-A", "app.tasks", "flower"] 18 | -------------------------------------------------------------------------------- /celery_app/docker/postgis/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM postgres:9.6-alpine 2 | MAINTAINER Régis Belson 3 | 4 | ENV POSTGIS_VERSION 2.4.3 5 | ENV POSTGIS_SHA256 b9754c7b9cbc30190177ec34b570717b2b9b88ed271d18e3af68eca3632d1d95 6 | 7 | RUN set -ex \ 8 | \ 9 | && apk add --no-cache --virtual .fetch-deps \ 10 | ca-certificates \ 11 | openssl \ 12 | tar \ 13 | \ 14 | && wget -O postgis.tar.gz "https://github.com/postgis/postgis/archive/$POSTGIS_VERSION.tar.gz" \ 15 | && echo "$POSTGIS_SHA256 *postgis.tar.gz" | sha256sum -c - \ 16 | && mkdir -p /usr/src/postgis \ 17 | && tar \ 18 | --extract \ 19 | --file postgis.tar.gz \ 20 | --directory /usr/src/postgis \ 21 | --strip-components 1 \ 22 | && rm postgis.tar.gz \ 23 | \ 24 | && apk add --no-cache --virtual .build-deps \ 25 | autoconf \ 26 | automake \ 27 | g++ \ 28 | json-c-dev \ 29 | libtool \ 30 | libxml2-dev \ 31 | make \ 32 | perl \ 33 | \ 34 | && apk add --no-cache --virtual .build-deps-testing \ 35 | --repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \ 36 | gdal-dev \ 37 | geos-dev \ 38 | proj4-dev \ 39 | protobuf-c-dev \ 40 | && cd /usr/src/postgis \ 41 | && ./autogen.sh \ 42 | # configure options taken from: 43 | # https://anonscm.debian.org/cgit/pkg-grass/postgis.git/tree/debian/rules?h=jessie 44 | && ./configure \ 45 | # --with-gui \ 46 | && make \ 47 | && make install \ 48 | && apk add --no-cache --virtual .postgis-rundeps \ 49 | json-c \ 50 | && apk add --no-cache --virtual .postgis-rundeps-testing \ 51 | --repository http://dl-cdn.alpinelinux.org/alpine/edge/testing \ 52 | geos \ 53 | gdal \ 54 | proj4 \ 55 | protobuf-c \ 56 | && cd / \ 57 | && rm -rf /usr/src/postgis \ 58 | && apk del .fetch-deps .build-deps .build-deps-testing 59 | 60 | COPY ./geopython.dump /geopython.dump 61 | RUN chmod 644 /geopython.dump 62 | COPY ./initdb-postgis.sh /docker-entrypoint-initdb.d/postgis.sh 63 | RUN chmod 644 /docker-entrypoint-initdb.d/postgis.sh 64 | COPY ./update-postgis.sh /usr/local/bin -------------------------------------------------------------------------------- /celery_app/docker/postgis/README.txt: -------------------------------------------------------------------------------- 1 | CREDITS - 2 | 3 | Postgis Alpine docker adapted from the container by Matt Dillon https://github.com/appropriate/docker-postgis 4 | Mapping data from Natural Earth http://www.naturalearthdata.com/downloads/10m-cultural-vectors/ -------------------------------------------------------------------------------- /celery_app/docker/postgis/initdb-postgis.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | #set -e 4 | 5 | # Perform all actions as $POSTGRES_USER 6 | export PGUSER="$POSTGRES_USER" 7 | 8 | # Load PostGIS into both template_database and $POSTGRES_DB 9 | echo "Loading PostGIS extensions into $DB" 10 | "${psql[@]}" --dbname=$POSTGRES_DB <<-'EOSQL' 11 | CREATE EXTENSION IF NOT EXISTS postgis; 12 | EOSQL 13 | 14 | "${psql[@]}" --dbname=$POSTGRES_DB -f /geopython.dump -------------------------------------------------------------------------------- /celery_app/docker/postgis/update-postgis.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | set -e 4 | 5 | # Perform all actions as $POSTGRES_USER 6 | export PGUSER="$POSTGRES_USER" 7 | 8 | POSTGIS_VERSION="${POSTGIS_VERSION%%+*}" 9 | 10 | # Load PostGIS into both template_database and $POSTGRES_DB 11 | for DB in template_postgis "$POSTGRES_DB"; do 12 | echo "Updating PostGIS extensions '$DB' to $POSTGIS_VERSION" 13 | psql --dbname="$DB" -c " 14 | -- Upgrade PostGIS (includes raster) 15 | CREATE EXTENSION IF NOT EXISTS postgis VERSION '$POSTGIS_VERSION'; 16 | ALTER EXTENSION postgis UPDATE TO '$POSTGIS_VERSION'; 17 | -- Upgrade Topology 18 | CREATE EXTENSION IF NOT EXISTS postgis_topology VERSION '$POSTGIS_VERSION'; 19 | ALTER EXTENSION postgis_topology UPDATE TO '$POSTGIS_VERSION'; 20 | -- Upgrade US Tiger Geocoder 21 | CREATE EXTENSION IF NOT EXISTS postgis_tiger_geocoder VERSION '$POSTGIS_VERSION'; 22 | ALTER EXTENSION postgis_tiger_geocoder UPDATE TO '$POSTGIS_VERSION'; 23 | " 24 | done -------------------------------------------------------------------------------- /celery_app/requirements/celery_app.txt: -------------------------------------------------------------------------------- 1 | amqp==2.2.2 2 | attrs==17.4.0 3 | billiard==3.5.0.3 4 | celery==4.1.0 5 | kombu==4.1.0 6 | pluggy==0.6.0 7 | py==1.5.2 8 | pytest==3.4.1 9 | pytz==2018.3 10 | six==1.11.0 11 | vine==1.1.4 12 | psycopg2==2.7.4 13 | -------------------------------------------------------------------------------- /celery_app/requirements/flower_app.txt: -------------------------------------------------------------------------------- 1 | amqp==2.2.2 2 | Babel==2.5.3 3 | billiard==3.5.0.3 4 | celery==4.1.0 5 | flower==0.9.2 6 | kombu==4.1.0 7 | pytz==2018.3 8 | tornado==5.0 9 | vine==1.1.4 10 | psycopg2==2.7.4 11 | -------------------------------------------------------------------------------- /celery_app/tests/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/thinkWhere/Celery-Workshop-GeoPython-2018/355beb3ba38d96cedd7d3680a6a54e60d0c541f8/celery_app/tests/__init__.py -------------------------------------------------------------------------------- /celery_app/tests/tests_unit/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/thinkWhere/Celery-Workshop-GeoPython-2018/355beb3ba38d96cedd7d3680a6a54e60d0c541f8/celery_app/tests/tests_unit/__init__.py -------------------------------------------------------------------------------- /celery_app/tests/tests_unit/test_tasks.py: -------------------------------------------------------------------------------- 1 | from unittest.mock import patch 2 | 3 | from celery.exceptions import Retry 4 | from pytest import raises 5 | 6 | from app.service import ServiceError 7 | from app.tasks import do_task, TaskError 8 | 9 | 10 | @patch('app.tasks.geoprocess') 11 | def test_do_task_success(mock_add): 12 | # Act 13 | do_task(2, 3) 14 | 15 | # Assert 16 | mock_add.assert_called_with(2, 3) 17 | 18 | 19 | @patch('app.tasks.geoprocess') 20 | @patch('app.tasks.do_task.retry') 21 | def test_do_task_handles_serviceerror_with_retry(mock_add_task_retry, mock_geoprocess): 22 | 23 | # Arrange 24 | mock_add_task_retry.side_effect = Retry() 25 | mock_geoprocess.side_effect = ServiceError('Random fail!') 26 | 27 | # Act/assert 28 | with raises(Retry): 29 | do_task(2, 1) 30 | 31 | 32 | @patch('app.tasks.geoprocess') 33 | @patch('app.tasks.do_task.retry') 34 | def test_do_task_handles_unexpected_exception(mock_add_task_retry, mock_geoprocess): 35 | 36 | # Arrange 37 | mock_add_task_retry.side_effect = Retry() 38 | mock_geoprocess.side_effect = NotADirectoryError('Random fail!') 39 | 40 | # Act/assert 41 | with raises(TaskError): 42 | do_task(2, 1) 43 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "2" 2 | 3 | services: 4 | 5 | postgis: 6 | image: thinkwhere/postgis:geopy 7 | environment: 8 | - POSTGRES_USER=geopython 9 | - POSTGRES_PASSWORD=geopython 10 | - POSTGRES_DB=geopython-db 11 | ports: 12 | - "5435:5432" 13 | 14 | rabbitmq: 15 | # Message broker for Celery 16 | image: rabbitmq:3.7-management-alpine 17 | environment: 18 | - RABBITMQ_DEFAULT_USER=celery_user 19 | - RABBITMQ_DEFAULT_PASS=secret 20 | - RABBITMQ_DEFAULT_VHOST=celery_app 21 | ports: 22 | # The rabbitMQ management plugin - running on http://localhost:15672 23 | - "15672:15672" 24 | - "5672:5672" 25 | 26 | celery: 27 | # Python application which run the Celery worker 28 | build: 29 | context: celery_app 30 | dockerfile: docker/celery.dockerfile 31 | links: 32 | - rabbitmq 33 | depends_on: 34 | - rabbitmq 35 | - postgis 36 | 37 | flower: 38 | # Monitoring application - running on http://localhost:5555 39 | build: 40 | context: celery_app 41 | dockerfile: docker/flower.dockerfile 42 | links: 43 | - celery 44 | - rabbitmq 45 | depends_on: 46 | - celery 47 | ports: 48 | - "5555:5555" 49 | -------------------------------------------------------------------------------- /extras/geopython-final.key: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/thinkWhere/Celery-Workshop-GeoPython-2018/355beb3ba38d96cedd7d3680a6a54e60d0c541f8/extras/geopython-final.key -------------------------------------------------------------------------------- /extras/map.qgs: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | meters 17 | 18 | 86771.37079973210347816 19 | 225830.19674059213139117 20 | 431433.87079973221989349 21 | 858948.69674059213139117 22 | 23 | 0 24 | 0 25 | 26 | 27 | +proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +ellps=airy +towgs84=446.448,-125.157,542.06,0.15,0.247,0.842,-20.489 +units=m +no_defs 28 | 2437 29 | 27700 30 | EPSG:27700 31 | OSGB 1936 / British National Grid 32 | tmerc 33 | airy 34 | false 35 | 36 | 37 | 0 38 | 39 | 40 | 41 | 42 | grid_squares20180405170855595 43 | 44 | 45 | 46 | 47 | 50 | 51 | 52 | 53 | 54 | 55 | 750 56 | 16030 57 | 657250 58 | 1221970 59 | 60 | grid_squares20180405170855595 61 | dbname='geopython-db' host=localhost port=5435 sslmode=disable key='id' srid=27700 type=Polygon table="geopython"."grid_squares" (geom) sql= 62 | 63 | 64 | 65 | grid_squares 66 | 67 | 68 | +proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +ellps=airy +towgs84=446.448,-125.157,542.06,0.15,0.247,0.842,-20.489 +units=m +no_defs 69 | 2437 70 | 27700 71 | EPSG:27700 72 | OSGB 1936 / British National Grid 73 | tmerc 74 | airy 75 | false 76 | 77 | 78 | postgres 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | 231 | 232 | 233 | 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 | 253 | 254 | 255 | 256 | 257 | 258 | 259 | 260 | 261 | 262 | 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | 321 | 322 | 323 | 324 | 325 | 326 | 327 | 328 | 329 | 330 | 331 | 332 | 333 | 334 | 335 | 336 | 337 | 338 | 339 | 340 | 341 | 342 | 343 | 344 | 345 | 346 | 347 | 348 | 349 | 350 | 351 | 352 | 353 | 354 | 355 | 356 | 357 | 358 | 359 | 360 | 361 | 362 | 363 | 364 | 365 | 366 | 367 | 0 368 | 0 369 | 0 370 | id 371 | 372 | 373 | 392 | 393 | 394 | 395 | 396 | 397 | 398 | 399 | 400 | 401 | 402 | 403 | 404 | 405 | 406 | 407 | 408 | 409 | 410 | 411 | 412 | 413 | 414 | 415 | 416 | 417 | 418 | 419 | 420 | 421 | . 422 | 423 | 424 | 425 | 426 | 427 | 428 | 429 | 430 | 431 | 432 | 437 | 438 | . 439 | 440 | 0 441 | . 442 | 459 | 0 460 | generatedlayout 461 | 462 | 463 | 464 | 465 | 466 | 467 | 468 | 469 | 470 | 471 | 472 | 473 | 474 | +proj=longlat +datum=WGS84 +no_defs 475 | EPSG:4326 476 | 3452 477 | 478 | 479 | false 480 | 481 | 482 | 0 483 | 255 484 | 255 485 | 255 486 | 255 487 | 255 488 | 255 489 | 490 | 491 | 2 492 | current_layer 493 | off 494 | 0 495 | 496 | 497 | 2 498 | true 499 | 500 | 501 | false 502 | 503 | 504 | 505 | 506 | --------------------------------------------------------------------------------