├── .github └── workflows │ └── lint.yml ├── .gitignore ├── Dockerfile ├── Makefile ├── README.md ├── app ├── main.py ├── templates │ └── treeCounter.html └── util.py ├── docker-compose.yaml ├── grafana_datasources.yaml ├── poetry.lock ├── prometheus.yaml ├── pyproject.toml └── slides.pdf /.github/workflows/lint.yml: -------------------------------------------------------------------------------- 1 | name: Python Lint 2 | 3 | on: [push] 4 | 5 | jobs: 6 | build: 7 | 8 | runs-on: ubuntu-latest 9 | 10 | steps: 11 | - uses: actions/checkout@v1 12 | - name: Set up Python ^3.10 13 | uses: actions/setup-python@v1 14 | with: 15 | python-version: ^3.10 16 | - name: Lint with pycodestyle 17 | run: | 18 | pip install pycodestyle 19 | pycodestyle --max-line-length 100 ./app 20 | - name: Lint app with flake 21 | run: | 22 | pip install flake8 23 | flake8 --max-line-length 100 app 24 | - name: Lint app with black 25 | run: | 26 | pip install black 27 | black --line-length 100 app -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .venv 2 | 3 | __pycache__ -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10.1-slim-buster 2 | 3 | RUN pip install poetry 4 | 5 | COPY ./pyproject.toml / 6 | COPY ./poetry.lock / 7 | RUN poetry install 8 | 9 | COPY ./app /python_server 10 | 11 | WORKDIR /python_server 12 | EXPOSE 8001 13 | CMD ["poetry", "run", "python", "main.py"] 14 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | deps: 2 | poetry install 3 | 4 | dev: deps 5 | cd app && poetry run python main.py 6 | 7 | lint: lint-black lint-flake lint-pycodestyle 8 | 9 | lint-black: 10 | poetry run black --line-length 100 app 11 | 12 | lint-flake: 13 | poetry run flake8 --max-line-length 100 app 14 | 15 | lint-pycodestyle: 16 | poetry run pycodestyle --max-line-length 100 ./app 17 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Workshop: We know what your app did last summer. Do you? 👀 2 | ## Observing Python Applications with Prometheus 🔥🐍 3 | 4 | ⚠️ ***This repository was created for the 2022 PyCon DE, please note that the content may not be representive of Ecosia's current engineering standards.*** 5 | 6 | ### Objective 7 | 8 | In the directory `app/`, we have an application that runs a Python web server with the endpoint `/treecounter`. It displays the total number of trees planted by Ecosia users. We want to start observing the behavior of this application at runtime by tracking and exporting metric data. 9 | 10 | We will do this using the time-series database system [Prometheus](https://prometheus.io), which uses a "pull" method to extract data from running applications. This means that the applications need to "export" their data, so that Prometheus is able to "scrape" the metric data from them. This is typically done via an HTTP endpoint (`/metrics`, by convention). 11 | 12 | We will use the [Prometheus Python client library](https://github.com/prometheus/client_python) to track metrics in our code. 13 | 14 | ### Agenda 15 | 16 | * [Section 1: Exposing metrics](#section-1:-exposing-metrics) 17 | * [Section 2: Creating custom metrics](#section-2:-creating-custom-metrics) 18 | * [Section 3: Scraping Metrics with Prometheus and creating Dashboards with Grafana](#section-3:-scraping-metrics-with-prometheus-and-creating-dashboards-with-grafana) 19 | * [Bonus Material: Histograms in Prometheus](#bonus-material:-histograms-in-prometheus) 20 | 21 | ### Prerequisites 22 | 23 | For this workshop you will need [Python 3.10](https://installpython3.com/), [Poetry](https://python-poetry.org/docs/#installation), [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) running on your machine. *(on mac os docker-compose is by default installed with Docker)* 24 | 25 | 26 | Please note that this repository is linted using [black](), [flake8]() and [pycodestyle]() with a max line length of 100. This linting is enforced with github actions configured [here](./github/workflow/lint.yml) 27 | 28 | ## Workshop Content 29 | 30 | --- 31 | 32 | ### Section 1: Exposing metrics ⚙️ 33 | 34 | --- 35 | 36 | For this section, you can use the following command to install depencies and run the dev server locally. 37 | 38 | ```sh 39 | # The Makefile allows us to run commands behind a target name 40 | # Make is not available for the Windows OS so you will need to copy the commands from the Makefile and run them directly 41 | make dev 42 | ``` 43 | 44 | To export our metrics we will need to have a server with a handler to *handle* the metrics. We can do this by changing the base class of our HTTPRequestHandler to the `MetricsHandler` provided by the prometheus python client. We also need to add the condition for the `/metrics` endpoint below our `/treecounter` endpoint condition. *(Don't forget to import the `MetricsHandler` from the `prometheus_client`)* 45 | 46 | ``` python 47 | class HTTPRequestHandler(MetricsHandler): 48 | ... 49 | ... 50 | elif endpoint == '/metrics': 51 | return super(HTTPRequestHandler, self).do_GET() 52 | ``` 53 | 54 | Now try restarting the server (`control c` will stop it) and go to `localhost:8001/metrics`. What do you see? What do you see if you visit `localhost:8001/treecounter` a few times and then go back to the `/metrics` endpoint? What do these base metrics represent? 55 | 56 | --- 57 | 58 | ### Section 2: Creating custom metrics 🔧 59 | 60 | --- 61 | 62 | Now that we can expose metrics, we need to create them. Prometheus has a few different data types but the most straight forward is a `Counter`. Counters always increment and can be used to track, for example, the number of requests received (you can then divide this unit over time to calculate requests per second). To create a `Counter`, import it from the Prometheus Python client and instantiate it. 63 | 64 | ``` python 65 | from prometheus_client import Counter 66 | requestCounter = Counter('requests_total', 'total number of requests', ['status', 'endpoint']) # can be declared as a global variable 67 | ``` 68 | 69 | Restart your server again and you should be able to see your metric exposed on `/metrics` - success! (Except, it will still always report 0 - not quite useful, yet) 70 | 71 | ``` 72 | # HELP requests_total Total requests 73 | # TYPE requests_total counter 74 | requests_total{} 0 75 | 76 | ``` 77 | 78 | To use our metric in practice, we want to increment the counter when tracking events in our code. To increment the `Counter` type by one, we can call `.inc()` - for example, using the request counter we created above, we could call: 79 | 80 | ``` python 81 | requestCounter.labels(status='200', endpoint='/treecounter').inc() 82 | ``` 83 | 84 | **You should add these `.inc()` calls in the place in your code where the event you want to track is occurring.** If you want to increment by a different amount than 1, you can for example, use `.inc(1.5)`. 85 | 86 | Add the call to `inc()` in your code. Try experiment with the placement of where you call it, what difference does it make to your metric? 87 | 88 | --- 89 | 90 | ### Section 3: Scraping Metrics with Prometheus and creating Dashboards with Grafana 🔥📈 91 | 92 | --- 93 | 94 | So far we've been able to instrument our application, such that it is now exporting metrics about its runtime behavior. However, we still need to collect those metrics and store the data in a way so we can query it back out in order to graph it over time and make dashboards. 95 | 96 | There is a `prometheus.yaml` configuration file here in the repo, which is already set up to scrape metrics from our application. We will run our application, Prometheus, and Grafana inside Docker, so that they are easily able to find each other. 97 | 98 | #### Run the application, Prometheus and Grafana in Docker 99 | 100 | To build the application Docker image and start the application container as well as Prometheus and Grafana together, run the following command (from the root of this repo): 101 | 102 | ``` sh 103 | docker-compose up --build 104 | ``` 105 | 106 | *If you see errors it may be because you still have the previous version of the application running and therefore might be using the same port as you are now trying to access with Docker.* 107 | 108 | You should then be able to access the Prometheus dashboard on `http://localhost:9090` 109 | 110 | #### Navigating the Prometheus UI and using PromQL to query metrics 111 | 112 | Prometheus should find and immediately start scraping metrics from the application container. You can check that it's found the application container by looking at the list of "targets" that Prometheus is scraping `http://localhost:9090/targets` 113 | 114 | Prometheus uses it's own query language called [PromQL](https://prometheus.io/docs/prometheus/latest/querying/basics/). You can enter PromQL queries in the `/graph` page of the Prometheus UI. 115 | 116 | To see the counter exported previously, we can use the PromQL query: 117 | 118 | ``` promql 119 | requests_total 120 | ``` 121 | 122 | If we want to see this graphed as a rate per-second over time, we use the PromQL query: 123 | 124 | ``` promql 125 | rate(requests_total[1m]) 126 | ``` 127 | 128 | #### Making Dashboards with Grafana 129 | 130 | [Grafana](http://grafana.com) is an open-source metric visualization tool, which can be used to create dashboards containing many graphs. Grafana can visualize data from multiple sources, including Prometheus. The `docker-compose` command used in the previous section will also start a Grafana container, which uses the Grafana configuration file in this repo to connect to Prometheus. After running the startup command mentioned above, `docker-compose up --build`), you'll be able to find Grafana on `http://localhost:3000` 131 | 132 | Grafana uses authentication, which, for this workshop, is configured in the `docker-compose.yaml` file. The credentials configured for this workshop are: 133 | 134 | ``` 135 | username: ecosia 136 | password: workshop 137 | ``` 138 | 139 | Time to get creative and visualize your metrics in a meaningful way so you can observe your application and even set up alerts for any behavior you want to be informed about! We will show you in the workshop how to build a simple dashboard panel but there's lots to explore. Lots of useful information can be found on both the [Prometheus](https://prometheus.io) and [Grafana](http://grafana.com) websites. 140 | 141 | ✨ **Go forth and Monitor!!** ✨ 142 | 143 | --- 144 | 145 | ### Bonus Material: Histograms in Prometheus 📊 146 | 147 | --- 148 | 149 | We likely will not get to this material due to the workshop length, however for those of you who want to continue we created this section with some additional challenges and information to further support you in monitoring your applications. 150 | 151 | We have already exposed metrics of type `Counter`. [Prometheus has four core metrics](https://prometheus.io/docs/concepts/metric_types/), which are: 152 | 153 | - Counter 154 | - Gauge 155 | - Histogram 156 | - Summary 157 | 158 | A histogram is a little bit more complicated than a Counter, but it can be very useful! 159 | 160 | For example a histogram is useful when you want approximations over a known range of values, such as: 161 | * response duration 162 | * request size 163 | 164 | In Prometheus, a histogram measures the frequency of value observations that fall into `buckets`. 165 | For example, we can define a set of buckets to measure request latency. These buckets are groupings which we can use to provide an indication of how long 166 | a single request could take e.g. 0.0 - 0.25s, 0.25 - 0.50s, 0.50 - 0.75s, 0.75 - 1.00s, 1.00s+. The duration of every request will fall into one of these buckets. 167 | 168 | In Prometheus, a histogram is cumulative and there are default buckets defined, so you don't need to specify them for yourself. 169 | When using the histogram, Prometheus won't store the exact request duration, but instead stores the frequency of requests that fall into these buckets. 170 | 171 | **Let's make a histogram for request latencies** 172 | 173 | The first thing we will do is add the import: 174 | 175 | ``` 176 | from prometheus_client import Histogram 177 | ``` 178 | 179 | Then define our histogram: 180 | 181 | ``` 182 | requestHistogram = Histogram('request_latency_seconds', 'Request latency', ['endpoint'] ) 183 | requestHistogramTreeCounter = requestHistogram.labels(endpoint='/treecounter') 184 | ``` 185 | 186 | Finally we add the following decorator to the piece of code that we want to measure the duration for: 187 | 188 | ``` 189 | @requestHistogramTreeCounter.time() 190 | def xxxx(): 191 | ... 192 | ``` 193 | 194 | Now restart the application and make a few requests. 👀 195 | 196 | #### How to interpret the histogram 197 | 198 | If we curl the `/metrics` endpoint again, a portion of the output will look something like this: 199 | 200 | ``` 201 | request_latency_seconds_count{endpoint="/treecounter"} 5.0 202 | ``` 203 | 204 | This is a `count` again! And we can see the endpoint has received 5 requests. 205 | 206 | We also see our buckets. Here `le` means `less than or equal to`. 207 | We can see from this output that the histogram is cumulative: 208 | 209 | ``` 210 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.005"} 1.0 211 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.01"} 1.0 212 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.025"} 1.0 213 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.05"} 1.0 214 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.075"} 1.0 215 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.1"} 1.0 216 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.25"} 4.0 217 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.5"} 4.0 218 | request_latency_seconds_bucket{endpoint="/treecounter",le="0.75"} 5.0 219 | request_latency_seconds_bucket{endpoint="/treecounter",le="1.0"} 5.0 220 | request_latency_seconds_bucket{endpoint="/treecounter",le="2.5"} 5.0 221 | request_latency_seconds_bucket{endpoint="/treecounter",le="5.0"} 5.0 222 | request_latency_seconds_bucket{endpoint="/treecounter",le="7.5"} 5.0 223 | request_latency_seconds_bucket{endpoint="/treecounter",le="10.0"} 5.0 224 | request_latency_seconds_bucket{endpoint="/treecounter",le="+Inf"} 5.0 225 | ``` 226 | 227 | Finally we see the total sum of all observed values: 228 | 229 | ``` 230 | request_latency_seconds_sum{endpoint="/treecounter"} 1.13912788000016 231 | ``` 232 | 233 | To learn more, you can read about [Prometheus Histogram best practices](https://prometheus.io/docs/practices/histograms/). 234 | 235 | --- 236 | 237 | ## Troubleshooting 238 | 239 | ### Port conflict 240 | 241 | If you see the error message below it is likely because you already have either the Docker version or non docker version of the application already running. 242 | 243 | ``` 244 | Error starting userland proxy: listen tcp4 0.0.0.0:8001: bind: address already in use 245 | ``` 246 | 247 | Check you terminal windows to see if you can find where it is running and use `ctrl c` to stop it. Alternatively you can use `lsof -i :8001` to find out the `pid` of the process running at this port and `kill ` to stop it. You may have to run these commands as `sudo`. 248 | 249 | ### Python version 250 | 251 | If the App will not start locally and you receive an error referring the version, it may be because you do not have a suitable version of Python available on your machine. The version should be 3.10 or above. 252 | 253 | --- 254 | 255 | The latest version of this material has been developed by @vinesse @sleepypioneer with previous iterations supported by @emilywoods @jasongwartz. -------------------------------------------------------------------------------- /app/main.py: -------------------------------------------------------------------------------- 1 | import requests 2 | from string import Template 3 | import time 4 | import random 5 | from http.server import BaseHTTPRequestHandler, HTTPServer 6 | from util import artificial_503, artificial_latency 7 | 8 | 9 | HOST_NAME = "0.0.0.0" # This will map to available port in docker 10 | PORT_NUMBER = 8001 11 | 12 | trees_api_url = "https://api.ecosia.org/v1/trees/count" 13 | 14 | with open("./templates/treeCounter.html", "r") as f: 15 | html_string = f.read() 16 | html_template = Template(html_string) 17 | 18 | 19 | def fetch_tree_count(): 20 | r = requests.get(trees_api_url) if random.random() > 0.15 else artificial_503() 21 | if r.status_code == 200: 22 | return r.json()["count"] 23 | return 0 24 | 25 | 26 | class HTTPRequestHandler(BaseHTTPRequestHandler): 27 | @artificial_latency 28 | def get_treecounter(self): 29 | self.do_HEAD() 30 | tree_count = fetch_tree_count() 31 | bytes_template = bytes(html_template.substitute(counter=tree_count), "utf-8") 32 | self.wfile.write(bytes_template) 33 | 34 | def do_HEAD(self): 35 | self.send_response(200) 36 | self.send_header("Content-type", "text/html") 37 | self.end_headers() 38 | 39 | def do_GET(self): 40 | endpoint = self.path 41 | if endpoint == "/treecounter": 42 | return self.get_treecounter() 43 | else: 44 | self.send_error(404) 45 | 46 | 47 | if __name__ == "__main__": 48 | myServer = HTTPServer((HOST_NAME, PORT_NUMBER), HTTPRequestHandler) 49 | print(time.asctime(), "Server Starts - %s:%s" % (HOST_NAME, PORT_NUMBER)) 50 | try: 51 | myServer.serve_forever() 52 | except KeyboardInterrupt: 53 | pass 54 | myServer.server_close() 55 | print(time.asctime(), "Server Stops - %s:%s" % (HOST_NAME, PORT_NUMBER)) 56 | -------------------------------------------------------------------------------- /app/templates/treeCounter.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | Trees Planted by Ecosia Users 6 | 7 | 28 | 29 | 30 |
31 |
32 | Ecosia.org logo 33 |

$counter

34 |

🌳🌳🌳

35 |

Trees planted by Ecosia users

36 |
37 |
38 | 39 | -------------------------------------------------------------------------------- /app/util.py: -------------------------------------------------------------------------------- 1 | import random 2 | from requests import Response 3 | from time import sleep 4 | 5 | 6 | def artificial_503(): 7 | r = Response() 8 | r.status_code = 503 9 | r.reason = "Uh oh - artificial 503" 10 | r.json = {} 11 | return r 12 | 13 | 14 | def artificial_latency(func): 15 | def randomised_latency(request_handler): 16 | random_number = random.random() 17 | sleep(random_number) 18 | return func(request_handler) 19 | 20 | return randomised_latency 21 | -------------------------------------------------------------------------------- /docker-compose.yaml: -------------------------------------------------------------------------------- 1 | version: "3" 2 | 3 | services: 4 | app: 5 | build: . 6 | ports: 7 | - "8001:8001" 8 | 9 | prometheus: 10 | image: prom/prometheus 11 | ports: 12 | - "9090:9090" 13 | volumes: 14 | - "./prometheus.yaml:/etc/prometheus/prometheus.yml" 15 | 16 | grafana: 17 | image: grafana/grafana 18 | ports: 19 | - "3000:3000" 20 | environment: 21 | - "GF_SECURITY_ADMIN_USER=ecosia" 22 | - "GF_SECURITY_ADMIN_PASSWORD=workshop" 23 | volumes: 24 | - "./grafana_datasources.yaml:/etc/grafana/provisioning/datasources/prometheus_docker-compose.yaml" 25 | 26 | 27 | -------------------------------------------------------------------------------- /grafana_datasources.yaml: -------------------------------------------------------------------------------- 1 | # config file version 2 | apiVersion: 1 3 | 4 | # list of datasources to insert/update depending 5 | # what's available in the database 6 | datasources: 7 | # name of the datasource. Required 8 | - name: Prometheus(docker-compose) 9 | # datasource type. Required 10 | type: prometheus 11 | # access mode. proxy or direct (Server or Browser in the UI). Required 12 | access: proxy 13 | # url 14 | url: http://prometheus:9090 15 | # mark as default datasource. Max one per org 16 | isDefault: true 17 | version: 1 18 | # allow users to edit datasources from the UI. 19 | editable: false -------------------------------------------------------------------------------- /poetry.lock: -------------------------------------------------------------------------------- 1 | [[package]] 2 | name = "atomicwrites" 3 | version = "1.4.0" 4 | description = "Atomic file writes." 5 | category = "main" 6 | optional = false 7 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" 8 | 9 | [[package]] 10 | name = "attrs" 11 | version = "21.4.0" 12 | description = "Classes Without Boilerplate" 13 | category = "main" 14 | optional = false 15 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" 16 | 17 | [package.extras] 18 | dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"] 19 | docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"] 20 | tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "cloudpickle"] 21 | tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "cloudpickle"] 22 | 23 | [[package]] 24 | name = "black" 25 | version = "22.3.0" 26 | description = "The uncompromising code formatter." 27 | category = "dev" 28 | optional = false 29 | python-versions = ">=3.6.2" 30 | 31 | [package.dependencies] 32 | click = ">=8.0.0" 33 | mypy-extensions = ">=0.4.3" 34 | pathspec = ">=0.9.0" 35 | platformdirs = ">=2" 36 | tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} 37 | 38 | [package.extras] 39 | colorama = ["colorama (>=0.4.3)"] 40 | d = ["aiohttp (>=3.7.4)"] 41 | jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"] 42 | uvloop = ["uvloop (>=0.15.2)"] 43 | 44 | [[package]] 45 | name = "certifi" 46 | version = "2021.10.8" 47 | description = "Python package for providing Mozilla's CA Bundle." 48 | category = "main" 49 | optional = false 50 | python-versions = "*" 51 | 52 | [[package]] 53 | name = "charset-normalizer" 54 | version = "2.0.12" 55 | description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." 56 | category = "main" 57 | optional = false 58 | python-versions = ">=3.5.0" 59 | 60 | [package.extras] 61 | unicode_backport = ["unicodedata2"] 62 | 63 | [[package]] 64 | name = "click" 65 | version = "8.1.3" 66 | description = "Composable command line interface toolkit" 67 | category = "dev" 68 | optional = false 69 | python-versions = ">=3.7" 70 | 71 | [package.dependencies] 72 | colorama = {version = "*", markers = "platform_system == \"Windows\""} 73 | 74 | [[package]] 75 | name = "colorama" 76 | version = "0.4.4" 77 | description = "Cross-platform colored terminal text." 78 | category = "main" 79 | optional = false 80 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" 81 | 82 | [[package]] 83 | name = "flake8" 84 | version = "4.0.1" 85 | description = "the modular source code checker: pep8 pyflakes and co" 86 | category = "dev" 87 | optional = false 88 | python-versions = ">=3.6" 89 | 90 | [package.dependencies] 91 | mccabe = ">=0.6.0,<0.7.0" 92 | pycodestyle = ">=2.8.0,<2.9.0" 93 | pyflakes = ">=2.4.0,<2.5.0" 94 | 95 | [[package]] 96 | name = "idna" 97 | version = "3.3" 98 | description = "Internationalized Domain Names in Applications (IDNA)" 99 | category = "main" 100 | optional = false 101 | python-versions = ">=3.5" 102 | 103 | [[package]] 104 | name = "iniconfig" 105 | version = "1.1.1" 106 | description = "iniconfig: brain-dead simple config-ini parsing" 107 | category = "main" 108 | optional = false 109 | python-versions = "*" 110 | 111 | [[package]] 112 | name = "mccabe" 113 | version = "0.6.1" 114 | description = "McCabe checker, plugin for flake8" 115 | category = "dev" 116 | optional = false 117 | python-versions = "*" 118 | 119 | [[package]] 120 | name = "mypy-extensions" 121 | version = "0.4.3" 122 | description = "Experimental type system extensions for programs checked with the mypy typechecker." 123 | category = "dev" 124 | optional = false 125 | python-versions = "*" 126 | 127 | [[package]] 128 | name = "packaging" 129 | version = "21.3" 130 | description = "Core utilities for Python packages" 131 | category = "main" 132 | optional = false 133 | python-versions = ">=3.6" 134 | 135 | [package.dependencies] 136 | pyparsing = ">=2.0.2,<3.0.5 || >3.0.5" 137 | 138 | [[package]] 139 | name = "pathspec" 140 | version = "0.9.0" 141 | description = "Utility library for gitignore style pattern matching of file paths." 142 | category = "dev" 143 | optional = false 144 | python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7" 145 | 146 | [[package]] 147 | name = "platformdirs" 148 | version = "2.5.2" 149 | description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." 150 | category = "dev" 151 | optional = false 152 | python-versions = ">=3.7" 153 | 154 | [package.extras] 155 | docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)", "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"] 156 | test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)", "pytest (>=6)"] 157 | 158 | [[package]] 159 | name = "pluggy" 160 | version = "1.0.0" 161 | description = "plugin and hook calling mechanisms for python" 162 | category = "main" 163 | optional = false 164 | python-versions = ">=3.6" 165 | 166 | [package.extras] 167 | dev = ["pre-commit", "tox"] 168 | testing = ["pytest", "pytest-benchmark"] 169 | 170 | [[package]] 171 | name = "prometheus-client" 172 | version = "0.13.1" 173 | description = "Python client for the Prometheus monitoring system." 174 | category = "main" 175 | optional = false 176 | python-versions = ">=3.6" 177 | 178 | [package.extras] 179 | twisted = ["twisted"] 180 | 181 | [[package]] 182 | name = "py" 183 | version = "1.11.0" 184 | description = "library with cross-python path, ini-parsing, io, code, log facilities" 185 | category = "main" 186 | optional = false 187 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" 188 | 189 | [[package]] 190 | name = "pycodestyle" 191 | version = "2.8.0" 192 | description = "Python style guide checker" 193 | category = "dev" 194 | optional = false 195 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" 196 | 197 | [[package]] 198 | name = "pyflakes" 199 | version = "2.4.0" 200 | description = "passive checker of Python programs" 201 | category = "dev" 202 | optional = false 203 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" 204 | 205 | [[package]] 206 | name = "pyparsing" 207 | version = "3.0.8" 208 | description = "pyparsing module - Classes and methods to define and execute parsing grammars" 209 | category = "main" 210 | optional = false 211 | python-versions = ">=3.6.8" 212 | 213 | [package.extras] 214 | diagrams = ["railroad-diagrams", "jinja2"] 215 | 216 | [[package]] 217 | name = "pytest" 218 | version = "7.1.2" 219 | description = "pytest: simple powerful testing with Python" 220 | category = "main" 221 | optional = false 222 | python-versions = ">=3.7" 223 | 224 | [package.dependencies] 225 | atomicwrites = {version = ">=1.0", markers = "sys_platform == \"win32\""} 226 | attrs = ">=19.2.0" 227 | colorama = {version = "*", markers = "sys_platform == \"win32\""} 228 | iniconfig = "*" 229 | packaging = "*" 230 | pluggy = ">=0.12,<2.0" 231 | py = ">=1.8.2" 232 | tomli = ">=1.0.0" 233 | 234 | [package.extras] 235 | testing = ["argcomplete", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "xmlschema"] 236 | 237 | [[package]] 238 | name = "pytest-httpserver" 239 | version = "1.0.4" 240 | description = "pytest-httpserver is a httpserver for pytest" 241 | category = "main" 242 | optional = false 243 | python-versions = ">=3.6,<4.0" 244 | 245 | [package.dependencies] 246 | Werkzeug = ">=2.0.0" 247 | 248 | [package.extras] 249 | dev = ["autopep8", "coverage", "flake8 (>4.0.0)", "ipdb", "mypy", "pytest-cov", "pytest", "reno", "requests", "sphinx", "sphinx-rtd-theme", "types-requests"] 250 | test = ["coverage", "flake8 (>4.0.0)", "mypy", "pytest-cov", "pytest", "requests", "types-requests"] 251 | doc = ["reno", "sphinx", "sphinx-rtd-theme"] 252 | 253 | [[package]] 254 | name = "requests" 255 | version = "2.27.1" 256 | description = "Python HTTP for Humans." 257 | category = "main" 258 | optional = false 259 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*" 260 | 261 | [package.dependencies] 262 | certifi = ">=2017.4.17" 263 | charset-normalizer = {version = ">=2.0.0,<2.1.0", markers = "python_version >= \"3\""} 264 | idna = {version = ">=2.5,<4", markers = "python_version >= \"3\""} 265 | urllib3 = ">=1.21.1,<1.27" 266 | 267 | [package.extras] 268 | socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"] 269 | use_chardet_on_py3 = ["chardet (>=3.0.2,<5)"] 270 | 271 | [[package]] 272 | name = "tomli" 273 | version = "2.0.1" 274 | description = "A lil' TOML parser" 275 | category = "main" 276 | optional = false 277 | python-versions = ">=3.7" 278 | 279 | [[package]] 280 | name = "urllib3" 281 | version = "1.26.9" 282 | description = "HTTP library with thread-safe connection pooling, file post, and more." 283 | category = "main" 284 | optional = false 285 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4" 286 | 287 | [package.extras] 288 | brotli = ["brotlicffi (>=0.8.0)", "brotli (>=1.0.9)", "brotlipy (>=0.6.0)"] 289 | secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "certifi", "ipaddress"] 290 | socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"] 291 | 292 | [[package]] 293 | name = "werkzeug" 294 | version = "2.1.2" 295 | description = "The comprehensive WSGI web application library." 296 | category = "main" 297 | optional = false 298 | python-versions = ">=3.7" 299 | 300 | [package.extras] 301 | watchdog = ["watchdog"] 302 | 303 | [metadata] 304 | lock-version = "1.1" 305 | python-versions = "^3.10" 306 | content-hash = "66f35e6c161fce724e56482dccafe610c23cedb2154389e71695dda3317f8d13" 307 | 308 | [metadata.files] 309 | atomicwrites = [ 310 | {file = "atomicwrites-1.4.0-py2.py3-none-any.whl", hash = "sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197"}, 311 | {file = "atomicwrites-1.4.0.tar.gz", hash = "sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"}, 312 | ] 313 | attrs = [ 314 | {file = "attrs-21.4.0-py2.py3-none-any.whl", hash = "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4"}, 315 | {file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"}, 316 | ] 317 | black = [ 318 | {file = "black-22.3.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:2497f9c2386572e28921fa8bec7be3e51de6801f7459dffd6e62492531c47e09"}, 319 | {file = "black-22.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5795a0375eb87bfe902e80e0c8cfaedf8af4d49694d69161e5bd3206c18618bb"}, 320 | {file = "black-22.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e3556168e2e5c49629f7b0f377070240bd5511e45e25a4497bb0073d9dda776a"}, 321 | {file = "black-22.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:67c8301ec94e3bcc8906740fe071391bce40a862b7be0b86fb5382beefecd968"}, 322 | {file = "black-22.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:fd57160949179ec517d32ac2ac898b5f20d68ed1a9c977346efbac9c2f1e779d"}, 323 | {file = "black-22.3.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:cc1e1de68c8e5444e8f94c3670bb48a2beef0e91dddfd4fcc29595ebd90bb9ce"}, 324 | {file = "black-22.3.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d2fc92002d44746d3e7db7cf9313cf4452f43e9ea77a2c939defce3b10b5c82"}, 325 | {file = "black-22.3.0-cp36-cp36m-win_amd64.whl", hash = "sha256:a6342964b43a99dbc72f72812bf88cad8f0217ae9acb47c0d4f141a6416d2d7b"}, 326 | {file = "black-22.3.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:328efc0cc70ccb23429d6be184a15ce613f676bdfc85e5fe8ea2a9354b4e9015"}, 327 | {file = "black-22.3.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06f9d8846f2340dfac80ceb20200ea5d1b3f181dd0556b47af4e8e0b24fa0a6b"}, 328 | {file = "black-22.3.0-cp37-cp37m-win_amd64.whl", hash = "sha256:ad4efa5fad66b903b4a5f96d91461d90b9507a812b3c5de657d544215bb7877a"}, 329 | {file = "black-22.3.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e8477ec6bbfe0312c128e74644ac8a02ca06bcdb8982d4ee06f209be28cdf163"}, 330 | {file = "black-22.3.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:637a4014c63fbf42a692d22b55d8ad6968a946b4a6ebc385c5505d9625b6a464"}, 331 | {file = "black-22.3.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:863714200ada56cbc366dc9ae5291ceb936573155f8bf8e9de92aef51f3ad0f0"}, 332 | {file = "black-22.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10dbe6e6d2988049b4655b2b739f98785a884d4d6b85bc35133a8fb9a2233176"}, 333 | {file = "black-22.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:cee3e11161dde1b2a33a904b850b0899e0424cc331b7295f2a9698e79f9a69a0"}, 334 | {file = "black-22.3.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5891ef8abc06576985de8fa88e95ab70641de6c1fca97e2a15820a9b69e51b20"}, 335 | {file = "black-22.3.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:30d78ba6bf080eeaf0b7b875d924b15cd46fec5fd044ddfbad38c8ea9171043a"}, 336 | {file = "black-22.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ee8f1f7228cce7dffc2b464f07ce769f478968bfb3dd1254a4c2eeed84928aad"}, 337 | {file = "black-22.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6ee227b696ca60dd1c507be80a6bc849a5a6ab57ac7352aad1ffec9e8b805f21"}, 338 | {file = "black-22.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:9b542ced1ec0ceeff5b37d69838106a6348e60db7b8fdd245294dc1d26136265"}, 339 | {file = "black-22.3.0-py3-none-any.whl", hash = "sha256:bc58025940a896d7e5356952228b68f793cf5fcb342be703c3a2669a1488cb72"}, 340 | {file = "black-22.3.0.tar.gz", hash = "sha256:35020b8886c022ced9282b51b5a875b6d1ab0c387b31a065b84db7c33085ca79"}, 341 | ] 342 | certifi = [ 343 | {file = "certifi-2021.10.8-py2.py3-none-any.whl", hash = "sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569"}, 344 | {file = "certifi-2021.10.8.tar.gz", hash = "sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872"}, 345 | ] 346 | charset-normalizer = [ 347 | {file = "charset-normalizer-2.0.12.tar.gz", hash = "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597"}, 348 | {file = "charset_normalizer-2.0.12-py3-none-any.whl", hash = "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"}, 349 | ] 350 | click = [ 351 | {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"}, 352 | {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"}, 353 | ] 354 | colorama = [ 355 | {file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"}, 356 | {file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"}, 357 | ] 358 | flake8 = [ 359 | {file = "flake8-4.0.1-py2.py3-none-any.whl", hash = "sha256:479b1304f72536a55948cb40a32dce8bb0ffe3501e26eaf292c7e60eb5e0428d"}, 360 | {file = "flake8-4.0.1.tar.gz", hash = "sha256:806e034dda44114815e23c16ef92f95c91e4c71100ff52813adf7132a6ad870d"}, 361 | ] 362 | idna = [ 363 | {file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"}, 364 | {file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"}, 365 | ] 366 | iniconfig = [ 367 | {file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"}, 368 | {file = "iniconfig-1.1.1.tar.gz", hash = "sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"}, 369 | ] 370 | mccabe = [ 371 | {file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"}, 372 | {file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"}, 373 | ] 374 | mypy-extensions = [ 375 | {file = "mypy_extensions-0.4.3-py2.py3-none-any.whl", hash = "sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d"}, 376 | {file = "mypy_extensions-0.4.3.tar.gz", hash = "sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"}, 377 | ] 378 | packaging = [ 379 | {file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"}, 380 | {file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"}, 381 | ] 382 | pathspec = [ 383 | {file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"}, 384 | {file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"}, 385 | ] 386 | platformdirs = [ 387 | {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"}, 388 | {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"}, 389 | ] 390 | pluggy = [ 391 | {file = "pluggy-1.0.0-py2.py3-none-any.whl", hash = "sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"}, 392 | {file = "pluggy-1.0.0.tar.gz", hash = "sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159"}, 393 | ] 394 | prometheus-client = [ 395 | {file = "prometheus_client-0.13.1-py3-none-any.whl", hash = "sha256:357a447fd2359b0a1d2e9b311a0c5778c330cfbe186d880ad5a6b39884652316"}, 396 | {file = "prometheus_client-0.13.1.tar.gz", hash = "sha256:ada41b891b79fca5638bd5cfe149efa86512eaa55987893becd2c6d8d0a5dfc5"}, 397 | ] 398 | py = [ 399 | {file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"}, 400 | {file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"}, 401 | ] 402 | pycodestyle = [ 403 | {file = "pycodestyle-2.8.0-py2.py3-none-any.whl", hash = "sha256:720f8b39dde8b293825e7ff02c475f3077124006db4f440dcbc9a20b76548a20"}, 404 | {file = "pycodestyle-2.8.0.tar.gz", hash = "sha256:eddd5847ef438ea1c7870ca7eb78a9d47ce0cdb4851a5523949f2601d0cbbe7f"}, 405 | ] 406 | pyflakes = [ 407 | {file = "pyflakes-2.4.0-py2.py3-none-any.whl", hash = "sha256:3bb3a3f256f4b7968c9c788781e4ff07dce46bdf12339dcda61053375426ee2e"}, 408 | {file = "pyflakes-2.4.0.tar.gz", hash = "sha256:05a85c2872edf37a4ed30b0cce2f6093e1d0581f8c19d7393122da7e25b2b24c"}, 409 | ] 410 | pyparsing = [ 411 | {file = "pyparsing-3.0.8-py3-none-any.whl", hash = "sha256:ef7b523f6356f763771559412c0d7134753f037822dad1b16945b7b846f7ad06"}, 412 | {file = "pyparsing-3.0.8.tar.gz", hash = "sha256:7bf433498c016c4314268d95df76c81b842a4cb2b276fa3312cfb1e1d85f6954"}, 413 | ] 414 | pytest = [ 415 | {file = "pytest-7.1.2-py3-none-any.whl", hash = "sha256:13d0e3ccfc2b6e26be000cb6568c832ba67ba32e719443bfe725814d3c42433c"}, 416 | {file = "pytest-7.1.2.tar.gz", hash = "sha256:a06a0425453864a270bc45e71f783330a7428defb4230fb5e6a731fde06ecd45"}, 417 | ] 418 | pytest-httpserver = [ 419 | {file = "pytest_httpserver-1.0.4-py3-none-any.whl", hash = "sha256:17396350d7c0dec067bc8c5ceca7e1137aa0e3395468a9c179bdfc5ede9027a5"}, 420 | {file = "pytest_httpserver-1.0.4.tar.gz", hash = "sha256:6de464ba5f74628d6182ebbdcb56783edf2c9b0caf598dc35c11f014f24a3f0d"}, 421 | ] 422 | requests = [ 423 | {file = "requests-2.27.1-py2.py3-none-any.whl", hash = "sha256:f22fa1e554c9ddfd16e6e41ac79759e17be9e492b3587efa038054674760e72d"}, 424 | {file = "requests-2.27.1.tar.gz", hash = "sha256:68d7c56fd5a8999887728ef304a6d12edc7be74f1cfa47714fc8b414525c9a61"}, 425 | ] 426 | tomli = [ 427 | {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"}, 428 | {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"}, 429 | ] 430 | urllib3 = [ 431 | {file = "urllib3-1.26.9-py2.py3-none-any.whl", hash = "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14"}, 432 | {file = "urllib3-1.26.9.tar.gz", hash = "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"}, 433 | ] 434 | werkzeug = [ 435 | {file = "Werkzeug-2.1.2-py3-none-any.whl", hash = "sha256:72a4b735692dd3135217911cbeaa1be5fa3f62bffb8745c5215420a03dc55255"}, 436 | {file = "Werkzeug-2.1.2.tar.gz", hash = "sha256:1ce08e8093ed67d638d63879fd1ba3735817f7a80de3674d293f5984f25fb6e6"}, 437 | ] 438 | -------------------------------------------------------------------------------- /prometheus.yaml: -------------------------------------------------------------------------------- 1 | global: 2 | scrape_interval: 10s # By default, scrape targets every 10 seconds. 3 | 4 | scrape_configs: 5 | # The job name is added as a label `job=` to any timeseries scraped from this config. 6 | - job_name: 'prometheus' 7 | static_configs: 8 | - targets: ['localhost:9090'] 9 | 10 | - job_name: 'app' 11 | static_configs: 12 | - targets: ['app:8001'] -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [tool.poetry] 2 | name = "python-prometheus-workshop" 3 | version = "0.1.0" 4 | description = "" 5 | authors = ["Your Name "] 6 | 7 | [tool.poetry.dependencies] 8 | python = "^3.10" 9 | prometheus-client = "^0.13.1" 10 | pytest = "^7.0.1" 11 | pytest-httpserver = "^1.0.4" 12 | requests = "^2.27.1" 13 | 14 | [tool.poetry.dev-dependencies] 15 | black = "^22.3.0" 16 | pycodestyle = "^2.8.0" 17 | flake8 = "^4.0.1" 18 | 19 | [build-system] 20 | requires = ["poetry-core>=1.0.0"] 21 | build-backend = "poetry.core.masonry.api" 22 | -------------------------------------------------------------------------------- /slides.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ecosia/pycon22-prometheus-workshop/df49d0ab8f8f0e4af5583f610465417470df7ae6/slides.pdf --------------------------------------------------------------------------------