├── .github
└── workflows
│ └── main.yml
├── .gitignore
├── LICENSE
├── README.md
├── docs
└── imgs
│ ├── jsonoutput1.png
│ ├── xlsxbycode.png
│ └── xlsxbysite.png
├── poetry.lock
├── pyproject.toml
├── requirements.txt
├── tests
├── __init__.py
├── test_async_utils.py
├── test_codes.py
├── test_main.py
├── test_output.py
└── test_utils.py
└── wayback_google_analytics
├── __init__.py
├── async_utils.py
├── codes.py
├── main.py
├── output.py
├── scraper.py
└── utils.py
/.github/workflows/main.yml:
--------------------------------------------------------------------------------
1 | name: Run tests and publish
2 |
3 | on:
4 | pull_request:
5 | push:
6 | branches:
7 | - main
8 | tags:
9 | - 'v*.*.*'
10 |
11 | jobs:
12 | checks:
13 | runs-on: ubuntu-latest
14 | steps:
15 | - name: Checkout repository
16 | uses: actions/checkout@v4
17 |
18 | - name: Set up Python 3.9
19 | uses: actions/setup-python@v4
20 | with:
21 | python-version: 3.9
22 |
23 | - name: Install poetry dependencies
24 | run: |
25 | pip install poetry
26 | poetry config virtualenvs.create false
27 | poetry install
28 |
29 | - name: Run tests
30 | run: |
31 | poetry run python -m unittest discover
32 |
33 | release:
34 | name: Release
35 | runs-on: ubuntu-latest
36 | needs: [checks]
37 | if: startsWith(github.ref, 'refs/tags/')
38 | steps:
39 | - uses: actions/checkout@v4
40 | - name: Build and publish to pypi
41 | uses: JRubics/poetry-publish@v1.17
42 | with:
43 | pypi_token: ${{ secrets.PYPI_TOKEN }}
44 | python_version: 3.9
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | venv
2 | .env
3 | output
4 | dist
5 | **/__pycache__
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2023 Stichting Bellingcat
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | [![Contributors][contributors-shield]][contributors-url]
5 | [![Forks][forks-shield]][forks-url]
6 | [![Stargazers][stars-shield]][stars-url]
7 | [![Issues][issues-shield]][issues-url]
8 | [![MIT License][license-shield]][license-url]
9 | [![LinkedIn][linkedin-shield]][linkedin-url]
10 |
11 |
12 |
13 |
14 |

15 |
16 |
Wayback Google Analytics
17 |
18 |
19 | A lightweight tool to gather current and historic Google analytics codes for OSINT investigations.
20 |
21 |
22 | ·
23 | Report Bug
24 | ·
25 | Request Feature
26 |
27 |
28 |
29 |
30 |
31 |
32 |
33 | Table of Contents
34 |
35 | -
36 | About The Project
37 |
43 |
44 | -
45 | Installation
46 |
50 |
51 | -
52 | Usage
53 |
58 |
59 | -
60 | Contributing
61 |
64 |
65 | -
66 | Development
67 |
71 |
72 | - License
73 | - Contact
74 | - Acknowledgments
75 |
76 |
77 |
78 |
79 |
80 |
81 | ## About The Project
82 |
83 | Wayback Google Analytics is a lightweight tool that gathers current and historic
84 | Google analytics data (UA, GA and GTM codes) from a collection of website urls.
85 |
86 | Read Bellingcat's article about using this tool to uncover disinformation networks online [here](https://www.bellingcat.com/resources/2024/01/09/using-the-wayback-machine-and-google-analytics-to-uncover-disinformation-networks/).
87 |
88 | ### Why do I need GA codes?
89 |
90 | Google Analytics codes are a useful data point when examining relationships between websites. If two seemingly disparate websites share the same UA, GA or GTM code then there is a good chance that they are managed by the same individual or group. This useful breadcrumb has been used by researchers and journalists in OSINT investigations regularly over the last decade, but a recent change in how Google handles its analytics codes threatens to limit its effectiveness. Google began phasing out UA codes as part of its Google Analytics 4 upgrade in July 2023, making it significantly more challenging to use this breadcrumb during investigations.
91 |
92 | ### How does this tool help me?
93 |
94 | Luckily, the Internet Archive's [Wayback Machine](https://archive.org/web/) contains useful snapshots of websites containing their historic GA IDs. While you could feasibly check each snapshot manually, this tool automates that process with the Wayback Machines CDX API to simplify and speed up the process. Enter a list of urls and a time frame (along with extra, optional parameters) to collect current and historic GA, UA and GTM codes and return them in a format you choose (json, txt, xlsx or csv).
95 |
96 | The raw json output for each provided url looks something like this:
97 |
98 | ```json
99 | "someurl.com": {
100 | "current_UA_code": "UA-12345678-1",
101 | "current_GA_code": "G-1234567890",
102 | "current_GTM_code": "GTM-12345678",
103 | "archived_UA_codes": {
104 | "UA-12345678-1": {
105 | "first_seen": "01/01/2019(12:30)",
106 | "last_seen": "03/10/2020(00:00)",
107 | },
108 | },
109 | "archived_GA_codes": {
110 | "G-1234567890": {
111 | "first_seen": "01/01/2019(12:30)",
112 | "last_seen": "01/01/2019(12:30)",
113 | }
114 | },
115 | "archived_GTM_codes": {
116 | "GTM-12345678": {
117 | "first_seen": "01/01/2019(12:30)",
118 | "last_seen": "01/01/2019(12:30)",
119 | },
120 | },
121 | }
122 | ```
123 |
124 | ### Further reading
125 |
126 | - For more info about analytics codes and what the GA-4 rollout means for OSINT: https://digitalinvestigations.substack.com/p/what-the-rollout-of-google-analytics
127 |
128 | - For an example investigation usings analytics codes: https://www.bellingcat.com/resources/how-tos/2015/07/23/unveiling-hidden-connections-with-google-analytics-ids/
129 |
130 | (back to top)
131 |
132 |
133 | ### Built With
134 |
135 |
136 |
137 | ![Python][Python]
138 | ![Pandas][Pandas]
139 |
140 |
141 |
142 | Additional libraries/tools: [BeautifulSoup4](https://pypi.org/project/beautifulsoup4/), [Asyncio](https://docs.python.org/3/library/asyncio.html), [Aiohttp](https://docs.aiohttp.org/en/stable/)
143 |
144 | (back to top)
145 |
146 |
147 | ## Installation
148 |
149 | ### Install from [pypi](https://pypi.org/project/wayback-google-analytics/) (with pip)
150 |
151 | [![PyPI][pypi-shield]][pypi-url]
152 |
153 | The easiest way to to install Wayback Google Analytics is from the command line with pip.
154 |
155 | 1. Open a terminal window and navigate to your chosen directory.
156 | 2. Create a virtual environment and activate it (optional, but recommended; if you use [Poetry](https://python-poetry.org/) or [pipenv](https://pipenv.pypa.io/en/latest/) those package managers do it for you)
157 | ```terminal
158 | python3 -m venv venv
159 | source venv/bin/activate
160 | ```
161 | 3. Install the project with pip
162 | ```terminal
163 | pip install wayback-google-analytics
164 | ```
165 | 4. Get a high-level overview
166 | ```terminal
167 | wayback-google-analytics -h
168 | ```
169 |
170 | ### Download from source
171 |
172 | You can also clone and download the repo from github and use the tool locally.
173 |
174 | 1. Clone repo:
175 | ```terminal
176 | git clone git@github.com:bellingcat/wayback-google-analytics.git
177 | ```
178 |
179 | 2. Navigate to root, create a venv and install requirements.txt:
180 | ```terminal
181 | cd wayback-google-analytics
182 | python -m venv venv
183 | source venv/bin/activate
184 | pip install -r requirements.txt
185 | ```
186 |
187 | 3. Get a high-level overview:
188 | ```terminal
189 | python -m wayback_google_analytics.main -h
190 | ```
191 |
192 | (back to top)
193 |
194 |
195 |
196 |
197 | ## Usage
198 |
199 | ### Getting started
200 |
201 | 1. Enter a list of urls manually through the command line using `--urls` (`-u`) or from a given file using `--input_file` (`-i`).
202 |
203 | 2. Specify your output format (.csv, .txt, .xlsx or .csv) using `--output` (`-o`).
204 |
205 | 3. Add any of the following options:
206 |
207 |
208 | Options list (run `wayback-google-analytics -h` to see in terminal):
209 |
210 | ```terminal
211 | options:
212 | -h, --help show this help message and exit
213 | -i INPUT_FILE, --input_file INPUT_FILE
214 | Enter a file path to a list of urls in a readable file type
215 | (e.g. .txt, .csv, .md)
216 | -u URLS [URLS ...], --urls URLS [URLS ...]
217 | Enter a list of urls separated by spaces to get their UA/GA
218 | codes (e.g. --urls https://www.google.com
219 | https://www.facebook.com)
220 | -o {csv,txt,json,xlsx}, --output {csv,txt,json,xlsx}
221 | Enter an output type to write results to file. Defaults to
222 | json.
223 | -s START_DATE, --start_date START_DATE
224 | Start date for time range (dd/mm/YYYY:HH:MM) Defaults to
225 | 01/10/2012:00:00, when UA codes were adopted.
226 | -e END_DATE, --end_date END_DATE
227 | End date for time range (dd/mm/YYYY:HH:MM). Defaults to None.
228 | -f {yearly,monthly,daily,hourly}, --frequency {yearly,monthly,daily,hourly}
229 | Can limit snapshots to remove duplicates (1 per hr, day, month,
230 | etc). Defaults to None.
231 | -l LIMIT, --limit LIMIT
232 | Limits number of snapshots returned. Defaults to -100 (most
233 | recent 100 snapshots).
234 | -sc, --skip_current Add this flag to skip current UA/GA codes when getting archived
235 | codes.
236 |
237 | ```
238 |
239 | Examples:
240 |
241 | To get current codes for two websites and archived codes between Oct 1, 2012 and Oct 25, 2012:
242 | `wayback-google-analytics --urls https://someurl.com https://otherurl.org --output json --start_date 01/10/2012 --end_date 25/10/2012 --frequency hourly`
243 |
244 | To get current codes for a list of websites (from a file) from January 1, 2012 to the present day, checking for snapshots monthly and returning it as an excel spreadsheet:
245 | `wayback-google-analytics --input_file path/to/file.txt --output xlsx --start_date 01/01/2012`
246 |
247 | To check a single website for its current codes plus codes from the last 2,000 archive.org snapshots:
248 | `wayback-google-analytics --urls https://someurl.com --limit -2000`
249 |
250 |
251 | ## Output files & spreadsheets
252 |
253 | Wayback Google Analytics allows you to export your findings to either `.csv` or `.xlsx` spreadsheets. When choosing to save your findings as a spreadsheet, the tool generates two databases: one where each url is the primary index and another where each identified code is the primary index. In an `.xlsx` file this is one spreadsheet with two sheets, while the `.csv` option generates one file sorted by codes and another sorted by websites. All output files can be found in `/output`, which is created in the directory from which the code is executed.
254 |
255 | #### Example spreadsheet
256 |
257 | Let's say we're looking into data from 4 websites from 2015 until present and we want to save what we find in an excel spreadsheet. Our start command looks something like this:
258 |
259 | ```terminal
260 | wayback-google-analytics -u https://yapatriot.ru https://zanogu.com https://whoswho.com.ua https://adamants.ru -s 01/01/2015 -f yearly -o xlsx
261 | ```
262 |
263 | The result is a single `.xlsx` file with two sheets.
264 |
265 | Ordered by website:
266 |
267 |

268 |
269 |
270 | Ordered by code:
271 |
272 |

273 |
274 |
275 | (back to top)
276 |
277 |
278 | ## Limitations & Rate Limits
279 |
280 | We recommend that you limit your list of urls to ~10 and your max snapshot limit to <500 during queries. While Wayback Google Analytics doesn't have any hardcoded limitations in regards to how many urls or snapshots you can request, large queries can cause 443 errors (rate limiting). Being rate limited can result in a temporary 5-10 minute ban from web.archive.org and the CDX api.
281 |
282 | The app currently uses `asyncio.Semaphore()` along with delays between requests, but large queries or operations that take a long time can still result in a 443. Use your judgment and break large queries into smaller, more manageable pieces if you find yourself getting rate limited.
283 |
284 |
285 | (back to top)
286 |
287 |
288 | ## Contributing
289 |
290 | ### Bugs and feature requests
291 |
292 | Please feel free to [open an issue](https://github.com/bellingcat/wayback-google-analytics/issues) should you encounter any bugs or have suggestions for new features or improvements. You can also [reach out to me](#contact) directly with suggestions or thoughts.
293 |
294 | (back to top)
295 |
296 |
297 | ## Development
298 |
299 | ### Testing
300 |
301 | * Run tests with `python -m unittest discover`
302 | * Check coverage with `coverage run -m unittest`
303 |
304 | ### Using Poetry for Development
305 |
306 | Wayback Google Analytics uses [Poetry](https://python-poetry.org/), a Python dependency management and packaging tool. A GitHub workflow automates the tests on PRs and to main ([see our workflow here](https://github.com/bellingcat/wayback-google-analytics/actions)), be sure to update the [semantic](https://semver.org/) version number in `pyproject.toml` when opening a PR.
307 |
308 | If you have push access, follow these steps to trigger the GitHub workflow that will build and release a new version to PyPI :
309 |
310 | 1. Change the version number in [pyproject.toml](pyproject.toml)
311 | 2. Create a new tag for that version `git tag "vX.0.0"`
312 | 3. Push the tag `git push --tags`
313 |
314 |
315 | ## License
316 |
317 | Distributed under the MIT License. See `LICENSE.txt` for more information.
318 |
319 | (back to top)
320 |
321 |
322 | ## Contact
323 |
324 | You can contact me through email or social media.
325 |
326 | * email: jclarksummit at gmail dot com
327 | * Twitter/X: [@JustinClarkJO](https://twitter.com/JustinClarkJO)
328 | * Linkedin: [Justin Clark](https://linkedin.com/in/justin-w-clark)
329 |
330 | Project Link: [https://github.com/bellingcat/wayback-google-analytics](https://github.com/bellingcat/wayback-google-analytics)
331 |
332 | (back to top)
333 |
334 |
335 |
336 |
337 | ## Acknowledgments
338 |
339 | * [Bellingcat](https://bellingcat.org) for hosting this project
340 | * [Miguel Ramalho](https://github.com/msramalho) for constant support, thoughtful code reviews and suggesting the original idea for this project
341 |
342 | (back to top)
343 |
344 |
345 |
346 |
347 |
348 | [contributors-shield]: https://img.shields.io/github/contributors/bellingcat/wayback-google-analytics.svg?style=for-the-badge
349 | [contributors-url]: https://github.com/bellingcat/wayback-google-analytics/graphs/contributors
350 | [forks-shield]: https://img.shields.io/github/forks/bellingcat/wayback-google-analytics.svg?style=for-the-badge
351 | [forks-url]: https://github.com/bellingcat/wayback-google-analytics/network/members
352 | [stars-shield]: https://img.shields.io/github/stars/bellingcat/wayback-google-analytics.svg?style=for-the-badge
353 | [stars-url]: https://github.com/bellingcat/wayback-google-analytics/stargazers
354 | [issues-shield]: https://img.shields.io/github/issues/bellingcat/wayback-google-analytics.svg?style=for-the-badge
355 | [issues-url]: https://github.com/bellingcat/wayback-google-analytics/issues
356 | [pypi-shield]: https://img.shields.io/pypi/v/wayback-google-analytics
357 | [pypi-url]: https://pypi.org/project/wayback-google-analytics/
358 | [license-shield]: https://img.shields.io/github/license/bellingcat/wayback-google-analytics.svg?style=for-the-badge
359 | [license-url]: https://github.com/bellingcat/wayback-google-analytics/blob/master/LICENSE.txt
360 | [linkedin-shield]: https://img.shields.io/badge/-LinkedIn-black.svg?style=for-the-badge&logo=linkedin&colorB=555
361 | [linkedin-url]: https://linkedin.com/in/justin-w-clark
362 | [Python]: https://img.shields.io/badge/Python-3776AB?style=for-the-badge&logo=python&logoColor=white
363 | [Pandas]: https://img.shields.io/badge/pandas-%23150458.svg?style=for-the-badge&logo=pandas&logoColor=white
364 |
--------------------------------------------------------------------------------
/docs/imgs/jsonoutput1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bellingcat/wayback-google-analytics/e697ce9d0e2594175867e911b03198ae986ffce6/docs/imgs/jsonoutput1.png
--------------------------------------------------------------------------------
/docs/imgs/xlsxbycode.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bellingcat/wayback-google-analytics/e697ce9d0e2594175867e911b03198ae986ffce6/docs/imgs/xlsxbycode.png
--------------------------------------------------------------------------------
/docs/imgs/xlsxbysite.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bellingcat/wayback-google-analytics/e697ce9d0e2594175867e911b03198ae986ffce6/docs/imgs/xlsxbysite.png
--------------------------------------------------------------------------------
/poetry.lock:
--------------------------------------------------------------------------------
1 | # This file is automatically @generated by Poetry 1.6.1 and should not be changed by hand.
2 |
3 | [[package]]
4 | name = "aiohttp"
5 | version = "3.8.5"
6 | description = "Async http client/server framework (asyncio)"
7 | optional = false
8 | python-versions = ">=3.6"
9 | files = [
10 | {file = "aiohttp-3.8.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a94159871304770da4dd371f4291b20cac04e8c94f11bdea1c3478e557fbe0d8"},
11 | {file = "aiohttp-3.8.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:13bf85afc99ce6f9ee3567b04501f18f9f8dbbb2ea11ed1a2e079670403a7c84"},
12 | {file = "aiohttp-3.8.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2ce2ac5708501afc4847221a521f7e4b245abf5178cf5ddae9d5b3856ddb2f3a"},
13 | {file = "aiohttp-3.8.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:96943e5dcc37a6529d18766597c491798b7eb7a61d48878611298afc1fca946c"},
14 | {file = "aiohttp-3.8.5-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2ad5c3c4590bb3cc28b4382f031f3783f25ec223557124c68754a2231d989e2b"},
15 | {file = "aiohttp-3.8.5-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0c413c633d0512df4dc7fd2373ec06cc6a815b7b6d6c2f208ada7e9e93a5061d"},
16 | {file = "aiohttp-3.8.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:df72ac063b97837a80d80dec8d54c241af059cc9bb42c4de68bd5b61ceb37caa"},
17 | {file = "aiohttp-3.8.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c48c5c0271149cfe467c0ff8eb941279fd6e3f65c9a388c984e0e6cf57538e14"},
18 | {file = "aiohttp-3.8.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:368a42363c4d70ab52c2c6420a57f190ed3dfaca6a1b19afda8165ee16416a82"},
19 | {file = "aiohttp-3.8.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7607ec3ce4993464368505888af5beb446845a014bc676d349efec0e05085905"},
20 | {file = "aiohttp-3.8.5-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:0d21c684808288a98914e5aaf2a7c6a3179d4df11d249799c32d1808e79503b5"},
21 | {file = "aiohttp-3.8.5-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:312fcfbacc7880a8da0ae8b6abc6cc7d752e9caa0051a53d217a650b25e9a691"},
22 | {file = "aiohttp-3.8.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ad093e823df03bb3fd37e7dec9d4670c34f9e24aeace76808fc20a507cace825"},
23 | {file = "aiohttp-3.8.5-cp310-cp310-win32.whl", hash = "sha256:33279701c04351a2914e1100b62b2a7fdb9a25995c4a104259f9a5ead7ed4802"},
24 | {file = "aiohttp-3.8.5-cp310-cp310-win_amd64.whl", hash = "sha256:6e4a280e4b975a2e7745573e3fc9c9ba0d1194a3738ce1cbaa80626cc9b4f4df"},
25 | {file = "aiohttp-3.8.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ae871a964e1987a943d83d6709d20ec6103ca1eaf52f7e0d36ee1b5bebb8b9b9"},
26 | {file = "aiohttp-3.8.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:461908b2578955045efde733719d62f2b649c404189a09a632d245b445c9c975"},
27 | {file = "aiohttp-3.8.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:72a860c215e26192379f57cae5ab12b168b75db8271f111019509a1196dfc780"},
28 | {file = "aiohttp-3.8.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc14be025665dba6202b6a71cfcdb53210cc498e50068bc088076624471f8bb9"},
29 | {file = "aiohttp-3.8.5-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8af740fc2711ad85f1a5c034a435782fbd5b5f8314c9a3ef071424a8158d7f6b"},
30 | {file = "aiohttp-3.8.5-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:841cd8233cbd2111a0ef0a522ce016357c5e3aff8a8ce92bcfa14cef890d698f"},
31 | {file = "aiohttp-3.8.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ed1c46fb119f1b59304b5ec89f834f07124cd23ae5b74288e364477641060ff"},
32 | {file = "aiohttp-3.8.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:84f8ae3e09a34f35c18fa57f015cc394bd1389bce02503fb30c394d04ee6b938"},
33 | {file = "aiohttp-3.8.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:62360cb771707cb70a6fd114b9871d20d7dd2163a0feafe43fd115cfe4fe845e"},
34 | {file = "aiohttp-3.8.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:23fb25a9f0a1ca1f24c0a371523546366bb642397c94ab45ad3aedf2941cec6a"},
35 | {file = "aiohttp-3.8.5-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:b0ba0d15164eae3d878260d4c4df859bbdc6466e9e6689c344a13334f988bb53"},
36 | {file = "aiohttp-3.8.5-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:5d20003b635fc6ae3f96d7260281dfaf1894fc3aa24d1888a9b2628e97c241e5"},
37 | {file = "aiohttp-3.8.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:0175d745d9e85c40dcc51c8f88c74bfbaef9e7afeeeb9d03c37977270303064c"},
38 | {file = "aiohttp-3.8.5-cp311-cp311-win32.whl", hash = "sha256:2e1b1e51b0774408f091d268648e3d57f7260c1682e7d3a63cb00d22d71bb945"},
39 | {file = "aiohttp-3.8.5-cp311-cp311-win_amd64.whl", hash = "sha256:043d2299f6dfdc92f0ac5e995dfc56668e1587cea7f9aa9d8a78a1b6554e5755"},
40 | {file = "aiohttp-3.8.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:cae533195e8122584ec87531d6df000ad07737eaa3c81209e85c928854d2195c"},
41 | {file = "aiohttp-3.8.5-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f21e83f355643c345177a5d1d8079f9f28b5133bcd154193b799d380331d5d3"},
42 | {file = "aiohttp-3.8.5-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a7a75ef35f2df54ad55dbf4b73fe1da96f370e51b10c91f08b19603c64004acc"},
43 | {file = "aiohttp-3.8.5-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2e2e9839e14dd5308ee773c97115f1e0a1cb1d75cbeeee9f33824fa5144c7634"},
44 | {file = "aiohttp-3.8.5-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c44e65da1de4403d0576473e2344828ef9c4c6244d65cf4b75549bb46d40b8dd"},
45 | {file = "aiohttp-3.8.5-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:78d847e4cde6ecc19125ccbc9bfac4a7ab37c234dd88fbb3c5c524e8e14da543"},
46 | {file = "aiohttp-3.8.5-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:c7a815258e5895d8900aec4454f38dca9aed71085f227537208057853f9d13f2"},
47 | {file = "aiohttp-3.8.5-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:8b929b9bd7cd7c3939f8bcfffa92fae7480bd1aa425279d51a89327d600c704d"},
48 | {file = "aiohttp-3.8.5-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:5db3a5b833764280ed7618393832e0853e40f3d3e9aa128ac0ba0f8278d08649"},
49 | {file = "aiohttp-3.8.5-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:a0215ce6041d501f3155dc219712bc41252d0ab76474615b9700d63d4d9292af"},
50 | {file = "aiohttp-3.8.5-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:fd1ed388ea7fbed22c4968dd64bab0198de60750a25fe8c0c9d4bef5abe13824"},
51 | {file = "aiohttp-3.8.5-cp36-cp36m-win32.whl", hash = "sha256:6e6783bcc45f397fdebc118d772103d751b54cddf5b60fbcc958382d7dd64f3e"},
52 | {file = "aiohttp-3.8.5-cp36-cp36m-win_amd64.whl", hash = "sha256:b5411d82cddd212644cf9360879eb5080f0d5f7d809d03262c50dad02f01421a"},
53 | {file = "aiohttp-3.8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:01d4c0c874aa4ddfb8098e85d10b5e875a70adc63db91f1ae65a4b04d3344cda"},
54 | {file = "aiohttp-3.8.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e5980a746d547a6ba173fd5ee85ce9077e72d118758db05d229044b469d9029a"},
55 | {file = "aiohttp-3.8.5-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2a482e6da906d5e6e653be079b29bc173a48e381600161c9932d89dfae5942ef"},
56 | {file = "aiohttp-3.8.5-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80bd372b8d0715c66c974cf57fe363621a02f359f1ec81cba97366948c7fc873"},
57 | {file = "aiohttp-3.8.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c1161b345c0a444ebcf46bf0a740ba5dcf50612fd3d0528883fdc0eff578006a"},
58 | {file = "aiohttp-3.8.5-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cd56db019015b6acfaaf92e1ac40eb8434847d9bf88b4be4efe5bfd260aee692"},
59 | {file = "aiohttp-3.8.5-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:153c2549f6c004d2754cc60603d4668899c9895b8a89397444a9c4efa282aaf4"},
60 | {file = "aiohttp-3.8.5-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:4a01951fabc4ce26ab791da5f3f24dca6d9a6f24121746eb19756416ff2d881b"},
61 | {file = "aiohttp-3.8.5-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bfb9162dcf01f615462b995a516ba03e769de0789de1cadc0f916265c257e5d8"},
62 | {file = "aiohttp-3.8.5-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:7dde0009408969a43b04c16cbbe252c4f5ef4574ac226bc8815cd7342d2028b6"},
63 | {file = "aiohttp-3.8.5-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4149d34c32f9638f38f544b3977a4c24052042affa895352d3636fa8bffd030a"},
64 | {file = "aiohttp-3.8.5-cp37-cp37m-win32.whl", hash = "sha256:68c5a82c8779bdfc6367c967a4a1b2aa52cd3595388bf5961a62158ee8a59e22"},
65 | {file = "aiohttp-3.8.5-cp37-cp37m-win_amd64.whl", hash = "sha256:2cf57fb50be5f52bda004b8893e63b48530ed9f0d6c96c84620dc92fe3cd9b9d"},
66 | {file = "aiohttp-3.8.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:eca4bf3734c541dc4f374ad6010a68ff6c6748f00451707f39857f429ca36ced"},
67 | {file = "aiohttp-3.8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1274477e4c71ce8cfe6c1ec2f806d57c015ebf84d83373676036e256bc55d690"},
68 | {file = "aiohttp-3.8.5-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:28c543e54710d6158fc6f439296c7865b29e0b616629767e685a7185fab4a6b9"},
69 | {file = "aiohttp-3.8.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:910bec0c49637d213f5d9877105d26e0c4a4de2f8b1b29405ff37e9fc0ad52b8"},
70 | {file = "aiohttp-3.8.5-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5443910d662db951b2e58eb70b0fbe6b6e2ae613477129a5805d0b66c54b6cb7"},
71 | {file = "aiohttp-3.8.5-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2e460be6978fc24e3df83193dc0cc4de46c9909ed92dd47d349a452ef49325b7"},
72 | {file = "aiohttp-3.8.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb1558def481d84f03b45888473fc5a1f35747b5f334ef4e7a571bc0dfcb11f8"},
73 | {file = "aiohttp-3.8.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:34dd0c107799dcbbf7d48b53be761a013c0adf5571bf50c4ecad5643fe9cfcd0"},
74 | {file = "aiohttp-3.8.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:aa1990247f02a54185dc0dff92a6904521172a22664c863a03ff64c42f9b5410"},
75 | {file = "aiohttp-3.8.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:0e584a10f204a617d71d359fe383406305a4b595b333721fa50b867b4a0a1548"},
76 | {file = "aiohttp-3.8.5-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:a3cf433f127efa43fee6b90ea4c6edf6c4a17109d1d037d1a52abec84d8f2e42"},
77 | {file = "aiohttp-3.8.5-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:c11f5b099adafb18e65c2c997d57108b5bbeaa9eeee64a84302c0978b1ec948b"},
78 | {file = "aiohttp-3.8.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:84de26ddf621d7ac4c975dbea4c945860e08cccde492269db4e1538a6a6f3c35"},
79 | {file = "aiohttp-3.8.5-cp38-cp38-win32.whl", hash = "sha256:ab88bafedc57dd0aab55fa728ea10c1911f7e4d8b43e1d838a1739f33712921c"},
80 | {file = "aiohttp-3.8.5-cp38-cp38-win_amd64.whl", hash = "sha256:5798a9aad1879f626589f3df0f8b79b3608a92e9beab10e5fda02c8a2c60db2e"},
81 | {file = "aiohttp-3.8.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a6ce61195c6a19c785df04e71a4537e29eaa2c50fe745b732aa937c0c77169f3"},
82 | {file = "aiohttp-3.8.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:773dd01706d4db536335fcfae6ea2440a70ceb03dd3e7378f3e815b03c97ab51"},
83 | {file = "aiohttp-3.8.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f83a552443a526ea38d064588613aca983d0ee0038801bc93c0c916428310c28"},
84 | {file = "aiohttp-3.8.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f7372f7341fcc16f57b2caded43e81ddd18df53320b6f9f042acad41f8e049a"},
85 | {file = "aiohttp-3.8.5-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ea353162f249c8097ea63c2169dd1aa55de1e8fecbe63412a9bc50816e87b761"},
86 | {file = "aiohttp-3.8.5-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5d47ae48db0b2dcf70bc8a3bc72b3de86e2a590fc299fdbbb15af320d2659de"},
87 | {file = "aiohttp-3.8.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d827176898a2b0b09694fbd1088c7a31836d1a505c243811c87ae53a3f6273c1"},
88 | {file = "aiohttp-3.8.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3562b06567c06439d8b447037bb655ef69786c590b1de86c7ab81efe1c9c15d8"},
89 | {file = "aiohttp-3.8.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:4e874cbf8caf8959d2adf572a78bba17cb0e9d7e51bb83d86a3697b686a0ab4d"},
90 | {file = "aiohttp-3.8.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:6809a00deaf3810e38c628e9a33271892f815b853605a936e2e9e5129762356c"},
91 | {file = "aiohttp-3.8.5-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:33776e945d89b29251b33a7e7d006ce86447b2cfd66db5e5ded4e5cd0340585c"},
92 | {file = "aiohttp-3.8.5-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:eaeed7abfb5d64c539e2db173f63631455f1196c37d9d8d873fc316470dfbacd"},
93 | {file = "aiohttp-3.8.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e91d635961bec2d8f19dfeb41a539eb94bd073f075ca6dae6c8dc0ee89ad6f91"},
94 | {file = "aiohttp-3.8.5-cp39-cp39-win32.whl", hash = "sha256:00ad4b6f185ec67f3e6562e8a1d2b69660be43070bd0ef6fcec5211154c7df67"},
95 | {file = "aiohttp-3.8.5-cp39-cp39-win_amd64.whl", hash = "sha256:c0a9034379a37ae42dea7ac1e048352d96286626251862e448933c0f59cbd79c"},
96 | {file = "aiohttp-3.8.5.tar.gz", hash = "sha256:b9552ec52cc147dbf1944ac7ac98af7602e51ea2dcd076ed194ca3c0d1c7d0bc"},
97 | ]
98 |
99 | [package.dependencies]
100 | aiosignal = ">=1.1.2"
101 | async-timeout = ">=4.0.0a3,<5.0"
102 | attrs = ">=17.3.0"
103 | charset-normalizer = ">=2.0,<4.0"
104 | frozenlist = ">=1.1.1"
105 | multidict = ">=4.5,<7.0"
106 | yarl = ">=1.0,<2.0"
107 |
108 | [package.extras]
109 | speedups = ["Brotli", "aiodns", "cchardet"]
110 |
111 | [[package]]
112 | name = "aiosignal"
113 | version = "1.3.1"
114 | description = "aiosignal: a list of registered asynchronous callbacks"
115 | optional = false
116 | python-versions = ">=3.7"
117 | files = [
118 | {file = "aiosignal-1.3.1-py3-none-any.whl", hash = "sha256:f8376fb07dd1e86a584e4fcdec80b36b7f81aac666ebc724e2c090300dd83b17"},
119 | {file = "aiosignal-1.3.1.tar.gz", hash = "sha256:54cd96e15e1649b75d6c87526a6ff0b6c1b0dd3459f43d9ca11d48c339b68cfc"},
120 | ]
121 |
122 | [package.dependencies]
123 | frozenlist = ">=1.1.0"
124 |
125 | [[package]]
126 | name = "async-timeout"
127 | version = "4.0.3"
128 | description = "Timeout context manager for asyncio programs"
129 | optional = false
130 | python-versions = ">=3.7"
131 | files = [
132 | {file = "async-timeout-4.0.3.tar.gz", hash = "sha256:4640d96be84d82d02ed59ea2b7105a0f7b33abe8703703cd0ab0bf87c427522f"},
133 | {file = "async_timeout-4.0.3-py3-none-any.whl", hash = "sha256:7405140ff1230c310e51dc27b3145b9092d659ce68ff733fb0cefe3ee42be028"},
134 | ]
135 |
136 | [[package]]
137 | name = "asyncio"
138 | version = "3.4.3"
139 | description = "reference implementation of PEP 3156"
140 | optional = false
141 | python-versions = "*"
142 | files = [
143 | {file = "asyncio-3.4.3-cp33-none-win32.whl", hash = "sha256:b62c9157d36187eca799c378e572c969f0da87cd5fc42ca372d92cdb06e7e1de"},
144 | {file = "asyncio-3.4.3-cp33-none-win_amd64.whl", hash = "sha256:c46a87b48213d7464f22d9a497b9eef8c1928b68320a2fa94240f969f6fec08c"},
145 | {file = "asyncio-3.4.3-py3-none-any.whl", hash = "sha256:c4d18b22701821de07bd6aea8b53d21449ec0ec5680645e5317062ea21817d2d"},
146 | {file = "asyncio-3.4.3.tar.gz", hash = "sha256:83360ff8bc97980e4ff25c964c7bd3923d333d177aa4f7fb736b019f26c7cb41"},
147 | ]
148 |
149 | [[package]]
150 | name = "asynctest"
151 | version = "0.13.0"
152 | description = "Enhance the standard unittest package with features for testing asyncio libraries"
153 | optional = false
154 | python-versions = ">=3.5"
155 | files = [
156 | {file = "asynctest-0.13.0-py3-none-any.whl", hash = "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676"},
157 | {file = "asynctest-0.13.0.tar.gz", hash = "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"},
158 | ]
159 |
160 | [[package]]
161 | name = "attrs"
162 | version = "23.1.0"
163 | description = "Classes Without Boilerplate"
164 | optional = false
165 | python-versions = ">=3.7"
166 | files = [
167 | {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
168 | {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
169 | ]
170 |
171 | [package.extras]
172 | cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
173 | dev = ["attrs[docs,tests]", "pre-commit"]
174 | docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
175 | tests = ["attrs[tests-no-zope]", "zope-interface"]
176 | tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
177 |
178 | [[package]]
179 | name = "beautifulsoup4"
180 | version = "4.12.2"
181 | description = "Screen-scraping library"
182 | optional = false
183 | python-versions = ">=3.6.0"
184 | files = [
185 | {file = "beautifulsoup4-4.12.2-py3-none-any.whl", hash = "sha256:bd2520ca0d9d7d12694a53d44ac482d181b4ec1888909b035a3dbf40d0f57d4a"},
186 | {file = "beautifulsoup4-4.12.2.tar.gz", hash = "sha256:492bbc69dca35d12daac71c4db1bfff0c876c00ef4a2ffacce226d4638eb72da"},
187 | ]
188 |
189 | [package.dependencies]
190 | soupsieve = ">1.2"
191 |
192 | [package.extras]
193 | html5lib = ["html5lib"]
194 | lxml = ["lxml"]
195 |
196 | [[package]]
197 | name = "certifi"
198 | version = "2023.7.22"
199 | description = "Python package for providing Mozilla's CA Bundle."
200 | optional = false
201 | python-versions = ">=3.6"
202 | files = [
203 | {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
204 | {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
205 | ]
206 |
207 | [[package]]
208 | name = "charset-normalizer"
209 | version = "3.2.0"
210 | description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
211 | optional = false
212 | python-versions = ">=3.7.0"
213 | files = [
214 | {file = "charset-normalizer-3.2.0.tar.gz", hash = "sha256:3bb3d25a8e6c0aedd251753a79ae98a093c7e7b471faa3aa9a93a81431987ace"},
215 | {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b87549028f680ca955556e3bd57013ab47474c3124dc069faa0b6545b6c9710"},
216 | {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c70087bfee18a42b4040bb9ec1ca15a08242cf5867c58726530bdf3945672ed"},
217 | {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a103b3a7069b62f5d4890ae1b8f0597618f628b286b03d4bc9195230b154bfa9"},
218 | {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94aea8eff76ee6d1cdacb07dd2123a68283cb5569e0250feab1240058f53b623"},
219 | {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:db901e2ac34c931d73054d9797383d0f8009991e723dab15109740a63e7f902a"},
220 | {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0dac0ff919ba34d4df1b6131f59ce95b08b9065233446be7e459f95554c0dc8"},
221 | {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:193cbc708ea3aca45e7221ae58f0fd63f933753a9bfb498a3b474878f12caaad"},
222 | {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09393e1b2a9461950b1c9a45d5fd251dc7c6f228acab64da1c9c0165d9c7765c"},
223 | {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:baacc6aee0b2ef6f3d308e197b5d7a81c0e70b06beae1f1fcacffdbd124fe0e3"},
224 | {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bf420121d4c8dce6b889f0e8e4ec0ca34b7f40186203f06a946fa0276ba54029"},
225 | {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c04a46716adde8d927adb9457bbe39cf473e1e2c2f5d0a16ceb837e5d841ad4f"},
226 | {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:aaf63899c94de41fe3cf934601b0f7ccb6b428c6e4eeb80da72c58eab077b19a"},
227 | {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62e51710986674142526ab9f78663ca2b0726066ae26b78b22e0f5e571238dd"},
228 | {file = "charset_normalizer-3.2.0-cp310-cp310-win32.whl", hash = "sha256:04e57ab9fbf9607b77f7d057974694b4f6b142da9ed4a199859d9d4d5c63fe96"},
229 | {file = "charset_normalizer-3.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:48021783bdf96e3d6de03a6e39a1171ed5bd7e8bb93fc84cc649d11490f87cea"},
230 | {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4957669ef390f0e6719db3613ab3a7631e68424604a7b448f079bee145da6e09"},
231 | {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46fb8c61d794b78ec7134a715a3e564aafc8f6b5e338417cb19fe9f57a5a9bf2"},
232 | {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f779d3ad205f108d14e99bb3859aa7dd8e9c68874617c72354d7ecaec2a054ac"},
233 | {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25c229a6ba38a35ae6e25ca1264621cc25d4d38dca2942a7fce0b67a4efe918"},
234 | {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2efb1bd13885392adfda4614c33d3b68dee4921fd0ac1d3988f8cbb7d589e72a"},
235 | {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f30b48dd7fa1474554b0b0f3fdfdd4c13b5c737a3c6284d3cdc424ec0ffff3a"},
236 | {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:246de67b99b6851627d945db38147d1b209a899311b1305dd84916f2b88526c6"},
237 | {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bd9b3b31adcb054116447ea22caa61a285d92e94d710aa5ec97992ff5eb7cf3"},
238 | {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c2f5e83493748286002f9369f3e6607c565a6a90425a3a1fef5ae32a36d749d"},
239 | {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3170c9399da12c9dc66366e9d14da8bf7147e1e9d9ea566067bbce7bb74bd9c2"},
240 | {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7a4826ad2bd6b07ca615c74ab91f32f6c96d08f6fcc3902ceeedaec8cdc3bcd6"},
241 | {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:3b1613dd5aee995ec6d4c69f00378bbd07614702a315a2cf6c1d21461fe17c23"},
242 | {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e608aafdb55eb9f255034709e20d5a83b6d60c054df0802fa9c9883d0a937aa"},
243 | {file = "charset_normalizer-3.2.0-cp311-cp311-win32.whl", hash = "sha256:f2a1d0fd4242bd8643ce6f98927cf9c04540af6efa92323e9d3124f57727bfc1"},
244 | {file = "charset_normalizer-3.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:681eb3d7e02e3c3655d1b16059fbfb605ac464c834a0c629048a30fad2b27489"},
245 | {file = "charset_normalizer-3.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c57921cda3a80d0f2b8aec7e25c8aa14479ea92b5b51b6876d975d925a2ea346"},
246 | {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41b25eaa7d15909cf3ac4c96088c1f266a9a93ec44f87f1d13d4a0e86c81b982"},
247 | {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f058f6963fd82eb143c692cecdc89e075fa0828db2e5b291070485390b2f1c9c"},
248 | {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a7647ebdfb9682b7bb97e2a5e7cb6ae735b1c25008a70b906aecca294ee96cf4"},
249 | {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef9df1eefada2c09a5e7a40991b9fc6ac6ef20b1372abd48d2794a316dc0449"},
250 | {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e03b8895a6990c9ab2cdcd0f2fe44088ca1c65ae592b8f795c3294af00a461c3"},
251 | {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ee4006268ed33370957f55bf2e6f4d263eaf4dc3cfc473d1d90baff6ed36ce4a"},
252 | {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c4983bf937209c57240cff65906b18bb35e64ae872da6a0db937d7b4af845dd7"},
253 | {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:3bb7fda7260735efe66d5107fb7e6af6a7c04c7fce9b2514e04b7a74b06bf5dd"},
254 | {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:72814c01533f51d68702802d74f77ea026b5ec52793c791e2da806a3844a46c3"},
255 | {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:70c610f6cbe4b9fce272c407dd9d07e33e6bf7b4aa1b7ffb6f6ded8e634e3592"},
256 | {file = "charset_normalizer-3.2.0-cp37-cp37m-win32.whl", hash = "sha256:a401b4598e5d3f4a9a811f3daf42ee2291790c7f9d74b18d75d6e21dda98a1a1"},
257 | {file = "charset_normalizer-3.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:c0b21078a4b56965e2b12f247467b234734491897e99c1d51cee628da9786959"},
258 | {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:95eb302ff792e12aba9a8b8f8474ab229a83c103d74a750ec0bd1c1eea32e669"},
259 | {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a100c6d595a7f316f1b6f01d20815d916e75ff98c27a01ae817439ea7726329"},
260 | {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6339d047dab2780cc6220f46306628e04d9750f02f983ddb37439ca47ced7149"},
261 | {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4b749b9cc6ee664a3300bb3a273c1ca8068c46be705b6c31cf5d276f8628a94"},
262 | {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a38856a971c602f98472050165cea2cdc97709240373041b69030be15047691f"},
263 | {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f87f746ee241d30d6ed93969de31e5ffd09a2961a051e60ae6bddde9ec3583aa"},
264 | {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f1b185a01fe560bc8ae5f619e924407efca2191b56ce749ec84982fc59a32a"},
265 | {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1c8a2f4c69e08e89632defbfabec2feb8a8d99edc9f89ce33c4b9e36ab63037"},
266 | {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f4ac36d8e2b4cc1aa71df3dd84ff8efbe3bfb97ac41242fbcfc053c67434f46"},
267 | {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a386ebe437176aab38c041de1260cd3ea459c6ce5263594399880bbc398225b2"},
268 | {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:ccd16eb18a849fd8dcb23e23380e2f0a354e8daa0c984b8a732d9cfaba3a776d"},
269 | {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:e6a5bf2cba5ae1bb80b154ed68a3cfa2fa00fde979a7f50d6598d3e17d9ac20c"},
270 | {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:45de3f87179c1823e6d9e32156fb14c1927fcc9aba21433f088fdfb555b77c10"},
271 | {file = "charset_normalizer-3.2.0-cp38-cp38-win32.whl", hash = "sha256:1000fba1057b92a65daec275aec30586c3de2401ccdcd41f8a5c1e2c87078706"},
272 | {file = "charset_normalizer-3.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:8b2c760cfc7042b27ebdb4a43a4453bd829a5742503599144d54a032c5dc7e9e"},
273 | {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:855eafa5d5a2034b4621c74925d89c5efef61418570e5ef9b37717d9c796419c"},
274 | {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:203f0c8871d5a7987be20c72442488a0b8cfd0f43b7973771640fc593f56321f"},
275 | {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e857a2232ba53ae940d3456f7533ce6ca98b81917d47adc3c7fd55dad8fab858"},
276 | {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e86d77b090dbddbe78867a0275cb4df08ea195e660f1f7f13435a4649e954e5"},
277 | {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fb39a81950ec280984b3a44f5bd12819953dc5fa3a7e6fa7a80db5ee853952"},
278 | {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dee8e57f052ef5353cf608e0b4c871aee320dd1b87d351c28764fc0ca55f9f4"},
279 | {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8700f06d0ce6f128de3ccdbc1acaea1ee264d2caa9ca05daaf492fde7c2a7200"},
280 | {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1920d4ff15ce893210c1f0c0e9d19bfbecb7983c76b33f046c13a8ffbd570252"},
281 | {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c1c76a1743432b4b60ab3358c937a3fe1341c828ae6194108a94c69028247f22"},
282 | {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f7560358a6811e52e9c4d142d497f1a6e10103d3a6881f18d04dbce3729c0e2c"},
283 | {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c8063cf17b19661471ecbdb3df1c84f24ad2e389e326ccaf89e3fb2484d8dd7e"},
284 | {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cd6dbe0238f7743d0efe563ab46294f54f9bc8f4b9bcf57c3c666cc5bc9d1299"},
285 | {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1249cbbf3d3b04902ff081ffbb33ce3377fa6e4c7356f759f3cd076cc138d020"},
286 | {file = "charset_normalizer-3.2.0-cp39-cp39-win32.whl", hash = "sha256:6c409c0deba34f147f77efaa67b8e4bb83d2f11c8806405f76397ae5b8c0d1c9"},
287 | {file = "charset_normalizer-3.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:7095f6fbfaa55defb6b733cfeb14efaae7a29f0b59d8cf213be4e7ca0b857b80"},
288 | {file = "charset_normalizer-3.2.0-py3-none-any.whl", hash = "sha256:8e098148dd37b4ce3baca71fb394c81dc5d9c7728c95df695d2dca218edf40e6"},
289 | ]
290 |
291 | [[package]]
292 | name = "coverage"
293 | version = "7.3.2"
294 | description = "Code coverage measurement for Python"
295 | optional = false
296 | python-versions = ">=3.8"
297 | files = [
298 | {file = "coverage-7.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d872145f3a3231a5f20fd48500274d7df222e291d90baa2026cc5152b7ce86bf"},
299 | {file = "coverage-7.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:310b3bb9c91ea66d59c53fa4989f57d2436e08f18fb2f421a1b0b6b8cc7fffda"},
300 | {file = "coverage-7.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f47d39359e2c3779c5331fc740cf4bce6d9d680a7b4b4ead97056a0ae07cb49a"},
301 | {file = "coverage-7.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aa72dbaf2c2068404b9870d93436e6d23addd8bbe9295f49cbca83f6e278179c"},
302 | {file = "coverage-7.3.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:beaa5c1b4777f03fc63dfd2a6bd820f73f036bfb10e925fce067b00a340d0f3f"},
303 | {file = "coverage-7.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:dbc1b46b92186cc8074fee9d9fbb97a9dd06c6cbbef391c2f59d80eabdf0faa6"},
304 | {file = "coverage-7.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:315a989e861031334d7bee1f9113c8770472db2ac484e5b8c3173428360a9148"},
305 | {file = "coverage-7.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d1bc430677773397f64a5c88cb522ea43175ff16f8bfcc89d467d974cb2274f9"},
306 | {file = "coverage-7.3.2-cp310-cp310-win32.whl", hash = "sha256:a889ae02f43aa45032afe364c8ae84ad3c54828c2faa44f3bfcafecb5c96b02f"},
307 | {file = "coverage-7.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:c0ba320de3fb8c6ec16e0be17ee1d3d69adcda99406c43c0409cb5c41788a611"},
308 | {file = "coverage-7.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ac8c802fa29843a72d32ec56d0ca792ad15a302b28ca6203389afe21f8fa062c"},
309 | {file = "coverage-7.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:89a937174104339e3a3ffcf9f446c00e3a806c28b1841c63edb2b369310fd074"},
310 | {file = "coverage-7.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e267e9e2b574a176ddb983399dec325a80dbe161f1a32715c780b5d14b5f583a"},
311 | {file = "coverage-7.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2443cbda35df0d35dcfb9bf8f3c02c57c1d6111169e3c85fc1fcc05e0c9f39a3"},
312 | {file = "coverage-7.3.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4175e10cc8dda0265653e8714b3174430b07c1dca8957f4966cbd6c2b1b8065a"},
313 | {file = "coverage-7.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0cbf38419fb1a347aaf63481c00f0bdc86889d9fbf3f25109cf96c26b403fda1"},
314 | {file = "coverage-7.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:5c913b556a116b8d5f6ef834038ba983834d887d82187c8f73dec21049abd65c"},
315 | {file = "coverage-7.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1981f785239e4e39e6444c63a98da3a1db8e971cb9ceb50a945ba6296b43f312"},
316 | {file = "coverage-7.3.2-cp311-cp311-win32.whl", hash = "sha256:43668cabd5ca8258f5954f27a3aaf78757e6acf13c17604d89648ecc0cc66640"},
317 | {file = "coverage-7.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10c39c0452bf6e694511c901426d6b5ac005acc0f78ff265dbe36bf81f808a2"},
318 | {file = "coverage-7.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:4cbae1051ab791debecc4a5dcc4a1ff45fc27b91b9aee165c8a27514dd160836"},
319 | {file = "coverage-7.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:12d15ab5833a997716d76f2ac1e4b4d536814fc213c85ca72756c19e5a6b3d63"},
320 | {file = "coverage-7.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c7bba973ebee5e56fe9251300c00f1579652587a9f4a5ed8404b15a0471f216"},
321 | {file = "coverage-7.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fe494faa90ce6381770746077243231e0b83ff3f17069d748f645617cefe19d4"},
322 | {file = "coverage-7.3.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f6e9589bd04d0461a417562649522575d8752904d35c12907d8c9dfeba588faf"},
323 | {file = "coverage-7.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d51ac2a26f71da1b57f2dc81d0e108b6ab177e7d30e774db90675467c847bbdf"},
324 | {file = "coverage-7.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:99b89d9f76070237975b315b3d5f4d6956ae354a4c92ac2388a5695516e47c84"},
325 | {file = "coverage-7.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:fa28e909776dc69efb6ed975a63691bc8172b64ff357e663a1bb06ff3c9b589a"},
326 | {file = "coverage-7.3.2-cp312-cp312-win32.whl", hash = "sha256:289fe43bf45a575e3ab10b26d7b6f2ddb9ee2dba447499f5401cfb5ecb8196bb"},
327 | {file = "coverage-7.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:7dbc3ed60e8659bc59b6b304b43ff9c3ed858da2839c78b804973f613d3e92ed"},
328 | {file = "coverage-7.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f94b734214ea6a36fe16e96a70d941af80ff3bfd716c141300d95ebc85339738"},
329 | {file = "coverage-7.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:af3d828d2c1cbae52d34bdbb22fcd94d1ce715d95f1a012354a75e5913f1bda2"},
330 | {file = "coverage-7.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:630b13e3036e13c7adc480ca42fa7afc2a5d938081d28e20903cf7fd687872e2"},
331 | {file = "coverage-7.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c9eacf273e885b02a0273bb3a2170f30e2d53a6d53b72dbe02d6701b5296101c"},
332 | {file = "coverage-7.3.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d8f17966e861ff97305e0801134e69db33b143bbfb36436efb9cfff6ec7b2fd9"},
333 | {file = "coverage-7.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b4275802d16882cf9c8b3d057a0839acb07ee9379fa2749eca54efbce1535b82"},
334 | {file = "coverage-7.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:72c0cfa5250f483181e677ebc97133ea1ab3eb68645e494775deb6a7f6f83901"},
335 | {file = "coverage-7.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:cb536f0dcd14149425996821a168f6e269d7dcd2c273a8bff8201e79f5104e76"},
336 | {file = "coverage-7.3.2-cp38-cp38-win32.whl", hash = "sha256:307adb8bd3abe389a471e649038a71b4eb13bfd6b7dd9a129fa856f5c695cf92"},
337 | {file = "coverage-7.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:88ed2c30a49ea81ea3b7f172e0269c182a44c236eb394718f976239892c0a27a"},
338 | {file = "coverage-7.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b631c92dfe601adf8f5ebc7fc13ced6bb6e9609b19d9a8cd59fa47c4186ad1ce"},
339 | {file = "coverage-7.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d3d9df4051c4a7d13036524b66ecf7a7537d14c18a384043f30a303b146164e9"},
340 | {file = "coverage-7.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f7363d3b6a1119ef05015959ca24a9afc0ea8a02c687fe7e2d557705375c01f"},
341 | {file = "coverage-7.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2f11cc3c967a09d3695d2a6f03fb3e6236622b93be7a4b5dc09166a861be6d25"},
342 | {file = "coverage-7.3.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:149de1d2401ae4655c436a3dced6dd153f4c3309f599c3d4bd97ab172eaf02d9"},
343 | {file = "coverage-7.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:3a4006916aa6fee7cd38db3bfc95aa9c54ebb4ffbfc47c677c8bba949ceba0a6"},
344 | {file = "coverage-7.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9028a3871280110d6e1aa2df1afd5ef003bab5fb1ef421d6dc748ae1c8ef2ebc"},
345 | {file = "coverage-7.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9f805d62aec8eb92bab5b61c0f07329275b6f41c97d80e847b03eb894f38d083"},
346 | {file = "coverage-7.3.2-cp39-cp39-win32.whl", hash = "sha256:d1c88ec1a7ff4ebca0219f5b1ef863451d828cccf889c173e1253aa84b1e07ce"},
347 | {file = "coverage-7.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b4767da59464bb593c07afceaddea61b154136300881844768037fd5e859353f"},
348 | {file = "coverage-7.3.2-pp38.pp39.pp310-none-any.whl", hash = "sha256:ae97af89f0fbf373400970c0a21eef5aa941ffeed90aee43650b81f7d7f47637"},
349 | {file = "coverage-7.3.2.tar.gz", hash = "sha256:be32ad29341b0170e795ca590e1c07e81fc061cb5b10c74ce7203491484404ef"},
350 | ]
351 |
352 | [package.extras]
353 | toml = ["tomli"]
354 |
355 | [[package]]
356 | name = "et-xmlfile"
357 | version = "1.1.0"
358 | description = "An implementation of lxml.xmlfile for the standard library"
359 | optional = false
360 | python-versions = ">=3.6"
361 | files = [
362 | {file = "et_xmlfile-1.1.0-py3-none-any.whl", hash = "sha256:a2ba85d1d6a74ef63837eed693bcb89c3f752169b0e3e7ae5b16ca5e1b3deada"},
363 | {file = "et_xmlfile-1.1.0.tar.gz", hash = "sha256:8eb9e2bc2f8c97e37a2dc85a09ecdcdec9d8a396530a6d5a33b30b9a92da0c5c"},
364 | ]
365 |
366 | [[package]]
367 | name = "frozenlist"
368 | version = "1.4.0"
369 | description = "A list-like structure which implements collections.abc.MutableSequence"
370 | optional = false
371 | python-versions = ">=3.8"
372 | files = [
373 | {file = "frozenlist-1.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:764226ceef3125e53ea2cb275000e309c0aa5464d43bd72abd661e27fffc26ab"},
374 | {file = "frozenlist-1.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d6484756b12f40003c6128bfcc3fa9f0d49a687e171186c2d85ec82e3758c559"},
375 | {file = "frozenlist-1.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9ac08e601308e41eb533f232dbf6b7e4cea762f9f84f6357136eed926c15d12c"},
376 | {file = "frozenlist-1.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d081f13b095d74b67d550de04df1c756831f3b83dc9881c38985834387487f1b"},
377 | {file = "frozenlist-1.4.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:71932b597f9895f011f47f17d6428252fc728ba2ae6024e13c3398a087c2cdea"},
378 | {file = "frozenlist-1.4.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:981b9ab5a0a3178ff413bca62526bb784249421c24ad7381e39d67981be2c326"},
379 | {file = "frozenlist-1.4.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e41f3de4df3e80de75845d3e743b3f1c4c8613c3997a912dbf0229fc61a8b963"},
380 | {file = "frozenlist-1.4.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6918d49b1f90821e93069682c06ffde41829c346c66b721e65a5c62b4bab0300"},
381 | {file = "frozenlist-1.4.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0e5c8764c7829343d919cc2dfc587a8db01c4f70a4ebbc49abde5d4b158b007b"},
382 | {file = "frozenlist-1.4.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:8d0edd6b1c7fb94922bf569c9b092ee187a83f03fb1a63076e7774b60f9481a8"},
383 | {file = "frozenlist-1.4.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:e29cda763f752553fa14c68fb2195150bfab22b352572cb36c43c47bedba70eb"},
384 | {file = "frozenlist-1.4.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:0c7c1b47859ee2cac3846fde1c1dc0f15da6cec5a0e5c72d101e0f83dcb67ff9"},
385 | {file = "frozenlist-1.4.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:901289d524fdd571be1c7be054f48b1f88ce8dddcbdf1ec698b27d4b8b9e5d62"},
386 | {file = "frozenlist-1.4.0-cp310-cp310-win32.whl", hash = "sha256:1a0848b52815006ea6596c395f87449f693dc419061cc21e970f139d466dc0a0"},
387 | {file = "frozenlist-1.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:b206646d176a007466358aa21d85cd8600a415c67c9bd15403336c331a10d956"},
388 | {file = "frozenlist-1.4.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:de343e75f40e972bae1ef6090267f8260c1446a1695e77096db6cfa25e759a95"},
389 | {file = "frozenlist-1.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ad2a9eb6d9839ae241701d0918f54c51365a51407fd80f6b8289e2dfca977cc3"},
390 | {file = "frozenlist-1.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bd7bd3b3830247580de99c99ea2a01416dfc3c34471ca1298bccabf86d0ff4dc"},
391 | {file = "frozenlist-1.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bdf1847068c362f16b353163391210269e4f0569a3c166bc6a9f74ccbfc7e839"},
392 | {file = "frozenlist-1.4.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:38461d02d66de17455072c9ba981d35f1d2a73024bee7790ac2f9e361ef1cd0c"},
393 | {file = "frozenlist-1.4.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5a32087d720c608f42caed0ef36d2b3ea61a9d09ee59a5142d6070da9041b8f"},
394 | {file = "frozenlist-1.4.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dd65632acaf0d47608190a71bfe46b209719bf2beb59507db08ccdbe712f969b"},
395 | {file = "frozenlist-1.4.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:261b9f5d17cac914531331ff1b1d452125bf5daa05faf73b71d935485b0c510b"},
396 | {file = "frozenlist-1.4.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:b89ac9768b82205936771f8d2eb3ce88503b1556324c9f903e7156669f521472"},
397 | {file = "frozenlist-1.4.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:008eb8b31b3ea6896da16c38c1b136cb9fec9e249e77f6211d479db79a4eaf01"},
398 | {file = "frozenlist-1.4.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e74b0506fa5aa5598ac6a975a12aa8928cbb58e1f5ac8360792ef15de1aa848f"},
399 | {file = "frozenlist-1.4.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:490132667476f6781b4c9458298b0c1cddf237488abd228b0b3650e5ecba7467"},
400 | {file = "frozenlist-1.4.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:76d4711f6f6d08551a7e9ef28c722f4a50dd0fc204c56b4bcd95c6cc05ce6fbb"},
401 | {file = "frozenlist-1.4.0-cp311-cp311-win32.whl", hash = "sha256:a02eb8ab2b8f200179b5f62b59757685ae9987996ae549ccf30f983f40602431"},
402 | {file = "frozenlist-1.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:515e1abc578dd3b275d6a5114030b1330ba044ffba03f94091842852f806f1c1"},
403 | {file = "frozenlist-1.4.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:f0ed05f5079c708fe74bf9027e95125334b6978bf07fd5ab923e9e55e5fbb9d3"},
404 | {file = "frozenlist-1.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ca265542ca427bf97aed183c1676e2a9c66942e822b14dc6e5f42e038f92a503"},
405 | {file = "frozenlist-1.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:491e014f5c43656da08958808588cc6c016847b4360e327a62cb308c791bd2d9"},
406 | {file = "frozenlist-1.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17ae5cd0f333f94f2e03aaf140bb762c64783935cc764ff9c82dff626089bebf"},
407 | {file = "frozenlist-1.4.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1e78fb68cf9c1a6aa4a9a12e960a5c9dfbdb89b3695197aa7064705662515de2"},
408 | {file = "frozenlist-1.4.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5655a942f5f5d2c9ed93d72148226d75369b4f6952680211972a33e59b1dfdc"},
409 | {file = "frozenlist-1.4.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c11b0746f5d946fecf750428a95f3e9ebe792c1ee3b1e96eeba145dc631a9672"},
410 | {file = "frozenlist-1.4.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e66d2a64d44d50d2543405fb183a21f76b3b5fd16f130f5c99187c3fb4e64919"},
411 | {file = "frozenlist-1.4.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:88f7bc0fcca81f985f78dd0fa68d2c75abf8272b1f5c323ea4a01a4d7a614efc"},
412 | {file = "frozenlist-1.4.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5833593c25ac59ede40ed4de6d67eb42928cca97f26feea219f21d0ed0959b79"},
413 | {file = "frozenlist-1.4.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:fec520865f42e5c7f050c2a79038897b1c7d1595e907a9e08e3353293ffc948e"},
414 | {file = "frozenlist-1.4.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:b826d97e4276750beca7c8f0f1a4938892697a6bcd8ec8217b3312dad6982781"},
415 | {file = "frozenlist-1.4.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:ceb6ec0a10c65540421e20ebd29083c50e6d1143278746a4ef6bcf6153171eb8"},
416 | {file = "frozenlist-1.4.0-cp38-cp38-win32.whl", hash = "sha256:2b8bcf994563466db019fab287ff390fffbfdb4f905fc77bc1c1d604b1c689cc"},
417 | {file = "frozenlist-1.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:a6c8097e01886188e5be3e6b14e94ab365f384736aa1fca6a0b9e35bd4a30bc7"},
418 | {file = "frozenlist-1.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:6c38721585f285203e4b4132a352eb3daa19121a035f3182e08e437cface44bf"},
419 | {file = "frozenlist-1.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a0c6da9aee33ff0b1a451e867da0c1f47408112b3391dd43133838339e410963"},
420 | {file = "frozenlist-1.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:93ea75c050c5bb3d98016b4ba2497851eadf0ac154d88a67d7a6816206f6fa7f"},
421 | {file = "frozenlist-1.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f61e2dc5ad442c52b4887f1fdc112f97caeff4d9e6ebe78879364ac59f1663e1"},
422 | {file = "frozenlist-1.4.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aa384489fefeb62321b238e64c07ef48398fe80f9e1e6afeff22e140e0850eef"},
423 | {file = "frozenlist-1.4.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:10ff5faaa22786315ef57097a279b833ecab1a0bfb07d604c9cbb1c4cdc2ed87"},
424 | {file = "frozenlist-1.4.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:007df07a6e3eb3e33e9a1fe6a9db7af152bbd8a185f9aaa6ece10a3529e3e1c6"},
425 | {file = "frozenlist-1.4.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f4f399d28478d1f604c2ff9119907af9726aed73680e5ed1ca634d377abb087"},
426 | {file = "frozenlist-1.4.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c5374b80521d3d3f2ec5572e05adc94601985cc526fb276d0c8574a6d749f1b3"},
427 | {file = "frozenlist-1.4.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:ce31ae3e19f3c902de379cf1323d90c649425b86de7bbdf82871b8a2a0615f3d"},
428 | {file = "frozenlist-1.4.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7211ef110a9194b6042449431e08c4d80c0481e5891e58d429df5899690511c2"},
429 | {file = "frozenlist-1.4.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:556de4430ce324c836789fa4560ca62d1591d2538b8ceb0b4f68fb7b2384a27a"},
430 | {file = "frozenlist-1.4.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7645a8e814a3ee34a89c4a372011dcd817964ce8cb273c8ed6119d706e9613e3"},
431 | {file = "frozenlist-1.4.0-cp39-cp39-win32.whl", hash = "sha256:19488c57c12d4e8095a922f328df3f179c820c212940a498623ed39160bc3c2f"},
432 | {file = "frozenlist-1.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:6221d84d463fb110bdd7619b69cb43878a11d51cbb9394ae3105d082d5199167"},
433 | {file = "frozenlist-1.4.0.tar.gz", hash = "sha256:09163bdf0b2907454042edb19f887c6d33806adc71fbd54afc14908bfdc22251"},
434 | ]
435 |
436 | [[package]]
437 | name = "idna"
438 | version = "3.4"
439 | description = "Internationalized Domain Names in Applications (IDNA)"
440 | optional = false
441 | python-versions = ">=3.5"
442 | files = [
443 | {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
444 | {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
445 | ]
446 |
447 | [[package]]
448 | name = "multidict"
449 | version = "6.0.4"
450 | description = "multidict implementation"
451 | optional = false
452 | python-versions = ">=3.7"
453 | files = [
454 | {file = "multidict-6.0.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b1a97283e0c85772d613878028fec909f003993e1007eafa715b24b377cb9b8"},
455 | {file = "multidict-6.0.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:eeb6dcc05e911516ae3d1f207d4b0520d07f54484c49dfc294d6e7d63b734171"},
456 | {file = "multidict-6.0.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d6d635d5209b82a3492508cf5b365f3446afb65ae7ebd755e70e18f287b0adf7"},
457 | {file = "multidict-6.0.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c048099e4c9e9d615545e2001d3d8a4380bd403e1a0578734e0d31703d1b0c0b"},
458 | {file = "multidict-6.0.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ea20853c6dbbb53ed34cb4d080382169b6f4554d394015f1bef35e881bf83547"},
459 | {file = "multidict-6.0.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:16d232d4e5396c2efbbf4f6d4df89bfa905eb0d4dc5b3549d872ab898451f569"},
460 | {file = "multidict-6.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36c63aaa167f6c6b04ef2c85704e93af16c11d20de1d133e39de6a0e84582a93"},
461 | {file = "multidict-6.0.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:64bdf1086b6043bf519869678f5f2757f473dee970d7abf6da91ec00acb9cb98"},
462 | {file = "multidict-6.0.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:43644e38f42e3af682690876cff722d301ac585c5b9e1eacc013b7a3f7b696a0"},
463 | {file = "multidict-6.0.4-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7582a1d1030e15422262de9f58711774e02fa80df0d1578995c76214f6954988"},
464 | {file = "multidict-6.0.4-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:ddff9c4e225a63a5afab9dd15590432c22e8057e1a9a13d28ed128ecf047bbdc"},
465 | {file = "multidict-6.0.4-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:ee2a1ece51b9b9e7752e742cfb661d2a29e7bcdba2d27e66e28a99f1890e4fa0"},
466 | {file = "multidict-6.0.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a2e4369eb3d47d2034032a26c7a80fcb21a2cb22e1173d761a162f11e562caa5"},
467 | {file = "multidict-6.0.4-cp310-cp310-win32.whl", hash = "sha256:574b7eae1ab267e5f8285f0fe881f17efe4b98c39a40858247720935b893bba8"},
468 | {file = "multidict-6.0.4-cp310-cp310-win_amd64.whl", hash = "sha256:4dcbb0906e38440fa3e325df2359ac6cb043df8e58c965bb45f4e406ecb162cc"},
469 | {file = "multidict-6.0.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0dfad7a5a1e39c53ed00d2dd0c2e36aed4650936dc18fd9a1826a5ae1cad6f03"},
470 | {file = "multidict-6.0.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:64da238a09d6039e3bd39bb3aee9c21a5e34f28bfa5aa22518581f910ff94af3"},
471 | {file = "multidict-6.0.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ff959bee35038c4624250473988b24f846cbeb2c6639de3602c073f10410ceba"},
472 | {file = "multidict-6.0.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:01a3a55bd90018c9c080fbb0b9f4891db37d148a0a18722b42f94694f8b6d4c9"},
473 | {file = "multidict-6.0.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c5cb09abb18c1ea940fb99360ea0396f34d46566f157122c92dfa069d3e0e982"},
474 | {file = "multidict-6.0.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:666daae833559deb2d609afa4490b85830ab0dfca811a98b70a205621a6109fe"},
475 | {file = "multidict-6.0.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:11bdf3f5e1518b24530b8241529d2050014c884cf18b6fc69c0c2b30ca248710"},
476 | {file = "multidict-6.0.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7d18748f2d30f94f498e852c67d61261c643b349b9d2a581131725595c45ec6c"},
477 | {file = "multidict-6.0.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:458f37be2d9e4c95e2d8866a851663cbc76e865b78395090786f6cd9b3bbf4f4"},
478 | {file = "multidict-6.0.4-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:b1a2eeedcead3a41694130495593a559a668f382eee0727352b9a41e1c45759a"},
479 | {file = "multidict-6.0.4-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7d6ae9d593ef8641544d6263c7fa6408cc90370c8cb2bbb65f8d43e5b0351d9c"},
480 | {file = "multidict-6.0.4-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:5979b5632c3e3534e42ca6ff856bb24b2e3071b37861c2c727ce220d80eee9ed"},
481 | {file = "multidict-6.0.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dcfe792765fab89c365123c81046ad4103fcabbc4f56d1c1997e6715e8015461"},
482 | {file = "multidict-6.0.4-cp311-cp311-win32.whl", hash = "sha256:3601a3cece3819534b11d4efc1eb76047488fddd0c85a3948099d5da4d504636"},
483 | {file = "multidict-6.0.4-cp311-cp311-win_amd64.whl", hash = "sha256:81a4f0b34bd92df3da93315c6a59034df95866014ac08535fc819f043bfd51f0"},
484 | {file = "multidict-6.0.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:67040058f37a2a51ed8ea8f6b0e6ee5bd78ca67f169ce6122f3e2ec80dfe9b78"},
485 | {file = "multidict-6.0.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:853888594621e6604c978ce2a0444a1e6e70c8d253ab65ba11657659dcc9100f"},
486 | {file = "multidict-6.0.4-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:39ff62e7d0f26c248b15e364517a72932a611a9b75f35b45be078d81bdb86603"},
487 | {file = "multidict-6.0.4-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:af048912e045a2dc732847d33821a9d84ba553f5c5f028adbd364dd4765092ac"},
488 | {file = "multidict-6.0.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b1e8b901e607795ec06c9e42530788c45ac21ef3aaa11dbd0c69de543bfb79a9"},
489 | {file = "multidict-6.0.4-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62501642008a8b9871ddfccbf83e4222cf8ac0d5aeedf73da36153ef2ec222d2"},
490 | {file = "multidict-6.0.4-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:99b76c052e9f1bc0721f7541e5e8c05db3941eb9ebe7b8553c625ef88d6eefde"},
491 | {file = "multidict-6.0.4-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:509eac6cf09c794aa27bcacfd4d62c885cce62bef7b2c3e8b2e49d365b5003fe"},
492 | {file = "multidict-6.0.4-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:21a12c4eb6ddc9952c415f24eef97e3e55ba3af61f67c7bc388dcdec1404a067"},
493 | {file = "multidict-6.0.4-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:5cad9430ab3e2e4fa4a2ef4450f548768400a2ac635841bc2a56a2052cdbeb87"},
494 | {file = "multidict-6.0.4-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ab55edc2e84460694295f401215f4a58597f8f7c9466faec545093045476327d"},
495 | {file = "multidict-6.0.4-cp37-cp37m-win32.whl", hash = "sha256:5a4dcf02b908c3b8b17a45fb0f15b695bf117a67b76b7ad18b73cf8e92608775"},
496 | {file = "multidict-6.0.4-cp37-cp37m-win_amd64.whl", hash = "sha256:6ed5f161328b7df384d71b07317f4d8656434e34591f20552c7bcef27b0ab88e"},
497 | {file = "multidict-6.0.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5fc1b16f586f049820c5c5b17bb4ee7583092fa0d1c4e28b5239181ff9532e0c"},
498 | {file = "multidict-6.0.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1502e24330eb681bdaa3eb70d6358e818e8e8f908a22a1851dfd4e15bc2f8161"},
499 | {file = "multidict-6.0.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b692f419760c0e65d060959df05f2a531945af31fda0c8a3b3195d4efd06de11"},
500 | {file = "multidict-6.0.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45e1ecb0379bfaab5eef059f50115b54571acfbe422a14f668fc8c27ba410e7e"},
501 | {file = "multidict-6.0.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ddd3915998d93fbcd2566ddf9cf62cdb35c9e093075f862935573d265cf8f65d"},
502 | {file = "multidict-6.0.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:59d43b61c59d82f2effb39a93c48b845efe23a3852d201ed2d24ba830d0b4cf2"},
503 | {file = "multidict-6.0.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc8e1d0c705233c5dd0c5e6460fbad7827d5d36f310a0fadfd45cc3029762258"},
504 | {file = "multidict-6.0.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d6aa0418fcc838522256761b3415822626f866758ee0bc6632c9486b179d0b52"},
505 | {file = "multidict-6.0.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6748717bb10339c4760c1e63da040f5f29f5ed6e59d76daee30305894069a660"},
506 | {file = "multidict-6.0.4-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:4d1a3d7ef5e96b1c9e92f973e43aa5e5b96c659c9bc3124acbbd81b0b9c8a951"},
507 | {file = "multidict-6.0.4-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4372381634485bec7e46718edc71528024fcdc6f835baefe517b34a33c731d60"},
508 | {file = "multidict-6.0.4-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:fc35cb4676846ef752816d5be2193a1e8367b4c1397b74a565a9d0389c433a1d"},
509 | {file = "multidict-6.0.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:4b9d9e4e2b37daddb5c23ea33a3417901fa7c7b3dee2d855f63ee67a0b21e5b1"},
510 | {file = "multidict-6.0.4-cp38-cp38-win32.whl", hash = "sha256:e41b7e2b59679edfa309e8db64fdf22399eec4b0b24694e1b2104fb789207779"},
511 | {file = "multidict-6.0.4-cp38-cp38-win_amd64.whl", hash = "sha256:d6c254ba6e45d8e72739281ebc46ea5eb5f101234f3ce171f0e9f5cc86991480"},
512 | {file = "multidict-6.0.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:16ab77bbeb596e14212e7bab8429f24c1579234a3a462105cda4a66904998664"},
513 | {file = "multidict-6.0.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:bc779e9e6f7fda81b3f9aa58e3a6091d49ad528b11ed19f6621408806204ad35"},
514 | {file = "multidict-6.0.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4ceef517eca3e03c1cceb22030a3e39cb399ac86bff4e426d4fc6ae49052cc60"},
515 | {file = "multidict-6.0.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:281af09f488903fde97923c7744bb001a9b23b039a909460d0f14edc7bf59706"},
516 | {file = "multidict-6.0.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52f2dffc8acaba9a2f27174c41c9e57f60b907bb9f096b36b1a1f3be71c6284d"},
517 | {file = "multidict-6.0.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b41156839806aecb3641f3208c0dafd3ac7775b9c4c422d82ee2a45c34ba81ca"},
518 | {file = "multidict-6.0.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d5e3fc56f88cc98ef8139255cf8cd63eb2c586531e43310ff859d6bb3a6b51f1"},
519 | {file = "multidict-6.0.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8316a77808c501004802f9beebde51c9f857054a0c871bd6da8280e718444449"},
520 | {file = "multidict-6.0.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f70b98cd94886b49d91170ef23ec5c0e8ebb6f242d734ed7ed677b24d50c82cf"},
521 | {file = "multidict-6.0.4-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bf6774e60d67a9efe02b3616fee22441d86fab4c6d335f9d2051d19d90a40063"},
522 | {file = "multidict-6.0.4-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:e69924bfcdda39b722ef4d9aa762b2dd38e4632b3641b1d9a57ca9cd18f2f83a"},
523 | {file = "multidict-6.0.4-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:6b181d8c23da913d4ff585afd1155a0e1194c0b50c54fcfe286f70cdaf2b7176"},
524 | {file = "multidict-6.0.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:52509b5be062d9eafc8170e53026fbc54cf3b32759a23d07fd935fb04fc22d95"},
525 | {file = "multidict-6.0.4-cp39-cp39-win32.whl", hash = "sha256:27c523fbfbdfd19c6867af7346332b62b586eed663887392cff78d614f9ec313"},
526 | {file = "multidict-6.0.4-cp39-cp39-win_amd64.whl", hash = "sha256:33029f5734336aa0d4c0384525da0387ef89148dc7191aae00ca5fb23d7aafc2"},
527 | {file = "multidict-6.0.4.tar.gz", hash = "sha256:3666906492efb76453c0e7b97f2cf459b0682e7402c0489a95484965dbc1da49"},
528 | ]
529 |
530 | [[package]]
531 | name = "numpy"
532 | version = "1.26.0"
533 | description = "Fundamental package for array computing in Python"
534 | optional = false
535 | python-versions = "<3.13,>=3.9"
536 | files = [
537 | {file = "numpy-1.26.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f8db2f125746e44dce707dd44d4f4efeea8d7e2b43aace3f8d1f235cfa2733dd"},
538 | {file = "numpy-1.26.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0621f7daf973d34d18b4e4bafb210bbaf1ef5e0100b5fa750bd9cde84c7ac292"},
539 | {file = "numpy-1.26.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:51be5f8c349fdd1a5568e72713a21f518e7d6707bcf8503b528b88d33b57dc68"},
540 | {file = "numpy-1.26.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:767254ad364991ccfc4d81b8152912e53e103ec192d1bb4ea6b1f5a7117040be"},
541 | {file = "numpy-1.26.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:436c8e9a4bdeeee84e3e59614d38c3dbd3235838a877af8c211cfcac8a80b8d3"},
542 | {file = "numpy-1.26.0-cp310-cp310-win32.whl", hash = "sha256:c2e698cb0c6dda9372ea98a0344245ee65bdc1c9dd939cceed6bb91256837896"},
543 | {file = "numpy-1.26.0-cp310-cp310-win_amd64.whl", hash = "sha256:09aaee96c2cbdea95de76ecb8a586cb687d281c881f5f17bfc0fb7f5890f6b91"},
544 | {file = "numpy-1.26.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:637c58b468a69869258b8ae26f4a4c6ff8abffd4a8334c830ffb63e0feefe99a"},
545 | {file = "numpy-1.26.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:306545e234503a24fe9ae95ebf84d25cba1fdc27db971aa2d9f1ab6bba19a9dd"},
546 | {file = "numpy-1.26.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c6adc33561bd1d46f81131d5352348350fc23df4d742bb246cdfca606ea1208"},
547 | {file = "numpy-1.26.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e062aa24638bb5018b7841977c360d2f5917268d125c833a686b7cbabbec496c"},
548 | {file = "numpy-1.26.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:546b7dd7e22f3c6861463bebb000646fa730e55df5ee4a0224408b5694cc6148"},
549 | {file = "numpy-1.26.0-cp311-cp311-win32.whl", hash = "sha256:c0b45c8b65b79337dee5134d038346d30e109e9e2e9d43464a2970e5c0e93229"},
550 | {file = "numpy-1.26.0-cp311-cp311-win_amd64.whl", hash = "sha256:eae430ecf5794cb7ae7fa3808740b015aa80747e5266153128ef055975a72b99"},
551 | {file = "numpy-1.26.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:166b36197e9debc4e384e9c652ba60c0bacc216d0fc89e78f973a9760b503388"},
552 | {file = "numpy-1.26.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f042f66d0b4ae6d48e70e28d487376204d3cbf43b84c03bac57e28dac6151581"},
553 | {file = "numpy-1.26.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e5e18e5b14a7560d8acf1c596688f4dfd19b4f2945b245a71e5af4ddb7422feb"},
554 | {file = "numpy-1.26.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f6bad22a791226d0a5c7c27a80a20e11cfe09ad5ef9084d4d3fc4a299cca505"},
555 | {file = "numpy-1.26.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4acc65dd65da28060e206c8f27a573455ed724e6179941edb19f97e58161bb69"},
556 | {file = "numpy-1.26.0-cp312-cp312-win32.whl", hash = "sha256:bb0d9a1aaf5f1cb7967320e80690a1d7ff69f1d47ebc5a9bea013e3a21faec95"},
557 | {file = "numpy-1.26.0-cp312-cp312-win_amd64.whl", hash = "sha256:ee84ca3c58fe48b8ddafdeb1db87388dce2c3c3f701bf447b05e4cfcc3679112"},
558 | {file = "numpy-1.26.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4a873a8180479bc829313e8d9798d5234dfacfc2e8a7ac188418189bb8eafbd2"},
559 | {file = "numpy-1.26.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:914b28d3215e0c721dc75db3ad6d62f51f630cb0c277e6b3bcb39519bed10bd8"},
560 | {file = "numpy-1.26.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c78a22e95182fb2e7874712433eaa610478a3caf86f28c621708d35fa4fd6e7f"},
561 | {file = "numpy-1.26.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:86f737708b366c36b76e953c46ba5827d8c27b7a8c9d0f471810728e5a2fe57c"},
562 | {file = "numpy-1.26.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b44e6a09afc12952a7d2a58ca0a2429ee0d49a4f89d83a0a11052da696440e49"},
563 | {file = "numpy-1.26.0-cp39-cp39-win32.whl", hash = "sha256:5671338034b820c8d58c81ad1dafc0ed5a00771a82fccc71d6438df00302094b"},
564 | {file = "numpy-1.26.0-cp39-cp39-win_amd64.whl", hash = "sha256:020cdbee66ed46b671429c7265cf00d8ac91c046901c55684954c3958525dab2"},
565 | {file = "numpy-1.26.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:0792824ce2f7ea0c82ed2e4fecc29bb86bee0567a080dacaf2e0a01fe7654369"},
566 | {file = "numpy-1.26.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d484292eaeb3e84a51432a94f53578689ffdea3f90e10c8b203a99be5af57d8"},
567 | {file = "numpy-1.26.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:186ba67fad3c60dbe8a3abff3b67a91351100f2661c8e2a80364ae6279720299"},
568 | {file = "numpy-1.26.0.tar.gz", hash = "sha256:f93fc78fe8bf15afe2b8d6b6499f1c73953169fad1e9a8dd086cdff3190e7fdf"},
569 | ]
570 |
571 | [[package]]
572 | name = "openpyxl"
573 | version = "3.1.2"
574 | description = "A Python library to read/write Excel 2010 xlsx/xlsm files"
575 | optional = false
576 | python-versions = ">=3.6"
577 | files = [
578 | {file = "openpyxl-3.1.2-py2.py3-none-any.whl", hash = "sha256:f91456ead12ab3c6c2e9491cf33ba6d08357d802192379bb482f1033ade496f5"},
579 | {file = "openpyxl-3.1.2.tar.gz", hash = "sha256:a6f5977418eff3b2d5500d54d9db50c8277a368436f4e4f8ddb1be3422870184"},
580 | ]
581 |
582 | [package.dependencies]
583 | et-xmlfile = "*"
584 |
585 | [[package]]
586 | name = "pandas"
587 | version = "2.1.1"
588 | description = "Powerful data structures for data analysis, time series, and statistics"
589 | optional = false
590 | python-versions = ">=3.9"
591 | files = [
592 | {file = "pandas-2.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:58d997dbee0d4b64f3cb881a24f918b5f25dd64ddf31f467bb9b67ae4c63a1e4"},
593 | {file = "pandas-2.1.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02304e11582c5d090e5a52aec726f31fe3f42895d6bfc1f28738f9b64b6f0614"},
594 | {file = "pandas-2.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffa8f0966de2c22de408d0e322db2faed6f6e74265aa0856f3824813cf124363"},
595 | {file = "pandas-2.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c1f84c144dee086fe4f04a472b5cd51e680f061adf75c1ae4fc3a9275560f8f4"},
596 | {file = "pandas-2.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:75ce97667d06d69396d72be074f0556698c7f662029322027c226fd7a26965cb"},
597 | {file = "pandas-2.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:4c3f32fd7c4dccd035f71734df39231ac1a6ff95e8bdab8d891167197b7018d2"},
598 | {file = "pandas-2.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9e2959720b70e106bb1d8b6eadd8ecd7c8e99ccdbe03ee03260877184bb2877d"},
599 | {file = "pandas-2.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:25e8474a8eb258e391e30c288eecec565bfed3e026f312b0cbd709a63906b6f8"},
600 | {file = "pandas-2.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b8bd1685556f3374520466998929bade3076aeae77c3e67ada5ed2b90b4de7f0"},
601 | {file = "pandas-2.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc3657869c7902810f32bd072f0740487f9e030c1a3ab03e0af093db35a9d14e"},
602 | {file = "pandas-2.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:05674536bd477af36aa2effd4ec8f71b92234ce0cc174de34fd21e2ee99adbc2"},
603 | {file = "pandas-2.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:b407381258a667df49d58a1b637be33e514b07f9285feb27769cedb3ab3d0b3a"},
604 | {file = "pandas-2.1.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c747793c4e9dcece7bb20156179529898abf505fe32cb40c4052107a3c620b49"},
605 | {file = "pandas-2.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3bcad1e6fb34b727b016775bea407311f7721db87e5b409e6542f4546a4951ea"},
606 | {file = "pandas-2.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5ec7740f9ccb90aec64edd71434711f58ee0ea7f5ed4ac48be11cfa9abf7317"},
607 | {file = "pandas-2.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:29deb61de5a8a93bdd033df328441a79fcf8dd3c12d5ed0b41a395eef9cd76f0"},
608 | {file = "pandas-2.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4f99bebf19b7e03cf80a4e770a3e65eee9dd4e2679039f542d7c1ace7b7b1daa"},
609 | {file = "pandas-2.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:84e7e910096416adec68075dc87b986ff202920fb8704e6d9c8c9897fe7332d6"},
610 | {file = "pandas-2.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:366da7b0e540d1b908886d4feb3d951f2f1e572e655c1160f5fde28ad4abb750"},
611 | {file = "pandas-2.1.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9e50e72b667415a816ac27dfcfe686dc5a0b02202e06196b943d54c4f9c7693e"},
612 | {file = "pandas-2.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc1ab6a25da197f03ebe6d8fa17273126120874386b4ac11c1d687df288542dd"},
613 | {file = "pandas-2.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0dbfea0dd3901ad4ce2306575c54348d98499c95be01b8d885a2737fe4d7a98"},
614 | {file = "pandas-2.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0489b0e6aa3d907e909aef92975edae89b1ee1654db5eafb9be633b0124abe97"},
615 | {file = "pandas-2.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:4cdb0fab0400c2cb46dafcf1a0fe084c8bb2480a1fa8d81e19d15e12e6d4ded2"},
616 | {file = "pandas-2.1.1.tar.gz", hash = "sha256:fecb198dc389429be557cde50a2d46da8434a17fe37d7d41ff102e3987fd947b"},
617 | ]
618 |
619 | [package.dependencies]
620 | numpy = [
621 | {version = ">=1.22.4", markers = "python_version < \"3.11\""},
622 | {version = ">=1.23.2", markers = "python_version == \"3.11\""},
623 | {version = ">=1.26.0", markers = "python_version >= \"3.12\""},
624 | ]
625 | python-dateutil = ">=2.8.2"
626 | pytz = ">=2020.1"
627 | tzdata = ">=2022.1"
628 |
629 | [package.extras]
630 | all = ["PyQt5 (>=5.15.6)", "SQLAlchemy (>=1.4.36)", "beautifulsoup4 (>=4.11.1)", "bottleneck (>=1.3.4)", "dataframe-api-compat (>=0.1.7)", "fastparquet (>=0.8.1)", "fsspec (>=2022.05.0)", "gcsfs (>=2022.05.0)", "html5lib (>=1.1)", "hypothesis (>=6.46.1)", "jinja2 (>=3.1.2)", "lxml (>=4.8.0)", "matplotlib (>=3.6.1)", "numba (>=0.55.2)", "numexpr (>=2.8.0)", "odfpy (>=1.4.1)", "openpyxl (>=3.0.10)", "pandas-gbq (>=0.17.5)", "psycopg2 (>=2.9.3)", "pyarrow (>=7.0.0)", "pymysql (>=1.0.2)", "pyreadstat (>=1.1.5)", "pytest (>=7.3.2)", "pytest-asyncio (>=0.17.0)", "pytest-xdist (>=2.2.0)", "pyxlsb (>=1.0.9)", "qtpy (>=2.2.0)", "s3fs (>=2022.05.0)", "scipy (>=1.8.1)", "tables (>=3.7.0)", "tabulate (>=0.8.10)", "xarray (>=2022.03.0)", "xlrd (>=2.0.1)", "xlsxwriter (>=3.0.3)", "zstandard (>=0.17.0)"]
631 | aws = ["s3fs (>=2022.05.0)"]
632 | clipboard = ["PyQt5 (>=5.15.6)", "qtpy (>=2.2.0)"]
633 | compression = ["zstandard (>=0.17.0)"]
634 | computation = ["scipy (>=1.8.1)", "xarray (>=2022.03.0)"]
635 | consortium-standard = ["dataframe-api-compat (>=0.1.7)"]
636 | excel = ["odfpy (>=1.4.1)", "openpyxl (>=3.0.10)", "pyxlsb (>=1.0.9)", "xlrd (>=2.0.1)", "xlsxwriter (>=3.0.3)"]
637 | feather = ["pyarrow (>=7.0.0)"]
638 | fss = ["fsspec (>=2022.05.0)"]
639 | gcp = ["gcsfs (>=2022.05.0)", "pandas-gbq (>=0.17.5)"]
640 | hdf5 = ["tables (>=3.7.0)"]
641 | html = ["beautifulsoup4 (>=4.11.1)", "html5lib (>=1.1)", "lxml (>=4.8.0)"]
642 | mysql = ["SQLAlchemy (>=1.4.36)", "pymysql (>=1.0.2)"]
643 | output-formatting = ["jinja2 (>=3.1.2)", "tabulate (>=0.8.10)"]
644 | parquet = ["pyarrow (>=7.0.0)"]
645 | performance = ["bottleneck (>=1.3.4)", "numba (>=0.55.2)", "numexpr (>=2.8.0)"]
646 | plot = ["matplotlib (>=3.6.1)"]
647 | postgresql = ["SQLAlchemy (>=1.4.36)", "psycopg2 (>=2.9.3)"]
648 | spss = ["pyreadstat (>=1.1.5)"]
649 | sql-other = ["SQLAlchemy (>=1.4.36)"]
650 | test = ["hypothesis (>=6.46.1)", "pytest (>=7.3.2)", "pytest-asyncio (>=0.17.0)", "pytest-xdist (>=2.2.0)"]
651 | xml = ["lxml (>=4.8.0)"]
652 |
653 | [[package]]
654 | name = "python-dateutil"
655 | version = "2.8.2"
656 | description = "Extensions to the standard Python datetime module"
657 | optional = false
658 | python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
659 | files = [
660 | {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
661 | {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
662 | ]
663 |
664 | [package.dependencies]
665 | six = ">=1.5"
666 |
667 | [[package]]
668 | name = "python-dotenv"
669 | version = "1.0.0"
670 | description = "Read key-value pairs from a .env file and set them as environment variables"
671 | optional = false
672 | python-versions = ">=3.8"
673 | files = [
674 | {file = "python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba"},
675 | {file = "python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a"},
676 | ]
677 |
678 | [package.extras]
679 | cli = ["click (>=5.0)"]
680 |
681 | [[package]]
682 | name = "pytz"
683 | version = "2023.3.post1"
684 | description = "World timezone definitions, modern and historical"
685 | optional = false
686 | python-versions = "*"
687 | files = [
688 | {file = "pytz-2023.3.post1-py2.py3-none-any.whl", hash = "sha256:ce42d816b81b68506614c11e8937d3aa9e41007ceb50bfdcb0749b921bf646c7"},
689 | {file = "pytz-2023.3.post1.tar.gz", hash = "sha256:7b4fddbeb94a1eba4b557da24f19fdf9db575192544270a9101d8509f9f43d7b"},
690 | ]
691 |
692 | [[package]]
693 | name = "requests"
694 | version = "2.31.0"
695 | description = "Python HTTP for Humans."
696 | optional = false
697 | python-versions = ">=3.7"
698 | files = [
699 | {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
700 | {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
701 | ]
702 |
703 | [package.dependencies]
704 | certifi = ">=2017.4.17"
705 | charset-normalizer = ">=2,<4"
706 | idna = ">=2.5,<4"
707 | urllib3 = ">=1.21.1,<3"
708 |
709 | [package.extras]
710 | socks = ["PySocks (>=1.5.6,!=1.5.7)"]
711 | use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
712 |
713 | [[package]]
714 | name = "six"
715 | version = "1.16.0"
716 | description = "Python 2 and 3 compatibility utilities"
717 | optional = false
718 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
719 | files = [
720 | {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
721 | {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
722 | ]
723 |
724 | [[package]]
725 | name = "soupsieve"
726 | version = "2.5"
727 | description = "A modern CSS selector implementation for Beautiful Soup."
728 | optional = false
729 | python-versions = ">=3.8"
730 | files = [
731 | {file = "soupsieve-2.5-py3-none-any.whl", hash = "sha256:eaa337ff55a1579b6549dc679565eac1e3d000563bcb1c8ab0d0fefbc0c2cdc7"},
732 | {file = "soupsieve-2.5.tar.gz", hash = "sha256:5663d5a7b3bfaeee0bc4372e7fc48f9cff4940b3eec54a6451cc5299f1097690"},
733 | ]
734 |
735 | [[package]]
736 | name = "tzdata"
737 | version = "2023.3"
738 | description = "Provider of IANA time zone data"
739 | optional = false
740 | python-versions = ">=2"
741 | files = [
742 | {file = "tzdata-2023.3-py2.py3-none-any.whl", hash = "sha256:7e65763eef3120314099b6939b5546db7adce1e7d6f2e179e3df563c70511eda"},
743 | {file = "tzdata-2023.3.tar.gz", hash = "sha256:11ef1e08e54acb0d4f95bdb1be05da659673de4acbd21bf9c69e94cc5e907a3a"},
744 | ]
745 |
746 | [[package]]
747 | name = "urllib3"
748 | version = "2.0.6"
749 | description = "HTTP library with thread-safe connection pooling, file post, and more."
750 | optional = false
751 | python-versions = ">=3.7"
752 | files = [
753 | {file = "urllib3-2.0.6-py3-none-any.whl", hash = "sha256:7a7c7003b000adf9e7ca2a377c9688bbc54ed41b985789ed576570342a375cd2"},
754 | {file = "urllib3-2.0.6.tar.gz", hash = "sha256:b19e1a85d206b56d7df1d5e683df4a7725252a964e3993648dd0fb5a1c157564"},
755 | ]
756 |
757 | [package.extras]
758 | brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
759 | secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
760 | socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
761 | zstd = ["zstandard (>=0.18.0)"]
762 |
763 | [[package]]
764 | name = "xlsxwriter"
765 | version = "3.1.5"
766 | description = "A Python module for creating Excel XLSX files."
767 | optional = false
768 | python-versions = ">=3.6"
769 | files = [
770 | {file = "XlsxWriter-3.1.5-py3-none-any.whl", hash = "sha256:a6d95556f96d6908885523554b3a468d74a711010d0a25b63d47e9ef4ba3bb94"},
771 | {file = "XlsxWriter-3.1.5.tar.gz", hash = "sha256:27eba7b0f9a9ba2b091d5e77f0003a327e022a5fe3e24ee95a5fe5b75784cf45"},
772 | ]
773 |
774 | [[package]]
775 | name = "yarl"
776 | version = "1.9.2"
777 | description = "Yet another URL library"
778 | optional = false
779 | python-versions = ">=3.7"
780 | files = [
781 | {file = "yarl-1.9.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8c2ad583743d16ddbdf6bb14b5cd76bf43b0d0006e918809d5d4ddf7bde8dd82"},
782 | {file = "yarl-1.9.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:82aa6264b36c50acfb2424ad5ca537a2060ab6de158a5bd2a72a032cc75b9eb8"},
783 | {file = "yarl-1.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c0c77533b5ed4bcc38e943178ccae29b9bcf48ffd1063f5821192f23a1bd27b9"},
784 | {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ee4afac41415d52d53a9833ebae7e32b344be72835bbb589018c9e938045a560"},
785 | {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9bf345c3a4f5ba7f766430f97f9cc1320786f19584acc7086491f45524a551ac"},
786 | {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2a96c19c52ff442a808c105901d0bdfd2e28575b3d5f82e2f5fd67e20dc5f4ea"},
787 | {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:891c0e3ec5ec881541f6c5113d8df0315ce5440e244a716b95f2525b7b9f3608"},
788 | {file = "yarl-1.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c3a53ba34a636a256d767c086ceb111358876e1fb6b50dfc4d3f4951d40133d5"},
789 | {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:566185e8ebc0898b11f8026447eacd02e46226716229cea8db37496c8cdd26e0"},
790 | {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:2b0738fb871812722a0ac2154be1f049c6223b9f6f22eec352996b69775b36d4"},
791 | {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:32f1d071b3f362c80f1a7d322bfd7b2d11e33d2adf395cc1dd4df36c9c243095"},
792 | {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:e9fdc7ac0d42bc3ea78818557fab03af6181e076a2944f43c38684b4b6bed8e3"},
793 | {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:56ff08ab5df8429901ebdc5d15941b59f6253393cb5da07b4170beefcf1b2528"},
794 | {file = "yarl-1.9.2-cp310-cp310-win32.whl", hash = "sha256:8ea48e0a2f931064469bdabca50c2f578b565fc446f302a79ba6cc0ee7f384d3"},
795 | {file = "yarl-1.9.2-cp310-cp310-win_amd64.whl", hash = "sha256:50f33040f3836e912ed16d212f6cc1efb3231a8a60526a407aeb66c1c1956dde"},
796 | {file = "yarl-1.9.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:646d663eb2232d7909e6601f1a9107e66f9791f290a1b3dc7057818fe44fc2b6"},
797 | {file = "yarl-1.9.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:aff634b15beff8902d1f918012fc2a42e0dbae6f469fce134c8a0dc51ca423bb"},
798 | {file = "yarl-1.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a83503934c6273806aed765035716216cc9ab4e0364f7f066227e1aaea90b8d0"},
799 | {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b25322201585c69abc7b0e89e72790469f7dad90d26754717f3310bfe30331c2"},
800 | {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:22a94666751778629f1ec4280b08eb11815783c63f52092a5953faf73be24191"},
801 | {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ec53a0ea2a80c5cd1ab397925f94bff59222aa3cf9c6da938ce05c9ec20428d"},
802 | {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:159d81f22d7a43e6eabc36d7194cb53f2f15f498dbbfa8edc8a3239350f59fe7"},
803 | {file = "yarl-1.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:832b7e711027c114d79dffb92576acd1bd2decc467dec60e1cac96912602d0e6"},
804 | {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:95d2ecefbcf4e744ea952d073c6922e72ee650ffc79028eb1e320e732898d7e8"},
805 | {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:d4e2c6d555e77b37288eaf45b8f60f0737c9efa3452c6c44626a5455aeb250b9"},
806 | {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:783185c75c12a017cc345015ea359cc801c3b29a2966c2655cd12b233bf5a2be"},
807 | {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:b8cc1863402472f16c600e3e93d542b7e7542a540f95c30afd472e8e549fc3f7"},
808 | {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:822b30a0f22e588b32d3120f6d41e4ed021806418b4c9f0bc3048b8c8cb3f92a"},
809 | {file = "yarl-1.9.2-cp311-cp311-win32.whl", hash = "sha256:a60347f234c2212a9f0361955007fcf4033a75bf600a33c88a0a8e91af77c0e8"},
810 | {file = "yarl-1.9.2-cp311-cp311-win_amd64.whl", hash = "sha256:be6b3fdec5c62f2a67cb3f8c6dbf56bbf3f61c0f046f84645cd1ca73532ea051"},
811 | {file = "yarl-1.9.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:38a3928ae37558bc1b559f67410df446d1fbfa87318b124bf5032c31e3447b74"},
812 | {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac9bb4c5ce3975aeac288cfcb5061ce60e0d14d92209e780c93954076c7c4367"},
813 | {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3da8a678ca8b96c8606bbb8bfacd99a12ad5dd288bc6f7979baddd62f71c63ef"},
814 | {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13414591ff516e04fcdee8dc051c13fd3db13b673c7a4cb1350e6b2ad9639ad3"},
815 | {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf74d08542c3a9ea97bb8f343d4fcbd4d8f91bba5ec9d5d7f792dbe727f88938"},
816 | {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6e7221580dc1db478464cfeef9b03b95c5852cc22894e418562997df0d074ccc"},
817 | {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:494053246b119b041960ddcd20fd76224149cfea8ed8777b687358727911dd33"},
818 | {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:52a25809fcbecfc63ac9ba0c0fb586f90837f5425edfd1ec9f3372b119585e45"},
819 | {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:e65610c5792870d45d7b68c677681376fcf9cc1c289f23e8e8b39c1485384185"},
820 | {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:1b1bba902cba32cdec51fca038fd53f8beee88b77efc373968d1ed021024cc04"},
821 | {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:662e6016409828ee910f5d9602a2729a8a57d74b163c89a837de3fea050c7582"},
822 | {file = "yarl-1.9.2-cp37-cp37m-win32.whl", hash = "sha256:f364d3480bffd3aa566e886587eaca7c8c04d74f6e8933f3f2c996b7f09bee1b"},
823 | {file = "yarl-1.9.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6a5883464143ab3ae9ba68daae8e7c5c95b969462bbe42e2464d60e7e2698368"},
824 | {file = "yarl-1.9.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5610f80cf43b6202e2c33ba3ec2ee0a2884f8f423c8f4f62906731d876ef4fac"},
825 | {file = "yarl-1.9.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b9a4e67ad7b646cd6f0938c7ebfd60e481b7410f574c560e455e938d2da8e0f4"},
826 | {file = "yarl-1.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:83fcc480d7549ccebe9415d96d9263e2d4226798c37ebd18c930fce43dfb9574"},
827 | {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5fcd436ea16fee7d4207c045b1e340020e58a2597301cfbcfdbe5abd2356c2fb"},
828 | {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:84e0b1599334b1e1478db01b756e55937d4614f8654311eb26012091be109d59"},
829 | {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3458a24e4ea3fd8930e934c129b676c27452e4ebda80fbe47b56d8c6c7a63a9e"},
830 | {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:838162460b3a08987546e881a2bfa573960bb559dfa739e7800ceeec92e64417"},
831 | {file = "yarl-1.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f4e2d08f07a3d7d3e12549052eb5ad3eab1c349c53ac51c209a0e5991bbada78"},
832 | {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:de119f56f3c5f0e2fb4dee508531a32b069a5f2c6e827b272d1e0ff5ac040333"},
833 | {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:149ddea5abf329752ea5051b61bd6c1d979e13fbf122d3a1f9f0c8be6cb6f63c"},
834 | {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:674ca19cbee4a82c9f54e0d1eee28116e63bc6fd1e96c43031d11cbab8b2afd5"},
835 | {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:9b3152f2f5677b997ae6c804b73da05a39daa6a9e85a512e0e6823d81cdad7cc"},
836 | {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5415d5a4b080dc9612b1b63cba008db84e908b95848369aa1da3686ae27b6d2b"},
837 | {file = "yarl-1.9.2-cp38-cp38-win32.whl", hash = "sha256:f7a3d8146575e08c29ed1cd287068e6d02f1c7bdff8970db96683b9591b86ee7"},
838 | {file = "yarl-1.9.2-cp38-cp38-win_amd64.whl", hash = "sha256:63c48f6cef34e6319a74c727376e95626f84ea091f92c0250a98e53e62c77c72"},
839 | {file = "yarl-1.9.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:75df5ef94c3fdc393c6b19d80e6ef1ecc9ae2f4263c09cacb178d871c02a5ba9"},
840 | {file = "yarl-1.9.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c027a6e96ef77d401d8d5a5c8d6bc478e8042f1e448272e8d9752cb0aff8b5c8"},
841 | {file = "yarl-1.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f3b078dbe227f79be488ffcfc7a9edb3409d018e0952cf13f15fd6512847f3f7"},
842 | {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:59723a029760079b7d991a401386390c4be5bfec1e7dd83e25a6a0881859e716"},
843 | {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b03917871bf859a81ccb180c9a2e6c1e04d2f6a51d953e6a5cdd70c93d4e5a2a"},
844 | {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c1012fa63eb6c032f3ce5d2171c267992ae0c00b9e164efe4d73db818465fac3"},
845 | {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a74dcbfe780e62f4b5a062714576f16c2f3493a0394e555ab141bf0d746bb955"},
846 | {file = "yarl-1.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c56986609b057b4839968ba901944af91b8e92f1725d1a2d77cbac6972b9ed1"},
847 | {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2c315df3293cd521033533d242d15eab26583360b58f7ee5d9565f15fee1bef4"},
848 | {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:b7232f8dfbd225d57340e441d8caf8652a6acd06b389ea2d3222b8bc89cbfca6"},
849 | {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:53338749febd28935d55b41bf0bcc79d634881195a39f6b2f767870b72514caf"},
850 | {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:066c163aec9d3d073dc9ffe5dd3ad05069bcb03fcaab8d221290ba99f9f69ee3"},
851 | {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8288d7cd28f8119b07dd49b7230d6b4562f9b61ee9a4ab02221060d21136be80"},
852 | {file = "yarl-1.9.2-cp39-cp39-win32.whl", hash = "sha256:b124e2a6d223b65ba8768d5706d103280914d61f5cae3afbc50fc3dfcc016623"},
853 | {file = "yarl-1.9.2-cp39-cp39-win_amd64.whl", hash = "sha256:61016e7d582bc46a5378ffdd02cd0314fb8ba52f40f9cf4d9a5e7dbef88dee18"},
854 | {file = "yarl-1.9.2.tar.gz", hash = "sha256:04ab9d4b9f587c06d801c2abfe9317b77cdf996c65a90d5e84ecc45010823571"},
855 | ]
856 |
857 | [package.dependencies]
858 | idna = ">=2.0"
859 | multidict = ">=4.0"
860 |
861 | [metadata]
862 | lock-version = "2.0"
863 | python-versions = ">=3.9,<3.13"
864 | content-hash = "4a807a737f1265629b836bb63740a705bf13c7d888416b9928df56b21bcb9d95"
865 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.poetry]
2 | name = "wayback-google-analytics"
3 | version = "0.2.3"
4 | description = "A tool for gathering current and historic google analytics ids from multiple websites"
5 | authors = ["Justin Clark "]
6 | license = "MIT"
7 | readme = "README.md"
8 |
9 | [tool.poetry.scripts]
10 | wayback-google-analytics = "wayback_google_analytics.main:main_entrypoint"
11 |
12 | [tool.poetry.dependencies]
13 | python = ">=3.9,<3.13"
14 | aiohttp = "3.8.5"
15 | aiosignal = "1.3.1"
16 | async-timeout = "4.0.3"
17 | asyncio = "3.4.3"
18 | asynctest = "0.13.0"
19 | attrs = "23.1.0"
20 | beautifulsoup4 = "4.12.2"
21 | certifi = "2023.7.22"
22 | charset-normalizer = "3.2.0"
23 | coverage = "7.3.2"
24 | et-xmlfile = "1.1.0"
25 | frozenlist = "1.4.0"
26 | idna = "3.4"
27 | multidict = "6.0.4"
28 | numpy = "1.26.0"
29 | openpyxl = "3.1.2"
30 | pandas = "2.1.1"
31 | python-dateutil = "2.8.2"
32 | python-dotenv = "1.0.0"
33 | pytz = "2023.3.post1"
34 | requests = "2.31.0"
35 | six = "1.16.0"
36 | soupsieve = "2.5"
37 | tzdata = "2023.3"
38 | urllib3 = "2.0.6"
39 | xlsxwriter = "3.1.5"
40 | yarl = "1.9.2"
41 |
42 |
43 | [build-system]
44 | requires = ["poetry-core"]
45 | build-backend = "poetry.core.masonry.api"
46 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | aiohttp==3.8.5
2 | aiosignal==1.3.1
3 | async-timeout==4.0.3
4 | asyncio==3.4.3
5 | asynctest==0.13.0
6 | attrs==23.1.0
7 | beautifulsoup4==4.12.2
8 | certifi==2023.7.22
9 | charset-normalizer==3.2.0
10 | coverage==7.3.2
11 | et-xmlfile==1.1.0
12 | frozenlist==1.4.0
13 | idna==3.4
14 | multidict==6.0.4
15 | numpy==1.26.0
16 | openpyxl==3.1.2
17 | pandas==2.1.1
18 | python-dateutil==2.8.2
19 | python-dotenv==1.0.0
20 | pytz==2023.3.post1
21 | requests==2.31.0
22 | six==1.16.0
23 | soupsieve==2.5
24 | tzdata==2023.3
25 | urllib3==2.0.6
26 | XlsxWriter==3.1.5
27 | yarl==1.9.2
28 |
--------------------------------------------------------------------------------
/tests/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bellingcat/wayback-google-analytics/e697ce9d0e2594175867e911b03198ae986ffce6/tests/__init__.py
--------------------------------------------------------------------------------
/tests/test_async_utils.py:
--------------------------------------------------------------------------------
1 | import asynctest
2 | from asynctest.mock import patch, MagicMock
3 | import aiohttp
4 |
5 | from wayback_google_analytics.async_utils import (
6 | get_codes_from_single_timestamp,
7 | get_codes_from_snapshots,
8 | get_snapshot_timestamps,
9 | DEFAULT_HEADERS,
10 | )
11 |
12 |
13 | class AsyncUtilsTestCase(asynctest.TestCase):
14 | """Tests for async_utils.py"""
15 |
16 | @patch("aiohttp.ClientSession.get")
17 | async def test_get_snapshot_timestamps(self, mock_get):
18 | """Does get_snapshot_timestamps return correct, formatted timestamps?"""
19 |
20 | # Mock the response from the server
21 | mock_response = MagicMock()
22 |
23 | async def mock_text_method():
24 | return "20120101000000\n20130102000000\n20140103000000\n20150104000000\n20160105000000\n20170106000000\n20180107000000\n20190108000000\n20200109000000\n20210110000000"
25 |
26 | mock_response.text = mock_text_method
27 |
28 | # Mock session.get to return response mocked above
29 | mock_get.return_value.__aenter__.return_value = mock_response
30 |
31 | async with aiohttp.ClientSession() as session:
32 | result = await get_snapshot_timestamps(
33 | session=session,
34 | url="https://www.someurl.com",
35 | start_date="20120101000000",
36 | end_date="20210102000000",
37 | frequency=4,
38 | limit=10,
39 | )
40 |
41 | expected_timestamp_list = [
42 | "20120101000000",
43 | "20130102000000",
44 | "20140103000000",
45 | "20150104000000",
46 | "20160105000000",
47 | "20170106000000",
48 | "20180107000000",
49 | "20190108000000",
50 | "20200109000000",
51 | "20210110000000",
52 | ]
53 |
54 | """Does get_snapshot_timestamps return correct, formatted timestamps?"""
55 | self.assertEqual(result, expected_timestamp_list)
56 |
57 | """Does get_snapshot_timestamps call session.get with correct parameters?"""
58 | expected_CDX_url = "http://web.archive.org/cdx/search/cdx?url=https://www.someurl.com&matchType=domain&filter=statuscode:200&fl=timestamp&output=JSON&collapse=timestamp:4&limit=10&from=20120101000000&to=20210102000000"
59 | mock_get.assert_called_with(expected_CDX_url, headers=DEFAULT_HEADERS)
60 |
61 | @patch("aiohttp.ClientSession.get")
62 | @patch("wayback_google_analytics.async_utils.get_UA_code")
63 | @patch("wayback_google_analytics.async_utils.get_GA_code")
64 | @patch("wayback_google_analytics.async_utils.get_GTM_code")
65 | async def test_get_codes_from_single_timestamp(
66 | self, mock_GTM, mock_GA, mock_UA, mock_get
67 | ):
68 | """Does get_codes_from_single_timestamp return correct codes from a single archive.org snapshot?"""
69 |
70 | # Mock the response from the server
71 | mock_response = MagicMock()
72 |
73 | async def mock_text_method():
74 | return " ... fake data ... "
75 |
76 | mock_response.text = mock_text_method
77 | mock_get.return_value.__aenter__.return_value = mock_response
78 |
79 | # Mock get_code functions
80 | mock_UA.return_value = ["UA-12345678-1"]
81 | mock_GA.return_value = ["G-12345678"]
82 | mock_GTM.return_value = ["GTM-12345678"]
83 |
84 | results = {
85 | "UA_codes": {},
86 | "GA_codes": {},
87 | "GTM_codes": {},
88 | }
89 |
90 | async with aiohttp.ClientSession() as session:
91 | await get_codes_from_single_timestamp(
92 | session=session,
93 | timestamp="20120101000000",
94 | base_url="https://web.archive.org/web/{timestamp}/https://www.someurl.com",
95 | results=results,
96 | )
97 |
98 | """Does it update results accordingly?"""
99 | self.assertIn("UA-12345678-1", results["UA_codes"])
100 | self.assertIn("G-12345678", results["GA_codes"])
101 | self.assertIn("GTM-12345678", results["GTM_codes"])
102 |
103 | """Does it call get with correct parameters?"""
104 | expected_url = (
105 | "https://web.archive.org/web/20120101000000/https://www.someurl.com"
106 | )
107 | mock_get.assert_called_with(expected_url, headers=DEFAULT_HEADERS)
108 |
109 | async def test_get_codes_from_snapshots(self):
110 | """Does get_codes_from_snapshots run once for each timestamp provided?"""
111 |
112 | # Mock get_codes_from_single_timestamp
113 | mock_get_codes_from_single_timestamp = asynctest.CoroutineMock()
114 |
115 | """Does it call get_codes_from_single_timestamp for each timestamp?"""
116 | with asynctest.mock.patch(
117 | "wayback_google_analytics.async_utils.get_codes_from_single_timestamp",
118 | mock_get_codes_from_single_timestamp,
119 | ):
120 | session = asynctest.Mock()
121 | url = "https://www.someurl.com"
122 | timestamps = [
123 | "20120101000000",
124 | "20130102000000",
125 | "20140103000000",
126 | "20150104000000",
127 | "20160105000000",
128 | "20170106000000",
129 | "20180107000000",
130 | "20190108000000",
131 | "20200109000000",
132 | "20210110000000",
133 | ]
134 | result = await get_codes_from_snapshots(session, url, timestamps)
135 |
136 | self.assertEqual(
137 | mock_get_codes_from_single_timestamp.call_count, len(timestamps)
138 | )
139 |
140 | # Resets call_count NOTE: There may be a better way to do this.
141 | mock_get_codes_from_single_timestamp = asynctest.CoroutineMock()
142 |
143 | with asynctest.mock.patch(
144 | "wayback_google_analytics.async_utils.get_codes_from_single_timestamp",
145 | mock_get_codes_from_single_timestamp,
146 | ):
147 | session = asynctest.Mock()
148 | url = "https://www.someurl.com"
149 | timestamps = ["20120101000000", "20130102000000", "20140103000000"]
150 | result = await get_codes_from_snapshots(session, url, timestamps)
151 |
152 | self.assertEqual(
153 | mock_get_codes_from_single_timestamp.call_count, len(timestamps)
154 | )
155 |
156 | # Resets call_count
157 | mock_get_codes_from_single_timestamp = asynctest.CoroutineMock()
158 |
159 | with asynctest.mock.patch(
160 | "wayback_google_analytics.async_utils.get_codes_from_single_timestamp",
161 | mock_get_codes_from_single_timestamp,
162 | ):
163 | session = asynctest.Mock()
164 | url = "https://www.someurl.com"
165 | timestamps = []
166 | result = await get_codes_from_snapshots(session, url, timestamps)
167 |
168 | self.assertEqual(
169 | mock_get_codes_from_single_timestamp.call_count, len(timestamps)
170 | )
171 |
--------------------------------------------------------------------------------
/tests/test_codes.py:
--------------------------------------------------------------------------------
1 | from unittest import TestCase
2 |
3 | from wayback_google_analytics.codes import get_UA_code, get_GA_code, get_GTM_code
4 |
5 |
6 | class CodesTestCase(TestCase):
7 | """Tests for codes.py"""
8 |
9 | def setUp(self):
10 | """Create test html data"""
11 |
12 | self.test_html_1 = """
13 |
14 |
15 |
16 |
24 |
25 |
26 | "UA-12345678-2"
27 | "G-12345678-2"
28 | "GTM-2333234"
29 |
30 |
31 | """
32 |
33 | self.test_html_2 = """
34 |
35 |
36 |
37 |
51 |
52 |
53 | "UA-12345678-2"
54 | "G-12345678-2"
55 | "GTM-2333234"
56 |
57 |
58 | """
59 |
60 | self.test_html_no_UA_code = """
61 |
62 |
63 |
64 |
70 |
71 |
72 | "UA-12345678-2"
73 | "G-12345678-2"
74 | "GTM-2333234"
75 |
76 |
77 | """
78 |
79 | self.test_errorful_html = ""
80 |
81 | def test_get_single_UA_code(self):
82 | """Test get_UA_code w/ single code"""
83 |
84 | """Does it return correct UA code?"""
85 | self.assertEqual(get_UA_code(self.test_html_1)[0], "UA-12345678-1")
86 |
87 | def test_get_multiple_UA_codes(self):
88 | """Test get_UA_code w/ multiple UA codes"""
89 |
90 | """Does it handle multiple UA codes in a single input?"""
91 | UA_codes = get_UA_code(self.test_html_2)
92 | self.assertEqual(len(UA_codes), 3)
93 | self.assertIn("UA-12345678-1", UA_codes)
94 | self.assertIn("UA-12345678-2", UA_codes)
95 | self.assertIn("UA-12345678", UA_codes)
96 |
97 | def test_get_UA_codes_invalid(self):
98 | """Test get_UA_code w/ invalid UA code"""
99 |
100 | """Does it return empty list if no UA code is found?"""
101 | self.assertIsInstance(get_UA_code(self.test_html_no_UA_code), list)
102 | self.assertEqual(len(get_UA_code(self.test_html_no_UA_code)), 0)
103 |
104 | def test_get_single_GA_code(self):
105 | """Test get_GA_code w/ single GA code"""
106 |
107 | """Does it return correct GA code?"""
108 | self.assertEqual(get_GA_code(self.test_html_1)[0], "G-12345678-1")
109 |
110 | def test_get_multiple_GA_codes(self):
111 | """Test get_GA_code w/ multiple GA codes"""
112 |
113 | """Does it handle multiple GA codes in a single input?"""
114 | GA_codes = get_GA_code(self.test_html_2)
115 | self.assertEqual(len(GA_codes), 3)
116 | self.assertIn("G-12345678-1", GA_codes)
117 | self.assertIn("G-12345678-2", GA_codes)
118 | self.assertIn("G-12345678", GA_codes)
119 |
120 | def test_get_GA_codes_invalid(self):
121 | """Test get_GA_code w/ invalid GA code"""
122 |
123 | """Does it return empty list if no GA code is found?"""
124 | self.assertIsInstance(get_GA_code(self.test_html_no_UA_code), list)
125 | self.assertEqual(len(get_GA_code(self.test_html_no_UA_code)), 0)
126 |
127 | def test_get_single_GTM_code(self):
128 | """Test get_GTM_code w/ single GTM code"""
129 |
130 | """Does it return correct GTM code?"""
131 | self.assertEqual(get_GTM_code(self.test_html_1)[0], "GTM-23451")
132 |
133 | def test_get_multiple_GTM_codes(self):
134 | """Test get_GTM_code w/ multiple GTM codes"""
135 |
136 | """Does it handle multiple GTM codes in a single input?"""
137 | GTM_codes = get_GTM_code(self.test_html_2)
138 | self.assertEqual(len(GTM_codes), 3)
139 | self.assertIn("GTM-23451", GTM_codes)
140 | self.assertIn("GTM-2333234", GTM_codes)
141 | self.assertIn("GTM-2124", GTM_codes)
142 |
143 | def test_get_GTM_codes_invalid(self):
144 | """Test get_GTM_code w/ invalid GTM code"""
145 |
146 | """Does it return empty list if no GTM code is found?"""
147 | self.assertIsInstance(get_GTM_code(self.test_html_no_UA_code), list)
148 | self.assertEqual(len(get_GTM_code(self.test_html_no_UA_code)), 0)
149 |
150 |
151 |
--------------------------------------------------------------------------------
/tests/test_main.py:
--------------------------------------------------------------------------------
1 | from wayback_google_analytics.main import main, setup_args
2 | import unittest
3 | import sys
4 | from io import StringIO
5 |
6 |
7 | class TestMain(unittest.TestCase):
8 | """Tests for main.py"""
9 |
10 | def setUp(self):
11 | # Capture any errors to stderr
12 | self.held_stderr = StringIO()
13 | self.held_stdout = StringIO()
14 | sys.stdout = self.held_stdout
15 | sys.stderr = self.held_stderr
16 |
17 | def tearDown(self):
18 | # Reset sys.stderr
19 | self.held_stdout.close()
20 | self.held_stderr.close()
21 | sys.stdout = sys.__stdout__
22 | sys.stderr = sys.__stderr__
23 |
24 | def test_setup_args_help_message(self):
25 | """Does main.py -h print usage message?"""
26 |
27 | sys.argv = ["main.py", "-h"]
28 | with self.assertRaises(SystemExit):
29 | setup_args()
30 |
31 | """Should print help message to terminal"""
32 | output = self.held_stdout.getvalue().strip()
33 | self.assertIn("usage: main.py [-h]", output)
34 |
35 | def test_setup_args_no_arguments(self):
36 | """Does setup_args return error message if no arguments provided?"""
37 | sys.argv = ["main.py"]
38 | with self.assertRaises(SystemExit):
39 | setup_args()
40 |
41 | """Should print help and error message to terminal"""
42 | output = self.held_stderr.getvalue().strip()
43 | self.assertIn("usage: main.py [-h]", output)
44 | self.assertIn("main.py: error:", output)
45 |
46 | def test_setup_args_invalid_multiple_inputs(self):
47 | """Does setup_args return error message if multiple inputs provided?"""
48 |
49 | sys.argv = [
50 | "main.py",
51 | "-i",
52 | "tests/test_urls.txt",
53 | "-u",
54 | "https://www.google.com",
55 | ]
56 | with self.assertRaises(SystemExit):
57 | setup_args()
58 |
59 | """Should print help and error message to terminal"""
60 | output = self.held_stderr.getvalue().strip()
61 | self.assertIn("usage: main.py [-h]", output)
62 | self.assertIn("main.py: error:", output)
63 |
64 | def test_setup_args_invalid_output(self):
65 | """Does setup_args return error message if invalid output provided?"""
66 |
67 | sys.argv = ["main.py", "-i", "tests/test_urls.txt", "-o", "invalid_output"]
68 | with self.assertRaises(SystemExit):
69 | setup_args()
70 |
71 | """Should print error message to terminal"""
72 | output = self.held_stderr.getvalue().strip()
73 | self.assertIn("main.py: error:", output)
74 |
75 | def test_setup_args_valid_args(self):
76 | """Does setup_args return args if valid args provided?"""
77 |
78 | sys.argv = [
79 | "main.py",
80 | "--input_file",
81 | "tests/test_urls.txt",
82 | "--output",
83 | "json",
84 | "--start_date",
85 | "01/01/2012:12:00",
86 | "--end_date",
87 | "01/01/2013:12:00",
88 | "--frequency",
89 | "daily",
90 | "--limit",
91 | "10",
92 | "--skip_current",
93 | ]
94 | args = setup_args()
95 |
96 | """Should return args"""
97 | self.assertIsNotNone(args)
98 |
99 | """Args should contain proper values"""
100 | self.assertEqual(args.input_file, "tests/test_urls.txt")
101 | self.assertEqual(args.output, "json")
102 | self.assertEqual(args.start_date, "01/01/2012:12:00")
103 | self.assertEqual(args.end_date, "01/01/2013:12:00")
104 | self.assertEqual(args.frequency, "daily")
105 | self.assertEqual(args.limit, "10")
106 | self.assertEqual(args.skip_current, True)
107 |
108 | def test_setup_args_valid_args_shorthand(self):
109 | """Does setup_args return args if valid args provided using shorthand commands?"""
110 |
111 | sys.argv = [
112 | "main.py",
113 | "-i",
114 | "tests/test_urls.txt",
115 | "-o",
116 | "json",
117 | "-s",
118 | "01/01/2012:12:00",
119 | "-e",
120 | "01/01/2013:12:00",
121 | "-f",
122 | "daily",
123 | "-l",
124 | "10",
125 | "-sc",
126 | ]
127 | args = setup_args()
128 |
129 | """Should return args"""
130 | self.assertIsNotNone(args)
131 |
132 | """Args should contain proper values"""
133 | self.assertEqual(args.input_file, "tests/test_urls.txt")
134 | self.assertEqual(args.output, "json")
135 | self.assertEqual(args.start_date, "01/01/2012:12:00")
136 | self.assertEqual(args.end_date, "01/01/2013:12:00")
137 | self.assertEqual(args.frequency, "daily")
138 | self.assertEqual(args.limit, "10")
139 | self.assertEqual(args.skip_current, True)
140 |
--------------------------------------------------------------------------------
/tests/test_output.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | import json
3 | import os
4 | import pandas as pd
5 | from unittest import TestCase
6 | from unittest.mock import patch, Mock
7 | from shutil import rmtree
8 |
9 | from wayback_google_analytics.output import (
10 | init_output,
11 | write_output,
12 | get_codes_df,
13 | get_urls_df,
14 | format_archived_codes,
15 | format_active,
16 | )
17 |
18 |
19 | class OutputTestCase(TestCase):
20 | """Tests for output.py"""
21 |
22 | def setUp(self):
23 | """Create test data"""
24 | self.test_timestamp = "01-01-2023(12-00-00)"
25 | self.test_path = "./test_output"
26 | self.valid_types = ["csv", "txt", "json", "xlsx"]
27 | if not os.path.exists(self.test_path):
28 | os.makedirs(self.test_path)
29 |
30 | def tearDown(self):
31 | """Removes any created directories after each test"""
32 | if os.path.exists(self.test_path):
33 | rmtree(self.test_path)
34 |
35 | def test_format_active(self):
36 | """Does format_active convert list of "active" values from df into formatted string?"""
37 |
38 | test_active_list = [
39 | "Current (at https://www.example.com)",
40 | "2019-01-01 - 2020-01-01 (at https://www.example.com)",
41 | "2018-01-01 - 2019-01-01 (at https://www.example.com)",
42 | ]
43 |
44 | expected = (
45 | "1. Current (at https://www.example.com)\n\n"
46 | "2. 2019-01-01 - 2020-01-01 (at https://www.example.com)\n\n"
47 | "3. 2018-01-01 - 2019-01-01 (at https://www.example.com)"
48 | )
49 |
50 | self.assertEqual(format_active(test_active_list), expected)
51 | self.assertTrue(type(format_active(test_active_list)) is str)
52 |
53 | def test_format_archived_codes(self):
54 | """Does format_archived_codes convert dict of archived codes into formatted string?"""
55 |
56 | archived_codes = {
57 | "UA-21085468-1": {
58 | "first_seen": "04/01/2014:04:02",
59 | "last_seen": "03/01/2012:06:00",
60 | },
61 | "UA-113949143-1": {
62 | "first_seen": "01/01/2019:19:39",
63 | "last_seen": "01/01/2023:00:16",
64 | },
65 | }
66 |
67 | expected = (
68 | "1. UA-21085468-1 (04/01/2014:04:02 - 03/01/2012:06:00)\n\n"
69 | "2. UA-113949143-1 (01/01/2019:19:39 - 01/01/2023:00:16)"
70 | )
71 |
72 | self.assertEqual(format_archived_codes(archived_codes), expected)
73 | self.assertTrue(type(format_archived_codes(archived_codes)) is str)
74 |
75 | @patch("wayback_google_analytics.output.datetime", autospec=True)
76 | def test_init_output_valid_types(self, mock_datetime):
77 | """Does init_output create a dict with correct keys?"""
78 | mock_now = Mock(
79 | return_value=datetime.strptime(self.test_timestamp, "%d-%m-%Y(%H-%M-%S)")
80 | )
81 | mock_datetime.now = mock_now
82 |
83 | for type in self.valid_types:
84 | with self.subTest(type=type):
85 | expected_file_path = os.path.join(
86 | self.test_path, f"{self.test_timestamp}.{type}"
87 | )
88 |
89 | returned_file_path = init_output(type=type, output_dir=self.test_path)
90 |
91 | """Does it return correct file path for each type?"""
92 | self.assertEqual(returned_file_path, expected_file_path)
93 |
94 | """Does it create correct file for each type?"""
95 | if type == "csv":
96 | self.assertTrue(
97 | os.path.exists(
98 | os.path.join(
99 | self.test_path, f"{self.test_timestamp}_codes.csv"
100 | )
101 | )
102 | )
103 | self.assertTrue(
104 | os.path.exists(
105 | os.path.join(
106 | self.test_path, f"{self.test_timestamp}_urls.csv"
107 | )
108 | )
109 | )
110 | else:
111 | self.assertTrue(os.path.exists(returned_file_path))
112 |
113 | def test_init_output_invalid_type(self):
114 | """Does init_output raise error with incorrect type?"""
115 |
116 | """Should raise error with 'docx'"""
117 | with self.assertRaises(ValueError):
118 | init_output("docx")
119 |
120 | """Should raise error with 'md'"""
121 | with self.assertRaises(ValueError):
122 | init_output("md")
123 |
124 | def test_write_output_txt(self):
125 | """Does write_output write results to correct text file?"""
126 |
127 | test_file = "./test_output/test_file.txt"
128 | test_results = {"test": "test"}
129 | with open(test_file, "w") as f:
130 | pass
131 |
132 | write_output(test_file, "txt", test_results)
133 |
134 | with open(test_file, "r") as f:
135 | test_data = json.load(f)
136 |
137 | os.remove(test_file)
138 | self.assertEqual(test_data, test_results)
139 |
140 | def test_write_output_json(self):
141 | """Does write_output write results to correct json file?"""
142 |
143 | test_file = "./test_output/test_file.json"
144 | test_results = {"test": "test"}
145 | with open(test_file, "w") as f:
146 | pass
147 |
148 | write_output(test_file, "json", test_results)
149 |
150 | with open(test_file, "r") as f:
151 | test_data = json.load(f)
152 |
153 | os.remove(test_file)
154 | self.assertEqual(test_data, test_results)
155 |
156 | @patch("wayback_google_analytics.output.get_urls_df", autospec=True)
157 | @patch("wayback_google_analytics.output.get_codes_df", autospec=True)
158 | def test_write_output_csv(self, mock_urls, mock_codes):
159 | """Does write_output write results to correct csv files?"""
160 |
161 | test_file = "./test_output/test_file.csv"
162 | test_file_urls = "./test_output/test_file_urls.csv"
163 | test_file_codes = "./test_output/test_file_codes.csv"
164 | test_results = {"test": "test"}
165 | mock_urls.return_value = pd.DataFrame([test_results])
166 | mock_codes.return_value = pd.DataFrame([test_results])
167 |
168 | with open(test_file_urls, "w") as f:
169 | pass
170 |
171 | with open(test_file_codes, "w") as f:
172 | pass
173 |
174 | write_output(test_file, "csv", test_results)
175 |
176 | test_data_urls = pd.read_csv(test_file_urls).to_dict(orient="records")[0]
177 | test_data_codes = pd.read_csv(test_file_codes).to_dict(orient="records")[0]
178 |
179 | os.remove(test_file_urls)
180 | os.remove(test_file_codes)
181 |
182 | self.assertEqual(test_data_urls, test_results)
183 | self.assertEqual(test_data_codes, test_results)
184 |
185 | @patch("wayback_google_analytics.output.get_urls_df", autospec=True)
186 | @patch("wayback_google_analytics.output.get_codes_df", autospec=True)
187 | def test_write_output_xlsx(self, mock_urls, mock_codes):
188 | """Does write_output write results to correct xlsx file?"""
189 |
190 | test_file = "./test_output/test_file.xlsx"
191 | test_results = {"test": "test"}
192 | mock_urls.return_value = pd.DataFrame([test_results])
193 | mock_codes.return_value = pd.DataFrame([test_results])
194 |
195 | with open(test_file, "w") as f:
196 | pass
197 |
198 | write_output(test_file, "xlsx", test_results)
199 |
200 | with pd.ExcelFile(test_file, engine="openpyxl") as xls:
201 | sheet_names = xls.sheet_names
202 |
203 | df_urls = xls.parse("URLs")
204 | df_codes = xls.parse("Codes")
205 |
206 | os.remove(test_file)
207 |
208 | self.assertEqual(sheet_names, ["URLs", "Codes"])
209 | self.assertEqual(df_urls.to_dict(orient="records")[0], test_results)
210 | self.assertEqual(df_codes.to_dict(orient="records")[0], test_results)
211 |
212 | def test_get_urls_df(self):
213 | """Does get_urls_df create appropriate df from dict?"""
214 |
215 | test_results = {
216 | "someurl.com": {
217 | "current_UA_code": "UA-12345678-1",
218 | "current_GA_code": "G-1234567890",
219 | "current_GTM_code": "GTM-12345678",
220 | "archived_UA_codes": {
221 | "UA-12345678-1": {
222 | "first_seen": "01/01/2019",
223 | "last_seen": "01/01/2019",
224 | },
225 | },
226 | "archived_GA_codes": {
227 | "G-1234567890": {
228 | "first_seen": "01/01/2019",
229 | "last_seen": "01/01/2019",
230 | }
231 | },
232 | "archived_GTM_codes": {
233 | "GTM-12345678": {
234 | "first_seen": "01/01/2019",
235 | "last_seen": "01/01/2019",
236 | },
237 | },
238 | }
239 | }
240 |
241 | expected_results = {
242 | "url": "someurl.com",
243 | "UA_Code": "UA-12345678-1",
244 | "GA_Code": "G-1234567890",
245 | "GTM_Code": "GTM-12345678",
246 | "Archived_UA_Codes": "1. UA-12345678-1 (01/01/2019 - 01/01/2019)",
247 | "Archived_GA_Codes": "1. G-1234567890 (01/01/2019 - 01/01/2019)",
248 | "Archived_GTM_Codes": "1. GTM-12345678 (01/01/2019 - 01/01/2019)",
249 | }
250 |
251 | actual_results = get_urls_df([test_results])
252 |
253 | self.assertEqual(actual_results.to_dict(orient="records")[0], expected_results)
254 | self.assertTrue(type(actual_results) is pd.DataFrame)
255 |
256 | def test_get_codes_df(self):
257 | """Does get_codes_df create appropriate df from dict?"""
258 |
259 | test_results = {
260 | "someurl.com": {
261 | "current_UA_code": "UA-12345678-1",
262 | "current_GA_code": "G-1234567890",
263 | "current_GTM_code": "GTM-12345678",
264 | "archived_UA_codes": {
265 | "UA-12345678-1": {
266 | "first_seen": "01/01/2019",
267 | "last_seen": "01/01/2019",
268 | },
269 | },
270 | "archived_GA_codes": {
271 | "G-1234567890": {
272 | "first_seen": "01/01/2019",
273 | "last_seen": "01/01/2019",
274 | }
275 | },
276 | "archived_GTM_codes": {
277 | "GTM-12345678": {
278 | "first_seen": "01/01/2019",
279 | "last_seen": "01/01/2019",
280 | },
281 | },
282 | }
283 | }
284 |
285 | expected_results = [
286 | {
287 | "code": "G-1234567890",
288 | "websites": "someurl.com",
289 | "active": "1. 01/01/2019 - 01/01/2019(at someurl.com)",
290 | },
291 | {
292 | "code": "GTM-12345678",
293 | "websites": "someurl.com",
294 | "active": "1. 01/01/2019 - 01/01/2019(at someurl.com)",
295 | },
296 | {
297 | "code": "UA-12345678-1",
298 | "websites": "someurl.com",
299 | "active": "1. 01/01/2019 - 01/01/2019(at someurl.com)",
300 | },
301 | ]
302 |
303 | actual_results = get_codes_df([test_results])
304 |
305 | self.assertEqual(actual_results.to_dict(orient="records"), expected_results)
306 | self.assertTrue(type(actual_results) is pd.DataFrame)
307 |
308 | def test_get_codes_df_empty_list(self):
309 | """Does get_codes_df return a df with a message if codes_list is empty?"""
310 |
311 | test_results = []
312 | expected_results = pd.DataFrame({"Message": ["No codes found."]})
313 |
314 | actual_results = get_codes_df(test_results)
315 |
316 | self.assertEqual(actual_results.to_dict(orient="records"), expected_results.to_dict(orient="records"))
317 | self.assertTrue(type(actual_results) is pd.DataFrame)
318 |
319 |
--------------------------------------------------------------------------------
/tests/test_utils.py:
--------------------------------------------------------------------------------
1 | from unittest import TestCase
2 |
3 | from wayback_google_analytics.utils import get_limit_from_frequency, validate_dates, get_14_digit_timestamp, get_date_from_timestamp, COLLAPSE_OPTIONS
4 |
5 | class UtilsTestCase(TestCase):
6 | """Tests for utils.py"""
7 |
8 | def test_get_limit_from_frequency(self):
9 | """Tests that get_limit_from_frequency returns correct limit."""
10 |
11 | """Returns a correct limit for 'yearly' frequency."""
12 | self.assertEqual(get_limit_from_frequency(frequency="yearly", start_date="20120101000000", end_date="20130101000000"), 2)
13 | self.assertEqual(get_limit_from_frequency(frequency="yearly", start_date="20120101000000", end_date="20140101000000"), 3)
14 | self.assertEqual(get_limit_from_frequency(frequency="yearly", start_date="19990101000000", end_date="20150101000000"), 17)
15 |
16 | """Returns a correct limit for 'monthly' frequency."""
17 | self.assertEqual(get_limit_from_frequency(frequency="monthly", start_date="20120101000000", end_date="20120201000000"), 2)
18 | self.assertEqual(get_limit_from_frequency(frequency="monthly", start_date="20120101000000", end_date="20121205000000"), 12)
19 | self.assertEqual(get_limit_from_frequency(frequency="monthly", start_date="20120101000000", end_date="20130101000000"), 13)
20 | self.assertEqual(get_limit_from_frequency(frequency="monthly", start_date="20120101000000", end_date="20140101000000"), 25)
21 | self.assertEqual(get_limit_from_frequency(frequency="monthly", start_date="19990801000000", end_date="20150617000000"), 191)
22 |
23 | """Returns a correct limit for 'daily' frequency."""
24 | self.assertEqual(get_limit_from_frequency(frequency="daily", start_date="20120101000000", end_date="20120102000000"), 2)
25 | self.assertEqual(get_limit_from_frequency(frequency="daily", start_date="20120101000000", end_date="20120131000000"), 31)
26 | self.assertEqual(get_limit_from_frequency(frequency="daily", start_date="19990213000000", end_date="20150617000000"), 5969)
27 |
28 | """Returns a correct limit for 'hourly' frequency."""
29 | self.assertEqual(get_limit_from_frequency(frequency="hourly", start_date="20120101000000", end_date="20120101010000"), 2)
30 | self.assertEqual(get_limit_from_frequency(frequency="hourly", start_date="20120101000000", end_date="20120101020000"), 3)
31 | self.assertEqual(get_limit_from_frequency(frequency="hourly", start_date="20120101000000", end_date="20120101230000"), 24)
32 | self.assertEqual(get_limit_from_frequency(frequency="hourly", start_date="19990213000000", end_date="20150617000000"), 143233)
33 |
34 | def test_get_limit_from_frequency_invalid(self):
35 | """Tests that get_limit_from_frequency raises ValueError if parameters incorrect."""
36 |
37 | """Raises ValueError without valid start date"""
38 | with self.assertRaises(ValueError):
39 | get_limit_from_frequency(frequency="yearly", start_date=None, end_date="20130101000000")
40 |
41 | """Raises ValueError without valid frequency"""
42 | with self.assertRaises(ValueError):
43 | get_limit_from_frequency(frequency=None, start_date="20120101000000", end_date="20130101000000")
44 | with self.assertRaises(ValueError):
45 | get_limit_from_frequency(frequency="weekly", start_date="20120101000000", end_date="20130101000000")
46 |
47 | def test_validate_dates(self):
48 | """Does validate_dates return True for valid dates?"""
49 |
50 | """Returns True for valid dates"""
51 | self.assertTrue(validate_dates(start_date="01/01/2012:12:00", end_date="02/01/2012:12:00"))
52 | self.assertTrue(validate_dates(start_date="01/10/2023:12:00", end_date="03/11/2023:12:00"))
53 |
54 | """Handles dates without 24hr time"""
55 | self.assertTrue(validate_dates(start_date="01/01/2012", end_date="02/01/2012"))
56 | self.assertTrue(validate_dates(start_date="01/10/2023", end_date="03/11/2023"))
57 | self.assertTrue(validate_dates(start_date="01/01/2012:12:30", end_date="02/01/2012"))
58 | self.assertTrue(validate_dates(start_date="01/01/2012", end_date="02/01/2012:01:00"))
59 |
60 | def test_validate_dates_invalid(self):
61 | """Does validate dates return False for invalid dates?"""
62 |
63 | """Returns False for invalid dates"""
64 | self.assertFalse(validate_dates(start_date="01/01/2012:12:00", end_date="01/01/2012:12:00"))
65 | self.assertFalse(validate_dates(start_date="01/01/2012:12:00", end_date="01/01/2010:11:00"))
66 | self.assertFalse(validate_dates(start_date="01/02/2012", end_date="01/01/2012:11:00"))
67 | self.assertFalse(validate_dates(start_date="01/01/2012:12:30", end_date="02/01/2010"))
68 |
69 | """Raises TypeError without valid start date"""
70 | with self.assertRaises(TypeError):
71 | validate_dates(start_date=None, end_date="01/01/2012:12:00")
72 | with self.assertRaises(TypeError):
73 | validate_dates(start_date="01/01/2012:12:00", end_date=None)
74 | with self.assertRaises(TypeError):
75 | validate_dates(start_date=None, end_date=None)
76 |
77 |
78 | def test_get_14_digit_timestamp(self):
79 | """Does get_14_digit_timestamp return correct timestamp?"""
80 |
81 | """Returns correct timestamp"""
82 | self.assertEqual(get_14_digit_timestamp("01/01/2012:12:00"), "20120101120000")
83 | self.assertEqual(get_14_digit_timestamp("01/01/2012:12:00"), "20120101120000")
84 | self.assertEqual(get_14_digit_timestamp("01/01/2012:23:01"), "20120101230100")
85 |
86 |
87 | def test_get_date_from_timestamp(self):
88 | """Does get_date_from_timestamp return correct date?"""
89 |
90 | """Returns correct date"""
91 | self.assertEqual(get_date_from_timestamp("20120101120000"), "01/01/2012:12:00")
92 | self.assertEqual(get_date_from_timestamp("20140101231200"), "01/01/2014:23:12")
93 | self.assertEqual(get_date_from_timestamp("20230112010200"), "12/01/2023:01:02")
94 |
95 | def test_COLLAPSE_OPTIONS(self):
96 | """Does COLLAPSE_OPTIONS return correct frequency?"""
97 |
98 | """Returns correct frequency"""
99 | self.assertEqual(COLLAPSE_OPTIONS["yearly"], "4")
100 | self.assertEqual(COLLAPSE_OPTIONS["monthly"], "6")
101 | self.assertEqual(COLLAPSE_OPTIONS["daily"], "8")
102 | self.assertEqual(COLLAPSE_OPTIONS["hourly"], "10")
103 |
104 |
--------------------------------------------------------------------------------
/wayback_google_analytics/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bellingcat/wayback-google-analytics/e697ce9d0e2594175867e911b03198ae986ffce6/wayback_google_analytics/__init__.py
--------------------------------------------------------------------------------
/wayback_google_analytics/async_utils.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import re
3 | from wayback_google_analytics.codes import get_UA_code, get_GA_code, get_GTM_code
4 | from wayback_google_analytics.utils import get_date_from_timestamp, DEFAULT_HEADERS
5 |
6 |
7 | async def get_snapshot_timestamps(
8 | session,
9 | url,
10 | start_date,
11 | end_date,
12 | frequency,
13 | limit,
14 | semaphore=asyncio.Semaphore(10),
15 | ):
16 | """Takes a url and returns an array of snapshot timestamps for a given time range.
17 |
18 | Args:
19 | session (aiohttp.ClientSession)
20 | url (str)
21 | start_date (str, optional): Start date for time range. Defaults to Oct 1, 2012, when UA codes were adopted.
22 | end_date (str, optional): End date for time range.
23 | frequency (str, optional): Can limit snapshots to remove duplicates (1 per hr, day, week, etc).
24 | limit (int, optional): Limit number of snapshots returned.
25 | semaphore: asyncio.Semaphore()
26 |
27 | Returns:
28 | Array of timestamps:
29 | ["20190101000000", "20190102000000", ...]
30 | """
31 |
32 | # Default params get snapshots from url domain w/ 200 status codes only.
33 | cdx_url = f"http://web.archive.org/cdx/search/cdx?url={url}&matchType=domain&filter=statuscode:200&fl=timestamp&output=JSON"
34 |
35 | # Add correct params to cdx_url
36 | if frequency:
37 | cdx_url += f"&collapse=timestamp:{frequency}"
38 |
39 | if limit:
40 | cdx_url += f"&limit={limit}"
41 |
42 | if start_date:
43 | cdx_url += f"&from={start_date}"
44 |
45 | if end_date:
46 | cdx_url += f"&to={end_date}"
47 |
48 | print("CDX url: ", cdx_url)
49 |
50 | # Regex pattern to find 14-digit timestamps
51 | pattern = re.compile(r"\d{14}")
52 |
53 | # Use session to get timestamps
54 | async with semaphore:
55 | async with session.get(cdx_url, headers=DEFAULT_HEADERS) as response:
56 | timestamps = pattern.findall(await response.text())
57 |
58 | print("Timestamps from CDX api: ", timestamps)
59 |
60 | # Return sorted timestamps
61 | return sorted(timestamps)
62 |
63 |
64 | async def get_codes_from_snapshots(session, url, timestamps, semaphore=asyncio.Semaphore(10)):
65 | """Returns an array of UA/GA codes for a given url using the Archive.org Wayback Machine.
66 |
67 | Args:
68 | session (aiohttp.ClientSession)
69 | url (str)
70 | timestamps (list): List of timestamps to get codes from.
71 | semaphore: asyncio.Semaphore()
72 |
73 | Returns:
74 | {
75 | "UA_codes": {
76 | "UA-12345678-1": {
77 | "first_seen": "20190101000000",
78 | "last_seen": "20190101000000"
79 | },
80 | "GA_codes": {
81 | "G-1234567890": {
82 | "first_seen": "20190101000000",
83 | "last_seen": "20190101000000"
84 | },
85 | },
86 | "GTM_codes": {
87 | "GTM-1234567890": {
88 | "first_seen": "20190101000000",
89 | "last_seen": "20190101000000"
90 | },
91 | },
92 | }
93 | """
94 |
95 | # Build base url template for wayback machine
96 | base_url = "https://web.archive.org/web/{timestamp}/" + url
97 |
98 | # Initialize results
99 | results = {
100 | "UA_codes": {},
101 | "GA_codes": {},
102 | "GTM_codes": {},
103 | }
104 |
105 | # Get codes from each timestamp with asyncio.gather().
106 | tasks = [
107 | get_codes_from_single_timestamp(session, base_url, timestamp, results, semaphore)
108 | for timestamp in timestamps
109 | ]
110 | await asyncio.gather(*tasks)
111 |
112 | for code_type in results:
113 | for code in results[code_type]:
114 | results[code_type][code]["first_seen"] = get_date_from_timestamp(
115 | results[code_type][code]["first_seen"]
116 | )
117 | results[code_type][code]["last_seen"] = get_date_from_timestamp(
118 | results[code_type][code]["last_seen"]
119 | )
120 |
121 | return results
122 |
123 |
124 | async def get_codes_from_single_timestamp(session, base_url, timestamp, results, semaphore=asyncio.Semaphore(10)):
125 | """Returns UA/GA codes from a single archive.org snapshot and adds it to the results dictionary.
126 |
127 | Args:
128 | session (aiohttp.ClientSession)
129 | base_url (str): Base url for archive.org snapshot.
130 | timestamp (str): 14-digit timestamp.
131 | results (dict): Dictionary to add codes to (inherited from get_codes_from_snapshots()).
132 | semaphore: asyncio.Semaphore()
133 |
134 | Returns:
135 | None
136 | """
137 |
138 | # Use semaphore to limit number of concurrent requests
139 | async with semaphore:
140 | async with session.get(
141 | base_url.format(timestamp=timestamp), headers=DEFAULT_HEADERS
142 | ) as response:
143 | try:
144 | html = await response.text()
145 |
146 | print(
147 | "Retrieving codes from url: ", base_url.format(timestamp=timestamp)
148 | )
149 |
150 | if html:
151 | # Get UA/GA codes from html
152 | UA_codes = get_UA_code(html)
153 | GA_codes = get_GA_code(html)
154 | GTM_codes = get_GTM_code(html)
155 |
156 | # above functions return lists, so iterate thru codes and update
157 | # results dict
158 | for code in UA_codes:
159 | if code not in results["UA_codes"]:
160 | results["UA_codes"][code] = {}
161 | results["UA_codes"][code]["first_seen"] = timestamp
162 | results["UA_codes"][code]["last_seen"] = timestamp
163 |
164 | if code in results["UA_codes"]:
165 | if timestamp < results["UA_codes"][code]["first_seen"]:
166 | results["UA_codes"][code]["first_seen"] = timestamp
167 | if timestamp > results["UA_codes"][code]["last_seen"]:
168 | results["UA_codes"][code]["last_seen"] = timestamp
169 |
170 | for code in GA_codes:
171 | if code not in results["GA_codes"]:
172 | results["GA_codes"][code] = {}
173 | results["GA_codes"][code]["first_seen"] = timestamp
174 | results["GA_codes"][code]["last_seen"] = timestamp
175 |
176 | if code in results["GA_codes"]:
177 | if timestamp < results["GA_codes"][code]["first_seen"]:
178 | results["GA_codes"][code]["first_seen"] = timestamp
179 | if timestamp > results["GA_codes"][code]["last_seen"]:
180 | results["GA_codes"][code]["last_seen"] = timestamp
181 |
182 | for code in GTM_codes:
183 | if code not in results["GTM_codes"]:
184 | results["GTM_codes"][code] = {}
185 | results["GTM_codes"][code]["first_seen"] = timestamp
186 | results["GTM_codes"][code]["last_seen"] = timestamp
187 |
188 | if code in results["GTM_codes"]:
189 | if timestamp < results["GTM_codes"][code]["first_seen"]:
190 | results["GTM_codes"][code]["first_seen"] = timestamp
191 | if timestamp > results["GTM_codes"][code]["last_seen"]:
192 | results["GTM_codes"][code]["last_seen"] = timestamp
193 |
194 | except Exception as e:
195 | print(
196 | f"Error retrieving codes from {base_url.format(timestamp=timestamp)}: ",
197 | e,
198 | )
199 | return None
200 |
201 | print("Finish gathering codes for: ", base_url.format(timestamp=timestamp))
202 |
--------------------------------------------------------------------------------
/wayback_google_analytics/codes.py:
--------------------------------------------------------------------------------
1 | from bs4 import BeautifulSoup
2 | import re
3 |
4 |
5 | def get_UA_code(html):
6 | """Returns UA codes (w/o duplicates) from given html, or None if not found.
7 |
8 | Args:
9 | html (str): Raw html.
10 |
11 | Returns:
12 | ["UA-12345678-1", "UA-12345678-2", ...]
13 | """
14 |
15 | # Only search for codes in script tags
16 | script_tags = BeautifulSoup(html, "html.parser").find_all("script")
17 |
18 | # Regex pattern to find UA codes
19 | pattern = re.compile(r"UA-[\d-]{5,15}")
20 |
21 | # Find all UA codes in html
22 |
23 | UA_codes = []
24 | for script in script_tags:
25 | curr_codes = pattern.findall(script.text)
26 | UA_codes += curr_codes
27 |
28 |
29 | # Remove duplicates and return
30 | return list(set(UA_codes))
31 |
32 |
33 | def get_GA_code(html):
34 | """Returns GA codes (w/o duplicates) from given html, or None if not found.
35 |
36 | Args:
37 | html (str): Raw html.
38 |
39 | Returns:
40 | ["G-1234567890", "G-1234567891", ...]
41 | """
42 |
43 | # Only search for codes in script tags
44 | script_tags = BeautifulSoup(html, "html.parser").find_all("script")
45 |
46 | # Regex pattern to find GA codes
47 | pattern = re.compile(r"G-[\d-]{5,15}")
48 |
49 | # Find all GA codes in html or return None
50 |
51 | GA_codes = []
52 | for script in script_tags:
53 | curr_codes = pattern.findall(script.text)
54 | GA_codes += curr_codes
55 |
56 | # Remove duplicates and return
57 | return list(set(GA_codes))
58 |
59 |
60 | def get_GTM_code(html):
61 | """Returns GTM codes (w/o duplicates) from given html, or None if not found.
62 | Args:
63 | html (str): Raw html.
64 |
65 | Returns:
66 | ["GTM-1234567890", "GTM-1234567891", ...]
67 | """
68 |
69 | # Only search for codes in script tags
70 | script_tags = BeautifulSoup(html, "html.parser").find_all("script")
71 |
72 | # This pattern
73 | pattern = re.compile(r"GTM-[\w-]{1,15}")
74 |
75 | # Find all GTM codes in html or return None
76 |
77 | GTM_codes = []
78 | for script in script_tags:
79 | curr_codes = pattern.findall(script.text)
80 | GTM_codes += curr_codes
81 |
82 |
83 | # Remove duplicates and return
84 | return list(set(GTM_codes))
85 |
--------------------------------------------------------------------------------
/wayback_google_analytics/main.py:
--------------------------------------------------------------------------------
1 | import aiohttp
2 | import argparse
3 | import asyncio
4 |
5 | from wayback_google_analytics.utils import (
6 | get_limit_from_frequency,
7 | get_14_digit_timestamp,
8 | validate_dates,
9 | COLLAPSE_OPTIONS,
10 | )
11 |
12 | from wayback_google_analytics.scraper import (
13 | get_analytics_codes,
14 | )
15 |
16 | from wayback_google_analytics.output import (
17 | init_output,
18 | write_output,
19 | )
20 |
21 |
22 | async def main(args):
23 | """Main function. Runs get_analytics_codes() and prints results.
24 |
25 | Args:
26 | args: Command line arguments (argparse)
27 |
28 | Returns:
29 | None
30 | """
31 |
32 | # If input_file is provided, read urls from file path
33 | if args.input_file:
34 | try:
35 | with open(args.input_file, "r") as f:
36 | args.urls = f.read().splitlines()
37 | except FileNotFoundError:
38 | print("File not found. Please enter a valid file path.")
39 | return
40 |
41 | # Throws ValueError immediately if output type is incorrect or there is an issue writing to file
42 | if args.output:
43 | output_file = init_output(args.output)
44 |
45 | # Check if start_date is before end_date
46 | if args.start_date and args.end_date:
47 | if not validate_dates(args.start_date, args.end_date):
48 | raise ValueError("Start date must be before end date.")
49 |
50 | # Update dates to 14-digit format
51 | if args.start_date:
52 | args.start_date = get_14_digit_timestamp(args.start_date)
53 |
54 | if args.end_date:
55 | args.end_date = get_14_digit_timestamp(args.end_date)
56 |
57 | # Gets appropriate limit for given frequency & converts frequency to collapse option
58 | if args.frequency:
59 | args.limit = (
60 | get_limit_from_frequency(
61 | frequency=args.frequency,
62 | start_date=args.start_date,
63 | end_date=args.end_date,
64 | )
65 | + 1
66 | )
67 | args.frequency = COLLAPSE_OPTIONS[args.frequency]
68 |
69 | semaphore = asyncio.Semaphore(10)
70 |
71 | # Warn user if large request
72 | if abs(int(args.limit)) > 500 or len(args.urls) > 9:
73 | response = input(
74 | f"""Large requests can lead to being rate limited by archive.org.\n\n Current limit: {args.limit} (Recommended < 500) \n\n Current # of urls: {len(args.urls)} (Recommended < 10, unless limit < 50)
75 |
76 | Do you wish to proceed? (Yes/no)
77 | """
78 | )
79 | if response.lower() not in ("yes", "y"):
80 | print("Request cancelled.")
81 | exit()
82 |
83 | try:
84 | async with semaphore:
85 | async with aiohttp.ClientSession() as session:
86 | results = await get_analytics_codes(
87 | session=session,
88 | urls=args.urls,
89 | start_date=args.start_date,
90 | end_date=args.end_date,
91 | frequency=args.frequency,
92 | limit=args.limit,
93 | semaphore=semaphore,
94 | skip_current=args.skip_current,
95 | )
96 | print(results)
97 |
98 | # handle printing the output
99 | if args.output:
100 | write_output(output_file, args.output, results)
101 | except aiohttp.ClientError as e:
102 | print(
103 | "Your request was rate limited. Wait 5 minutes and try again and consider reducing the limit and # of numbers."
104 | )
105 |
106 |
107 | def setup_args():
108 | """Setup command line arguments. Returns args for use in main().
109 |
110 | CLI Args:
111 | --urls: List of urls to scrape
112 | --start_date: Start date for time range. Defaults to Oct 1, 2012, when UA codes were adopted.
113 | --end_date: End date for time range. Defaults to None.
114 | --frequency: Can limit snapshots to remove duplicates (1 per hr, day, month, etc). Defaults to None.
115 | --limit: Limit number of snapshots returned. Defaults to None.
116 | --skip_current: Add this flag to skip current UA/GA codes when getting archived codes.
117 |
118 | Returns:
119 | Command line arguments (argparse)
120 | """
121 |
122 | parser = argparse.ArgumentParser()
123 |
124 | # Argparse group to prevent user from entering both --input_file and --urls
125 | group = parser.add_mutually_exclusive_group(required=True)
126 | group.add_argument(
127 | "-i",
128 | "--input_file",
129 | default=None,
130 | help="Enter a file path to a list of urls in a readable file type (e.g. .txt, .csv, .md)",
131 | )
132 | group.add_argument(
133 | "-u",
134 | "--urls",
135 | nargs="+",
136 | help="Enter a list of urls separated by spaces to get their UA/GA codes (e.g. --urls https://www.google.com https://www.facebook.com)",
137 | )
138 | parser.add_argument(
139 | "-o",
140 | "--output",
141 | default="json",
142 | help="Enter an output type to write results to file. Defaults to json.",
143 | choices=["csv", "txt", "json", "xlsx"],
144 | )
145 | parser.add_argument(
146 | "-s",
147 | "--start_date",
148 | default="01/10/2012:00:00",
149 | help="Start date for time range (dd/mm/YYYY:HH:MM) Defaults to 01/10/2012:00:00, when UA codes were adopted.",
150 | )
151 | parser.add_argument(
152 | "-e",
153 | "--end_date",
154 | default=None,
155 | help="End date for time range (dd/mm/YYYY:HH:MM). Defaults to None.",
156 | )
157 | parser.add_argument(
158 | "-f",
159 | "--frequency",
160 | default=None,
161 | help="Can limit snapshots to remove duplicates (1 per hr, day, month, etc). Defaults to None.",
162 | choices=["yearly", "monthly", "daily", "hourly"],
163 | )
164 | parser.add_argument(
165 | "-l",
166 | "--limit",
167 | default=-100,
168 | help="Limits number of snapshots returned. Defaults to -100 (most recent 100 snapshots).",
169 | )
170 | parser.add_argument(
171 | "-sc",
172 | "--skip_current",
173 | action="store_true",
174 | help="Add this flag to skip current UA/GA codes when getting archived codes.",
175 | )
176 |
177 | return parser.parse_args()
178 |
179 |
180 | def main_entrypoint():
181 | args = setup_args()
182 | asyncio.run(main(args))
183 |
184 |
185 | if __name__ == "__main__":
186 | main_entrypoint()
187 |
--------------------------------------------------------------------------------
/wayback_google_analytics/output.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | import json
3 | import os
4 | import pandas as pd
5 |
6 |
7 | def init_output(type, output_dir="./output"):
8 | """Creates output directory and initializes empty output file.
9 |
10 | Args:
11 | type (str): csv/txt/json.
12 | output_dir (str): Path to output directory. Defaults to ./output.
13 |
14 | Returns:
15 | None
16 | """
17 |
18 | valid_types = ["csv", "txt", "json", "xlsx"]
19 | if type not in valid_types:
20 | raise ValueError(
21 | f"Invalid output type: {type}. Please use csv, txt, xlsx or json."
22 | )
23 |
24 | # Create output directory if it doesn't exist
25 | if not os.path.exists(output_dir):
26 | os.makedirs(output_dir)
27 |
28 | # Get current date and time for file name
29 | file_name = datetime.now().strftime("%d-%m-%Y(%H-%M-%S)")
30 |
31 | # Create empty output file if type is not csv and return filename
32 | if type not in ["csv"]:
33 | with open(os.path.join(f"{output_dir}", f"{file_name}.{type}"), "w") as f:
34 | pass
35 |
36 | return os.path.join(output_dir, f"{file_name}.{type}")
37 |
38 | # If csv, create separate files for urls and codes and return filename
39 | with open(os.path.join(f"{output_dir}", f"{file_name}_urls.{type}"), "w") as f:
40 | pass
41 |
42 | with open(os.path.join(f"{output_dir}", f"{file_name}_codes.{type}"), "w") as f:
43 | pass
44 |
45 | return os.path.join(output_dir, f"{file_name}.{type}")
46 |
47 |
48 | def write_output(output_file, output_type, results):
49 | """Writes results to the correct output file in json, csv or txt.
50 |
51 | Args:
52 | output_file (str): Path to output file.
53 | output_type (str): csv/txt/json.
54 | results (dict): Results from scraper.
55 |
56 | Returns:
57 | None
58 | """
59 |
60 | # If json or txt, write contents directly to file.
61 | if output_type == "json" or output_type == "txt":
62 | with open(output_file, "w") as f:
63 | json.dump(results, f, indent=4)
64 | return
65 | # If csv or xlsx, convert results to pandas dataframes.
66 | urls_df = get_urls_df(results)
67 | codes_df = get_codes_df(results)
68 |
69 | # If csv, write dataframes to separate csv files for urls, codes.
70 | if output_type == "csv":
71 | urls_output_file = output_file.replace(".csv", "_urls.csv")
72 | codes_output_file = output_file.replace(".csv", "_codes.csv")
73 | urls_df.to_csv(urls_output_file, index=False)
74 | codes_df.to_csv(codes_output_file, index=False)
75 |
76 | # If xlsx, write dataframes to separate sheets for urls, codes.
77 | if output_type == "xlsx":
78 | writer = pd.ExcelWriter(output_file, engine="xlsxwriter")
79 | urls_df.to_excel(writer, sheet_name="URLs", index=False)
80 | codes_df.to_excel(writer, sheet_name="Codes", index=False)
81 | writer.close()
82 |
83 |
84 | def get_urls_df(results):
85 | """Flattens the results json (list of dictionaries) and converts it into simple Pandas dataframe and returns it.
86 |
87 | Args:
88 | list: Results from scraper.
89 |
90 | Returns:
91 | urls_df (pd.DataFrame): Pandas dataframe of results.
92 | """
93 |
94 | url_list = []
95 |
96 | for item in results:
97 | for url, info in item.items():
98 | url_list.append(
99 | {
100 | "url": url,
101 | "UA_Code": info.get("current_UA_code", ""),
102 | "GA_Code": info.get("current_GA_code", ""),
103 | "GTM_Code": info.get("current_GTM_code", ""),
104 | "Archived_UA_Codes": format_archived_codes(
105 | info.get("archived_UA_codes", {})
106 | ),
107 | "Archived_GA_Codes": format_archived_codes(
108 | info.get("archived_GA_codes", {})
109 | ),
110 | "Archived_GTM_Codes": format_archived_codes(
111 | info.get("archived_GTM_codes", {})
112 | ),
113 | }
114 | )
115 |
116 | return pd.DataFrame(url_list)
117 |
118 |
119 | def format_archived_codes(archived_codes):
120 | """Helper function to flatten archived codes and format them into a single string where
121 | each item is numbered and separated by a newline.
122 |
123 | Args:
124 | archived_codes (dict): Dictionary of archived codes.
125 |
126 | Returns:
127 | str: Formatted string.
128 | """
129 |
130 | results = []
131 | idx = 1
132 |
133 | for code, timeframe in archived_codes.items():
134 | results.append(
135 | f"{idx}. {code} ({timeframe['first_seen']} - {timeframe['last_seen']})"
136 | )
137 | idx += 1
138 |
139 | return "\n\n".join(results)
140 |
141 |
142 | def get_codes_df(results):
143 | """Flattens the result json (list of dictionries) into a Pandas dataframe and returns it.
144 |
145 | Args:
146 | results (list): Results from scraper.
147 |
148 | Returns:
149 | codes_df (pd.DataFrame): Pandas dataframe of results.
150 |
151 | """
152 |
153 | code_list = []
154 |
155 | # Flattens results into list of dicts for each code, including duplicates
156 | for item in results:
157 | for url, info in item.items():
158 | for key, code in info.items():
159 | if type(code) is list:
160 | for c in code:
161 | code_list.append(
162 | {
163 | "code": c,
164 | "websites": url,
165 | "active": f"Current (at {url})",
166 | }
167 | )
168 | if type(code) is dict:
169 | for c in code:
170 | code_list.append(
171 | {
172 | "code": c,
173 | "websites": url,
174 | "active": f"{code[c]['first_seen']} - {code[c]['last_seen']}(at {url})",
175 | }
176 | )
177 |
178 | # Return a df w/ string message if no codes found
179 | if not code_list:
180 | return pd.DataFrame([{"Message": "No codes found."}])
181 |
182 | # Convert list of dicts to pandas dataframe
183 | codes_df = pd.DataFrame(code_list)
184 |
185 | # Combine all duplicates and format combined columns
186 | codes_df = (
187 | codes_df.groupby("code")
188 | .agg({"websites": lambda x: ", ".join(x), "active": format_active})
189 | .reset_index()
190 | )
191 |
192 | return codes_df
193 |
194 |
195 | def format_active(list):
196 | """Takes a list of strings and formats them into a single, numbered string where
197 | each item is separated by a newline.
198 |
199 | Args:
200 | list (list): List of strings.
201 |
202 | Returns:
203 | str: Formatted string.
204 |
205 | Example:
206 | ["Current (at https://www.example.com)", "2019-01-01 - 2020-01-01 (at https://www.example.com)"]
207 | ->
208 | "1. Current (at https://www.example.com)\n\n
209 | 2. 2019-01-01 - 2020-01-01 (at https://www.example.com)"
210 |
211 | """
212 |
213 | return "\n\n".join(f"{i + 1}. {item}" for i, item in enumerate(list))
214 |
--------------------------------------------------------------------------------
/wayback_google_analytics/scraper.py:
--------------------------------------------------------------------------------
1 | import aiohttp
2 | import asyncio
3 | from wayback_google_analytics.codes import (
4 | get_UA_code,
5 | get_GA_code,
6 | get_GTM_code,
7 | )
8 | from wayback_google_analytics.async_utils import (
9 | get_snapshot_timestamps,
10 | get_codes_from_snapshots,
11 | )
12 |
13 | from wayback_google_analytics.utils import (
14 | DEFAULT_HEADERS,
15 | )
16 |
17 |
18 | async def get_html(session, url, semaphore):
19 | """Returns html from a single url.
20 |
21 | Args:
22 | session (aiohttp.ClientSession)
23 | url (str): Url to scrape html from.
24 | semaphore: asyncio.semaphore
25 |
26 | Returns:
27 | html (str): html from url.
28 | """
29 | async with semaphore:
30 | try:
31 | async with session.get(url, headers=DEFAULT_HEADERS) as response:
32 | return await response.text()
33 | except aiohttp.ServerTimeoutError as e:
34 | print(f"Request to {url} timed out", e)
35 | except aiohttp.ClientError as e:
36 | print(f"Failed to reach {url}", e)
37 | except Exception as e:
38 | print(f"Error getting data from {url}", e)
39 | return None
40 |
41 |
42 | async def process_url(
43 | session, url, start_date, end_date, frequency, limit, semaphore, skip_current
44 | ):
45 | """Returns a dictionary of current and archived UA/GA codes for a single url.
46 |
47 | Args:
48 | session (aiohttp.ClientSession)
49 | url (str): Url to scrape.
50 | start_date (str): Start date for time range
51 | end_date (str): End date for time range
52 | frequency (int):
53 | limit (int):
54 | semaphore: asyncio.semaphore
55 | skip_current (bool): Determine whether to skip getting current codes
56 |
57 | Returns:
58 | "someurl.com": {
59 | "current_UA_code": "UA-12345678-1",
60 | "current_GA_code": "G-1234567890",
61 | "current_GTM_code": "GTM-12345678",
62 | "archived_UA_codes": {
63 | "UA-12345678-1": {
64 | "first_seen": "20190101000000",
65 | "last_seen": "20190101000000",
66 | },
67 | },
68 | "archived_GA_codes": {
69 | "G-1234567890": {
70 | "first_seen": "20190101000000",
71 | "last_seen": "20190101000000",
72 | }
73 | },
74 | "archived_GTM_codes": {
75 | "GTM-12345678": {
76 | "first_seen": "20190101000000",
77 | "last_seen": "20190101000000",
78 | },
79 | },
80 |
81 | """
82 | async with semaphore:
83 | # Initialize dict for entry
84 | curr_entry = {url: {}}
85 |
86 | # Get html + current codes
87 | if not skip_current:
88 | html = await get_html(session, url, semaphore)
89 | print("Retrieving current codes for: ", url)
90 | if html:
91 | curr_entry[url]["current_UA_code"] = get_UA_code(html)
92 | curr_entry[url]["current_GA_code"] = get_GA_code(html)
93 | curr_entry[url]["current_GTM_code"] = get_GTM_code(html)
94 | curr_entry[url]["current_GTM_code"] = get_GTM_code(html)
95 | print("Finished gathering current codes for: ", url)
96 |
97 | # Get snapshots for Wayback Machine
98 | print("Retrieving archived codes for: ", url)
99 | archived_snapshots = await get_snapshot_timestamps(
100 | session=session,
101 | url=url,
102 | start_date=start_date,
103 | end_date=end_date,
104 | frequency=frequency,
105 | limit=limit,
106 | semaphore=semaphore,
107 | )
108 |
109 | # Get historic codes from archived snapshots, appending them to curr_entry
110 | archived_codes = await get_codes_from_snapshots(
111 | session=session, url=url, timestamps=archived_snapshots, semaphore=semaphore
112 | )
113 | curr_entry[url]["archived_UA_codes"] = archived_codes["UA_codes"]
114 | curr_entry[url]["archived_GA_codes"] = archived_codes["GA_codes"]
115 | curr_entry[url]["archived_GTM_codes"] = archived_codes["GTM_codes"]
116 |
117 | print("Finished retrieving archived codes for: ", url)
118 |
119 | return curr_entry
120 |
121 |
122 | async def get_analytics_codes(
123 | session,
124 | urls,
125 | start_date="20121001000000",
126 | end_date=None,
127 | frequency=None,
128 | limit=None,
129 | semaphore=None,
130 | skip_current=False,
131 | ):
132 | """Takes array of urls and returns array of dictionaries with all found analytics codes for a given time range.
133 |
134 | Args:
135 | session (aiohttp.ClientSession)
136 | urls (array): Array of urls to scrape.
137 | start_date (str, optional): Start date for time range. Defaults to Oct 1, 2012, when UA codes were adopted.
138 | end_date (str, optional): End date for time range. Defaults to None.
139 | frequency (str, optional): Can limit snapshots to remove duplicates (1 per hr, day, month, etc). Defaults to None.
140 | limit (int, optional): Limit number of snapshots returned. Defaults to None.
141 |
142 | Returns:
143 | {
144 | "someurl.com": {
145 | "current_UA_code": "UA-12345678-1",
146 | "current_GA_code": "G-1234567890",
147 | "current_GTM_code": "GTM-12345678",
148 | "archived_UA_codes": {
149 | "UA-12345678-1": {
150 | "first_seen": "20190101000000",
151 | "last_seen": "20200101000000",
152 | }
153 | "UA-12345678-2": {
154 | "first_seen": "20190101000000",
155 | "last_seen": "20200101000000",
156 | }
157 | },
158 | "archived_GA_codes": {
159 | "G-1234567890": {
160 | "first_seen": "20190101000000",
161 | "last_seen": "20200101000000",
162 | }
163 | },
164 | "archived_GTM_codes": {
165 | "GTM-12345678": {
166 | "first_seen": "20190101000000",
167 | "last_seen": "20200101000000",
168 | }
169 | },
170 | "someotherurl.com": {...},
171 | }
172 | """
173 |
174 | tasks = []
175 | for url in urls:
176 | task = asyncio.create_task(
177 | process_url(
178 | session=session,
179 | url=url,
180 | start_date=start_date,
181 | end_date=end_date,
182 | frequency=frequency,
183 | limit=limit,
184 | semaphore=semaphore,
185 | skip_current=skip_current,
186 | )
187 | )
188 | tasks.append(task)
189 | await asyncio.sleep(5)
190 |
191 | # Process urls concurrently and return results
192 | results = await asyncio.gather(*tasks)
193 | return results
194 |
--------------------------------------------------------------------------------
/wayback_google_analytics/utils.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from dateutil.relativedelta import relativedelta
3 |
4 | # Default headers for requests
5 | DEFAULT_HEADERS = {
6 | "user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36"
7 | }
8 |
9 | # Collapse options for CDX api
10 | COLLAPSE_OPTIONS = {
11 | "hourly": "10",
12 | "daily": "8",
13 | "monthly": "6",
14 | "yearly": "4",
15 | }
16 |
17 |
18 | def get_limit_from_frequency(frequency, start_date, end_date):
19 | """Returns an appropriate limit for a given frequency to be used w/ the CDX api.
20 |
21 | Args:
22 | frequency (str): Frequency (hourly, daily, monthly, yearly)
23 | start_date (str): 14-digit timestamp for starting point
24 | end_date (str): 14-digit timestamp for end of range
25 |
26 | Returns:
27 | int: Limit for CDX api
28 | """
29 |
30 | # Get start date as datetime object or raise error if not provided
31 | if not start_date:
32 | raise ValueError("To set a frequency you must provide a start date.")
33 |
34 | # Get end date as current date if not present
35 | if not end_date:
36 | end_date = datetime.now()
37 | else:
38 | end_date = datetime.strptime(end_date, "%Y%m%d%H%M%S")
39 |
40 | # Get start and end dates as datetime objects
41 | start_date = datetime.strptime(start_date, "%Y%m%d%H%M%S")
42 |
43 | # Get delta between start and end dates
44 | delta = relativedelta(end_date, start_date)
45 |
46 | # Remove whitespace and convert frequency to lower case
47 | if frequency:
48 | frequency.strip().lower()
49 |
50 | if frequency == "yearly":
51 | return delta.years + 1
52 |
53 | if frequency == "monthly":
54 | return delta.years * 12 + delta.months + 1
55 |
56 | if frequency == "daily":
57 | total_days = (end_date - start_date).days
58 | return total_days + 1
59 |
60 | if frequency == "hourly":
61 | total_hours = (end_date - start_date).total_seconds() / 3600
62 | return int(total_hours + 1)
63 |
64 | # Raise error if frequency none of the above options
65 | raise ValueError(
66 | f"Invalid frequency: {frequency}. Please use hourly, daily, monthly, or yearly."
67 | )
68 |
69 |
70 | def validate_dates(start_date, end_date):
71 | """Returns True if start_date is before end_date, False otherwise.
72 |
73 | Args:
74 | start_date (str): Date (dd:mm:YY:HH:MM) for starting point
75 | end_date (str): Date (dd:mm:YY:HH:MM) for end of range
76 |
77 | Returns:
78 | bool: True if start_date is before end_date, False otherwise.
79 | """
80 |
81 | # Get start and end dates as datetime objects
82 | try:
83 | start_date = datetime.strptime(start_date, "%d/%m/%Y:%H:%M")
84 | except ValueError:
85 | start_date = datetime.strptime(start_date, "%d/%m/%Y")
86 |
87 | try:
88 | end_date = datetime.strptime(end_date, "%d/%m/%Y:%H:%M")
89 | except ValueError:
90 | end_date = datetime.strptime(end_date, "%d/%m/%Y")
91 |
92 | # Return True if start_date is before end_date
93 | if start_date < end_date:
94 | return True
95 |
96 | # Return False if start_date is after end_date
97 | return False
98 |
99 |
100 | def get_date_from_timestamp(timestamp):
101 | """Takes a 14-digit timestamp (YYYYmmddHHMMSS) and returns a date (dd/mm/YYYY:HH:MM).
102 |
103 | Args:
104 | timestamp (str): 14-digit timestamp (YYYYmmddHHMMSS)
105 |
106 | Returns:
107 | str: Date in format dd/mm/YYYY:HH:MM
108 |
109 | Example: 20121001000000 -> 01/10/2012:00:00
110 | """
111 |
112 | # convert timestamp to datetime object
113 | date = datetime.strptime(timestamp, "%Y%m%d%H%M%S")
114 |
115 | # convert datetime object to date
116 | return date.strftime("%d/%m/%Y:%H:%M")
117 |
118 |
119 | def get_14_digit_timestamp(date):
120 | """Takes a date (dd/mm/YYYY:HH:MM) and converts it to a 14-digit timestamp (YYYYmmddHHMMSS).
121 |
122 | Args:
123 | date (str): Date in format dd/mm/YYYY:HH:MM
124 |
125 | Returns:
126 | str: 14-digit timestamp (YYYYmmddHHMMSS)
127 |
128 | Example: 01/10/2012:00:00 -> 20121001000000
129 | """
130 |
131 | # Convert date to datetime object
132 | try:
133 | date = datetime.strptime(date, "%d/%m/%Y:%H:%M")
134 | except ValueError:
135 | date = datetime.strptime(date, "%d/%m/%Y")
136 |
137 | # Convert datetime object to 14-digit timestamp
138 | return date.strftime("%Y%m%d%H%M%S")
139 |
140 | def generate_semaphore(url_list, limit):
141 | """Generates appropriate semaphore given a list of urls and a limit."""
142 |
143 | url_count = len(url_list)
144 |
145 | operations = url_count * limit
146 |
147 | if operations <= 100:
148 | return 10
149 |
150 | if operations <= 1000:
151 | return 5
152 |
153 | if operations <= 10000:
154 | return 1
--------------------------------------------------------------------------------