├── .gitmodules
├── README.md
├── autojournal
├── __init__.py
├── app_usage_output_parser.py
├── calendar_api.py
├── credentials.py
├── data_model.py
├── drive_api.py
├── gcal_aggregator.py
├── maps_data_parser.py
├── notes_to_calendar.py
├── parsers
│ ├── __pycache__
│ │ └── cronometer.cpython-39.pyc
│ ├── activitywatch.py
│ ├── cgm.py
│ ├── cronometer.py
│ ├── google_fit.py
│ ├── gps.py
│ ├── momentodb.py
│ └── nomie.py
├── photos_api.py
├── process_location.py
├── record_usage_video.py
├── report_generator.py
├── selfspy_api.py
└── utils.py
├── example_computer_usage_calendar.png
├── example_location_calendar.png
├── poetry.lock
├── pyproject.toml
└── run_gcal_aggregator.bash
/.gitmodules:
--------------------------------------------------------------------------------
1 | [submodule "Maps-Location-History"]
2 | path = maps_location_history
3 | url = ./Maps-Location-History/
4 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # AutoJournal
2 |
3 | Overarching goal is to provide a system that summarizes life events
4 | automatically, to augment your memory and allow for further analysis if
5 | desired.
6 |
7 | **This library is obselete, I have switched to using and developing https://github.com/kovasap/autojournal-on-gas.**
8 |
9 | ## Examples
10 |
11 | Autojournal fundamentally moves personal data around between different services.
12 | The most supported place for data to end up is Google calendar.
13 | Here are some examples of what that looks like.
14 | Each block on the calendar corresponds to a single "event" that I've tracked. For
15 | sleep, this would be time asleep. For location, this would be time at a specific
16 | location, or time travelling. For computer usage, this would be active time interacting
17 | with a computer (not AFK).
18 |
19 | An excellent example of someone else doing exactly what autojournal does
20 | can be found at https://miguelrochefort.com/blog/calendar/.
21 |
22 | Just the location events from a trip I went on:
23 |
24 | 
25 |
26 | All events before the trip, showing usage of multiple different computers, and
27 | sleep:
28 |
29 | 
30 |
31 | These were generated with the help of [this blurring
32 | script](https://gist.github.com/IceCreamYou/4f085b180a1608b99cb2).
33 |
34 | ## Getting Started
35 |
36 | Run:
37 |
38 | ```
39 | # Required for selfspy dependency.
40 | sudo apt-get install python-tk
41 | poetry install
42 | ```
43 |
44 | Then, run:
45 |
46 | ```
47 | source $(poetry env info --path)/bin/activate
48 | ```
49 |
50 | to get into the poetry virtualenv to run scripts.
51 |
52 | You'll also need to get a `credentials.json` file from
53 | https://developers.google.com/calendar/quickstart/python#step_1_turn_on_the.
54 | If you already have a client ID, you can just press the download button and save
55 | the file as `credentials.json`.
56 |
57 | To run once a day at 10pm, run `crontab -e` and add this snippet (assuming you
58 | cloned autojournal into your home directory ~/):
59 |
60 | ```
61 | 0 22 * * * (cd ~/autojournal; nohup poetry run gcal_aggregator --update all &> ~/autojournal.log &)
62 | ```
63 |
64 | ### Raspberry Pi
65 |
66 | Requires additional installs before poetry install:
67 |
68 | ```
69 | sudo apt-get install python-dev libatlas-base-dev
70 | ```
71 |
72 | ## Nomie with Couchdb
73 |
74 | 1. Setup couchdb on your machine (for me it's a raspberry pi:
75 | https://andyfelong.com/2019/07/couchdb-2-1-on-raspberry-pi-raspbian-stretch/,
76 | https://github.com/jguillod/couchdb-on-raspberry-pi#5-script-for-running-couchdb-on-boot).
77 | 1. Follow
78 | https://github.com/happydata/nomie-docs/blob/master/development/couchdb-setup.md
79 |
80 | Checkout this for coding help:
81 | https://colab.research.google.com/drive/1vKOHtu1cLgky6I_4W-aFBqq6e6Hb4qBA
82 |
83 | ## TODOs
84 |
85 | ### Better Location Data
86 |
87 | Could use ML to judge mode of travel better like these guys:
88 | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5620731/.
89 |
90 | ### Emailed Report
91 |
92 | Generate a report with the timeline that is already implemented, plus something like https://stackoverflow.com/questions/21604997/how-to-find-significant-correlations-in-a-large-dataset. Email this regularly.
93 |
94 | ### Frontend Idea
95 |
96 | Create a histogram-timeline using
97 | [d3js](https://www.d3-graph-gallery.com/graph/density_basic.html) that all data
98 | gets plotted on. Have checkboxes to turn on/off data series. Idea here is to
99 | plot lots of stuff (keystrokes, temperature, heartrate, stock market, etc.) and
100 | have a each way to check on whatever. Could use [Joy
101 | plot](http://datavizcatalogue.com/blog/area-graphs/) for this.
102 |
103 | Some ideas to explore in this space:
104 |
105 | - [patternfly-timeline](https://github.com/patternfly/patternfly-timeline)
106 | - [d3 line chart with zoom](https://www.d3-graph-gallery.com/graph/line_brushZoom.html)
107 | - **[plotly stacked range slider chart](https://plotly.com/python/range-slider/)**
108 | - [d3 ridgeline/joy plot](https://www.d3-graph-gallery.com/graph/ridgeline_basic.html)
109 | - [d3 pannable chart](https://observablehq.com/@d3/pannable-chart)
110 |
111 | For plotting GPS logs:
112 |
113 | - [Google maps python API](https://github.com/googlemaps/google-maps-services-python)
114 | - [QGIS open source mapping project](https://qgis.org/en/site/about/index.html)
115 | - **[folium](https://github.com/python-visualization/folium)**
116 |
117 | This could generate an HTML report that would be automatically emailed to me
118 | every week.
119 |
120 | ## GPS Logging
121 |
122 | Uses https://gpslogger.app/. Note to make this work consistently, refer to
123 | https://gpslogger.app/#sometimestheappwillnotlogforlongperiodsoftime.
124 |
125 | ## Google Photos to Calendar Syncer
126 |
127 | When run, will make a calendar entry for every photo that exists in a given
128 | album.
129 |
130 | Check out https://github.com/ActivityWatch/activitywatch as a potential data
131 | source.
132 |
133 | For long events (e.g. whole day), think about making the event name long with
134 | newlines, so that you get a "graph" in the calendar UI. For example:
135 |
136 | ```
137 | Temperature event:
138 | 68
139 | 68
140 | 68
141 | 68
142 | 68
143 | 68
144 | 69
145 | 70
146 | 70
147 | 70
148 | 70
149 | ...
150 | ```
151 |
152 | For complex events (e.g. selfspy activity tracking), try using a word cloud
153 | type program to extract out representative words to put into the calendar event
154 | summary.
155 |
156 | ## Analysis TODOs
157 |
158 | Calculate total daily calories vs time of first meal.
159 |
160 | ## Additional Things to Track
161 |
162 | Try https://blog.luap.info/how-i-track-my-life.html.
163 |
164 | ## Cron
165 |
166 | Add this command to run every other hour, assuming you cloned this project to `~/autojournal`.
167 |
168 | ```
169 | 0 */2 * * * ~/autojournal/run_gcal_aggregator.bash
170 | ```
171 |
172 | Check out this article for ideas about other kinds of tracking on Google Calendar: https://towardsdatascience.com/andorid-activity-1ecc454c636c
173 |
174 | ## Google Cloud
175 |
176 | Set up a free tier compute engine instance. Then do
177 | https://cloud.google.com/compute/docs/gcloud-compute#default-properties and you
178 | should be able to ssh into the instance with:
179 |
180 | ```
181 | gcloud compute ssh kovas
182 | ```
183 |
184 | Currently a free tier instance does not have enough memory to support running this tool. I might be able to
185 | fix this by reducing memory usage by not keeping all events in memeory at once,
186 | but that would require some serious code changes.
187 |
188 | ## Other Inspiration
189 |
190 | https://vimeo.com/99571921
191 |
192 | https://karpathy.github.io/2014/08/03/quantifying-productivity/
193 |
194 | https://www.smartertime.com/index.html
195 |
--------------------------------------------------------------------------------
/autojournal/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kovasap/autojournal/55309333457eee4ece658d6961cebeceee40308f/autojournal/__init__.py
--------------------------------------------------------------------------------
/autojournal/app_usage_output_parser.py:
--------------------------------------------------------------------------------
1 | import copy
2 | import csv
3 | from datetime import datetime, timedelta
4 | from dateutil import tz
5 | from collections import defaultdict
6 | from dataclasses import dataclass, field
7 | from functools import reduce
8 | from typing import Dict
9 |
10 | from sortedcontainers import SortedSet
11 |
12 | from . import calendar_api
13 | from . import utils
14 |
15 |
16 | # See also http://lightonphiri.org/blog/quantifying-my-phone-usage-android-applications-usage-statistics
17 |
18 |
19 | @dataclass
20 | class PhoneSession:
21 | """Phone usage session"""
22 | start_time: datetime = None
23 | end_time: datetime = None
24 | # Map from app name to the total time it was used in this session.
25 | summed_usages: Dict[str, timedelta] = field(
26 | default_factory=lambda: defaultdict(lambda: timedelta(seconds=0)))
27 | # Number of times the phone was checked (used before locking) during this
28 | # period.
29 | checks: int = 1
30 |
31 | def get_duration(self):
32 | return self.end_time - self.start_time
33 |
34 | def to_calendar_event(self):
35 | used_time = reduce(lambda t1, t2: t1 + t2,
36 | self.summed_usages.values(),
37 | timedelta(seconds=0))
38 | unused_time = self.get_duration() - used_time
39 | # Avoid off by very slight amount errors.
40 | if unused_time.total_seconds() < 1:
41 | unused_time = timedelta(seconds=0)
42 | if unused_time < timedelta(seconds=0):
43 | print(self.get_duration(), used_time, unused_time)
44 | print(self.end_time)
45 | raise Exception()
46 | usage_sums = list(self.summed_usages.items())
47 | if unused_time > timedelta(seconds=2):
48 | usage_sums.append(('Unused', unused_time))
49 | sorted_usage_sums = list(reversed(sorted(usage_sums,
50 | key=lambda item: item[1])))
51 | # TODO look into using selfspy api functionality for this to pick
52 | # significant events to show.
53 | top_event_name = ''
54 | if sorted_usage_sums:
55 | top_event_name = sorted_usage_sums[0][0]
56 | if top_event_name == 'Unused' and len(sorted_usage_sums) > 1:
57 | top_event_name = sorted_usage_sums[1][0]
58 | used_time_str = utils.strfdelta(used_time)
59 | used_secs = used_time.total_seconds()
60 | unused_secs = unused_time.total_seconds()
61 | if used_secs == 0 and unused_secs == 0:
62 | percent_active = 100.0
63 | else:
64 | percent_active = round(
65 | 100 * (used_secs / (unused_secs + used_secs)), 1)
66 | return calendar_api.CalendarEvent(
67 | start=dict(dateTime=self.start_time.isoformat(),
68 | timeZone='America/Los_Angeles'),
69 | end=dict(dateTime=self.end_time.isoformat(),
70 | timeZone='America/Los_Angeles'),
71 | summary=(f'{used_time_str} ({percent_active}%), {self.checks} '
72 | f'checks. {top_event_name} primarily'),
73 | description=reduce(
74 | str.__add__,
75 | [f'{u[0]} -- {round(u[1].total_seconds() / 60, 2)} mins\n'
76 | for u in sorted_usage_sums],
77 | ''),
78 | )
79 |
80 |
81 | def get_app_usages_from_usage_history_app_export_lines(csv_lines):
82 | """Parses export data from
83 | https://play.google.com/store/apps/details?id=com.huybn.UsageHistory
84 | """
85 | reader = csv.DictReader(csv_lines)
86 | # Convert times to datetime and get rid of non-data lines. Also skip
87 | # repeated lines.
88 | parsed_rows = set()
89 | app_usages = []
90 | # i = 0
91 | for row in reader:
92 | # i += 1
93 | # if i > 10:
94 | # raise Exception()
95 | # Ignore header lines from concatenating multiple csvs
96 | if list(row.keys()) == list(row.values()):
97 | continue
98 | # Ignore duplicate rows.
99 | row_tuple = tuple(row.values())
100 | if row_tuple in parsed_rows:
101 | continue
102 | parsed_rows.add(row_tuple)
103 | # Get time from row data.
104 | use_time = datetime.fromtimestamp(
105 | int(row['Time in ms']) / 1000).replace(tzinfo=tz.gettz('PST'))
106 | # Make sure that all times are unique, so in sorting nothing gets
107 | # rearranged. This PROBABLY keeps the initial order, but might still
108 | # be a bit buggy.
109 | if app_usages and use_time == app_usages[-1].start_time:
110 | use_time -= timedelta(seconds=1)
111 | duration = timedelta(seconds=float(row['Duration (s)']))
112 | cur_usage = PhoneSession(
113 | start_time=use_time,
114 | end_time=use_time + duration,
115 | )
116 | cur_usage.summed_usages[row['\ufeff\"App name\"']] += duration
117 | # There is a bug with the usage_history app where events are sometimes
118 | # duplicated ~2 seconds after the original event. This attempts to
119 | # filter those duplicates out of the final data set.
120 | duplicate = False
121 | for existing_usage in app_usages[::-1]:
122 | if existing_usage.summed_usages == cur_usage.summed_usages:
123 | duplicate = True
124 | if ((cur_usage.start_time - existing_usage.start_time)
125 | > timedelta(seconds=4)):
126 | break
127 | if not duplicate:
128 | # print(cur_usage.start_time)
129 | # for k, v in cur_usage.summed_usages.items():
130 | # print(k, v)
131 | app_usages.append(cur_usage)
132 | app_usages.sort(key=lambda usage: usage.start_time)
133 | return app_usages
134 |
135 |
136 | def get_app_usages_from_phone_time_app_export_lines(csv_lines):
137 | """Parses export data from
138 | https://play.google.com/store/apps/details?id=com.smartertime.phonetime
139 | """
140 | reader = csv.DictReader(csv_lines)
141 | # Convert times to datetime and get rid of non-data lines. Also skip
142 | # repeated lines.
143 | parsed_rows = set()
144 | app_usages = []
145 | for row in reader:
146 | # Ignore header lines from concatenating multiple csvs
147 | if list(row.keys()) == list(row.values()):
148 | continue
149 | # Ignore duplicate rows.
150 | row_tuple = tuple(sorted(row.items(), key=lambda t: t[0]))
151 | if row_tuple in parsed_rows:
152 | continue
153 | parsed_rows.add(row_tuple)
154 | # Get time from row data.
155 | use_time = datetime.fromtimestamp(
156 | int(row['Start time ms']) / 1000).replace(tzinfo=tz.gettz('PST'))
157 | # Make sure that all times are unique, so in sorting nothing gets
158 | # rearranged. This PROBABLY keeps the initial order, but might still
159 | # be a bit buggy.
160 | # if app_usages and use_time == app_usages[-1].start_time:
161 | # use_time -= timedelta(seconds=1)
162 | cur_usage = PhoneSession(
163 | start_time=use_time,
164 | end_time=use_time + timedelta(
165 | seconds=float(row['Duration ms']) / 1000),
166 | )
167 | # Truncate single app sessions that have durations that bleed past the
168 | # next session's start time.
169 | if app_usages and cur_usage.start_time < app_usages[-1].end_time:
170 | app_usages[-1].end_time = cur_usage.start_time
171 | last_app = next(iter(app_usages[-1].summed_usages.keys()))
172 | app_usages[-1].summed_usages[
173 | last_app] = app_usages[-1].get_duration()
174 | if not row['App name'].strip():
175 | row['App name'] = 'Unknown'
176 | cur_usage.summed_usages[row['App name']] += cur_usage.get_duration()
177 | app_usages.append(cur_usage)
178 | app_usages.sort(key=lambda usage: usage.start_time)
179 | return app_usages
180 |
181 |
182 | def unify_sessions(sessions):
183 | def join_summed_usages(su1, su2):
184 | new = copy.deepcopy(su1)
185 | for app, duration in su2.items():
186 | new[app] += duration
187 | return new
188 | unified = PhoneSession(
189 | start_time=sessions[0].start_time,
190 | end_time=sessions[-1].end_time,
191 | summed_usages=reduce(join_summed_usages,
192 | [s.summed_usages for s in sessions]),
193 | checks=sum([s.checks for s in sessions]),
194 | )
195 | # try:
196 | # unified.to_calendar_event()
197 | # except:
198 | # for s in sessions:
199 | # print(s.start_time.strftime('%M %S'))
200 | # for k, v in s.summed_usages.items():
201 | # print(k, utils.strfdelta(v))
202 | # print(unified.to_calendar_event())
203 | return unified
204 |
205 |
206 | def collapse_all_sessions(sessions, idle_mins=20):
207 | return [unify_sessions(s) for s in utils.split_on_gaps(
208 | sessions, timedelta(minutes=idle_mins),
209 | key=lambda s: s.start_time, last_key=lambda s: s.end_time)]
210 |
211 |
212 | def create_events(csv_lines):
213 | app_usages = get_app_usages_from_usage_history_app_export_lines(csv_lines)
214 | # Group individual app usages into continuous usage sessions.
215 | sessions = collapse_all_sessions(app_usages, idle_mins=0.01)
216 | # Each session should count as a single phone "check".
217 | for s in sessions:
218 | s.checks = 1
219 | # Create new sessions representing multiple sessions happening quickly
220 | # after each other.
221 | grouped_sessions = collapse_all_sessions(sessions, idle_mins=20)
222 | return [ps.to_calendar_event() for ps in grouped_sessions]
223 |
--------------------------------------------------------------------------------
/autojournal/calendar_api.py:
--------------------------------------------------------------------------------
1 | import time
2 | from typing import List, Iterable, Tuple
3 | from datetime import datetime
4 | from googleapiclient.discovery import build
5 | from pprint import pprint, pformat
6 |
7 | from . import utils
8 |
9 | # https://developers.google.com/calendar/v3/reference/events
10 | CalendarEvent = dict
11 | # https://developers.google.com/calendar/v3/reference/calendarList
12 | CalendarList = dict
13 |
14 |
15 | def print_event(e: CalendarEvent) -> str:
16 | start = datetime.fromisoformat(
17 | e['start']['dateTime']).strftime('%m/%d/%Y %I:%M%p')
18 | end = datetime.fromisoformat(
19 | e['end']['dateTime']).strftime('%m/%d/%Y %I:%M%p')
20 | print(f'{start} - {end} {e["summary"]}')
21 |
22 |
23 | EVENT_DESCRIPTION_LENGTH_LIMIT = 8100 # characters
24 |
25 |
26 | def get_consistant_event_timing(event: CalendarEvent) -> Tuple[str, str]:
27 | """Get start/end time strings that are consistant from a given event.
28 |
29 | Seconds can be rounded strangely by Google calendar, so we only compare up
30 | to the minute.
31 | """
32 | return (event['start']['dateTime'][:16], event['end']['dateTime'][:16])
33 |
34 |
35 | def unique_event_key(event: CalendarEvent) -> str:
36 | """Returns a string that should uniquely identify an event."""
37 | return '|'.join((event.get('description', ''),)
38 | + get_consistant_event_timing(event))
39 |
40 |
41 | def time_started_event_key(event: CalendarEvent) -> str:
42 | """Returns a string that should uniquely identify an event."""
43 | return get_consistant_event_timing(event)[0]
44 |
45 |
46 | class CalendarApi(object):
47 |
48 | def __init__(self, creds):
49 | self.service = build('calendar', 'v3', credentials=creds)
50 |
51 | def get_events(self, calendar_id: str) -> List[CalendarEvent]:
52 | page_token = None
53 | events = []
54 | while page_token != '':
55 | response = utils.retry_on_error(
56 | self.service.events().list(calendarId=calendar_id,
57 | pageToken=page_token).execute)
58 | events += response.get('items', [])
59 | page_token = response.get('nextPageToken', '')
60 | return events
61 |
62 | def list_calendars(self) -> List[CalendarList]:
63 | calendars = []
64 | page_token = None
65 | while page_token != '':
66 | page_token = '' if not page_token else page_token
67 | calendar_list = utils.retry_on_error(
68 | self.service.calendarList().list(pageToken=page_token).execute)
69 | calendars += calendar_list['items']
70 | page_token = calendar_list.get('nextPageToken', '')
71 | return calendars
72 |
73 | def get_calendar_id(self, calendar_name: str) -> str:
74 | matches = [
75 | cal['id']
76 | for cal in self.list_calendars()
77 | if cal['summary'] == calendar_name
78 | ]
79 | assert len(matches) == 1
80 | return matches[0]
81 |
82 | def clear_calendar(self,
83 | calendar_name: str,
84 | dry_run: bool = False,
85 | **filter_args):
86 | calendar_id = self.get_calendar_id(calendar_name)
87 | events = filter_events(self.get_events(calendar_id), **filter_args)
88 | print(f'Clearing {len(events)} events from {calendar_name}...')
89 | if dry_run:
90 | print('(DRY RUN)')
91 | print_events(events)
92 | else:
93 | for i, e in enumerate(events):
94 | utils.retry_on_error(
95 | self.service.events().delete(
96 | calendarId=calendar_id,
97 | eventId=e['id']).execute)
98 | print(i, end='\r')
99 | print()
100 |
101 | def add_events(self,
102 | calendar_name: str,
103 | events: Iterable[CalendarEvent],
104 | dry_run: bool = False,
105 | **filter_args):
106 | calendar_id = self.get_calendar_id(calendar_name)
107 | events = filter_events(events, **filter_args)
108 | print(f'Adding {len(events)} new events to {calendar_name}...')
109 |
110 | # Find all existing events. If new events are equal to existing
111 | # events, skip them. However, if new events have the same start time
112 | # as existing events but are otherwise not equal, overwrite the
113 | # existing event.
114 | existing_events = self.get_events(calendar_id)
115 | existing_keys = {unique_event_key(e) for e in existing_events}
116 | pre_filter_num_events = len(events)
117 | events = [e for e in events if unique_event_key(e) not in existing_keys]
118 | print(f'{pre_filter_num_events - len(events)} events were '
119 | 'screened because they already exist on this calendar.')
120 | print('Removing exising events with same start time...')
121 | event_start_keys = {time_started_event_key(e) for e in events}
122 | i = 0
123 | for e in existing_events:
124 | if time_started_event_key(e) in event_start_keys:
125 | utils.retry_on_error(
126 | self.service.events().delete(
127 | calendarId=calendar_id,
128 | eventId=e['id']).execute)
129 | i += 1
130 | print(i, end='\r')
131 |
132 | if dry_run:
133 | print('(DRY RUN)')
134 | print_events(events)
135 | else:
136 | for event in events:
137 | try:
138 | response = utils.retry_on_error(
139 | self.service.events().insert(
140 | calendarId=calendar_id,
141 | body=event).execute)
142 | print(f'Added event {pformat(response)}')
143 | except Exception as e:
144 | print(f'FAILED to add event {pformat(event)}')
145 | print(f'Failure reason: {repr(e)}')
146 | raise
147 |
148 |
149 | def print_events(events):
150 | for e in events:
151 | # pprint(e)
152 | # print('--------------------------------------')
153 | print_event(e)
154 |
155 |
156 | def filter_events(events, start_datetime=None, end_datetime=None):
157 | """Filters out events older than the start_datetime or newer than
158 | end_datetime."""
159 |
160 | def datetime_selector(event):
161 | return ((start_datetime is None or datetime.fromisoformat(
162 | event['start']['dateTime']) >= start_datetime) and
163 | (end_datetime is None or
164 | datetime.fromisoformat(event['end']['dateTime']) <= end_datetime))
165 |
166 | return [e for e in events if datetime_selector(e)]
167 |
--------------------------------------------------------------------------------
/autojournal/credentials.py:
--------------------------------------------------------------------------------
1 | import pickle
2 | import os.path
3 |
4 | from google_auth_oauthlib.flow import InstalledAppFlow
5 | from google.auth.transport.requests import Request
6 |
7 |
8 | def get_credentials(scopes):
9 | creds = None
10 | # The file token.pickle stores the user's access and refresh tokens, and is
11 | # created automatically when the authorization flow completes for the first
12 | # time.
13 | if os.path.exists('token.pickle'):
14 | with open('token.pickle', 'rb') as token:
15 | creds = pickle.load(token)
16 | # If there are no (valid) credentials available, let the user log in.
17 | if not creds or not creds.valid:
18 | if creds and creds.expired and creds.refresh_token:
19 | creds.refresh(Request())
20 | else:
21 | # Get credentials.json from
22 | # https://developers.google.com/calendar/quickstart/python#step_1_turn_on_the
23 | flow = InstalledAppFlow.from_client_secrets_file(
24 | 'credentials.json', scopes)
25 | creds = flow.run_local_server(port=0)
26 | # Save the credentials for the next run
27 | with open('token.pickle', 'wb') as token:
28 | pickle.dump(creds, token)
29 | return creds
30 |
31 |
32 |
--------------------------------------------------------------------------------
/autojournal/data_model.py:
--------------------------------------------------------------------------------
1 | from dataclasses import dataclass, field
2 | from datetime import datetime, timedelta
3 | from dateutil import tz
4 |
5 | from . import calendar_api
6 |
7 | @dataclass
8 | class Event:
9 | timestamp: datetime
10 | summary: str
11 | description: str
12 | data: dict = field(default_factory=dict)
13 | duration: timedelta = timedelta(0)
14 |
15 | def to_calendar_event(self) -> calendar_api.CalendarEvent:
16 | return calendar_api.CalendarEvent(
17 | start=dict(
18 | dateTime=self.timestamp.replace(tzinfo=tz.gettz('PST')).isoformat(),
19 | timeZone='America/Los_Angeles'),
20 | end=dict(
21 | dateTime=(self.duration + self.timestamp).replace(
22 | tzinfo=tz.gettz('PST')).isoformat(),
23 | timeZone='America/Los_Angeles'),
24 | summary=self.summary,
25 | description=self.description,
26 | )
27 |
--------------------------------------------------------------------------------
/autojournal/drive_api.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import os.path as op
3 | import io
4 | import zipfile
5 |
6 | from googleapiclient.discovery import build
7 | from googleapiclient.http import MediaIoBaseDownload
8 |
9 | from . import utils
10 |
11 |
12 | class DriveApi(object):
13 |
14 | def __init__(self, creds):
15 | self.service = build('drive', 'v3', credentials=creds)
16 |
17 | def read_files(self, directory):
18 | """Returns dict mapping filename to lines in file for each file in
19 | given directory.
20 | """
21 | found_files = self._get_files_for_query(
22 | f"'{self.get_folder_id(directory)}' in parents")
23 | return {file.get('name'): self.get_file_lines(file.get('id'))
24 | for file in found_files}
25 |
26 | def read_all_spreadsheet_data(self, directory, only=None):
27 | """Gets all spreadsheet data from directory.
28 |
29 | If the set only is specified, will only get files whose name appears in
30 | the only set.
31 | """
32 | found_files = self._get_files_for_query(
33 | f"'{self.get_folder_id(directory)}' in parents")
34 | return {file.get('name'): self.get_spreadsheet_data(file)
35 | for file in found_files
36 | if only is None or file.get('name') in only}
37 |
38 | def get_spreadsheet_data(self, file):
39 | if file['mimeType'] not in {
40 | 'text/comma-separated-values', 'text/csv', 'application/zip',
41 | 'text/tab-separated-values', 'application/vnd.google-apps.spreadsheet'}:
42 | print(f'File {file} not of supported type.')
43 | return []
44 | if file['mimeType'] == 'application/vnd.google-apps.spreadsheet':
45 | dl_file = self.download_file(file.get('id'), export_mime_type='text/csv')
46 | else:
47 | dl_file = self.download_file(file.get('id'))
48 | if file['mimeType'] == 'application/zip':
49 | dl_file = zipfile.ZipFile(dl_file).open(
50 | op.splitext(file.get('name'))[0] + '.csv')
51 | textio = io.TextIOWrapper(dl_file, encoding='utf-8')
52 | delimiter = ','
53 | if file['mimeType'] == 'text/tab-separated-values':
54 | delimiter = '\t'
55 | return [row for row in csv.DictReader(textio, delimiter=delimiter)]
56 |
57 | def download_file(self, file_id, export_mime_type=None):
58 | def _download():
59 | if export_mime_type:
60 | request = self.service.files().export_media(
61 | fileId=file_id, mimeType=export_mime_type)
62 | else:
63 | request = self.service.files().get_media(fileId=file_id)
64 | fh = io.BytesIO()
65 | downloader = MediaIoBaseDownload(fh, request)
66 | done = False
67 | while done is False:
68 | status, done = downloader.next_chunk()
69 | # print("Downloading file %d%%." % int(status.progress() * 100))
70 | fh.seek(0)
71 | return fh
72 | return utils.retry_on_error(_download)
73 |
74 | def download_file_to_disk(self, folder, filename, filepath):
75 | folder_id = self.get_folder_id(folder)
76 | for file in self._get_files_for_query(f"'{folder_id}' in parents"):
77 | if file['name'] == filename:
78 | with open(filepath, 'wb') as f:
79 | f.write(self.download_file(file['id']).getbuffer())
80 |
81 | def get_file_lines(self, file_id):
82 | return [ln.decode('utf-8')
83 | for ln in self.download_file(file_id).readlines()]
84 |
85 | def get_folder_id(self, folder_name):
86 | found_files = self._get_files_for_query(
87 | f"mimeType = 'application/vnd.google-apps.folder' and "
88 | f"name = '{folder_name}'")
89 | assert len(found_files) == 1, found_files
90 | return found_files[0].get('id')
91 |
92 | def _get_files_for_query(self, query):
93 | page_token = None
94 | found_files = []
95 | while True:
96 | response = utils.retry_on_error(
97 | self.service.files().list(
98 | q=query,
99 | spaces='drive',
100 | fields='nextPageToken, files(id, name, mimeType)',
101 | pageToken=page_token).execute)
102 | found_files += response.get('files', [])
103 | page_token = response.get('nextPageToken', None)
104 | if page_token is None:
105 | break
106 | return found_files
107 |
108 |
109 | if __name__ == "__main__":
110 | import credentials
111 | creds = credentials.get_credentials([
112 | # If modifying scopes, delete the file token.pickle.
113 | 'https://www.googleapis.com/auth/drive.readonly'])
114 | drive_api = DriveApi(creds)
115 |
--------------------------------------------------------------------------------
/autojournal/gcal_aggregator.py:
--------------------------------------------------------------------------------
1 | #!/bin/python3
2 |
3 | from functools import reduce
4 | from datetime import timedelta, datetime, date
5 | from dateutil import tz
6 | import argparse
7 |
8 | from . import credentials
9 | from . import photos_api
10 | from . import selfspy_api
11 | from . import calendar_api
12 | from . import drive_api
13 | from . import app_usage_output_parser
14 | from . import maps_data_parser
15 | from . import utils
16 | from .parsers import gps, nomie, momentodb
17 |
18 |
19 | def photos_to_event(
20 | photos: photos_api.mediaItem,
21 | event_length_mins: int = 15) -> calendar_api.CalendarEvent:
22 | return calendar_api.CalendarEvent(
23 | start=utils.utc_to_timezone(
24 | photos[0]['mediaMetadata']['creationTime']),
25 | end=utils.utc_to_timezone(photos[-1]['mediaMetadata']['creationTime'],
26 | additional_offset_mins=event_length_mins),
27 | description=reduce(
28 | lambda s1, s2: s1 + s2,
29 | [f'Notes: {photo.get("description", "")}\n\n'
30 | f'{photo["productUrl"]}\n\n\n' for photo in photos]),
31 | summary='Ate food',
32 | )
33 |
34 |
35 | calendars = {
36 | 'laptop': 'Laptop Activity',
37 | 'desktop': 'Desktop Activity',
38 | 'phone': 'Android Activity',
39 | 'maps': 'Locations and Travel',
40 | 'food': 'Food',
41 | 'nomie': 'Nomie',
42 | 'momento': 'Momento',
43 | }
44 |
45 |
46 | argparser = argparse.ArgumentParser(
47 | description='Upload data from multiple sources to Google Calendar')
48 | argparser.add_argument(
49 | '--clear', nargs='*', choices=list(calendars.keys()) + ['all'],
50 | default=[],
51 | help='Calendars to REMOVE ALL EVENTS from.')
52 | argparser.add_argument(
53 | '--update', nargs='*', choices=list(calendars.keys()) + ['all'],
54 | default=[],
55 | help='Calendars to update.')
56 | argparser.add_argument(
57 | '--dry_run', action='store_true', default=False,
58 | help='Will print what would be added to the calendar(s) without actually '
59 | 'updating them.')
60 | argparser.add_argument(
61 | '--start_date', type=str,
62 | help='Date (inclusive) at which to start modifying the calendar(s) in '
63 | 'format mm/dd/yyyy.')
64 | argparser.add_argument(
65 | '--end_date', type=str,
66 | help='Date (inclusive) at which to stop modifying the calendar(s) in '
67 | 'format mm/dd/yyyy.')
68 | args = argparser.parse_args()
69 |
70 |
71 | def main():
72 | creds = credentials.get_credentials([
73 | # If modifying scopes, delete the file token.pickle.
74 | 'https://www.googleapis.com/auth/drive.readonly',
75 | 'https://www.googleapis.com/auth/calendar',
76 | 'https://www.googleapis.com/auth/photoslibrary.readonly'])
77 |
78 | cal_api_instance = calendar_api.CalendarApi(creds)
79 | drive_api_instance = drive_api.DriveApi(creds)
80 |
81 | cal_mod_args = dict(dry_run=args.dry_run)
82 | if args.start_date:
83 | cal_mod_args['start_datetime'] = datetime.strptime(
84 | args.start_date, '%m/%d/%Y').replace(tzinfo=tz.gettz('PST'))
85 | if args.end_date:
86 | cal_mod_args['end_datetime'] = datetime.strptime(
87 | args.end_date, '%m/%d/%Y').replace(tzinfo=tz.gettz('PST'))
88 |
89 | # Clear events from calendars.
90 | if 'all' in args.clear:
91 | args.clear = list(calendars.keys())
92 | for c in args.clear:
93 | cal_api_instance.clear_calendar(calendars[c], **cal_mod_args)
94 |
95 | # Add food events from Google Photos.
96 | if 'all' in args.update or 'food' in args.update:
97 | photos_api_instance = photos_api.PhotosApi(creds)
98 | food_pictures = photos_api_instance.get_album_contents(
99 | photos_api_instance.get_album_id('Food!'))
100 | # Collapse multiple food pictures taken within 30 mins to one food
101 | # event.
102 | grouped_photos = utils.split_on_gaps(
103 | food_pictures, threshold=timedelta(minutes=30),
104 | key=lambda photo: datetime.fromisoformat(
105 | photo['mediaMetadata']['creationTime'].rstrip('Z')))
106 | food_events = [photos_to_event(photos) for photos in grouped_photos]
107 | cal_api_instance.add_events(
108 | calendars['food'], food_events, **cal_mod_args)
109 |
110 | # Add laptop activity from selfspy
111 | if 'all' in args.update or 'laptop' in args.update:
112 | drive_api_instance.download_file_to_disk(
113 | 'selfspy-laptop', 'selfspy.sqlite', 'laptop_selfspy.sqlite')
114 | laptop_events = selfspy_api.get_selfspy_usage_events(
115 | db_name='laptop_selfspy.sqlite')
116 | cal_api_instance.add_events(
117 | calendars['laptop'], laptop_events,
118 | **cal_mod_args)
119 |
120 | # Add desktop activity from selfspy db stored in Google Drive
121 | if 'all' in args.update or 'desktop' in args.update:
122 | drive_api_instance.download_file_to_disk(
123 | 'selfspy', 'selfspy.sqlite', 'desktop_selfspy.sqlite')
124 | desktop_events = selfspy_api.get_selfspy_usage_events(
125 | db_name='desktop_selfspy.sqlite')
126 | cal_api_instance.add_events(calendars['desktop'], desktop_events,
127 | **cal_mod_args)
128 |
129 | # Add phone events from phone usage csvs stored in Google Drive
130 | if 'all' in args.update or 'phone' in args.update:
131 | android_activity_files = drive_api_instance.read_files(
132 | directory='android-activity-logs')
133 | android_events = app_usage_output_parser.create_events(
134 | # Combine all "Activity" csvs in directory into single datastream.
135 | reduce(list.__add__, [v for k, v in android_activity_files.items()
136 | if 'usage_events' in k]))
137 | cal_api_instance.add_events(calendars['phone'], android_events,
138 | **cal_mod_args)
139 |
140 | # Add locations and travel.
141 | if 'all' in args.update or 'maps' in args.update:
142 | # From Google Takeout files stored in Google Drive.
143 | # drive_api_instance = drive_api.DriveApi(creds)
144 | # maps_location_history_files = drive_api_instance.read_files(
145 | # directory='maps-location-history')
146 | # location_events = maps_data_parser.parse_semantic_location_history(
147 | # maps_location_history_files)
148 |
149 | # Directly from timeline web "API"
150 | # location_events = maps_data_parser.make_events_from_kml_data(
151 | # '2019-09-01',
152 | # # Get data from yesterday only so that the data from today is fully
153 | # # populated before we send it off to the calendar.
154 | # date.today() - timedelta(days=1))
155 | # cal_api_instance.add_events(calendars['maps'], location_events,
156 | # **cal_mod_args)
157 |
158 | # From GPSLogger files in Google Drive
159 | spreadsheet_data = drive_api_instance.read_all_spreadsheet_data(
160 | 'GPSLogger for Android')
161 | # 'GPS TESTING')
162 | location_events = [
163 | e.to_calendar_event() for e in gps.parse_gps(spreadsheet_data)]
164 | cal_api_instance.add_events(calendars['maps'], location_events,
165 | **cal_mod_args)
166 |
167 | # Add manually tracked event from Nomie
168 | if 'all' in args.update or 'nomie' in args.update:
169 | nomie_data = drive_api_instance.read_all_spreadsheet_data('Nomie')
170 | nomie_events = [
171 | e.to_calendar_event() for e in nomie.parse_nomie(nomie_data)]
172 | cal_api_instance.add_events(calendars['nomie'], nomie_events,
173 | **cal_mod_args)
174 |
175 | if 'all' in args.update or 'momento' in args.update:
176 | momento_data = drive_api_instance.read_all_spreadsheet_data('momentodb')
177 | momento_events = [
178 | e.to_calendar_event() for e in momentodb.parse_momentodb(momento_data)]
179 | cal_api_instance.add_events(calendars['momento'], momento_events,
180 | **cal_mod_args)
181 |
182 | # TODO add journal entries
183 |
184 | # TODO add github commits
185 |
186 | # TODO add sleep
187 |
188 |
189 | if __name__ == '__main__':
190 | main()
191 |
--------------------------------------------------------------------------------
/autojournal/maps_data_parser.py:
--------------------------------------------------------------------------------
1 | import json
2 | from pprint import pprint
3 | import pandas as pd
4 | from datetime import datetime, timedelta
5 |
6 | from .process_location import get_kml_file, full_df
7 |
8 | from . import calendar_api
9 | from . import utils
10 |
11 |
12 | def _get_coord_str(d):
13 | return f"{d['latitudeE7']}, {d['longitudeE7']}"
14 |
15 |
16 | def parse_semantic_location_history(lines_by_filename):
17 | events = []
18 | for lines in lines_by_filename.values():
19 | data = json.loads('\n'.join(lines))
20 | for o in data['timelineObjects']:
21 | # These dicts should have a single key
22 | obj_type = next(iter(o.keys()))
23 | obj = o[obj_type]
24 | events.append(calendar_api.CalendarEvent(
25 | start=utils.timestamp_ms_to_event_time(
26 | int(obj['duration']['startTimestampMs'])),
27 | end=utils.timestamp_ms_to_event_time(
28 | int(obj['duration']['endTimestampMs'])),
29 | ))
30 | date = events[-1]['start']['dateTime'][:11]
31 | if obj_type == 'placeVisit':
32 | events[-1]['summary'] = f"At {obj['location']['name']}"
33 | events[-1]['description'] = f"""Details:
34 | Coordinates: {_get_coord_str(obj['location'])}
35 | Address: {obj['location']['address']}
36 | See https://www.google.com/maps/timeline?pb=!1m2!1m1!1s{date} for more.
37 | """
38 | elif obj_type == 'activitySegment':
39 | speed = round(obj['distance'] / ((
40 | int(obj['duration']['endTimestampMs'])
41 | - int(obj['duration']['startTimestampMs'])) / 1000), 2)
42 | travel_mode = obj['activityType'].lower().replace('_', ' ')
43 | events[-1][
44 | 'summary'] = f"{speed} m/s {travel_mode}"
45 | events[-1]['description'] = f"""Details:
46 | Start Coordinates: {_get_coord_str(obj['startLocation'])}
47 | End Coordinates: {_get_coord_str(obj['endLocation'])}
48 | See https://www.google.com/maps/timeline?pb=!1m2!1m1!1s{date} for more.
49 | """
50 | else:
51 | raise Exception(o)
52 | return events
53 |
54 |
55 | COOKIE_FILE = '/home/kovas/autojournal/timeline_cookie.txt'
56 | KML_OUTPUT_DIRECTORY = '/home/kovas/autojournal/location_data/'
57 |
58 |
59 | def make_events_from_kml_data(start_date, end_date, timezone_name='America/Los_Angeles'):
60 | try:
61 | with open(COOKIE_FILE, 'r') as f:
62 | cookie_content = f.read().strip()
63 | except FileNotFoundError:
64 | print('No cookie for maps timeline data!')
65 | return []
66 | kml_files = []
67 | for date in pd.date_range(start=start_date, end=end_date):
68 | kml_files.append(
69 | get_kml_file(date.year, date.month, date.day, cookie_content,
70 | KML_OUTPUT_DIRECTORY))
71 | df = full_df(kml_files).sort_values('RawBeginTime')
72 | events = []
73 | last_name_id = None
74 | for _, row in df.iterrows():
75 | name_id = str(row.Name) + str(row.Address)
76 | # Collapse events where both places are the same into a single event.
77 | if last_name_id == name_id:
78 | events[-1]['end'] = utils.utc_to_timezone(row.RawEndTime,
79 | timezone_name)
80 | else:
81 | events.append(calendar_api.CalendarEvent(
82 | start=utils.utc_to_timezone(row.RawBeginTime, timezone_name),
83 | end=utils.utc_to_timezone(row.RawEndTime, timezone_name),
84 | summary=(
85 | f'{round(row.Distance / row.TotalSecs, 1)} m/s {row.Category}'
86 | if row.Category else
87 | f'At {row.Name} {row.Address}'
88 | ),
89 | description=f'See https://www.google.com/maps/timeline?pb=!1m2!1m1!1s{row.BeginDate} for details.'
90 | ))
91 | last_name_id = name_id
92 | return events
93 |
94 |
95 | if __name__ == '__main__':
96 | make_events_from_kml_data('2019-09-01', '2019-10-10')
97 |
--------------------------------------------------------------------------------
/autojournal/notes_to_calendar.py:
--------------------------------------------------------------------------------
1 | """TODO write some code that takes notes written in special format and pushed
2 | them to google calendar"""
3 |
--------------------------------------------------------------------------------
/autojournal/parsers/__pycache__/cronometer.cpython-39.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kovasap/autojournal/55309333457eee4ece658d6961cebeceee40308f/autojournal/parsers/__pycache__/cronometer.cpython-39.pyc
--------------------------------------------------------------------------------
/autojournal/parsers/activitywatch.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime, timedelta
2 | import json
3 | import sqlite3
4 | from typing import Dict, List, Any
5 |
6 | from ..data_model import Event
7 |
8 |
9 | def get_events(db_file: str) -> List[Event]:
10 | con = sqlite3.connect(db_file)
11 | cur = con.cursor()
12 |
13 | buckets_by_id = {}
14 | hostnames_by_id = {}
15 | for bid, name, hostname in cur.execute(
16 | 'SELECT key, id, hostname FROM bucketmodel'):
17 | buckets_by_id[bid] = name
18 | hostnames_by_id[bid] = hostname
19 |
20 |
21 | events = []
22 | for bid, timestamp, duration, datastr in cur.execute(
23 | 'SELECT bucket_id, timestamp, duration, datastr FROM eventmodel'):
24 | parsed_time = datetime.fromisoformat(timestamp)
25 | data = eval(datastr)
26 | start_event = Event(
27 | timestamp=parsed_time,
28 | data=dict(device=hostnames_by_id[bid]),
29 | )
30 | end_event = Event(
31 | timestamp=parsed_time + timedelta(seconds=duration),
32 | data=dict(device=hostnames_by_id[bid]),
33 | )
34 | if buckets_by_id[bid].startswith('aw-watcher-afk'):
35 | # Ignore all statuses other than "not-afk"
36 | if data['status'] != 'not-afk':
37 | continue
38 | start_event.data['using_laptop'] = 1
39 | end_event.data['using_laptop'] = 0
40 | elif buckets_by_id[bid].startswith('aw-watcher-window'):
41 | start_event.data.update(data)
42 | end_event.data.update(data)
43 | events.append(start_event)
44 | events.append(end_event)
45 |
46 | return events
47 |
48 |
49 | def get_events_from_json(raw_json: str) -> List[Event]:
50 | data = json.loads(raw_json)
51 | events = []
52 | base_data = dict(
53 | device=data['buckets']['aw-watcher-android-test']['hostname'],
54 | )
55 | for use in data['buckets']['aw-watcher-android-test']['events']:
56 | parsed_time = datetime.fromisoformat(
57 | use['timestamp'].replace('Z', '+00:00'))
58 | # Start event
59 | events.append(Event(
60 | timestamp=parsed_time,
61 | data={**base_data, 'app': use['data']['app'], 'using_phone': 1},
62 | ))
63 | # End event
64 | events.append(Event(
65 | timestamp=parsed_time + timedelta(seconds=use['duration']),
66 | data={**base_data, 'app': use['data']['app'], 'using_phone': 0},
67 | ))
68 | return events
69 |
--------------------------------------------------------------------------------
/autojournal/parsers/cgm.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from typing import Dict, List, Any
3 |
4 | from ..data_model import Event
5 |
6 |
7 | def parse_cgm(data_by_fname) -> List[Event]:
8 | events = []
9 | for fname, data in data_by_fname.items():
10 | if 'glucose' not in fname:
11 | continue
12 | for line in data:
13 | events.append(Event(
14 | summary='',
15 | description='',
16 | timestamp=datetime.strptime(
17 | line['Device Timestamp'], '%m-%d-%Y %I:%M %p'),
18 | data=line))
19 | return events
20 |
--------------------------------------------------------------------------------
/autojournal/parsers/cronometer.py:
--------------------------------------------------------------------------------
1 | from collections import defaultdict
2 | import copy
3 | from datetime import datetime
4 | from typing import List, Any, Union
5 |
6 | import pytz
7 |
8 | from ..data_model import Event
9 |
10 |
11 | def is_numeric(v: Any) -> bool:
12 | """Returns True if v can be an int/float, False otherwise."""
13 | try:
14 | float(v)
15 | except ValueError:
16 | return False
17 | return True
18 |
19 |
20 | def _parse_food_timing_note(note: str) -> dict:
21 | split_note = note['Note'].split(' ')
22 | data = dict(
23 | num_foods=int(split_note[1]),
24 | time=split_note[2],
25 | )
26 | if len(split_note) > 3:
27 | data['description'] = ' '.join(split_note[3:])
28 | return data
29 |
30 |
31 | def _get_data_by_day(csv_data):
32 | data_by_day = defaultdict(list)
33 | for line in csv_data:
34 | data_by_day[line['Day']].append(line)
35 | return data_by_day
36 |
37 |
38 | def parse_time(time: str) -> datetime:
39 | dt = datetime.strptime(time, '%Y-%m-%d %I:%M%p')
40 | return dt.astimezone(pytz.timezone('America/Los_Angeles'))
41 |
42 |
43 | def add_food(t, d, events, cur_day_events):
44 | event = Event(
45 | summary=f'{d["Food Name"]}, {d["Amount"]}',
46 | description=f'{d["Food Name"]}, {d["Amount"]}',
47 | timestamp=t,
48 | data=d)
49 | events.append(event)
50 | cur_data = cur_day_events[-1].data if len(cur_day_events) else {}
51 | cur_day_event = copy.deepcopy(event)
52 | cur_day_event.data = {k: (cur_data.get(k, 0) + float(v))
53 | if is_numeric(v) else v
54 | for k, v in d.items()}
55 | cur_day_events.append(cur_day_event)
56 |
57 |
58 | def cast_food_value_type(name: str, value: str) -> Union[float, str]:
59 | if is_numeric(value):
60 | return float(value)
61 | elif name in {'Category', 'Food Name', 'Amount'}:
62 | return value
63 | elif value == '':
64 | return 0.0
65 | else:
66 | return value
67 |
68 |
69 | def _make_weight_event(daytime: str, data: dict) -> Event:
70 | plottable_data = {f'{data["Metric"]} ({data["Unit"]})': data['Amount']}
71 | return Event(
72 | summary=f'{data["Metric"]}, {data["Amount"]}, {data["Unit"]}',
73 | description=f'{data["Metric"]}, {data["Amount"]}, {data["Unit"]}',
74 | timestamp=parse_time(daytime),
75 | data=plottable_data)
76 |
77 |
78 | def parse_weight(weight_by_day) -> List[Event]:
79 | events = []
80 | for day, weights in weight_by_day.items():
81 | if not weights:
82 | continue
83 | elif len(weights) == 1:
84 | events.append(_make_weight_event(f'{day} 9:00am', weights[0]))
85 | elif len(weights) == 2:
86 | events.append(_make_weight_event(f'{day} 9:00am', weights[0]))
87 | events.append(_make_weight_event(f'{day} 11:00pm', weights[1]))
88 | elif len(weights) == 3:
89 | events.append(_make_weight_event(f'{day} 9:00am', weights[0]))
90 | events.append(_make_weight_event(f'{day} 12:00pm', weights[1]))
91 | events.append(_make_weight_event(f'{day} 11:00pm', weights[2]))
92 | return events
93 |
94 |
95 | def parse_nutrition(data_by_fname, daily_cumulative: bool = True
96 | ) -> List[Event]:
97 | events = []
98 | daily_cum_events = []
99 |
100 | weight_events = parse_weight(_get_data_by_day(
101 | data_by_fname['biometrics.csv']))
102 | events += weight_events
103 | daily_cum_events += weight_events
104 |
105 | foods_by_day = _get_data_by_day(data_by_fname['servings.csv'])
106 | notes_by_day = _get_data_by_day(data_by_fname['notes.csv'])
107 |
108 | for day, foods in foods_by_day.items():
109 | # Convert to proper types, making empty strings 0 valued
110 | foods = [{k: cast_food_value_type(k, v) for k, v in f.items()}
111 | for f in foods]
112 | cur_day_events = []
113 | foods_iter = iter(f for f in foods if f['Food Name'] != 'Tap Water')
114 | for note in notes_by_day[day]:
115 | if not note['Note'].startswith('eat'):
116 | continue
117 | note_data = _parse_food_timing_note(note)
118 | for _ in range(note_data['num_foods']):
119 | add_food(parse_time(f'{day} {note_data["time"]}'),
120 | next(foods_iter), events, cur_day_events)
121 | # Assume the rest of the foods were eaten at midnight.
122 | for food in foods_iter:
123 | add_food(parse_time(f'{day} 12:00am'),
124 | food, events, cur_day_events)
125 | daily_cum_events += cur_day_events
126 |
127 | return daily_cum_events if daily_cumulative else events
128 |
129 |
130 | def parse_biometrics(data_by_fname) -> List[Event]:
131 | pass
132 |
--------------------------------------------------------------------------------
/autojournal/parsers/google_fit.py:
--------------------------------------------------------------------------------
1 | """Parses json data like:
2 | {
3 | "fitnessActivity": "running",
4 | "startTime": "2021-04-19T20:57:39.686Z",
5 | "endTime": "2021-04-19T21:08:53.872Z",
6 | "duration": "674.186s",
7 | "segment": [{
8 | "fitnessActivity": "running",
9 | "startTime": "2021-04-19T20:57:39.686Z",
10 | "endTime": "2021-04-19T21:08:53.872Z"
11 | }],
12 | "aggregate": [{
13 | "metricName": "com.google.heart_minutes.summary",
14 | "floatValue": 19.0
15 | }, {
16 | "metricName": "com.google.calories.expended",
17 | "floatValue": 144.3837432861328
18 | }, {
19 | "metricName": "com.google.step_count.delta",
20 | "intValue": 1550
21 | }, {
22 | "metricName": "com.google.distance.delta",
23 | "floatValue": 1558.4816045761108
24 | }, {
25 | "metricName": "com.google.speed.summary",
26 | "floatValue": 2.381139442776954
27 | }, {
28 | "metricName": "com.google.active_minutes",
29 | "intValue": 10
30 | }]
31 | }
32 | """
33 |
34 | from datetime import datetime, timedelta
35 | from dateutil import tz
36 | import json
37 | from typing import List
38 |
39 | from ..data_model import Event
40 |
41 |
42 | METERS_IN_MILE = 1609
43 | METRIC_NAME_MAPS = {
44 | 'com.google.calories.expended': 'Burned Calories',
45 | 'com.google.distance.delta': 'Meters Travelled',
46 | 'com.google.speed.summary': 'Speed',
47 | }
48 |
49 |
50 | def get_agg_value(aggregate: dict) -> float:
51 | for key in ['floatValue', 'intValue']:
52 | if key in aggregate:
53 | return round(float(aggregate[key]), 1)
54 | raise Exception(f'Unknown key in aggregate {aggregate}')
55 |
56 |
57 | def activity_json_to_event(activity_json: str) -> Event:
58 | data = json.loads(activity_json)
59 | aggregates = {
60 | METRIC_NAME_MAPS.get(agg['metricName'], agg['metricName']):
61 | get_agg_value(agg) for agg in data['aggregate']}
62 | if data['fitnessActivity'] in {'running', 'walking'}:
63 | calories = aggregates['Burned Calories']
64 | speed = aggregates['Speed']
65 | distance_mi = round(aggregates['Meters Travelled'] / METERS_IN_MILE, 1)
66 | description = (
67 | f'{calories} cal burned {data["fitnessActivity"]} {distance_mi} mi '
68 | f'at {speed} m/s')
69 | else:
70 | calories = aggregates['Burned Calories']
71 | description = f'{calories} cal burned doing {data["fitnessActivity"]}'
72 | return Event(
73 | timestamp=datetime.fromisoformat(
74 | data['startTime'].replace('Z', '+00:00')).astimezone(
75 | tz.gettz('PST')),
76 | duration=timedelta(seconds=float(data['duration'].strip('s'))),
77 | data=aggregates,
78 | summary=data['fitnessActivity'],
79 | description=description)
80 |
81 |
82 | def parse_sessions(drive_api_instance, directory) -> List[Event]:
83 | json_files = drive_api_instance.read_files(directory)
84 | return [activity_json_to_event('\n'.join(lines))
85 | for lines in json_files.values()]
86 |
--------------------------------------------------------------------------------
/autojournal/parsers/gps.py:
--------------------------------------------------------------------------------
1 | """Gps data parsing functionality.
2 |
3 | Parses data generated by the GPSLogger for Android app (https://gpslogger.app/).
4 | Make sure that "gps" is the only source for data on the app (not "network").
5 | """
6 |
7 | from dataclasses import dataclass
8 | from dateutil import tz
9 | from datetime import datetime, timedelta
10 | from typing import List, Tuple, Sequence, Set, Optional
11 | import statistics
12 |
13 | import geopy.distance
14 | import geopy.geocoders
15 |
16 | from ..data_model import Event
17 |
18 | # Distance between two readings for movement between them to be ignored.
19 | STATIONARY_DISTANCE_MILES = 0.05
20 | STATIONARY_TIME_BETWEEN_TRIPS_SECS = 60 * 5
21 |
22 | location_bank = []
23 | nominatim = geopy.geocoders.Nominatim(user_agent='autojournal')
24 |
25 | def make_float(s: str) -> float:
26 | return float(s) if s else 0.0
27 |
28 | @dataclass
29 | class Location:
30 | """Stores location data."""
31 |
32 | latitude: float
33 | longitude: float
34 | elevation: float
35 | accuracy_miles: float
36 | speed: float
37 | name: str = 'unknown'
38 | mode_of_travel: str = None
39 |
40 | @classmethod
41 | def from_line(cls, line: dict) -> 'Location':
42 | return cls(
43 | latitude=make_float(line['lat']),
44 | longitude=make_float(line['lon']),
45 | elevation=make_float(line['elevation']),
46 | accuracy_miles=make_float(line['accuracy']) / 1609.34, # meters -> miles
47 | speed=make_float(line['speed']))
48 |
49 | def summary(self) -> str:
50 | if self.mode_of_travel:
51 | return ''
52 | else:
53 | if self.name == 'unknown':
54 | self.lookup_name()
55 | return f'At {self.name}'
56 |
57 | def as_point(self) -> Tuple[float, float]:
58 | return (self.latitude, self.longitude)
59 |
60 | def get_distance(self, other):
61 | return geopy.distance.distance(
62 | self.as_point(), other.as_point()).miles
63 |
64 | def is_same_place(self, other) -> bool:
65 | return (
66 | self.accuracy_miles + other.accuracy_miles + STATIONARY_DISTANCE_MILES
67 | > self.get_distance(other))
68 |
69 | def lookup_name(self) -> str:
70 | for banked_location in location_bank:
71 | if banked_location.is_same_place(self):
72 | self.name = banked_location.name
73 | else:
74 | self.name = nominatim.reverse(self.as_point()).address
75 | location_bank.append(self)
76 |
77 | def __str__(self) -> str:
78 | return f'{self.as_point()}, {self.name}'
79 |
80 |
81 | def are_single_location(
82 | locations: Sequence[Location], fraction_required: float=0.9,
83 | samples: Optional[int]=None) -> bool:
84 | sampled_locations = locations
85 | if samples:
86 | sample_spacing = len(locations) // samples
87 | if sample_spacing > 0:
88 | sampled_locations = [locations[i*sample_spacing] for i in range(samples)]
89 | num_locations = len(sampled_locations)
90 | total_num_matching = 0
91 | for l1 in sampled_locations:
92 | num_matching_l1 = 0
93 | for l2 in sampled_locations:
94 | if l1.is_same_place(l2):
95 | num_matching_l1 += 1
96 | if num_matching_l1 / num_locations > fraction_required:
97 | total_num_matching += 1
98 | return total_num_matching / num_locations > fraction_required
99 |
100 |
101 | def get_traveling_description(
102 | timestamps: Sequence[datetime], locations: Sequence[Location]) -> str:
103 | mph_speeds = [
104 | locations[i-1].get_distance(location)
105 | / ((timestamp - timestamps[i-1]).total_seconds() / 60 / 60)
106 | for i, (timestamp, location) in enumerate(zip(timestamps, locations))
107 | if i > 0
108 | ]
109 | if not mph_speeds:
110 | return 'not enough data'
111 | average_mph_speed = statistics.mean(mph_speeds)
112 | stdev_mph_speed = statistics.stdev(mph_speeds)
113 | max_mph_speed = max(mph_speeds)
114 | # https://www.bbc.co.uk/bitesize/guides/zq4mfcw/revision/1
115 | if average_mph_speed < 0:
116 | mode_of_travel = 'not travelling?'
117 | elif average_mph_speed < 4 and max_mph_speed < 10:
118 | mode_of_travel = 'walking'
119 | elif average_mph_speed < 13 and max_mph_speed < 13:
120 | mode_of_travel = 'running'
121 | elif average_mph_speed < 25 and max_mph_speed < 20:
122 | mode_of_travel = 'biking'
123 | elif average_mph_speed < 100:
124 | mode_of_travel = 'driving'
125 | elif average_mph_speed < 300:
126 | mode_of_travel = 'on the train'
127 | else:
128 | mode_of_travel = 'flying'
129 | return (f'{mode_of_travel} at {average_mph_speed:.2f}±{stdev_mph_speed:.2f} '
130 | f'mph (max of {max_mph_speed:.2f} mph)')
131 |
132 |
133 | def make_calendar_event(
134 | timestamps: Sequence[datetime], locations: Sequence[Location],
135 | travel_event: bool) -> Event:
136 | return Event(
137 | timestamp=timestamps[0],
138 | duration=timestamps[-1] - timestamps[0],
139 | summary=(get_traveling_description(timestamps, locations)
140 | if travel_event else locations[-1].summary()),
141 | description='')
142 |
143 |
144 | def make_events(timestamps: List[datetime], locations: List[Location],
145 | window_size: timedelta=timedelta(minutes=2),
146 | min_points_per_window: int=3,
147 | ) -> List[Tuple[datetime, str]]:
148 | """Finds sections of input list where location is different."""
149 |
150 | def get_window_size(ts: List[datetime]) -> timedelta:
151 | if not ts:
152 | return timedelta(seconds=0)
153 | return ts[-1] - ts[0]
154 |
155 | # Creates windows of window_size. If a window has less than
156 | # min_points_per_window, then we add more even if we go above window_size.
157 | timestamp_windows = [[]]
158 | location_windows = [[]]
159 | for timestamp, location in zip(timestamps, locations):
160 | if (get_window_size(timestamp_windows[-1]) > window_size and
161 | len(timestamp_windows[-1]) >= min_points_per_window):
162 | timestamp_windows.append([])
163 | location_windows.append([])
164 | timestamp_windows[-1].append(timestamp)
165 | location_windows[-1].append(location)
166 |
167 | events = []
168 | cur_event_timestamps = []
169 | cur_event_locations = []
170 | stationary = True
171 | for timestamp_window, location_window in zip(
172 | timestamp_windows, location_windows):
173 | single_location = are_single_location(location_window)
174 | if cur_event_timestamps and single_location != stationary:
175 | events.append(make_calendar_event(
176 | cur_event_timestamps, cur_event_locations,
177 | travel_event=not stationary))
178 | stationary = not stationary
179 | cur_event_timestamps = []
180 | cur_event_locations = []
181 | cur_event_timestamps += timestamp_window
182 | cur_event_locations += location_window
183 | events.append(make_calendar_event(
184 | cur_event_timestamps, cur_event_locations,
185 | travel_event=not stationary))
186 | return events
187 |
188 |
189 | def parse_gps(data_by_fname) -> List[Event]:
190 | events = []
191 | for fname, data in sorted(data_by_fname.items(), key=lambda t: t[0]):
192 | if not fname.endswith('.zip'):
193 | continue
194 | line_locations = []
195 | line_timestamps = []
196 | for line in data:
197 | line_locations.append(Location.from_line(line))
198 | line_timestamps.append(datetime.fromisoformat(
199 | line['time'].replace('Z', '+00:00')).astimezone(tz.gettz('PST')))
200 | events += make_events(line_timestamps, line_locations)
201 | print(f'Finished parsing {fname}')
202 | return events
203 |
--------------------------------------------------------------------------------
/autojournal/parsers/momentodb.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from typing import List
3 |
4 | from ..data_model import Event
5 |
6 |
7 | def parse_momentodb(data_by_fname) -> List[Event]:
8 | events = []
9 | for fname, data in data_by_fname.items():
10 | for line in data:
11 | print(line)
12 | for key, val in line.items():
13 | if key in {'Creation', '__id'}:
14 | continue
15 | if val in {'', None, 'FALSE'}:
16 | continue
17 | events.append(Event(
18 | summary=f'{key} {val}' if val != 'TRUE' else key,
19 | description=str(line),
20 | timestamp=datetime.strptime(
21 | line['Creation'], '%m/%d/%Y %H:%M:%S'),
22 | data={key: val}))
23 | return events
24 |
--------------------------------------------------------------------------------
/autojournal/parsers/nomie.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from typing import Dict, List
3 |
4 | from ..data_model import Event
5 |
6 |
7 | def parse_nomie(data_by_fname) -> List[Event]:
8 | events = []
9 | for fname, data in data_by_fname.items():
10 | if not fname.startswith('n-'):
11 | continue
12 | for line in data:
13 | print(line)
14 | events.append(Event(
15 | summary=f'{line["value"]} {line["tracker"]}',
16 | description=str(line),
17 | timestamp=datetime.strptime(
18 | line['start'], '%Y-%m-%dT%H:%M:%S.%f'),
19 | data={line['tracker']: line['value']}))
20 | return events
21 |
--------------------------------------------------------------------------------
/autojournal/photos_api.py:
--------------------------------------------------------------------------------
1 | from typing import List
2 | from googleapiclient.discovery import build
3 |
4 |
5 | # https://developers.google.com/photos/library/reference/rest/v1/mediaItems
6 | mediaItem = dict
7 |
8 | # Helpful tips at
9 | # https://stackoverflow.com/questions/50573196/access-google-photo-api-with-python-using-google-api-python-client
10 |
11 |
12 | class PhotosApi(object):
13 |
14 | def __init__(self, creds):
15 | self.service = build(
16 | 'photoslibrary', 'v1', credentials=creds, static_discovery=False)
17 |
18 | def get_album_id(self, name: str) -> str:
19 | albums = []
20 | page_token = None
21 | while page_token != '':
22 | page_token = '' if not page_token else page_token
23 | results = self.service.albums().list(
24 | pageToken=page_token,
25 | pageSize=10,
26 | fields="nextPageToken,albums(id,title)",
27 | ).execute()
28 | albums += results.get('albums', [])
29 | page_token = results.get('nextPageToken', '')
30 | for album in albums:
31 | if album['title'] == name:
32 | return album['id']
33 | else:
34 | raise Exception(f'Album {name} not found!')
35 |
36 | def get_album_contents(self, album_id: str) -> List[mediaItem]:
37 | photos = []
38 | page_token = None
39 | while page_token != '':
40 | page_token = '' if not page_token else page_token
41 | results = self.service.mediaItems().search(
42 | body=dict(pageSize=25, pageToken=page_token,
43 | albumId=album_id)
44 | ).execute()
45 | photos += results.get('mediaItems', [])
46 | page_token = results.get('nextPageToken', '')
47 | return photos
48 |
--------------------------------------------------------------------------------
/autojournal/process_location.py:
--------------------------------------------------------------------------------
1 | import pandas as pd
2 | import numpy as np
3 | import datetime as DT
4 | import os
5 | import re
6 | import imp
7 | import glob
8 | import time
9 | import calendar
10 | import requests
11 | from datetime import datetime
12 | from dateutil import tz
13 | from bs4 import BeautifulSoup
14 | # from mpl_toolkits.basemap import Basemap
15 |
16 |
17 | def convert_timezone(dtime):
18 | """
19 | Convert datetimes from UTC to localtime zone
20 | """
21 | utc_datetime = datetime.strptime(dtime, "%Y-%m-%dT%H:%M:%S.%fZ")
22 | utc_datetime = utc_datetime.replace(tzinfo=tz.tzutc())
23 | local_datetime = utc_datetime.astimezone(tz.tzlocal())
24 | return local_datetime.strftime("%Y-%m-%d %H:%M:%S")
25 |
26 |
27 | def process(bs):
28 | """
29 | Convert KML file into a list of dictionnaries
30 | At this time, every place begin with Placemark tag in the KML file
31 | :param bs: beautiful soup object
32 | :return: list of places
33 | """
34 | places = []
35 | for place in bs.find_all('Placemark'):
36 | dic = {}
37 | for elem in place:
38 | if elem.name != 'Point':
39 | c = list(elem.children)
40 | e = elem.find_all('Data')
41 | if len(c) == 1:
42 | dic.update({elem.name.title(): ''.join(c)})
43 | elif len(e) > 1:
44 | for d in e:
45 | dic.update({d.attrs['name']: d.text})
46 | else:
47 | dic.update({elem.name: [d.text for d in c]})
48 | places.append(dic)
49 | return places
50 |
51 |
52 | def create_places_list(json_file):
53 | """
54 | Open the KML. Read the KML. Process and create json.
55 | :param json_file: json file path
56 | :return: list of places
57 | """
58 | with open(json_file, 'r') as f:
59 | s = BeautifulSoup(f, 'xml')
60 | return process(s)
61 |
62 |
63 | def convert_time(row):
64 | """
65 | Convert datimes into well-formated dates, get event duration
66 | """
67 | b_time = datetime.strptime(row['BeginTime'], "%Y-%m-%dT%H:%M:%S.%fZ")
68 | e_time = datetime.strptime(row['EndTime'], "%Y-%m-%dT%H:%M:%S.%fZ")
69 | delta = (e_time - b_time).total_seconds()
70 | m, s = map(int,divmod(delta, 60))
71 | h, m = divmod(m, 60)
72 | row['RawBeginTime'], row['RawEndTime'] = row['BeginTime'], row['EndTime']
73 | row['Duration'] = '%sh %smin %ssec' % (h, m, s)
74 | row['IndexTime'] = row['BeginTime'] = convert_timezone(row['BeginTime'])
75 | row['BeginDate'], row['BeginTime'] = row['BeginTime'].split(' ')
76 | row['EndDate'], row['EndTime'] = convert_timezone(row['EndTime']).split(' ')
77 | row['WeekDay'] = datetime.strptime(row['BeginDate'], "%Y-%m-%d").weekday()
78 | row['TotalSecs'] = delta
79 | return row
80 |
81 |
82 | def create_df(places):
83 | """
84 | Create a well formated pandas DataFrame
85 | One row is a event (place or moving)
86 | :param places: list of places
87 | :return: DataFrame
88 | """
89 | df = pd.DataFrame(places)
90 | # with pd.option_context('display.max_rows', None, 'display.max_columns', None): # more options can be specified also
91 | # print(df)
92 | times = df['TimeSpan'].apply(pd.Series).rename(columns={0:'BeginTime', 1:'EndTime'})
93 | df = pd.concat([df, times], axis = 1)
94 | # df.drop(['TimeSpan', 'Email', 'Description'], axis=1, inplace=True)
95 | # df['Track'] = df['Track'].apply(lambda x:[d.split(' ') for d in x if d != 'clampToGround'])
96 | df = df.apply(convert_time, axis=1)
97 | return df.sort_values('IndexTime', ascending=False)
98 |
99 |
100 | def get_kml_file(year, month, day, cookie_content, folder, overwrite=False):
101 | """
102 | Get KML file from your location history and save it in a chosen folder
103 | :param month: month of the location history
104 | :param day: day of the location history
105 | :param cookie_content: your cookie (see README)
106 | :param folder: path to the folder
107 | """
108 | cookies = dict(cookie=cookie_content)
109 |
110 | if type(month) == str:
111 | month = month[:3].title()
112 | cal = {v: k for k, v in enumerate(calendar.month_abbr, -1)}
113 | month_url = str(cal[month])
114 | else:
115 | month_url = str(int(month - 1))
116 |
117 | year_file = year_url = str(int(year))
118 | month_file = str(int(month_url) + 1)
119 | day_file = day_url = str(int(day))
120 |
121 | if len(month_file) == 1:
122 | month_file = '0' + month_file
123 | if len(day_file) == 1:
124 | day_file = '0' + day_file
125 |
126 | outfilepath = os.path.join(
127 | folder, f'history-{year_file}-{month_file}-{day_file}.kml')
128 | if not overwrite and os.path.isfile(outfilepath):
129 | return outfilepath
130 | print(f'Downloading to {outfilepath}...')
131 |
132 | url = 'https://www.google.com/maps/timeline/kml?authuser=0&pb=!1m8!1m3!1i{0}!2i{1}!3i{2}!2m3!1i{0}!2i{1}!3i{2}'.format(year_url, month_url, day_url)
133 | time.sleep(0.003 * np.random.randint(100))
134 | r = requests.get(url, cookies=cookies)
135 | if r.status_code == 200:
136 | with open(outfilepath, 'w') as f:
137 | f.write(r.text)
138 | else:
139 | print(r.text)
140 | return outfilepath
141 |
142 |
143 | def full_df(kml_files):
144 | """
145 | Create a well formated DataFrame from multiple KML files
146 | :param folder: path to folder where are saved the KML files
147 | """
148 | df = pd.DataFrame()
149 | print('{0} KML files (ie {0} days) to concatenate'.format(len(kml_files)))
150 | for file in kml_files:
151 | try:
152 | df = pd.concat([df, create_df(create_places_list(file))])
153 | except KeyError as e:
154 | if 'TimeSpan' not in repr(e):
155 | raise
156 | df = df.sort_values('IndexTime', ascending=False)
157 | # Need hashable elements to drop duplicates, tuples are, list aren't
158 | df = df[['Address', 'BeginDate', 'BeginTime', 'RawBeginTime',
159 | 'Category', 'Distance', 'Duration', 'EndDate', 'EndTime',
160 | 'RawEndTime', 'IndexTime', 'Name', 'WeekDay', 'TotalSecs']]
161 | for elem in df.columns:
162 | df[elem] = df[elem].apply(lambda x: tuple([tuple(p) for p in x])
163 | if type(x) is list else x)
164 | df.drop_duplicates(inplace=True)
165 | df['Distance'] = df['Distance'].apply(int) # This is in meters.
166 | return df.reset_index(drop=True)
167 |
168 |
169 | def sec_to_time(sec):
170 | h, s = divmod(sec, 3600)
171 | m, s = divmod(s, 60)
172 | return h, m, s, "%02d:%02d:%02d" % (h, m, s)
173 |
174 |
175 | if __name__ == '__main__':
176 | output_folder = '/home/kovas/photos_calendar_sync/location_data/'
177 | cookie_content = 'cookie: CONSENT=YES+US.en+20180225-07-0; OTZ=5363995_84_88_104280_84_446940; OGP=-19016807:; S=billing-ui-v3=CkgJ1xHidolfyqc74Vo5xY9UmKXxuvNn:billing-ui-v3-efe=CkgJ1xHidolfyqc74Vo5xY9UmKXxuvNn:sso=aXwL3pK84fymj5WTbi7N006L6xPsxhQq; OGPC=19016807-1:19016664-5:; SEARCH_SAMESITE=CgQItI8B; ANID=AHWqTUmYIrqVeR1ZZwGqusEktPtrg2hJ8HE3Ujyb3WuME-LBzp0Uv8ZtGJlARrBU; NID=202=CK3AzGKuN3feT05S3vtaXY9OC923eX_WJoxOYuawRZ-_A4Rzw1Y3cEpGANem40umJOlZRVmgmanECPyB4lH_5q0ESnyidOOoEbW1T1u6WPf0L1UCVaZdUNL6kxg633RKmNwwtdF4JhKDxJS29bTiuayBhSLyHUZxCntv7zMGqUFYKMwZheomJjLoKnzpyyw8a_9X4QbnHdd_vqokhEnOpimZVjlAl9Rlk9pdG6ZvZ6I6EXoP7ZTHOXV1b5SGEY7rYzQ6vRaHinRI; SID=vweP0laRgn9q3QTpsrfTYutbPETFtBuOysyr8JgWiG3uhKSyY5IMAHTacjonAsaiEBcKTw.; __Secure-3PSID=vweP0laRgn9q3QTpsrfTYutbPETFtBuOysyr8JgWiG3uhKSyL-Bk9j2uozG6zKDuQBKsbw.; HSID=AbN19q3BIUA2o6Apx; SSID=AdSyw-E-fjsebVxoI; APISID=KaGUUyepjr3XehId/AZgZu9uR7Z37HtHEL; SAPISID=JhZoG03R9faqKtuC/AFieFTHgoh9KTwKKo; __Secure-HSID=AbN19q3BIUA2o6Apx; __Secure-SSID=AdSyw-E-fjsebVxoI; __Secure-APISID=KaGUUyepjr3XehId/AZgZu9uR7Z37HtHEL; __Secure-3PAPISID=JhZoG03R9faqKtuC/AFieFTHgoh9KTwKKo; 1P_JAR=2020-04-11-20; DV=w4GNKA2kSS1JMNUx7HxUEtrYZAivFlcvJBy_9KrqZAAAAFD-_UOvWji7ugAAAFjzqyMnUlHpRwAAAA; SIDCC=AJi4QfGbxHacWpHD8vDXT6VZxGhH4WiIk3S2PZ6fd59SdRdrChWABLcX1uCtIs91Yt9gc9ik3rA'
178 | for i in [7, 8, 9, 10, 11, 12]:
179 | get_kml_file(2020, 4, i, cookie_content, output_folder)
180 | with pd.option_context('display.max_rows', None, 'display.max_columns', None): # more options can be specified also
181 | print(full_df(output_folder))
182 |
183 |
--------------------------------------------------------------------------------
/autojournal/record_usage_video.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | import glob
3 | import os
4 | import subprocess
5 | import time
6 |
7 | # import credentials
8 | # import photos_api
9 |
10 |
11 | DATA_DIR = os.path.expanduser('~/usage_video_data/')
12 | DATE_PATTERN = '%Y-%m-%d'
13 | IDLE_TIMEOUT_MINS = 15
14 |
15 |
16 | def make_video(input_img_pattern, output_fname='output.mp4'):
17 | cmd = ['ffmpeg',
18 | '-framerate', '5',
19 | '-y', # Overwrite output file without asking.
20 | '-s', '1920x1080',
21 | '-i', input_img_pattern,
22 | '-c:v', 'libx264', '-profile:v', 'high', '-crf', '20',
23 | '-pix_fmt', 'yuv420p',
24 | output_fname]
25 | subprocess.run(cmd, cwd=DATA_DIR)
26 | return DATA_DIR + output_fname
27 |
28 |
29 | def main():
30 | # creds = credentials.get_credentials([
31 | # # If modifying scopes, delete the file token.pickle.
32 | # 'https://www.googleapis.com/auth/photoslibrary.readonly'])
33 | # photos_api_instance = photos_api.PhotosApi(creds)
34 |
35 | if not os.path.exists(DATA_DIR):
36 | os.makedirs(DATA_DIR)
37 |
38 | i = 0
39 | active = True
40 | while True:
41 | idle_ms = int(subprocess.run(['xprintidle'],
42 | capture_output=True).stdout)
43 | if idle_ms / 1000 / 60 > IDLE_TIMEOUT_MINS:
44 | if active:
45 | i = 0
46 | imgs = glob.glob(f'{DATA_DIR}*.png')
47 | if imgs:
48 | make_video(
49 | f'%05d.png',
50 | f'{datetime.now().strftime("%Y-%m-%d_%H:%M")}.mp4')
51 | for img in imgs:
52 | print(f'Removing {img}')
53 | os.remove(img)
54 | active = False
55 | else:
56 | active = True
57 | # Each screenshot is ~2 MB, so taking one screenshot every 10
58 | # seconds should use 2 * 6 * 60 * 12 hours/day on computer = ~8.6
59 | # GB
60 | subprocess.run(
61 | ['scrot',
62 | f'{DATA_DIR}{i:05}.png',
63 | '-p', # Take screenshot with mouse pointer.
64 | '-o', # Overwrite existing files (if program was restarted).
65 | ])
66 | i += 1
67 | time.sleep(10)
68 |
69 |
70 | if __name__ == '__main__':
71 | main()
72 |
--------------------------------------------------------------------------------
/autojournal/report_generator.py:
--------------------------------------------------------------------------------
1 | import os
2 | import copy
3 | from datetime import datetime
4 | from dateutil import tz
5 | import pickle
6 | from typing import Union, Iterable, List
7 |
8 | import click
9 | import plotly.graph_objects as go
10 | import plotly.express as px
11 |
12 | from . import credentials
13 | from . import drive_api
14 | from . import calendar_api
15 | from .parsers import cronometer
16 | from .parsers import cgm
17 | from .parsers import nomie
18 | from .parsers import gps
19 | from .parsers import activitywatch
20 | from .parsers import google_fit
21 | from .data_model import Event
22 |
23 |
24 | def create_altair_report(
25 | data: Iterable[Event], metrics_to_plot: List[str], html_name: str,
26 | ):
27 | pass
28 |
29 |
30 | # Based on https://plotly.com/python/range-slider/.
31 | def create_plotly_report(
32 | data: Iterable[Event], metrics_to_plot: List[str], html_name: str,
33 | metric_colors=px.colors.cyclical.mrybm * 2,
34 | # metric_colors=px.colors.sequential.Electric * 2,
35 | ):
36 | # Create figure
37 | fig = go.Figure()
38 |
39 | def get_metric_data(m):
40 | return [p for p in data if m in p.data]
41 |
42 | metrics_to_data = {
43 | m: get_metric_data(m) for m in metrics_to_plot if get_metric_data(m)
44 | }
45 |
46 | # If sleep data is included, display it as colored backgrounds
47 | if 'asleep' in metrics_to_data:
48 | asleep_periods = []
49 | for i, event in enumerate(metrics_to_data['asleep']):
50 | if event.data['asleep']:
51 | asleep_periods.append((event.timestamp, ))
52 | else:
53 | assert len(asleep_periods[-1]) == 1
54 | asleep_periods[-1] += (event.timestamp, )
55 | if len(asleep_periods[-1]) == 1:
56 | asleep_periods.pop(-1)
57 | fig.update_layout(
58 | shapes=[
59 | dict(
60 | fillcolor="rgba(63, 81, 181, 0.2)",
61 | line={"width": 0},
62 | type="rect",
63 | x0=x0,
64 | x1=x1,
65 | xref="x",
66 | y0=0,
67 | y1=1.0,
68 | yref="paper"
69 | )
70 | for x0, x1 in asleep_periods
71 | ]
72 | )
73 | # COMMENT THIS OUT TO USE BAR CHARTS TO VISUALIZE SLEEP!
74 | del metrics_to_data['asleep']
75 |
76 | axis_domain_size = 1.0 / len(metrics_to_data)
77 | y_axes = {}
78 | for i, (m, pts) in enumerate(metrics_to_data.items()):
79 | # print(m)
80 | # for pt in pts:
81 | # print(pt.data['Day'], pt.data['Food Name'], pt.data['Energy (kcal)'])
82 | y_data = [p.data[m] for p in pts]
83 | y_str = '' if i == 0 else str(i + 1)
84 |
85 | if m == 'asleep':
86 | pass
87 | fig.add_trace(
88 | go.Bar(x=[p.timestamp for p in pts][:-1],
89 | # base=[p.timestamp for p in pts][1:],
90 | y=[1 if p.data['asleep'] else 2 for p in pts],
91 | yaxis=f'y{y_str}',
92 | marker=dict(color=metric_colors[i]),
93 | showlegend=False,
94 | name=m,
95 | hovertemplate='
'))
96 | else:
97 | fig.add_trace(
98 | go.Scatter(
99 | x=[p.timestamp for p in pts],
100 | y=y_data,
101 | name=m,
102 | text=[p.description for p in pts],
103 | yaxis=f'y{y_str}',
104 | marker=dict(color=metric_colors[i], size=8),
105 | hoverinfo='name+x+text',
106 | # https://plotly.com/python/line-charts/
107 | line=dict(width=1.5, shape='hv'),
108 | mode='lines+markers',
109 | showlegend=False,
110 | ))
111 | y_axes[f'yaxis{y_str}'] = dict(
112 | anchor='x',
113 | autorange=True,
114 | domain=[axis_domain_size * i, axis_domain_size * (i + 1)],
115 | linecolor=metric_colors[i],
116 | mirror=True,
117 | range=[min(y_data), max(y_data)],
118 | showline=True,
119 | side='right',
120 | tickfont={'color': metric_colors[i]},
121 | tickmode='auto',
122 | ticks='',
123 | title=m,
124 | titlefont={'color': metric_colors[i]},
125 | type='linear',
126 | zeroline=False)
127 |
128 | # style all the traces
129 | # fig.update_traces(
130 | # hoverinfo='name+x+text',
131 | # # https://plotly.com/python/line-charts/
132 | # line=dict(width=1.5, shape='hv'),
133 | # marker={'size': 8},
134 | # mode='lines+markers',
135 | # showlegend=False)
136 |
137 | # Update axes
138 | fig.update_layout(xaxis=dict(
139 | autorange=True,
140 | range=[data[0].timestamp, data[-1].timestamp],
141 | rangeslider=dict(
142 | autorange=True,
143 | range=[data[0].timestamp, data[-1].timestamp],
144 | ),
145 | type='date'),
146 | **y_axes)
147 |
148 | # Update layout
149 | fig.update_layout(
150 | title='Glucose monitoring data',
151 | legend_title='Legend',
152 | dragmode='zoom',
153 | hovermode='closest',
154 | legend=dict(traceorder='reversed'),
155 | height=2000,
156 | template='plotly_white',
157 | margin=dict(t=50, b=50),
158 | )
159 |
160 | with open('autojournal/image_hover.js', 'r') as f:
161 | fig.write_html(html_name, post_script=''.join(f.readlines()))
162 |
163 |
164 | def parse_date(s: Union[str, datetime]) -> datetime:
165 | if isinstance(s, datetime):
166 | return s
167 | for fmt in ('%Y-%m-%d', '%d.%m.%Y', '%m/%d/%Y'):
168 | try:
169 | return datetime.strptime(s, fmt).replace(tzinfo=DEFAULT_TIMEZONE)
170 | except ValueError:
171 | pass
172 | raise ValueError('no valid date format found')
173 |
174 |
175 | DEFAULT_TIMEZONE = tz.gettz('PST')
176 |
177 |
178 | @click.command()
179 | @click.option('--start_date', default='2000-01-01')
180 | @click.option('--end_date',
181 | default=datetime.now().replace(tzinfo=DEFAULT_TIMEZONE))
182 | @click.option('--use_cache/--no_cache', default=False)
183 | def main(start_date: str, end_date: str, use_cache: bool):
184 | start_date, end_date = parse_date(start_date), parse_date(end_date)
185 |
186 | if use_cache:
187 | with open('report_data_cache.pickle', 'rb') as f:
188 | event_data = pickle.load(f)
189 | else:
190 | creds = credentials.get_credentials([
191 | # If modifying scopes, delete the file token.pickle.
192 | 'https://www.googleapis.com/auth/drive.readonly',
193 | 'https://www.googleapis.com/auth/calendar',
194 | 'https://www.googleapis.com/auth/photoslibrary.readonly'
195 | ])
196 | drive_api_instance = drive_api.DriveApi(creds)
197 | cal_api_instance = calendar_api.CalendarApi(creds)
198 | print('Done setting up google APIs')
199 |
200 | event_data = []
201 | spreadsheet_data = {}
202 | print('Getting sleep data...')
203 | sleep_data = cal_api_instance.get_events(
204 | cal_api_instance.get_calendar_id('Sleep'))
205 | for e in sleep_data:
206 | event_data.append(Event(
207 | summary='',
208 | description='',
209 | timestamp=datetime.fromisoformat(e['start']['dateTime']),
210 | data={'description': e.get('description', ''), 'asleep': 1},
211 | ))
212 | event_data.append(Event(
213 | summary='',
214 | description='',
215 | timestamp=datetime.fromisoformat(e['end']['dateTime']),
216 | data={'description': e.get('description', ''), 'asleep': 0},
217 | ))
218 | print('Getting Cronometer data...')
219 | spreadsheet_data.update(drive_api_instance.read_all_spreadsheet_data(
220 | 'cronometer'))
221 | event_data += cronometer.parse_nutrition(
222 | spreadsheet_data, daily_cumulative=True)
223 | print('Getting Glucose data...')
224 | spreadsheet_data.update(drive_api_instance.read_all_spreadsheet_data(
225 | '4-21-2021-continuous-glucose-monitoring'))
226 | event_data += cgm.parse_cgm(spreadsheet_data)
227 | print('Getting workout data...')
228 | fit_data = google_fit.parse_sessions(
229 | drive_api_instance, 'google-fit-sessions')
230 | for e in fit_data:
231 | start_event = copy.deepcopy(e)
232 | event_data.append(start_event)
233 | end_event = copy.deepcopy(e)
234 | end_event.data['Burned Calories'] = 0.0
235 | end_event.timestamp = e.timestamp + e.duration
236 | event_data.append(end_event)
237 | # event_data += nomie.parse_nomie(spreadsheet_data)
238 | # spreadsheet_data.update(drive_api_instance.read_all_spreadsheet_data(
239 | # 'GPSLogger for Android'))
240 | # event_data += gps.parse_gps(spreadsheet_data)
241 | # event_data += activitywatch.get_events(
242 | # os.path.expanduser(
243 | # '~/.local/share/activitywatch/aw-server/peewee-sqlite.v2.db'))
244 | # for file_lines in drive_api_instance.read_files(
245 | # 'activitywatch-phone-data').values():
246 | # event_data += activitywatch.get_events_from_json('\n'.join(file_lines))
247 |
248 | # If events don't have a timezone, assume DEFAULT_TIMEZONE.
249 | # Then, shift all times to the DEFAULT_TIMEZONE.
250 | print('Fixing timestamps...')
251 | for e in event_data:
252 | if e.timestamp.tzinfo is None:
253 | e.timestamp = e.timestamp.replace(tzinfo=DEFAULT_TIMEZONE)
254 | e.timestamp = e.timestamp.astimezone(tz=DEFAULT_TIMEZONE)
255 |
256 | print('Writing cache file...')
257 | with open('report_data_cache.pickle', 'wb') as f:
258 | pickle.dump(event_data, f)
259 |
260 | # Filter events by date
261 | print('Filtering events to specified date range...')
262 | event_data = [e for e in event_data if start_date < e.timestamp < end_date]
263 | event_data = sorted(event_data, key=lambda e: e.timestamp)
264 |
265 | print('Making plot...')
266 | create_plotly_report(event_data, [
267 | 'Carbs (g)', 'Sugars (g)', 'Fat (g)', 'Fiber (g)',
268 | 'Monounsaturated (g)', 'Polyunsaturated (g)', 'Saturated (g)',
269 | 'Sodium (mg)',
270 | 'Weight (lbs)', 'Burned Calories',
271 | 'Energy (kcal)', 'asleep', 'Historic Glucose mg/dL',
272 | 'weight', 'speed', 'using_laptop', 'using_phone',
273 | ], 'out.html')
274 |
275 | # TODO Rank activities by time spent in them here.
276 |
277 |
278 | if __name__ == '__main__':
279 | main()
280 |
--------------------------------------------------------------------------------
/autojournal/selfspy_api.py:
--------------------------------------------------------------------------------
1 | import os
2 | import copy
3 | import re
4 | from typing import List, Any
5 | from datetime import datetime, timedelta
6 | from dateutil import tz
7 | from collections import namedtuple
8 | from dataclasses import dataclass
9 | from collections import defaultdict
10 | from functools import reduce
11 | from pprint import pprint
12 | import psutil
13 |
14 | from sortedcontainers import SortedList
15 | from selfspy.modules import models, config as cfg
16 | from selfspy.stats import create_times
17 |
18 | from . import calendar_api
19 | from . import utils
20 |
21 |
22 | ActionTiming = namedtuple('ActionTiming', [
23 | 'time', # type datetime
24 | 'num_moves', # type int, zero if this is a keystroke, nonzero if is a
25 | # click selfspy stores amount of mouse movement before a
26 | # click in the same row, so we carry through that infomation
27 | # here.
28 | ])
29 |
30 |
31 | @dataclass
32 | class WindowSession:
33 | """Describes the time spent in a single window.
34 |
35 | Every time a window is switched to, another instance of this is created.
36 | """
37 | # Title of the window.
38 | title: str = None
39 | # Name of the program that this window is an instance of.
40 | program_name: str = None
41 | # Timestamp for each action that happened while in this window. There is
42 | # always one "action" when the window is moved to. It's safe to say that
43 | # the time in a window is the last time in this list minus the first time.
44 | # TODO might be a bug here where the actions do not quite give the time in
45 | # the window accurately. For instance, if a key is pressed to go to a
46 | # window, then no actions are taken for a while, it might be that the
47 | # window session for the window "starts" when the first key is pressed,
48 | # which is inaccurate.
49 | action_timings: 'Any' = None # SortedList[ActionTiming]
50 |
51 | def get_total_time(self):
52 | return self.action_timings[-1].time - self.action_timings[0].time
53 |
54 | def get_total_actions_by_type(self):
55 | return dict(
56 | keystrokes=len([a for a in self.action_timings
57 | if a.num_moves == 0]),
58 | clicks=len([a for a in self.action_timings if a.num_moves != 0]),
59 | mouse_moves=sum([a.num_moves for a in self.action_timings]),
60 | )
61 |
62 | def summarize(self):
63 | actions = ', '.join([
64 | f'{v} {k}' for k, v in self.get_total_actions_by_type().items()])
65 | total_mins = round(self.get_total_time().total_seconds() / 60, 2)
66 | return f'{total_mins}m : {self.title} --- {actions}'
67 |
68 |
69 | def get_total_time_of_sessions(sessions: List[WindowSession]):
70 | return sum([s.get_total_time().total_seconds() for s in sessions])
71 |
72 |
73 | def get_total_time_of_sessions_str(sessions: List[WindowSession]):
74 | return utils.strfdelta(
75 | timedelta(seconds=get_total_time_of_sessions(sessions)))
76 |
77 |
78 | def remove_urls(s):
79 | return re.sub(r'http\S+', '', s)
80 |
81 |
82 | def remove_redundancy(string):
83 | redundants = [
84 | ' - Google Chrome',
85 | '/google/src/cloud/kovas/chamber_regression_replication/',
86 | ]
87 | return reduce(lambda s, r: s.replace(r, ''), redundants, string)
88 |
89 |
90 | def get_session_group_description(sessions: List[WindowSession], long=True):
91 | unique_sessions = defaultdict(list)
92 | for s in sessions:
93 | unique_sessions[s.title].append(s)
94 | # Sort unique_sessions by the total time of each group of sessions, longest
95 | # first.
96 | unique_sessions = sorted(
97 | unique_sessions.items(),
98 | key=lambda t: get_total_time_of_sessions(t[1]))[::-1]
99 |
100 | total_actions = reduce(
101 | lambda d1, d2: {k: d1[k] + d2[k] for k in d1.keys()},
102 | [s.get_total_actions_by_type() for s in sessions])
103 | action_summary = ''
104 | for k, v in total_actions.items():
105 | action_summary += f'{v} {k} ({round(v / 60, 2)} {k} per minute)\n'
106 | # TODO rank windows both by time used AND by keystrokes/mouse actions
107 | # within.
108 | if long:
109 | desc = f"""Used computer for {get_total_time_of_sessions_str(sessions)}.
110 |
111 | {action_summary}
112 | Windows used ({len(sessions)} total switches):
113 |
114 | """
115 | for title, ss in unique_sessions:
116 | title = remove_redundancy(title)
117 | actions = reduce(
118 | lambda d1, d2: {k: d1[k] + d2[k] for k in d1.keys()},
119 | [s.get_total_actions_by_type() for s in ss])
120 | actions_str = ', '.join(
121 | [f'{v} {k}' for k, v in actions.items()]
122 | ).replace('keystrokes', 'k').replace('clicks', 'c').replace(
123 | 'mouse_moves', 'm') # Save some characters
124 | row = (f'{get_total_time_of_sessions_str(ss)} : {title} '
125 | f'--- {actions_str}\n')
126 | if (len(desc) + len(row)
127 | > calendar_api.EVENT_DESCRIPTION_LENGTH_LIMIT):
128 | break
129 | desc += row
130 | return desc
131 | else:
132 | # Get events that make up majority of time in session
133 | top_session_titles = []
134 | percent_left = 100
135 | total_secs = get_total_time_of_sessions(sessions)
136 | # Round up to at least 0.01 to avoid div by zero errors.
137 | if total_secs == 0:
138 | total_secs = 0.01
139 | for title, ss in unique_sessions:
140 | top_session_titles.append(remove_urls(remove_redundancy(
141 | title.replace('\n', ' '))))
142 | percent_left -= (get_total_time_of_sessions(ss) / total_secs) * 100
143 | if percent_left < 25:
144 | break
145 | kpm = round(total_actions['keystrokes'] / 60, 2)
146 | cpm = round(total_actions['clicks'] / 60, 2)
147 | try:
148 | percent_active = round(
149 | 100 * total_secs / (
150 | sessions[-1].action_timings[-1].time
151 | - sessions[0].action_timings[0].time).total_seconds(),
152 | 1)
153 | except ZeroDivisionError:
154 | percent_active = '0.0'
155 | return f"""{get_total_time_of_sessions_str(sessions)} Active
156 | ({percent_active}%)
157 | -- {' | '.join(top_session_titles)[:50]}
158 | -- {kpm}kpm, {cpm}cpm.""".replace('\n', '')
159 |
160 |
161 | def get_window_sessions(db_name):
162 | # Sessions sorted by the first action that occured in them.
163 | window_sessions = SortedList(key=lambda ws: ws.action_timings[0])
164 |
165 | # Query "Keys" table for action_timings and basic window info.
166 | session = models.initialize(db_name)()
167 | for keys_row in session.query(models.Keys).order_by(models.Keys.id).all():
168 | window_sessions.add(
169 | WindowSession(
170 | title=str(keys_row.window.title),
171 | program_name=keys_row.process.name,
172 | action_timings=SortedList(
173 | [ActionTiming(
174 | time=datetime.fromtimestamp(t),
175 | num_moves=0)
176 | for t in create_times(keys_row)])))
177 |
178 | # Query "Clicks" table to fill out mouse data in window_sessions.
179 | for click_row in session.query(
180 | models.Click).order_by(models.Click.id).all():
181 | click_row_tuple = ActionTiming(
182 | time=click_row.created_at,
183 | num_moves=click_row.nrmoves)
184 | idx = window_sessions.bisect_left(
185 | WindowSession(action_timings=[click_row_tuple])) - 1
186 | window_sessions[idx].action_timings.add(click_row_tuple)
187 |
188 | return window_sessions
189 |
190 |
191 | # This function attempts to group sessions assuming that they overlap, which
192 | # shouldn't normally happen
193 | def group_sessions(window_sessions, group_separation_time):
194 | # We assume the window_sessions are sorted by their first action timing
195 | groups = [
196 | [window_sessions[0]],
197 | ]
198 | cur_session = window_sessions[0]
199 | for next_session in window_sessions[1:]:
200 | cur_group_end = (cur_session.action_timings[-1].time
201 | + group_separation_time)
202 | if next_session.action_timings[0].time > cur_group_end:
203 | groups.append([])
204 | groups[-1].append(next_session)
205 | cur_session = next_session
206 | return groups
207 |
208 |
209 | def get_events_from_sessions(window_sessions, idle_time,
210 | group_separation_time):
211 | # Split up long window sessions with inactive periods into several
212 | # sessions, each containing activity (clicks/keystrokes).
213 |
214 | # Sessions can sometimes overlap (not sure why exactly...), so we enforce
215 | # that they are at least sorted by the first event that happens in each and
216 | # that the last event is the longest?
217 | active_sessions = SortedList(key=lambda ws: ws.action_timings[0])
218 | for window_session in window_sessions:
219 | new_timings = utils.split_on_gaps(
220 | window_session.action_timings, idle_time, key=lambda t: t.time)
221 | for timings in new_timings:
222 | active_sessions.add(
223 | WindowSession(
224 | title=window_session.title,
225 | program_name=window_session.program_name,
226 | action_timings=SortedList(timings, key=lambda t: t.time)))
227 |
228 | # prev = None
229 | # for s in active_sessions:
230 | # if prev and prev.action_timings[0].time > s.action_timings[0].time:
231 | # print(prev.title)
232 | # for t in prev.action_timings:
233 | # print(t)
234 | # print(s.title)
235 | # for t in s.action_timings:
236 | # print(t)
237 | # print()
238 | # raise Exception()
239 | # prev = s
240 | # raise Exception('nope')
241 |
242 | # Group window sessions into chunks, where each chunk contains a continuous
243 | # period of activity, with no inactivity longer than idle_time.
244 | # grouped_sessions = group_sessions(active_sessions, group_separation_time)
245 | grouped_sessions = utils.split_on_gaps(
246 | active_sessions, group_separation_time,
247 | key=lambda s: s.action_timings[0].time,
248 | last_key=lambda s: s.action_timings[-1].time)
249 |
250 | return [make_cal_event_from_session_group(sessions)
251 | for sessions in grouped_sessions]
252 |
253 |
254 | def make_cal_event_from_session_group(sessions: List[WindowSession]):
255 | if (sessions[0].action_timings[0].time
256 | > sessions[-1].action_timings[-1].time):
257 | for s in sessions:
258 | print(s.title)
259 | for t in s.action_timings:
260 | print(t)
261 | raise Exception()
262 | return calendar_api.CalendarEvent(
263 | start=dict(
264 | dateTime=sessions[0].action_timings[0].time.replace(
265 | tzinfo=tz.gettz('PST')).isoformat(),
266 | timeZone='America/Los_Angeles'),
267 | end=dict(
268 | dateTime=sessions[-1].action_timings[-1].time.replace(
269 | tzinfo=tz.gettz('PST')).isoformat(),
270 | timeZone='America/Los_Angeles'),
271 | summary=get_session_group_description(sessions, long=False),
272 | description=get_session_group_description(sessions, long=True),
273 | )
274 |
275 |
276 | def get_selfspy_usage_events(
277 | db_name=os.path.expanduser(os.path.join(cfg.DATA_DIR, cfg.DBNAME)),
278 | session_limit=None,
279 | idle_seconds=60 * 3,
280 | event_separation_seconds=60 * 20,
281 | ) -> List[calendar_api.CalendarEvent]:
282 | # db_name = 'test_selfspy_db/selfspy.sqlite'
283 | process = psutil.Process(os.getpid())
284 | print('mem used: ', process.memory_info().rss / 10**6, 'MB')
285 | print('db ', db_name)
286 | window_sessions = get_window_sessions(db_name)
287 | print('mem used: ', process.memory_info().rss / 10**6, 'MB')
288 | if session_limit:
289 | window_sessions = window_sessions[-session_limit:]
290 | print('mem used: ', process.memory_info().rss / 10**6, 'MB')
291 | events = get_events_from_sessions(
292 | window_sessions, timedelta(seconds=idle_seconds),
293 | timedelta(seconds=event_separation_seconds))
294 | print('mem used: ', process.memory_info().rss / 10**6, 'MB')
295 | return events
296 |
297 |
298 | if __name__ == '__main__':
299 | pprint(get_selfspy_usage_events('desktop_selfspy.sqlite', None))
300 |
--------------------------------------------------------------------------------
/autojournal/utils.py:
--------------------------------------------------------------------------------
1 | import time
2 | from typing import Dict
3 | from datetime import datetime, timedelta
4 | from dateutil import tz
5 | import socket
6 |
7 |
8 | def retry_on_error(
9 | function,
10 | errors=(BrokenPipeError, ConnectionResetError, socket.timeout),
11 | sleep_time_secs=3,
12 | num_retries=5):
13 | last_error = None
14 | for retry in range(num_retries):
15 | try:
16 | return function()
17 | except errors as e:
18 | last_error = e
19 | print(
20 | f'Hit {e}, sleeping for {sleep_time_secs}s, then trying again.. '
21 | f'(attempt {retry}/{num_retries})')
22 | time.sleep(sleep_time_secs)
23 | else:
24 | break
25 | raise last_error
26 |
27 |
28 | def timestamp_ms_to_event_time(
29 | timestamp_ms: int,
30 | timezone_name='America/Los_Angeles',
31 | ) -> Dict[str, str]:
32 | t = datetime.fromtimestamp(timestamp_ms / 1000).replace(
33 | tzinfo=tz.gettz(timezone_name))
34 | return dict(
35 | dateTime=t.isoformat(),
36 | timeZone=timezone_name,
37 | )
38 |
39 |
40 | def utc_to_timezone(
41 | utc_time: str,
42 | timezone_name: str = 'America/Los_Angeles',
43 | additional_offset_mins: int = 0,
44 | round_seconds: bool = False,
45 | ) -> Dict[str, str]:
46 | """Converts utc time string (e.g. from Google Photos timestamp) to pst
47 | string with timezone name (e.g. for Google Calendar event).
48 |
49 | >>> utc_to_timezone('2020-01-29T01:57:06Z', 'America/Los_Angeles', \
50 | round_seconds=False)
51 | {'dateTime': '2020-01-28T17:57:06-08:00', 'timeZone': 'America/Los_Angeles'}
52 | >>> utc_to_timezone('2020-01-29T01:57:06Z', 'America/Los_Angeles', 4, \
53 | round_seconds=False)
54 | {'dateTime': '2020-01-28T18:01:06-08:00', 'timeZone': 'America/Los_Angeles'}
55 | >>> utc_to_timezone('2020-01-29T01:57:06Z', 'America/Los_Angeles', \
56 | round_seconds=True)
57 | {'dateTime': '2020-01-28T17:57:00-08:00', 'timeZone': 'America/Los_Angeles'}
58 | """
59 | utc = datetime.fromisoformat(utc_time.rstrip('Z')).replace(
60 | tzinfo=tz.gettz('UTC'))
61 | pst = utc.astimezone(tz.gettz(timezone_name))
62 | pst += timedelta(minutes=additional_offset_mins)
63 | if round_seconds:
64 | pst = pst.replace(second=0)
65 | return dict(
66 | dateTime=pst.isoformat(),
67 | timeZone=timezone_name,
68 | )
69 |
70 |
71 | def is_subset(ref: dict, query: dict) -> bool:
72 | """Checks to see if the query dict is a subset of the ref dict.
73 |
74 | Taken from
75 | https://stackoverflow.com/questions/49419486/recursive-function-to-check-dictionary-is-a-subset-of-another-dictionary
76 |
77 | """
78 | for key, value in query.items():
79 | if key not in ref:
80 | return False
81 | if isinstance(value, dict):
82 | if not is_subset(ref[key], value):
83 | return False
84 | elif isinstance(value, list):
85 | if not set(value) <= set(ref[key]):
86 | return False
87 | elif isinstance(value, set):
88 | if not value <= ref[key]:
89 | return False
90 | else:
91 | if not value == ref[key]:
92 | return False
93 | return True
94 |
95 |
96 | def split_on_gaps(values, threshold, key=lambda o: o, last_key=None):
97 | """Input values will generally need to be sorted.
98 |
99 | >>> split_on_gaps([1,2,3,4,8,9,10], 2)
100 | [[1, 2, 3, 4], [8, 9, 10]]
101 | """
102 | if last_key is None:
103 | last_key = key
104 | last_val = None
105 | split_points = []
106 | for i, orig_value in enumerate(values):
107 | value = key(orig_value)
108 | if last_val is not None and (value - last_val) > threshold:
109 | split_points.append(i)
110 | last_val = last_key(orig_value)
111 | if split_points:
112 | split_points.insert(0, 0)
113 | split_points.append(len(values))
114 | return [
115 | values[split_points[i]:split_points[i + 1]]
116 | for i in range(len(split_points) - 1)]
117 | else:
118 | return [values]
119 |
120 |
121 | def strfdelta(tdelta):
122 | """Creates formatted string for timedelta objects."""
123 | days = tdelta.days
124 | hours, rem = divmod(tdelta.seconds, 3600)
125 | minutes, seconds = divmod(rem, 60)
126 | s = ''
127 | if days:
128 | s += f'{days}d '
129 | if hours:
130 | s += f'{hours}h '
131 | if minutes:
132 | s += f'{minutes}m '
133 | s += f'{seconds}s'
134 | return s
135 |
--------------------------------------------------------------------------------
/example_computer_usage_calendar.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kovasap/autojournal/55309333457eee4ece658d6961cebeceee40308f/example_computer_usage_calendar.png
--------------------------------------------------------------------------------
/example_location_calendar.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kovasap/autojournal/55309333457eee4ece658d6961cebeceee40308f/example_location_calendar.png
--------------------------------------------------------------------------------
/poetry.lock:
--------------------------------------------------------------------------------
1 | # This file is automatically @generated by Poetry 1.4.2 and should not be changed by hand.
2 |
3 | [[package]]
4 | name = "altair"
5 | version = "5.0.0"
6 | description = "Vega-Altair: A declarative statistical visualization library for Python."
7 | category = "main"
8 | optional = false
9 | python-versions = ">=3.7"
10 | files = [
11 | {file = "altair-5.0.0-py3-none-any.whl", hash = "sha256:e7deed321f61a3ec752186ae96e97b44a1353de142928c1934fb211e9f0bfe9e"},
12 | {file = "altair-5.0.0.tar.gz", hash = "sha256:394c3d8be96f9cc90e15a0eee3634cc5b6f19e470fd2045759892623bd9a3fb2"},
13 | ]
14 |
15 | [package.dependencies]
16 | jinja2 = "*"
17 | jsonschema = ">=3.0"
18 | numpy = "*"
19 | pandas = ">=0.18"
20 | toolz = "*"
21 | typing-extensions = {version = ">=4.0.1", markers = "python_version < \"3.11\""}
22 |
23 | [package.extras]
24 | dev = ["black (<24)", "hatch", "ipython", "m2r", "mypy", "pandas-stubs", "pytest", "pytest-cov", "ruff", "types-jsonschema", "types-setuptools", "vega-datasets", "vl-convert-python"]
25 | doc = ["docutils", "geopandas", "jinja2", "myst-parser", "numpydoc", "pillow", "pydata-sphinx-theme", "sphinx", "sphinx-copybutton", "sphinx-design"]
26 |
27 | [[package]]
28 | name = "attrs"
29 | version = "23.1.0"
30 | description = "Classes Without Boilerplate"
31 | category = "main"
32 | optional = false
33 | python-versions = ">=3.7"
34 | files = [
35 | {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
36 | {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
37 | ]
38 |
39 | [package.extras]
40 | cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
41 | dev = ["attrs[docs,tests]", "pre-commit"]
42 | docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
43 | tests = ["attrs[tests-no-zope]", "zope-interface"]
44 | tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
45 |
46 | [[package]]
47 | name = "beautifulsoup4"
48 | version = "4.12.2"
49 | description = "Screen-scraping library"
50 | category = "main"
51 | optional = false
52 | python-versions = ">=3.6.0"
53 | files = [
54 | {file = "beautifulsoup4-4.12.2-py3-none-any.whl", hash = "sha256:bd2520ca0d9d7d12694a53d44ac482d181b4ec1888909b035a3dbf40d0f57d4a"},
55 | {file = "beautifulsoup4-4.12.2.tar.gz", hash = "sha256:492bbc69dca35d12daac71c4db1bfff0c876c00ef4a2ffacce226d4638eb72da"},
56 | ]
57 |
58 | [package.dependencies]
59 | soupsieve = ">1.2"
60 |
61 | [package.extras]
62 | html5lib = ["html5lib"]
63 | lxml = ["lxml"]
64 |
65 | [[package]]
66 | name = "bs4"
67 | version = "0.0.1"
68 | description = "Dummy package for Beautiful Soup"
69 | category = "main"
70 | optional = false
71 | python-versions = "*"
72 | files = [
73 | {file = "bs4-0.0.1.tar.gz", hash = "sha256:36ecea1fd7cc5c0c6e4a1ff075df26d50da647b75376626cc186e2212886dd3a"},
74 | ]
75 |
76 | [package.dependencies]
77 | beautifulsoup4 = "*"
78 |
79 | [[package]]
80 | name = "cachetools"
81 | version = "5.3.0"
82 | description = "Extensible memoizing collections and decorators"
83 | category = "main"
84 | optional = false
85 | python-versions = "~=3.7"
86 | files = [
87 | {file = "cachetools-5.3.0-py3-none-any.whl", hash = "sha256:429e1a1e845c008ea6c85aa35d4b98b65d6a9763eeef3e37e92728a12d1de9d4"},
88 | {file = "cachetools-5.3.0.tar.gz", hash = "sha256:13dfddc7b8df938c21a940dfa6557ce6e94a2f1cdfa58eb90c805721d58f2c14"},
89 | ]
90 |
91 | [[package]]
92 | name = "certifi"
93 | version = "2023.5.7"
94 | description = "Python package for providing Mozilla's CA Bundle."
95 | category = "main"
96 | optional = false
97 | python-versions = ">=3.6"
98 | files = [
99 | {file = "certifi-2023.5.7-py3-none-any.whl", hash = "sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716"},
100 | {file = "certifi-2023.5.7.tar.gz", hash = "sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7"},
101 | ]
102 |
103 | [[package]]
104 | name = "cffi"
105 | version = "1.15.1"
106 | description = "Foreign Function Interface for Python calling C code."
107 | category = "main"
108 | optional = false
109 | python-versions = "*"
110 | files = [
111 | {file = "cffi-1.15.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:a66d3508133af6e8548451b25058d5812812ec3798c886bf38ed24a98216fab2"},
112 | {file = "cffi-1.15.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:470c103ae716238bbe698d67ad020e1db9d9dba34fa5a899b5e21577e6d52ed2"},
113 | {file = "cffi-1.15.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:9ad5db27f9cabae298d151c85cf2bad1d359a1b9c686a275df03385758e2f914"},
114 | {file = "cffi-1.15.1-cp27-cp27m-win32.whl", hash = "sha256:b3bbeb01c2b273cca1e1e0c5df57f12dce9a4dd331b4fa1635b8bec26350bde3"},
115 | {file = "cffi-1.15.1-cp27-cp27m-win_amd64.whl", hash = "sha256:e00b098126fd45523dd056d2efba6c5a63b71ffe9f2bbe1a4fe1716e1d0c331e"},
116 | {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:d61f4695e6c866a23a21acab0509af1cdfd2c013cf256bbf5b6b5e2695827162"},
117 | {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:ed9cb427ba5504c1dc15ede7d516b84757c3e3d7868ccc85121d9310d27eed0b"},
118 | {file = "cffi-1.15.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:39d39875251ca8f612b6f33e6b1195af86d1b3e60086068be9cc053aa4376e21"},
119 | {file = "cffi-1.15.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:285d29981935eb726a4399badae8f0ffdff4f5050eaa6d0cfc3f64b857b77185"},
120 | {file = "cffi-1.15.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3eb6971dcff08619f8d91607cfc726518b6fa2a9eba42856be181c6d0d9515fd"},
121 | {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21157295583fe8943475029ed5abdcf71eb3911894724e360acff1d61c1d54bc"},
122 | {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5635bd9cb9731e6d4a1132a498dd34f764034a8ce60cef4f5319c0541159392f"},
123 | {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2012c72d854c2d03e45d06ae57f40d78e5770d252f195b93f581acf3ba44496e"},
124 | {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd86c085fae2efd48ac91dd7ccffcfc0571387fe1193d33b6394db7ef31fe2a4"},
125 | {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fa6693661a4c91757f4412306191b6dc88c1703f780c8234035eac011922bc01"},
126 | {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:59c0b02d0a6c384d453fece7566d1c7e6b7bae4fc5874ef2ef46d56776d61c9e"},
127 | {file = "cffi-1.15.1-cp310-cp310-win32.whl", hash = "sha256:cba9d6b9a7d64d4bd46167096fc9d2f835e25d7e4c121fb2ddfc6528fb0413b2"},
128 | {file = "cffi-1.15.1-cp310-cp310-win_amd64.whl", hash = "sha256:ce4bcc037df4fc5e3d184794f27bdaab018943698f4ca31630bc7f84a7b69c6d"},
129 | {file = "cffi-1.15.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3d08afd128ddaa624a48cf2b859afef385b720bb4b43df214f85616922e6a5ac"},
130 | {file = "cffi-1.15.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3799aecf2e17cf585d977b780ce79ff0dc9b78d799fc694221ce814c2c19db83"},
131 | {file = "cffi-1.15.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a591fe9e525846e4d154205572a029f653ada1a78b93697f3b5a8f1f2bc055b9"},
132 | {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3548db281cd7d2561c9ad9984681c95f7b0e38881201e157833a2342c30d5e8c"},
133 | {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:91fc98adde3d7881af9b59ed0294046f3806221863722ba7d8d120c575314325"},
134 | {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94411f22c3985acaec6f83c6df553f2dbe17b698cc7f8ae751ff2237d96b9e3c"},
135 | {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:03425bdae262c76aad70202debd780501fabeaca237cdfddc008987c0e0f59ef"},
136 | {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cc4d65aeeaa04136a12677d3dd0b1c0c94dc43abac5860ab33cceb42b801c1e8"},
137 | {file = "cffi-1.15.1-cp311-cp311-win32.whl", hash = "sha256:a0f100c8912c114ff53e1202d0078b425bee3649ae34d7b070e9697f93c5d52d"},
138 | {file = "cffi-1.15.1-cp311-cp311-win_amd64.whl", hash = "sha256:04ed324bda3cda42b9b695d51bb7d54b680b9719cfab04227cdd1e04e5de3104"},
139 | {file = "cffi-1.15.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50a74364d85fd319352182ef59c5c790484a336f6db772c1a9231f1c3ed0cbd7"},
140 | {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e263d77ee3dd201c3a142934a086a4450861778baaeeb45db4591ef65550b0a6"},
141 | {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cec7d9412a9102bdc577382c3929b337320c4c4c4849f2c5cdd14d7368c5562d"},
142 | {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4289fc34b2f5316fbb762d75362931e351941fa95fa18789191b33fc4cf9504a"},
143 | {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:173379135477dc8cac4bc58f45db08ab45d228b3363adb7af79436135d028405"},
144 | {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6975a3fac6bc83c4a65c9f9fcab9e47019a11d3d2cf7f3c0d03431bf145a941e"},
145 | {file = "cffi-1.15.1-cp36-cp36m-win32.whl", hash = "sha256:2470043b93ff09bf8fb1d46d1cb756ce6132c54826661a32d4e4d132e1977adf"},
146 | {file = "cffi-1.15.1-cp36-cp36m-win_amd64.whl", hash = "sha256:30d78fbc8ebf9c92c9b7823ee18eb92f2e6ef79b45ac84db507f52fbe3ec4497"},
147 | {file = "cffi-1.15.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:198caafb44239b60e252492445da556afafc7d1e3ab7a1fb3f0584ef6d742375"},
148 | {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5ef34d190326c3b1f822a5b7a45f6c4535e2f47ed06fec77d3d799c450b2651e"},
149 | {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8102eaf27e1e448db915d08afa8b41d6c7ca7a04b7d73af6514df10a3e74bd82"},
150 | {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5df2768244d19ab7f60546d0c7c63ce1581f7af8b5de3eb3004b9b6fc8a9f84b"},
151 | {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8c4917bd7ad33e8eb21e9a5bbba979b49d9a97acb3a803092cbc1133e20343c"},
152 | {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2642fe3142e4cc4af0799748233ad6da94c62a8bec3a6648bf8ee68b1c7426"},
153 | {file = "cffi-1.15.1-cp37-cp37m-win32.whl", hash = "sha256:e229a521186c75c8ad9490854fd8bbdd9a0c9aa3a524326b55be83b54d4e0ad9"},
154 | {file = "cffi-1.15.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a0b71b1b8fbf2b96e41c4d990244165e2c9be83d54962a9a1d118fd8657d2045"},
155 | {file = "cffi-1.15.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:320dab6e7cb2eacdf0e658569d2575c4dad258c0fcc794f46215e1e39f90f2c3"},
156 | {file = "cffi-1.15.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e74c6b51a9ed6589199c787bf5f9875612ca4a8a0785fb2d4a84429badaf22a"},
157 | {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5c84c68147988265e60416b57fc83425a78058853509c1b0629c180094904a5"},
158 | {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b926aa83d1edb5aa5b427b4053dc420ec295a08e40911296b9eb1b6170f6cca"},
159 | {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:87c450779d0914f2861b8526e035c5e6da0a3199d8f1add1a665e1cbc6fc6d02"},
160 | {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f2c9f67e9821cad2e5f480bc8d83b8742896f1242dba247911072d4fa94c192"},
161 | {file = "cffi-1.15.1-cp38-cp38-win32.whl", hash = "sha256:8b7ee99e510d7b66cdb6c593f21c043c248537a32e0bedf02e01e9553a172314"},
162 | {file = "cffi-1.15.1-cp38-cp38-win_amd64.whl", hash = "sha256:00a9ed42e88df81ffae7a8ab6d9356b371399b91dbdf0c3cb1e84c03a13aceb5"},
163 | {file = "cffi-1.15.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:54a2db7b78338edd780e7ef7f9f6c442500fb0d41a5a4ea24fff1c929d5af585"},
164 | {file = "cffi-1.15.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:fcd131dd944808b5bdb38e6f5b53013c5aa4f334c5cad0c72742f6eba4b73db0"},
165 | {file = "cffi-1.15.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7473e861101c9e72452f9bf8acb984947aa1661a7704553a9f6e4baa5ba64415"},
166 | {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c9a799e985904922a4d207a94eae35c78ebae90e128f0c4e521ce339396be9d"},
167 | {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3bcde07039e586f91b45c88f8583ea7cf7a0770df3a1649627bf598332cb6984"},
168 | {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33ab79603146aace82c2427da5ca6e58f2b3f2fb5da893ceac0c42218a40be35"},
169 | {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5d598b938678ebf3c67377cdd45e09d431369c3b1a5b331058c338e201f12b27"},
170 | {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db0fbb9c62743ce59a9ff687eb5f4afbe77e5e8403d6697f7446e5f609976f76"},
171 | {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:98d85c6a2bef81588d9227dde12db8a7f47f639f4a17c9ae08e773aa9c697bf3"},
172 | {file = "cffi-1.15.1-cp39-cp39-win32.whl", hash = "sha256:40f4774f5a9d4f5e344f31a32b5096977b5d48560c5592e2f3d2c4374bd543ee"},
173 | {file = "cffi-1.15.1-cp39-cp39-win_amd64.whl", hash = "sha256:70df4e3b545a17496c9b3f41f5115e69a4f2e77e94e1d2a8e1070bc0c38c8a3c"},
174 | {file = "cffi-1.15.1.tar.gz", hash = "sha256:d400bfb9a37b1351253cb402671cea7e89bdecc294e8016a707f6d1d8ac934f9"},
175 | ]
176 |
177 | [package.dependencies]
178 | pycparser = "*"
179 |
180 | [[package]]
181 | name = "charset-normalizer"
182 | version = "3.1.0"
183 | description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
184 | category = "main"
185 | optional = false
186 | python-versions = ">=3.7.0"
187 | files = [
188 | {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
189 | {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
190 | {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
191 | {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
192 | {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
193 | {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
194 | {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
195 | {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
196 | {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
197 | {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
198 | {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
199 | {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
200 | {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
201 | {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
202 | {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
203 | {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
204 | {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
205 | {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
206 | {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
207 | {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
208 | {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
209 | {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
210 | {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
211 | {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
212 | {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
213 | {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
214 | {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
215 | {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
216 | {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
217 | {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
218 | {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
219 | {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
220 | {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
221 | {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
222 | {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
223 | {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
224 | {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
225 | {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
226 | {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
227 | {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
228 | {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
229 | {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
230 | {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
231 | {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
232 | {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
233 | {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
234 | {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
235 | {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
236 | {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
237 | {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
238 | {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
239 | {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
240 | {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
241 | {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
242 | {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
243 | {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
244 | {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
245 | {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
246 | {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
247 | {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
248 | {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
249 | {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
250 | {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
251 | {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
252 | {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
253 | {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
254 | {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
255 | {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
256 | {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
257 | {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
258 | {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
259 | {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
260 | {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
261 | {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
262 | {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
263 | ]
264 |
265 | [[package]]
266 | name = "click"
267 | version = "7.1.2"
268 | description = "Composable command line interface toolkit"
269 | category = "main"
270 | optional = false
271 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
272 | files = [
273 | {file = "click-7.1.2-py2.py3-none-any.whl", hash = "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"},
274 | {file = "click-7.1.2.tar.gz", hash = "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a"},
275 | ]
276 |
277 | [[package]]
278 | name = "cryptography"
279 | version = "40.0.2"
280 | description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
281 | category = "main"
282 | optional = false
283 | python-versions = ">=3.6"
284 | files = [
285 | {file = "cryptography-40.0.2-cp36-abi3-macosx_10_12_universal2.whl", hash = "sha256:8f79b5ff5ad9d3218afb1e7e20ea74da5f76943ee5edb7f76e56ec5161ec782b"},
286 | {file = "cryptography-40.0.2-cp36-abi3-macosx_10_12_x86_64.whl", hash = "sha256:05dc219433b14046c476f6f09d7636b92a1c3e5808b9a6536adf4932b3b2c440"},
287 | {file = "cryptography-40.0.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4df2af28d7bedc84fe45bd49bc35d710aede676e2a4cb7fc6d103a2adc8afe4d"},
288 | {file = "cryptography-40.0.2-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dcca15d3a19a66e63662dc8d30f8036b07be851a8680eda92d079868f106288"},
289 | {file = "cryptography-40.0.2-cp36-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:a04386fb7bc85fab9cd51b6308633a3c271e3d0d3eae917eebab2fac6219b6d2"},
290 | {file = "cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:adc0d980fd2760c9e5de537c28935cc32b9353baaf28e0814df417619c6c8c3b"},
291 | {file = "cryptography-40.0.2-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:d5a1bd0e9e2031465761dfa920c16b0065ad77321d8a8c1f5ee331021fda65e9"},
292 | {file = "cryptography-40.0.2-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:a95f4802d49faa6a674242e25bfeea6fc2acd915b5e5e29ac90a32b1139cae1c"},
293 | {file = "cryptography-40.0.2-cp36-abi3-win32.whl", hash = "sha256:aecbb1592b0188e030cb01f82d12556cf72e218280f621deed7d806afd2113f9"},
294 | {file = "cryptography-40.0.2-cp36-abi3-win_amd64.whl", hash = "sha256:b12794f01d4cacfbd3177b9042198f3af1c856eedd0a98f10f141385c809a14b"},
295 | {file = "cryptography-40.0.2-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:142bae539ef28a1c76794cca7f49729e7c54423f615cfd9b0b1fa90ebe53244b"},
296 | {file = "cryptography-40.0.2-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:956ba8701b4ffe91ba59665ed170a2ebbdc6fc0e40de5f6059195d9f2b33ca0e"},
297 | {file = "cryptography-40.0.2-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:4f01c9863da784558165f5d4d916093737a75203a5c5286fde60e503e4276c7a"},
298 | {file = "cryptography-40.0.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:3daf9b114213f8ba460b829a02896789751626a2a4e7a43a28ee77c04b5e4958"},
299 | {file = "cryptography-40.0.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:48f388d0d153350f378c7f7b41497a54ff1513c816bcbbcafe5b829e59b9ce5b"},
300 | {file = "cryptography-40.0.2-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c0764e72b36a3dc065c155e5b22f93df465da9c39af65516fe04ed3c68c92636"},
301 | {file = "cryptography-40.0.2-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:cbaba590180cba88cb99a5f76f90808a624f18b169b90a4abb40c1fd8c19420e"},
302 | {file = "cryptography-40.0.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7a38250f433cd41df7fcb763caa3ee9362777fdb4dc642b9a349721d2bf47404"},
303 | {file = "cryptography-40.0.2.tar.gz", hash = "sha256:c33c0d32b8594fa647d2e01dbccc303478e16fdd7cf98652d5b3ed11aa5e5c99"},
304 | ]
305 |
306 | [package.dependencies]
307 | cffi = ">=1.12"
308 |
309 | [package.extras]
310 | docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
311 | docstest = ["pyenchant (>=1.6.11)", "sphinxcontrib-spelling (>=4.0.1)", "twine (>=1.12.0)"]
312 | pep8test = ["black", "check-manifest", "mypy", "ruff"]
313 | sdist = ["setuptools-rust (>=0.11.4)"]
314 | ssh = ["bcrypt (>=3.1.5)"]
315 | test = ["iso8601", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-shard (>=0.1.2)", "pytest-subtests", "pytest-xdist"]
316 | test-randomorder = ["pytest-randomly"]
317 | tox = ["tox"]
318 |
319 | [[package]]
320 | name = "geographiclib"
321 | version = "2.0"
322 | description = "The geodesic routines from GeographicLib"
323 | category = "main"
324 | optional = false
325 | python-versions = ">=3.7"
326 | files = [
327 | {file = "geographiclib-2.0-py3-none-any.whl", hash = "sha256:6b7225248e45ff7edcee32becc4e0a1504c606ac5ee163a5656d482e0cd38734"},
328 | {file = "geographiclib-2.0.tar.gz", hash = "sha256:f7f41c85dc3e1c2d3d935ec86660dc3b2c848c83e17f9a9e51ba9d5146a15859"},
329 | ]
330 |
331 | [[package]]
332 | name = "geopy"
333 | version = "2.3.0"
334 | description = "Python Geocoding Toolbox"
335 | category = "main"
336 | optional = false
337 | python-versions = ">=3.7"
338 | files = [
339 | {file = "geopy-2.3.0-py3-none-any.whl", hash = "sha256:4a29a16d41d8e56ba8e07310802a1cbdf098eeb6069cc3d6d3068fc770629ffc"},
340 | {file = "geopy-2.3.0.tar.gz", hash = "sha256:228cd53b6eef699b2289d1172e462a90d5057779a10388a7366291812601187f"},
341 | ]
342 |
343 | [package.dependencies]
344 | geographiclib = ">=1.52,<3"
345 |
346 | [package.extras]
347 | aiohttp = ["aiohttp"]
348 | dev = ["coverage", "flake8 (>=5.0,<5.1)", "isort (>=5.10.0,<5.11.0)", "pytest (>=3.10)", "pytest-asyncio (>=0.17)", "readme-renderer", "sphinx (<=4.3.2)", "sphinx-issues", "sphinx-rtd-theme (>=0.5.0)"]
349 | dev-docs = ["readme-renderer", "sphinx (<=4.3.2)", "sphinx-issues", "sphinx-rtd-theme (>=0.5.0)"]
350 | dev-lint = ["flake8 (>=5.0,<5.1)", "isort (>=5.10.0,<5.11.0)"]
351 | dev-test = ["coverage", "pytest (>=3.10)", "pytest-asyncio (>=0.17)", "sphinx (<=4.3.2)"]
352 | requests = ["requests (>=2.16.2)", "urllib3 (>=1.24.2)"]
353 | timezone = ["pytz"]
354 |
355 | [[package]]
356 | name = "google-api-core"
357 | version = "2.11.0"
358 | description = "Google API client core library"
359 | category = "main"
360 | optional = false
361 | python-versions = ">=3.7"
362 | files = [
363 | {file = "google-api-core-2.11.0.tar.gz", hash = "sha256:4b9bb5d5a380a0befa0573b302651b8a9a89262c1730e37bf423cec511804c22"},
364 | {file = "google_api_core-2.11.0-py3-none-any.whl", hash = "sha256:ce222e27b0de0d7bc63eb043b956996d6dccab14cc3b690aaea91c9cc99dc16e"},
365 | ]
366 |
367 | [package.dependencies]
368 | google-auth = ">=2.14.1,<3.0dev"
369 | googleapis-common-protos = ">=1.56.2,<2.0dev"
370 | protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
371 | requests = ">=2.18.0,<3.0.0dev"
372 |
373 | [package.extras]
374 | grpc = ["grpcio (>=1.33.2,<2.0dev)", "grpcio (>=1.49.1,<2.0dev)", "grpcio-status (>=1.33.2,<2.0dev)", "grpcio-status (>=1.49.1,<2.0dev)"]
375 | grpcgcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"]
376 | grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"]
377 |
378 | [[package]]
379 | name = "google-api-python-client"
380 | version = "2.86.0"
381 | description = "Google API Client Library for Python"
382 | category = "main"
383 | optional = false
384 | python-versions = ">=3.7"
385 | files = [
386 | {file = "google-api-python-client-2.86.0.tar.gz", hash = "sha256:3ca4e93821f4e9ac29b91ab0d9df168b42c8ad0fb8bff65b8c2ccb2d462b0464"},
387 | {file = "google_api_python_client-2.86.0-py2.py3-none-any.whl", hash = "sha256:0f320190ab9d5bd2fdb0cb894e8e53bb5e17d4888ee8dc4d26ba65ce378409e2"},
388 | ]
389 |
390 | [package.dependencies]
391 | google-api-core = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0dev"
392 | google-auth = ">=1.19.0,<3.0.0dev"
393 | google-auth-httplib2 = ">=0.1.0"
394 | httplib2 = ">=0.15.0,<1dev"
395 | uritemplate = ">=3.0.1,<5"
396 |
397 | [[package]]
398 | name = "google-auth"
399 | version = "2.17.3"
400 | description = "Google Authentication Library"
401 | category = "main"
402 | optional = false
403 | python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*"
404 | files = [
405 | {file = "google-auth-2.17.3.tar.gz", hash = "sha256:ce311e2bc58b130fddf316df57c9b3943c2a7b4f6ec31de9663a9333e4064efc"},
406 | {file = "google_auth-2.17.3-py2.py3-none-any.whl", hash = "sha256:f586b274d3eb7bd932ea424b1c702a30e0393a2e2bc4ca3eae8263ffd8be229f"},
407 | ]
408 |
409 | [package.dependencies]
410 | cachetools = ">=2.0.0,<6.0"
411 | pyasn1-modules = ">=0.2.1"
412 | rsa = {version = ">=3.1.4,<5", markers = "python_version >= \"3.6\""}
413 | six = ">=1.9.0"
414 |
415 | [package.extras]
416 | aiohttp = ["aiohttp (>=3.6.2,<4.0.0dev)", "requests (>=2.20.0,<3.0.0dev)"]
417 | enterprise-cert = ["cryptography (==36.0.2)", "pyopenssl (==22.0.0)"]
418 | pyopenssl = ["cryptography (>=38.0.3)", "pyopenssl (>=20.0.0)"]
419 | reauth = ["pyu2f (>=0.1.5)"]
420 | requests = ["requests (>=2.20.0,<3.0.0dev)"]
421 |
422 | [[package]]
423 | name = "google-auth-httplib2"
424 | version = "0.1.0"
425 | description = "Google Authentication Library: httplib2 transport"
426 | category = "main"
427 | optional = false
428 | python-versions = "*"
429 | files = [
430 | {file = "google-auth-httplib2-0.1.0.tar.gz", hash = "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac"},
431 | {file = "google_auth_httplib2-0.1.0-py2.py3-none-any.whl", hash = "sha256:31e49c36c6b5643b57e82617cb3e021e3e1d2df9da63af67252c02fa9c1f4a10"},
432 | ]
433 |
434 | [package.dependencies]
435 | google-auth = "*"
436 | httplib2 = ">=0.15.0"
437 | six = "*"
438 |
439 | [[package]]
440 | name = "google-auth-oauthlib"
441 | version = "0.4.6"
442 | description = "Google Authentication Library"
443 | category = "main"
444 | optional = false
445 | python-versions = ">=3.6"
446 | files = [
447 | {file = "google-auth-oauthlib-0.4.6.tar.gz", hash = "sha256:a90a072f6993f2c327067bf65270046384cda5a8ecb20b94ea9a687f1f233a7a"},
448 | {file = "google_auth_oauthlib-0.4.6-py2.py3-none-any.whl", hash = "sha256:3f2a6e802eebbb6fb736a370fbf3b055edcb6b52878bf2f26330b5e041316c73"},
449 | ]
450 |
451 | [package.dependencies]
452 | google-auth = ">=1.0.0"
453 | requests-oauthlib = ">=0.7.0"
454 |
455 | [package.extras]
456 | tool = ["click (>=6.0.0)"]
457 |
458 | [[package]]
459 | name = "googleapis-common-protos"
460 | version = "1.59.0"
461 | description = "Common protobufs used in Google APIs"
462 | category = "main"
463 | optional = false
464 | python-versions = ">=3.7"
465 | files = [
466 | {file = "googleapis-common-protos-1.59.0.tar.gz", hash = "sha256:4168fcb568a826a52f23510412da405abd93f4d23ba544bb68d943b14ba3cb44"},
467 | {file = "googleapis_common_protos-1.59.0-py2.py3-none-any.whl", hash = "sha256:b287dc48449d1d41af0c69f4ea26242b5ae4c3d7249a38b0984c86a4caffff1f"},
468 | ]
469 |
470 | [package.dependencies]
471 | protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
472 |
473 | [package.extras]
474 | grpc = ["grpcio (>=1.44.0,<2.0.0dev)"]
475 |
476 | [[package]]
477 | name = "httplib2"
478 | version = "0.22.0"
479 | description = "A comprehensive HTTP client library."
480 | category = "main"
481 | optional = false
482 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
483 | files = [
484 | {file = "httplib2-0.22.0-py3-none-any.whl", hash = "sha256:14ae0a53c1ba8f3d37e9e27cf37eabb0fb9980f435ba405d546948b009dd64dc"},
485 | {file = "httplib2-0.22.0.tar.gz", hash = "sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81"},
486 | ]
487 |
488 | [package.dependencies]
489 | pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0.2,<3.0.3 || >3.0.3,<4", markers = "python_version > \"3.0\""}
490 |
491 | [[package]]
492 | name = "idna"
493 | version = "3.4"
494 | description = "Internationalized Domain Names in Applications (IDNA)"
495 | category = "main"
496 | optional = false
497 | python-versions = ">=3.5"
498 | files = [
499 | {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
500 | {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
501 | ]
502 |
503 | [[package]]
504 | name = "importlib-metadata"
505 | version = "6.6.0"
506 | description = "Read metadata from Python packages"
507 | category = "main"
508 | optional = false
509 | python-versions = ">=3.7"
510 | files = [
511 | {file = "importlib_metadata-6.6.0-py3-none-any.whl", hash = "sha256:43dd286a2cd8995d5eaef7fee2066340423b818ed3fd70adf0bad5f1fac53fed"},
512 | {file = "importlib_metadata-6.6.0.tar.gz", hash = "sha256:92501cdf9cc66ebd3e612f1b4f0c0765dfa42f0fa38ffb319b6bd84dd675d705"},
513 | ]
514 |
515 | [package.dependencies]
516 | zipp = ">=0.5"
517 |
518 | [package.extras]
519 | docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
520 | perf = ["ipython"]
521 | testing = ["flake8 (<5)", "flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)"]
522 |
523 | [[package]]
524 | name = "importlib-resources"
525 | version = "5.12.0"
526 | description = "Read resources from Python packages"
527 | category = "main"
528 | optional = false
529 | python-versions = ">=3.7"
530 | files = [
531 | {file = "importlib_resources-5.12.0-py3-none-any.whl", hash = "sha256:7b1deeebbf351c7578e09bf2f63fa2ce8b5ffec296e0d349139d43cca061a81a"},
532 | {file = "importlib_resources-5.12.0.tar.gz", hash = "sha256:4be82589bf5c1d7999aedf2a45159d10cb3ca4f19b2271f8792bc8e6da7b22f6"},
533 | ]
534 |
535 | [package.dependencies]
536 | zipp = {version = ">=3.1.0", markers = "python_version < \"3.10\""}
537 |
538 | [package.extras]
539 | docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
540 | testing = ["flake8 (<5)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
541 |
542 | [[package]]
543 | name = "jaraco-classes"
544 | version = "3.2.3"
545 | description = "Utility functions for Python class constructs"
546 | category = "main"
547 | optional = false
548 | python-versions = ">=3.7"
549 | files = [
550 | {file = "jaraco.classes-3.2.3-py3-none-any.whl", hash = "sha256:2353de3288bc6b82120752201c6b1c1a14b058267fa424ed5ce5984e3b922158"},
551 | {file = "jaraco.classes-3.2.3.tar.gz", hash = "sha256:89559fa5c1d3c34eff6f631ad80bb21f378dbcbb35dd161fd2c6b93f5be2f98a"},
552 | ]
553 |
554 | [package.dependencies]
555 | more-itertools = "*"
556 |
557 | [package.extras]
558 | docs = ["jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)"]
559 | testing = ["flake8 (<5)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
560 |
561 | [[package]]
562 | name = "jeepney"
563 | version = "0.8.0"
564 | description = "Low-level, pure Python DBus protocol wrapper."
565 | category = "main"
566 | optional = false
567 | python-versions = ">=3.7"
568 | files = [
569 | {file = "jeepney-0.8.0-py3-none-any.whl", hash = "sha256:c0a454ad016ca575060802ee4d590dd912e35c122fa04e70306de3d076cce755"},
570 | {file = "jeepney-0.8.0.tar.gz", hash = "sha256:5efe48d255973902f6badc3ce55e2aa6c5c3b3bc642059ef3a91247bcfcc5806"},
571 | ]
572 |
573 | [package.extras]
574 | test = ["async-timeout", "pytest", "pytest-asyncio (>=0.17)", "pytest-trio", "testpath", "trio"]
575 | trio = ["async_generator", "trio"]
576 |
577 | [[package]]
578 | name = "jinja2"
579 | version = "3.1.2"
580 | description = "A very fast and expressive template engine."
581 | category = "main"
582 | optional = false
583 | python-versions = ">=3.7"
584 | files = [
585 | {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
586 | {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
587 | ]
588 |
589 | [package.dependencies]
590 | MarkupSafe = ">=2.0"
591 |
592 | [package.extras]
593 | i18n = ["Babel (>=2.7)"]
594 |
595 | [[package]]
596 | name = "jsonschema"
597 | version = "4.17.3"
598 | description = "An implementation of JSON Schema validation for Python"
599 | category = "main"
600 | optional = false
601 | python-versions = ">=3.7"
602 | files = [
603 | {file = "jsonschema-4.17.3-py3-none-any.whl", hash = "sha256:a870ad254da1a8ca84b6a2905cac29d265f805acc57af304784962a2aa6508f6"},
604 | {file = "jsonschema-4.17.3.tar.gz", hash = "sha256:0f864437ab8b6076ba6707453ef8f98a6a0d512a80e93f8abdb676f737ecb60d"},
605 | ]
606 |
607 | [package.dependencies]
608 | attrs = ">=17.4.0"
609 | importlib-resources = {version = ">=1.4.0", markers = "python_version < \"3.9\""}
610 | pkgutil-resolve-name = {version = ">=1.3.10", markers = "python_version < \"3.9\""}
611 | pyrsistent = ">=0.14.0,<0.17.0 || >0.17.0,<0.17.1 || >0.17.1,<0.17.2 || >0.17.2"
612 |
613 | [package.extras]
614 | format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
615 | format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
616 |
617 | [[package]]
618 | name = "keyring"
619 | version = "23.13.1"
620 | description = "Store and access your passwords safely."
621 | category = "main"
622 | optional = false
623 | python-versions = ">=3.7"
624 | files = [
625 | {file = "keyring-23.13.1-py3-none-any.whl", hash = "sha256:771ed2a91909389ed6148631de678f82ddc73737d85a927f382a8a1b157898cd"},
626 | {file = "keyring-23.13.1.tar.gz", hash = "sha256:ba2e15a9b35e21908d0aaf4e0a47acc52d6ae33444df0da2b49d41a46ef6d678"},
627 | ]
628 |
629 | [package.dependencies]
630 | importlib-metadata = {version = ">=4.11.4", markers = "python_version < \"3.12\""}
631 | importlib-resources = {version = "*", markers = "python_version < \"3.9\""}
632 | "jaraco.classes" = "*"
633 | jeepney = {version = ">=0.4.2", markers = "sys_platform == \"linux\""}
634 | pywin32-ctypes = {version = ">=0.2.0", markers = "sys_platform == \"win32\""}
635 | SecretStorage = {version = ">=3.2", markers = "sys_platform == \"linux\""}
636 |
637 | [package.extras]
638 | completion = ["shtab"]
639 | docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)"]
640 | testing = ["flake8 (<5)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
641 |
642 | [[package]]
643 | name = "lockfile"
644 | version = "0.12.2"
645 | description = "Platform-independent file locking module"
646 | category = "main"
647 | optional = false
648 | python-versions = "*"
649 | files = [
650 | {file = "lockfile-0.12.2-py2.py3-none-any.whl", hash = "sha256:6c3cb24f344923d30b2785d5ad75182c8ea7ac1b6171b08657258ec7429d50fa"},
651 | {file = "lockfile-0.12.2.tar.gz", hash = "sha256:6aed02de03cba24efabcd600b30540140634fc06cfa603822d508d5361e9f799"},
652 | ]
653 |
654 | [[package]]
655 | name = "lxml"
656 | version = "4.9.2"
657 | description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API."
658 | category = "main"
659 | optional = false
660 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, != 3.4.*"
661 | files = [
662 | {file = "lxml-4.9.2-cp27-cp27m-macosx_10_15_x86_64.whl", hash = "sha256:76cf573e5a365e790396a5cc2b909812633409306c6531a6877c59061e42c4f2"},
663 | {file = "lxml-4.9.2-cp27-cp27m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b1f42b6921d0e81b1bcb5e395bc091a70f41c4d4e55ba99c6da2b31626c44892"},
664 | {file = "lxml-4.9.2-cp27-cp27m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9f102706d0ca011de571de32c3247c6476b55bb6bc65a20f682f000b07a4852a"},
665 | {file = "lxml-4.9.2-cp27-cp27m-win32.whl", hash = "sha256:8d0b4612b66ff5d62d03bcaa043bb018f74dfea51184e53f067e6fdcba4bd8de"},
666 | {file = "lxml-4.9.2-cp27-cp27m-win_amd64.whl", hash = "sha256:4c8f293f14abc8fd3e8e01c5bd86e6ed0b6ef71936ded5bf10fe7a5efefbaca3"},
667 | {file = "lxml-4.9.2-cp27-cp27mu-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2899456259589aa38bfb018c364d6ae7b53c5c22d8e27d0ec7609c2a1ff78b50"},
668 | {file = "lxml-4.9.2-cp27-cp27mu-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6749649eecd6a9871cae297bffa4ee76f90b4504a2a2ab528d9ebe912b101975"},
669 | {file = "lxml-4.9.2-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:a08cff61517ee26cb56f1e949cca38caabe9ea9fbb4b1e10a805dc39844b7d5c"},
670 | {file = "lxml-4.9.2-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:85cabf64adec449132e55616e7ca3e1000ab449d1d0f9d7f83146ed5bdcb6d8a"},
671 | {file = "lxml-4.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:8340225bd5e7a701c0fa98284c849c9b9fc9238abf53a0ebd90900f25d39a4e4"},
672 | {file = "lxml-4.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:1ab8f1f932e8f82355e75dda5413a57612c6ea448069d4fb2e217e9a4bed13d4"},
673 | {file = "lxml-4.9.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:699a9af7dffaf67deeae27b2112aa06b41c370d5e7633e0ee0aea2e0b6c211f7"},
674 | {file = "lxml-4.9.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b9cc34af337a97d470040f99ba4282f6e6bac88407d021688a5d585e44a23184"},
675 | {file = "lxml-4.9.2-cp310-cp310-win32.whl", hash = "sha256:d02a5399126a53492415d4906ab0ad0375a5456cc05c3fc0fc4ca11771745cda"},
676 | {file = "lxml-4.9.2-cp310-cp310-win_amd64.whl", hash = "sha256:a38486985ca49cfa574a507e7a2215c0c780fd1778bb6290c21193b7211702ab"},
677 | {file = "lxml-4.9.2-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:c83203addf554215463b59f6399835201999b5e48019dc17f182ed5ad87205c9"},
678 | {file = "lxml-4.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:2a87fa548561d2f4643c99cd13131acb607ddabb70682dcf1dff5f71f781a4bf"},
679 | {file = "lxml-4.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:d6b430a9938a5a5d85fc107d852262ddcd48602c120e3dbb02137c83d212b380"},
680 | {file = "lxml-4.9.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:3efea981d956a6f7173b4659849f55081867cf897e719f57383698af6f618a92"},
681 | {file = "lxml-4.9.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:df0623dcf9668ad0445e0558a21211d4e9a149ea8f5666917c8eeec515f0a6d1"},
682 | {file = "lxml-4.9.2-cp311-cp311-win32.whl", hash = "sha256:da248f93f0418a9e9d94b0080d7ebc407a9a5e6d0b57bb30db9b5cc28de1ad33"},
683 | {file = "lxml-4.9.2-cp311-cp311-win_amd64.whl", hash = "sha256:3818b8e2c4b5148567e1b09ce739006acfaa44ce3156f8cbbc11062994b8e8dd"},
684 | {file = "lxml-4.9.2-cp35-cp35m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ca989b91cf3a3ba28930a9fc1e9aeafc2a395448641df1f387a2d394638943b0"},
685 | {file = "lxml-4.9.2-cp35-cp35m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:822068f85e12a6e292803e112ab876bc03ed1f03dddb80154c395f891ca6b31e"},
686 | {file = "lxml-4.9.2-cp35-cp35m-win32.whl", hash = "sha256:be7292c55101e22f2a3d4d8913944cbea71eea90792bf914add27454a13905df"},
687 | {file = "lxml-4.9.2-cp35-cp35m-win_amd64.whl", hash = "sha256:998c7c41910666d2976928c38ea96a70d1aa43be6fe502f21a651e17483a43c5"},
688 | {file = "lxml-4.9.2-cp36-cp36m-macosx_10_15_x86_64.whl", hash = "sha256:b26a29f0b7fc6f0897f043ca366142d2b609dc60756ee6e4e90b5f762c6adc53"},
689 | {file = "lxml-4.9.2-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:ab323679b8b3030000f2be63e22cdeea5b47ee0abd2d6a1dc0c8103ddaa56cd7"},
690 | {file = "lxml-4.9.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:689bb688a1db722485e4610a503e3e9210dcc20c520b45ac8f7533c837be76fe"},
691 | {file = "lxml-4.9.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:f49e52d174375a7def9915c9f06ec4e569d235ad428f70751765f48d5926678c"},
692 | {file = "lxml-4.9.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:36c3c175d34652a35475a73762b545f4527aec044910a651d2bf50de9c3352b1"},
693 | {file = "lxml-4.9.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:a35f8b7fa99f90dd2f5dc5a9fa12332642f087a7641289ca6c40d6e1a2637d8e"},
694 | {file = "lxml-4.9.2-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:58bfa3aa19ca4c0f28c5dde0ff56c520fbac6f0daf4fac66ed4c8d2fb7f22e74"},
695 | {file = "lxml-4.9.2-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:bc718cd47b765e790eecb74d044cc8d37d58562f6c314ee9484df26276d36a38"},
696 | {file = "lxml-4.9.2-cp36-cp36m-win32.whl", hash = "sha256:d5bf6545cd27aaa8a13033ce56354ed9e25ab0e4ac3b5392b763d8d04b08e0c5"},
697 | {file = "lxml-4.9.2-cp36-cp36m-win_amd64.whl", hash = "sha256:3ab9fa9d6dc2a7f29d7affdf3edebf6ece6fb28a6d80b14c3b2fb9d39b9322c3"},
698 | {file = "lxml-4.9.2-cp37-cp37m-macosx_10_15_x86_64.whl", hash = "sha256:05ca3f6abf5cf78fe053da9b1166e062ade3fa5d4f92b4ed688127ea7d7b1d03"},
699 | {file = "lxml-4.9.2-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:a5da296eb617d18e497bcf0a5c528f5d3b18dadb3619fbdadf4ed2356ef8d941"},
700 | {file = "lxml-4.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:04876580c050a8c5341d706dd464ff04fd597095cc8c023252566a8826505726"},
701 | {file = "lxml-4.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:c9ec3eaf616d67db0764b3bb983962b4f385a1f08304fd30c7283954e6a7869b"},
702 | {file = "lxml-4.9.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2a29ba94d065945944016b6b74e538bdb1751a1db6ffb80c9d3c2e40d6fa9894"},
703 | {file = "lxml-4.9.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:a82d05da00a58b8e4c0008edbc8a4b6ec5a4bc1e2ee0fb6ed157cf634ed7fa45"},
704 | {file = "lxml-4.9.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:223f4232855ade399bd409331e6ca70fb5578efef22cf4069a6090acc0f53c0e"},
705 | {file = "lxml-4.9.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d17bc7c2ccf49c478c5bdd447594e82692c74222698cfc9b5daae7ae7e90743b"},
706 | {file = "lxml-4.9.2-cp37-cp37m-win32.whl", hash = "sha256:b64d891da92e232c36976c80ed7ebb383e3f148489796d8d31a5b6a677825efe"},
707 | {file = "lxml-4.9.2-cp37-cp37m-win_amd64.whl", hash = "sha256:a0a336d6d3e8b234a3aae3c674873d8f0e720b76bc1d9416866c41cd9500ffb9"},
708 | {file = "lxml-4.9.2-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:da4dd7c9c50c059aba52b3524f84d7de956f7fef88f0bafcf4ad7dde94a064e8"},
709 | {file = "lxml-4.9.2-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:821b7f59b99551c69c85a6039c65b75f5683bdc63270fec660f75da67469ca24"},
710 | {file = "lxml-4.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:e5168986b90a8d1f2f9dc1b841467c74221bd752537b99761a93d2d981e04889"},
711 | {file = "lxml-4.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:8e20cb5a47247e383cf4ff523205060991021233ebd6f924bca927fcf25cf86f"},
712 | {file = "lxml-4.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:13598ecfbd2e86ea7ae45ec28a2a54fb87ee9b9fdb0f6d343297d8e548392c03"},
713 | {file = "lxml-4.9.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:880bbbcbe2fca64e2f4d8e04db47bcdf504936fa2b33933efd945e1b429bea8c"},
714 | {file = "lxml-4.9.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7d2278d59425777cfcb19735018d897ca8303abe67cc735f9f97177ceff8027f"},
715 | {file = "lxml-4.9.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5344a43228767f53a9df6e5b253f8cdca7dfc7b7aeae52551958192f56d98457"},
716 | {file = "lxml-4.9.2-cp38-cp38-win32.whl", hash = "sha256:925073b2fe14ab9b87e73f9a5fde6ce6392da430f3004d8b72cc86f746f5163b"},
717 | {file = "lxml-4.9.2-cp38-cp38-win_amd64.whl", hash = "sha256:9b22c5c66f67ae00c0199f6055705bc3eb3fcb08d03d2ec4059a2b1b25ed48d7"},
718 | {file = "lxml-4.9.2-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:5f50a1c177e2fa3ee0667a5ab79fdc6b23086bc8b589d90b93b4bd17eb0e64d1"},
719 | {file = "lxml-4.9.2-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:090c6543d3696cbe15b4ac6e175e576bcc3f1ccfbba970061b7300b0c15a2140"},
720 | {file = "lxml-4.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:63da2ccc0857c311d764e7d3d90f429c252e83b52d1f8f1d1fe55be26827d1f4"},
721 | {file = "lxml-4.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:5b4545b8a40478183ac06c073e81a5ce4cf01bf1734962577cf2bb569a5b3bbf"},
722 | {file = "lxml-4.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2e430cd2824f05f2d4f687701144556646bae8f249fd60aa1e4c768ba7018947"},
723 | {file = "lxml-4.9.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6804daeb7ef69e7b36f76caddb85cccd63d0c56dedb47555d2fc969e2af6a1a5"},
724 | {file = "lxml-4.9.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a6e441a86553c310258aca15d1c05903aaf4965b23f3bc2d55f200804e005ee5"},
725 | {file = "lxml-4.9.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ca34efc80a29351897e18888c71c6aca4a359247c87e0b1c7ada14f0ab0c0fb2"},
726 | {file = "lxml-4.9.2-cp39-cp39-win32.whl", hash = "sha256:6b418afe5df18233fc6b6093deb82a32895b6bb0b1155c2cdb05203f583053f1"},
727 | {file = "lxml-4.9.2-cp39-cp39-win_amd64.whl", hash = "sha256:f1496ea22ca2c830cbcbd473de8f114a320da308438ae65abad6bab7867fe38f"},
728 | {file = "lxml-4.9.2-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:b264171e3143d842ded311b7dccd46ff9ef34247129ff5bf5066123c55c2431c"},
729 | {file = "lxml-4.9.2-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0dc313ef231edf866912e9d8f5a042ddab56c752619e92dfd3a2c277e6a7299a"},
730 | {file = "lxml-4.9.2-pp38-pypy38_pp73-macosx_10_15_x86_64.whl", hash = "sha256:16efd54337136e8cd72fb9485c368d91d77a47ee2d42b057564aae201257d419"},
731 | {file = "lxml-4.9.2-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:0f2b1e0d79180f344ff9f321327b005ca043a50ece8713de61d1cb383fb8ac05"},
732 | {file = "lxml-4.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:7b770ed79542ed52c519119473898198761d78beb24b107acf3ad65deae61f1f"},
733 | {file = "lxml-4.9.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:efa29c2fe6b4fdd32e8ef81c1528506895eca86e1d8c4657fda04c9b3786ddf9"},
734 | {file = "lxml-4.9.2-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7e91ee82f4199af8c43d8158024cbdff3d931df350252288f0d4ce656df7f3b5"},
735 | {file = "lxml-4.9.2-pp39-pypy39_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:b23e19989c355ca854276178a0463951a653309fb8e57ce674497f2d9f208746"},
736 | {file = "lxml-4.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:01d36c05f4afb8f7c20fd9ed5badca32a2029b93b1750f571ccc0b142531caf7"},
737 | {file = "lxml-4.9.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7b515674acfdcadb0eb5d00d8a709868173acece5cb0be3dd165950cbfdf5409"},
738 | {file = "lxml-4.9.2.tar.gz", hash = "sha256:2455cfaeb7ac70338b3257f41e21f0724f4b5b0c0e7702da67ee6c3640835b67"},
739 | ]
740 |
741 | [package.extras]
742 | cssselect = ["cssselect (>=0.7)"]
743 | html5 = ["html5lib"]
744 | htmlsoup = ["BeautifulSoup4"]
745 | source = ["Cython (>=0.29.7)"]
746 |
747 | [[package]]
748 | name = "markupsafe"
749 | version = "2.1.2"
750 | description = "Safely add untrusted strings to HTML/XML markup."
751 | category = "main"
752 | optional = false
753 | python-versions = ">=3.7"
754 | files = [
755 | {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7"},
756 | {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036"},
757 | {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1"},
758 | {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323"},
759 | {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601"},
760 | {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1"},
761 | {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff"},
762 | {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65"},
763 | {file = "MarkupSafe-2.1.2-cp310-cp310-win32.whl", hash = "sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603"},
764 | {file = "MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156"},
765 | {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013"},
766 | {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a"},
767 | {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd"},
768 | {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6"},
769 | {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d"},
770 | {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1"},
771 | {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc"},
772 | {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0"},
773 | {file = "MarkupSafe-2.1.2-cp311-cp311-win32.whl", hash = "sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625"},
774 | {file = "MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3"},
775 | {file = "MarkupSafe-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a"},
776 | {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a"},
777 | {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a"},
778 | {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2"},
779 | {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619"},
780 | {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513"},
781 | {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460"},
782 | {file = "MarkupSafe-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859"},
783 | {file = "MarkupSafe-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666"},
784 | {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed"},
785 | {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094"},
786 | {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54"},
787 | {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419"},
788 | {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa"},
789 | {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58"},
790 | {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba"},
791 | {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03"},
792 | {file = "MarkupSafe-2.1.2-cp38-cp38-win32.whl", hash = "sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2"},
793 | {file = "MarkupSafe-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147"},
794 | {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f"},
795 | {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd"},
796 | {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f"},
797 | {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4"},
798 | {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2"},
799 | {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65"},
800 | {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c"},
801 | {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3"},
802 | {file = "MarkupSafe-2.1.2-cp39-cp39-win32.whl", hash = "sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7"},
803 | {file = "MarkupSafe-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed"},
804 | {file = "MarkupSafe-2.1.2.tar.gz", hash = "sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d"},
805 | ]
806 |
807 | [[package]]
808 | name = "more-itertools"
809 | version = "9.1.0"
810 | description = "More routines for operating on iterables, beyond itertools"
811 | category = "main"
812 | optional = false
813 | python-versions = ">=3.7"
814 | files = [
815 | {file = "more-itertools-9.1.0.tar.gz", hash = "sha256:cabaa341ad0389ea83c17a94566a53ae4c9d07349861ecb14dc6d0345cf9ac5d"},
816 | {file = "more_itertools-9.1.0-py3-none-any.whl", hash = "sha256:d2bc7f02446e86a68911e58ded76d6561eea00cddfb2a91e7019bbb586c799f3"},
817 | ]
818 |
819 | [[package]]
820 | name = "numpy"
821 | version = "1.24.3"
822 | description = "Fundamental package for array computing in Python"
823 | category = "main"
824 | optional = false
825 | python-versions = ">=3.8"
826 | files = [
827 | {file = "numpy-1.24.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3c1104d3c036fb81ab923f507536daedc718d0ad5a8707c6061cdfd6d184e570"},
828 | {file = "numpy-1.24.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:202de8f38fc4a45a3eea4b63e2f376e5f2dc64ef0fa692838e31a808520efaf7"},
829 | {file = "numpy-1.24.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8535303847b89aa6b0f00aa1dc62867b5a32923e4d1681a35b5eef2d9591a463"},
830 | {file = "numpy-1.24.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d926b52ba1367f9acb76b0df6ed21f0b16a1ad87c6720a1121674e5cf63e2b6"},
831 | {file = "numpy-1.24.3-cp310-cp310-win32.whl", hash = "sha256:f21c442fdd2805e91799fbe044a7b999b8571bb0ab0f7850d0cb9641a687092b"},
832 | {file = "numpy-1.24.3-cp310-cp310-win_amd64.whl", hash = "sha256:ab5f23af8c16022663a652d3b25dcdc272ac3f83c3af4c02eb8b824e6b3ab9d7"},
833 | {file = "numpy-1.24.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9a7721ec204d3a237225db3e194c25268faf92e19338a35f3a224469cb6039a3"},
834 | {file = "numpy-1.24.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d6cc757de514c00b24ae8cf5c876af2a7c3df189028d68c0cb4eaa9cd5afc2bf"},
835 | {file = "numpy-1.24.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76e3f4e85fc5d4fd311f6e9b794d0c00e7002ec122be271f2019d63376f1d385"},
836 | {file = "numpy-1.24.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a1d3c026f57ceaad42f8231305d4653d5f05dc6332a730ae5c0bea3513de0950"},
837 | {file = "numpy-1.24.3-cp311-cp311-win32.whl", hash = "sha256:c91c4afd8abc3908e00a44b2672718905b8611503f7ff87390cc0ac3423fb096"},
838 | {file = "numpy-1.24.3-cp311-cp311-win_amd64.whl", hash = "sha256:5342cf6aad47943286afa6f1609cad9b4266a05e7f2ec408e2cf7aea7ff69d80"},
839 | {file = "numpy-1.24.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7776ea65423ca6a15255ba1872d82d207bd1e09f6d0894ee4a64678dd2204078"},
840 | {file = "numpy-1.24.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ae8d0be48d1b6ed82588934aaaa179875e7dc4f3d84da18d7eae6eb3f06c242c"},
841 | {file = "numpy-1.24.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ecde0f8adef7dfdec993fd54b0f78183051b6580f606111a6d789cd14c61ea0c"},
842 | {file = "numpy-1.24.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4749e053a29364d3452c034827102ee100986903263e89884922ef01a0a6fd2f"},
843 | {file = "numpy-1.24.3-cp38-cp38-win32.whl", hash = "sha256:d933fabd8f6a319e8530d0de4fcc2e6a61917e0b0c271fded460032db42a0fe4"},
844 | {file = "numpy-1.24.3-cp38-cp38-win_amd64.whl", hash = "sha256:56e48aec79ae238f6e4395886b5eaed058abb7231fb3361ddd7bfdf4eed54289"},
845 | {file = "numpy-1.24.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4719d5aefb5189f50887773699eaf94e7d1e02bf36c1a9d353d9f46703758ca4"},
846 | {file = "numpy-1.24.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0ec87a7084caa559c36e0a2309e4ecb1baa03b687201d0a847c8b0ed476a7187"},
847 | {file = "numpy-1.24.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea8282b9bcfe2b5e7d491d0bf7f3e2da29700cec05b49e64d6246923329f2b02"},
848 | {file = "numpy-1.24.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:210461d87fb02a84ef243cac5e814aad2b7f4be953b32cb53327bb49fd77fbb4"},
849 | {file = "numpy-1.24.3-cp39-cp39-win32.whl", hash = "sha256:784c6da1a07818491b0ffd63c6bbe5a33deaa0e25a20e1b3ea20cf0e43f8046c"},
850 | {file = "numpy-1.24.3-cp39-cp39-win_amd64.whl", hash = "sha256:d5036197ecae68d7f491fcdb4df90082b0d4960ca6599ba2659957aafced7c17"},
851 | {file = "numpy-1.24.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:352ee00c7f8387b44d19f4cada524586f07379c0d49270f87233983bc5087ca0"},
852 | {file = "numpy-1.24.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7d6acc2e7524c9955e5c903160aa4ea083736fde7e91276b0e5d98e6332812"},
853 | {file = "numpy-1.24.3-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:35400e6a8d102fd07c71ed7dcadd9eb62ee9a6e84ec159bd48c28235bbb0f8e4"},
854 | {file = "numpy-1.24.3.tar.gz", hash = "sha256:ab344f1bf21f140adab8e47fdbc7c35a477dc01408791f8ba00d018dd0bc5155"},
855 | ]
856 |
857 | [[package]]
858 | name = "oauthlib"
859 | version = "3.2.2"
860 | description = "A generic, spec-compliant, thorough implementation of the OAuth request-signing logic"
861 | category = "main"
862 | optional = false
863 | python-versions = ">=3.6"
864 | files = [
865 | {file = "oauthlib-3.2.2-py3-none-any.whl", hash = "sha256:8139f29aac13e25d502680e9e19963e83f16838d48a0d71c287fe40e7067fbca"},
866 | {file = "oauthlib-3.2.2.tar.gz", hash = "sha256:9859c40929662bec5d64f34d01c99e093149682a3f38915dc0655d5a633dd918"},
867 | ]
868 |
869 | [package.extras]
870 | rsa = ["cryptography (>=3.0.0)"]
871 | signals = ["blinker (>=1.4.0)"]
872 | signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
873 |
874 | [[package]]
875 | name = "pandas"
876 | version = "1.5.3"
877 | description = "Powerful data structures for data analysis, time series, and statistics"
878 | category = "main"
879 | optional = false
880 | python-versions = ">=3.8"
881 | files = [
882 | {file = "pandas-1.5.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3749077d86e3a2f0ed51367f30bf5b82e131cc0f14260c4d3e499186fccc4406"},
883 | {file = "pandas-1.5.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:972d8a45395f2a2d26733eb8d0f629b2f90bebe8e8eddbb8829b180c09639572"},
884 | {file = "pandas-1.5.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:50869a35cbb0f2e0cd5ec04b191e7b12ed688874bd05dd777c19b28cbea90996"},
885 | {file = "pandas-1.5.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c3ac844a0fe00bfaeb2c9b51ab1424e5c8744f89860b138434a363b1f620f354"},
886 | {file = "pandas-1.5.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a0a56cef15fd1586726dace5616db75ebcfec9179a3a55e78f72c5639fa2a23"},
887 | {file = "pandas-1.5.3-cp310-cp310-win_amd64.whl", hash = "sha256:478ff646ca42b20376e4ed3fa2e8d7341e8a63105586efe54fa2508ee087f328"},
888 | {file = "pandas-1.5.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6973549c01ca91ec96199e940495219c887ea815b2083722821f1d7abfa2b4dc"},
889 | {file = "pandas-1.5.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c39a8da13cede5adcd3be1182883aea1c925476f4e84b2807a46e2775306305d"},
890 | {file = "pandas-1.5.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f76d097d12c82a535fda9dfe5e8dd4127952b45fea9b0276cb30cca5ea313fbc"},
891 | {file = "pandas-1.5.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e474390e60ed609cec869b0da796ad94f420bb057d86784191eefc62b65819ae"},
892 | {file = "pandas-1.5.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f2b952406a1588ad4cad5b3f55f520e82e902388a6d5a4a91baa8d38d23c7f6"},
893 | {file = "pandas-1.5.3-cp311-cp311-win_amd64.whl", hash = "sha256:bc4c368f42b551bf72fac35c5128963a171b40dce866fb066540eeaf46faa003"},
894 | {file = "pandas-1.5.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:14e45300521902689a81f3f41386dc86f19b8ba8dd5ac5a3c7010ef8d2932813"},
895 | {file = "pandas-1.5.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9842b6f4b8479e41968eced654487258ed81df7d1c9b7b870ceea24ed9459b31"},
896 | {file = "pandas-1.5.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:26d9c71772c7afb9d5046e6e9cf42d83dd147b5cf5bcb9d97252077118543792"},
897 | {file = "pandas-1.5.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5fbcb19d6fceb9e946b3e23258757c7b225ba450990d9ed63ccceeb8cae609f7"},
898 | {file = "pandas-1.5.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:565fa34a5434d38e9d250af3c12ff931abaf88050551d9fbcdfafca50d62babf"},
899 | {file = "pandas-1.5.3-cp38-cp38-win32.whl", hash = "sha256:87bd9c03da1ac870a6d2c8902a0e1fd4267ca00f13bc494c9e5a9020920e1d51"},
900 | {file = "pandas-1.5.3-cp38-cp38-win_amd64.whl", hash = "sha256:41179ce559943d83a9b4bbacb736b04c928b095b5f25dd2b7389eda08f46f373"},
901 | {file = "pandas-1.5.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c74a62747864ed568f5a82a49a23a8d7fe171d0c69038b38cedf0976831296fa"},
902 | {file = "pandas-1.5.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c4c00e0b0597c8e4f59e8d461f797e5d70b4d025880516a8261b2817c47759ee"},
903 | {file = "pandas-1.5.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a50d9a4336a9621cab7b8eb3fb11adb82de58f9b91d84c2cd526576b881a0c5a"},
904 | {file = "pandas-1.5.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dd05f7783b3274aa206a1af06f0ceed3f9b412cf665b7247eacd83be41cf7bf0"},
905 | {file = "pandas-1.5.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f69c4029613de47816b1bb30ff5ac778686688751a5e9c99ad8c7031f6508e5"},
906 | {file = "pandas-1.5.3-cp39-cp39-win32.whl", hash = "sha256:7cec0bee9f294e5de5bbfc14d0573f65526071029d036b753ee6507d2a21480a"},
907 | {file = "pandas-1.5.3-cp39-cp39-win_amd64.whl", hash = "sha256:dfd681c5dc216037e0b0a2c821f5ed99ba9f03ebcf119c7dac0e9a7b960b9ec9"},
908 | {file = "pandas-1.5.3.tar.gz", hash = "sha256:74a3fd7e5a7ec052f183273dc7b0acd3a863edf7520f5d3a1765c04ffdb3b0b1"},
909 | ]
910 |
911 | [package.dependencies]
912 | numpy = [
913 | {version = ">=1.20.3", markers = "python_version < \"3.10\""},
914 | {version = ">=1.21.0", markers = "python_version >= \"3.10\""},
915 | {version = ">=1.23.2", markers = "python_version >= \"3.11\""},
916 | ]
917 | python-dateutil = ">=2.8.1"
918 | pytz = ">=2020.1"
919 |
920 | [package.extras]
921 | test = ["hypothesis (>=5.5.3)", "pytest (>=6.0)", "pytest-xdist (>=1.31)"]
922 |
923 | [[package]]
924 | name = "pkgutil-resolve-name"
925 | version = "1.3.10"
926 | description = "Resolve a name to an object."
927 | category = "main"
928 | optional = false
929 | python-versions = ">=3.6"
930 | files = [
931 | {file = "pkgutil_resolve_name-1.3.10-py3-none-any.whl", hash = "sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e"},
932 | {file = "pkgutil_resolve_name-1.3.10.tar.gz", hash = "sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174"},
933 | ]
934 |
935 | [[package]]
936 | name = "plotly"
937 | version = "4.14.3"
938 | description = "An open-source, interactive data visualization library for Python"
939 | category = "main"
940 | optional = false
941 | python-versions = "*"
942 | files = [
943 | {file = "plotly-4.14.3-py2.py3-none-any.whl", hash = "sha256:d68fc15fcb49f88db27ab3e0c87110943e65fee02a47f33a8590f541b3042461"},
944 | {file = "plotly-4.14.3.tar.gz", hash = "sha256:7d8aaeed392e82fb8e0e48899f2d3d957b12327f9d38cdd5802bc574a8a39d91"},
945 | ]
946 |
947 | [package.dependencies]
948 | retrying = ">=1.3.3"
949 | six = "*"
950 |
951 | [[package]]
952 | name = "protobuf"
953 | version = "4.23.0"
954 | description = ""
955 | category = "main"
956 | optional = false
957 | python-versions = ">=3.7"
958 | files = [
959 | {file = "protobuf-4.23.0-cp310-abi3-win32.whl", hash = "sha256:6c16657d6717a0c62d5d740cb354fbad1b0d8cb811669e06fc1caa0ff4799ddd"},
960 | {file = "protobuf-4.23.0-cp310-abi3-win_amd64.whl", hash = "sha256:baca40d067dddd62141a129f244703160d278648b569e90bb0e3753067644711"},
961 | {file = "protobuf-4.23.0-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:2b94bd6df92d71bd1234a2ffe7ce96ddf6d10cf637a18d6b55ad0a89fbb7fc21"},
962 | {file = "protobuf-4.23.0-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:9f5a0fbfcdcc364f3986f9ed9f8bb1328fb84114fd790423ff3d7fdb0f85c2d1"},
963 | {file = "protobuf-4.23.0-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:ebde3a023b8e11bfa6c890ef34cd6a8b47d586f26135e86c21344fe433daf2e2"},
964 | {file = "protobuf-4.23.0-cp37-cp37m-win32.whl", hash = "sha256:7cb5b9a05ce52c6a782bb97de52679bd3438ff2b7460eff5da348db65650f227"},
965 | {file = "protobuf-4.23.0-cp37-cp37m-win_amd64.whl", hash = "sha256:6fe180b56e1169d72ecc4acbd39186339aed20af5384531b8e8979b02bbee159"},
966 | {file = "protobuf-4.23.0-cp38-cp38-win32.whl", hash = "sha256:d5a35ff54e3f62e8fc7be02bb0d2fbc212bba1a5a9cc2748090690093996f07b"},
967 | {file = "protobuf-4.23.0-cp38-cp38-win_amd64.whl", hash = "sha256:e62fb869762b4ba18666370e2f8a18f17f8ab92dd4467295c6d38be6f8fef60b"},
968 | {file = "protobuf-4.23.0-cp39-cp39-win32.whl", hash = "sha256:03eee35b60317112a72d19c54d0bff7bc58ff12fea4cd7b018232bd99758ffdf"},
969 | {file = "protobuf-4.23.0-cp39-cp39-win_amd64.whl", hash = "sha256:36f5370a930cb77c8ad2f4135590c672d0d2c72d4a707c7d0058dce4b4b4a598"},
970 | {file = "protobuf-4.23.0-py3-none-any.whl", hash = "sha256:9744e934ea5855d12191040ea198eaf704ac78665d365a89d9572e3b627c2688"},
971 | {file = "protobuf-4.23.0.tar.gz", hash = "sha256:5f1eba1da2a2f3f7df469fccddef3cc060b8a16cfe3cc65961ad36b4dbcf59c5"},
972 | ]
973 |
974 | [[package]]
975 | name = "psutil"
976 | version = "5.9.5"
977 | description = "Cross-platform lib for process and system monitoring in Python."
978 | category = "main"
979 | optional = false
980 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
981 | files = [
982 | {file = "psutil-5.9.5-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:be8929ce4313f9f8146caad4272f6abb8bf99fc6cf59344a3167ecd74f4f203f"},
983 | {file = "psutil-5.9.5-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:ab8ed1a1d77c95453db1ae00a3f9c50227ebd955437bcf2a574ba8adbf6a74d5"},
984 | {file = "psutil-5.9.5-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:4aef137f3345082a3d3232187aeb4ac4ef959ba3d7c10c33dd73763fbc063da4"},
985 | {file = "psutil-5.9.5-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:ea8518d152174e1249c4f2a1c89e3e6065941df2fa13a1ab45327716a23c2b48"},
986 | {file = "psutil-5.9.5-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:acf2aef9391710afded549ff602b5887d7a2349831ae4c26be7c807c0a39fac4"},
987 | {file = "psutil-5.9.5-cp27-none-win32.whl", hash = "sha256:5b9b8cb93f507e8dbaf22af6a2fd0ccbe8244bf30b1baad6b3954e935157ae3f"},
988 | {file = "psutil-5.9.5-cp27-none-win_amd64.whl", hash = "sha256:8c5f7c5a052d1d567db4ddd231a9d27a74e8e4a9c3f44b1032762bd7b9fdcd42"},
989 | {file = "psutil-5.9.5-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:3c6f686f4225553615612f6d9bc21f1c0e305f75d7d8454f9b46e901778e7217"},
990 | {file = "psutil-5.9.5-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7a7dd9997128a0d928ed4fb2c2d57e5102bb6089027939f3b722f3a210f9a8da"},
991 | {file = "psutil-5.9.5-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89518112647f1276b03ca97b65cc7f64ca587b1eb0278383017c2a0dcc26cbe4"},
992 | {file = "psutil-5.9.5-cp36-abi3-win32.whl", hash = "sha256:104a5cc0e31baa2bcf67900be36acde157756b9c44017b86b2c049f11957887d"},
993 | {file = "psutil-5.9.5-cp36-abi3-win_amd64.whl", hash = "sha256:b258c0c1c9d145a1d5ceffab1134441c4c5113b2417fafff7315a917a026c3c9"},
994 | {file = "psutil-5.9.5-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:c607bb3b57dc779d55e1554846352b4e358c10fff3abf3514a7a6601beebdb30"},
995 | {file = "psutil-5.9.5.tar.gz", hash = "sha256:5410638e4df39c54d957fc51ce03048acd8e6d60abc0f5107af51e5fb566eb3c"},
996 | ]
997 |
998 | [package.extras]
999 | test = ["enum34", "ipaddress", "mock", "pywin32", "wmi"]
1000 |
1001 | [[package]]
1002 | name = "pyasn1"
1003 | version = "0.5.0"
1004 | description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
1005 | category = "main"
1006 | optional = false
1007 | python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
1008 | files = [
1009 | {file = "pyasn1-0.5.0-py2.py3-none-any.whl", hash = "sha256:87a2121042a1ac9358cabcaf1d07680ff97ee6404333bacca15f76aa8ad01a57"},
1010 | {file = "pyasn1-0.5.0.tar.gz", hash = "sha256:97b7290ca68e62a832558ec3976f15cbf911bf5d7c7039d8b861c2a0ece69fde"},
1011 | ]
1012 |
1013 | [[package]]
1014 | name = "pyasn1-modules"
1015 | version = "0.3.0"
1016 | description = "A collection of ASN.1-based protocols modules"
1017 | category = "main"
1018 | optional = false
1019 | python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
1020 | files = [
1021 | {file = "pyasn1_modules-0.3.0-py2.py3-none-any.whl", hash = "sha256:d3ccd6ed470d9ffbc716be08bd90efbd44d0734bc9303818f7336070984a162d"},
1022 | {file = "pyasn1_modules-0.3.0.tar.gz", hash = "sha256:5bd01446b736eb9d31512a30d46c1ac3395d676c6f3cafa4c03eb54b9925631c"},
1023 | ]
1024 |
1025 | [package.dependencies]
1026 | pyasn1 = ">=0.4.6,<0.6.0"
1027 |
1028 | [[package]]
1029 | name = "pycparser"
1030 | version = "2.21"
1031 | description = "C parser in Python"
1032 | category = "main"
1033 | optional = false
1034 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
1035 | files = [
1036 | {file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
1037 | {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
1038 | ]
1039 |
1040 | [[package]]
1041 | name = "pyparsing"
1042 | version = "3.0.9"
1043 | description = "pyparsing module - Classes and methods to define and execute parsing grammars"
1044 | category = "main"
1045 | optional = false
1046 | python-versions = ">=3.6.8"
1047 | files = [
1048 | {file = "pyparsing-3.0.9-py3-none-any.whl", hash = "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"},
1049 | {file = "pyparsing-3.0.9.tar.gz", hash = "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb"},
1050 | ]
1051 |
1052 | [package.extras]
1053 | diagrams = ["jinja2", "railroad-diagrams"]
1054 |
1055 | [[package]]
1056 | name = "pyrsistent"
1057 | version = "0.19.3"
1058 | description = "Persistent/Functional/Immutable data structures"
1059 | category = "main"
1060 | optional = false
1061 | python-versions = ">=3.7"
1062 | files = [
1063 | {file = "pyrsistent-0.19.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:20460ac0ea439a3e79caa1dbd560344b64ed75e85d8703943e0b66c2a6150e4a"},
1064 | {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c18264cb84b5e68e7085a43723f9e4c1fd1d935ab240ce02c0324a8e01ccb64"},
1065 | {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b774f9288dda8d425adb6544e5903f1fb6c273ab3128a355c6b972b7df39dcf"},
1066 | {file = "pyrsistent-0.19.3-cp310-cp310-win32.whl", hash = "sha256:5a474fb80f5e0d6c9394d8db0fc19e90fa540b82ee52dba7d246a7791712f74a"},
1067 | {file = "pyrsistent-0.19.3-cp310-cp310-win_amd64.whl", hash = "sha256:49c32f216c17148695ca0e02a5c521e28a4ee6c5089f97e34fe24163113722da"},
1068 | {file = "pyrsistent-0.19.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f0774bf48631f3a20471dd7c5989657b639fd2d285b861237ea9e82c36a415a9"},
1069 | {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ab2204234c0ecd8b9368dbd6a53e83c3d4f3cab10ecaf6d0e772f456c442393"},
1070 | {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e42296a09e83028b3476f7073fcb69ffebac0e66dbbfd1bd847d61f74db30f19"},
1071 | {file = "pyrsistent-0.19.3-cp311-cp311-win32.whl", hash = "sha256:64220c429e42a7150f4bfd280f6f4bb2850f95956bde93c6fda1b70507af6ef3"},
1072 | {file = "pyrsistent-0.19.3-cp311-cp311-win_amd64.whl", hash = "sha256:016ad1afadf318eb7911baa24b049909f7f3bb2c5b1ed7b6a8f21db21ea3faa8"},
1073 | {file = "pyrsistent-0.19.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c4db1bd596fefd66b296a3d5d943c94f4fac5bcd13e99bffe2ba6a759d959a28"},
1074 | {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aeda827381f5e5d65cced3024126529ddc4289d944f75e090572c77ceb19adbf"},
1075 | {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:42ac0b2f44607eb92ae88609eda931a4f0dfa03038c44c772e07f43e738bcac9"},
1076 | {file = "pyrsistent-0.19.3-cp37-cp37m-win32.whl", hash = "sha256:e8f2b814a3dc6225964fa03d8582c6e0b6650d68a232df41e3cc1b66a5d2f8d1"},
1077 | {file = "pyrsistent-0.19.3-cp37-cp37m-win_amd64.whl", hash = "sha256:c9bb60a40a0ab9aba40a59f68214eed5a29c6274c83b2cc206a359c4a89fa41b"},
1078 | {file = "pyrsistent-0.19.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a2471f3f8693101975b1ff85ffd19bb7ca7dd7c38f8a81701f67d6b4f97b87d8"},
1079 | {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc5d149f31706762c1f8bda2e8c4f8fead6e80312e3692619a75301d3dbb819a"},
1080 | {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3311cb4237a341aa52ab8448c27e3a9931e2ee09561ad150ba94e4cfd3fc888c"},
1081 | {file = "pyrsistent-0.19.3-cp38-cp38-win32.whl", hash = "sha256:f0e7c4b2f77593871e918be000b96c8107da48444d57005b6a6bc61fb4331b2c"},
1082 | {file = "pyrsistent-0.19.3-cp38-cp38-win_amd64.whl", hash = "sha256:c147257a92374fde8498491f53ffa8f4822cd70c0d85037e09028e478cababb7"},
1083 | {file = "pyrsistent-0.19.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b735e538f74ec31378f5a1e3886a26d2ca6351106b4dfde376a26fc32a044edc"},
1084 | {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99abb85579e2165bd8522f0c0138864da97847875ecbd45f3e7e2af569bfc6f2"},
1085 | {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3a8cb235fa6d3fd7aae6a4f1429bbb1fec1577d978098da1252f0489937786f3"},
1086 | {file = "pyrsistent-0.19.3-cp39-cp39-win32.whl", hash = "sha256:c74bed51f9b41c48366a286395c67f4e894374306b197e62810e0fdaf2364da2"},
1087 | {file = "pyrsistent-0.19.3-cp39-cp39-win_amd64.whl", hash = "sha256:878433581fc23e906d947a6814336eee031a00e6defba224234169ae3d3d6a98"},
1088 | {file = "pyrsistent-0.19.3-py3-none-any.whl", hash = "sha256:ccf0d6bd208f8111179f0c26fdf84ed7c3891982f2edaeae7422575f47e66b64"},
1089 | {file = "pyrsistent-0.19.3.tar.gz", hash = "sha256:1a2994773706bbb4995c31a97bc94f1418314923bd1048c6d964837040376440"},
1090 | ]
1091 |
1092 | [[package]]
1093 | name = "python-dateutil"
1094 | version = "2.8.2"
1095 | description = "Extensions to the standard Python datetime module"
1096 | category = "main"
1097 | optional = false
1098 | python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
1099 | files = [
1100 | {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
1101 | {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
1102 | ]
1103 |
1104 | [package.dependencies]
1105 | six = ">=1.5"
1106 |
1107 | [[package]]
1108 | name = "python3-xlib"
1109 | version = "0.15"
1110 | description = "Python3 X Library"
1111 | category = "main"
1112 | optional = false
1113 | python-versions = "*"
1114 | files = [
1115 | {file = "python3-xlib-0.15.tar.gz", hash = "sha256:dc4245f3ae4aa5949c1d112ee4723901ade37a96721ba9645f2bfa56e5b383f8"},
1116 | ]
1117 |
1118 | [[package]]
1119 | name = "pytz"
1120 | version = "2023.3"
1121 | description = "World timezone definitions, modern and historical"
1122 | category = "main"
1123 | optional = false
1124 | python-versions = "*"
1125 | files = [
1126 | {file = "pytz-2023.3-py2.py3-none-any.whl", hash = "sha256:a151b3abb88eda1d4e34a9814df37de2a80e301e68ba0fd856fb9b46bfbbbffb"},
1127 | {file = "pytz-2023.3.tar.gz", hash = "sha256:1d8ce29db189191fb55338ee6d0387d82ab59f3d00eac103412d64e0ebd0c588"},
1128 | ]
1129 |
1130 | [[package]]
1131 | name = "pywin32-ctypes"
1132 | version = "0.2.0"
1133 | description = ""
1134 | category = "main"
1135 | optional = false
1136 | python-versions = "*"
1137 | files = [
1138 | {file = "pywin32-ctypes-0.2.0.tar.gz", hash = "sha256:24ffc3b341d457d48e8922352130cf2644024a4ff09762a2261fd34c36ee5942"},
1139 | {file = "pywin32_ctypes-0.2.0-py2.py3-none-any.whl", hash = "sha256:9dc2d991b3479cc2df15930958b674a48a227d5361d413827a4cfd0b5876fc98"},
1140 | ]
1141 |
1142 | [[package]]
1143 | name = "requests"
1144 | version = "2.30.0"
1145 | description = "Python HTTP for Humans."
1146 | category = "main"
1147 | optional = false
1148 | python-versions = ">=3.7"
1149 | files = [
1150 | {file = "requests-2.30.0-py3-none-any.whl", hash = "sha256:10e94cc4f3121ee6da529d358cdaeaff2f1c409cd377dbc72b825852f2f7e294"},
1151 | {file = "requests-2.30.0.tar.gz", hash = "sha256:239d7d4458afcb28a692cdd298d87542235f4ca8d36d03a15bfc128a6559a2f4"},
1152 | ]
1153 |
1154 | [package.dependencies]
1155 | certifi = ">=2017.4.17"
1156 | charset-normalizer = ">=2,<4"
1157 | idna = ">=2.5,<4"
1158 | urllib3 = ">=1.21.1,<3"
1159 |
1160 | [package.extras]
1161 | socks = ["PySocks (>=1.5.6,!=1.5.7)"]
1162 | use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
1163 |
1164 | [[package]]
1165 | name = "requests-oauthlib"
1166 | version = "1.3.1"
1167 | description = "OAuthlib authentication support for Requests."
1168 | category = "main"
1169 | optional = false
1170 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
1171 | files = [
1172 | {file = "requests-oauthlib-1.3.1.tar.gz", hash = "sha256:75beac4a47881eeb94d5ea5d6ad31ef88856affe2332b9aafb52c6452ccf0d7a"},
1173 | {file = "requests_oauthlib-1.3.1-py2.py3-none-any.whl", hash = "sha256:2577c501a2fb8d05a304c09d090d6e47c306fef15809d102b327cf8364bddab5"},
1174 | ]
1175 |
1176 | [package.dependencies]
1177 | oauthlib = ">=3.0.0"
1178 | requests = ">=2.0.0"
1179 |
1180 | [package.extras]
1181 | rsa = ["oauthlib[signedtoken] (>=3.0.0)"]
1182 |
1183 | [[package]]
1184 | name = "retrying"
1185 | version = "1.3.4"
1186 | description = "Retrying"
1187 | category = "main"
1188 | optional = false
1189 | python-versions = "*"
1190 | files = [
1191 | {file = "retrying-1.3.4-py3-none-any.whl", hash = "sha256:8cc4d43cb8e1125e0ff3344e9de678fefd85db3b750b81b2240dc0183af37b35"},
1192 | {file = "retrying-1.3.4.tar.gz", hash = "sha256:345da8c5765bd982b1d1915deb9102fd3d1f7ad16bd84a9700b85f64d24e8f3e"},
1193 | ]
1194 |
1195 | [package.dependencies]
1196 | six = ">=1.7.0"
1197 |
1198 | [[package]]
1199 | name = "rsa"
1200 | version = "4.9"
1201 | description = "Pure-Python RSA implementation"
1202 | category = "main"
1203 | optional = false
1204 | python-versions = ">=3.6,<4"
1205 | files = [
1206 | {file = "rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7"},
1207 | {file = "rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21"},
1208 | ]
1209 |
1210 | [package.dependencies]
1211 | pyasn1 = ">=0.1.3"
1212 |
1213 | [[package]]
1214 | name = "secretstorage"
1215 | version = "3.3.3"
1216 | description = "Python bindings to FreeDesktop.org Secret Service API"
1217 | category = "main"
1218 | optional = false
1219 | python-versions = ">=3.6"
1220 | files = [
1221 | {file = "SecretStorage-3.3.3-py3-none-any.whl", hash = "sha256:f356e6628222568e3af06f2eba8df495efa13b3b63081dafd4f7d9a7b7bc9f99"},
1222 | {file = "SecretStorage-3.3.3.tar.gz", hash = "sha256:2403533ef369eca6d2ba81718576c5e0f564d5cca1b58f73a8b23e7d4eeebd77"},
1223 | ]
1224 |
1225 | [package.dependencies]
1226 | cryptography = ">=2.0"
1227 | jeepney = ">=0.6"
1228 |
1229 | [[package]]
1230 | name = "selfspy"
1231 | version = "0.3.0"
1232 | description = "Log everything you do on the computer, for statistics, future reference and all-around fun!"
1233 | category = "main"
1234 | optional = false
1235 | python-versions = "*"
1236 | files = []
1237 | develop = false
1238 |
1239 | [package.dependencies]
1240 | keyring = ">=1.2.2"
1241 | lockfile = ">=0.9.1"
1242 | SQLAlchemy = "0.9.4"
1243 |
1244 | [package.source]
1245 | type = "git"
1246 | url = "https://github.com/kovasap/selfspy.git"
1247 | reference = "master"
1248 | resolved_reference = "12139b966562790834d5126f148a33f637102a7e"
1249 |
1250 | [[package]]
1251 | name = "six"
1252 | version = "1.16.0"
1253 | description = "Python 2 and 3 compatibility utilities"
1254 | category = "main"
1255 | optional = false
1256 | python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
1257 | files = [
1258 | {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
1259 | {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
1260 | ]
1261 |
1262 | [[package]]
1263 | name = "sortedcontainers"
1264 | version = "2.4.0"
1265 | description = "Sorted Containers -- Sorted List, Sorted Dict, Sorted Set"
1266 | category = "main"
1267 | optional = false
1268 | python-versions = "*"
1269 | files = [
1270 | {file = "sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0"},
1271 | {file = "sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88"},
1272 | ]
1273 |
1274 | [[package]]
1275 | name = "soupsieve"
1276 | version = "2.4.1"
1277 | description = "A modern CSS selector implementation for Beautiful Soup."
1278 | category = "main"
1279 | optional = false
1280 | python-versions = ">=3.7"
1281 | files = [
1282 | {file = "soupsieve-2.4.1-py3-none-any.whl", hash = "sha256:1c1bfee6819544a3447586c889157365a27e10d88cde3ad3da0cf0ddf646feb8"},
1283 | {file = "soupsieve-2.4.1.tar.gz", hash = "sha256:89d12b2d5dfcd2c9e8c22326da9d9aa9cb3dfab0a83a024f05704076ee8d35ea"},
1284 | ]
1285 |
1286 | [[package]]
1287 | name = "sqlalchemy"
1288 | version = "0.9.4"
1289 | description = "Database Abstraction Library"
1290 | category = "main"
1291 | optional = false
1292 | python-versions = "*"
1293 | files = [
1294 | {file = "SQLAlchemy-0.9.4.tar.gz", hash = "sha256:bc87674f5ac9962e0efe96f060ba22a07e2c61fbebf531efc383b82db5b291ec"},
1295 | ]
1296 |
1297 | [[package]]
1298 | name = "toolz"
1299 | version = "0.12.0"
1300 | description = "List processing tools and functional utilities"
1301 | category = "main"
1302 | optional = false
1303 | python-versions = ">=3.5"
1304 | files = [
1305 | {file = "toolz-0.12.0-py3-none-any.whl", hash = "sha256:2059bd4148deb1884bb0eb770a3cde70e7f954cfbbdc2285f1f2de01fd21eb6f"},
1306 | {file = "toolz-0.12.0.tar.gz", hash = "sha256:88c570861c440ee3f2f6037c4654613228ff40c93a6c25e0eba70d17282c6194"},
1307 | ]
1308 |
1309 | [[package]]
1310 | name = "typing-extensions"
1311 | version = "4.5.0"
1312 | description = "Backported and Experimental Type Hints for Python 3.7+"
1313 | category = "main"
1314 | optional = false
1315 | python-versions = ">=3.7"
1316 | files = [
1317 | {file = "typing_extensions-4.5.0-py3-none-any.whl", hash = "sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4"},
1318 | {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
1319 | ]
1320 |
1321 | [[package]]
1322 | name = "uritemplate"
1323 | version = "4.1.1"
1324 | description = "Implementation of RFC 6570 URI Templates"
1325 | category = "main"
1326 | optional = false
1327 | python-versions = ">=3.6"
1328 | files = [
1329 | {file = "uritemplate-4.1.1-py2.py3-none-any.whl", hash = "sha256:830c08b8d99bdd312ea4ead05994a38e8936266f84b9a7878232db50b044e02e"},
1330 | {file = "uritemplate-4.1.1.tar.gz", hash = "sha256:4346edfc5c3b79f694bccd6d6099a322bbeb628dbf2cd86eea55a456ce5124f0"},
1331 | ]
1332 |
1333 | [[package]]
1334 | name = "urllib3"
1335 | version = "2.0.2"
1336 | description = "HTTP library with thread-safe connection pooling, file post, and more."
1337 | category = "main"
1338 | optional = false
1339 | python-versions = ">=3.7"
1340 | files = [
1341 | {file = "urllib3-2.0.2-py3-none-any.whl", hash = "sha256:d055c2f9d38dc53c808f6fdc8eab7360b6fdbbde02340ed25cfbcd817c62469e"},
1342 | {file = "urllib3-2.0.2.tar.gz", hash = "sha256:61717a1095d7e155cdb737ac7bb2f4324a858a1e2e6466f6d03ff630ca68d3cc"},
1343 | ]
1344 |
1345 | [package.extras]
1346 | brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
1347 | secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
1348 | socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
1349 | zstd = ["zstandard (>=0.18.0)"]
1350 |
1351 | [[package]]
1352 | name = "yapf"
1353 | version = "0.31.0"
1354 | description = "A formatter for Python code."
1355 | category = "main"
1356 | optional = false
1357 | python-versions = "*"
1358 | files = [
1359 | {file = "yapf-0.31.0-py2.py3-none-any.whl", hash = "sha256:e3a234ba8455fe201eaa649cdac872d590089a18b661e39bbac7020978dd9c2e"},
1360 | {file = "yapf-0.31.0.tar.gz", hash = "sha256:408fb9a2b254c302f49db83c59f9aa0b4b0fd0ec25be3a5c51181327922ff63d"},
1361 | ]
1362 |
1363 | [[package]]
1364 | name = "zipp"
1365 | version = "3.15.0"
1366 | description = "Backport of pathlib-compatible object wrapper for zip files"
1367 | category = "main"
1368 | optional = false
1369 | python-versions = ">=3.7"
1370 | files = [
1371 | {file = "zipp-3.15.0-py3-none-any.whl", hash = "sha256:48904fc76a60e542af151aded95726c1a5c34ed43ab4134b597665c86d7ad556"},
1372 | {file = "zipp-3.15.0.tar.gz", hash = "sha256:112929ad649da941c23de50f356a2b5570c954b65150642bccdd66bf194d224b"},
1373 | ]
1374 |
1375 | [package.extras]
1376 | docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
1377 | testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
1378 |
1379 | [metadata]
1380 | lock-version = "2.0"
1381 | python-versions = "^3.8"
1382 | content-hash = "0e22874ca885552b432a847e227a703af7196b4c31a891521eef15daeb3a995b"
1383 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.poetry]
2 | name = "autojournal"
3 | version = "0.1.0"
4 | description = ""
5 | authors = ["Kovas Palunas "]
6 | packages = [
7 | { include = "autojournal" }
8 | ]
9 |
10 | [tool.poetry.scripts]
11 | report_generator = "autojournal.report_generator:main"
12 | gcal_aggregator = "autojournal.gcal_aggregator:main"
13 |
14 | [tool.poetry.dependencies]
15 | python = "^3.8"
16 | python-dateutil = "^2.8.1"
17 | google-auth-oauthlib = "^0.4.4"
18 | google-api-python-client = "^2.2.0"
19 | sortedcontainers = "^2.3.0"
20 | pandas = "^1.2.4"
21 | plotly = "^4.14.3"
22 | click = "^7.1.2"
23 | bs4 = "^0.0.1"
24 | lxml = "^4.6.3"
25 | psutil = "^5.8.0"
26 | selfspy = {git = "https://github.com/kovasap/selfspy.git", branch="master"}
27 | yapf = "^0.31.0"
28 | python3-xlib = "^0.15"
29 | geopy = "^2.1.0"
30 | altair = "^5.0.0"
31 |
32 | [tool.poetry.dev-dependencies]
33 |
34 | [tool.yapf]
35 | based_on_style = "google"
36 | indent_width = 2
37 |
38 | [tool.pylint]
39 | indent-string = ' '
40 |
41 | [build-system]
42 | requires = ["poetry-core>=1.0.0"]
43 | build-backend = "poetry.core.masonry.api"
44 |
--------------------------------------------------------------------------------
/run_gcal_aggregator.bash:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | (cd ~/autojournal; nohup /home/pi/.poetry/bin/poetry run gcal_aggregator --update all &> ~/autojournal.log &)
4 |
--------------------------------------------------------------------------------