├── .gitignore ├── Dockerfile ├── License.txt ├── Makefile ├── Readme.md ├── backup.py ├── config.sample.yml ├── doomsdaymachine ├── __init__.py ├── backup.py ├── backup_log.py ├── datasources │ ├── github.py │ ├── google_contacts.py │ ├── google_drive.py │ ├── imap.py │ └── lastpass.py ├── notification.py ├── sync_map.py ├── sync_map2.py ├── util.py └── webserver.py ├── generate_credentials.py ├── requirements.txt ├── screenshot.png ├── static ├── index.html ├── reset.css ├── script.js └── style.css └── webserver.py /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | config.yml 3 | __pycache__ 4 | *.json 5 | data/** 6 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.8 2 | 3 | WORKDIR /src/app 4 | 5 | COPY . . 6 | 7 | RUN pip3 install -r requirements.txt 8 | 9 | CMD ["python3", "backup.py"] 10 | -------------------------------------------------------------------------------- /License.txt: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 John E. Jones IV 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | install: 2 | pip3 install -r requirements.txt 3 | 4 | backup: 5 | python3 backup.py 6 | 7 | webserver: 8 | python3 webserver.py 9 | 10 | authenticate: 11 | python3 generate_credentials.py 12 | -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | # Doomsday Machine 2 | 3 | ![Screenshot of Doomsday Machine](screenshot.png) 4 | 5 | ## About 6 | 7 | Doomsday Machine is a tool for backing up cloud services to a local machine. 8 | 9 | ## Installation 10 | 11 | To install the main script, run the following: 12 | 13 | ```sh 14 | $ git clone git@github.com:johnjones4/Doomsday-Machine.git 15 | $ cd Doomsday-Machine 16 | $ make install 17 | ``` 18 | 19 | If you would like to run Doomsday Machine on a schedule, I recommend using something like [Supervisord](http://supervisord.org/) to manage the application. Once you have Supervisord installed for your distro, you can use the following as a template Supervisord configuration for Doomsday Machine. Note that this configuration expects this project to be checked out in the directory `/usr/local/src/Doomsday-Machine/`, it expects a directory for logging named `/var/log/doomsday/`, and it expects a config file at `/var/lib/doomsday/config.yml`. 20 | 21 | ``` 22 | [program:doomsday] 23 | command=/usr/bin/python3 /usr/local/src/Doomsday-Machine/backup.py 24 | directory=/usr/local/src/Doomsday-Machine 25 | autostart=true 26 | autorestart=true 27 | startretries=3 28 | stderr_logfile=/var/log/doomsday/supervisor-backup.err.log 29 | stdout_logfile=/var/log/doomsday/supervisor-backup.out.log 30 | user=root 31 | environment=CONFIG_FILE='/var/lib/doomsday/config.yml' 32 | 33 | [program:doomsdaywebserver] 34 | command=/usr/bin/python3 /usr/local/src/Doomsday-Machine/webserver.py 35 | directory=/usr/local/src/Doomsday-Machine 36 | autostart=true 37 | autorestart=true 38 | startretries=3 39 | stderr_logfile=/var/log/doomsday/supervisor-webserver.err.log 40 | stdout_logfile=/var/log/doomsday/supervisor-webserver.out.log 41 | user=root 42 | environment=CONFIG_FILE='/var/lib/doomsday/config.yml' 43 | ``` 44 | 45 | ## Setup 46 | 47 | The file `config.sample.yml` includes all configurations for the project. Copy that file to `config.yml` and begin updating the file to meet your needs. You may remove or duplicate any job in the list. Specify the absolute path to this file as an environment variable named `CONFIG_FILE` 48 | 49 | ### Backup Jobs 50 | 51 | #### Dropbox Setup 52 | 53 | To setup Dropbox access, go to the [Dropbox App Console](https://www.dropbox.com/developers/apps) and create a new app. After setting up the app (you only need to provide basic information as you'll be the only one consuming this) under "Generated access token", click "Generate." Copy and paste that generated code to the configuration option `oauth2_access_token`. In the Dropbox configuration, you may also specify a whitelist of paths that should be downloaded. Omit that option to download all files. 54 | 55 | #### Google Contacts 56 | 57 | Setup for this data source is a bit more complex. First, create an application on the [Google API Console](https://console.developers.google.com/), give the application access to the _Google People API_, and create Oauth credentials for a Desktop Client. 58 | 59 | Now, copy and paste new client credentials into your `config.yml` file as `client_id` and `client_secret` under `options` for the section titled `Google Contacts`. Next, run `make authenticate` which will generate an authorization URL. Open that URL in a browser, agree to give your new application access to your contact, and copy the generated token and paste it as `token`. Now run `make authenticate` one last time, which will result in a `token` and a `refresh_token`. Copy an paste those keys to `token` and `refresh_token` in your `config.yml`. 60 | 61 | #### LastPass 62 | 63 | To setup LastPass access, specify your username and password in the options. 64 | 65 | #### GitHub 66 | 67 | To setup GitHub access, go to your [Personal access tokens](https://github.com/settings/tokens), click "Generate new token," and select everything under "repo" under scopes. Copy and paste that generated code to the configuration option `access_token`. 68 | 69 | #### IMAP 70 | 71 | To add an email/IMAP account, specify all of the standard IMAP connection details as well as a list of mailboxes to download. 72 | 73 | ### Other Configuration Options 74 | 75 | #### Notifications 76 | 77 | Upon completion of each backup job, Doomsday Machine can send an email notification. To allow that, specify IMAP connection details under `email_notification`. 78 | 79 | #### Delay 80 | 81 | Between jobs and run loops, you can specify a delay time (In seconds) under `delay`. 82 | 83 | #### Outputs 84 | 85 | Doomsday Machine will loop through a list of output directories under `outputs` in case you want to keep multiple redundant backups. 86 | 87 | #### Active Job 88 | 89 | Doomsday Machine will write a JSON file with the active job information to the path specified in `active_job_file_path`. 90 | 91 | #### Logging 92 | 93 | To control logging, set a logging level and/or output file under `logging`. 94 | -------------------------------------------------------------------------------- /backup.py: -------------------------------------------------------------------------------- 1 | from doomsdaymachine.backup import start 2 | 3 | if __name__ == '__main__': 4 | start() 5 | -------------------------------------------------------------------------------- /config.sample.yml: -------------------------------------------------------------------------------- 1 | jobs: 2 | - type: dropbox 3 | name: Dropbox 4 | options: 5 | oauth2_access_token: 6 | include: 7 | - /folder/pattern/* 8 | - type: google_contacts 9 | name: Google Contacts 10 | options: 11 | client_id: 12 | client_secret: 13 | token: 14 | refresh_token: 15 | code: 16 | - type: lastpass 17 | name: Lastpass 18 | options: 19 | username: 20 | password: 21 | - type: github 22 | name: GitHub 23 | options: 24 | access_token: 25 | - type: imap 26 | name: Gmail 27 | options: 28 | host: imap.gmail.com 29 | port: 993 30 | tls: false 31 | ssl: true 32 | username: 33 | password: 34 | mailboxes: 35 | - '"[Gmail]/All Mail"' 36 | - Inbox 37 | 38 | email_notification: 39 | from: 40 | to: 41 | host: smtp.gmail.com 42 | port: 465 43 | username: 44 | password: 45 | ssl: true 46 | tls: false 47 | 48 | delay: 49 | job: 0 50 | cycle: 0 51 | 52 | outputs: 53 | - ./data/1 54 | - ./data/2 55 | - ./data/3 56 | backup_log_path: ./data/log.sqlite3 57 | 58 | log: 59 | level: DEBUG 60 | file: ./data/log.log 61 | -------------------------------------------------------------------------------- /doomsdaymachine/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnjones4/Doomsday-Machine/63e9aedc1b10c556ec19c7987071f48ffd4e8524/doomsdaymachine/__init__.py -------------------------------------------------------------------------------- /doomsdaymachine/backup.py: -------------------------------------------------------------------------------- 1 | import time 2 | import os 3 | import os.path as path 4 | import logging 5 | from doomsdaymachine.datasources.google_contacts import execute_google_contacts 6 | from doomsdaymachine.datasources.lastpass import execute_lastpass 7 | from doomsdaymachine.datasources.github import execute_github 8 | from doomsdaymachine.datasources.imap import execute_imap 9 | from doomsdaymachine.datasources.google_drive import execute_google_drive 10 | from doomsdaymachine.util import load_config 11 | from doomsdaymachine.notification import send_notification 12 | from doomsdaymachine.backup_log import BackupLog 13 | 14 | def start(): 15 | config = load_config() 16 | logger = logging.getLogger(__name__) 17 | handler = logging.FileHandler(config["log"]["file"]) if "log" in config and "file" in config["log"] else logging.StreamHandler() 18 | handler.setFormatter(logging.Formatter("%(asctime)s:%(levelname)s:%(message)s")) 19 | logger.addHandler(handler) 20 | logger.setLevel(logging.getLevelName(config["log"]["level"]) if "log" in config and "level" in config["log"] else logging.INFO) 21 | i = 0 22 | backup_log = BackupLog(config) 23 | while True: 24 | config["output"] = config["outputs"][i] 25 | i = i + 1 if i + 1 < len(config["outputs"]) else 0 26 | for job in config["jobs"]: 27 | job["output_folder"] = path.join(config["output"], job["id"]) 28 | try: 29 | (start_time, job_instance_id) = backup_log.start_job(job["id"]) 30 | execute_job(logger, config, job) 31 | end_time = backup_log.end_job(job["id"], job_instance_id) 32 | elapsed_time = end_time - start_time 33 | send_notification(config, f"Completed {job['type']}/{job['name']} to {job['output_folder']} in {elapsed_time} seconds.") 34 | except Exception as e: 35 | logger.error(f"Exception during {job['type']}/{job['name']} to {job['output_folder']}", exc_info=True) 36 | send_notification(config, f"Exception during {job['type']}/{job['name']} to {job['output_folder']}: {str(e)}.") 37 | if "delay" in config and "job" in config["delay"]: 38 | time.sleep(config["delay"]["job"]) 39 | if "delay" in config and "cycle" in config["delay"]: 40 | time.sleep(config["delay"]["cycle"]) 41 | 42 | 43 | def execute_job(logger, config, job): 44 | logger.info(f"Starting {job['type']}/{job['name']} ({job['id']})") 45 | if not path.isdir(job["output_folder"]): 46 | os.mkdir(job["output_folder"]) 47 | elif job["type"] == "google_contacts": 48 | execute_google_contacts(logger, config, job) 49 | elif job["type"] == "lastpass": 50 | execute_lastpass(logger, config, job) 51 | elif job["type"] == "github": 52 | execute_github(logger, config, job) 53 | elif job["type"] == "imap": 54 | execute_imap(logger, config, job) 55 | elif job["type"] == "google_drive": 56 | execute_google_drive(logger, config, job) 57 | -------------------------------------------------------------------------------- /doomsdaymachine/backup_log.py: -------------------------------------------------------------------------------- 1 | import sqlite3 2 | import time 3 | from uuid import uuid4 4 | 5 | class BackupLog(): 6 | def __init__(self, config): 7 | self.connection = sqlite3.connect(config["backup_log_path"]) 8 | self.cursor = self.connection.cursor() 9 | self.cursor.execute("CREATE TABLE IF NOT EXISTS log (job TEXT NOT NULL, job_instance_id TEXT NOT NULL, event TEXT NOT NULL, timestamp NUMBER NOT NULL)") 10 | 11 | def start_job(self, job): 12 | start_time = time.time() 13 | job_instance_id = str(uuid4()) 14 | self.cursor.execute("INSERT INTO log (job, job_instance_id, event, timestamp) VALUES (?, ?, 'start', ?)", (job, job_instance_id, start_time)) 15 | self.connection.commit() 16 | return (start_time, job_instance_id) 17 | 18 | def end_job(self, job, job_instance_id): 19 | end_time = time.time() 20 | self.cursor.execute("INSERT INTO log (job, job_instance_id, event, timestamp) VALUES (?, ?, 'end', ?)", (job, job_instance_id, end_time)) 21 | self.connection.commit() 22 | return end_time 23 | 24 | def get_active_job(self): 25 | self.cursor.execute("SELECT job, event, timestamp FROM log ORDER BY timestamp DESC LIMIT 1") 26 | row = self.cursor.fetchone() 27 | if row and row[1] == "start": 28 | return dict( 29 | job=row[0], 30 | start_time=row[2] 31 | ) 32 | return dict( 33 | job=None, 34 | start_time=None 35 | ) 36 | 37 | def get_last_execution_time(self, job): 38 | self.cursor.execute("SELECT event, timestamp FROM log WHERE job = ? ORDER BY timestamp DESC LIMIT 2", (job, )) 39 | rows = self.cursor.fetchall() 40 | if len(rows) != 2 or rows[0][0] == "start" or rows[1][0] == "end": 41 | return None 42 | return rows[0][1] - rows[1][1] 43 | -------------------------------------------------------------------------------- /doomsdaymachine/datasources/github.py: -------------------------------------------------------------------------------- 1 | from github import Github 2 | import os.path as path 3 | import os 4 | import base64 5 | import urllib.request 6 | from doomsdaymachine.util import format_filename 7 | from doomsdaymachine.sync_map2 import SyncMap2 8 | 9 | 10 | def execute_github(logger, config, job): 11 | gclient = Github(job["options"]["access_token"]) 12 | download_github_repositories(logger, job, gclient) 13 | download_github_gists(logger, job, gclient) 14 | 15 | def download_github_repositories(logger, job, gclient): 16 | logger.info("Getting repositories") 17 | projects_folder = path.join(job["output_folder"], "repositories") 18 | syncer = SyncMap2(job) 19 | if not path.isdir(projects_folder): 20 | os.mkdir(projects_folder) 21 | for repo in gclient.get_user().get_repos(): 22 | project_file = path.join(projects_folder, repo.full_name) 23 | if not os.path.isdir(project_file): 24 | os.makedirs(project_file) 25 | for branch in repo.get_branches(): 26 | key = repo.full_name + "/" + branch.name 27 | signature = branch.commit.commit.tree.sha 28 | if syncer.get_signature(key) != signature: 29 | logger.info(f"Getting repo {repo.full_name} {branch.name}") 30 | archive_url = repo.get_archive_link("tarball", branch.name) 31 | archive_file = path.join(project_file, format_filename(branch.name) + ".tar.gz") 32 | print(archive_url, archive_file) 33 | with urllib.request.urlopen(archive_url) as dl_file: 34 | with open(archive_file, "wb") as out_file: 35 | out_file.write(dl_file.read()) 36 | syncer.set_signature(key, signature) 37 | syncer.save() 38 | 39 | def download_github_gists(logger, job, gclient): 40 | logger.info("Getting gists") 41 | gists_folder = path.join(job["output_folder"], "gists") 42 | if not path.isdir(gists_folder): 43 | os.mkdir(gists_folder) 44 | for gist in gclient.get_user().get_gists(): 45 | logger.info(f"Getting gist {gist.id}") 46 | gist_folder = path.join(gists_folder, gist.id) 47 | if not path.isdir(gist_folder): 48 | os.mkdir(gist_folder) 49 | files = gist.files 50 | for file_name in files: 51 | logger.debug(f"Downloading {file_name}") 52 | file_path = path.join(gist_folder, file_name) 53 | with open(file_path, "w") as file_handle: 54 | logger.debug(f"Saving {file_name}") 55 | file_handle.write(files[file_name].content) 56 | -------------------------------------------------------------------------------- /doomsdaymachine/datasources/google_contacts.py: -------------------------------------------------------------------------------- 1 | from googleapiclient.discovery import build 2 | from google_auth_oauthlib.flow import InstalledAppFlow 3 | from google.oauth2.credentials import Credentials 4 | from google.auth.transport.urllib3 import Request 5 | import json 6 | import os.path as path 7 | import urllib3 8 | 9 | def execute_google_contacts(logger, config, job): 10 | credentials = Credentials( 11 | token=job["options"]["token"], 12 | refresh_token=job["options"]["refresh_token"], 13 | token_uri="https://oauth2.googleapis.com/token", 14 | client_id=job["options"]["client_id"], 15 | client_secret=job["options"]["client_secret"], 16 | scopes=["https://www.googleapis.com/auth/contacts.readonly"], 17 | ) 18 | credentials.refresh(Request(urllib3.PoolManager())) 19 | service = build("people", "v1", credentials=credentials) 20 | results = service.people().connections().list( 21 | resourceName='people/me', 22 | pageSize=2000, 23 | personFields='names,emailAddresses' 24 | ).execute() 25 | connections = results.get('connections', []) 26 | with open(path.join(job["output_folder"], "contacts.json"), "w") as contacts_file: 27 | logger.debug(f"Saving contacts") 28 | contacts_file.write(json.dumps(connections)) 29 | 30 | def generate_flow(job): 31 | return InstalledAppFlow.from_client_config({ 32 | "installed": { 33 | "client_id": job["options"]["client_id"], 34 | "client_secret": job["options"]["client_secret"], 35 | "redirect_uris": ["urn:ietf:wg:oauth:2.0:oob"], 36 | "auth_uri": "https://accounts.google.com/o/oauth2/auth", 37 | "token_uri": "https://oauth2.googleapis.com/token", 38 | "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", 39 | } 40 | }, ["https://www.googleapis.com/auth/contacts.readonly"], redirect_uri="urn:ietf:wg:oauth:2.0:oob") 41 | 42 | def generate_login_url(job): 43 | flow = generate_flow(job) 44 | return flow.authorization_url()[0] 45 | 46 | def authorize_code(job): 47 | flow = generate_flow(job) 48 | flow.fetch_token(code=job["options"]["code"]) 49 | return { 50 | "token": flow.credentials.token, 51 | "refresh_token": flow.credentials.refresh_token 52 | } 53 | 54 | -------------------------------------------------------------------------------- /doomsdaymachine/datasources/google_drive.py: -------------------------------------------------------------------------------- 1 | from googleapiclient.discovery import build 2 | from googleapiclient.http import MediaIoBaseDownload 3 | from google_auth_oauthlib.flow import InstalledAppFlow 4 | from google.oauth2.credentials import Credentials 5 | from google.auth.transport.urllib3 import Request 6 | import json 7 | import os.path as path 8 | import urllib3 9 | import os 10 | from doomsdaymachine.sync_map import SyncMap 11 | import io 12 | import datetime 13 | 14 | EXPORTABLE = { 15 | "application/vnd.google-apps.spreadsheet": "application/x-vnd.oasis.opendocument.spreadsheet", 16 | "application/vnd.google-apps.presentation": "application/vnd.oasis.opendocument.presentation", 17 | "application/vnd.google-apps.document": "application/vnd.oasis.opendocument.text", 18 | "application/vnd.google-apps.drawing": "image/svg+xml" 19 | } 20 | 21 | IGNORE = [ 22 | "application/vnd.google-apps.folder", 23 | "application/vnd.google-apps.form", 24 | "application/vnd.google-apps.fusiontable", 25 | "application/vnd.google-apps.map", 26 | "application/vnd.google-apps.script", 27 | "application/vnd.google-apps.shortcut", 28 | "application/vnd.google-apps.site", 29 | "application/vnd.google-apps.unknown" 30 | ] 31 | 32 | EXTENSIONS = { 33 | "application/vnd.google-apps.spreadsheet": "ods", 34 | "application/vnd.google-apps.presentation": "odp", 35 | "application/vnd.google-apps.document": "odt", 36 | "application/vnd.google-apps.drawing": "svg" 37 | } 38 | 39 | def execute_google_drive(logger, config, job): 40 | credentials = Credentials( 41 | token=job["options"]["token"], 42 | refresh_token=job["options"]["refresh_token"], 43 | token_uri="https://oauth2.googleapis.com/token", 44 | client_id=job["options"]["client_id"], 45 | client_secret=job["options"]["client_secret"], 46 | scopes=[ "https://www.googleapis.com/auth/drive.readonly"], 47 | ) 48 | credentials.refresh(Request(urllib3.PoolManager())) 49 | service = build('drive', 'v3', credentials=credentials) 50 | 51 | syncer = SyncMap(job) 52 | 53 | file_output_folder = path.join(job["output_folder"], "files") 54 | if not path.isdir(file_output_folder): 55 | os.mkdir(file_output_folder) 56 | 57 | results = None 58 | pageToken = None 59 | while not results or pageToken: 60 | results = service.files().list( 61 | pageSize=100, 62 | fields="nextPageToken, files(*)", 63 | pageToken=pageToken 64 | ).execute() 65 | pageToken = results.get('nextPageToken', None) 66 | for item in results.get('files', []): 67 | try: 68 | modified_time = datetime.datetime.strptime(item["modifiedTime"], "%Y-%m-%dT%H:%M:%S.%fZ").timestamp() 69 | if modified_time > syncer.get_last_modified(item["id"]) and item["capabilities"]["canDownload"] and item["mimeType"] not in IGNORE: 70 | if item["mimeType"] in EXPORTABLE: 71 | request = service.files().export( 72 | fileId=item["id"], 73 | mimeType=EXPORTABLE[item["mimeType"]] 74 | ) 75 | output_file = item["id"] + "_" + item["name"].strip().replace("/", "-") + "." + EXTENSIONS[item["mimeType"]] 76 | else: 77 | request = service.files().get_media(fileId=item["id"]) 78 | output_file = item["id"] + "_" + item["name"].strip().replace("/", "-") 79 | if request: 80 | logger.debug(f"Downloading: {item['name']}") 81 | with open(path.join(file_output_folder, output_file), "wb") as output_file_handler: 82 | downloader = MediaIoBaseDownload(output_file_handler, request) 83 | done = False 84 | while not done: 85 | _, done = downloader.next_chunk() 86 | syncer.set_last_modified(item["id"], modified_time) 87 | except: 88 | logger.error(f"Exception during {job['type']}/{job['name']} to {job['output_folder']}", exc_info=True) 89 | syncer.save() 90 | syncer.close() 91 | 92 | 93 | def generate_flow(job): 94 | return InstalledAppFlow.from_client_config({ 95 | "installed": { 96 | "client_id": job["options"]["client_id"], 97 | "client_secret": job["options"]["client_secret"], 98 | "redirect_uris": ["urn:ietf:wg:oauth:2.0:oob"], 99 | "auth_uri": "https://accounts.google.com/o/oauth2/auth", 100 | "token_uri": "https://oauth2.googleapis.com/token", 101 | "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", 102 | } 103 | }, ["https://www.googleapis.com/auth/drive.readonly"], redirect_uri="urn:ietf:wg:oauth:2.0:oob") 104 | 105 | def generate_login_url(job): 106 | flow = generate_flow(job) 107 | return flow.authorization_url()[0] 108 | 109 | def authorize_code(job): 110 | flow = generate_flow(job) 111 | flow.fetch_token(code=job["options"]["code"]) 112 | return { 113 | "token": flow.credentials.token, 114 | "refresh_token": flow.credentials.refresh_token 115 | } 116 | 117 | -------------------------------------------------------------------------------- /doomsdaymachine/datasources/imap.py: -------------------------------------------------------------------------------- 1 | import imaplib 2 | from imaplib import IMAP4, IMAP4_SSL, Time2Internaldate 3 | import os 4 | import os.path as path 5 | from doomsdaymachine.sync_map import SyncMap 6 | import time 7 | from datetime import date, timedelta 8 | from doomsdaymachine.util import format_filename 9 | 10 | imaplib._MAXLINE = 1000000 11 | 12 | def execute_imap(logger, config, job): 13 | syncer = SyncMap(job) 14 | if job["options"]["ssl"]: 15 | imap = IMAP4_SSL(job["options"]["host"], port=job["options"]["port"]) 16 | else: 17 | imap = IMAP4(job["options"]["host"], port=job["options"]["port"]) 18 | if job["options"]["tls"]: 19 | imap.starttls() 20 | imap.login(job["options"]["username"], job["options"]["password"]) 21 | for mailbox in job["options"]["mailboxes"]: 22 | logger.info(f"Downloading {mailbox}") 23 | mbox_folder = path.join(job["output_folder"], format_filename(mailbox)) 24 | if not path.isdir(mbox_folder): 25 | os.mkdir(mbox_folder) 26 | imap.select(mailbox) 27 | since_timestamp = syncer.get_last_modified(mailbox) 28 | if since_timestamp == 0: 29 | logger.debug("Doing deep search of mailbox") 30 | downloaded = None 31 | start = date.today() 32 | end = start - timedelta(days=30) 33 | while downloaded != 0: 34 | logger.debug(f"Searching {str(start)} to {str(end)}") 35 | criteria = f"(BEFORE {start.strftime('%d-%b-%Y')}) (SINCE {end.strftime('%d-%b-%Y')})" 36 | downloaded = search_mail(logger, imap, mbox_folder, criteria) 37 | start = end 38 | end = start - timedelta(days=30) 39 | else: 40 | since = date.fromtimestamp(since_timestamp) 41 | logger.debug(f"Searching since {str(since)}") 42 | criteria = "(SINCE " + since.strftime("%d-%b-%Y") + ")" 43 | search_mail(logger, imap, mbox_folder, criteria) 44 | syncer.set_last_modified(mailbox, time.time()) 45 | syncer.save() 46 | syncer.close() 47 | 48 | def search_mail(logger, imap, mbox_folder, criteria): 49 | downloaded = 0 50 | tmp, data = imap.search(None, criteria) 51 | for num in data[0].split(): 52 | downloaded += 1 53 | logger.debug(f"Getting message {num.decode('utf-8')}") 54 | tmp, mdata = imap.fetch(num, "(RFC822)") 55 | msg_file_path = path.join(mbox_folder, f"{num.decode('utf-8')}.msg") 56 | with open(msg_file_path, "wb") as msg_file: 57 | logger.debug(f"Saving {num.decode('utf-8')}") 58 | msg_file.write(mdata[0][1]) 59 | return downloaded 60 | -------------------------------------------------------------------------------- /doomsdaymachine/datasources/lastpass.py: -------------------------------------------------------------------------------- 1 | import lastpass 2 | import json 3 | import os.path as path 4 | 5 | def execute_lastpass(logger, config, job): 6 | vault = lastpass.Vault.open_remote(job["options"]["username"], job["options"]["password"]) 7 | output = [] 8 | for account in vault.accounts: 9 | logger.debug(f"Saving account {account.id.decode('utf-8')}") 10 | output.append({ 11 | "id": account.id.decode("utf-8"), 12 | "username": account.username.decode("utf-8"), 13 | "password": account.password.decode("utf-8"), 14 | "url": account.url.decode("utf-8") 15 | }) 16 | with open(path.join(job["output_folder"], "passwords.json"), "w") as passwords_file: 17 | logger.debug(f"Saving accounts") 18 | passwords_file.write(json.dumps(output)) 19 | -------------------------------------------------------------------------------- /doomsdaymachine/notification.py: -------------------------------------------------------------------------------- 1 | import smtplib 2 | import datetime 3 | from email.mime.multipart import MIMEMultipart 4 | from email.mime.text import MIMEText 5 | 6 | def send_notification(config, notification): 7 | if "email_notification" not in config: 8 | return 9 | if config["email_notification"]["ssl"]: 10 | smtp = smtplib.SMTP_SSL(config["email_notification"]["host"]) 11 | else: 12 | smtp = smtplib.SMTP(config["email_notification"]["host"]) 13 | if config["email_notification"]["tls"]: 14 | smtp.starttls() 15 | smtp.login(config["email_notification"]["username"], config["email_notification"]["password"]) 16 | message = MIMEMultipart() 17 | message["From"] = config["email_notification"]["from"] 18 | message["To"] = config["email_notification"]["to"] 19 | message["Subject"] = f"Doomsday Machine Notification {str(datetime.datetime.now())}" 20 | message.attach(MIMEText(notification, "plain")) 21 | smtp.sendmail(config["email_notification"]["from"], config["email_notification"]["to"], message.as_string()) 22 | 23 | -------------------------------------------------------------------------------- /doomsdaymachine/sync_map.py: -------------------------------------------------------------------------------- 1 | import sqlite3 2 | import os.path as path 3 | 4 | class SyncMap: 5 | def __init__(self, job): 6 | self.connection = sqlite3.connect(path.join(job["output_folder"], "syncmap.sqlite3")) 7 | self.cursor = self.connection.cursor() 8 | self.cursor.execute("CREATE TABLE IF NOT EXISTS resources (resource TEXT UNIQUE NOT NULL, last_modified NUMBER NOT NULL)") 9 | 10 | def get_last_modified(self, resource): 11 | self.cursor.execute("SELECT last_modified FROM resources WHERE resource=?", (resource, )) 12 | row = self.cursor.fetchone() 13 | if row: 14 | return row[0] 15 | else: 16 | return 0 17 | 18 | def set_last_modified(self, resource, last_modified): 19 | self.cursor.execute("SELECT last_modified FROM resources WHERE resource=?", (resource, )) 20 | row = self.cursor.fetchone() 21 | if row: 22 | self.cursor.execute("UPDATE resources SET last_modified=? WHERE resource=?", (last_modified, resource)) 23 | else: 24 | self.cursor.execute("INSERT INTO resources (last_modified, resource) VALUES (?, ?)", (last_modified, resource)) 25 | 26 | def save(self): 27 | self.connection.commit() 28 | 29 | def close(self): 30 | self.connection.close() 31 | -------------------------------------------------------------------------------- /doomsdaymachine/sync_map2.py: -------------------------------------------------------------------------------- 1 | import sqlite3 2 | import os.path as path 3 | 4 | class SyncMap2: 5 | def __init__(self, job): 6 | self.connection = sqlite3.connect(path.join(job["output_folder"], "syncmap.sqlite3")) 7 | self.cursor = self.connection.cursor() 8 | self.cursor.execute("CREATE TABLE IF NOT EXISTS resources (resource TEXT UNIQUE NOT NULL, signature TEXT NOT NULL)") 9 | 10 | def get_signature(self, resource): 11 | self.cursor.execute("SELECT signature FROM resources WHERE resource=?", (resource, )) 12 | row = self.cursor.fetchone() 13 | if row: 14 | return row[0] 15 | else: 16 | return None 17 | 18 | def set_signature(self, resource, signature): 19 | self.cursor.execute("SELECT signature FROM resources WHERE resource=?", (resource, )) 20 | row = self.cursor.fetchone() 21 | if row: 22 | self.cursor.execute("UPDATE resources SET signature=? WHERE resource=?", (signature, resource)) 23 | else: 24 | self.cursor.execute("INSERT INTO resources (signature, resource) VALUES (?, ?)", (signature, resource)) 25 | 26 | def save(self): 27 | self.connection.commit() 28 | 29 | def close(self): 30 | self.connection.close() 31 | -------------------------------------------------------------------------------- /doomsdaymachine/util.py: -------------------------------------------------------------------------------- 1 | import hashlib 2 | import yaml 3 | import os 4 | import os.path as path 5 | import string 6 | 7 | def load_config(): 8 | with open(os.getenv("CONFIG_FILE", "./config.yml"), "r") as config_file: 9 | config = yaml.full_load(config_file) 10 | if "jobs" not in config: 11 | raise Exception(f"No config jobs provided") 12 | id_map = {} 13 | for i, job in enumerate(config["jobs"]): 14 | if "name" not in job: 15 | raise Exception(f"No name provided for job #{i}") 16 | if "type" not in job: 17 | raise Exception(f"No type provided for job #{i}") 18 | job_id = generate_job_id(job) 19 | if job_id in id_map and id_map[job_id] != i: 20 | raise Exception(f"Job name and type must be unique (Error on job #{i})") 21 | else: 22 | id_map[job_id] = i 23 | job["id"] = job_id 24 | return config 25 | 26 | 27 | def generate_job_id(job): 28 | return hashlib.sha256(f"{job['name']}-{job['type']}".encode()).hexdigest() 29 | 30 | 31 | def format_filename(s): 32 | valid_chars = "-_.() %s%s" % (string.ascii_letters, string.digits) 33 | filename = ''.join(c for c in s if c in valid_chars) 34 | filename = filename.replace(' ','_') # I don't like spaces in filenames. 35 | return filename 36 | -------------------------------------------------------------------------------- /doomsdaymachine/webserver.py: -------------------------------------------------------------------------------- 1 | from flask import Flask, jsonify, send_from_directory 2 | from doomsdaymachine.util import load_config 3 | from doomsdaymachine.backup_log import BackupLog 4 | 5 | APP = Flask(__name__) 6 | 7 | config = load_config() 8 | 9 | ALLOWED_STATIC_FILES = [ 10 | "index.html", 11 | "script.js", 12 | "style.css", 13 | "reset.css", 14 | ] 15 | 16 | @APP.route("/") 17 | def home_file(): 18 | return send_from_directory("../static", "index.html") 19 | 20 | @APP.route("/") 21 | def static_file(file): 22 | if file in ALLOWED_STATIC_FILES: 23 | return send_from_directory("../static", file) 24 | 25 | @APP.route("/api/status") 26 | def status(): 27 | backup_log = BackupLog(config) 28 | active_job = backup_log.get_active_job() 29 | jobs = list(map(lambda job: dict( 30 | name=job["name"], 31 | type=job["type"], 32 | id=job["id"], 33 | start_time=active_job["start_time"] if active_job["job"] == job["id"] else None, 34 | last_execution_time=backup_log.get_last_execution_time(job["id"]) 35 | ), config["jobs"])) 36 | return jsonify(dict(jobs=jobs)) 37 | -------------------------------------------------------------------------------- /generate_credentials.py: -------------------------------------------------------------------------------- 1 | from doomsdaymachine.datasources import google_drive, google_contacts 2 | from doomsdaymachine.util import load_config 3 | import sys 4 | 5 | def generate(name): 6 | config = load_config() 7 | for job in config["jobs"]: 8 | if job["name"] == name: 9 | if job["type"] == "google_contacts": 10 | if "code" in job["options"]: 11 | print(google_contacts.authorize_code(job)) 12 | else: 13 | print(google_contacts.generate_login_url(job)) 14 | elif job["type"] == "google_drive": 15 | if "code" in job["options"]: 16 | print(google_drive.authorize_code(job)) 17 | else: 18 | print(google_drive.generate_login_url(job)) 19 | 20 | generate(sys.argv[1]) 21 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | PyGithub 2 | lastpass-python 3 | PyYAML 4 | google-api-python-client 5 | google-auth-oauthlib 6 | google-auth-httplib2 7 | Flask 8 | -------------------------------------------------------------------------------- /screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/johnjones4/Doomsday-Machine/63e9aedc1b10c556ec19c7987071f48ffd4e8524/screenshot.png -------------------------------------------------------------------------------- /static/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | Doomsday Machine 6 | 7 | 8 | 9 |
10 |

Doomsday Machine

11 |
12 |
13 | 14 | 15 | 16 | 17 | -------------------------------------------------------------------------------- /static/reset.css: -------------------------------------------------------------------------------- 1 | /* http://meyerweb.com/eric/tools/css/reset/ 2 | v2.0 | 20110126 3 | License: none (public domain) 4 | */ 5 | 6 | html, body, div, span, applet, object, iframe, 7 | h1, h2, h3, h4, h5, h6, p, blockquote, pre, 8 | a, abbr, acronym, address, big, cite, code, 9 | del, dfn, em, img, ins, kbd, q, s, samp, 10 | small, strike, strong, sub, sup, tt, var, 11 | b, u, i, center, 12 | dl, dt, dd, ol, ul, li, 13 | fieldset, form, label, legend, 14 | table, caption, tbody, tfoot, thead, tr, th, td, 15 | article, aside, canvas, details, embed, 16 | figure, figcaption, footer, header, hgroup, 17 | menu, nav, output, ruby, section, summary, 18 | time, mark, audio, video { 19 | margin: 0; 20 | padding: 0; 21 | border: 0; 22 | font-size: 100%; 23 | font: inherit; 24 | vertical-align: baseline; 25 | } 26 | /* HTML5 display-role reset for older browsers */ 27 | article, aside, details, figcaption, figure, 28 | footer, header, hgroup, menu, nav, section { 29 | display: block; 30 | } 31 | body { 32 | line-height: 1; 33 | } 34 | ol, ul { 35 | list-style: none; 36 | } 37 | blockquote, q { 38 | quotes: none; 39 | } 40 | blockquote:before, blockquote:after, 41 | q:before, q:after { 42 | content: ''; 43 | content: none; 44 | } 45 | table { 46 | border-collapse: collapse; 47 | border-spacing: 0; 48 | } 49 | -------------------------------------------------------------------------------- /static/script.js: -------------------------------------------------------------------------------- 1 | (async () => { 2 | const chartWidth = window.innerWidth 3 | const chartHeight = window.innerHeight * 0.7 4 | const chartMargin = 0 5 | const chartRadius = Math.min(chartWidth, chartHeight) / 2 - chartMargin 6 | 7 | const loadData = async () => { 8 | const response = await fetch('/api/status') 9 | return await response.json() 10 | } 11 | 12 | const drawChart = (jobs) => { 13 | d3.select("#chart").selectAll("*").remove(); 14 | 15 | const svg = d3.select("#chart") 16 | .append("svg") 17 | .attr("width", chartWidth) 18 | .attr("height", chartHeight) 19 | .append("g") 20 | .attr("transform", "translate(" + chartWidth / 2 + "," + chartHeight / 2 + ")") 21 | 22 | const color = d3.scaleLinear() 23 | .domain([0,jobs.length - 1]) 24 | .range([100,200]) 25 | 26 | const blueColor = d3.scaleLinear() 27 | .domain([0,jobs.length - 1]) 28 | .range([150,250]) 29 | 30 | const jobSeconds = d => { 31 | if (d.last_execution_time) { 32 | return d.last_execution_time 33 | } else if (d.start_time) { 34 | return (new Date().getTime() / 1000) - d.start_time 35 | } else { 36 | return 0 37 | } 38 | } 39 | 40 | const formatTime = t => { 41 | if (t < 60) { 42 | return `${Math.round(t)} Seconds` 43 | } else if (t < 3600) { 44 | return `${Math.round(t / 60)} Minutes` 45 | } else if (t < 216000) { 46 | return `${Math.round(t / 60 / 60)} Hours` 47 | } else { 48 | return `${Math.round(t / 60 / 60 / 24)} Days` 49 | } 50 | } 51 | 52 | const pie = d3.pie() 53 | .value(d => Math.log(jobSeconds(d))) 54 | 55 | const dataReady = pie(jobs) 56 | 57 | const arc = d3.arc() 58 | .innerRadius(chartRadius * 0.4) 59 | .outerRadius(chartRadius * 0.8) 60 | 61 | const outerArc = d3.arc() 62 | .innerRadius(chartRadius * 0.9) 63 | .outerRadius(chartRadius * 0.9) 64 | 65 | svg 66 | .selectAll('slices') 67 | .data(dataReady) 68 | .enter() 69 | .append('path') 70 | .attr('d', arc) 71 | .attr('class', (d) => ['pie-segment', 'pie-segment-' + (!isNaN(d.data.start_time) && d.data.start_time > 0 ? 'active' : 'inactive')].join(' ')) 72 | .attr('fill',(d,i) => `rgb(${color(i)},${color(i)},${blueColor(i)})`) 73 | 74 | svg 75 | .selectAll('lines') 76 | .data(dataReady) 77 | .enter() 78 | .append('polyline') 79 | .attr('class', 'label-line') 80 | .attr('points', (d) => { 81 | const posA = arc.centroid(d) 82 | const posB = outerArc.centroid(d) 83 | const posC = outerArc.centroid(d) 84 | const midangle = d.startAngle + (d.endAngle - d.startAngle) / 2 85 | posC[0] = chartRadius * 0.95 * (midangle < Math.PI ? 1 : -1) 86 | return [posA, posB, posC] 87 | }) 88 | 89 | svg 90 | .selectAll('labels') 91 | .data(dataReady) 92 | .enter() 93 | .append('text') 94 | .text(d => `${d.data.name} (${formatTime(jobSeconds(d.data))})`) 95 | .attr('class', 'label') 96 | .attr('transform', function(d) { 97 | const pos = outerArc.centroid(d); 98 | const midangle = d.startAngle + (d.endAngle - d.startAngle) / 2 99 | pos[0] = chartRadius * 0.99 * (midangle < Math.PI ? 1 : -1); 100 | return 'translate(' + pos + ')'; 101 | }) 102 | .style('text-anchor', function(d) { 103 | const midangle = d.startAngle + (d.endAngle - d.startAngle) / 2 104 | return (midangle < Math.PI ? 'start' : 'end') 105 | }) 106 | } 107 | 108 | const { jobs } = await loadData() 109 | drawChart(jobs) 110 | 111 | setInterval(async () => { 112 | const { jobs } = await loadData() 113 | drawChart(jobs) 114 | }, 5000) 115 | })() 116 | -------------------------------------------------------------------------------- /static/style.css: -------------------------------------------------------------------------------- 1 | body { 2 | font-family: Helvetica, Arial, sans-serif; 3 | background: black; 4 | color: #f9f9f9; 5 | margin: 0; 6 | padding: 0; 7 | } 8 | 9 | h1 { 10 | text-align: center; 11 | font-size: 4em; 12 | text-transform: uppercase; 13 | } 14 | 15 | #chart svg { 16 | margin: auto; 17 | display: block; 18 | } 19 | 20 | @keyframes glow { 21 | 0% { 22 | opacity: 1; 23 | } 24 | 100% { 25 | opacity: 0.5; 26 | } 27 | } 28 | #chart svg .pie-segment.pie-segment-active { 29 | animation: glow 1s ease-in-out infinite alternate; 30 | } 31 | 32 | #chart svg .label-line { 33 | fill: none; 34 | stroke: white; 35 | } 36 | 37 | #chart svg .label { 38 | fill: white; 39 | } 40 | -------------------------------------------------------------------------------- /webserver.py: -------------------------------------------------------------------------------- 1 | from doomsdaymachine.webserver import APP 2 | 3 | if __name__ == '__main__': 4 | APP.run(host="0.0.0.0") 5 | --------------------------------------------------------------------------------