├── .hgignore ├── Pipfile ├── setup.py ├── LICENSE ├── README.md ├── Pipfile.lock └── photosync.py /.hgignore: -------------------------------------------------------------------------------- 1 | test.sql 2 | clientsecret.json 3 | 4 | syntax: regexp 5 | \d{4} 6 | -------------------------------------------------------------------------------- /Pipfile: -------------------------------------------------------------------------------- 1 | [[source]] 2 | name = "pypi" 3 | url = "https://pypi.org/simple" 4 | verify_ssl = true 5 | 6 | [dev-packages] 7 | 8 | [packages] 9 | google-api-python-client = "==1.11" 10 | google-api-core = "==1.26.3" 11 | google-auth-httplib2 = "*" 12 | google-auth-oauthlib = "*" 13 | google-auth = "==1.29" 14 | python-dateutil = "*" 15 | arguments = "*" 16 | future = "*" 17 | pyyaml = "*" 18 | consoleprinter = "*" 19 | 20 | [requires] 21 | python_version = "3.9" 22 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | 3 | from setuptools import setup 4 | 5 | setup( 6 | name='photosync', 7 | version='0.1', 8 | description='Download and organize media from Google Photos', 9 | url='https://github.com/dermesser/photosync', 10 | author='Lewin Bormann ', 11 | author_email='lbo@spheniscida.de', 12 | license='MIT', 13 | scripts=['photosync.py'], 14 | install_requires=[ 15 | 'arguments', 16 | 'future', 17 | 'consoleprinter', 18 | 'google-api-core', 19 | 'google-api-python-client==1.11', 20 | 'google-auth==1.29', 21 | 'google-auth-httplib2', 22 | 'google-auth-oauthlib', 23 | 'python-dateutil', 24 | 'pyyaml', 25 | 'requests-oauthlib', 26 | ]) 27 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2019 Lewin Bormann 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of 4 | this software and associated documentation files (the "Software"), to deal in 5 | the Software without restriction, including without limitation the rights to 6 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 7 | of the Software, and to permit persons to whom the Software is furnished to do 8 | so, subject to the following conditions: 9 | 10 | The above copyright notice and this permission notice shall be included in all 11 | copies or substantial portions of the Software. 12 | 13 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 14 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 15 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 16 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 17 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 18 | FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 19 | DEALINGS IN THE SOFTWARE. 20 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # photosync 2 | 3 | Now that Google deprecated the Photos<-\>Drive synchronization, I need another way to back up my photos locally. This 4 | program downloads all photos from your Google Photos account and organizes them locally. It is not very user friendly 5 | yet, but definitely usable. 6 | 7 | photosync only ever downloads photos, i.e. the synchronization works from Google Photos as Source of Truth to your local 8 | storage. Don't worry about deleting photos locally; although you have to use the slow `--resync` option to re-download them. 9 | 10 | photosync is fast enough for reasonably large libraries. My library of ~50'000 photos was indexed in roughly an hour; 11 | the (resumable) download takes another few hours, depending on latency and photo size. 12 | 13 | **Pull requests are welcome!** 14 | 15 | ## Behavior 16 | 17 | By default, photosync will ask for OAuth2 authorization on the console, and then immediately start downloading metadata 18 | from Google Photos. Once no more new photos are fetched and all metadata is stored in `sync.db`, photosync will look for 19 | photos that are not yet marked as downloaded in the database and fetch the actual image files. By default, it will 20 | organize photos in directories like `year/month/day/` (numerically, 0-padded), but you can write your own method of 21 | mapping photos to directories and override it by setting the `path_mapper` argument in the `Driver` constructor called 22 | from `Main.main()`. 23 | 24 | Note that this (obviously) takes a while for large libraries. But you can always stop photosync and restart it later; 25 | without the `--all` option, it will resume synchronization where it left off. 26 | 27 | Albums are currently ignored. Videos are downloaded just like photos. 28 | 29 | ## Install & Use 30 | 31 | First, acquire a client secret. This is necessary because this is an open source project, and I don't want client 32 | credentials associated with my account floating around in the wild. Also, the daily limit for Photos API calls is at 33 | 10'000, so it wouldn't work for a nontrivial number of users anyway. 34 | 35 | For this, 36 | 37 | 1. go to https://console.developers.google.com. 38 | 1. Ensure you are on the right project or create a new one 39 | 1. Go to the APIs page and enable the Google Photos API. 40 | 1. Set up the OAuth consent screen (otherwise Google will nag you during credentials creation to do it). 41 | 1. Then go to the *Credentials* page and create a new client ID (type `other`). Download the JSON file using the 42 | download button at the right hand side. 43 | 1. Save the downloaded JSON file and put it somewhere, for example in your photos directory. Pass the path to the file 44 | to photosync using the `--creds` argument. By default, photosync will look for a file called `clientsecret.json` in 45 | the current directory. 46 | 1. After the first run, credentials are cached in the internal SQLite database, 47 | meaning you don't have to explicitly specify them on any further invocations. 48 | 49 | Once you have gone through the hassle of obtaining the client secret, you can start downloading your photos. 50 | 51 | 1. Clone this repository to a convenient place: `git clone https://github.com/dermesser/photosync` or `hg clone 52 | git+https://github.com/dermesser/photosync`. 53 | 1. Go into the `photosync` repository and run `pip[3] install [--user] .`. This 54 | installs dependencies needed by the program. 55 | - Or you use `pipenv` using the provided `Pipfile`: `pipenv shell && pipenv 56 | install`, after which you can use `python photosync.py`. 57 | - Alternatively, build an egg `python setup.py bdist_egg` and install it 58 | `easy_install dist/photosync-[version]-py[version].egg` 59 | 1. Run it: `python3 photosync.py --help` or, if you installed it with `pip`, 60 | `photosync.py --help`. 61 | 62 | Consult the help text printed by the last command. Usually you will need to set 63 | `--dir` so that your photos don't end up in the current directory. Typically you 64 | would initially run 65 | 66 | ``` 67 | $ python3 photosync.py --dir=/target/directory --creds=/path/to/clientsecret.json 68 | ``` 69 | 70 | which also asks you for OAuth authorization. After having uploaded photos 71 | (careful: Google Photos exposes new media only a few minutes up to half an hour 72 | after uploading!) you can run the following command, which looks for photos and 73 | videos that are either older than the oldest known item or newer than the newest 74 | known item (obviously missing any that have been uploaded with a date between 75 | the oldest and newest items: see `--dates` below how to fix it): 76 | 77 | ``` 78 | $ python3 photosync.py --dir=/target/directory 79 | ``` 80 | 81 | If it turns out you are missing some items locally, you can check Google Photos 82 | again for them: 83 | 84 | ``` 85 | # to check *all* photos available for January 2000: 86 | $ python3 photosync.py --dir=/target/directory --dates=2000-01-01:2000-02-01 87 | 88 | # to check *all* photos from *ever*: 89 | $ python3 photosync.py --dir=/target/directory --all 90 | ``` 91 | 92 | ## Troubleshooting 93 | 94 | * I have seen `Invalid media item ID.` errors for valid-looking media item IDs. This happened to a handful of photos, 95 | all from the same day. The media item IDs all started with the same prefix which was different than the shared prefix of 96 | all other media item IDs (all IDs from one account usually start with the same 4-6 characters). I'm not sure why the 97 | API at one point returned those. 98 | * To clean this up, remove the invalid IDs from the database (`sqlite3 sync.db "DELETE FROM items WHERE id LIKE 99 | 'wrongprefix%'"`) after checking that only a small number of items has this kind of ID (`sqlite3 sync.db "SELECT * 100 | FROM items WHERE id LIKE 'wrongprefix%'"`). 101 | * Re-fetch metadata for the affected days: `python3 photosync.py --dir=.../directory --all --dates=2012-12-12:2012-12-14` 102 | (for example) 103 | -------------------------------------------------------------------------------- /Pipfile.lock: -------------------------------------------------------------------------------- 1 | { 2 | "_meta": { 3 | "hash": { 4 | "sha256": "0cea9d846d98ac0a2022fb3ff7239a98535ba187e96d9f537f393d27fbaa3cd5" 5 | }, 6 | "pipfile-spec": 6, 7 | "requires": { 8 | "python_version": "3.9" 9 | }, 10 | "sources": [ 11 | { 12 | "name": "pypi", 13 | "url": "https://pypi.org/simple", 14 | "verify_ssl": true 15 | } 16 | ] 17 | }, 18 | "default": { 19 | "arguments": { 20 | "hashes": [ 21 | "sha256:5de390ba2212c227f0b4f43db175b623db1d1a25d4fb001f4e20013827ba829c" 22 | ], 23 | "index": "pypi", 24 | "version": "==76" 25 | }, 26 | "cachetools": { 27 | "hashes": [ 28 | "sha256:1d9d5f567be80f7c07d765e21b814326d78c61eb0c3a637dffc0e5d1796cb2e2", 29 | "sha256:f469e29e7aa4cff64d8de4aad95ce76de8ea1125a16c68e0d93f65c3c3dc92e9" 30 | ], 31 | "markers": "python_version ~= '3.5'", 32 | "version": "==4.2.1" 33 | }, 34 | "certifi": { 35 | "hashes": [ 36 | "sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c", 37 | "sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830" 38 | ], 39 | "version": "==2020.12.5" 40 | }, 41 | "chardet": { 42 | "hashes": [ 43 | "sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa", 44 | "sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5" 45 | ], 46 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 47 | "version": "==4.0.0" 48 | }, 49 | "consoleprinter": { 50 | "hashes": [ 51 | "sha256:a5a91a7b52cd459b38840d9558fa59b19eb94a860cbad5e12551ceaec8068a95" 52 | ], 53 | "index": "pypi", 54 | "version": "==93" 55 | }, 56 | "future": { 57 | "hashes": [ 58 | "sha256:b1bead90b70cf6ec3f0710ae53a525360fa360d306a86583adc6bf83a4db537d" 59 | ], 60 | "index": "pypi", 61 | "version": "==0.18.2" 62 | }, 63 | "google-api-core": { 64 | "hashes": [ 65 | "sha256:099762d4b4018cd536bcf85136bf337957da438807572db52f21dc61251be089", 66 | "sha256:b914345c7ea23861162693a27703bab804a55504f7e6e9abcaff174d80df32ac" 67 | ], 68 | "index": "pypi", 69 | "version": "==1.26.3" 70 | }, 71 | "google-api-python-client": { 72 | "hashes": [ 73 | "sha256:4f596894f702736da84cf89490a810b55ca02a81f0cddeacb3022e2900b11ec6", 74 | "sha256:caf4015800ef1a18d06d117f47f0219c0c0641f21978f6b1bb5ede7912fab97b" 75 | ], 76 | "index": "pypi", 77 | "version": "==1.11" 78 | }, 79 | "google-auth": { 80 | "hashes": [ 81 | "sha256:010f011c4e27d3d5eb01106fba6aac39d164842dfcd8709955c4638f5b11ccf8", 82 | "sha256:f30a672a64d91cc2e3137765d088c5deec26416246f7a9e956eaf69a8d7ed49c" 83 | ], 84 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", 85 | "version": "==1.29.0" 86 | }, 87 | "google-auth-httplib2": { 88 | "hashes": [ 89 | "sha256:31e49c36c6b5643b57e82617cb3e021e3e1d2df9da63af67252c02fa9c1f4a10", 90 | "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac" 91 | ], 92 | "index": "pypi", 93 | "version": "==0.1.0" 94 | }, 95 | "google-auth-oauthlib": { 96 | "hashes": [ 97 | "sha256:09832c6e75032f93818edf1affe4746121d640c625a5bef9b5c96af676e98eee", 98 | "sha256:0e92aacacfb94978de3b7972cf4b0f204c3cd206f74ddd0dc0b31e91164e6317" 99 | ], 100 | "index": "pypi", 101 | "version": "==0.4.4" 102 | }, 103 | "googleapis-common-protos": { 104 | "hashes": [ 105 | "sha256:a88ee8903aa0a81f6c3cec2d5cf62d3c8aa67c06439b0496b49048fb1854ebf4", 106 | "sha256:f6d561ab8fb16b30020b940e2dd01cd80082f4762fa9f3ee670f4419b4b8dbd0" 107 | ], 108 | "markers": "python_version >= '3.6'", 109 | "version": "==1.53.0" 110 | }, 111 | "httplib2": { 112 | "hashes": [ 113 | "sha256:0b12617eeca7433d4c396a100eaecfa4b08ee99aa881e6df6e257a7aad5d533d", 114 | "sha256:2ad195faf9faf079723f6714926e9a9061f694d07724b846658ce08d40f522b4" 115 | ], 116 | "version": "==0.19.1" 117 | }, 118 | "idna": { 119 | "hashes": [ 120 | "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6", 121 | "sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0" 122 | ], 123 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 124 | "version": "==2.10" 125 | }, 126 | "oauthlib": { 127 | "hashes": [ 128 | "sha256:bee41cc35fcca6e988463cacc3bcb8a96224f470ca547e697b604cc697b2f889", 129 | "sha256:df884cd6cbe20e32633f1db1072e9356f53638e4361bef4e8b03c9127c9328ea" 130 | ], 131 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 132 | "version": "==3.1.0" 133 | }, 134 | "packaging": { 135 | "hashes": [ 136 | "sha256:5b327ac1320dc863dca72f4514ecc086f31186744b84a230374cc1fd776feae5", 137 | "sha256:67714da7f7bc052e064859c05c595155bd1ee9f69f76557e21f051443c20947a" 138 | ], 139 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 140 | "version": "==20.9" 141 | }, 142 | "protobuf": { 143 | "hashes": [ 144 | "sha256:0277f62b1e42210cafe79a71628c1d553348da81cbd553402a7f7549c50b11d0", 145 | "sha256:07eec4e2ccbc74e95bb9b3afe7da67957947ee95bdac2b2e91b038b832dd71f0", 146 | "sha256:1c0e9e56202b9dccbc094353285a252e2b7940b74fdf75f1b4e1b137833fabd7", 147 | "sha256:1f0b5d156c3df08cc54bc2c8b8b875648ea4cd7ebb2a9a130669f7547ec3488c", 148 | "sha256:2dc0e8a9e4962207bdc46a365b63a3f1aca6f9681a5082a326c5837ef8f4b745", 149 | "sha256:3053f13207e7f13dc7be5e9071b59b02020172f09f648e85dc77e3fcb50d1044", 150 | "sha256:4a054b0b5900b7ea7014099e783fb8c4618e4209fffcd6050857517b3f156e18", 151 | "sha256:510e66491f1a5ac5953c908aa8300ec47f793130097e4557482803b187a8ee05", 152 | "sha256:5ff9fa0e67fcab442af9bc8d4ec3f82cb2ff3be0af62dba047ed4187f0088b7d", 153 | "sha256:90270fe5732c1f1ff664a3bd7123a16456d69b4e66a09a139a00443a32f210b8", 154 | "sha256:a0a08c6b2e6d6c74a6eb5bf6184968eefb1569279e78714e239d33126e753403", 155 | "sha256:c5566f956a26cda3abdfacc0ca2e21db6c9f3d18f47d8d4751f2209d6c1a5297", 156 | "sha256:dab75b56a12b1ceb3e40808b5bd9dfdaef3a1330251956e6744e5b6ed8f8830b", 157 | "sha256:efa4c4d4fc9ba734e5e85eaced70e1b63fb3c8d08482d839eb838566346f1737", 158 | "sha256:f17b352d7ce33c81773cf81d536ca70849de6f73c96413f17309f4b43ae7040b", 159 | "sha256:f42c2f5fb67da5905bfc03733a311f72fa309252bcd77c32d1462a1ad519521e", 160 | "sha256:f6077db37bfa16494dca58a4a02bfdacd87662247ad6bc1f7f8d13ff3f0013e1", 161 | "sha256:f80afc0a0ba13339bbab25ca0409e9e2836b12bb012364c06e97c2df250c3343", 162 | "sha256:f9cadaaa4065d5dd4d15245c3b68b967b3652a3108e77f292b58b8c35114b56c", 163 | "sha256:fad4f971ec38d8df7f4b632c819bf9bbf4f57cfd7312cf526c69ce17ef32436a" 164 | ], 165 | "version": "==3.15.8" 166 | }, 167 | "pyasn1": { 168 | "hashes": [ 169 | "sha256:014c0e9976956a08139dc0712ae195324a75e142284d5f87f1a87ee1b068a359", 170 | "sha256:03840c999ba71680a131cfaee6fab142e1ed9bbd9c693e285cc6aca0d555e576", 171 | "sha256:0458773cfe65b153891ac249bcf1b5f8f320b7c2ce462151f8fa74de8934becf", 172 | "sha256:08c3c53b75eaa48d71cf8c710312316392ed40899cb34710d092e96745a358b7", 173 | "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d", 174 | "sha256:5c9414dcfede6e441f7e8f81b43b34e834731003427e5b09e4e00e3172a10f00", 175 | "sha256:6e7545f1a61025a4e58bb336952c5061697da694db1cae97b116e9c46abcf7c8", 176 | "sha256:78fa6da68ed2727915c4767bb386ab32cdba863caa7dbe473eaae45f9959da86", 177 | "sha256:7ab8a544af125fb704feadb008c99a88805126fb525280b2270bb25cc1d78a12", 178 | "sha256:99fcc3c8d804d1bc6d9a099921e39d827026409a58f2a720dcdb89374ea0c776", 179 | "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba", 180 | "sha256:e89bf84b5437b532b0803ba5c9a5e054d21fec423a89952a74f87fa2c9b7bce2", 181 | "sha256:fec3e9d8e36808a28efb59b489e4528c10ad0f480e57dcc32b4de5c9d8c9fdf3" 182 | ], 183 | "version": "==0.4.8" 184 | }, 185 | "pyasn1-modules": { 186 | "hashes": [ 187 | "sha256:0845a5582f6a02bb3e1bde9ecfc4bfcae6ec3210dd270522fee602365430c3f8", 188 | "sha256:0fe1b68d1e486a1ed5473f1302bd991c1611d319bba158e98b106ff86e1d7199", 189 | "sha256:15b7c67fabc7fc240d87fb9aabf999cf82311a6d6fb2c70d00d3d0604878c811", 190 | "sha256:426edb7a5e8879f1ec54a1864f16b882c2837bfd06eee62f2c982315ee2473ed", 191 | "sha256:65cebbaffc913f4fe9e4808735c95ea22d7a7775646ab690518c056784bc21b4", 192 | "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e", 193 | "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74", 194 | "sha256:a99324196732f53093a84c4369c996713eb8c89d360a496b599fb1a9c47fc3eb", 195 | "sha256:b80486a6c77252ea3a3e9b1e360bc9cf28eaac41263d173c032581ad2f20fe45", 196 | "sha256:c29a5e5cc7a3f05926aff34e097e84f8589cd790ce0ed41b67aed6857b26aafd", 197 | "sha256:cbac4bc38d117f2a49aeedec4407d23e8866ea4ac27ff2cf7fb3e5b570df19e0", 198 | "sha256:f39edd8c4ecaa4556e989147ebf219227e2cd2e8a43c7e7fcb1f1c18c5fd6a3d", 199 | "sha256:fe0644d9ab041506b62782e92b06b8c68cca799e1a9636ec398675459e031405" 200 | ], 201 | "version": "==0.2.8" 202 | }, 203 | "pyparsing": { 204 | "hashes": [ 205 | "sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1", 206 | "sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b" 207 | ], 208 | "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'", 209 | "version": "==2.4.7" 210 | }, 211 | "python-dateutil": { 212 | "hashes": [ 213 | "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c", 214 | "sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a" 215 | ], 216 | "index": "pypi", 217 | "version": "==2.8.1" 218 | }, 219 | "pytz": { 220 | "hashes": [ 221 | "sha256:83a4a90894bf38e243cf052c8b58f381bfe9a7a483f6a9cab140bc7f702ac4da", 222 | "sha256:eb10ce3e7736052ed3623d49975ce333bcd712c7bb19a58b9e2089d4057d0798" 223 | ], 224 | "version": "==2021.1" 225 | }, 226 | "pyyaml": { 227 | "hashes": [ 228 | "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", 229 | "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", 230 | "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", 231 | "sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77", 232 | "sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922", 233 | "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", 234 | "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", 235 | "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", 236 | "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", 237 | "sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018", 238 | "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", 239 | "sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253", 240 | "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347", 241 | "sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183", 242 | "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", 243 | "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", 244 | "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", 245 | "sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc", 246 | "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", 247 | "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", 248 | "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", 249 | "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", 250 | "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", 251 | "sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63", 252 | "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", 253 | "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", 254 | "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", 255 | "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", 256 | "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0" 257 | ], 258 | "index": "pypi", 259 | "version": "==5.4.1" 260 | }, 261 | "requests": { 262 | "hashes": [ 263 | "sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804", 264 | "sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e" 265 | ], 266 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 267 | "version": "==2.25.1" 268 | }, 269 | "requests-oauthlib": { 270 | "hashes": [ 271 | "sha256:7f71572defaecd16372f9006f33c2ec8c077c3cfa6f5911a9a90202beb513f3d", 272 | "sha256:b4261601a71fd721a8bd6d7aa1cc1d6a8a93b4a9f5e96626f8e4d91e8beeaa6a", 273 | "sha256:fa6c47b933f01060936d87ae9327fead68768b69c6c9ea2109c48be30f2d4dbc" 274 | ], 275 | "version": "==1.3.0" 276 | }, 277 | "rsa": { 278 | "hashes": [ 279 | "sha256:78f9a9bf4e7be0c5ded4583326e7461e3a3c5aae24073648b4bdfa797d78c9d2", 280 | "sha256:9d689e6ca1b3038bc82bf8d23e944b6b6037bc02301a574935b2dd946e0353b9" 281 | ], 282 | "markers": "python_version >= '3.6'", 283 | "version": "==4.7.2" 284 | }, 285 | "six": { 286 | "hashes": [ 287 | "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", 288 | "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" 289 | ], 290 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 291 | "version": "==1.15.0" 292 | }, 293 | "terminaltables": { 294 | "hashes": [ 295 | "sha256:f3eb0eb92e3833972ac36796293ca0906e998dc3be91fbe1f8615b331b853b81" 296 | ], 297 | "version": "==3.1.0" 298 | }, 299 | "ujson": { 300 | "hashes": [ 301 | "sha256:0190d26c0e990c17ad072ec8593647218fe1c675d11089cd3d1440175b568967", 302 | "sha256:0ea07fe57f9157118ca689e7f6db72759395b99121c0ff038d2e38649c626fb1", 303 | "sha256:30962467c36ff6de6161d784cd2a6aac1097f0128b522d6e9291678e34fb2b47", 304 | "sha256:4d6d061563470cac889c0a9fd367013a5dbd8efc36ad01ab3e67a57e56cad720", 305 | "sha256:5e1636b94c7f1f59a8ead4c8a7bab1b12cc52d4c21ababa295ffec56b445fd2a", 306 | "sha256:7333e8bc45ea28c74ae26157eacaed5e5629dbada32e0103c23eb368f93af108", 307 | "sha256:84b1dca0d53b0a8d58835f72ea2894e4d6cf7a5dd8f520ab4cbd698c81e49737", 308 | "sha256:91396a585ba51f84dc71c8da60cdc86de6b60ba0272c389b6482020a1fac9394", 309 | "sha256:a214ba5a21dad71a43c0f5aef917cd56a2d70bc974d845be211c66b6742a471c", 310 | "sha256:aad6d92f4d71e37ea70e966500f1951ecd065edca3a70d3861b37b176dd6702c", 311 | "sha256:b3a6dcc660220539aa718bcc9dbd6dedf2a01d19c875d1033f028f212e36d6bb", 312 | "sha256:b5c70704962cf93ec6ea3271a47d952b75ae1980d6c56b8496cec2a722075939", 313 | "sha256:c615a9e9e378a7383b756b7e7a73c38b22aeb8967a8bfbffd4741f7ffd043c4d", 314 | "sha256:d3a87888c40b5bfcf69b4030427cd666893e826e82cc8608d1ba8b4b5e04ea99", 315 | "sha256:e2cadeb0ddc98e3963bea266cc5b884e5d77d73adf807f0bda9eca64d1c509d5", 316 | "sha256:e390df0dcc7897ffb98e17eae1f4c442c39c91814c298ad84d935a3c5c7a32fa", 317 | "sha256:e6e90330670c78e727d6637bb5a215d3e093d8e3570d439fd4922942f88da361", 318 | "sha256:eb6b25a7670c7537a5998e695fa62ff13c7f9c33faf82927adf4daa460d5f62e", 319 | "sha256:f273a875c0b42c2a019c337631bc1907f6fdfbc84210cc0d1fff0e2019bbfaec", 320 | "sha256:f8aded54c2bc554ce20b397f72101737dd61ee7b81c771684a7dd7805e6cca0c", 321 | "sha256:fc51e545d65689c398161f07fd405104956ec27f22453de85898fa088b2cd4bb" 322 | ], 323 | "markers": "python_version >= '3.6'", 324 | "version": "==4.0.2" 325 | }, 326 | "uritemplate": { 327 | "hashes": [ 328 | "sha256:07620c3f3f8eed1f12600845892b0e036a2420acf513c53f7de0abd911a5894f", 329 | "sha256:5af8ad10cec94f215e3f48112de2022e1d5a37ed427fbd88652fa908f2ab7cae" 330 | ], 331 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 332 | "version": "==3.0.1" 333 | }, 334 | "urllib3": { 335 | "hashes": [ 336 | "sha256:2f4da4594db7e1e110a944bb1b551fdf4e6c136ad42e4234131391e21eb5b0df", 337 | "sha256:e7b021f7241115872f92f43c6508082facffbd1c048e3c6e2bb9c2a157e28937" 338 | ], 339 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", 340 | "version": "==1.26.4" 341 | } 342 | }, 343 | "develop": {} 344 | } 345 | -------------------------------------------------------------------------------- /photosync.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | 3 | import datetime 4 | import json 5 | import os 6 | import os.path 7 | import pickle 8 | import sqlite3 9 | 10 | import arguments 11 | import dateutil.parser 12 | import httplib2 13 | 14 | from googleapiclient.discovery import build 15 | from google_auth_oauthlib.flow import InstalledAppFlow 16 | from google.auth.transport.requests import Request 17 | 18 | PROD = False 19 | TRACE = True 20 | 21 | 22 | def log(level, msg, *args): 23 | if PROD: 24 | return 25 | if level == 'TRACE' and not TRACE: 26 | return 27 | if args: 28 | msg = msg.format(*args) 29 | print (level, "::", msg) 30 | 31 | def make_date_iso(d): 32 | """Expects a date like 2019-1-4 and preprocesses it for ISO parsing. 33 | """ 34 | return '-'.join('{:02d}'.format(int(p)) for p in d.split('-')) 35 | 36 | class TokenSource: 37 | """Return OAuth token for PhotosService to use. 38 | 39 | Please acquire your own client secret and put it into the clientsecret.json 40 | (or any other file) in the directory you are running this program from. 41 | 42 | On first use, this will prompt you for authorization on stdin. On subsequent 43 | invocations, it will use the token from the database and refresh it when 44 | needed. 45 | """ 46 | SCOPES = ['https://www.googleapis.com/auth/photoslibrary'] 47 | CRED_ID = 'installed.main' 48 | 49 | def __init__(self, db=None, tokensfile=None, clientsecret=None): 50 | self._db = db 51 | self._tokensfile = tokensfile 52 | self._clientsecret = clientsecret 53 | 54 | def creds(self): 55 | if self._clientsecret is None: 56 | if self._tokensfile and os.path.exists(self._tokensfile): 57 | with open(self._tokensfile, 'rb') as f: 58 | creds = pickle.load(f) 59 | return creds 60 | elif self._db: 61 | creds = self._db.get_credentials(self.CRED_ID) 62 | if creds: 63 | creds = pickle.loads(creds) 64 | return creds 65 | assert self._clientsecret is not None, 'Need --creds to proceed with authorization' 66 | flow = InstalledAppFlow.from_client_secrets_file(self._clientsecret, self.SCOPES) 67 | creds = flow.run_local_server() 68 | if creds and self._tokensfile: 69 | with open(self._tokensfile, 'wb') as f: 70 | pickle.dump(creds, f) 71 | if creds and self._db: 72 | self._db.store_credentials(self.CRED_ID, pickle.dumps(creds)) 73 | return creds 74 | 75 | 76 | class PhotosService: 77 | 78 | def __init__(self, tokens=None): 79 | self._token_source = tokens 80 | self._service = build('photoslibrary', 'v1', credentials=tokens.creds()) 81 | self._http = httplib2.Http() 82 | 83 | def get_item(self, id): 84 | item = self._service.mediaItems().get(mediaItemId=id).execute() 85 | return item 86 | 87 | def list_library(self, start=None, to=None): 88 | """Yields items from the library. 89 | 90 | Arguments: 91 | start: datetime.date 92 | end: datetime.date 93 | 94 | Returns: 95 | [mediaItem] 96 | """ 97 | filters = {} 98 | if start or to: 99 | rng_filter = {'ranges': {}} 100 | if not start: 101 | start = datetime.date(1999, 1, 1) 102 | if not to: 103 | to = datetime.datetime.now().date() 104 | rng_filter['ranges']['startDate'] = {'year': start.year, 'month': start.month, 'day': start.day} 105 | rng_filter['ranges']['endDate'] = {'year': to.year, 'month': to.month, 'day': to.day} 106 | filters['dateFilter'] = rng_filter 107 | pagetoken = None 108 | 109 | # Photos are returned in reversed order of creationTime. 110 | while True: 111 | resp = self._service.mediaItems().search(body={'pageSize': 75, 'filters': filters, 'pageToken': pagetoken}).execute() 112 | pagetoken = resp.get('nextPageToken', None) 113 | items = resp.get('mediaItems', None) 114 | if not items: 115 | return 116 | for i in items: 117 | log('TRACE', i['mediaMetadata']['creationTime']) 118 | yield i 119 | if pagetoken is None: 120 | return 121 | 122 | def download_items(self, items): 123 | """Download multiple items. 124 | 125 | Arguments: 126 | items: List of (id, path, video) tuples. 127 | 128 | Returns: 129 | List of IDs that were successfully downloaded. 130 | """ 131 | ids = [i[0] for i in items] 132 | media_items = self._service.mediaItems().batchGet(mediaItemIds=ids).execute() 133 | ok = [] 134 | i = -1 135 | for result in media_items['mediaItemResults']: 136 | i += 1 137 | if 'status' in result: 138 | log('WARN', 'Could not query info for {}: {}'.format(items[i][0], result['status'])) 139 | continue 140 | item = result['mediaItem'] 141 | rawurl = item['baseUrl'] 142 | if 'video' in item['mediaMetadata']: 143 | rawurl += '=dv' 144 | else: 145 | rawurl += '=d' 146 | os.makedirs(items[i][1], exist_ok=True) 147 | p = os.path.join(items[i][1], item['filename']) 148 | log('INFO', 'Downloading {}', p) 149 | resp, cont = self._http.request(rawurl, 'GET') 150 | if resp.status != 200: 151 | log('WARN', 'HTTP item download failed: {} {}'.format(resp.status, resp.reason)) 152 | continue 153 | with open(p, 'wb') as f: 154 | f.write(cont) 155 | size = len(cont) / (1024. * 1024.) 156 | log('INFO', 'Downloaded {} successfully ({:.2f} MiB)', p, size) 157 | ok.append(item['id']) 158 | return ok 159 | 160 | 161 | class DB: 162 | 163 | def __init__(self, path): 164 | self._db = sqlite3.connect(path) 165 | self.initdb() 166 | self._dtparse = dateutil.parser.isoparser() 167 | 168 | def initdb(self): 169 | cur = self._db.cursor() 170 | cur.execute('CREATE TABLE IF NOT EXISTS items (id TEXT PRIMARY KEY, creationTime TEXT, path TEXT, mimetype \ 171 | TEXT, filename TEXT, video INTEGER, offline INTEGER)') 172 | cur.execute('CREATE TABLE IF NOT EXISTS transactions (id TEXT, type TEXT, time INTEGER)') 173 | cur.execute('CREATE TABLE IF NOT EXISTS oauth (id TEXT PRIMARY KEY, credentials BLOB)') 174 | self._db.commit() 175 | 176 | def store_credentials(self, id, creds): 177 | with self._db as conn: 178 | cur = conn.cursor() 179 | cur.execute('SELECT id FROM oauth WHERE id = ?', (id,)) 180 | if not cur.fetchone(): 181 | cur.execute('INSERT INTO oauth (id, credentials) VALUES (?, ?)', (id, creds)) 182 | return 183 | cur.close() 184 | cur = conn.cursor() 185 | cur.execute('UPDATE oauth SET credentials = ? WHERE id = ?', (creds, id)) 186 | 187 | def get_credentials(self, id): 188 | with self._db as conn: 189 | cur = conn.cursor() 190 | cur.execute('SELECT credentials FROM oauth WHERE id = ?', (id,)) 191 | row = cur.fetchone() 192 | if row: 193 | return row[0] 194 | return None 195 | 196 | def add_online_item(self, media_item, path): 197 | with self._db as conn: 198 | cur = conn.cursor() 199 | cur.execute('SELECT id FROM items WHERE id = "{}"'.format(media_item['id'])) 200 | if cur.fetchone(): 201 | log('INFO', 'Photo already in store.') 202 | cur.close() 203 | return False 204 | log('INFO', 'Inserting item {}'.format(media_item['id'])) 205 | cur.close() 206 | 207 | creation_time = int(self._dtparse.isoparse(media_item['mediaMetadata']['creationTime']).timestamp()) 208 | is_video = 1 if 'video' in media_item['mediaMetadata'] else 0 209 | conn.cursor().execute('INSERT INTO items (id, creationTime, path, mimetype, filename, video, offline) VALUES (?, ?, ?, ?, ?, ?, 0)', (media_item['id'], creation_time, path, media_item['mimeType'], media_item['filename'], is_video)) 210 | self.record_transaction(media_item['id'], 'ADD') 211 | return True 212 | 213 | def get_items_by_downloaded(self, downloaded=False): 214 | """Generate items (as [id, path, filename, is_video]) that are not yet present locally.""" 215 | with self._db as conn: 216 | cur = conn.cursor() 217 | cur.execute('SELECT id, path, filename, video FROM items WHERE offline = ? ORDER BY creationTime ASC', (1 if downloaded else 0,)) 218 | while True: 219 | row = cur.fetchone() 220 | if not row: 221 | break 222 | yield row 223 | 224 | def mark_items_downloaded(self, ids, downloaded=True): 225 | with self._db as conn: 226 | for id in ids: 227 | conn.cursor().execute('UPDATE items SET offline = ? WHERE id = ?', (1 if downloaded else 0, id)) 228 | self.record_transaction(id, 'DOWNLOAD') 229 | 230 | def existing_items_range(self): 231 | with self._db as conn: 232 | cursor = conn.cursor() 233 | cursor.execute('SELECT creationTime FROM items ORDER BY creationTime DESC LIMIT 1') 234 | newest = cursor.fetchone() 235 | cursor.execute('SELECT creationTime FROM items ORDER BY creationTime ASC LIMIT 1') 236 | oldest = cursor.fetchone() 237 | 238 | # Safe defaults that will lead to all items being selected 239 | old_default = datetime.datetime.now() 240 | new_default = datetime.datetime.fromtimestamp(0) 241 | return ( 242 | datetime.datetime.fromtimestamp(int(oldest[0])) if oldest else old_default, 243 | datetime.datetime.fromtimestamp(int(newest[0])) if newest else new_default 244 | ) 245 | 246 | def record_transaction(self, id, typ): 247 | """Record an event in the transaction log. 248 | 249 | typ should be one of 'ADD', 'DOWNLOAD'. 250 | """ 251 | with self._db as conn: 252 | cursor = conn.cursor() 253 | cursor.execute('INSERT INTO transactions (id, type, time) VALUES (?, ?, ?)', (id, typ, int(datetime.datetime.now().timestamp()))) 254 | 255 | 256 | class Driver: 257 | """Coordinates synchronization. 258 | 259 | 1. Fetch item metadata (list_library). This takes a long time on first try. 260 | 2. Check for items not yet downloaded, download them. 261 | 3. Start again. 262 | """ 263 | 264 | def __init__(self, db, photosservice, root='', path_mapper=None): 265 | self._root = root 266 | self._db = db 267 | self._svc = photosservice 268 | self._path_mapper = path_mapper if path_mapper else Driver.path_from_date 269 | 270 | def fetch_metadata(self, date_range=(None, None), window_heuristic=False): 271 | """Fetch media metadata and write it to the database.""" 272 | 273 | # First, figure out which ranges we need to fetch. 274 | ranges = [date_range] 275 | if not (date_range[0] or date_range[1]): 276 | if window_heuristic: 277 | (oldest, newest) = self._db.existing_items_range() 278 | # Special case where no previous items exist. 279 | if newest == datetime.datetime.fromtimestamp(0): 280 | ranges = [(datetime.datetime.fromtimestamp(0), datetime.datetime.now())] 281 | else: 282 | # Fetch from the time before the oldest item and after the newest item. 283 | # This will fail if items are uploaded with a creation 284 | # date in between existing items. 285 | ranges = [ 286 | (datetime.datetime.fromtimestamp(0), oldest), 287 | (newest, datetime.datetime.now()) 288 | ] 289 | else: 290 | ranges = [(datetime.datetime.fromtimestamp(0), datetime.datetime.now())] 291 | 292 | log('INFO', 'Running starting for {}'.format(date_range)) 293 | 294 | for rng in ranges: 295 | for item in self._svc.list_library(start=rng[0], to=rng[1]): 296 | log('INFO', 'Fetched metadata for {}'.format(item['filename'])) 297 | if self._db.add_online_item(item, self._path_mapper(item)): 298 | log('INFO', 'Added {} to DB'.format(item['filename'])) 299 | return True 300 | 301 | def download_items(self): 302 | """Scans database for items not yet downloaded and downloads them.""" 303 | retry = [] 304 | chunk = [] 305 | chunksize = 16 306 | for item in self._db.get_items_by_downloaded(False): 307 | (id, path, filename, is_video) = item 308 | path = os.path.join(self._root, path) 309 | chunk.append((id, path, is_video)) 310 | 311 | if len(chunk) > chunksize: 312 | ok = self._svc.download_items(chunk) 313 | self._db.mark_items_downloaded(ok) 314 | wantids = set(i[0] for i in chunk) 315 | missing = wantids ^ set(ok) 316 | for item in chunk: 317 | if item[0] in missing: 318 | retry.append(item) 319 | chunk = [] 320 | 321 | chunk.extend(retry) 322 | n = chunksize 323 | smalls = [chunk[i:i + n] for i in range(0, len(chunk), n)] 324 | for chunk in smalls: 325 | ok = self._svc.download_items(chunk) 326 | self._db.mark_items_downloaded(ok) 327 | if len(ok) < len(chunk): 328 | log('WARN', 'Could not download {} items. Please try again later (photosync will automatically retry these)', len(chunk) - len(ok)) 329 | 330 | def drive(self, date_range=(None, None), window_heuristic=True): 331 | """First, download all metadata since most recently fetched item. 332 | Then, download content.""" 333 | # This possibly takes a long time and it may be that the user aborts in 334 | # between. It returns fast if most items are already present locally. 335 | # window_heuristic asks the metadata fetching logic to only fetch 336 | # items older than the oldest or newer than the newest item, which is 337 | # what we want for updating the items library. 338 | if self.fetch_metadata(date_range, window_heuristic): 339 | self.download_items() 340 | 341 | def find_vanished_items(self, dir): 342 | """Checks if all photos that are supposed to be downloaded are still present. 343 | 344 | Marks them for download otherwise, meaning that they will be downloaded later. 345 | """ 346 | found = 0 347 | for (id, path, filename, video) in self._db.get_items_by_downloaded(downloaded=True): 348 | path = os.path.join(dir, path, filename) 349 | try: 350 | info = os.stat(path) 351 | except FileNotFoundError: 352 | log('INFO', 'Found vanished item at {}; marking for download', path) 353 | found += 1 354 | self._db.mark_items_downloaded([id], downloaded=False) 355 | if found > 0: 356 | log('WARN', 'Found {} vanished items. Reattempting download now...', found) 357 | return True 358 | return False 359 | 360 | 361 | def path_from_date(item): 362 | """By default, map items to year/month/day directory. 363 | 364 | Important: Omits the --dir relative directory (self._root). 365 | """ 366 | dt = dateutil.parser.isoparser().isoparse(item['mediaMetadata']['creationTime']).date() 367 | return '{y}/{m:02d}/{d:02d}/'.format(y=dt.year, m=dt.month, d=dt.day) 368 | 369 | 370 | class Main(arguments.BaseArguments): 371 | def __init__(self): 372 | doc = ''' 373 | Download photos and videos from Google Photos. Without any arguments, photosync will check for 374 | new photos and download all photos that are marked as not yet downloaded as well as the new ones. 375 | 376 | In general, photosync works like this: 377 | 378 | * Download metadata for all items (initial run, or --all), or items in 379 | a specified date range (--dates), or before the oldest and after the 380 | newest item (default) 381 | -> items are marked as "online", i.e. not yet downloaded. 382 | * Check database for all items that are "online" and start download them. 383 | * Exit. 384 | 385 | This means that if you interrupt photosync during any phase of 386 | synchronization, it will pick up afterwards without re-executing a lot 387 | of work, as long as you don't use the --all option. 388 | 389 | Usage: 390 | photosync.py [options] 391 | 392 | Options: 393 | -h --help Show this screen 394 | -d --dir= Root directory; where to download photos and store the database. 395 | -a --all Synchronize metadata for *all* photos instead of just before the oldest/after the newest photo. Needed if you have uploaded photos somewhere in the middle. Consider using --dates instead. 396 | --creds= Path to the client credentials JSON file. Defaults to none. Specify to force reauth. After the first authorization, tokens are saved in the database. 397 | --dates= Similar to --all, but only consider photos in the given date range: yyyy-mm-dd:yyyy-mm-dd or day: yyyy-mm-dd. 398 | --query= Query metadata for item and print on console. 399 | --resync Check local filesystem for files that should be downloaded but are not there (anymore). 400 | ''' 401 | super(arguments.BaseArguments, self).__init__(doc=doc) 402 | self.dir = self.dir or '.' 403 | self.creds = self.creds 404 | 405 | def main(self): 406 | # TODO: --resync, to inspect the local filesystem for vanished files. 407 | db = DB(os.path.join(self.dir, 'sync.db')) 408 | s = PhotosService(tokens=TokenSource(db=db, clientsecret=self.creds)) 409 | d = Driver(db, s, root=self.dir) 410 | 411 | if self.query: 412 | print(s.get_item(self.query)) 413 | return 414 | if self.resync: 415 | if d.find_vanished_items(self.dir): 416 | d.download_items() 417 | log('WARN', 'Finished downloading missing items.') 418 | return 419 | if self.all: 420 | d.drive(window_heuristic=False) 421 | elif self.dates: 422 | parts = self.dates.split(':') 423 | p = dateutil.parser.isoparser() 424 | window = None 425 | if len(parts) == 2: 426 | (a, b) = parts 427 | (a, b) = (make_date_iso(a), make_date_iso(b)) 428 | (a, b) = p.isoparse(a), p.isoparse(b) 429 | window = (a, b) 430 | elif len(parts) == 1: 431 | date = p.isoparse(make_date_iso(parts[0])) 432 | window = (date, date) 433 | else: 434 | print("Please use --date with argument yyyy-mm-dd:yyyy-mm-dd (from:to) or yyyy-mm-dd.") 435 | return 436 | d.drive(window_heuristic=False, date_range=window) 437 | else: 438 | d.drive(window_heuristic=True) 439 | 440 | 441 | def main(): 442 | Main().main() 443 | 444 | 445 | if __name__ == '__main__': 446 | main() 447 | --------------------------------------------------------------------------------