├── README.md └── watchlist.py /README.md: -------------------------------------------------------------------------------- 1 | # plex-watchlist-migration 2 | Script to migrate your "watched" list of media to a new Plex server. 3 | 4 | ## NOTICE (Sep 18 2022) 5 | It seems migrating the watchlist is hit-or-miss these days. The guid (unique id) for a watched item no longer appears 6 | consistent between source and destination servers. This basically kills the logic for matching and updating the watched items on the destination. I'm trying to figure out why some shows have different guids between server while others are the same (as it should be). Until then know that you will be missing some items. They will be listed in the output when you run the tool. 7 | 8 | *Features* 9 | * Migrate missing users. 10 | * Migrate watch history of MANAGED users. 11 | * Migrate date added/created to preserve on-deck ordering 12 | * Co-exists with new Plex watchlist sync feature (this tool only handles MANAGED users). 13 | 14 | This utility is intended to migrate over your user media viewing history when you move to a new 15 | Plex server instance or to just maintain a failover. The process of copying around the full database folder structure is not reliable and 16 | is just trouble. So this tool assumes you are starting new libraries on the new machine. 17 | After copying over all your media, instruct the new Plex server to scan and rebuild your metadata. 18 | Yes, this will take quite a bit of time for large collections. Do this for all your 19 | libraries before proceeding. This script should be your final step. 20 | 21 | You may start an export on a running Plex server, but you *must* stop the Plex server on the destination side before loading the export. 22 | 23 | Export from your primary (source) Plex database: 24 | 25 | python3 watchlist.py -e 26 | 27 | ex: python3 watchlist.py -e /mnt/data/Plex/Library/Application\ Support/Plex\ Media\ Server/Plug-in\ Support/Databases/com.plexapp.plugins.library.db /tmp/export.json 28 | 29 | 30 | When ready to import into the destination Plex db, *stop the Plex server first* or you may corrupt the database. 31 | 32 | python3 watchlist.py -i 33 | 34 | ex: python3 watchlist.py -i export.json /mnt/data/Plex/Library/Application\ Support/Plex\ Media\ Server/Plug-in\ Support/Databases/com.plexapp.plugins.library.db 35 | 36 | 37 | This script is rerun-safe. You can run it as many times as you want, and you shouldn't get duplicate data. 38 | 39 | ### NOTE: If you get a bunch of warnings during import listing media files not found on the destination it could be: 40 | 1. Old media garbage data in the source Plex db. 41 | 2. Incompatible _guid_ values between source and destination. When Plex switched to their own scrapers the guid values changed, which were being used as primary keys in the database. To fix you can try refreshing full metadata of the affected item(s) and re-export. 42 | 43 | -------------------------------------------------------------------------------- /watchlist.py: -------------------------------------------------------------------------------- 1 | import sqlite3 2 | import sys 3 | import json 4 | 5 | 6 | def dict_factory(cursor, row): 7 | d = {} 8 | for idx, col in enumerate(cursor.description): 9 | d[col[0]] = row[idx] 10 | return d 11 | 12 | 13 | def do_export(source_db, export_filename): 14 | db_connection = sqlite3.connect(source_db) 15 | db_connection.row_factory = dict_factory 16 | cursor = db_connection.cursor() 17 | 18 | # 19 | # get users 20 | # 21 | rows = cursor.execute('select id,name,hashed_password,salt,created_at from accounts where id != 0 and id < 1000') 22 | user_list = [] 23 | for user in rows.fetchall(): 24 | 25 | print(f"Exporting for user {user['name']}") 26 | user_list.append(user) 27 | 28 | # 29 | # get the watchlist for user 30 | # 31 | # Note there is an additional join to metadata_items that seemingly isn't used for anything. It is actually 32 | # used implicitly to only grab valid watch records, as there tend to be leftovers from removed media or just 33 | # orphaned due to bugs. 34 | # 35 | user["watchlist"] = [] 36 | watchlist = cursor.execute( 37 | "select account_id,metadata_item_views.guid,metadata_item_views.metadata_type,metadata_item_views.library_section_id,grandparent_title," 38 | "parent_index,parent_title,'index',metadata_item_views.title,thumb_url,viewed_at,grandparent_guid,metadata_item_views.originally_available_at " 39 | "from metadata_item_views " 40 | "inner join library_sections on library_sections.id = metadata_item_views.library_section_id " 41 | "inner join metadata_items on metadata_items.guid = metadata_item_views.guid " 42 | "where account_id=? and library_sections.section_type in (1,2)", (user["id"],)) 43 | for row in watchlist.fetchall(): 44 | user["watchlist"].append(row) 45 | print(f" {len(user['watchlist'])} watched items") 46 | 47 | user["metadata_item_settings"] = {} 48 | settings = cursor.execute( 49 | 'select account_id,guid,rating,view_offset,view_count,last_viewed_at,created_at,' 50 | 'skip_count,last_skipped_at,changed_at,extra_data ' 51 | 'from metadata_item_settings ' 52 | 'where account_id = ?', (user["id"],)) 53 | for row in settings.fetchall(): 54 | guid = row["guid"] 55 | user["metadata_item_settings"][guid] = row 56 | # user["metadata_item_settings"].append(row) 57 | 58 | # 59 | # get list of source library sections for mapping to new ones 60 | # 61 | source_sections = dict() 62 | for row in cursor.execute('select id,name from library_sections where section_type in (1,2)').fetchall(): 63 | (_id, name) = row.values() 64 | source_sections[_id] = {'name': name} 65 | 66 | # 67 | # get list of media added dates to preserve on-deck ordering in destination system 68 | # 69 | ordering = cursor.execute( 70 | 'select guid,added_at,metadata_items.created_at from metadata_items ' 71 | 'inner join library_sections on library_sections.id = metadata_items.library_section_id ' 72 | 'where library_sections.section_type in (1,2)').fetchall() 73 | 74 | full_export = {"users": user_list, "oldsections": source_sections, "ordering": ordering} 75 | with open(export_filename, "w") as fp: 76 | json.dump(full_export, fp, indent=2) 77 | print("export complete, results in " + export_filename) 78 | 79 | 80 | def do_import(export_filename, dest_db): 81 | db_connection = sqlite3.connect(dest_db) 82 | db_connection.row_factory = dict_factory 83 | cursor = db_connection.cursor() 84 | 85 | with open(export_filename, "r") as fp: 86 | source = json.load(fp) 87 | 88 | orig_userlist = source["users"] 89 | source_sections = source["oldsections"] 90 | ordering = source["ordering"] 91 | 92 | # 93 | # Get list of all guids in the new system for validation. 94 | # 95 | dest_guid_list = [row["guid"] for row in cursor.execute( 96 | 'select guid from metadata_items mi inner join library_sections ls on mi.library_section_id = ls.id where ls.section_type in (1,2)')] 97 | 98 | # 99 | # map source to destination sections 100 | # 101 | for values in source_sections.values(): 102 | for row in cursor.execute('select id from library_sections where name = ?', (values['name'],)): 103 | values['newid'] = row["id"] 104 | if "newid" not in values: 105 | name = values["name"] 106 | print(f'WARNING: library name "{name}" not found in destination database - watchlist for this library cannot be migrated') 107 | 108 | # 109 | # create a map of source to destination user ids 110 | # 111 | for orig_user in orig_userlist: 112 | print(f"Processing user {orig_user['name']}") 113 | if orig_user["id"] > 1: # if PlexPass and have non-admin users, bring them over 114 | # check if user already exists 115 | cursor.execute('select id, name from accounts where name = ?', (orig_user["name"],)) 116 | existing = cursor.fetchone() 117 | if existing is None: 118 | # create user 119 | cursor.execute('insert into accounts (name,hashed_password,salt,created_at) values (?,?,?,?)', 120 | (orig_user["name"], orig_user["hashed_password"], orig_user["salt"], 121 | orig_user["created_at"])) 122 | dest_user_id = cursor.lastrowid 123 | else: 124 | # map account id to existing user 125 | dest_user_id = existing["id"] 126 | else: 127 | # probably just the main admin user (account id 1) 128 | dest_user_id = orig_user["id"] 129 | 130 | # 131 | # create map of source to destination guid and settings 132 | # 133 | source_settings = dict() 134 | for setting in orig_user["metadata_item_settings"].values(): 135 | guid = setting["guid"] 136 | source_settings[guid] = setting 137 | 138 | # 139 | # get list of things the user already watched in the destination so we don't duplicate 140 | # 141 | count = 0 142 | dest_watchlist = cursor.execute( 143 | "select guid from metadata_item_views inner join library_sections on library_sections.id = metadata_item_views.library_section_id " 144 | "where account_id=? and library_sections.section_type in (1,2)", (dest_user_id,)).fetchall() 145 | 146 | already_watched = [watched["guid"] for watched in dest_watchlist] 147 | 148 | guid_dedup = [] 149 | 150 | # 151 | # iterate over each watched item from the source system 152 | # 153 | for watched in orig_user["watchlist"]: 154 | guid = watched["guid"] 155 | gptitle = watched["grandparent_title"] 156 | ptitle = watched["parent_title"] 157 | title = watched["title"] 158 | 159 | if guid in guid_dedup: 160 | # clean up dups in watch list while we are here 161 | continue 162 | 163 | if guid in already_watched: 164 | # already been registered, skip 165 | continue 166 | 167 | if guid not in dest_guid_list: 168 | print(f' {gptitle} {ptitle} {title} (guid {guid}) not found in destination media list - skipped') 169 | continue 170 | 171 | library_section_id = str(watched["library_section_id"]) 172 | if library_section_id in source_sections.keys(): 173 | if source_sections[library_section_id].get("newid") is not None: 174 | library_section_id = source_sections[library_section_id]["newid"] 175 | else: 176 | #print(' newid not found in source sections for library section id {}'.format(library_section_id)) 177 | # Just silently ingore. We've already warned the user above that the dest library is missing. 178 | continue 179 | else: 180 | print(f' Unexpected: unknown library {library_section_id} for guid {guid} ({title}) - skipped') 181 | continue 182 | 183 | if guid not in source_settings: 184 | print( 185 | ' Unexpected: guid {} not found in metadata_item_settings for user {}'.format(guid, orig_user["id"])) 186 | continue 187 | # 188 | # everything looks good, insert rows 189 | # 190 | 191 | # TODO: refactor this; it assumes ordering of values 192 | w = list(watched.values()) 193 | w[0] = dest_user_id 194 | w[3] = library_section_id 195 | watched = tuple(w) 196 | 197 | cursor.execute("insert into metadata_item_views (" 198 | "account_id,guid,metadata_type,library_section_id,grandparent_title," 199 | "parent_index,parent_title,'index',title,thumb_url,viewed_at,grandparent_guid," 200 | "originally_available_at) values (?,?,?,?,?,?,?,?,?,?,?,?,?)", watched) 201 | 202 | # TODO: refactor this; it assumes ordering of values 203 | metadata_item_settings = orig_user["metadata_item_settings"] 204 | s = list(metadata_item_settings[guid].values()) 205 | s[0] = dest_user_id 206 | settings = tuple(s) 207 | cursor.execute( 208 | 'insert into metadata_item_settings (account_id,guid,rating,view_offset,view_count,' 209 | 'last_viewed_at,created_at,skip_count,last_skipped_at,changed_at,extra_data) ' 210 | 'values (?,?,?,?,?,?,?,?,?,?,?)', settings) 211 | 212 | guid_dedup.append(guid) 213 | count += 1 214 | 215 | if count: 216 | print(f"{count} update(s) for {orig_user['name']}") 217 | else: 218 | print(f"No updates for {orig_user['name']}") 219 | # 220 | # go through all the media and migrate the added and created dates to preserve "on-deck" ordering 221 | # 222 | 223 | # 224 | # This is tricky because Plex uses custom C-based tokenizer code which isn't available to developers. 225 | # This causes a trigger error when attempting to update metadata_items. 226 | # This is my workaround: query the triggers, drop the triggers, do the updates, then recreate the triggers. 227 | # I think this is safe because those triggers just maintain text search indexes of titles, which this 228 | # code doesn't modify anyhow. 229 | # 230 | 231 | # grab the triggers 232 | triggers = cursor.execute("select name,sql from sqlite_master where type='trigger' and tbl_name='metadata_items' and name like '%_update_%'").fetchall() 233 | # now delete them 234 | for trigger in triggers: 235 | cursor.execute(f"drop trigger {trigger['name']}") 236 | 237 | for item in ordering: 238 | # just do an optimistic update - sqlite will ignore if row doesn't exist 239 | cursor.execute('update metadata_items set added_at=?, created_at=? where guid=?', 240 | (item["added_at"], item["created_at"], item["guid"])) 241 | 242 | # recreate the triggers 243 | for trigger in triggers: 244 | cursor.execute(trigger["sql"]) 245 | 246 | cursor.close() 247 | db_connection.commit() 248 | 249 | 250 | if __name__ == '__main__': 251 | if len(sys.argv) <= 1: 252 | print("usage: watchlist.py -e ") 253 | print(" -i ") 254 | sys.exit(0) 255 | if sys.argv[1] == "-e": 256 | do_export(sys.argv[2], sys.argv[3]) 257 | if sys.argv[1] == "-i": 258 | do_import(sys.argv[2], sys.argv[3]) 259 | --------------------------------------------------------------------------------