├── _config.yml ├── plum_output.JPG ├── legacy_snt_output.JPG ├── LICENSE ├── _layouts └── default.html ├── README.md └── stickyparser.py /_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-midnight -------------------------------------------------------------------------------- /plum_output.JPG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/iamhunggy/StickyParser/HEAD/plum_output.JPG -------------------------------------------------------------------------------- /legacy_snt_output.JPG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/iamhunggy/StickyParser/HEAD/legacy_snt_output.JPG -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Elaine Hung 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /_layouts/default.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | {% seo %} 8 | 9 | 10 | 11 | 12 | 15 | 18 | 19 | 20 | 21 | 22 | 32 | 33 |
34 | 35 |
36 |
37 |

{{ site.title | default: site.github.repository_name }}

38 |

{{ site.description | default: site.github.project_tagline }}

39 |
40 | Project maintained by {{ site.github.owner_name }} 41 | Hosted on GitHub Pages — Theme by mattgraham 42 |
43 | 44 | {{ content }} 45 | 46 |
47 | 48 |
49 | 50 | {% if site.google_analytics %} 51 | 59 | {% endif %} 60 | 61 | 62 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # StickyParser 2 | ### A Windows Sticky Notes Parser (snt and plum.sqlite supported) - Recovery of sqlite/plum sqlite also supported. 3 | For details on how Sticky Notes work, you could also refer to my write up here : https://dingtoffee.medium.com/windows-sticky-notes-forensics-80ee31ab67ef 4 | 5 | Sticky Notes is a feature starting from Windows 7 that allows a user to create sticky notes on their desktop/laptop.  6 | ### Legacy Sticky Note Format  7 | * Available Windows Version: Windows 7 to Windows 10 (before build 1607)  8 | * Location: ```%APPDATA%\Roaming\StickyNotes\StickyNotes.snt``` 9 | The .snt file is an MS OLE/compound file binary format.  10 | .snt file can be opened and viewed using the MiTEC Structured Storage Viewer or you could also use the parser I created to extract the content.  11 | 12 | ### Win10 Sticky Note Format  13 | * Available Windows Version: Windows 10 (After build 1607)  14 | * Location: ```%LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite``` 15 | * Transaction Log and Events : ```%LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite-shm``` and 16 | ```%LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite-wal``` 17 | 18 | Starting from Windows 10 Build 1607, Microsoft has changed the sticky note databse from OLE to sqlite3. In order to view the completed events, it is recommended to roll the transaction logs and events of sqlite-shm and sqlite-wal into sqlite3. You could use any sqlite browser or my script to parse the information out. 19 | 20 | ### Features 21 | For latest version of StickyNote, 22 | * Copy everything under the ```%LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState```. 23 | * Run StickyPraser against the copied folder. Make sure the other files apart from the plum.sqlite are all in the same folder. Once run, WAL/SHMfiles will be merged into .sqlite file. 24 | * Execute StickyPasrser 25 | * Result csv will be created under the directory you specified. 26 | * All timestamps are recorded in UTC. 27 | 28 | For legacy snt format of StickyNote, 29 | * Copy StickyNotes.snt under ``` C:\Users\User\AppData\Roaming\StickyNotes\``` 30 | * Run StickyPraser against the StickyNotes.snt 31 | * Result csv will be created under the directory you specified. 32 | * All timestamps are recorded in UTC. 33 | 34 | Additional Features: 35 | * Support recovery of deleted content of plum.sqlite. 36 | * Result will be created under the directory you specified. 37 | 38 | Support Python Version 3.x only. 39 | 40 | # Prerequisite 41 | Please install the relevant Python modules before running: 42 | ```pip install pandas olefile``` 43 | 44 | # Usage 45 | ``` 46 | usage: stickyparser.py [-h] [-s [snt file]] [-p [sqlite file]] [-d [File Directory]] [-r [sqlite file]] 47 | 48 | StickyParser: Parses sticky note files in legacy snt formats or latest sqlite formats.It can also be used to recover 49 | deleted content inside sqlite. For latest version of StickyNote, please copy everything under the 50 | %LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbw\LocalState Folder. Run StickyPraser against 51 | the copied folder. Make sure the other files apart from the plum.sqlite are all in the same folder. Once run, WAL/SHM 52 | files will be merged into .sqlite file. 53 | 54 | optional arguments: 55 | -h, --help show this help message and exit 56 | -s [snt file] Sticky note .snt file. Example: StickyParser.exe -s C:\Users\User\AppData\Roaming\Sticky 57 | Notes\StickyNotes.snt. Choose either -s or -p only. 58 | -p [sqlite file] Sticky note plum.sqlite file. Example: StickyParse -s \plum.sqlite. Choose either -s or 59 | -p 60 | -d [File Directory] Specify the directory where the output should write to. Example: StickyParser -p -d 61 | C:\Users\User\Desktop\ 62 | -r [sqlite file] To recover deleted content from sqlite. 63 | ``` 64 | 65 | # Example Commands 66 | * Parse snt file 67 | ``` 68 | python stickyparser.py -s "C:\Users\User\AppData\Roaming\StickyNotes\StickyNotes.snt" -d C:\temp 69 | ``` 70 | * Parse plum.sqlite file 71 | ``` 72 | python stickyparser.py -p %LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite -d C:\temp 73 | ``` 74 | * Recovery of deleted content inside plum.sqlite (it may also support generic sqlite3 file) 75 | ``` 76 | python stickyparser.py -r %LOCALAPPDATA%\Packages\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbwe\LocalState\plum.sqlite -d C:\temp 77 | ``` 78 | # Example Outputs 79 | SNT Paser Output 80 | ![SNT Praser Output](https://github.com/dingtoffee/StickyParser/blob/master/legacy_snt_output.JPG) 81 | Plum Sqlite Paser Output 82 | ![Plum Sqlite Paser Output](https://github.com/dingtoffee/StickyParser/blob/master/plum_output.JPG) 83 | 84 | # References 85 | * https://github.com/aramosf/recoversqlite 86 | * https://gist.github.com/daddycocoaman/e1a4f31109e17188d5ce8fd0ca15b63e 87 | -------------------------------------------------------------------------------- /stickyparser.py: -------------------------------------------------------------------------------- 1 | import json 2 | import sqlite3 3 | import olefile 4 | import pandas as pd 5 | import datetime 6 | import argparse 7 | import sys 8 | import os 9 | import struct 10 | def snt(file): 11 | # https://www.tutorialspoint.com/python_digital_forensics/python_digital_forensics_important_artifacts_in_windows 12 | if not olefile.isOleFile(file): 13 | return "Invalid OLE file" 14 | 15 | ole = olefile.OleFileIO(file) 16 | df_id = [] 17 | df_created = [] 18 | df_modified = [] 19 | df_text = [] 20 | now = datetime.datetime.now().strftime("%Y%m%d%H%M") 21 | 22 | 23 | for stream in ole.listdir(): 24 | if stream[0].count("-") == 3: 25 | if stream[0] not in df_id: 26 | df_id.append (stream[0]) 27 | df_created.append (str(ole.getctime(stream[0]))) 28 | df_modified.append (str(ole.getmtime(stream[0]))) 29 | 30 | #if stream[1] == '3': 31 | # Stream 3 truncates the text after reaching certain length. The snt function was rewritten to use Stream 0 instead. 32 | if stream[1] == '0': 33 | 34 | x = ole.openstream(stream).read().decode("utf-8") 35 | x = x.split('fs22 ') 36 | x = x[1].split('\lang9')[0] 37 | content = x.replace('\par', ' ') 38 | df_text.append(content) 39 | 40 | data_dict = {'id': df_id, 'text': df_text, 'created': df_created, 'modified' : df_modified } 41 | final_df = pd.DataFrame(data_dict) 42 | print("StickyParser: Saving the csv file") 43 | 44 | final_df.to_csv(args.d+ 'stickynoteresultsnt-'+ now + '.csv', index=False) 45 | print("StickyParser: File saved.") 46 | 47 | 48 | def plum(db): 49 | 50 | conn = sqlite3.connect(db, isolation_level=None, 51 | detect_types=sqlite3.PARSE_COLNAMES) 52 | db_df = pd.read_sql_query("SELECT Text, WindowPosition, IsOpen, IsAlwaysOnTop, CreationNoteIdAnchor, Theme, IsFutureNote, \ 53 | RemoteId, ChangeKey, LastServerVersion, RemoteSchemaVersion, IsRemoteDataInvalid, PendingInsightsScan, Type, Id, ParentId, \ 54 | strftime('%Y-%m-%d %H:%M-%S', CreatedAt/10000000 - 62135596800,'unixepoch') AS CreatedAtUTC, strftime('%Y-%m-%d %H:%M-%S', DeletedAt/10000000 - 62135596800,'unixepoch') AS DeletedAtUTC,\ 55 | strftime('%Y-%m-%d %H:%M-%S', UpdatedAt/10000000 - 62135596800,'unixepoch') AS UpdatedAtUTC FROM Note", conn) 56 | now = datetime.datetime.now().strftime("%Y%m%d%H%M") 57 | print("StickyParser: Saving the csv file") 58 | 59 | db_df.to_csv(args.d+ 'stickynoteresultplum-'+ now + '.csv', index=False) 60 | print("StickyParser: File saved.") 61 | 62 | 63 | 64 | def all_same(items): 65 | return all(x == items[0] for x in items) 66 | 67 | def hexdump(src, length=16): 68 | if all_same(src): 69 | strzero = "0000 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................\n" 70 | strzero += "..." 71 | return strzero 72 | result = [] 73 | digits = 4 if isinstance(src, bytes) else 2 74 | for i in range(0, len(src), length): 75 | s = src[i:i+length] 76 | hexa = ' '.join(["%0*X" % (digits, x)for x in s]) 77 | hexa = bytes(hexa, 'UTF-8') 78 | text = b'' 79 | for x in s: 80 | #x = str(x 81 | if(x < 0x20 or x == 0x7F or 0x81 <= x < 0xA0): 82 | 83 | text += b'.' 84 | else: 85 | text += x.to_bytes(1,'big') 86 | result.append( b"%04X %-*s |%s|" % (i, length*(digits + 1),hexa, text) ) 87 | 88 | return b'\n'.join(result) 89 | 90 | 91 | def asciidump(src, length=80): 92 | if all_same(src): 93 | strzero = "................................................................................" 94 | return strzero 95 | result = [] 96 | digits = 4 if isinstance(src, str) else 2 97 | for i in range(0, len(src), length): 98 | s = src[i:i+length] 99 | #print(type(s)) 100 | #s = bytes(s, 'utf-8') 101 | text = '' 102 | text = text.encode() 103 | for x in s: 104 | #x = str(x) 105 | if(x < 0x20 or x == 0x7F or 0x81 <= x < 0xA0): 106 | text += b'.' 107 | else: 108 | text += x.to_bytes(1,'big') 109 | result.append( b"%s" % (text) ) 110 | 111 | #result = result.decode('utf-8','ignore') 112 | x = b'\n'.join(result) 113 | x.decode('utf-8',errors='ignore') 114 | 115 | return b'\n'.join(result) 116 | 117 | 118 | 119 | def formatlist(list): 120 | i=0 121 | for el in list: 122 | i+=1 123 | if i == 10: 124 | if verbose == 1: print 125 | i=0 126 | else: 127 | if verbose == 1: print ('%5d' % el,) 128 | if verbose == 1: print 129 | 130 | def locatetrunk ( offset ): 131 | nextrunk = struct.unpack(">i", s[offset:offset+4])[0] 132 | if nextrunk != 0: 133 | return nextrunk 134 | else: 135 | return 0 136 | 137 | def locatefreeleafs ( offset ): 138 | numleafpag = struct.unpack(">i", s[offset+4:offset+8])[0] 139 | # -24 -> bug in sqlite avoids writing in last 6 entries 140 | freepage = s[offset+8:offset+pagesize-24] 141 | fmt = ">" + "i" * int((len(freepage)/4)) 142 | # return only numleafpag 143 | return struct.unpack(fmt, freepage)[0:numleafpag] 144 | 145 | 146 | def freepages( ): 147 | offset = (pagenum - 1)*pagesize 148 | freeleaftpages=locatefreeleafs( offset ) 149 | nextrunk = 1 150 | freetrunk.append(pagenum) 151 | while ( nextrunk != 0 ): 152 | nextrunk = locatetrunk( offset ) 153 | if nextrunk != 0: 154 | freetrunk.append(nextrunk) 155 | offset = (nextrunk - 1)*pagesize 156 | freeleaftpages += locatefreeleafs( offset ) 157 | 158 | freeleaf = list(set(freeleaftpages)) 159 | 160 | return freeleaf, freetrunk 161 | 162 | def locatebtree( ): 163 | offset = 100 164 | page = 1 165 | while ( offset < filesize ): 166 | byte = s[offset] 167 | if byte == 13 and page not in freeleaf: 168 | leafpages.append(page) 169 | elif byte == 2 and page not in freeleaf: 170 | interindex.append(page) 171 | elif byte == 5 and page not in freeleaf: 172 | intertable.append(page) 173 | elif byte == 10 and page not in freeleaf: 174 | leafindex.append(page) 175 | if offset == 100: 176 | offset = 0 177 | offset += pagesize 178 | page += 1 179 | 180 | def pagedump(pages): 181 | for page in pages: 182 | offset = (page-1 )*pagesize 183 | end = 0 184 | if page == 0 or offset == 0: 185 | offset += 100 186 | end = 100 187 | if page not in freeleaf and page not in freetrunk: 188 | numcells = struct.unpack(">H", s[offset+3:offset+5])[0] 189 | offcells = struct.unpack(">H", s[offset+5:offset+7])[0] 190 | osfree = struct.unpack(">H", s[offset+1:offset+3])[0] 191 | fragmented = struct.unpack(">b", s[offset+7].to_bytes(1, 'big'))[0] 192 | nextfb = osfree 193 | fbsize = 0 194 | if offcells == 0: 195 | offcells = 65536 196 | head = 8 197 | if s[offset] == 2 or 5: head = 12 198 | unstart = offset + head + ( numcells * 2 ) 199 | unend = offset + offcells - end 200 | freestr = s[unstart:unend] 201 | if ftype != "n": 202 | print('%-25s page: %s offset: %10s-%-10s\n' % ('Unallocated space: ', page, unstart, unend), file = file) 203 | if ftype == "a": 204 | print(asciidump(freestr),file=file) 205 | elif ftype == "h": 206 | print (hexdump(freestr),file=file) 207 | 208 | while ( nextfb != 0 ): 209 | fbsize = struct.unpack(">H", s[offset+nextfb+2:offset+nextfb+4])[0] 210 | fbstart = offset + nextfb + 4 211 | fbend = offset + nextfb + fbsize 212 | fbstr = s[fbstart:fbend] 213 | if ftype != "n": 214 | print ('%-25s page: %s offset: %10s-%-10s\n' % ('Freeblock space: ', page, fbstart,fbend),file=file) 215 | if ftype == "a": 216 | print (asciidump(fbstr), file=file) 217 | elif ftype == "h": 218 | print(hexdump(fbstr),file=file) 219 | nextfb = struct.unpack(">H", s[offset+nextfb:offset+nextfb+2])[0] 220 | 221 | elif page in freeleaf: 222 | freestr = s[offset:offset+pagesize] 223 | if ftype != "n": 224 | print ('%-25s page: %s offset: %10s\n' % ('Freelist leaf page: ',page, offset),file=file) 225 | if ftype == "a": 226 | print(asciidump(freestr), file = file) 227 | 228 | elif ftype == "h": 229 | print(hexdump(freestr), file = file) 230 | 231 | elif page in freetrunk: 232 | ftstart = offset+8+(4*len(freeleaf)) 233 | ftend = offset+pagesize 234 | freestr = s[ftstart:ftend] 235 | if ftype != "n": 236 | print ('%-25s page: %s offset: %10s-%-10s\n' % ('Freelist trunk space: ', page, ftstart,ftend), file = file) 237 | if ftype == "a": 238 | print (f'asciidump(freestr)', file = file) 239 | elif ftype == "h": 240 | print(hexdump(freestr), file = file) 241 | 242 | 243 | 244 | 245 | if __name__ == "__main__": 246 | 247 | parser = argparse.ArgumentParser(description="""StickyParser: Parses sticky note files in legacy snt formats or latest sqlite formats.It can also be used to recover deleted content inside sqlite. 248 | For latest version of StickyNote, please copy everything under the %LOCALAPPDATA%\\Packages\\Microsoft.MicrosoftStickyNotes_8wekyb3d8bbw\\LocalStatealState Folder. Run StickyPraser against the copied folder. Make sure the other files apart from the plum.sqlite are all in the same folder. 249 | Once run, WAL/SHM files will be merged into .sqlite file.""") 250 | parser.add_argument('-s', nargs='?',metavar="snt file", help='Sticky note .snt file. Example: StickyParser.exe -s C:\\Users\\User\\AppData\\Roaming\\Sticky Notes\\StickyNotes.snt. Choose either -s or -p only. ', type=argparse.FileType('r')) 251 | parser.add_argument('-p', nargs='?',metavar="sqlite file", help='Sticky note plum.sqlite file. Example: StickyParse -s \\plum.sqlite. Choose either -s or -p', type=argparse.FileType('r')) 252 | parser.add_argument('-d' ,nargs='?',metavar="File Directory", help='Specify the directory where the output should write to. Example: StickyParser -p -d C:\\Users\\User\\Desktop\\') 253 | parser.add_argument('-r', nargs='?',metavar="sqlite file", help = 'To recover deleted content from sqlite.') 254 | args = parser.parse_args() 255 | line = "*" * 10 256 | if args.d is not None: 257 | if args.d[-1] != '\\': 258 | args.d = args.d + '\\' 259 | if args.d is None: 260 | print('StickyParser: Output set to the current directory...') 261 | args.d = os.path.abspath(os.getcwd()) 262 | if args.d[-1] != '\\': 263 | args.d = args.d + '\\' 264 | if args.s is not None: 265 | 266 | print('StickyPraser: Parsing the SNT File...') 267 | snt(args.s.name) 268 | 269 | if args.p is not None: 270 | 271 | print('StickyPraser: Parsing the sqlite file ....') 272 | plum(args.p.name) 273 | 274 | if args.r is not None: 275 | print('StickyPraser: Attempt to recover deleted content.') 276 | now = datetime.datetime.now().strftime("%Y%m%d%H%M") 277 | 278 | file = open(args.r + 'stickynotemetadata-' + now + '.txt','w') 279 | ftype = "a" 280 | pages = "all" 281 | filesize = os.path.getsize(args.r) 282 | freeleaf = [ ] 283 | freetrunk = [ ] 284 | btreepages = [ ] 285 | leafpages = [ ] 286 | interindex = [ ] 287 | intertable = [ ] 288 | leafindex = [ ] 289 | 290 | verbose = 0 291 | with open(args.r, 'rb') as f: 292 | s = f.read() 293 | 294 | 295 | # The header string: "SQLite format 3\000" 296 | hs=s[:16].rstrip(b' \t\r\n\0') 297 | if hs == b'SQLite format 3': 298 | r = " (correct)" 299 | else: 300 | r = " (incorrect)" 301 | file.write ('%-45s %-20s\n' % ("Header String:", str(hs) + r)) 302 | 303 | # The database page size in bytes. Must be a power of two between 512 304 | # and 32768 inclusive, or the value 1 representing a page size of 65536 305 | pagesize = struct.unpack(">H", s[16:18])[0] 306 | if pagesize == 1: 307 | pagesize == 65536 308 | file.write ('%-45s %-20s\n' % ("Page Size bytes:", pagesize)) 309 | 310 | # File format write version. 1 for legacy; 2 for WAL 311 | wver = ord(s[18:19]) 312 | if wver == 2: 313 | wrel = str(wver) + " - WAL: yes" 314 | else: 315 | wrel = str(wver) + " - WAL: no" 316 | file.write ('%-45s %-20s\n' % ("File format write version:", wrel)) 317 | 318 | # File format read version. 1 for legacy; 2 for WAL. 319 | rver = ord(s[19:20]) 320 | if rver == 2: 321 | rrel = str(rver) + " - WAL: yes" 322 | else: 323 | rrel = str(rver) + " - WAL: no" 324 | file.write ('%-45s %-20s\n' % ("File format read version:", rrel)) 325 | 326 | # Bytes of unused "reserved" space at the end of each page. Usually 0. 327 | 328 | if verbose == 1: f.write ('%-45s %-20s\n' % ('Bytes of unused reserved space:', s[20])) 329 | 330 | # Maximum embedded payload fraction. Must be 64. 331 | if s[21] == 64: 332 | r = " (correct)" 333 | else: 334 | r = " (incorrect)" 335 | if verbose == 1: f.write ('%-45s %-20s\n' % ('Maximum embedded payload fraction:', str(s[21]) + r)) 336 | 337 | # Minimum embedded payload fraction. Must be 32. 338 | if s[22] == 32: 339 | r = " (correct)" 340 | else: 341 | r = " (incorrect)" 342 | if verbose == 1: f.write ('%-45s %-20s\n' % ('Minimum embedded payload fraction:', str(s[22]) + r)) 343 | 344 | # Leaf payload fraction. Must be 32. 345 | if s[23] == 32: 346 | r = " (correct)" 347 | else: 348 | r = " (incorrect)" 349 | if verbose == 1: f.write ('%-45s %-20s\n' % ('Leaf payload fraction:', str(s[23]) + r)) 350 | 351 | # File change counter. 352 | count = struct.unpack(">i", s[24:28])[0] 353 | file.write ('%-45s %-20s\n' % ('File change counter:', count)) 354 | 355 | # Size of the database file in pages. The "in-header database size". 356 | sizepag = struct.unpack(">i", s[28:32])[0] 357 | file.write ('%-45s %-20s\n' % ('Size of the database file in pages:', sizepag)) 358 | 359 | # Page number of the first freelist trunk page. 360 | pagenum = struct.unpack(">i", s[32:36])[0] 361 | file.write ('%-45s %-20s\n' % ('Page number of the first freelist trunk page:', pagenum)) 362 | 363 | # Total number of freelist pages. 364 | freenum = struct.unpack(">i", s[36:40])[0] 365 | file.write ('%-45s %-20s\n' % ('Total number of freelist pages:', freenum)) 366 | 367 | # The schema cookie. 368 | schema = struct.unpack(">i", s[40:44])[0] 369 | if verbose == 1: file.write ('%-45s %-20s\n' % ('The schema cookie:', schema)) 370 | 371 | # The schema format number. Supported schema formats are 1, 2, 3, and 4. 372 | schemav = struct.unpack(">i", s[44:48])[0] 373 | if schemav == 1: 374 | schemavs = " - back to ver 3.0.0" 375 | elif schemav == 2: 376 | schemavs = " - >= 3.1.3 on 2005-02-19" 377 | elif schemav == 3: 378 | schemavs = " - >= 3.1.4 on 2005-03-11" 379 | elif schemav == 4: 380 | schemavs = " - >= 3.3.0 on 2006-01-10" 381 | else: 382 | schemavs = " - schema version error" 383 | if verbose == 1: file.write ('%-45s %-20s\n' % ('The schema format number', str(schemav) + schemavs)) 384 | 385 | # Default page cache size. 386 | pcachesize = struct.unpack(">i", s[44:48])[0] 387 | if verbose == 1: file.write ('%-45s %-20s\n' % ('Default page cache size:', pcachesize)) 388 | 389 | # The page number of the largest root b-tree page when in auto-vacuum or incremental-vacuum modes, or zero otherwise. 390 | vacuum = struct.unpack(">i", s[52:56])[0] 391 | if vacuum == 0: 392 | vacuums = " - not supported" 393 | else: 394 | vacuums = " " 395 | if verbose == 1: file.write ('%-45s %-20s\n' % ('Auto/incremental-vacuum page number:', str(vacuum) + vacuums)) 396 | 397 | # The database text encoding. A value of 1 means UTF-8. A value of 2 means UTF-16le. A value of 3 means UTF-16be. 398 | encod = struct.unpack(">i", s[56:60])[0] 399 | if encod == 1: 400 | encods = " - UTF-8" 401 | elif encod == 2: 402 | encods = " - UTF-16le" 403 | elif encod == 3: 404 | encods = " - UTF-16be" 405 | else: 406 | encods = " - error" 407 | file.write ('%-45s %-20s\n' % ('The database text encoding:', str(encod) + encods)) 408 | 409 | # The "user version" as read and set by the user_version pragma. 410 | userv = struct.unpack(">i", s[60:64])[0] 411 | if verbose == 1: file.write ('%-45s %-20s\n' % ('User version number:', userv)) 412 | 413 | # True (non-zero) for incremental-vacuum mode. False (zero) otherwise. 414 | incvacuum = struct.unpack(">i", s[64:68])[0] 415 | if incvacuum == 0: 416 | sinc = " - false" 417 | else: 418 | sinc = " - true" 419 | file.write('%-45s %-20s\n' % ('Incremental-vacuum mode:', str(incvacuum) + sinc)) 420 | 421 | reserv = struct.unpack(">iiiiii", s[68:92])[0] 422 | if reserv == 0: 423 | strreserv = "0 (correct)" 424 | else: 425 | strreserv = "!= 0 (incorrect)" 426 | if verbose == 1: file.write ('%-45s %-20s\n' % ('Reserved for expansion:', strreserv)) 427 | 428 | # The version-valid-for number. 429 | # The 4-byte big-endian integer at offset 92 is the value of the change counter 430 | # when the version number was stored. The integer at offset 92 indicates which 431 | # transaction the version number is valid for and is sometimes called the 432 | # "version-valid-for number". 433 | verval = struct.unpack(">i", s[92:96])[0] 434 | if verbose == 1: file.write ('%-45s %-20s\n' % ('The version-valid-for number:', verval)) 435 | 436 | # SQLITE_VERSION_NUMBER: 437 | # #define SQLITE_VERSION "3.7.13" 438 | # #define SQLITE_VERSION_NUMBER 3007013 439 | vervalid = struct.unpack(">i", s[96:100])[0] 440 | may = vervalid / 1000000 441 | min = (vervalid - (may * 1000000)) / 1000 442 | rls = vervalid - (may * 1000000) - (min * 1000) 443 | verstr = str(vervalid) + " - " + str(may) + "." + str(min) + "." + str(rls) 444 | file.write ('%-45s %-20s\n' % ('SQLITE_VERSION_NUMBER:', verstr)) 445 | # Freepages 446 | #freeleaf = [ ] 447 | #freetrunk = [ ] 448 | 449 | if freenum > 0: freeleaf, freetrunk = freepages( ) 450 | if verbose == 1 and freenum >0: 451 | file.write ("Freelist leaf pages:") 452 | formatlist(freeleaf) 453 | file.write('%-45s %-20s\n' % ('Number of freelist leaf pages:', len(sorted(set(freeleaf))))) 454 | locatebtree() 455 | 456 | if verbose == 1 and len(freetrunk) >0: 457 | file.write ("Freelist trunk pages:") 458 | formatlist(freetrunk) 459 | file.write('%-45s %-20s\n' % ('Number of freelist trunk pages:', len(freetrunk))) 460 | 461 | if verbose == 1 and len(leafpages) > 0: 462 | file.write("B-Tree leaf pages:") 463 | formatlist(leafpages) 464 | file.write ('%-45s %-20s\n' % ('Number of b-tree leaf pages:', len(leafpages))) 465 | if verbose == 1 and len(leafindex) >0: 466 | file.write ("B-Tree leaf index pages:") 467 | formatlist(leafindex) 468 | file.write ('%-45s %-20s\n' % ('Number of b-tree leaf index pages:', len(leafindex))) 469 | if verbose == 1 and len(interindex) >0: 470 | file.write ("B-Tree interior index pages:") 471 | formatlist(interindex) 472 | file.write ('%-45s %-20s\n' % ('Number of b-tree interior index pages:', len(interindex))) 473 | if verbose == 1 and len(intertable) >0: 474 | file.write ("B-Tree interior table pages:") 475 | formatlist(intertable) 476 | file.write ('%-45s %-20s\n' % ('Number of b-tree interior table pages:', len(intertable))) 477 | 478 | 479 | btreepages = sorted(leafpages + leafindex + interindex + intertable) 480 | allpag = sorted(btreepages + freeleaf + freetrunk) 481 | 482 | pagedump(allpag) 483 | 484 | 485 | if verbose == 1: 486 | file.write ('%-45s %-20s\n' % ('Number of detected pages:', len(allpag))) 487 | file.write ('%-45s %-20s\n' % ('Missing:', sizepag - len(allpag))) 488 | print('StickyParser: Recovered content saved on specific directory') 489 | 490 | 491 | 492 | 493 | 494 | if len(sys.argv) < 2: 495 | parser.print_usage() 496 | sys.exit(1) 497 | 498 | 499 | --------------------------------------------------------------------------------