├── .gitignore ├── COPYING ├── README.md ├── backport.py ├── build-for-compare.py ├── check-dnsseeds.py ├── clang-format.py ├── delete_nonreduced_fuzz_inputs.sh ├── fastcopy-chaindata.py ├── ghmeta └── __init__.py ├── ghwatch.py ├── github-merge.py ├── list-pulls.py ├── make-tag.py ├── optimize-pngs.py ├── patches └── stripbuildinfo.patch ├── release-prep.sh ├── signoff.py ├── termlib ├── __init__.py ├── attr.py ├── input.py └── tableprinter.py ├── treehash512.py ├── unittest-statistics.py └── update-translations.py /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__ 2 | -------------------------------------------------------------------------------- /COPYING: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) The Bitcoin Core developers 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in 13 | all copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 21 | THE SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | External repository for Bitcoin Core related maintenance tools. 2 | 3 | github-merge 4 | ------------ 5 | 6 | A small script to automate merging pull-requests securely and sign them with GPG. 7 | 8 | For example, if the "to" repo is identical to the "from" repo: 9 | 10 | ```bash 11 | ./github-merge.py 1234 12 | ``` 13 | 14 | (in any git repository) will help you merge pull request #1234 for the configured repository. 15 | 16 | Otherwise, for a differing "from" repo: 17 | 18 | ```bash 19 | ./github-merge.py --repo-from=bitcoin-core/gui 1234 20 | ``` 21 | 22 | will fetch the pull request from another monotree repository. Be sure to also set `githubmerge.pushmirrors` (see below). 23 | 24 | What it does: 25 | * Fetch master and the pull request. 26 | * Locally construct a merge commit. 27 | * Show the diff that merge results in. 28 | * Ask you to verify the resulting source tree (so you can do a make check or whatever). 29 | * Ask you whether to GPG sign the merge commit. 30 | * Ask you whether to push the result upstream. 31 | 32 | This means that there are no potential race conditions (where a 33 | pull request gets updated while you're reviewing it, but before you click 34 | merge), and when using GPG signatures, that even a compromised GitHub 35 | couldn't mess with the sources. 36 | 37 | ### Setup 38 | 39 | Configuring the github-merge tool for the bitcoin repository is done in the following way: 40 | 41 | git config githubmerge.repository bitcoin/bitcoin 42 | git config githubmerge.pushmirrors "git@github.com:bitcoin-core/gui.git,git@github.com:YourPrivateMirror/bitcoin-core.git" 43 | git config githubmerge.testcmd "make -j4 check" (adapt to whatever you want to use for testing) 44 | git config --global user.signingkey mykeyid 45 | 46 | If you want to use HTTPS instead of SSH for accessing GitHub, you need set the host additionally: 47 | 48 | git config githubmerge.host "https://github.com" (default is "git@github.com", which implies SSH) 49 | 50 | ### Authentication (optional) 51 | 52 | The API request limit for unauthenticated requests is quite low, but the 53 | limit for authenticated requests is much higher. If you start running 54 | into rate limiting errors it can be useful to set an authentication token 55 | so that the script can authenticate requests. 56 | 57 | - First, go to [Personal access tokens](https://github.com/settings/tokens). 58 | - Click 'Generate new token'. 59 | - Fill in an arbitrary token description. No further privileges are needed. 60 | - Click the `Generate token` button at the bottom of the form. 61 | - Copy the generated token (should be a hexadecimal string) 62 | 63 | Then do: 64 | 65 | git config --global user.ghtoken "pasted token" 66 | 67 | ### Create and verify timestamps of merge commits 68 | 69 | To create or verify timestamps on the merge commits, install the OpenTimestamps 70 | client via `pip3 install opentimestamps-client`. Then, download the gpg wrapper 71 | `ots-git-gpg-wrapper.sh` and set it as git's `gpg.program`. See 72 | [the ots git integration documentation](https://github.com/opentimestamps/opentimestamps-client/blob/master/doc/git-integration.md#usage) 73 | for further details. 74 | 75 | update-translations 76 | ------------------- 77 | 78 | Run this script from the root of a repository to update all translations from Transifex. 79 | It will do the following automatically: 80 | 81 | - Fetch all translations 82 | - Post-process them into valid and committable format 83 | - Add missing translations to the build system (TODO) 84 | 85 | To be able to pull translation files from the Transifex website, it needs 86 | the [Transifex CLI](https://github.com/transifex/cli). 87 | 88 | clang-format 89 | ------------ 90 | 91 | A script to format cpp source code according to the .clang-format file in the bitcoin repo. 92 | This should only be applied to new files or files which are currently not actively developed on. 93 | Also, git subtrees are not subject to formatting. 94 | 95 | Note: The script is currently untested and unmaintained, but kept for archival reasons, in 96 | case it is planned to be used some day. 97 | 98 | build-for-compare 99 | -------------------- 100 | 101 | Build for binary comparison. 102 | 103 | See `build-for-compare.py --help` for more information. 104 | 105 | Builds from current directory, which is assumed to be a git clone of the bitcoin repository. 106 | 107 | **DO NOT RUN this with the nocopy=1 flag set on working tree if you have any local additions, it will nuke all 108 | non-repository files, multiple times over. By leaving nocopy off (default) the git tree is copied to a temporary 109 | directory and all operations are performed there.** 110 | 111 | Example: 112 | ```bash 113 | git clone https://github.com/bitcoin/bitcoin.git bitcoin-compare 114 | cd bitcoin-compare 115 | ../bitcoin-maintainer-tools/build-for-compare.py 4731cab 2f71490 116 | sha256sum /tmp/compare/bitcoind.*.stripped 117 | git diff -W --word-diff /tmp/compare/4731cab /tmp/compare/2f71490 118 | ``` 119 | 120 | backport 121 | -------- 122 | 123 | Script to backport pull requests in order of merge, to minimize number of conflicts. 124 | Pull ids are listed in `to_backport.txt` or given on the command line, and they must be prefixed 125 | with the repository name, e.g.: 126 | 127 | ```bash 128 | ../bitcoin-maintainer-tools/backport.py bitcoin/bitcoin#21907 bitcoin-core/gui#277 bitcoin-core/gui#365 129 | 130 | ``` 131 | 132 | Requires `pip3 install gitpython` or similar. 133 | 134 | unittest-statistics 135 | -------------------------- 136 | 137 | `unittest-statistics.py` can be used to print a table of the slowest 20 unit tests. 138 | 139 | Usage: 140 | ```bash 141 | unittest-statistics.py [] 142 | ``` 143 | 144 | For example: 145 | ```bash 146 | unittest-statistics.py src/test/test_bitcoin wallet_tests 147 | ``` 148 | 149 | treehash512 150 | -------------- 151 | 152 | This script will show the SHA512 tree has for a certain commit, or HEAD 153 | by default. 154 | 155 | Usage: 156 | 157 | ```bash 158 | treehash512.py [] 159 | ``` 160 | 161 | This should match the Tree-SHA512 commit metadata field added by 162 | github-merge. 163 | 164 | signoff 165 | ---------- 166 | 167 | This is an utility to manually add a treehash to the HEAD commit and then 168 | gpg-sign it. This is useful when there is the need to manually add a commit. 169 | 170 | Usage: 171 | 172 | ```bash 173 | signoff.py 174 | ``` 175 | (no command line arguments) 176 | 177 | When there is already a treehash on the HEAD commit, it is compared against 178 | what is computed. If this matches, it continues. If the treehash mismatches an 179 | error is thrown. If there is no treehash it adds the "Tree-SHA512:" header with 180 | the computed hash to the commit message. 181 | 182 | After making sure the treehash is correct it verifies whether the commit is 183 | signed. If so it just displays the signature, if not, it is signed. 184 | 185 | subtree updates 186 | --------------- 187 | 188 | Bitcoin Core comes with several subtrees (c.f. https://github.com/bitcoin/bitcoin/tree/master/test/lint#git-subtree-checksh) 189 | To update the subtree, make sure to fetch the remote of the subtree. 190 | Then a simple call should pull in and squash the changes: 191 | 192 | ```sh 193 | git subtree pull --prefix src/${prefix} ${remote_repo} ${ref} --squash 194 | ``` 195 | 196 | For setting up a subtree, refer to `git help subtree`. 197 | 198 | check-dnsseeds 199 | --------------- 200 | 201 | Sanity-check the DNS seeds used by Bitcoin Core. 202 | 203 | Usage: 204 | 205 | ```bash 206 | check-dnsseeds.py 207 | ``` 208 | 209 | Example output: 210 | 211 | ```bash 212 | * Mainnet 213 | OK seed.bitcoin.sipa.be (40 results) 214 | OK dnsseed.bluematt.me (33 results) 215 | FAIL dnsseed.bitcoin.dashjr.org 216 | OK seed.bitcoinstats.com (50 results) 217 | OK seed.bitcoin.jonasschnelli.ch (38 results) 218 | OK seed.btc.petertodd.org (23 results) 219 | OK seed.bitcoin.sprovoost.nl (35 results) 220 | OK dnsseed.emzy.de (41 results) 221 | 222 | * Testnet 223 | OK testnet-seed.bitcoin.jonasschnelli.ch (36 results) 224 | OK seed.tbtc.petertodd.org (38 results) 225 | OK testnet-seed.bluematt.me (5 results) 226 | ``` 227 | 228 | delete non-reduced fuzz inputs 229 | ------------------------------ 230 | 231 | Refer to the documentation inside the script. 232 | 233 | fastcopy-chaindata 234 | ------------------- 235 | 236 | Fast local copy of Bitcoin Core blockchain state. 237 | 238 | ```bash 239 | fastcopy-chaindata.py ~/.bitcoin /path/to/temp/datadir 240 | ``` 241 | 242 | This utility hardlinks all but the last block data file (rev and blk), 243 | and hardlinks all .ldb files to the destination. The last data files as well 244 | as the other leveldb data files (such as the log) are copied. 245 | 246 | This relies on the fact that block files (except the last) and ldb files 247 | are read-only once they are written. 248 | 249 | Warning: Hardlinking only works within a filesystem, and may not work for all 250 | filesystems. 251 | 252 | list-pulls 253 | ---------- 254 | 255 | Script to parse git commit list, extract github issues to create a changelog in 256 | text and json format. 257 | 258 | Run this in the root directory of the repository. 259 | 260 | This requires an up-to-date checkout of https://github.com/zw/bitcoin-gh-meta.git 261 | in the parent directory, or environment variable `GHMETA`. 262 | 263 | It takes a range of commits and a .json file of PRs to exclude, for 264 | example if these are already backported in a minor release. This can be the pulls.json 265 | generated from a previous release. 266 | 267 | Example usage: 268 | 269 | ../maintainer-tools/list-pulls.py v0.18.0 0.19 relnot/pulls-exclude.json > relnot/pulls.md 270 | 271 | The output of this script is a first draft based on rough heuristics, and 272 | likely needs to be extensively manually edited before ending up in the release 273 | notes. 274 | 275 | make-tag 276 | -------- 277 | 278 | Make a new release tag, performing a few checks. 279 | 280 | Usage: `make-tag.py `. 281 | 282 | ghwatch 283 | ------- 284 | 285 | This is a script to watch your github notifications in the terminal. It will show a table that is refreshed every 10 minutes (configurable). It can be exited by pressing ESC or Ctrl-C. 286 | 287 | ### Dependencies 288 | 289 | The `github` python module is a required dependency for github API access. This can be installed for your user using `pip3 install --user PyGithub`, or globally using your distribution's package manager e.g. `apt-get install python3-github`. 290 | 291 | ### Configuration 292 | 293 | To generate a default configuration file in `~/.config/ghwatch/ghwatch.conf` do 294 | 295 | ``` 296 | ./ghwatch.py --default-config 297 | ``` 298 | 299 | Then, edit the configuration file. Only thing that is necessary to change is `ghtoken`. You will need to create a github [authentication token](https://github.com/settings/tokens) then insert it here: 300 | 301 | ``` 302 | "ghtoken": "", 303 | ``` 304 | 305 | Depending on your browser preference you might want to change `browser`, this is the command that will be invoked when clicking on an issue number. It defaults to `null` which indicates to use the system web browser. 306 | 307 | If you want to see PR status (and other issue details like labels), point `meta` for the `bitcoin/bitcoin` repository to an up-to-date checkout of [bitcoin-gh-meta](https://github.com/zw/bitcoin-gh-meta). 308 | ``` 309 | "meta": { 310 | "bitcoin/bitcoin": "/path/to/bitcoin-gh-meta" 311 | }, 312 | ``` 313 | 314 | To keep this repository up to date you can set the interval in seconds in 'auto_update', default is 0 (i.e. no automatic update). Be aware that [bitcoin-gh-meta](https://github.com/zw/bitcoin-gh-meta) is being refreshed every two hours (7200 seconds). 315 | 316 | Sorting the notifications by {reason, time} can be enabled with the 'sort_notifications' boolean field (default=false). 317 | 318 | By editing the `label_prio` structure it is possible to affect what labels will be shown. The first label encountered in this list for an issue in the associated repository will be shown as the label in the table. 319 | 320 | ### Command-line options 321 | 322 | Some other settings can be set through command line options. See `./ghwatch.py --help` for the list of command line options and their descriptions. 323 | 324 | ### Display 325 | 326 | Most of the columns are self-explanatory, except for: 327 | 328 | - `r` this [notification reason](https://docs.github.com/en/rest/reference/activity#notification-reasons) from the GH API as a two-letter code. This can be: 329 | - `as` assign 330 | - `au` author 331 | - `co` comment 332 | - `in` invitation 333 | - `ma` manual 334 | - `me` mention 335 | - `rr` review requested 336 | - `sa` security alert 337 | - `sc` state change 338 | - `su` subscribed 339 | - `tm` team mention 340 | - `k` the kind of notification as a letter. This can be: 341 | - `P` pull request 342 | - `I` issue 343 | - `C` commit 344 | 345 | ### Controls 346 | 347 | Left-click on a PR number to show details in a web browser. 348 | 349 | The program can be exited by pressing ESC or Ctrl-C. 350 | -------------------------------------------------------------------------------- /backport.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Script to do backports (pull ids listed in to_backport.txt or command line) in order of merge, 4 | to minimize number of conflicts. 5 | ''' 6 | import git 7 | import re 8 | import shlex 9 | import subprocess 10 | import os, sys 11 | 12 | # External tools (can be overridden using environment) 13 | GIT = os.getenv('GIT','git') 14 | BASH = os.getenv('BASH','bash') 15 | # Other configuration 16 | SRCREPO = os.getenv('SRCREPO', '../bitcoin') 17 | 18 | def ask_prompt(text): 19 | print(text,end=" ",file=sys.stderr) 20 | sys.stderr.flush() 21 | reply = sys.stdin.readline().rstrip() 22 | print("",file=sys.stderr) 23 | return reply 24 | 25 | merge_re = re.compile('^Merge (\w+(?:-\w+)*/\w+(?:-\w+)*#[0-9]+)') 26 | if len(sys.argv) > 1: 27 | pulls = [x.strip() for x in sys.argv[1:]] 28 | else: 29 | with open('to_backport.txt','r') as f: 30 | pulls = [x.strip() for x in f if x.strip()] 31 | 32 | execute = True 33 | 34 | pulls = set(pulls) 35 | repo = git.Repo(SRCREPO) 36 | head = repo.heads['master'] 37 | 38 | commit = head.commit 39 | to_backport = [] 40 | while True: 41 | match = merge_re.match(commit.message) 42 | if match: 43 | prid = match.group(1) 44 | if prid in pulls: 45 | pulls.remove(prid) 46 | to_backport.append((prid, commit)) 47 | if not pulls: 48 | break 49 | if not commit.parents: 50 | break 51 | commit = commit.parents[0] 52 | 53 | if pulls: 54 | print('Missing: %s' % list(pulls)) 55 | exit(1) 56 | 57 | # Backport in reverse order 58 | to_backport.reverse() 59 | 60 | if execute: 61 | class Attr: 62 | hsh = "\x1b[90m" 63 | head = "\x1b[1;96m" 64 | head2 = "\x1b[94m" 65 | reset = "\x1b[0m" 66 | else: # no colors if we're printing a bash script 67 | class Attr: 68 | head = "" 69 | head2 = "" 70 | reset = "" 71 | 72 | if not execute: 73 | print('set -e') 74 | for t in to_backport: 75 | msg = t[1].message.rstrip().splitlines() 76 | assert(msg[1] == '') 77 | print('{a.hsh}# {a.head}{}{a.reset}'.format(msg[0],a=Attr)) 78 | # XXX get the commits in the merge from the actual commit data instead of from the commit message 79 | commits = [] 80 | for line in msg[2:]: 81 | if not line: # stop at first empty line 82 | break 83 | cid,_,message = line.partition(' ') 84 | commits.append((cid,message)) 85 | 86 | for (cid, message) in reversed(commits): 87 | print('{a.hsh}# {a.head2}{}{a.reset}'.format(cid + ' '+ message,a=Attr)) 88 | commit = repo.commit(cid) 89 | cmsg = commit.message 90 | cmsg += '\n' 91 | cmsg += 'Github-Pull: %s\n' % t[0] 92 | cmsg += 'Rebased-From: %s\n' % commit.hexsha 93 | if execute: 94 | if subprocess.call([GIT,'cherry-pick', commit.hexsha]): 95 | # fail - drop to shell 96 | print('Dropping to shell - run git cherry-pick --continue after fixing issues, or exit and choose abort/skip') 97 | if os.path.isfile('/etc/debian_version'): # Show pull number on Debian default prompt 98 | os.putenv('debian_chroot',t[0]) 99 | subprocess.call([BASH,'-i']) 100 | reply = ask_prompt("Type 'c' to continue, 'a' to abort, 's' to skip pull.") 101 | if reply == 'c': 102 | pass 103 | elif reply == 'a': 104 | exit(1) 105 | elif reply == 's': 106 | subprocess.check_call([GIT,'cherry-pick', '--abort']) 107 | continue 108 | 109 | # Sign 110 | subprocess.check_call([GIT,'commit','--amend','--gpg-sign','-q','-m',cmsg]) 111 | else: 112 | print('git cherry-pick %s' % (commit.hexsha)) 113 | print('git commit -q --amend -m %s' % (shlex.quote(cmsg))) 114 | -------------------------------------------------------------------------------- /build-for-compare.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Written by W.J. van der Laan, provided under MIT license. 3 | # 4 | # Usage: ../do_build.py [ ...] 5 | # Will produce a ../bitcoind.$1.stripped for binary comparison 6 | import os,subprocess,sys,argparse,logging,shutil,re,hashlib,shlex,tempfile 7 | from collections import defaultdict 8 | from typing import List 9 | 10 | logger = logging.getLogger('do_build') 11 | # Use this command to compare resulting directories 12 | # git diff -W --word-diff /tmp/compare/4b5b263 /tmp/compare/d1bc5bf 13 | 14 | # WARNING WARNING WARNING 15 | # DO NOT RUN this with --nocopy=1 on working tree if you have any local additions. 16 | # It will nuke all non-repository files, multiple times over. 17 | # WARNING WARNING WARNING 18 | 19 | CONFIGURE_EXTRA=[ 20 | 'EVENT_CFLAGS=-I/opt/libevent/include', 21 | 'EVENT_LIBS=-L/opt/libevent/lib -levent', 22 | 'EVENT_PTHREADS_CFLAGS=-I/opt/libevent/include', 23 | 'EVENT_PTHREADS_LIBS=-L/opt/libevent/lib -levent_pthreads' 24 | ] 25 | DEFAULT_PARALLELISM=4 26 | DEFAULT_ASSERTIONS=0 27 | DEFAULT_NOCOPY=0 28 | DEFAULT_PATCH='stripbuildinfo.patch' 29 | TMPDIR=tempfile.gettempdir() 30 | DEFAULT_TGTDIR=os.path.join(TMPDIR, 'compare') 31 | DEFAULT_REPODIR=os.path.join(TMPDIR, 'repo') 32 | 33 | # No debugging information (not used by analysis at the moment, saves on I/O) 34 | OPTFLAGS=["-O0","-g0"] 35 | # Some options from -O to reduce code size 36 | # can't use -O or -Os as it does some weird cross-contamination between unchanged functions in compilation unit 37 | # Selectively enable opts that don't interfere or cause excessive sensitivity to changes 38 | # 39 | OPTFLAGS+=["-fcombine-stack-adjustments","-fcompare-elim","-fcprop-registers","-fdefer-pop","-fforward-propagate","-fif-conversion","-fif-conversion2", 40 | "-finline-functions-called-once","-fshrink-wrap","-fsplit-wide-types","-ftree-bit-ccp","-ftree-ccp","-ftree-ch","-ftree-copy-prop","-ftree-copyrename", 41 | "-ftree-dce","-ftree-dominator-opts","-ftree-dse","-ftree-fre","-ftree-sink","-ftree-slsr","-ftree-sra","-ftree-ter" 42 | ] 43 | # 44 | # -ffunctions-sections/-fdata-sections put every element in its own section. This is essential. 45 | OPTFLAGS+=['-ffunction-sections', '-fdata-sections'] 46 | # Fix the random seed 47 | OPTFLAGS+=['-frandom-seed=notsorandom'] 48 | # OFF: -fmerge-constants don't attempt to merge constants: this causes global interaction between sections/functions 49 | # this was reenabled because it doesn't matter, the numbered section names are annoying merged or unmerged 50 | OPTFLAGS+=['-fmerge-all-constants'] 51 | # -fipa-sra semi-randomly renames functions (or creates variants of functions with different names( 52 | OPTFLAGS+=['-fno-ipa-sra'] 53 | # -freorder-functions moves functions to .unlikely .hot sections 54 | OPTFLAGS+=['-fno-reorder-functions'] 55 | # no interprocedural optimizations 56 | # -fno-ipa-profile -fno-ipa-pure-const -fno-ipa-reference -fno-guess-branch-probability -fno-ipa-cp 57 | 58 | CPPFLAGS=[] 59 | # Prevent __LINE__ from messing with things 60 | #CPPFLAGS+=["-D__LINE__=0","-D__DATE__=\"\""] #-D__COUNTER__=0" 61 | # XXX unfortunately this approach does not work thanks to boost. 62 | 63 | # objcopy: strip all symbols, debug info, and the hash header section 64 | OBJCOPY_ARGS=['-R.note.gnu.build-id','-g','-S'] 65 | OBJDUMP_ARGS=['-C','--no-show-raw-insn','-d','-r'] 66 | 67 | # Set QT_RCC_SOURCE_DATE_OVERRIDE so that bitcoin-qt is deterministic 68 | os.environ['QT_RCC_SOURCE_DATE_OVERRIDE'] = '1' 69 | 70 | # These can be overridden from the environment 71 | GIT=os.getenv('GIT', 'git') 72 | MAKE=os.getenv('MAKE', 'make') 73 | RSYNC=os.getenv('RSYNC', 'rsync') 74 | OBJCOPY=os.getenv('OBJCOPY', 'objcopy') 75 | OBJDUMP=os.getenv('OBJDUMP', 'objdump') 76 | OBJEXT=os.getenv('OBJEXT', '.o') # object file extension 77 | 78 | PYDIR=os.path.dirname(os.path.abspath(__file__)) 79 | PATCHDIR=os.path.join(PYDIR,'patches') 80 | 81 | def init_logging(): 82 | LOG_PREFMT = { 83 | (logging.DEBUG, '\x1b[38;5;239m[%(name)-8s]\x1b[0m %(message)s\x1b[0m'), 84 | (logging.INFO, '\x1b[38;5;19m>\x1b[38;5;18m>\x1b[38;5;17m> \x1b[38;5;239m[%(name)-8s]\x1b[0m %(message)s\x1b[0m'), 85 | (logging.WARNING, '\x1b[38;5;228m>\x1b[38;5;227m>\x1b[38;5;226m> \x1b[38;5;239m[%(name)-8s]\x1b[38;5;226m %(message)s\x1b[0m'), 86 | (logging.ERROR, '\x1b[38;5;208m>\x1b[38;5;202m>\x1b[38;5;196m> \x1b[38;5;239m[%(name)-8s]\x1b[38;5;196m %(message)s\x1b[0m'), 87 | (logging.CRITICAL, '\x1b[48;5;196;38;5;16m>>> [%(name)-8s] %(message)s\x1b[0m'), 88 | } 89 | 90 | class MyStreamHandler(logging.StreamHandler): 91 | def __init__(self, stream, formatters): 92 | logging.StreamHandler.__init__(self, stream) 93 | self.formatters = formatters 94 | def format(self, record): 95 | return self.formatters[record.levelno].format(record) 96 | 97 | formatters = {} 98 | for (level, fmtstr) in LOG_PREFMT: 99 | formatters[level] = logging.Formatter(fmtstr) 100 | handler = MyStreamHandler(sys.stdout, formatters) 101 | logging.basicConfig(level=logging.DEBUG, handlers=[handler]) 102 | 103 | def safe_path(path: str) -> bool: 104 | ''' 105 | Ensure dir is a path we can nuke without consequences. 106 | This is currently restricted to /tmp/. 107 | ''' 108 | abspath = os.path.abspath(path) 109 | if abspath[0] != '/': return False # ??? 110 | comps = abspath[1:].split('/') # skip leading slash to avoid relying on empty first component 111 | return len(comps) > 1 and abspath.startswith(TMPDIR) 112 | 113 | def shell_split(s: str) -> List[str]: 114 | return shlex.split(s) 115 | def shell_join(s) -> str: 116 | return ' '.join(shlex.quote(x) for x in s) 117 | 118 | def check_call(args) -> int: 119 | '''Wrapper for subprocess.check_call that logs what command failed''' 120 | try: 121 | subprocess.check_call(args) 122 | except Exception: 123 | logger.error('Command failed: {}'.format(shell_join(args))) 124 | raise 125 | 126 | def cmd_exists(cmd) -> bool: 127 | '''Determine if a given command is available. Requires "which".''' 128 | try: 129 | with open(os.devnull, 'w') as FNULL: 130 | subprocess.check_call(['which', cmd], stdout=FNULL) 131 | except: 132 | return False 133 | return True 134 | 135 | def iterate_objs(srcdir) -> str: 136 | '''Iterate over all object files in srcdir''' 137 | for (root, dirs, files) in os.walk(srcdir): 138 | if not root.startswith(srcdir): 139 | raise ValueError 140 | root = root[len(srcdir)+1:] 141 | for filename in files: 142 | if filename.endswith(OBJEXT): 143 | yield os.path.join(root, filename) 144 | 145 | def copy_o_files(srcdir: str, tgtdir: str): 146 | '''Copy all object files from srcdir to dstdir, keeping the same directory hierarchy''' 147 | for objname in iterate_objs(srcdir): 148 | outname = os.path.join(tgtdir, objname) 149 | os.makedirs(os.path.dirname(outname), exist_ok=True) 150 | shutil.copy(os.path.join(srcdir, objname), outname) 151 | 152 | def objdump_all(srcdir: str, tgtdir: str): 153 | ''' 154 | Object analysis pass using objdump. 155 | ''' 156 | for objname in iterate_objs(srcdir): 157 | objname = os.path.join(srcdir, objname) 158 | p = subprocess.Popen([OBJDUMP] + OBJDUMP_ARGS + [objname], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE) 159 | (out,err) = p.communicate() 160 | if p.returncode != 0: 161 | raise Exception('objdump failed') 162 | (out,err) = (out.decode(),err.decode()) 163 | 164 | # postprocess- break into sections separated by 'Disassembly of section...' 165 | sections = defaultdict(list) 166 | funcname = '' 167 | for line in out.splitlines(): 168 | match = re.match('^Disassembly of section (.*):$', line) 169 | if match: 170 | funcname = match.group(1) 171 | if not '.rodata' in line: # filter out 'ebc: R_X86_64_32 .rodata+0x1944' 172 | sections[funcname].append(line) 173 | 174 | ''' 175 | lines = [] 176 | for section in sorted(sections.keys()): # '' header section automatically comes first 177 | #lines.extend(sections[section]) 178 | lines.append(sections[section][0]) 179 | out = '\n'.join(lines) 180 | 181 | outname = os.path.join(tgtdir, objname[:-len(OBJEXT)] + '.dis') 182 | make_parent_dirs(outname) 183 | with open(outname, 'w') as f: 184 | f.write(out) 185 | ''' 186 | for section in sections.keys(): 187 | if not section: 188 | continue 189 | name = hashlib.sha1(section.encode()).hexdigest() 190 | outname = os.path.join(tgtdir, name + '.dis') 191 | os.makedirs(os.path.dirname(outname), exist_ok=True) 192 | with open(outname, 'w') as f: 193 | f.write('\n'.join(sections[section])) 194 | 195 | # some TODO s, learning about the objdump output: 196 | # - demangle section names 197 | # - remove/make relative addresses 198 | # - sort/combine sections 199 | # - remove duplicate sections? (sounds like linker's work - can we do a partial link that preserves sections, such as for inlines?) 200 | # - resolve callq's relocations - these are ugly right now - integrate reloc result into instruction by substituting argument 201 | # - [- 17: R_X86_64_32S vtable for boost::exception_detail::bad_exception_+0x30-] 202 | # (at the very least delete callq's arguments) 203 | # - for data (mov etc): fill in data? pointers change arbitrarily especially in combined string tables (.rodata.str1...) 204 | # and these entries don't have names/symbols 205 | # - or could use a different disassembler completely, such as capstone. Parsing objdump output is a hack. 206 | 207 | def parse_arguments(): 208 | parser = argparse.ArgumentParser(description='Build to compare binaries. Execute this from a repository directory.') 209 | parser.add_argument('commitids', metavar='COMMITID', nargs='+') 210 | parser.add_argument('--executables', default='src/bitcoind', help='Comma-separated list of executables to build, default is "src/bitcoind"') 211 | parser.add_argument('--tgtdir', default=DEFAULT_TGTDIR, help='Target directory, default is "{}"'.format(DEFAULT_TGTDIR)) 212 | parser.add_argument('--repodir', default=DEFAULT_REPODIR, help='Temp repository directory, default is "{}"'.format(DEFAULT_REPODIR)) 213 | parser.add_argument('--parallelism', '-j', default=DEFAULT_PARALLELISM, type=int, help='Make parallelism, default is {}'.format(DEFAULT_PARALLELISM)) 214 | parser.add_argument('--assertions', default=DEFAULT_ASSERTIONS, type=int, help='Build with assertions, default is {}'.format(DEFAULT_ASSERTIONS)) 215 | parser.add_argument('--opt', default=None, type=str, help='Override C/C++ optimization flags. Prepend + to avoid collisions with arguments, e.g. "+-O2 -g"') 216 | parser.add_argument('--patches', '-P', default=None, type=str, help='Comma separated list of stripbuildinfo patches to apply, one per hash (in order).') 217 | parser.add_argument('--prefix', default=None, type=str, help='A depends prefix that will be passed to configure') 218 | parser.add_argument('--nocopy', default=DEFAULT_NOCOPY, type=int, help='Build directly in the repository. If unset, will rsync or copy the repository to a temporary directory first, default is {}'.format(DEFAULT_NOCOPY)) 219 | args = parser.parse_args() 220 | args.patches = dict(zip(args.commitids, [v.strip() for v in args.patches.split(',')])) if args.patches is not None else {} 221 | args.executables = args.executables.split(',') 222 | if args.opt is not None: 223 | if not args.opt.startswith('+'): 224 | print('"opt" argument must start with +', file=sys.stderr) 225 | exit(1) 226 | args.opt = shell_split(args.opt[1:]) 227 | else: 228 | args.opt = OPTFLAGS 229 | # Safety checks 230 | if not args.nocopy and not safe_path(args.repodir): 231 | logger.error('Temp repository directory {} may not be used. Please use {}, e.g. "{}/{}"'.format(args.repodir, TMPDIR, TMPDIR, args.repodir)) 232 | exit(1) 233 | 234 | return args 235 | 236 | def main(): 237 | args = parse_arguments() 238 | init_logging() 239 | try: 240 | try: 241 | os.makedirs(args.tgtdir) 242 | except FileExistsError: 243 | logger.warning("{} already exists, remove it if you don't want to continue a current comparison session".format(args.tgtdir)) 244 | if safe_path(args.tgtdir): 245 | dodelete = input("Delete {}? [y/n] ".format(args.tgtdir)) 246 | if dodelete == 'y' or dodelete == 'Y': 247 | # Remove target dir 248 | logger.info('Removing {}'.format(args.tgtdir)) 249 | check_call(['rm', '-rf', args.tgtdir]) 250 | 251 | for commit in args.commitids: 252 | try: 253 | int(commit,16) 254 | except ValueError: 255 | logger.error('{} is not a hexadecimal commit id. It\'s the only thing we know.'.format(commit)) 256 | exit(1) 257 | 258 | # Copy repo, unless nocopy is set 259 | if not args.nocopy and safe_path(args.repodir): 260 | if cmd_exists(RSYNC.split(' ')[0]): 261 | logger.info('RSyncing repository ...') 262 | check_call([RSYNC, 263 | '-r', # recursive 264 | '--delete', # delete extraneous files on dst 265 | '.git', # from .git in CWD 266 | args.repodir]) # to repodir 267 | else: 268 | gitdir = os.path.join(args.repodir, '.git') 269 | logger.warning('Command "rsync" not found; resorting to cp, which tends to be slower.') 270 | logger.info('Copying repository ...') 271 | # Touch (to avoid file not found) and remove repodir/.git so we don't end up with repodir/.git/.git 272 | check_call(['mkdir','-p',args.repodir]) 273 | check_call(['touch',gitdir]) 274 | check_call(['rm','-rf',gitdir]) 275 | check_call(['cp','-r','.git',args.repodir]) 276 | # Go to repo 277 | os.chdir(args.repodir) 278 | 279 | # Determine (g)make arguments 280 | make_args = [] 281 | if args.parallelism is not None: 282 | make_args += ['-j{}'.format(args.parallelism)] 283 | # Disable assertions if requested 284 | cppflags = CPPFLAGS 285 | if not args.assertions: 286 | cppflags+=['-DNDEBUG'] 287 | 288 | for commit in args.commitids: 289 | logger.info("Building {}...".format(commit)) 290 | stripbuildinfopatch = args.patches[commit] if commit in args.patches else DEFAULT_PATCH 291 | commitdir = os.path.join(args.tgtdir, commit) 292 | commitdir_obj = os.path.join(args.tgtdir, commit+'.o') 293 | 294 | try: 295 | os.makedirs(commitdir) 296 | except FileExistsError: 297 | logger.error("{} already exists; skipping".format(commitdir)) 298 | continue 299 | check_call([GIT,'reset','--hard']) 300 | check_call([GIT,'clean','-f','-x','-d']) 301 | check_call([GIT,'checkout',commit]) 302 | try: 303 | if commit in args.patches: 304 | logger.info('User-defined patch: {}'.format(stripbuildinfopatch)) 305 | check_call([GIT,'apply', os.path.join(PATCHDIR,stripbuildinfopatch)]) 306 | except subprocess.CalledProcessError: 307 | logger.error('Could not apply patch to strip build info. Probably it needs to be updated') 308 | exit(1) 309 | 310 | check_call(['./autogen.sh']) 311 | logger.info('Running configure script') 312 | opt = shell_join(args.opt) 313 | check_call(['./configure', '--disable-hardening', '--without-cli', '--disable-tests', '--disable-bench', '--disable-ccache', 314 | '--prefix={}'.format(args.prefix) if args.prefix else '--with-incompatible-bdb', 315 | 'CPPFLAGS='+(' '.join(cppflags)), 316 | 'CFLAGS='+opt, 'CXXFLAGS='+opt, 'LDFLAGS='+opt] + CONFIGURE_EXTRA) 317 | 318 | for name in args.executables: 319 | logger.info('Building executable {}'.format(name)) 320 | target_name = os.path.join(args.tgtdir, os.path.basename(name) + '.' + commit) 321 | check_call([MAKE] + make_args + [name]) 322 | shutil.copy(name, target_name) 323 | check_call([OBJCOPY] + OBJCOPY_ARGS + [name, target_name + '.stripped']) 324 | 325 | logger.info('Copying object files...') 326 | copy_o_files('.', commitdir_obj) 327 | 328 | logger.info('Performing basic analysis pass...') 329 | objdump_all(commitdir_obj, commitdir) 330 | 331 | if len(args.commitids)>1: 332 | logger.info('Use these commands to compare results:') 333 | logger.info('$ sha256sum {}/*.stripped'.format(args.tgtdir)) 334 | logger.info('$ git diff -W --word-diff {} {}'.format(os.path.join(args.tgtdir,args.commitids[0]), os.path.join(args.tgtdir,args.commitids[1]))) 335 | except Exception: 336 | logger.exception('Error:') 337 | 338 | if __name__ == '__main__': 339 | main() 340 | 341 | -------------------------------------------------------------------------------- /check-dnsseeds.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Simple script to check the status of all Bitcoin Core DNS seeds. 4 | Seeds are available from https://github.com/bitcoin/bitcoin/blob/master/src/kernel/chainparams.cpp 5 | ''' 6 | import subprocess 7 | 8 | SEEDS_PER_NETWORK={ 9 | 'mainnet': [ 10 | "seed.bitcoin.sipa.be", 11 | "dnsseed.bluematt.me", 12 | "dnsseed.bitcoin.dashjr-list-of-p2p-nodes.us", 13 | "seed.bitcoinstats.com", 14 | "seed.bitcoin.jonasschnelli.ch", 15 | "seed.btc.petertodd.net", 16 | "seed.bitcoin.sprovoost.nl", 17 | "dnsseed.emzy.de", 18 | "seed.bitcoin.wiz.biz", 19 | ], 20 | 'testnet': [ 21 | "testnet-seed.bitcoin.jonasschnelli.ch", 22 | "seed.tbtc.petertodd.net", 23 | "testnet-seed.bluematt.me", 24 | "seed.testnet.bitcoin.sprovoost.nl", 25 | ], 26 | 'signet': [ 27 | "seed.signet.bitcoin.sprovoost.nl", 28 | ], 29 | } 30 | 31 | def check_seed(x): 32 | p = subprocess.run(["host",x], capture_output=True, universal_newlines=True) 33 | out = p.stdout 34 | 35 | # Parse matching lines 36 | addresses = [] 37 | for line in out.splitlines(): 38 | if "has address" in line or "has IPv6 address" in line: 39 | addresses.append(line) 40 | 41 | if addresses: 42 | print(f"\x1b[94mOK\x1b[0m {x} ({len(addresses)} results)") 43 | else: 44 | print(f"\x1b[91mFAIL\x1b[0m {x}") 45 | 46 | if __name__ == '__main__': 47 | for (network, seeds) in SEEDS_PER_NETWORK.items(): 48 | print(f"\x1b[90m* \x1b[97m{network}\x1b[0m") 49 | 50 | for hostname in seeds: 51 | check_seed(hostname) 52 | 53 | print() 54 | -------------------------------------------------------------------------------- /clang-format.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | ''' 3 | Wrapper script for clang-format 4 | 5 | Copyright (c) 2015 MarcoFalke 6 | Copyright (c) 2015 The Bitcoin Core developers 7 | Distributed under the MIT software license, see the accompanying 8 | file COPYING or http://www.opensource.org/licenses/mit-license.php. 9 | ''' 10 | 11 | import os 12 | import sys 13 | import subprocess 14 | 15 | tested_versions = ['3.6.0', '3.6.1', '3.6.2'] # A set of versions known to produce the same output 16 | accepted_file_extensions = ('.h', '.cpp') # Files to format 17 | 18 | def check_clang_format_version(clang_format_exe): 19 | try: 20 | output = subprocess.check_output([clang_format_exe, '-version']) 21 | for ver in tested_versions: 22 | if ver in output: 23 | print "Detected clang-format version " + ver 24 | return 25 | raise RuntimeError("Untested version: " + output) 26 | except Exception as e: 27 | print 'Could not verify version of ' + clang_format_exe + '.' 28 | raise e 29 | 30 | def check_command_line_args(argv): 31 | required_args = ['{clang-format-exe}', '{files}'] 32 | example_args = ['clang-format-3.x', 'src/main.cpp', 'src/wallet/*'] 33 | 34 | if(len(argv) < len(required_args) + 1): 35 | for word in (['Usage:', argv[0]] + required_args): 36 | print word, 37 | print '' 38 | for word in (['E.g:', argv[0]] + example_args): 39 | print word, 40 | print '' 41 | sys.exit(1) 42 | 43 | def run_clang_format(clang_format_exe, files): 44 | for target in files: 45 | if os.path.isdir(target): 46 | for path, dirs, files in os.walk(target): 47 | run_clang_format(clang_format_exe, (os.path.join(path, f) for f in files)) 48 | elif target.endswith(accepted_file_extensions): 49 | print "Format " + target 50 | subprocess.check_call([clang_format_exe, '-i', '-style=file', target], stdout=open(os.devnull, 'wb'), stderr=subprocess.STDOUT) 51 | else: 52 | print "Skip " + target 53 | 54 | def main(argv): 55 | check_command_line_args(argv) 56 | clang_format_exe = argv[1] 57 | files = argv[2:] 58 | check_clang_format_version(clang_format_exe) 59 | run_clang_format(clang_format_exe, files) 60 | 61 | if __name__ == "__main__": 62 | main(sys.argv) 63 | -------------------------------------------------------------------------------- /delete_nonreduced_fuzz_inputs.sh: -------------------------------------------------------------------------------- 1 | # Over time the fuzz engine will reduce inputs (produce a smaller input that 2 | # yields the same coverage statistics). With a growing set of inputs, it could 3 | # be useful to occasionally delete the "old" non-reduced inputs. 4 | # 5 | # This script tries to do so in a way that is as deterministic as possible. 6 | # 7 | # The script should be run on an x86_64 virtual machine with only a minimal 8 | # vanilla Ubuntu Noble 24.04 installed. Ideally, the script was run on 9 | # different architectures or even different OS versions, which come with 10 | # different library packages, but this is left as a future improvement. 11 | 12 | export FUZZ_CORPORA_DIR="fuzz_corpora" 13 | 14 | set -e 15 | 16 | echo "Installing Bitcoin Core build deps" 17 | export DEBIAN_FRONTEND=noninteractive 18 | apt update 19 | apt install -y \ 20 | git \ 21 | build-essential pkg-config bsdmainutils python3 cmake \ 22 | libsqlite3-dev libevent-dev libboost-dev \ 23 | lsb-release wget software-properties-common gnupg 24 | 25 | export LLVM_VERSION=18 26 | wget https://apt.llvm.org/llvm.sh && chmod +x ./llvm.sh 27 | ./llvm.sh $LLVM_VERSION all 28 | ln -s $(which llvm-symbolizer-$LLVM_VERSION) /usr/bin/llvm-symbolizer 29 | 30 | git clone --branch stable https://github.com/AFLplusplus/AFLplusplus 31 | make -C AFLplusplus LLVM_CONFIG=llvm-config-$LLVM_VERSION PERFORMANCE=1 install -j$(nproc) 32 | 33 | git clone --depth=1 https://github.com/bitcoin-core/qa-assets.git 34 | ( 35 | cd qa-assets 36 | mv ./"${FUZZ_CORPORA_DIR}" ../all_inputs 37 | git config user.name "delete_nonreduced_inputs script" 38 | git config user.email "noreply@noreply.noreply" 39 | git commit -a -m "Delete fuzz inputs" 40 | ) 41 | 42 | git clone --depth=1 https://github.com/bitcoin/bitcoin.git 43 | ( 44 | cd bitcoin 45 | 46 | echo "Adding reduced seeds with afl-cmin" 47 | 48 | rm -rf build_fuzz/ 49 | export LDFLAGS="-fuse-ld=lld" 50 | cmake -B build_fuzz \ 51 | -DCMAKE_C_COMPILER=afl-clang-fast -DCMAKE_CXX_COMPILER=afl-clang-fast++ \ 52 | -DBUILD_FOR_FUZZING=ON 53 | cmake --build build_fuzz -j$(nproc) 54 | 55 | WRITE_ALL_FUZZ_TARGETS_AND_ABORT="/tmp/a" "./build_fuzz/bin/fuzz" || true 56 | readarray FUZZ_TARGETS < "/tmp/a" 57 | for fuzz_target in ${FUZZ_TARGETS[@]}; do 58 | if [ -d "../all_inputs/$fuzz_target" ]; then 59 | mkdir --parents ../qa-assets/"${FUZZ_CORPORA_DIR}"/$fuzz_target 60 | # Allow timeouts and crashes with "-A", "-T all" to use all available cores 61 | FUZZ=$fuzz_target afl-cmin -T all -A -i ../all_inputs/$fuzz_target -o ../qa-assets/"${FUZZ_CORPORA_DIR}"/$fuzz_target -- ./build_fuzz/bin/fuzz 62 | else 63 | echo "No input corpus for $fuzz_target (ignoring)" 64 | fi 65 | done 66 | 67 | ( 68 | cd ../qa-assets 69 | git add "${FUZZ_CORPORA_DIR}" 70 | git commit -m "Reduced inputs for afl-cmin" 71 | ) 72 | 73 | for sanitizer in {"fuzzer","fuzzer,address,undefined,integer"}; do 74 | echo "Adding reduced seeds for sanitizer=${sanitizer}" 75 | 76 | rm -rf build_fuzz/ 77 | cmake -B build_fuzz \ 78 | -DCMAKE_C_COMPILER=clang-$LLVM_VERSION -DCMAKE_CXX_COMPILER=clang++-$LLVM_VERSION \ 79 | -DBUILD_FOR_FUZZING=ON -DSANITIZERS="$sanitizer" 80 | cmake --build build_fuzz -j$(nproc) 81 | 82 | ( cd build_fuzz; ./test/fuzz/test_runner.py -l DEBUG --par=$(nproc) --m_dir=../../all_inputs ../../qa-assets/"${FUZZ_CORPORA_DIR}" ) 83 | 84 | ( 85 | cd ../qa-assets 86 | git add "${FUZZ_CORPORA_DIR}" 87 | git commit -m "Reduced inputs for ${sanitizer}" 88 | ) 89 | done 90 | ) 91 | -------------------------------------------------------------------------------- /fastcopy-chaindata.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Copyright (c) 2017 Wladimir J. van der Laan 3 | # Distributed under the MIT software license, see the accompanying 4 | # file COPYING or http://www.opensource.org/licenses/mit-license.php. 5 | ''' 6 | Fast local copy of Bitcoin Core blockchain state. 7 | 8 | This utility hardlinks all but the last block data file (rev and blk), 9 | and hardlinks all .ldb files to the destination. The last data files as well 10 | as the other leveldb data files (such as the log) are copied. 11 | 12 | This relies on the fact that block files (except the last) and ldb files 13 | are read-only once they are written. 14 | 15 | Warning: Hardlinking only works within a filesystem, and may not work for all 16 | filesystems. 17 | ''' 18 | import os,re,shutil,sys 19 | from os import path 20 | 21 | def dat_name(type_, num) -> str: 22 | return '{}{:05d}.dat'.format(type_, num) 23 | 24 | def link_blocks(src: str, dst: str): 25 | rev_max = -1 26 | blk_max = -1 27 | for fname in os.listdir(src): 28 | match = re.match('^rev([0-9]{5}).dat$', fname) 29 | if match: 30 | rev_max = max(rev_max, int(match.group(1))) 31 | match = re.match('^blk([0-9]{5}).dat$', fname) 32 | if match: 33 | blk_max = max(blk_max, int(match.group(1))) 34 | if blk_max != rev_max: 35 | raise ValueError("Maximum block file {:05d} doesn't match maximum undo file {:05d}".format(blk_max, rev_max)) 36 | print('Hard-linking all rev and blk files up to {:05d}'.format(blk_max)) 37 | for i in range(blk_max): 38 | for type_ in ['rev','blk']: 39 | name = dat_name(type_, i) 40 | os.link(path.join(src, name), path.join(dst, name)) 41 | print('Copying rev and blk files {:05d}'.format(blk_max)) 42 | for type_ in ['rev','blk']: 43 | name = dat_name(type_, blk_max) 44 | shutil.copyfile(path.join(src, name), path.join(dst, name)) 45 | 46 | def link_leveldb(src: str, dst: str): 47 | ldb_files = [] 48 | other_files = [] 49 | for fname in os.listdir(src): 50 | if re.match('^[0-9]{6,}.ldb$', fname): 51 | ldb_files.append(fname) 52 | else: 53 | other_files.append(fname) 54 | print('Hard-linking {:d} leveldb files'.format(len(ldb_files))) 55 | for name in ldb_files: 56 | os.link(path.join(src, name), path.join(dst, name)) 57 | print('Copying {:d} other files in leveldb dir'.format(len(other_files))) 58 | for name in other_files: 59 | shutil.copyfile(path.join(src, name), path.join(dst, name)) 60 | 61 | if len(sys.argv) != 3: 62 | print('Usage: {} reference_datadir destination_datadir'.format(path.basename(sys.argv[0]))) 63 | exit(1) 64 | srcdir = sys.argv[1] # '/store2/tmp/testbtc' 65 | dstdir = sys.argv[2] # '/store2/tmp/testbtc2' 66 | 67 | src_blocks = path.join(srcdir, 'blocks') 68 | dst_blocks = path.join(dstdir, 'blocks') 69 | src_blocks_index = path.join(srcdir, 'blocks/index') 70 | dst_blocks_index = path.join(dstdir, 'blocks/index') 71 | src_chainstate = path.join(srcdir, 'chainstate') 72 | dst_chainstate = path.join(dstdir, 'chainstate') 73 | 74 | try: 75 | os.makedirs(dstdir) 76 | except FileExistsError: 77 | print('warning: destination directory {} already exists'.format(dstdir)) 78 | os.makedirs(dst_blocks_index) 79 | os.makedirs(dst_chainstate) 80 | 81 | print('Copying blocks') 82 | link_blocks(src_blocks, dst_blocks) 83 | print('Copying block index') 84 | link_leveldb(src_blocks_index, dst_blocks_index) 85 | print('Copying chainstate') 86 | link_leveldb(src_chainstate, dst_chainstate) 87 | -------------------------------------------------------------------------------- /ghmeta/__init__.py: -------------------------------------------------------------------------------- 1 | ''' 2 | Access to github metadata as exported by ghrip (https://github.com/zw/ghrip). 3 | ''' 4 | import json 5 | 6 | class GhMeta: 7 | def __init__(self, paths): 8 | self.paths = paths 9 | 10 | def __getitem__(self, idx): 11 | (repo, pull) = idx 12 | base = self.paths[repo] 13 | 14 | filename = f'{base}/issues/{pull//100}xx/{pull}.json' 15 | try: 16 | with open(filename, 'r') as f: 17 | data0 = json.load(f) 18 | except IOError as e: 19 | raise KeyError 20 | 21 | filename = f'{base}/issues/{pull//100}xx/{pull}-PR.json' 22 | try: 23 | with open(filename, 'r') as f: 24 | data1 = json.load(f) 25 | except IOError as e: 26 | data1 = None 27 | 28 | data0['pr'] = data1 29 | return data0 30 | 31 | def get(self, pull, default=None): 32 | try: 33 | return self[pull] 34 | except KeyError: 35 | return default 36 | -------------------------------------------------------------------------------- /ghwatch.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Watch sorted github notifications from the terminal. 4 | ''' 5 | import argparse 6 | from collections import namedtuple 7 | import datetime 8 | import json 9 | import os 10 | import re 11 | import shutil 12 | import sys 13 | import subprocess 14 | import webbrowser 15 | 16 | from github import Github, GithubObject 17 | 18 | from ghmeta import GhMeta 19 | from termlib.input import Key 20 | from termlib.tableprinter import Column, TablePrinter 21 | from termlib.attr import Attr 22 | 23 | DEFAULT_CONFIG = { 24 | 'ghbase': 'https://github.com/', 25 | 'ghtoken': '', 26 | 27 | # This specifies the action to invoke when clicking an issue number. 28 | # the default setting will invoke the operating system default browser. 29 | 'browser': None, 30 | # Alternatively, it is possible to specify a command 31 | #'browser': ['firefox', '--new-tab'], 32 | 33 | # Repository with github metadata mirror (to get label data) 34 | 'meta': {'bitcoin/bitcoin': '/path/to/bitcoin-gh-meta'}, 35 | 36 | # Interval in seconds for an automatic update (git pull) of github metadata mirror, if greater than 0. 37 | 'auto_update': 0, 38 | 39 | # Whether to enable ordering of notifications by {reason, time}. 40 | 'sort_notifications': False, 41 | 42 | # Label priorities; the higher in this list, the higher the priority. 43 | # When a PR or issue has multiple labels, the one with the highest priority will be 44 | # shown. This is pretty arbitary, roughly going from specific to aspecific, 45 | # and not a value judgement with regard to importance of components. 46 | 'label_prio': {'bitcoin/bitcoin': [ 47 | 'Consensus', 48 | 'Mining', 49 | 'Mempool', 50 | 'TX fees and policy', 51 | 'UTXO Db and Indexes', 52 | 'Validation', 53 | 'P2P', 54 | 'Wallet', 55 | 'RPC/REST/ZMQ', 56 | 'Build system', 57 | 'Scripts and tools', 58 | 'Settings', 59 | 'Utils/log/libs', 60 | 'Tests', 61 | 'GUI', 62 | 'Docs', 63 | 'Descriptors', 64 | 'PSBT', 65 | 'Privacy', 66 | 'Resource usage', 67 | 'Block storage', 68 | 'Data corruption', 69 | 'Interfaces', 70 | 'Refactoring', 71 | ]}, 72 | } 73 | 74 | # Priority list of notification reasons, from highest to lowest 75 | REASON_PRIO = ["assign", "review_requested", "mention", "author", "comment", "invitation", 76 | "manual", "team_mention", "security_alert", "state_change", "subscribed"] 77 | 78 | # A clickable link UI element 79 | ButtonInfo = namedtuple('ButtonInfo', ['x0', 'y0', 'x1', 'y1', 'url']) 80 | 81 | class Theme: 82 | ''' 83 | Application theming. 84 | ''' 85 | # Default attribute for row 86 | HEADER = Attr.BOLD + Attr.REVERSE 87 | ROW = '' 88 | # Attribute for timestamp 89 | DATETIME = '' # Attr.fg_hex('#ffffff') 90 | 91 | # Attributes for PR/issue states 92 | REF = { 93 | 'unknown': '', 94 | 'open': Attr.fg_hex('#3fb950') + Attr.bg_hex('#12221d'), 95 | 'closed': Attr.fg_hex('#f85149') + Attr.bg_hex('#22141a'), 96 | 'merged': Attr.fg_hex('#a371f7') + Attr.bg_hex('#1f1d2f'), 97 | } 98 | 99 | # Attributes for notification reasons 100 | # see https://docs.github.com/en/rest/reference/activity#notification-reasons 101 | REASON_GLYPHS = { 102 | 'assign': (Attr.fg_hex('#808080'), 'as'), 103 | 'author': (Attr.fg_hex('#c000ff'), 'au'), 104 | 'comment': (Attr.fg_hex('#808080'), 'co'), 105 | 'invitation': (Attr.fg_hex('#808080'), 'in'), 106 | 'manual': (Attr.fg_hex('#808080'), 'ma'), 107 | 'mention': (Attr.fg_hex('#ff00ff'), 'me'), 108 | 'review_requested': (Attr.fg_hex('#808080'), 'rr'), 109 | 'security_alert': (Attr.fg_hex('#808080'), 'sa'), 110 | 'state_change': (Attr.fg_hex('#808080'), 'sc'), 111 | 'subscribed': (Attr.fg_hex('#3c3c3c'), 'su'), 112 | 'team_mention': (Attr.fg_hex('#808080'), 'tm'), 113 | } 114 | UNK_REASON = (ROW, '??') 115 | 116 | def pick_label(label_prio, repo, labels): 117 | ''' 118 | Pick the most appropriate (highest priority) label to show. 119 | ''' 120 | try: 121 | label_prio = label_prio[repo] 122 | except KeyError: # if no specific prioritization for this repo, return the first label 123 | if len(labels) > 0: 124 | return labels[0] 125 | else: 126 | return None 127 | 128 | res = None 129 | res_prio = len(label_prio) + 1 130 | for label in labels: 131 | try: 132 | prio = -label_prio.index(label['name']) 133 | except ValueError: 134 | prio = -len(label_prio) 135 | 136 | if res is None or prio > res_prio: 137 | res = label 138 | res_prio = prio 139 | 140 | return res 141 | 142 | def parse_args() -> argparse.Namespace: 143 | '''Parse command line arguments.''' 144 | parser = argparse.ArgumentParser(description='Display github notifications') 145 | 146 | parser.add_argument('--exclude-reasons', '-x', help='Reasons to exclude (comma-separated) from: assign, author, comment, invitation, manual, mention, review_requested, security_alert, state_change, subscribed, team_mention)') 147 | parser.add_argument('--all', '-a', action='store_const', const=True, default=False, help='Show all notifications, also those that are read') 148 | parser.add_argument('--days', '-d', type=int, default=7, help='Number of days to look back (default: 7)') 149 | parser.add_argument('--refresh-time', '-r', type=int, default=600, help='Refresh time in seconds in interactive mode (default: 600)') 150 | parser.add_argument('--default-config', action='store_const', const=True, default=False, help='Generate a default configuration file in ~/.config/ghwatch') 151 | parser.add_argument('--sort', '-s', action='store_true', default=None, help="Sort notifications by reasons (and then time). Overrides 'sort_notifications' in the configuration file") 152 | parser.add_argument('--no-sort', dest='sort', action='store_false', help="Don't sort notifications. Overrides 'sort_notifications' in the configuration file") 153 | 154 | return parser.parse_args() 155 | 156 | config_dir: str = f'{os.path.expanduser("~")}/.config/ghwatch' 157 | config_file: str = f'{config_dir}/ghwatch.conf' 158 | 159 | def parse_config_file(generate=False): 160 | config = DEFAULT_CONFIG 161 | if generate: 162 | os.makedirs(config_dir, exist_ok=True) 163 | with open(config_file, 'w') as f: 164 | json.dump(config, f, indent=4) 165 | if os.path.exists(config_file): 166 | with open(config_file, 'r') as f: 167 | # TODO: merge with default config instead of overwrite here 168 | config = json.load(f) 169 | else: 170 | print(f'No configuration file {config_file}, use --default-config to generate a default one.', file=sys.stderr) 171 | sys.exit(1) 172 | return config 173 | 174 | def get_html_url(ghbase, rec): 175 | ''' 176 | Get the browser URL for a notification object. 177 | 178 | I think the "proper" way to do this would be to fetch rec.subject.url and get 179 | 'html_url' from the returned object. But to avoid another roundtrip to github, 180 | this implements the logic locally. 181 | ''' 182 | m = re.match('.*\/([0-9a-f]+)$', rec.subject.url) 183 | if not m: 184 | return None 185 | idx = m.group(1) 186 | 187 | comment_n = '' 188 | if rec.subject.latest_comment_url: 189 | m = re.match('.*\/comments\/([0-9a-f]+)$', rec.subject.latest_comment_url) 190 | if m: 191 | comment_n = '#issuecomment-' + m.group(1) 192 | 193 | if rec.subject.type == 'PullRequest': 194 | return f'{ghbase}{rec.repository.full_name}/pull/{idx}{comment_n}' 195 | if rec.subject.type == 'Issue': 196 | return f'{ghbase}{rec.repository.full_name}/issues/{idx}{comment_n}' 197 | elif rec.subject.type == 'Commit': 198 | return f'{ghbase}{rec.repository.full_name}/commit/{idx}{comment_n}' 199 | else: # TODO: releases and other things 200 | return None 201 | 202 | def priority_sort_key(item): 203 | return (-REASON_PRIO.index(item.reason), item.updated_at) 204 | 205 | def github_load(user): 206 | ''' 207 | Load the notifications from github and return them 208 | ''' 209 | since = datetime.datetime.utcnow() - datetime.timedelta(days=args.days) 210 | get_all = True if args.all else GithubObject.NotSet 211 | notifications = list(user.get_notifications(all=get_all, since=since)) 212 | 213 | if sort_notifications: 214 | notifications.sort(key=priority_sort_key, reverse=True) 215 | 216 | return notifications 217 | 218 | def draw(notifications): 219 | sys.stdout.write(Attr.CLEAR) 220 | pr.print_header(Theme.HEADER) 221 | issue_column = pr.column_info(4) 222 | 223 | # TODO: 224 | # "Notifications are optimized for polling with the Last-Modified header. If 225 | # there are no new notifications, you will see a 304 Not Modified response, 226 | # leaving your current rate limit untouched. There is an X-Poll-Interval header 227 | # that specifies how often (in seconds) you are allowed to poll. In times of 228 | # high server load, the time may increase. Please obey the header." 229 | 230 | row = 0 231 | buttons = [] 232 | 233 | for rec in notifications: 234 | if rec.reason in exclude_reasons: 235 | continue 236 | # rec.subject.type : PullRequest, Issue, Commit 237 | # rec.reason : comment, subscribed, mention, author, state_change, review_requested see https://docs.github.com/en/rest/reference/activity#notification-reasons 238 | # state_change is only for self-initiated state changed, not any monitored issue/PR 239 | if rec.subject.type in {'PullRequest', 'Issue'}: 240 | # PullRequest: https://api.github.com/repos/bitcoin-core/secp256k1/pulls/875 241 | # Issue: https://api.github.com/repos/bitcoin/bitcoin/issues/20935 242 | m = re.match('.*\/([0-9]+)$', rec.subject.url) 243 | issue = int(m.group(1)) 244 | meta = ghmeta.get((rec.repository.full_name, issue)) 245 | ref_str = str(issue) 246 | elif rec.subject.type == 'Commit': 247 | # Commit: https://api.github.com/repos/bitcoin/bitcoin/commits/54ce4fac80689621dcbcc76169b2b00b179ee743 248 | m = re.match('.*\/([0-9a-f]+)$', rec.subject.url) 249 | ref_str = m.group(1) 250 | issue = None 251 | meta = None 252 | else: 253 | # Release: https://api.github.com/repos/bitcoin-core/HWI/releases/34442950 254 | # RepositoryInvitation: ? 255 | if rec.subject.type not in {'Release', 'RepositoryInvitation'}: # Huh 256 | print(rec.subject.type, rec.subject.url) 257 | assert(False) 258 | issue = None 259 | meta = None 260 | 261 | label_t = (Theme.ROW, '') 262 | state = 'unknown' 263 | if meta is not None: 264 | label = pick_label(config['label_prio'], rec.repository.full_name, meta['labels']) 265 | if label is not None: 266 | label_t = (Attr.bg_hex(label['color']) + Attr.fg(0, 0, 0), label['name']) 267 | 268 | state = meta['state'] 269 | if meta['pr'] is not None and meta['pr']['merged']: 270 | state = 'merged' 271 | 272 | pr.print_row([ 273 | (Theme.DATETIME, rec.updated_at), 274 | Theme.REASON_GLYPHS.get(rec.reason, Theme.UNK_REASON), 275 | (Theme.ROW, rec.repository.full_name), 276 | (Theme.ROW, rec.subject.type), 277 | (Theme.REF.get(state, ''), ref_str), 278 | label_t, 279 | (Theme.ROW, rec.subject.title), 280 | ]) 281 | 282 | buttons.append(ButtonInfo( 283 | x0 = issue_column.x, 284 | y0 = row + 1, 285 | x1 = issue_column.x + issue_column.width, 286 | y1 = row + 2, 287 | url = get_html_url(config['ghbase'], rec), 288 | )) 289 | 290 | row += 1 291 | if row == N: 292 | break 293 | 294 | return buttons 295 | 296 | def handle_mouse_click(b, config): 297 | ''' 298 | Handle click action on button element. 299 | ''' 300 | if b.url is not None: 301 | if config['browser'] is None: 302 | webbrowser.open(b.url) 303 | else: 304 | subprocess.call(config['browser'] + [b.url], stdout = subprocess.PIPE, stderr = subprocess.PIPE) 305 | 306 | def set_window_size(): 307 | global pr, N 308 | 309 | (cols, rows) = shutil.get_terminal_size((80, 25)) 310 | W = cols - 70 311 | N = rows - 2 312 | if W < 10 or N < 5: 313 | print('Terminal size too small') 314 | sys.exit(1) 315 | 316 | pr = TablePrinter(sys.stdout, Attr, [ 317 | Column('date', 19), 318 | Column('r', 2), 319 | Column('repository', 24), 320 | Column('k', 1), 321 | Column('#', 5), 322 | Column('label', 12), 323 | Column('title', W), 324 | ]) 325 | 326 | def pull_repositories(config): 327 | ''' 328 | Use subprocess to "git pull" the configured metadata-repositories. 329 | ''' 330 | try: 331 | for repo, repo_path in config['meta'].items(): 332 | subprocess.run(['git','pull'], check=True, cwd=repo_path, capture_output=True) 333 | except subprocess.CalledProcessError as e: 334 | print(e.stderr.decode()) 335 | raise 336 | 337 | def main(): 338 | global args, config, exclude_reasons, ghmeta, sort_notifications 339 | 340 | args = parse_args() 341 | config = parse_config_file(args.default_config) 342 | if not config['ghtoken']: 343 | print(f'A github token is required to be set as "ghtoken" in {config_file}', file=sys.stderr) 344 | exit(1) 345 | auto_update = config.get('auto_update', 0) 346 | ghmeta = GhMeta(config['meta']) 347 | gh = Github(config['ghtoken']) 348 | user = gh.get_user() 349 | 350 | exclude_reasons = set() 351 | if args.exclude_reasons: 352 | exclude_reasons.update(args.exclude_reasons.split(',')) 353 | 354 | if args.sort is None: 355 | sort_notifications = config.get('sort_notifications', False) 356 | else: 357 | sort_notifications = args.sort 358 | 359 | if auto_update: 360 | pull_repositories(config) 361 | 362 | set_window_size() 363 | notifications = github_load(user) 364 | buttons = draw(notifications) 365 | 366 | Key.start(hide_cursor=True) 367 | 368 | refr_t = 0 369 | meta_t = 0 370 | try: 371 | while True: 372 | # auto-refresh periodically 373 | if refr_t >= args.refresh_time: 374 | notifications = github_load(user) 375 | buttons = draw(notifications) 376 | refr_t = 0 377 | 378 | # auto-update (git pull) metadata-repo(s) 379 | if auto_update and meta_t >= auto_update: 380 | meta_t = 0 381 | pull_repositories(config) 382 | 383 | # handle key input 384 | k = Key.get() 385 | if not k: 386 | Key.input_wait(1.0) 387 | refr_t += 1 388 | meta_t += 1 389 | continue 390 | if k == 'escape': 391 | break 392 | if k == 'mouse_click': 393 | # TODO: highlight button when clicked for a bit of feedback? 394 | for b in buttons: 395 | if b.x0 <= Key.mouse_pos[0] < b.x1 and b.y0 <= Key.mouse_pos[1] < b.y1: 396 | handle_mouse_click(b, config) 397 | if k == 'resize': 398 | set_window_size() 399 | buttons = draw(notifications) 400 | finally: 401 | Key.stop() 402 | 403 | if __name__ == "__main__": 404 | main() 405 | -------------------------------------------------------------------------------- /github-merge.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Copyright (c) 2016-2017 The Bitcoin Core developers 3 | # Distributed under the MIT software license, see the accompanying 4 | # file COPYING or http://www.opensource.org/licenses/mit-license.php. 5 | 6 | # This script will locally construct a merge commit for a pull request on a 7 | # github repository, inspect it, sign it and optionally push it. 8 | 9 | # The following temporary branches are created/overwritten and deleted: 10 | # * pull/$PULL/base (the current master we're merging onto) 11 | # * pull/$PULL/head (the current state of the remote pull request) 12 | # * pull/$PULL/merge (github's merge) 13 | # * pull/$PULL/local-merge (our merge) 14 | 15 | # In case of a clean merge that is accepted by the user, the local branch with 16 | # name $BRANCH is overwritten with the merged result, and optionally pushed. 17 | import os 18 | from sys import stdin,stdout,stderr 19 | import argparse 20 | import re 21 | import hashlib 22 | import subprocess 23 | import sys 24 | import json 25 | import codecs 26 | import unicodedata 27 | from urllib.request import Request, urlopen 28 | from urllib.error import HTTPError 29 | 30 | # External tools (can be overridden using environment) 31 | GIT = os.getenv('GIT','git') 32 | SHELL = os.getenv('SHELL','bash') 33 | 34 | # OS specific configuration for terminal attributes 35 | ATTR_RESET = '' 36 | ATTR_PR = '' 37 | ATTR_NAME = '' 38 | ATTR_WARN = '' 39 | ATTR_HL = '' 40 | COMMIT_FORMAT = '%H %s (%an)%d' 41 | if os.name == 'posix': # if posix, assume we can use basic terminal escapes 42 | ATTR_RESET = '\033[0m' 43 | ATTR_PR = '\033[1;36m' 44 | ATTR_NAME = '\033[0;36m' 45 | ATTR_WARN = '\033[1;31m' 46 | ATTR_HL = '\033[95m' 47 | COMMIT_FORMAT = '%C(bold blue)%H%Creset %s %C(cyan)(%an)%Creset%C(green)%d%Creset' 48 | 49 | def sanitize(s, newlines=False): 50 | ''' 51 | Strip control characters (optionally except for newlines) from a string. 52 | This prevent text data from doing potentially confusing or harmful things 53 | with ANSI formatting, linefeeds bells etc. 54 | ''' 55 | return ''.join(ch for ch in s if unicodedata.category(ch)[0] != "C" or (ch == '\n' and newlines)) 56 | 57 | def git_config_get(option, default=None): 58 | ''' 59 | Get named configuration option from git repository. 60 | ''' 61 | try: 62 | return subprocess.check_output([GIT,'config','--get',option]).rstrip().decode('utf-8') 63 | except subprocess.CalledProcessError: 64 | return default 65 | 66 | def get_response(req_url, ghtoken): 67 | req = Request(req_url) 68 | if ghtoken is not None: 69 | req.add_header('Authorization', 'token ' + ghtoken) 70 | return urlopen(req) 71 | 72 | def sanitize_ghdata(rec): 73 | ''' 74 | Sanitize comment/review record coming from github API in-place. 75 | This currently sanitizes the following: 76 | - ['title'] PR title (optional, may not have newlines) 77 | - ['body'] Comment body (required, may have newlines) 78 | It also checks rec['user']['login'] (required) to be a valid github username. 79 | 80 | When anything more is used, update this function! 81 | ''' 82 | if 'title' in rec: # only for PRs 83 | rec['title'] = sanitize(rec['title'], newlines=False) 84 | if rec['body'] is None: 85 | rec['body'] = '' 86 | rec['body'] = sanitize(rec['body'], newlines=True) 87 | 88 | if rec['user'] is None: # User deleted account 89 | rec['user'] = {'login': '[deleted]'} 90 | else: 91 | # "Github username may only contain alphanumeric characters or hyphens'. 92 | # Sometimes bot have a "[bot]" suffix in the login, so we also match for that 93 | # Use \Z instead of $ to not match final newline only end of string. 94 | if not re.match(r'[a-zA-Z0-9-]+(\[bot\])?\Z', rec['user']['login'], re.DOTALL): 95 | raise ValueError('Github username contains invalid characters: {}'.format(sanitize(rec['user']['login']))) 96 | return rec 97 | 98 | def retrieve_json(req_url, ghtoken, use_pagination=False): 99 | ''' 100 | Retrieve json from github. 101 | Return None if an error happens. 102 | ''' 103 | try: 104 | reader = codecs.getreader('utf-8') 105 | if not use_pagination: 106 | return sanitize_ghdata(json.load(reader(get_response(req_url, ghtoken)))) 107 | 108 | obj = [] 109 | page_num = 1 110 | while True: 111 | req_url_page = '{}?page={}'.format(req_url, page_num) 112 | result = get_response(req_url_page, ghtoken) 113 | obj.extend(json.load(reader(result))) 114 | 115 | link = result.headers.get('link', None) 116 | if link is not None: 117 | link_next = [l for l in link.split(',') if 'rel="next"' in l] 118 | if len(link_next) > 0: 119 | page_num = int(link_next[0][link_next[0].find("page=")+5:link_next[0].find(">")]) 120 | continue 121 | break 122 | return [sanitize_ghdata(d) for d in obj] 123 | except HTTPError as e: 124 | error_message = e.read() 125 | print('Warning: unable to retrieve pull information from github: %s' % e) 126 | print('Detailed error: %s' % error_message) 127 | return None 128 | except Exception as e: 129 | print('Warning: unable to retrieve pull information from github: %s' % e) 130 | return None 131 | 132 | def retrieve_pr_info(repo,pull,ghtoken): 133 | req_url = "https://api.github.com/repos/"+repo+"/pulls/"+pull 134 | return retrieve_json(req_url,ghtoken) 135 | 136 | def retrieve_pr_comments(repo,pull,ghtoken): 137 | req_url = "https://api.github.com/repos/"+repo+"/issues/"+pull+"/comments" 138 | return retrieve_json(req_url,ghtoken,use_pagination=True) 139 | 140 | def retrieve_pr_reviews(repo,pull,ghtoken): 141 | req_url = "https://api.github.com/repos/"+repo+"/pulls/"+pull+"/reviews" 142 | return retrieve_json(req_url,ghtoken,use_pagination=True) 143 | 144 | def ask_prompt(text): 145 | print(text,end=" ",file=stderr) 146 | stderr.flush() 147 | reply = stdin.readline().rstrip() 148 | print("",file=stderr) 149 | return reply 150 | 151 | def get_symlink_files(): 152 | files = sorted(subprocess.check_output([GIT, 'ls-tree', '--full-tree', '-r', 'HEAD']).splitlines()) 153 | ret = [] 154 | for f in files: 155 | if (int(f.decode('utf-8').split(" ")[0], 8) & 0o170000) == 0o120000: 156 | ret.append(f.decode('utf-8').split("\t")[1]) 157 | return ret 158 | 159 | def tree_sha512sum(commit='HEAD'): 160 | # request metadata for entire tree, recursively 161 | files = [] 162 | blob_by_name = {} 163 | for line in subprocess.check_output([GIT, 'ls-tree', '--full-tree', '-r', commit]).splitlines(): 164 | name_sep = line.index(b'\t') 165 | metadata = line[:name_sep].split() # perms, 'blob', blobid 166 | assert(metadata[1] == b'blob') 167 | name = line[name_sep+1:] 168 | files.append(name) 169 | blob_by_name[name] = metadata[2] 170 | 171 | files.sort() 172 | # open connection to git-cat-file in batch mode to request data for all blobs 173 | # this is much faster than launching it per file 174 | p = subprocess.Popen([GIT, 'cat-file', '--batch'], stdout=subprocess.PIPE, stdin=subprocess.PIPE) 175 | overall = hashlib.sha512() 176 | for f in files: 177 | blob = blob_by_name[f] 178 | # request blob 179 | p.stdin.write(blob + b'\n') 180 | p.stdin.flush() 181 | # read header: blob, "blob", size 182 | reply = p.stdout.readline().split() 183 | assert(reply[0] == blob and reply[1] == b'blob') 184 | size = int(reply[2]) 185 | # hash the blob data 186 | intern = hashlib.sha512() 187 | ptr = 0 188 | while ptr < size: 189 | bs = min(65536, size - ptr) 190 | piece = p.stdout.read(bs) 191 | if len(piece) == bs: 192 | intern.update(piece) 193 | else: 194 | raise IOError('Premature EOF reading git cat-file output') 195 | ptr += bs 196 | dig = intern.hexdigest() 197 | assert(p.stdout.read(1) == b'\n') # ignore LF that follows blob data 198 | # update overall hash with file hash 199 | overall.update(dig.encode("utf-8")) 200 | overall.update(" ".encode("utf-8")) 201 | overall.update(f) 202 | overall.update("\n".encode("utf-8")) 203 | p.stdin.close() 204 | if p.wait(): 205 | raise IOError('Non-zero return value executing git cat-file') 206 | return overall.hexdigest() 207 | 208 | def get_acks_from_comments(head_commit, comments) -> dict: 209 | # Look for abbreviated commit id, because not everyone wants to type/paste 210 | # the whole thing and the chance of collisions within a PR is small enough 211 | head_abbrev = head_commit[0:6] 212 | acks = {} 213 | for c in comments: 214 | review = [ 215 | l for l in c["body"].splitlines() 216 | if "ACK" in l 217 | and head_abbrev in l 218 | and not l.startswith("> ") # omit if quoted comment 219 | and not l.startswith(" ") # omit if markdown indentation 220 | ] 221 | if review: 222 | acks[c['user']['login']] = review[0] 223 | return acks 224 | 225 | def make_acks_message(head_commit, acks) -> str: 226 | if acks: 227 | ack_str ='\n\nACKs for top commit:\n'.format(head_commit) 228 | for name, msg in acks.items(): 229 | ack_str += ' {}:\n'.format(name) 230 | ack_str += ' {}\n'.format(msg) 231 | else: 232 | ack_str ='\n\nTop commit has no ACKs.\n' 233 | return ack_str 234 | 235 | def print_merge_details(pull_reference, title, branch, base_branch, head_branch, acks, message): 236 | print('{}{}{} {} {}into {}{}'.format(ATTR_RESET+ATTR_PR,pull_reference,ATTR_RESET,title,ATTR_RESET+ATTR_PR,branch,ATTR_RESET)) 237 | subprocess.check_call([GIT,'--no-pager','log','--graph','--topo-order','--pretty=tformat:'+COMMIT_FORMAT,base_branch+'..'+head_branch]) 238 | if acks is not None: 239 | if acks: 240 | print('{}ACKs:{}'.format(ATTR_PR, ATTR_RESET)) 241 | for ack_name, ack_msg in acks.items(): 242 | print('* {} {}({}){}'.format(ack_msg, ATTR_NAME, ack_name, ATTR_RESET)) 243 | else: 244 | print('{}Top commit has no ACKs!{}'.format(ATTR_WARN, ATTR_RESET)) 245 | show_message = False 246 | if message is not None and '@' in message: 247 | print('{}Merge message contains an @!{}'.format(ATTR_WARN, ATTR_RESET)) 248 | show_message = True 249 | if message is not None and '/), 264 | githubmerge.pushmirrors (default: none, comma-separated list of mirrors to push merges of the master development branch to, e.g. `git@gitlab.com:/.git,git@github.com:/.git`), 265 | user.signingkey (mandatory), 266 | user.ghtoken (default: none). 267 | githubmerge.merge-author-email (default: Email from git config), 268 | githubmerge.host (default: git@github.com), 269 | githubmerge.branch (no default), 270 | githubmerge.testcmd (default: none). 271 | ''' 272 | parser = argparse.ArgumentParser(description='Utility to merge, sign and push github pull requests', 273 | epilog=epilog) 274 | parser.add_argument('--repo-from', '-r', metavar='repo_from', type=str, nargs='?', 275 | help='The repo to fetch the pull request from. Useful for monotree repositories. Can only be specified when branch==master. (default: githubmerge.repository setting)') 276 | parser.add_argument('pull', metavar='PULL', type=int, nargs=1, 277 | help='Pull request ID to merge') 278 | parser.add_argument('branch', metavar='BRANCH', type=str, nargs='?', 279 | default=None, help='Branch to merge against (default: githubmerge.branch setting, or base branch for pull, or \'master\')') 280 | return parser.parse_args() 281 | 282 | def main(): 283 | # Extract settings from git repo 284 | repo = git_config_get('githubmerge.repository') 285 | host = git_config_get('githubmerge.host','git@github.com') 286 | opt_branch = git_config_get('githubmerge.branch',None) 287 | merge_author_email = git_config_get('githubmerge.merge-author-email',None) 288 | testcmd = git_config_get('githubmerge.testcmd') 289 | ghtoken = git_config_get('user.ghtoken') 290 | signingkey = git_config_get('user.signingkey') 291 | if repo is None: 292 | print("ERROR: No repository configured. Use this command to set:", file=stderr) 293 | print("git config githubmerge.repository /", file=stderr) 294 | sys.exit(1) 295 | if signingkey is None: 296 | print("ERROR: No GPG signing key set. Set one using:",file=stderr) 297 | print("git config --global user.signingkey ",file=stderr) 298 | sys.exit(1) 299 | 300 | # Extract settings from command line 301 | args = parse_arguments() 302 | repo_from = args.repo_from or repo 303 | is_other_fetch_repo = repo_from != repo 304 | pull = str(args.pull[0]) 305 | 306 | if host.startswith(('https:','http:')): 307 | host_repo = host+"/"+repo+".git" 308 | host_repo_from = host+"/"+repo_from+".git" 309 | else: 310 | host_repo = host+":"+repo 311 | host_repo_from = host+":"+repo_from 312 | 313 | # Receive pull information from github 314 | info = retrieve_pr_info(repo_from,pull,ghtoken) 315 | if info is None: 316 | sys.exit(1) 317 | title = info['title'].strip() 318 | body = info['body'].strip() 319 | pull_reference = repo_from + '#' + pull 320 | # precedence order for destination branch argument: 321 | # - command line argument 322 | # - githubmerge.branch setting 323 | # - base branch for pull (as retrieved from github) 324 | # - 'master' 325 | branch = args.branch or opt_branch or info['base']['ref'] or 'master' 326 | 327 | if branch == 'master': 328 | push_mirrors = git_config_get('githubmerge.pushmirrors', default='').split(',') 329 | push_mirrors = [p for p in push_mirrors if p] # Filter empty string 330 | else: 331 | push_mirrors = [] 332 | if is_other_fetch_repo: 333 | print('ERROR: --repo-from is only supported for the master development branch') 334 | sys.exit(1) 335 | 336 | # Initialize source branches 337 | head_branch = 'pull/'+pull+'/head' 338 | base_branch = 'pull/'+pull+'/base' 339 | merge_branch = 'pull/'+pull+'/merge' 340 | local_merge_branch = 'pull/'+pull+'/local-merge' 341 | 342 | devnull = open(os.devnull, 'w', encoding="utf8") 343 | try: 344 | subprocess.check_call([GIT,'checkout','-q',branch]) 345 | except subprocess.CalledProcessError: 346 | print(f"ERROR: Cannot check out branch {branch}.", file=stderr) 347 | sys.exit(3) 348 | try: 349 | subprocess.check_call([GIT,'fetch','-q',host_repo_from,'+refs/pull/'+pull+'/*:refs/heads/pull/'+pull+'/*', 350 | '+refs/heads/'+branch+':refs/heads/'+base_branch]) 351 | except subprocess.CalledProcessError: 352 | print(f"ERROR: Cannot find pull request {pull_reference} or branch {branch} on {host_repo_from}.", file=stderr) 353 | sys.exit(3) 354 | try: 355 | subprocess.check_call([GIT,'--no-pager','log','-q','-1','refs/heads/'+head_branch], stdout=devnull, stderr=stdout) 356 | head_commit = subprocess.check_output([GIT,'--no-pager','log','-1','--pretty=format:%H',head_branch]).decode('utf-8') 357 | assert len(head_commit) == 40 358 | except subprocess.CalledProcessError: 359 | print(f"ERROR: Cannot find head of pull request {pull_reference} on {host_repo_from}.", file=stderr) 360 | sys.exit(3) 361 | try: 362 | subprocess.check_call([GIT,'--no-pager','log','-q','-1','refs/heads/'+merge_branch], stdout=devnull, stderr=stdout) 363 | except subprocess.CalledProcessError: 364 | print(f"ERROR: Cannot find merge of pull request {pull_reference} on {host_repo_from}.", file=stderr) 365 | sys.exit(3) 366 | subprocess.check_call([GIT,'checkout','-q',base_branch]) 367 | subprocess.call([GIT,'branch','-q','-D',local_merge_branch], stderr=devnull) 368 | subprocess.check_call([GIT,'checkout','-q','-b',local_merge_branch]) 369 | 370 | try: 371 | # Go up to the repository's root. 372 | toplevel = subprocess.check_output([GIT,'rev-parse','--show-toplevel']).strip() 373 | os.chdir(toplevel) 374 | # Create unsigned merge commit. 375 | if title: 376 | firstline = 'Merge {}: {}'.format(pull_reference,title) 377 | else: 378 | firstline = 'Merge {}'.format(pull_reference) 379 | message = firstline + '\n\n' 380 | message += subprocess.check_output([GIT,'--no-pager','log','--no-merges','--topo-order','--pretty=format:%H %s (%an)',base_branch+'..'+head_branch]).decode('utf-8') 381 | message += '\n\nPull request description:\n\n ' + body.replace('\n', '\n ') + '\n' 382 | try: 383 | subprocess.check_call([GIT,'merge','-q','--commit','--no-edit','--no-ff','--no-gpg-sign','-m',message.encode('utf-8'),head_branch]) 384 | except subprocess.CalledProcessError: 385 | print("ERROR: Cannot be merged cleanly.",file=stderr) 386 | subprocess.check_call([GIT,'merge','--abort']) 387 | sys.exit(4) 388 | logmsg = subprocess.check_output([GIT,'--no-pager','log','--pretty=format:%s','-n','1']).decode('utf-8') 389 | if logmsg.rstrip() != firstline.rstrip(): 390 | print("ERROR: Creating merge failed (already merged?).",file=stderr) 391 | sys.exit(4) 392 | 393 | symlink_files = get_symlink_files() 394 | for f in symlink_files: 395 | print(f"ERROR: File '{f}' was a symlink") 396 | if len(symlink_files) > 0: 397 | sys.exit(4) 398 | 399 | # Compute SHA512 of git tree (to be able to detect changes before sign-off) 400 | try: 401 | first_sha512 = tree_sha512sum() 402 | except subprocess.CalledProcessError: 403 | print("ERROR: Unable to compute tree hash") 404 | sys.exit(4) 405 | 406 | print_merge_details(pull_reference, title, branch, base_branch, head_branch, acks=None, message=None) 407 | print() 408 | 409 | # Run test command if configured. 410 | if testcmd: 411 | if subprocess.call(testcmd,shell=True): 412 | print(f"ERROR: Running '{testcmd}' failed.",file=stderr) 413 | sys.exit(5) 414 | 415 | # Show the created merge. 416 | diff = subprocess.check_output([GIT,'diff',merge_branch+'..'+local_merge_branch]) 417 | subprocess.check_call([GIT,'diff',base_branch+'..'+local_merge_branch]) 418 | if diff: 419 | print("WARNING: merge differs from github!",file=stderr) 420 | reply = ask_prompt("Type 'ignore' to continue.") 421 | if reply.lower() == 'ignore': 422 | print("Difference with github ignored.",file=stderr) 423 | else: 424 | sys.exit(6) 425 | else: 426 | # Verify the result manually. 427 | print("Dropping you on a shell so you can try building/testing the merged source.",file=stderr) 428 | print("Run 'git diff HEAD~' to show the changes being merged.",file=stderr) 429 | print("Type 'exit' when done.",file=stderr) 430 | if os.path.isfile('/etc/debian_version'): # Show pull number on Debian default prompt 431 | os.putenv('debian_chroot',pull) 432 | subprocess.call([SHELL,'-i']) 433 | 434 | second_sha512 = tree_sha512sum() 435 | if first_sha512 != second_sha512: 436 | print("ERROR: Tree hash changed unexpectedly",file=stderr) 437 | sys.exit(8) 438 | 439 | # Retrieve PR comments and ACKs and add to commit message, store ACKs to print them with commit 440 | # description 441 | comments = retrieve_pr_comments(repo_from,pull,ghtoken) + retrieve_pr_reviews(repo_from,pull,ghtoken) 442 | if comments is None: 443 | print("ERROR: Could not fetch PR comments and reviews",file=stderr) 444 | sys.exit(1) 445 | acks = get_acks_from_comments(head_commit=head_commit, comments=comments) 446 | message += make_acks_message(head_commit=head_commit, acks=acks) 447 | # end message with SHA512 tree hash, then update message 448 | message += '\n\nTree-SHA512: ' + first_sha512 449 | try: 450 | subprocess.check_call([GIT,'commit','--amend','--no-gpg-sign','-m',message.encode('utf-8')]) 451 | except subprocess.CalledProcessError: 452 | print("ERROR: Cannot update message.", file=stderr) 453 | sys.exit(4) 454 | 455 | # Sign the merge commit. 456 | print_merge_details(pull_reference, title, branch, base_branch, head_branch, acks, message) 457 | while True: 458 | reply = ask_prompt("Type 's' to sign off on the above merge, or 'x' to reject and exit.").lower() 459 | if reply == 's': 460 | try: 461 | config = ['-c', 'user.name=merge-script'] 462 | if merge_author_email: 463 | config += ['-c', f'user.email={merge_author_email}'] 464 | subprocess.check_call([GIT] + config + ['commit','-q','--gpg-sign','--amend','--no-edit','--reset-author']) 465 | break 466 | except subprocess.CalledProcessError: 467 | print("Error while signing, asking again.",file=stderr) 468 | elif reply == 'x': 469 | print("Not signing off on merge, exiting.",file=stderr) 470 | sys.exit(1) 471 | 472 | # Put the result in branch. 473 | subprocess.check_call([GIT,'checkout','-q',branch]) 474 | subprocess.check_call([GIT,'reset','-q','--hard',local_merge_branch]) 475 | finally: 476 | # Clean up temporary branches. 477 | subprocess.call([GIT,'checkout','-q',branch]) 478 | subprocess.call([GIT,'branch','-q','-D',head_branch],stderr=devnull) 479 | subprocess.call([GIT,'branch','-q','-D',base_branch],stderr=devnull) 480 | subprocess.call([GIT,'branch','-q','-D',merge_branch],stderr=devnull) 481 | subprocess.call([GIT,'branch','-q','-D',local_merge_branch],stderr=devnull) 482 | 483 | # Push the result. 484 | while True: 485 | reply = ask_prompt("Type 'push' to push the result to {}, branch {}, or 'x' to exit without pushing.".format(', '.join([host_repo] + push_mirrors), branch)).lower() 486 | if reply == 'push': 487 | subprocess.check_call([GIT,'push',host_repo,'refs/heads/'+branch]) 488 | for p_mirror in push_mirrors: 489 | subprocess.check_call([GIT,'push',p_mirror,'refs/heads/'+branch]) 490 | break 491 | elif reply == 'x': 492 | sys.exit(1) 493 | 494 | if __name__ == '__main__': 495 | main() 496 | -------------------------------------------------------------------------------- /list-pulls.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Script to parse git commit list, extract github issues to create a changelog in 4 | text and JSON format. 5 | 6 | Run this in the root directory of the repository. 7 | 8 | This requires an up-to-date checkout of https://github.com/zw/bitcoin-gh-meta.git 9 | in the parent directory, or environment variable `GHMETA`. 10 | 11 | It takes a range of commits and a .json file of PRs to exclude, for 12 | example if these are already backported in a minor release. This can be the pulls.json 13 | generated from a previous release. 14 | 15 | Example usage: 16 | 17 | ../maintainer-tools/list-pulls.py v0.18.0 0.19 relnot/pulls-exclude.json > relnot/pulls.md 18 | 19 | The output of this script is a first draft based on rough heuristics, and 20 | likely needs to be extensively manually edited before ending up in the release 21 | notes. 22 | ''' 23 | # W.J. van der Laan 2017-2021 24 | # SPDX-License-Identifier: MIT 25 | import subprocess 26 | import re 27 | import json 28 | import time 29 | import sys, os 30 | from collections import namedtuple, defaultdict 31 | 32 | # == Global environment == 33 | GIT = os.getenv('GIT', 'git') 34 | GHMETA = os.getenv('GHMETA', '../bitcoin-gh-meta') 35 | DEFAULT_REPO = os.getenv('DEFAULT_REPO', 'bitcoin/bitcoin') 36 | 37 | # == Label to category mapping == 38 | # See: https://github.com/bitcoin/bitcoin/labels 39 | # this is priority ordering: the first label to be matched determines the 40 | # category it is slotted to 41 | # TODO: simply create titles for combinations of mappings, and leave it up to release note writer 42 | # which one to choose? this automatic choosing based on "priority" kind of sucks. 43 | LABEL_MAPPING = ( 44 | # Consensus, mining and policy changes should come first 45 | ({'consensus'}, 46 | 'Consensus'), 47 | ({'tx fees and policy'}, 48 | 'Policy'), 49 | ({'mining'}, 50 | 'Mining'), 51 | # Privacy changes 52 | ({'privacy'}, 53 | 'Privacy'), 54 | # Backends 55 | ({'mempool', 'block storage', 'utxo db and indexes', 'validation'}, 56 | 'Block and transaction handling'), 57 | ({'p2p'}, 58 | 'P2P protocol and network code'), 59 | ({'wallet', 'descriptors'}, 60 | 'Wallet'), 61 | # Frontends 62 | ({'rpc/rest/zmq'}, 63 | 'RPC and other APIs'), 64 | ({'gui'}, 65 | 'GUI'), 66 | # Frameworks, infrastructure, building etcetera 67 | ({'build system'}, 68 | 'Build system'), 69 | ({'tests'}, 70 | 'Tests and QA'), 71 | ({'utils/log/libs', 'scripts and tools', 'upstream', 'utils and libraries'}, 72 | 'Miscellaneous'), 73 | # Documentation-only 74 | ({'docs and output', 'docs'}, 75 | 'Documentation'), 76 | ({'windows', 'unix', 'macos'}, 77 | 'Platform support'), 78 | # Ignore everything below this for pull list 79 | ({'refactoring'}, 80 | 'Refactoring'), # Ignore pure refactoring for pull list 81 | ({'backport'}, 82 | 'Backports'), # Ignore pure backports for pull list 83 | ) 84 | UNCATEGORIZED = 'Uncategorized' 85 | 86 | # == PR title prefix to category mapping == 87 | # this takes precedence over the above label mapping 88 | # handle (in all cases, ignoring including leading and trailing ' ') 89 | # SPECIFY IN LOWERCASE 90 | # set do_strip as False if the prefix adds information beyond what the category provides! 91 | # '[prefix]:' '[prefix]' 'prefix:' 92 | PREFIXES = [ 93 | # (prefix, category, do_strip) 94 | ('bench', 'Tests and QA', False), 95 | ('build', 'Build system', True), 96 | ('ci', 'Tests and QA', False), 97 | ('cli', 'RPC and other APIs', False), 98 | ('consensus', 'Consensus', True), 99 | ('contrib', 'Miscellaneous', False), 100 | ('depends', 'Build system', True), 101 | ('doc', 'Documentation', True), 102 | ('docs', 'Documentation', True), 103 | ('gitian', 'Build system', False), 104 | ('gui', 'GUI', True), 105 | ('lint', 'Miscellaneous', False), 106 | ('logging', 'Miscellaneous', False), 107 | ('mempool', 'Block and transaction handling', True), 108 | ('txmempool', 'Block and transaction handling', True), 109 | ('moveonly', 'Refactoring', False), 110 | ('net', 'P2P protocol and network code', True), 111 | ('nit', 'Refactoring', True), 112 | ('p2p', 'P2P protocol and network code', True), 113 | ('policy', 'Policy', True), 114 | ('qa', 'Tests and QA', True), 115 | ('qt', 'GUI', True), 116 | ('refactor', 'Refactoring', True), 117 | ('release', 'Build system', False), 118 | ('rest', 'RPC and other APIs', False), 119 | ('rpc', 'RPC and other APIs', True), 120 | ('scripted-diff', 'Refactoring', False), 121 | ('script', 'Miscellaneous', False), # !!! this is unclear, 'script' could also be block/tx handling or even consensus 122 | ('scripts', 'Miscellaneous', False), 123 | ('shutdown', 'Miscellaneous', False), 124 | ('tests', 'Tests and QA', True), 125 | ('test', 'Tests and QA', True), 126 | ('travis', 'Tests and QA', False), 127 | ('trivial', 'Refactoring', True), 128 | ('ui', 'GUI', True), 129 | ('util', 'Miscellaneous', False), 130 | ('utils', 'Miscellaneous', False), 131 | ('validation', 'Block and transaction handling', True), 132 | ('wallet', 'Wallet', True), 133 | ] 134 | 135 | # Per-repository information 136 | REPO_INFO = { 137 | 'bitcoin/bitcoin': { 138 | 'label_mapping': LABEL_MAPPING, 139 | 'prefixes': PREFIXES, 140 | 'default_category': UNCATEGORIZED, 141 | 'ghmeta': GHMETA, 142 | }, 143 | # For now, GUI repository pulls are automatically categorized into the GUI category. 144 | 'bitcoin-core/gui': { 145 | 'label_mapping': (), 146 | 'prefixes': [], 147 | 'default_category': 'GUI', 148 | 'ghmeta': None, 149 | }, 150 | } 151 | 152 | # == Utilities == 153 | 154 | def remove_last_if_empty(l): 155 | '''Remove empty last member of list''' 156 | if l[-1]==b'' or l[-1]=='': 157 | return l[0:-1] 158 | else: 159 | return l 160 | 161 | # Valid chars in github names 162 | VALIDNAMECHARS = '[0-9a-zA-Z\-_]' 163 | # For parsing owner/repo#id 164 | FQID_RE = re.compile('^(' + VALIDNAMECHARS + '+)/(' + VALIDNAMECHARS + '+)#([0-9]+)$') 165 | # For parsing non-qualified #id 166 | PR_RE = re.compile('^#?([0-9]+)$') 167 | 168 | class FQId: 169 | '''Fully qualified PR id.''' 170 | def __init__(self, owner: str, repo: str, pr: int): 171 | self.owner = owner 172 | self.repo = repo 173 | self.pr = pr 174 | 175 | @property 176 | def _key(self): 177 | return (self.owner, self.repo, self.pr) 178 | 179 | def __eq__(self, o): 180 | return self._key == o._key 181 | 182 | def __lt__(self, o): 183 | return self._key < o._key 184 | 185 | def __hash__(self): 186 | return hash(self._key) 187 | 188 | def __str__(self): 189 | return f'{self.owner}/{self.repo}#{self.pr}' 190 | 191 | def __repr__(self): 192 | return f'FQId({repr(self.owner)}, {repr(self.repo)}, {repr(self.pr)})' 193 | 194 | @classmethod 195 | def parse(cls, pull, default_repo): 196 | '''Return FQId from 'owner/repo#id' or '#id' or 'id' string.''' 197 | m = FQID_RE.match(pull) 198 | if m: 199 | return cls(m.group(1), m.group(2), int(m.group(3))) 200 | m = PR_RE.match(pull) 201 | if m: 202 | (owner, repo) = default_repo.split('/') 203 | return cls(owner, repo, int(m.group(1))) 204 | raise ValueError(f'Cannot parse {pull} as PR specification.') 205 | 206 | def tests(): 207 | '''Quick internal sanity tests.''' 208 | assert(FQId.parse('bitcoin/bitcoin#1234', 'bitcoin/bitcoin') == FQId('bitcoin', 'bitcoin', 1234)) 209 | assert(FQId.parse('bitcoin-core/gui#1235', 'bitcoin/bitcoin') == FQId('bitcoin-core', 'gui', 1235)) 210 | assert(FQId.parse('#1236', 'bitcoin/bitcoin') == FQId('bitcoin', 'bitcoin', 1236)) 211 | assert(FQId.parse('1237', 'bitcoin/bitcoin') == FQId('bitcoin', 'bitcoin', 1237)) 212 | assert(str(FQId('bitcoin', 'bitcoin', 1239)) == 'bitcoin/bitcoin#1239') 213 | assert(FQId('bitcoin', 'bitcoin', 1239) < FQId('bitcoin', 'bitcoin', 1240)) 214 | assert(not (FQId('bitcoin', 'bitcoin', 1240) < FQId('bitcoin', 'bitcoin', 1239))) 215 | assert(FQId('bitcoin', 'bitcoin', 1240) < FQId('bitcoin-core', 'gui', 1239)) 216 | assert(not (FQId('bitcoin-core', 'gui', 1239) < FQId('bitcoin', 'bitcoin', 1240))) 217 | 218 | # == Main program == 219 | tests() 220 | ref_from = sys.argv[1] # 'v0.10.0rc1' 221 | ref_to = sys.argv[2] # 'master' 222 | 223 | # read exclude file 224 | exclude_pulls = set() 225 | 226 | if len(sys.argv) >= 4: 227 | exclude_file = sys.argv[3] 228 | try: 229 | with open(exclude_file, 'r') as f: 230 | d = json.load(f) 231 | exclude_pulls = set(FQId.parse(str(p['id']), DEFAULT_REPO) for p in d['pulls']) 232 | print(f'Excluding {", ".join(str(p) for p in exclude_pulls)}') 233 | print() 234 | except IOError as e: 235 | print(f'Unable to read exclude file {exclude_file}', file=sys.stderr) 236 | exit(1) 237 | 238 | # set of all commits 239 | commits = subprocess.check_output([GIT, 'rev-list', '--reverse', '--topo-order', ref_from+'..'+ref_to]) 240 | commits = commits.decode() 241 | commits = remove_last_if_empty(commits.splitlines()) 242 | commits_list = commits 243 | commits = set(commits) 244 | 245 | CommitData = namedtuple('CommitData', ['sha', 'message', 'title', 'parents']) 246 | commit_data = {} 247 | 248 | # collect data 249 | for commit in commits: 250 | info = subprocess.check_output([GIT, 'show', '-s', '--format=%B%x00%P', commit]) 251 | info = info.decode() 252 | (message, parents) = info.split('\0') 253 | title = message.rstrip().splitlines()[0] 254 | parents = parents.rstrip().split(' ') 255 | commit_data[commit] = CommitData(commit, message, title, parents) 256 | 257 | class CommitMetaData: 258 | pull = None 259 | rebased_from = None 260 | 261 | def __repr__(self): 262 | return 'CommitMetadata(pull=%s,rebased_from=%s)' % (self.pull,self.rebased_from) 263 | 264 | def parse_commit_message(msg): 265 | ''' 266 | Parse backport commit message. 267 | ''' 268 | retval = CommitMetaData() 269 | for line in msg.splitlines(): 270 | if line.startswith('Github-Pull:'): 271 | param = line[12:].strip() 272 | if param.startswith('#'): # compensate for incorrect #bitcoin-core/gui#148 273 | param = param[1:] 274 | retval.pull = FQId.parse(param, DEFAULT_REPO) 275 | if line.startswith('Rebased-From:'): 276 | retval.rebased_from = line[13:].strip().split() 277 | if retval.pull is not None: 278 | return retval 279 | else: 280 | return None 281 | 282 | # traverse merge commits 283 | pulls = {} 284 | PullData = namedtuple('PullData', ['id', 'merge', 'commits', 'index']) 285 | orphans = set(commits) 286 | MERGE_RE = re.compile('Merge (.*?):') 287 | for c in commit_data.values(): 288 | # is merge commit 289 | if len(c.parents)>1: 290 | assert(len(c.parents)==2) 291 | match = MERGE_RE.match(c.title) 292 | if match: # merges a pull request 293 | if c.sha in orphans: 294 | orphans.remove(c.sha) 295 | #print('removing ', c.sha) 296 | sub_commits = subprocess.check_output([GIT, 'rev-list', c.parents[0]+'..'+c.parents[1]]) 297 | sub_commits = sub_commits.decode() 298 | sub_commits = set(sub_commits.rstrip().splitlines()) 299 | pull = FQId.parse(match.group(1), DEFAULT_REPO) 300 | 301 | # remove commits that are not in the global list 302 | sub_commits = sub_commits.intersection(commits) 303 | for cs in sub_commits: 304 | if cs in orphans: 305 | orphans.remove(cs) 306 | 307 | if not pull in exclude_pulls: 308 | # if any sub-commits left, report them 309 | if sub_commits: 310 | # only report pull if any new commit went into the release 311 | index = commits_list.index(c.sha) 312 | pulls[pull] = PullData(pull, c.sha, sub_commits, index) 313 | 314 | # look up commits and see if they point to master pulls 315 | # (=backport pull) 316 | # add those too 317 | sub_pulls = defaultdict(list) 318 | for cid in sub_commits: 319 | md = parse_commit_message(commit_data[cid].message) 320 | if md: 321 | sub_pulls[md.pull].append(cid) 322 | 323 | if not sub_pulls and 'backport' in c.title.lower(): 324 | # just information for manual checking 325 | print(f'{pull}: Merge PR title {repr(c.title)} contains \'backport\' but there are no sub-pulls') 326 | 327 | for (sub_pull, sub_pull_commits) in sub_pulls.items(): 328 | pulls[sub_pull] = PullData(sub_pull, sub_pull_commits[0], sub_pull_commits, index) 329 | else: 330 | print(f'{c.sha}: Merge commit does not merge a PR: {c.title}') 331 | 332 | # Extract remaining pull numbers from orphans, if they're backports 333 | for o in set(orphans): 334 | c = commit_data[o] 335 | md = parse_commit_message(commit_data[o].message) 336 | if md: 337 | pulls[md.pull] = PullData(md.pull, c.sha, [], commits_list.index(c.sha)) 338 | orphans.remove(o) 339 | 340 | # Sort by index in commits list 341 | # This results in approximately chronological order 342 | pulls_order = list(pulls.values()) 343 | pulls_order.sort(key=lambda p:p.index) 344 | pulls_order = [p.id for p in pulls_order] 345 | # pulls_order = sorted(pulls.keys()) 346 | 347 | def guess_category_from_labels(repo_info, labels): 348 | ''' 349 | Guess category for a PR from github labels. 350 | ''' 351 | labels = [l.lower() for l in labels] 352 | for (label_list, category) in repo_info['label_mapping']: 353 | for l in labels: 354 | if l in label_list: 355 | return category 356 | return repo_info['default_category'] 357 | 358 | def get_category(repo_info, labels, message): 359 | ''' 360 | Guess category for a PR from repository, labels and message prefixes. 361 | Strip category from message. 362 | ''' 363 | category = guess_category_from_labels(repo_info, labels) 364 | message = message.strip() 365 | 366 | for (prefix, p_category, do_strip) in repo_info['prefixes']: 367 | for variant in [('[' + prefix + ']:'), ('[' + prefix + ']'), (prefix + ':')]: 368 | if message.lower().startswith(variant): 369 | category = p_category 370 | message = message[len(variant):].lstrip() 371 | if not do_strip: # if strip is not requested, re-add prefix in sanitized way 372 | message = prefix + ': ' + message.capitalize() 373 | 374 | return (category, message) 375 | 376 | pull_meta = {} 377 | pull_labels = {} 378 | per_category = defaultdict(list) 379 | for pull in pulls_order: 380 | repo_info = REPO_INFO[f'{pull.owner}/{pull.repo}'] 381 | 382 | # Find github metadata for PR, if available 383 | data0 = None 384 | data1 = {'title': '{Not found}', 'user': {'login':'unknown'}} 385 | if repo_info['ghmeta'] is not None: 386 | filename = f'{repo_info["ghmeta"]}/issues/{pull.pr//100}xx/{pull.pr}.json' 387 | try: 388 | with open(filename, 'r') as f: 389 | data0 = json.load(f) 390 | except IOError as e: 391 | pass 392 | 393 | filename = f'{repo_info["ghmeta"]}/issues/{pull.pr//100}xx/{pull.pr}-PR.json' 394 | try: 395 | with open(filename, 'r') as f: 396 | data1 = json.load(f) 397 | except IOError as e: 398 | pass 399 | 400 | message = data1['title'] 401 | author = data1['user']['login'] 402 | if data0 is not None: 403 | labels = [l['name'] for l in data0['labels']] 404 | else: 405 | labels = ['Missing'] 406 | 407 | # nightmarish UTF tweaking to fix broken output of export script 408 | message = message.encode('ISO-8859-1', errors='replace').decode(errors='replace') 409 | 410 | # consistent ellipsis 411 | message = message.replace('...', '…') 412 | # no '.' at end 413 | if message.endswith('.'): 414 | message = message[0:-1] 415 | 416 | # determine category and new message from message 417 | category, message = get_category(repo_info, labels, message) 418 | data1['title'] = message 419 | 420 | per_category[category].append((pull, message, author)) 421 | pull_labels[pull] = labels 422 | pull_meta[pull] = data1 423 | 424 | for _,category in LABEL_MAPPING: 425 | if not per_category[category]: 426 | continue 427 | print('### %s' % category) 428 | for dd in per_category[category]: 429 | print(f'- {dd[0]} {dd[1]} ({dd[2]})') 430 | print() 431 | 432 | if per_category[UNCATEGORIZED]: 433 | print('### %s' % UNCATEGORIZED) 434 | for dd in per_category[UNCATEGORIZED]: 435 | print(f'- {dd[0]} {dd[1]} ({dd[2]}) (labels: {pull_labels[dd[0]]})') 436 | print() 437 | 438 | print('### Orphan commits') 439 | for o in orphans: 440 | c = commit_data[o] 441 | print('- `%s` %s' % (o[0:7], c.title)) 442 | 443 | # write to json structure for postprocessing 444 | commits_d = [] 445 | for c in commits_list: 446 | commits_d.append(commit_data[c]) 447 | 448 | pulls_d = [] 449 | for pull in sorted(pulls.keys()): 450 | pd = pulls[pull] 451 | pulls_d.append( 452 | {'id': str(pd.id), 453 | 'merge': pd.merge, 454 | 'commits': list(pd.commits), 455 | 'meta': pull_meta[pd.id]}) 456 | 457 | data_out = { 458 | 'commits': commits_d, 459 | 'pulls': pulls_d, 460 | 'orphans': list(orphans), 461 | } 462 | 463 | with open('pulls.json','w') as f: 464 | json.dump(data_out, f, sort_keys=True, 465 | indent=4, separators=(',', ': ')) 466 | 467 | -------------------------------------------------------------------------------- /make-tag.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Make a new release tag, performing a few checks. 4 | 5 | Usage: make-tag.py 6 | ''' 7 | import os 8 | import subprocess 9 | import re 10 | import sys 11 | import collections 12 | 13 | import treehash512 14 | 15 | GIT = os.getenv("GIT", "git") 16 | 17 | # Full version specification 18 | VersionSpec = collections.namedtuple('VersionSpec', ['major', 'minor', 'build', 'rc']) 19 | 20 | def version_name(spec): 21 | ''' 22 | Short version name for comparison. 23 | ''' 24 | if not spec.build: 25 | version = f"{spec.major}.{spec.minor}" 26 | else: 27 | version = f"{spec.major}.{spec.minor}.{spec.build}" 28 | if spec.rc: 29 | version += f"rc{spec.rc}" 30 | return version 31 | 32 | def parse_tag(tag): 33 | ''' 34 | Parse a version tag. Valid version tags are 35 | 36 | - v1.2 37 | - v1.2.3 38 | - v1.2rc3 39 | - v1.2.3rc4 40 | ''' 41 | m = re.match(r"^v([0-9]+)\.([0-9]+)(?:\.([0-9]+))?(?:rc([0-9])+)?$", tag) 42 | 43 | if m is None: 44 | print(f"Invalid tag {tag}", file=sys.stderr) 45 | sys.exit(1) 46 | 47 | major = m.group(1) 48 | minor = m.group(2) 49 | build = m.group(3) 50 | rc = m.group(4) 51 | 52 | # Check for x.y.z.0 or x.y.zrc0 53 | if build == '0' or rc == '0': 54 | print('rc or build cannot be specified as 0 (leave them out instead)', file=sys.stderr) 55 | sys.exit(1) 56 | 57 | # Implicitly, treat no rc as rc0 and no build as build 0 58 | if build is None: 59 | build = 0 60 | if rc is None: 61 | rc = 0 62 | 63 | return VersionSpec(int(major), int(minor), int(build), int(rc)) 64 | 65 | def check_buildsystem(spec): 66 | ''' 67 | Parse configure.ac or CMakeLists.txt and return 68 | (major, minor, build, rc) 69 | ''' 70 | info = {} 71 | filename = 'configure.ac' 72 | if os.path.exists(filename): 73 | pattern = r"define\(_CLIENT_VERSION_([A-Z_]+), ([0-9a-z]+)\)" 74 | else: 75 | filename = 'CMakeLists.txt' 76 | if not os.path.exists(filename): 77 | print("No buildsystem (configure.ac or CMakeLists.txt) found", file=sys.stderr) 78 | sys.exit(1) 79 | pattern = r'set\(CLIENT_VERSION_([A-Z_]+)\s+"?([0-9a-z]+)"?\)' 80 | 81 | with open(filename) as f: 82 | for line in f: 83 | m = re.match(pattern, line) 84 | if m: 85 | info[m.group(1)] = m.group(2) 86 | # check if IS_RELEASE is set 87 | if info["IS_RELEASE"] != "true": 88 | print(f'{filename}: IS_RELEASE is not set to true', file=sys.stderr) 89 | sys.exit(1) 90 | 91 | cfg_spec = VersionSpec( 92 | int(info['MAJOR']), 93 | int(info['MINOR']), 94 | int(info['BUILD']), 95 | int(info['RC']), 96 | ) 97 | 98 | if cfg_spec != spec: 99 | print(f"{filename}: Version from tag {version_name(spec)} doesn't match specified version {version_name(cfg_spec)}", file=sys.stderr) 100 | sys.exit(1) 101 | 102 | def main(): 103 | try: 104 | tag = sys.argv[1] 105 | except IndexError: 106 | print("Usage: make-tag.py , e.g. v0.19.0 or v0.19.0rc3", file=sys.stderr) 107 | sys.exit(1) 108 | 109 | spec = parse_tag(tag) 110 | 111 | # Check that the script is called from repo root 112 | if not os.path.exists('.git'): 113 | print('Execute this script at the root of the repository', file=sys.stderr) 114 | sys.exit(1) 115 | 116 | # Check if working directory clean 117 | if subprocess.call([GIT, 'diff-index', '--quiet', 'HEAD']): 118 | print('Git working directory is not clean. Commit changes first.', file=sys.stderr) 119 | sys.exit(1) 120 | 121 | # Check version components against configure.ac in git tree 122 | check_buildsystem(spec) 123 | 124 | # Generate base message 125 | if not spec.build: 126 | version = f"{spec.major}.{spec.minor}" 127 | else: 128 | version = f"{spec.major}.{spec.minor}.{spec.build}" 129 | if spec.rc: 130 | version += f" release candidate {spec.rc}" 131 | else: 132 | version += " final" 133 | msg = 'Bitcoin Core ' + version + '\n' 134 | 135 | # Add treehash header 136 | msg += "\n" 137 | msg += 'Tree-SHA512: ' + treehash512.tree_sha512sum() + '\n' 138 | 139 | # Finally, make the tag 140 | print(msg) 141 | return subprocess.call([GIT, "tag", "-s", tag, "-m", msg]) 142 | 143 | if __name__ == '__main__': 144 | sys.exit(main()) 145 | -------------------------------------------------------------------------------- /optimize-pngs.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Copyright (c) 2014-2018 The Bitcoin Core developers 3 | # Distributed under the MIT software license, see the accompanying 4 | # file COPYING or http://www.opensource.org/licenses/mit-license.php. 5 | ''' 6 | Run this script every time you change one of the png files. Using pngcrush, it will optimize the png files, remove various color profiles, remove ancillary chunks (alla) and text chunks (text). 7 | #pngcrush -brute -ow -rem gAMA -rem cHRM -rem iCCP -rem sRGB -rem alla -rem text 8 | 9 | 10 | Install instructions 11 | ==== 12 | 13 | virtualenv ./optimize-pngs-env 14 | source ./optimize-pngs-env/bin/activate 15 | pip install Pillow 16 | 17 | Running 18 | ==== 19 | 20 | ./optimize-pngs.py ../bitcoin_core/src/qt/res/movies/ ../bitcoin_core/src/qt/res/icons/ ../bitcoin_core/share/pixmaps/ 21 | 22 | ''' 23 | import argparse 24 | import os 25 | import sys 26 | import subprocess 27 | import hashlib 28 | from PIL import Image 29 | 30 | def file_hash(filename): 31 | '''Return hash of raw file contents''' 32 | with open(filename, 'rb') as f: 33 | return hashlib.sha256(f.read()).hexdigest() 34 | 35 | def content_hash(filename): 36 | '''Return hash of RGBA contents of image''' 37 | i = Image.open(filename) 38 | i = i.convert('RGBA') 39 | data = i.tobytes() 40 | return hashlib.sha256(data).hexdigest() 41 | 42 | zopflipng = 'zopflipng' 43 | pngcrush = 'pngcrush' 44 | git = 'git' 45 | 46 | parser = argparse.ArgumentParser() 47 | parser.add_argument('folder', nargs='+') 48 | folders = parser.parse_args().folder 49 | folders = [os.path.abspath(f) for f in folders] 50 | 51 | totalSaveBytes = 0 52 | noHashChange = True 53 | 54 | outputArray = [] 55 | for absFolder in folders: 56 | for file in os.listdir(absFolder): 57 | extension = os.path.splitext(file)[1] 58 | if extension.lower() == '.png': 59 | print("optimizing {}...".format(file), end =' ') 60 | file_path = os.path.join(absFolder, file) 61 | fileMetaMap = {'file' : file, 'osize': os.path.getsize(file_path), 'sha256Old' : file_hash(file_path)} 62 | contentHashPre = content_hash(file_path) 63 | 64 | try: 65 | subprocess.call([pngcrush, "-brute", "-ow", "-rem", "gAMA", "-rem", "cHRM", "-rem", "iCCP", "-rem", "sRGB", "-rem", "alla", "-rem", "text", file_path], 66 | stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 67 | subprocess.call([zopflipng, '-m', '-y', file_path, file_path], 68 | stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 69 | except: 70 | print("pngcrush or zopflipng is not installed, aborting...") 71 | sys.exit(0) 72 | 73 | #verify 74 | if "Not a PNG file" in subprocess.check_output([pngcrush, "-n", "-v", file_path], stderr=subprocess.STDOUT, universal_newlines=True, encoding='utf8'): 75 | print("PNG file "+file+" is corrupted after crushing, check out pngcursh version") 76 | sys.exit(1) 77 | 78 | fileMetaMap['sha256New'] = file_hash(file_path) 79 | contentHashPost = content_hash(file_path) 80 | 81 | if contentHashPre != contentHashPost: 82 | print("Image contents of PNG file {} before and after crushing don't match".format(file)) 83 | sys.exit(1) 84 | 85 | fileMetaMap['psize'] = os.path.getsize(file_path) 86 | outputArray.append(fileMetaMap) 87 | print("done") 88 | 89 | print("summary:\n+++++++++++++++++") 90 | for fileDict in outputArray: 91 | oldHash = fileDict['sha256Old'] 92 | newHash = fileDict['sha256New'] 93 | totalSaveBytes += fileDict['osize'] - fileDict['psize'] 94 | noHashChange = noHashChange and (oldHash == newHash) 95 | print(fileDict['file']+"\n size diff from: "+str(fileDict['osize'])+" to: "+str(fileDict['psize'])+"\n old sha256: "+oldHash+"\n new sha256: "+newHash+"\n") 96 | 97 | print("completed. Checksum stable: "+str(noHashChange)+". Total reduction: "+str(totalSaveBytes)+" bytes") 98 | -------------------------------------------------------------------------------- /patches/stripbuildinfo.patch: -------------------------------------------------------------------------------- 1 | diff --git a/src/bench/bench.h b/src/bench/bench.h 2 | index 63e1bf67e2..358dd663f8 100644 3 | --- a/src/bench/bench.h 4 | +++ b/src/bench/bench.h 5 | @@ -77,6 +77,6 @@ public: 6 | 7 | // BENCHMARK(foo) expands to: benchmark::BenchRunner bench_11foo("foo", foo, priority_level); 8 | #define BENCHMARK(n, priority_level) \ 9 | - benchmark::BenchRunner PASTE2(bench_, PASTE2(__LINE__, n))(STRINGIZE(n), n, priority_level); 10 | + benchmark::BenchRunner PASTE2(bench_, PASTE2(0, n))(STRINGIZE(n), n, priority_level); 11 | 12 | #endif // BITCOIN_BENCH_BENCH_H 13 | diff --git a/src/clientversion.cpp b/src/clientversion.cpp 14 | index 192e9c52bc..e257059624 100644 15 | --- a/src/clientversion.cpp 16 | +++ b/src/clientversion.cpp 17 | @@ -53,7 +53,7 @@ static std::string FormatVersion(int nVersion) 18 | 19 | std::string FormatFullVersion() 20 | { 21 | - static const std::string CLIENT_BUILD(BUILD_DESC BUILD_SUFFIX); 22 | + static const std::string CLIENT_BUILD(""); 23 | return CLIENT_BUILD; 24 | } 25 | 26 | diff --git a/src/compat/assumptions.h b/src/compat/assumptions.h 27 | index 92615b582a..611894be19 100644 28 | --- a/src/compat/assumptions.h 29 | +++ b/src/compat/assumptions.h 30 | @@ -13,9 +13,6 @@ 31 | // Assumption: We assume that the macro NDEBUG is not defined. 32 | // Example(s): We use assert(...) extensively with the assumption of it never 33 | // being a noop at runtime. 34 | -#if defined(NDEBUG) 35 | -# error "Bitcoin cannot be compiled without assertions." 36 | -#endif 37 | 38 | // Assumption: We assume a C++17 (ISO/IEC 14882:2017) compiler (minimum requirement). 39 | // Example(s): We assume the presence of C++17 features everywhere :-) 40 | diff --git a/src/logging.h b/src/logging.h 41 | index 14a0f08f8d..32d4b7c708 100644 42 | --- a/src/logging.h 43 | +++ b/src/logging.h 44 | @@ -230,7 +230,7 @@ static inline void LogPrintf_(const std::string& logging_function, const std::st 45 | } 46 | } 47 | 48 | -#define LogPrintLevel_(category, level, ...) LogPrintf_(__func__, __FILE__, __LINE__, category, level, __VA_ARGS__) 49 | +#define LogPrintLevel_(category, level, ...) LogPrintf_(__func__, __FILE__, 0, category, level, __VA_ARGS__) 50 | 51 | // Log unconditionally. 52 | #define LogPrintf(...) LogPrintLevel_(BCLog::LogFlags::NONE, BCLog::Level::None, __VA_ARGS__) 53 | diff --git a/src/rest.cpp b/src/rest.cpp 54 | index a10d8a433f..48736c9b9d 100644 55 | --- a/src/rest.cpp 56 | +++ b/src/rest.cpp 57 | @@ -87,7 +87,7 @@ static NodeContext* GetNodeContext(const std::any& context, HTTPRequest* req) 58 | strprintf("%s:%d (%s)\n" 59 | "Internal bug detected: Node context not found!\n" 60 | "You may report this issue here: %s\n", 61 | - __FILE__, __LINE__, __func__, PACKAGE_BUGREPORT)); 62 | + __FILE__, 0, __func__, PACKAGE_BUGREPORT)); 63 | return nullptr; 64 | } 65 | return node_context; 66 | @@ -125,7 +125,7 @@ static ChainstateManager* GetChainman(const std::any& context, HTTPRequest* req) 67 | strprintf("%s:%d (%s)\n" 68 | "Internal bug detected: Chainman disabled or instance not found!\n" 69 | "You may report this issue here: %s\n", 70 | - __FILE__, __LINE__, __func__, PACKAGE_BUGREPORT)); 71 | + __FILE__, 0, __func__, PACKAGE_BUGREPORT)); 72 | return nullptr; 73 | } 74 | return node_context->chainman.get(); 75 | diff --git a/src/sync.h b/src/sync.h 76 | index 1f4e191214..88d4ade0a0 100644 77 | --- a/src/sync.h 78 | +++ b/src/sync.h 79 | @@ -140,12 +140,12 @@ using Mutex = AnnotatedMixin; 80 | */ 81 | class GlobalMutex : public Mutex { }; 82 | 83 | -#define AssertLockHeld(cs) AssertLockHeldInternal(#cs, __FILE__, __LINE__, &cs) 84 | +#define AssertLockHeld(cs) AssertLockHeldInternal(#cs, __FILE__, 0, &cs) 85 | 86 | inline void AssertLockNotHeldInline(const char* name, const char* file, int line, Mutex* cs) EXCLUSIVE_LOCKS_REQUIRED(!cs) { AssertLockNotHeldInternal(name, file, line, cs); } 87 | inline void AssertLockNotHeldInline(const char* name, const char* file, int line, RecursiveMutex* cs) LOCKS_EXCLUDED(cs) { AssertLockNotHeldInternal(name, file, line, cs); } 88 | inline void AssertLockNotHeldInline(const char* name, const char* file, int line, GlobalMutex* cs) LOCKS_EXCLUDED(cs) { AssertLockNotHeldInternal(name, file, line, cs); } 89 | -#define AssertLockNotHeld(cs) AssertLockNotHeldInline(#cs, __FILE__, __LINE__, &cs) 90 | +#define AssertLockNotHeld(cs) AssertLockNotHeldInline(#cs, __FILE__, 0, &cs) 91 | 92 | /** Wrapper around std::unique_lock style lock for MutexType. */ 93 | template 94 | @@ -241,7 +241,7 @@ public: 95 | friend class reverse_lock; 96 | }; 97 | 98 | -#define REVERSE_LOCK(g) typename std::decay::type::reverse_lock UNIQUE_NAME(revlock)(g, #g, __FILE__, __LINE__) 99 | +#define REVERSE_LOCK(g) typename std::decay::type::reverse_lock UNIQUE_NAME(revlock)(g, #g, __FILE__, 0) 100 | 101 | // When locking a Mutex, require negative capability to ensure the lock 102 | // is not already held 103 | @@ -255,23 +255,23 @@ inline MutexType& MaybeCheckNotHeld(MutexType& m) LOCKS_EXCLUDED(m) LOCK_RETURNE 104 | template 105 | inline MutexType* MaybeCheckNotHeld(MutexType* m) LOCKS_EXCLUDED(m) LOCK_RETURNED(m) { return m; } 106 | 107 | -#define LOCK(cs) UniqueLock UNIQUE_NAME(criticalblock)(MaybeCheckNotHeld(cs), #cs, __FILE__, __LINE__) 108 | +#define LOCK(cs) UniqueLock UNIQUE_NAME(criticalblock)(MaybeCheckNotHeld(cs), #cs, __FILE__, 0) 109 | #define LOCK2(cs1, cs2) \ 110 | - UniqueLock criticalblock1(MaybeCheckNotHeld(cs1), #cs1, __FILE__, __LINE__); \ 111 | - UniqueLock criticalblock2(MaybeCheckNotHeld(cs2), #cs2, __FILE__, __LINE__) 112 | -#define TRY_LOCK(cs, name) UniqueLock name(MaybeCheckNotHeld(cs), #cs, __FILE__, __LINE__, true) 113 | -#define WAIT_LOCK(cs, name) UniqueLock name(MaybeCheckNotHeld(cs), #cs, __FILE__, __LINE__) 114 | + UniqueLock criticalblock1(MaybeCheckNotHeld(cs1), #cs1, __FILE__, 0); \ 115 | + UniqueLock criticalblock2(MaybeCheckNotHeld(cs2), #cs2, __FILE__, 0) 116 | +#define TRY_LOCK(cs, name) UniqueLock name(MaybeCheckNotHeld(cs), #cs, __FILE__, 0, true) 117 | +#define WAIT_LOCK(cs, name) UniqueLock name(MaybeCheckNotHeld(cs), #cs, __FILE__, 0) 118 | 119 | #define ENTER_CRITICAL_SECTION(cs) \ 120 | { \ 121 | - EnterCritical(#cs, __FILE__, __LINE__, &cs); \ 122 | + EnterCritical(#cs, __FILE__, 0, &cs); \ 123 | (cs).lock(); \ 124 | } 125 | 126 | #define LEAVE_CRITICAL_SECTION(cs) \ 127 | { \ 128 | std::string lockname; \ 129 | - CheckLastCritical((void*)(&cs), lockname, #cs, __FILE__, __LINE__); \ 130 | + CheckLastCritical((void*)(&cs), lockname, #cs, __FILE__, 0); \ 131 | (cs).unlock(); \ 132 | LeaveCritical(); \ 133 | } 134 | diff --git a/src/util/check.h b/src/util/check.h 135 | index b6c03bed2a..b4a1380061 100644 136 | --- a/src/util/check.h 137 | +++ b/src/util/check.h 138 | @@ -38,11 +38,7 @@ T&& inline_check_non_fatal(LIFETIMEBOUND T&& val, const char* file, int line, co 139 | * caller, which can then report the issue to the developers. 140 | */ 141 | #define CHECK_NONFATAL(condition) \ 142 | - inline_check_non_fatal(condition, __FILE__, __LINE__, __func__, #condition) 143 | - 144 | -#if defined(NDEBUG) 145 | -#error "Cannot compile without assertions!" 146 | -#endif 147 | + inline_check_non_fatal(condition, __FILE__, 0, __func__, #condition) 148 | 149 | /** Helper for Assert() */ 150 | void assertion_fail(const char* file, int line, const char* func, const char* assertion); 151 | @@ -64,7 +60,7 @@ T&& inline_assertion_check(LIFETIMEBOUND T&& val, [[maybe_unused]] const char* f 152 | } 153 | 154 | /** Identity function. Abort if the value compares equal to zero */ 155 | -#define Assert(val) inline_assertion_check(val, __FILE__, __LINE__, __func__, #val) 156 | +#define Assert(val) inline_assertion_check(val, __FILE__, 0, __func__, #val) 157 | 158 | /** 159 | * Assume is the identity function. 160 | @@ -76,13 +72,13 @@ T&& inline_assertion_check(LIFETIMEBOUND T&& val, [[maybe_unused]] const char* f 161 | * - For non-fatal errors in interactive sessions (e.g. RPC or command line 162 | * interfaces), CHECK_NONFATAL() might be more appropriate. 163 | */ 164 | -#define Assume(val) inline_assertion_check(val, __FILE__, __LINE__, __func__, #val) 165 | +#define Assume(val) inline_assertion_check(val, __FILE__, 0, __func__, #val) 166 | 167 | /** 168 | * NONFATAL_UNREACHABLE() is a macro that is used to mark unreachable code. It throws a NonFatalCheckError. 169 | */ 170 | #define NONFATAL_UNREACHABLE() \ 171 | throw NonFatalCheckError( \ 172 | - "Unreachable code reached (non-fatal)", __FILE__, __LINE__, __func__) 173 | + "Unreachable code reached (non-fatal)", __FILE__, 0, __func__) 174 | 175 | #endif // BITCOIN_UTIL_CHECK_H 176 | -------------------------------------------------------------------------------- /release-prep.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # This script will do code related release preparation stuff for Bitcoin Core as specified in 4 | # the release-process.md file. 5 | # This should be run from the folder containing the Source tree 6 | # The following actions will be done: 7 | # 1. Set the version numbers as specified in the arguments 8 | # 2. Update src/chainparams.cpp nMinimumChainWork and defaultAssumeValid with information from the getblockchaininfo rpc. 9 | # 3. Update Hard coded seeds 10 | # 4. Set BLOCK_CHAIN_SIZE 11 | # 5. Update translations 12 | # 6. Generate updated manpages 13 | # Note: Step 2 assumes that an up-to-date Bitcoin Core is running and has been built in the 14 | # directory which this script is being run. 15 | 16 | # Variables 17 | VERSION= 18 | BLOCK_CHAIN_SIZE= 19 | DATADIR="" 20 | verBump=true 21 | chainparamsUpdate=true 22 | seedUpdate=true 23 | blockchainsizeBump=true 24 | translationsUpdate=true 25 | genManpages=true 26 | 27 | # Help Message 28 | read -d '' usage <<- EOF 29 | Usage: $scriptName version block_chain_size 30 | 31 | Run this script from the Bitcoin Core Source root directory. This requires a current version of Bitcoin Core 32 | to be running at the time that this script is run. 33 | 34 | Arguments: 35 | version Version number to set following the MAJOR.MINOR.REVISION format. Only required if 36 | a version bump will be done. e.g. 0.14.0 37 | block_chain_size The size of the blockchain for the intro display. Should contain a little bit of 38 | overhead. Only required if BLOCK_CHAIN_SIZE will be updated. e.g. 120 39 | 40 | Options: 41 | --datadir The path to the data directory of the running Bitcoin Core node. Note that this is 42 | different from Bitcoin Core's -datadir option syntax. There is no equals, simply a space 43 | followed by the path 44 | 45 | --skip [v|c|s|b|t|m] Skip the specified steps. v=version bump; c=update nMinimumChainwork and defaultAssumeValid 46 | s=hard coded seed update; b=blockchain size bump; t=translations update; m=generate manpages. 47 | The steps will be done in the order listed above. 48 | 49 | EOF 50 | 51 | # Get options and arguments 52 | while :; do 53 | case $1 in 54 | # datadir 55 | --datadir) 56 | if [ -n "$2" ] 57 | then 58 | DATADIR="-datadir=$2" 59 | shift 60 | else 61 | echo 'Error: "--datadir" requires an argument' 62 | exit 1 63 | fi 64 | ;; 65 | # skips 66 | --skip) 67 | if [ -n "$2" ] 68 | then 69 | if [[ "$2" = *"v"* ]] 70 | then 71 | verBump=false 72 | fi 73 | if [[ "$2" = *"c"* ]] 74 | then 75 | chainparamsUpdate=false 76 | fi 77 | if [[ "$2" = *"s"* ]] 78 | then 79 | seedUpdate=false 80 | fi 81 | if [[ "$2" = *"b"* ]] 82 | then 83 | blockchainsizeBump=false 84 | fi 85 | if [[ "$2" = *"t"* ]] 86 | then 87 | translationsUpdate=false 88 | fi 89 | if [[ "$2" = *"m"* ]] 90 | then 91 | genManpages=false 92 | fi 93 | shift 94 | else 95 | echo 'Error: "--skip" requires an argument' 96 | exit 1 97 | fi 98 | ;; 99 | *) # Default case: If no more options then break out of the loop. 100 | break 101 | esac 102 | shift 103 | done 104 | 105 | if [[ $verBump = true ]] 106 | then 107 | # Bump Version numbers 108 | # Get version 109 | if [[ -n "$1" ]] 110 | then 111 | VERSION=$1 112 | shift 113 | fi 114 | 115 | # Check that a version is specified 116 | if [[ $VERSION == "" ]] 117 | then 118 | echo "$scriptName: Missing version." 119 | echo "Try $scriptName --help for more information" 120 | exit 1 121 | fi 122 | 123 | echo "Setting Version number" 124 | major=$(echo $VERSION | cut -d. -f1) 125 | minor=$(echo $VERSION | cut -d. -f2) 126 | rev=$(echo $VERSION | cut -d. -f3) 127 | 128 | # configure.ac 129 | sed -i "/define(_CLIENT_VERSION_MAJOR, /c\define(_CLIENT_VERSION_MAJOR, $major)" ./configure.ac 130 | sed -i "/define(_CLIENT_VERSION_MINOR, /c\define(_CLIENT_VERSION_MINOR, $minor)" ./configure.ac 131 | sed -i "/define(_CLIENT_VERSION_REVISION, /c\define(_CLIENT_VERSION_REVISION, $rev)" ./configure.ac 132 | sed -i "/define(_CLIENT_VERSION_IS_RELEASE, /c\define(_CLIENT_VERSION_IS_RELEASE, true)" ./configure.ac 133 | 134 | # src/clientversion.h 135 | sed -i "/#define CLIENT_VERSION_MAJOR /c\#define CLIENT_VERSION_MAJOR $major" ./src/clientversion.h 136 | sed -i "/#define CLIENT_VERSION_MINOR /c\#define CLIENT_VERSION_MINOR $minor" ./src/clientversion.h 137 | sed -i "/#define CLIENT_VERSION_REVISION /c\#define CLIENT_VERSION_REVISION $rev" ./src/clientversion.h 138 | sed -i "/#define CLIENT_VERSION_IS_RELEASE /c\#define CLIENT_VERSION_IS_RELEASE true" ./src/clientversion.h 139 | 140 | # docs 141 | sed -i "/PROJECT_NUMBER = /c\PROJECT_NUMBER = $VERSION" ./doc/Doxyfile 142 | sed -i "1s/.*/Bitcoin Core $VERSION/" ./doc/README.md 143 | sed -i "1s/.*/Bitcoin Core $VERSION/" ./doc/README_windows.txt 144 | 145 | # gitian descriptors 146 | sed -i "2s/.*/name: \"bitcoin-win-$major.$minor\"/" ./contrib/gitian-descriptors/gitian-win.yml 147 | sed -i "2s/.*/name: \"bitcoin-linux-$major.$minor\"/" ./contrib/gitian-descriptors/gitian-linux.yml 148 | sed -i "2s/.*/name: \"bitcoin-osx-$major.$minor\"/" ./contrib/gitian-descriptors/gitian-osx.yml 149 | fi 150 | 151 | if [[ $chainparamsUpdate = true ]] 152 | then 153 | # Update nMinimumChainWork and defaultAssumeValid 154 | echo "Updating nMinimumChainWork and defaultAssumeValid" 155 | blockchaininfo=`src/bitcoin-cli ${DATADIR} getblockchaininfo` 156 | chainwork=`echo "$blockchaininfo" | jq -r '.chainwork'` 157 | bestblockhash=`echo "$blockchaininfo" | jq -r '.bestblockhash'` 158 | sed -i "0,/ consensus.nMinimumChainWork = uint256S(.*/s// consensus.nMinimumChainWork = uint256S(\"0x$chainwork\");/" ./src/chainparams.cpp 159 | sed -i "0,/ consensus.defaultAssumeValid = uint256S(.*/s// consensus.defaultAssumeValid = uint256S(\"0x$bestblockhash\");/" ./src/chainparams.cpp 160 | fi 161 | 162 | if [[ $seedUpdate = true ]] 163 | then 164 | # Update Seeds 165 | echo "Updating hard coded seeds" 166 | pushd ./contrib/seeds 167 | curl -s http://bitcoin.sipa.be/seeds.txt > seeds_main.txt 168 | python makeseeds.py < seeds_main.txt > nodes_main.txt 169 | python generate-seeds.py . > ../../src/chainparamsseeds.h 170 | popd 171 | fi 172 | 173 | if [[ $blockchainsizeBump = true ]] 174 | then 175 | # Set blockchain size 176 | # Get block_chain_size 177 | if [[ -n "$1" ]] 178 | then 179 | BLOCK_CHAIN_SIZE=$1 180 | shift 181 | fi 182 | 183 | # Check that a block_chain_size is specified 184 | if [[ $BLOCK_CHAIN_SIZE == "" ]] 185 | then 186 | echo "$scriptName: Missing block_chain_size." 187 | echo "Try $scriptName --help for more information" 188 | exit 1 189 | fi 190 | echo "Setting BLOCK_CHAIN_SIZE" 191 | sed -i "/static const uint64_t BLOCK_CHAIN_SIZE = /c\static const uint64_t BLOCK_CHAIN_SIZE = $BLOCK_CHAIN_SIZE;" ./src/qt/intro.cpp 192 | fi 193 | 194 | if [[ $translationsUpdate = true ]] 195 | then 196 | # Update translations 197 | echo "Updating translations" 198 | python contrib/devtools/update-translations.py 199 | ls src/qt/locale/*ts|xargs -n1 basename|sed 's/\(bitcoin_\(.*\)\).ts/locale\/\1.qm<\/file>/' 200 | ls src/qt/locale/*ts|xargs -n1 basename|sed 's/\(bitcoin_\(.*\)\).ts/ qt\/locale\/\1.ts \\/' 201 | fi 202 | 203 | if [[ $genManpages = true ]] 204 | then 205 | # Generate manpages 206 | echo "Generating manpages" 207 | bash ./contrib/devtools/gen-manpages.sh 208 | fi 209 | 210 | # Complete 211 | echo "Release preparation complete. Please use git to commit these changes" 212 | -------------------------------------------------------------------------------- /signoff.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python3 2 | ''' 3 | This is an utility to manually add a treehash to the top commit and 4 | then gpg-sign it. 5 | ''' 6 | from treehash512 import tree_sha512sum, GIT 7 | import subprocess, sys 8 | HDR = b'Tree-SHA512' 9 | def main(): 10 | commit = 'HEAD' 11 | h = tree_sha512sum(commit).encode() 12 | msg = subprocess.check_output([GIT, 'show', '-s', '--format=%B', commit]).rstrip() 13 | 14 | curh = None 15 | for line in msg.splitlines(): 16 | if line.startswith(HDR+b':'): 17 | assert(curh is None) # no multiple treehashes 18 | curh = line[len(HDR)+1:].strip() 19 | 20 | if curh == h: 21 | print('Warning: already has a (valid) treehash') 22 | if subprocess.call([GIT, 'verify-commit', commit]) == 0: 23 | # already signed, too, just exit 24 | sys.exit(0) 25 | elif curh is not None: 26 | print('Error: already has a treehash, which mismatches. Something is wrong. Remove it first.') 27 | sys.exit(1) 28 | else: # no treehash 29 | msg += b'\n\n' + HDR + b': ' + h 30 | 31 | print(msg.decode()) 32 | 33 | subprocess.check_call([GIT, 'commit', '--amend', '--gpg-sign', '-m', msg, '--no-edit']) 34 | rv = subprocess.call([GIT, 'verify-commit', commit]) 35 | sys.exit(rv) 36 | 37 | if __name__ == '__main__': 38 | main() 39 | -------------------------------------------------------------------------------- /termlib/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/bitcoin-core/bitcoin-maintainer-tools/e626b57997e539254185193e4bab0d28ee24d6bb/termlib/__init__.py -------------------------------------------------------------------------------- /termlib/attr.py: -------------------------------------------------------------------------------- 1 | class Attr: 2 | ''' 3 | Terminal output attributes. 4 | ''' 5 | def fg(r, g, b): 6 | return f'\x1b[38;2;{r};{g};{b}m' 7 | 8 | def bg(r, g, b): 9 | return f'\x1b[48;2;{r};{g};{b}m' 10 | 11 | def fg_hex(color): 12 | if color[0] == '#': 13 | color = color[1:] 14 | assert(len(color) == 6) 15 | return Attr.fg(int(color[0:2], 16), int(color[2:4], 16), int(color[4:6], 16)) 16 | 17 | def bg_hex(color): 18 | if color[0] == '#': 19 | color = color[1:] 20 | assert(len(color) == 6) 21 | return Attr.bg(int(color[0:2], 16), int(color[2:4], 16), int(color[4:6], 16)) 22 | 23 | def close(attr): 24 | # Close an attribute (restore terminal to "neutral") 25 | return Attr.RESET 26 | 27 | RESET = '\x1b[0m' 28 | BOLD = '\033[1m' 29 | CLEAR = '\x1b[H\x1b[J\x1b[3J' 30 | UNDERLINE = '\x1b[4m' 31 | REVERSE = '\033[7m' 32 | -------------------------------------------------------------------------------- /termlib/input.py: -------------------------------------------------------------------------------- 1 | ''' 2 | Terminal input handling (based on bpytop). 3 | ''' 4 | import fcntl 5 | import logging 6 | import os 7 | from select import select 8 | import signal 9 | import sys 10 | import termios 11 | import threading 12 | import tty 13 | from typing import List, Set, Dict, Tuple, Optional, Union, Any, Callable, ContextManager, Iterable, Type, NamedTuple 14 | 15 | errlog = logging.getLogger("ErrorLogger") 16 | errlog.setLevel(logging.DEBUG) 17 | 18 | class Term: 19 | """Terminal info and commands""" 20 | hide_cursor = "\033[?25l" #* Hide terminal cursor 21 | show_cursor = "\033[?25h" #* Show terminal cursor 22 | alt_screen = "\033[?1049h" #* Switch to alternate screen 23 | normal_screen = "\033[?1049l" #* Switch to normal screen 24 | clear = "\033[2J\033[0;0f" #* Clear screen and set cursor to position 0,0 25 | mouse_on = "\033[?1002h\033[?1015h\033[?1006h" #* Enable reporting of mouse position on click and release 26 | mouse_off = "\033[?1002l" #* Disable mouse reporting 27 | mouse_direct_on = "\033[?1003h" #* Enable reporting of mouse position at any movement 28 | mouse_direct_off = "\033[?1003l" #* Disable direct mouse reporting 29 | 30 | class Raw(object): 31 | """Set raw input mode for device""" 32 | def __init__(self, stream): 33 | self.stream = stream 34 | self.fd = self.stream.fileno() 35 | def __enter__(self): 36 | self.original_stty = termios.tcgetattr(self.stream) 37 | tty.setcbreak(self.stream) 38 | def __exit__(self, type, value, traceback): 39 | termios.tcsetattr(self.stream, termios.TCSANOW, self.original_stty) 40 | 41 | class Nonblocking(object): 42 | """Set nonblocking mode for device""" 43 | def __init__(self, stream): 44 | self.stream = stream 45 | self.fd = self.stream.fileno() 46 | def __enter__(self): 47 | self.orig_fl = fcntl.fcntl(self.fd, fcntl.F_GETFL) 48 | fcntl.fcntl(self.fd, fcntl.F_SETFL, self.orig_fl | os.O_NONBLOCK) 49 | def __exit__(self, *args): 50 | fcntl.fcntl(self.fd, fcntl.F_SETFL, self.orig_fl) 51 | 52 | class Key: 53 | """Handles the threaded input reader for keypresses and mouse events""" 54 | list: List[str] = [] 55 | mouse: Dict[str, List[List[int]]] = {} 56 | mouse_pos: Tuple[int, int] = (0, 0) 57 | escape: Dict[Union[str, Tuple[str, str]], str] = { 58 | "\n" : "enter", 59 | ("\x7f", "\x08") : "backspace", 60 | ("[A", "OA") : "up", 61 | ("[B", "OB") : "down", 62 | ("[D", "OD") : "left", 63 | ("[C", "OC") : "right", 64 | "[2~" : "insert", 65 | "[3~" : "delete", 66 | "[H" : "home", 67 | "[F" : "end", 68 | "[5~" : "page_up", 69 | "[6~" : "page_down", 70 | "\t" : "tab", 71 | "[Z" : "shift_tab", 72 | "OP" : "f1", 73 | "OQ" : "f2", 74 | "OR" : "f3", 75 | "OS" : "f4", 76 | "[15" : "f5", 77 | "[17" : "f6", 78 | "[18" : "f7", 79 | "[19" : "f8", 80 | "[20" : "f9", 81 | "[21" : "f10", 82 | "[23" : "f11", 83 | "[24" : "f12" 84 | } 85 | new = threading.Event() 86 | idle = threading.Event() 87 | mouse_move = threading.Event() 88 | mouse_report: bool = False 89 | idle.set() 90 | stopping: bool = False 91 | started: bool = False 92 | reader: threading.Thread 93 | 94 | @classmethod 95 | def start(cls, hide_cursor=False): 96 | signal.signal(signal.SIGWINCH, cls._resize_handler) 97 | cls.stopping = False 98 | cls.reader = threading.Thread(target=cls._get_key) 99 | cls.reader.start() 100 | cls.started = True 101 | sys.stdout.write(Term.mouse_on) 102 | if hide_cursor: 103 | sys.stdout.write(Term.hide_cursor) 104 | sys.stdout.flush() 105 | 106 | @classmethod 107 | def stop(cls): 108 | sys.stdout.write(Term.mouse_off + Term.show_cursor) 109 | sys.stdout.flush() 110 | if cls.started and cls.reader.is_alive(): 111 | cls.stopping = True 112 | try: 113 | cls.reader.join() 114 | except: 115 | pass 116 | 117 | @classmethod 118 | def last(cls) -> str: 119 | if cls.list: return cls.list.pop() 120 | else: return "" 121 | 122 | @classmethod 123 | def get(cls) -> str: 124 | if cls.list: return cls.list.pop(0) 125 | else: return "" 126 | 127 | @classmethod 128 | def get_mouse(cls) -> Tuple[int, int]: 129 | if cls.new.is_set(): 130 | cls.new.clear() 131 | return cls.mouse_pos 132 | 133 | @classmethod 134 | def mouse_moved(cls) -> bool: 135 | if cls.mouse_move.is_set(): 136 | cls.mouse_move.clear() 137 | return True 138 | else: 139 | return False 140 | 141 | @classmethod 142 | def has_key(cls) -> bool: 143 | return bool(cls.list) 144 | 145 | @classmethod 146 | def clear(cls): 147 | cls.list = [] 148 | 149 | @classmethod 150 | def input_wait(cls, sec: float = 0.0, mouse: bool = False) -> bool: 151 | '''Returns True if key is detected else waits out timer and returns False''' 152 | if cls.list: return True 153 | if mouse: Draw.now(Term.mouse_direct_on) 154 | cls.new.wait(sec if sec > 0 else 0.0) 155 | if mouse: Draw.now(Term.mouse_direct_off, Term.mouse_on) 156 | 157 | if cls.new.is_set(): 158 | cls.new.clear() 159 | return True 160 | else: 161 | return False 162 | 163 | @classmethod 164 | def break_wait(cls): 165 | cls.list.append("_null") 166 | cls.new.set() 167 | sleep(0.01) 168 | cls.new.clear() 169 | 170 | @classmethod 171 | def _resize_handler(cls, signum, frame): 172 | cls.list.append('resize') 173 | cls.new.set() 174 | 175 | @classmethod 176 | def _get_key(cls): 177 | """Get a key or escape sequence from stdin, convert to readable format and save to keys list. Meant to be run in it's own thread.""" 178 | input_key: str = "" 179 | clean_key: str = "" 180 | try: 181 | while not cls.stopping: 182 | with Raw(sys.stdin): 183 | if not select([sys.stdin], [], [], 0.1)[0]: #* Wait 100ms for input on stdin then restart loop to check for stop flag 184 | continue 185 | input_key += sys.stdin.read(1) #* Read 1 key safely with blocking on 186 | if input_key == "\033": #* If first character is a escape sequence keep reading 187 | cls.idle.clear() #* Report IO block in progress to prevent Draw functions from getting a IO Block error 188 | # Draw.idle.wait() #* Wait for Draw function to finish if busy 189 | with Nonblocking(sys.stdin): #* Set non blocking to prevent read stall 190 | input_key += sys.stdin.read(20) 191 | if input_key.startswith("\033[<"): 192 | _ = sys.stdin.read(1000) 193 | cls.idle.set() #* Report IO blocking done 194 | #errlog.debug(f'{repr(input_key)}') 195 | if input_key == "\033": clean_key = "escape" #* Key is "escape" key if only containing \033 196 | elif input_key.startswith(("\033[<0;", "\033[<35;", "\033[<64;", "\033[<65;")): #* Detected mouse event 197 | try: 198 | cls.mouse_pos = (int(input_key.split(";")[1]) - 1, int(input_key.split(";")[2].rstrip("mM")) - 1) 199 | except: 200 | pass 201 | else: 202 | if input_key.startswith("\033[<35;"): #* Detected mouse move in mouse direct mode 203 | cls.mouse_move.set() 204 | cls.new.set() 205 | elif input_key.startswith("\033[<64;"): #* Detected mouse scroll up 206 | clean_key = "mouse_scroll_up" 207 | elif input_key.startswith("\033[<65;"): #* Detected mouse scroll down 208 | clean_key = "mouse_scroll_down" 209 | elif input_key.startswith("\033[<0;") and input_key.endswith("m"): #* Detected mouse click release 210 | clean_key = "mouse_click" 211 | elif input_key == "\\": clean_key = "\\" #* Clean up "\" to not return escaped 212 | else: 213 | for code in cls.escape.keys(): #* Go trough dict of escape codes to get the cleaned key name 214 | if input_key.lstrip("\033").startswith(code): 215 | clean_key = cls.escape[code] 216 | break 217 | else: #* If not found in escape dict and length of key is 1, assume regular character 218 | if len(input_key) == 1: 219 | clean_key = input_key 220 | if clean_key: 221 | cls.list.append(clean_key) #* Store up to 10 keys in input queue for later processing 222 | if len(cls.list) > 10: del cls.list[0] 223 | clean_key = "" 224 | cls.new.set() #* Set threading event to interrupt main thread sleep 225 | input_key = "" 226 | 227 | 228 | except Exception as e: 229 | errlog.exception(f'Input thread failed with exception: {e}') 230 | cls.idle.set() 231 | cls.list.clear() 232 | clean_quit(1, thread=True) 233 | 234 | def clean_quit(errcode: int = 0, errmsg: str = "", thread: bool = False): 235 | sys.exit(1) 236 | -------------------------------------------------------------------------------- /termlib/tableprinter.py: -------------------------------------------------------------------------------- 1 | from collections import namedtuple 2 | import enum 3 | import unicodedata 4 | 5 | class Align(enum.Enum): 6 | LEFT = 0 7 | CENTER = 1 8 | RIGHT = 2 9 | 10 | # from https://bugs.python.org/msg145523 11 | W = { 12 | 'F': 2, # full-width, width 2, compatibility character for a narrow char 13 | 'H': 1, # half-width, width 1, compatibility character for a narrow char 14 | 'W': 2, # wide, width 2 15 | 'Na': 1, # narrow, width 1 16 | 'A': 1, # ambiguous; width 2 in Asian context, width 1 in non-Asian context 17 | 'N': 1, #neutral; not used in Asian text, so has no width. Practically, width can be considered as 1 18 | } 19 | 20 | def get_width(s): 21 | # TODO: 22 | # - Handle embedded attributes 23 | return sum(W[unicodedata.east_asian_width(ch)] for ch in s) 24 | 25 | def crop(s, width): 26 | '''Crop a string to a certain visual length.''' 27 | # TODO: 28 | # - Handle embedded attributes 29 | # - Ellipsis … ? 30 | o = 0 31 | l = 0 32 | for ch in s: 33 | w = W[unicodedata.east_asian_width(ch)] 34 | if l + w > width: 35 | break 36 | l += w 37 | o += 1 38 | 39 | return (s[0:o], l) 40 | 41 | def pad(s, width, align): 42 | # TODO: 43 | # - Handle embedded attributes 44 | s, swidth = crop(s, width) 45 | padding = width - swidth 46 | if align == Align.LEFT: 47 | return s + (' ' * padding) 48 | elif align == Align.RIGHT: 49 | return (' ' * padding) + s 50 | elif align == Align.CENTER: 51 | return (' ' * (padding // 2)) + s + (' ' * ((padding + 1) // 2)) 52 | 53 | class Column: 54 | def __init__(self, title, width, align=Align.LEFT): 55 | self.title = title 56 | self.width = width 57 | self.align = align 58 | 59 | ColumnInfo = namedtuple('ColumnInfo', ['x', 'width']) 60 | 61 | class TablePrinter: 62 | def __init__(self, out, attr, columns): 63 | self.out = out 64 | self.attr = attr 65 | self.columns = columns 66 | 67 | def format_row(self, rec): 68 | return ' '.join((entry[0] + pad(str(entry[1]), col.width, col.align) + self.attr.close(entry[0]) for entry, col in zip(rec, self.columns))) 69 | 70 | def print_row(self, rec): 71 | self.out.write(f'{self.format_row(rec)}\n') 72 | 73 | def print_header(self, hdr_attr): 74 | titles = [(hdr_attr, t.title) for t in self.columns] 75 | self.out.write(f'{self.format_row(titles)}\n') 76 | 77 | def column_info(self, idx): 78 | x = 0 79 | for i in range(0, idx): 80 | x += self.columns[i].width + 1 81 | return ColumnInfo(x=x, width=self.columns[idx].width) 82 | -------------------------------------------------------------------------------- /treehash512.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Copyright (c) 2017 The Bitcoin Core developers 3 | # Distributed under the MIT software license, see the accompanying 4 | # file COPYING or http://www.opensource.org/licenses/mit-license.php. 5 | ''' 6 | This script will show the SHA512 tree has for a certain commit, or HEAD 7 | by default. 8 | 9 | Usage: 10 | 11 | treehash512.py [] 12 | ''' 13 | import os 14 | from sys import stdin,stdout,stderr 15 | import sys 16 | import argparse 17 | import hashlib 18 | import subprocess 19 | 20 | # External tools (can be overridden using environment) 21 | GIT = os.getenv('GIT','git') 22 | 23 | def tree_sha512sum(commit='HEAD'): 24 | # request metadata for entire tree, recursively 25 | files = [] 26 | blob_by_name = {} 27 | for line in subprocess.check_output([GIT, 'ls-tree', '--full-tree', '-r', commit]).splitlines(): 28 | name_sep = line.index(b'\t') 29 | metadata = line[:name_sep].split() # perms, 'blob', blobid 30 | assert(metadata[1] == b'blob') 31 | name = line[name_sep+1:] 32 | files.append(name) 33 | blob_by_name[name] = metadata[2] 34 | 35 | files.sort() 36 | # open connection to git-cat-file in batch mode to request data for all blobs 37 | # this is much faster than launching it per file 38 | p = subprocess.Popen([GIT, 'cat-file', '--batch'], stdout=subprocess.PIPE, stdin=subprocess.PIPE) 39 | overall = hashlib.sha512() 40 | for f in files: 41 | blob = blob_by_name[f] 42 | # request blob 43 | p.stdin.write(blob + b'\n') 44 | p.stdin.flush() 45 | # read header: blob, "blob", size 46 | reply = p.stdout.readline().split() 47 | assert(reply[0] == blob and reply[1] == b'blob') 48 | size = int(reply[2]) 49 | # hash the blob data 50 | intern = hashlib.sha512() 51 | ptr = 0 52 | while ptr < size: 53 | bs = min(65536, size - ptr) 54 | piece = p.stdout.read(bs) 55 | if len(piece) == bs: 56 | intern.update(piece) 57 | else: 58 | raise IOError('Premature EOF reading git cat-file output') 59 | ptr += bs 60 | dig = intern.hexdigest() 61 | assert(p.stdout.read(1) == b'\n') # ignore LF that follows blob data 62 | # update overall hash with file hash 63 | overall.update(dig.encode("utf-8")) 64 | overall.update(" ".encode("utf-8")) 65 | overall.update(f) 66 | overall.update("\n".encode("utf-8")) 67 | p.stdin.close() 68 | if p.wait(): 69 | raise IOError('Non-zero return value executing git cat-file') 70 | return overall.hexdigest() 71 | 72 | def main(): 73 | if len(sys.argv)>1: 74 | commit = sys.argv[1] 75 | else: 76 | commit = 'HEAD' 77 | print(tree_sha512sum(commit)) 78 | 79 | if __name__ == '__main__': 80 | main() 81 | 82 | -------------------------------------------------------------------------------- /unittest-statistics.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | import subprocess 3 | import sys 4 | import re 5 | import os 6 | 7 | def main(): 8 | if len(sys.argv) < 2: 9 | tool = os.path.basename(sys.argv[0]) 10 | print('Usage: {} []'.format(tool)) 11 | print('For example: {} src/test/test_bitcoin wallet_tests'.format(tool)) 12 | exit(1) 13 | test_bitcoin = sys.argv[1] 14 | args = [test_bitcoin, '--log_level=test_suite'] 15 | if len(sys.argv) > 2: 16 | args += ['--run_test=' + sys.argv[2]] 17 | p = subprocess.Popen(args, stdout=subprocess.PIPE) 18 | results = [] 19 | for line in p.stdout: 20 | if not line: 21 | break 22 | line = line.decode() 23 | m = re.match('.*Leaving test case "(.*)".*: ([0-9]+)(us|mks|ms)', line) 24 | if m: 25 | if m.group(3) == 'ms': 26 | elapsed = int(m.group(2)) * 1000 27 | else: 28 | elapsed = int(m.group(2)) 29 | results.append((m.group(1), elapsed)) 30 | sys.stderr.write('.') 31 | sys.stderr.flush() 32 | sys.stderr.write('\n') 33 | sys.stderr.flush() 34 | rv = p.wait() 35 | 36 | if rv == 0: 37 | print('| {:<55} | {:^9} |'.format('Test', 'Time (μs)')) 38 | print('| {} | {}:|'.format('-'*55, '-'*9)) 39 | results.sort(key=lambda a:-a[1]) 40 | for a in results: 41 | print('| {:<55} | {:>9} |'.format('`'+a[0]+'`', a[1])) 42 | 43 | if __name__ == '__main__': 44 | main() 45 | 46 | -------------------------------------------------------------------------------- /update-translations.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # Copyright (c) 2014 Wladimir J. van der Laan 3 | # Distributed under the MIT software license, see the accompanying 4 | # file COPYING or http://www.opensource.org/licenses/mit-license.php. 5 | ''' 6 | Run this script from the root of the repository to update all translations from 7 | transifex. 8 | It will do the following automatically: 9 | 10 | - remove current translations 11 | - fetch all translations using the tx tool 12 | - post-process them into valid and committable format 13 | - remove invalid control characters 14 | - remove location tags (makes diffs less noisy) 15 | - drop untranslated languages 16 | - update git for added translations 17 | - update build system 18 | ''' 19 | import subprocess 20 | import re 21 | import sys 22 | import os 23 | import io 24 | import xml.etree.ElementTree as ET 25 | 26 | # Name of Qt lconvert tool 27 | LCONVERT = 'lconvert' 28 | # Name of transifex tool 29 | TX = 'tx' 30 | # Name of source language file without extension 31 | SOURCE_LANG = 'bitcoin_en' 32 | # Directory with locale files 33 | LOCALE_DIR = 'src/qt/locale' 34 | # Minimum number of non-numerus messages for translation to be considered at all 35 | MIN_NUM_NONNUMERUS_MESSAGES = 10 36 | # Regexp to check for Bitcoin addresses 37 | ADDRESS_REGEXP = re.compile('([13]|bc1)[a-zA-Z0-9]{30,}') 38 | # Path to git 39 | GIT = os.getenv("GIT", "git") 40 | # Original content file suffix 41 | ORIGINAL_SUFFIX = '.orig' 42 | # Native Qt translation file (TS) format 43 | FORMAT_TS = '.ts' 44 | # XLIFF file format 45 | FORMAT_XLIFF = '.xlf' 46 | 47 | def check_at_repository_root(): 48 | if not os.path.exists('.git'): 49 | print('No .git directory found') 50 | print('Execute this script at the root of the repository', file=sys.stderr) 51 | sys.exit(1) 52 | 53 | def remove_current_translations(): 54 | ''' 55 | Remove current translations. 56 | We only want the active translations that are currently on transifex. 57 | This leaves bitcoin_en.ts untouched. 58 | ''' 59 | for (_,name) in all_ts_files(): 60 | os.remove(name) 61 | 62 | def remove_orig_files(): 63 | ''' 64 | Remove temporary files that might be left behind. 65 | ''' 66 | for (_,name) in all_ts_files(suffix=ORIGINAL_SUFFIX): 67 | os.remove(name + ORIGINAL_SUFFIX) 68 | 69 | def fetch_all_translations(): 70 | if subprocess.call([TX, 'pull', '--translations', '--force', '--all']): 71 | print('Error while fetching translations', file=sys.stderr) 72 | sys.exit(1) 73 | 74 | def convert_xlf_to_ts(): 75 | xliff_files = all_ts_files(file_format=FORMAT_XLIFF) 76 | if not xliff_files: 77 | return False 78 | 79 | for (_, name) in xliff_files: 80 | outname = name.replace(FORMAT_XLIFF, FORMAT_TS) 81 | print('Converting %s to %s...' % (name, outname)) 82 | subprocess.check_call([LCONVERT, '-o', outname, '-i', name]) 83 | os.remove(name) 84 | return True 85 | 86 | def find_format_specifiers(s): 87 | '''Find all format specifiers in a string.''' 88 | pos = 0 89 | specifiers = [] 90 | while True: 91 | percent = s.find('%', pos) 92 | if percent < 0: 93 | break 94 | specifiers.append(s[percent+1]) 95 | pos = percent+2 96 | return specifiers 97 | 98 | def split_format_specifiers(specifiers): 99 | '''Split format specifiers between numeric (Qt) and others (strprintf)''' 100 | numeric = [] 101 | other = [] 102 | for s in specifiers: 103 | if s in {'1','2','3','4','5','6','7','8','9'}: 104 | numeric.append(s) 105 | else: 106 | other.append(s) 107 | 108 | # If both numeric format specifiers and "others" are used, assume we're dealing 109 | # with a Qt-formatted message. In the case of Qt formatting (see https://doc.qt.io/qt-5/qstring.html#arg) 110 | # only numeric formats are replaced at all. This means "(percentage: %1%)" is valid, without needing 111 | # any kind of escaping that would be necessary for strprintf. Without this, this function 112 | # would wrongly detect '%)' as a printf format specifier. 113 | if numeric: 114 | other = [] 115 | 116 | # numeric (Qt) can be present in any order, others (strprintf) must be in specified order 117 | return set(numeric),other 118 | 119 | def sanitize_string(s): 120 | '''Sanitize string for printing''' 121 | return s.replace('\n',' ') 122 | 123 | def check_format_specifiers(source, translation, errors, numerus): 124 | source_f = split_format_specifiers(find_format_specifiers(source)) 125 | # assert that no source messages contain both Qt and strprintf format specifiers 126 | # if this fails, go change the source as this is hacky and confusing! 127 | assert(not(source_f[0] and source_f[1])) 128 | try: 129 | translation_f = split_format_specifiers(find_format_specifiers(translation)) 130 | except IndexError: 131 | errors.append("Parse error in translation for '%s': '%s'" % (sanitize_string(source), sanitize_string(translation))) 132 | return False 133 | else: 134 | if source_f != translation_f: 135 | if numerus and source_f == (set(), ['n']) and translation_f == (set(), []) and translation.find('%') == -1: 136 | # Allow numerus translations to omit %n specifier (usually when it only has one possible value) 137 | return True 138 | errors.append("Mismatch between '%s' and '%s'" % (sanitize_string(source), sanitize_string(translation))) 139 | return False 140 | return True 141 | 142 | def all_ts_files(file_format=FORMAT_TS, suffix='', include_source=False): 143 | for filename in os.listdir(LOCALE_DIR): 144 | # process only language files, and do not process source language 145 | if not filename.endswith(file_format + suffix) or (not include_source and filename == SOURCE_LANG + file_format + suffix): 146 | continue 147 | if suffix: # remove provided suffix 148 | filename = filename[0:-len(suffix)] 149 | filepath = os.path.join(LOCALE_DIR, filename) 150 | yield(filename, filepath) 151 | 152 | FIX_RE = re.compile(b'[\x00-\x09\x0b\x0c\x0e-\x1f]') 153 | def remove_invalid_characters(s): 154 | '''Remove invalid characters from translation string''' 155 | return FIX_RE.sub(b'', s) 156 | 157 | # Override cdata escape function to make our output match Qt's (optional, just for cleaner diffs for 158 | # comparison, disable by default) 159 | _orig_escape_cdata = None 160 | def escape_cdata(text): 161 | text = _orig_escape_cdata(text) 162 | text = text.replace("'", ''') 163 | text = text.replace('"', '"') 164 | return text 165 | 166 | def contains_bitcoin_addr(text, errors): 167 | if text is not None and ADDRESS_REGEXP.search(text) is not None: 168 | errors.append('Translation "%s" contains a bitcoin address. This will be removed.' % (text)) 169 | return True 170 | return False 171 | 172 | def postprocess_message(filename, message, xliff_compatible_mode): 173 | translation_node = message.find('translation') 174 | if not xliff_compatible_mode and translation_node.get('type') == 'unfinished': 175 | return False 176 | 177 | numerus = message.get('numerus') == 'yes' 178 | source = message.find('source').text 179 | # pick all numerusforms 180 | if numerus: 181 | translations = [i.text for i in translation_node.findall('numerusform')] 182 | else: 183 | if translation_node.text is None or translation_node.text == source: 184 | return False 185 | 186 | translations = [translation_node.text] 187 | 188 | for translation in translations: 189 | if translation is None: 190 | continue 191 | errors = [] 192 | valid = check_format_specifiers(source, translation, errors, numerus) and not contains_bitcoin_addr(translation, errors) 193 | 194 | for error in errors: 195 | print('%s: %s' % (filename, error)) 196 | 197 | if not valid: 198 | return False 199 | 200 | # Remove location tags 201 | for location in message.findall('location'): 202 | message.remove(location) 203 | 204 | return True 205 | 206 | def postprocess_translations(xliff_compatible_mode, reduce_diff_hacks=False): 207 | print('Checking and postprocessing...') 208 | 209 | if reduce_diff_hacks: 210 | global _orig_escape_cdata 211 | _orig_escape_cdata = ET._escape_cdata 212 | ET._escape_cdata = escape_cdata 213 | 214 | for (filename,filepath) in all_ts_files(): 215 | os.rename(filepath, filepath + ORIGINAL_SUFFIX) 216 | 217 | for (filename,filepath) in all_ts_files(suffix=ORIGINAL_SUFFIX): 218 | # pre-fixups to cope with transifex output 219 | parser = ET.XMLParser(encoding='utf-8') # need to override encoding because 'utf8' is not understood only 'utf-8' 220 | with open(filepath + ORIGINAL_SUFFIX, 'rb') as f: 221 | data = f.read() 222 | # remove control characters; this must be done over the entire file otherwise the XML parser will fail 223 | data = remove_invalid_characters(data) 224 | tree = ET.parse(io.BytesIO(data), parser=parser) 225 | 226 | # iterate over all messages in file 227 | root = tree.getroot() 228 | for context in root.findall('context'): 229 | for message in context.findall('message'): 230 | if not postprocess_message(filename, message, xliff_compatible_mode): 231 | context.remove(message); 232 | 233 | if not context.findall('message'): 234 | root.remove(context) 235 | 236 | # check if document is (virtually) empty, and remove it if so 237 | num_nonnumerus_messages = 0 238 | for context in root.findall('context'): 239 | for message in context.findall('message'): 240 | if message.get('numerus') != 'yes': 241 | num_nonnumerus_messages += 1 242 | 243 | if num_nonnumerus_messages < MIN_NUM_NONNUMERUS_MESSAGES: 244 | print('Removing %s, as it contains only %i non-numerus messages' % (filepath, num_nonnumerus_messages)) 245 | continue 246 | 247 | # write fixed-up tree 248 | # if diff reduction requested, replace some XML to 'sanitize' to qt formatting 249 | if reduce_diff_hacks: 250 | out = io.BytesIO() 251 | tree.write(out, encoding='utf-8') 252 | out = out.getvalue() 253 | out = out.replace(b' />', b'/>') 254 | with open(filepath, 'wb') as f: 255 | f.write(out) 256 | else: 257 | tree.write(filepath, encoding='utf-8') 258 | 259 | def update_git(): 260 | ''' 261 | Add new files to git repository. 262 | (Removing files isn't necessary here, as `git commit -a` will take care of removing files that are gone) 263 | ''' 264 | file_paths = [filepath for (filename, filepath) in all_ts_files()] 265 | subprocess.check_call([GIT, 'add'] + file_paths) 266 | 267 | def update_build_systems(): 268 | ''' 269 | Update build system and Qt resource descriptors. 270 | ''' 271 | filename_lang = [re.match(r'((bitcoin_(.*)).ts)$', filename).groups() for (filename, filepath) in all_ts_files(include_source=True)] 272 | filename_lang.sort(key=lambda x: x[0]) 273 | 274 | # update qrc locales 275 | with open('src/qt/bitcoin_locale.qrc', 'w') as f: 276 | f.write('\n') 277 | f.write(' \n') 278 | for (filename, basename, lang) in filename_lang: 279 | f.write(f' locale/{basename}.qm\n') 280 | f.write(' \n') 281 | f.write('\n') 282 | 283 | # update Makefile include 284 | with open('src/Makefile.qt_locale.include', 'w') as f: 285 | f.write('QT_TS = \\\n') 286 | f.write(' \\\n'.join(f' qt/locale/{filename}' for (filename, basename, lang) in filename_lang)) 287 | f.write('\n') # make sure last line doesn't end with a backslash 288 | 289 | if __name__ == '__main__': 290 | check_at_repository_root() 291 | remove_current_translations() 292 | fetch_all_translations() 293 | xliff_compatible_mode = convert_xlf_to_ts() 294 | postprocess_translations(xliff_compatible_mode) 295 | update_git() 296 | update_build_systems() 297 | remove_orig_files() 298 | 299 | --------------------------------------------------------------------------------