├── .dockerignore ├── .gitignore ├── Dockerfile ├── LICENSE ├── MANUAL_INSTRUCTIONS.md ├── README.md ├── install.sh ├── proof-sh └── proof.sh ├── python ├── _test_.py ├── convert.py ├── generate.py ├── helpers │ ├── common.py │ └── merkle.py ├── proof.py ├── requirements.txt └── tests │ └── merkle_test.py └── web └── index.html /.dockerignore: -------------------------------------------------------------------------------- 1 | docker/py-proof.Dockerfile 2 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | devreward.csv 3 | metadata.bin 4 | metadata.json 5 | python/__pycache__ 6 | python/output 7 | test.csv 8 | __pycache__ 9 | __test_cache__ 10 | __sh_cache__ 11 | 12 | age.stderr 13 | openssl.stderr 14 | test_data 15 | test-keys 16 | github-accounts.txt 17 | 18 | bin 19 | claim-venv 20 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.10 2 | COPY . /claim 3 | WORKDIR /claim 4 | RUN apt update && apt install age && apt clean 5 | RUN curl -o metadata.json https://fluence-dao.s3.eu-west-1.amazonaws.com/metadata.json 6 | RUN pip install -r python/requirements.txt 7 | 8 | ENTRYPOINT ["python", "python/proof.py"] 9 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /MANUAL_INSTRUCTIONS.md: -------------------------------------------------------------------------------- 1 | # Manual instructions for the paranoid 2 | 3 | The following can be done in a Docker container for additional isolation, including disconnecting its network once required components are installed, but is not necessary. 4 | 5 | 1. Setup and run in Docker (optional) 6 | - `docker run -ti --rm ubuntu:22.04 bash` to start a very basic docker container 7 | - `apt-get update && apt-get dist-upgrade -y` 8 | - `apt-get install -y age curl xxd gcc` to install age, curl, xxd and gcc compiler (for `sha3sum` compilation below) 9 | - `curl https://sh.rustup.rs -sSf | sh` to install rust 10 | 2. `~/.cargo/bin/cargo install sha3sum` to install the sha3sum tool with keccak256 11 | 3. `curl -LO https://fluence-dao.s3.eu-west-1.amazonaws.com/metadata.bin` to download the metadata file to your current directory 12 | 4. Optional: if running in Docker, disconnect it from the network, on the host: `docker network disconnect bridge ` (use `docker ps` to find the container id) 13 | 5. `grep '^githubusername,' metadata.bin` (if no output, you are not eligible) 14 | 6. Take one of the output lines, minus 'githubusername,' at the beginning, this is your encrypted blob in hex 15 | 7. `echo | xxd -r -p -c 1000 > enc.bin` to convert the hex to binary 16 | 8. If running in Docker, copy your private ssh key inside the docker (e.g. `docker cp path/to/local/private/key/file :/path/to/private/key/file`, or just use copy/paste in your terminal to a file) 17 | 9. `age --decrypt --identity --output dec.bin enc.bin` to decrypt the binary file 18 | 10. The file `dec.bin` has one line with 4 comma-separated entries, extract each of them to a variable: 19 | - `USER_ID=$(cat dec.bin | cut -d, -f1)` 20 | - `ETH_ADDR=$(cat dec.bin | cut -d, -f2)` 21 | - `ETH_KEY_SRC=$(cat dec.bin | cut -d, -f3)` 22 | - `MERKLE_PROOF=$(cat dec.bin | cut -d, -f4)` 23 | 11. Create a DER form of the Eth key: `echo -n $ETH_KEY_SRC | xxd -r -p -c 118 > eth_key.der` 24 | 12. `openssl ec -inform der -in eth_key.der > eth.key` 25 | 13. Prepare a message to sign: 26 | - Take your ethereum address _without_ the '0x' prefix. 27 | - `UNSIGNED_MSG="$(echo -n $'\x19Ethereum Signed Message:\n'20 | xxd -p)"` 28 | - `echo -n "$UNSIGNED_MSG" | xxd -r -p | ~/.cargo/bin/sha3sum -a Keccak256` to get your signed message hash. The first part of this output is your hash, set it as `$HASH`. The short-form of this is (the nbsp separator makes this look odd): `HASH=$(echo -n "$UNSIGNED_MSG" | xxd -r -p | ~/.cargo/bin/sha3sum -a Keccak256 | awk -F $'\xC2\xA0' '{ print $1 }')` 29 | 14. With HASH as the hex output of the signed message hash above, `SIGNATURE_HEX=$(echo $HASH | xxd -r -p | openssl pkeyutl -sign -inkey eth.key | xxd -p -c 72)` 30 | 15. You now have all elements of your proof, make a CSV form of them: `echo "${USER_ID},${ETH_ADDR},${SIGNATURE_HEX},${MERKLE_PROOF}"` 31 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Fluence Developer Rewards 2 | 3 | This repo allows one to generate a proof signature for Fluence dev reward claiming. 4 | 5 | > [!CAUTION] 6 | > Beware of scam emails, asking you to generate proofs for someone! See [#98](https://github.com/fluencelabs/dev-rewards/pull/98). 7 | 8 | The methods for generating signature are described below: 9 | 10 | ## Generate proof in docker 11 | 12 | 1. Build docker image 13 | 14 | > `docker build -t dev-reward-script .` 15 | 16 | 2. If your ssh keys are in ~/.ssh, run the script: 17 | 18 | > `docker run -it --rm --network none -v ~/.ssh:/root/.ssh:ro dev-reward-script` 19 | 20 | If your ssh keys are in other directories, replace 21 | {dir_path_for_your_ssh_keys} with your directory path: 22 | 23 | > `docker run -it --rm --network none -v /{dir_path_for_your_ssh_keys}:/root/.ssh:ro dev-reward-script` 24 | 25 | ## Generate proof via local sh script 26 | 27 | 1. Install dependencies 28 | 29 | > `./install.sh` 30 | 31 | 2. Run the script 32 | 33 | > `./proof-sh/proof.sh` 34 | 35 | ## Generate proof via local python script 36 | 37 | 1. Install python 38 | 39 | > https://www.python.org/downloads/ 40 | 41 | 2. Install dependencies 42 | 43 | > `./install.sh` 44 | 45 | > `python3 -m venv claim-venv` 46 | 47 | > `source claim-venv/bin/activate` 48 | 49 | > `pip3 install -r python/requirements.txt` 50 | 51 | 3. Run the script 52 | 53 | > `python3 python/proof.py` 54 | 55 | ## Generate proof through a website 56 | 57 | 1. Enter the `web` directory 58 | 59 | > cd web 60 | 61 | 2. Download the metadata.json file 62 | 63 | > curl https://fluence-dao.s3.eu-west-1.amazonaws.com/metadata.json > metadata.json 64 | 65 | 3. Spin up an HTTP server 66 | 67 | > python3 -m http.server 68 | 69 | 4. Open `http://127.0.0.1:8000` in your browser and follow the instructions 70 | 71 | ## Notes: 72 | 73 | Also check out [paranoid](./MANUAL_INSTRUCTIONS.md) instruction 74 | in case you have any security concerns regarding the methods above. 75 | -------------------------------------------------------------------------------- /install.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | set -o errexit -o nounset -o pipefail 3 | 4 | BIN_DIR="./bin" 5 | TEMP_DIR="$(mktemp -d)" 6 | OS="$(uname -s)-$(uname -m)" 7 | 8 | # cleanup on exit 9 | trap 'cleanup' EXIT INT TERM 10 | 11 | # check_program_in_path "program" 12 | check_program_in_path() { 13 | program="${1}" 14 | if ! type -p "${program}" &>/dev/null; then 15 | printf '%s\n' "error: ${program} is not installed." 16 | printf '%s\n' "Use your package manager to install it." 17 | exit 1 18 | fi 19 | } 20 | 21 | # check that everything installed 22 | PATH="${PATH}:./bin" 23 | for i in curl tar; do 24 | check_program_in_path $i 25 | done 26 | 27 | cleanup() { 28 | rm -r ${TEMP_DIR} 29 | } 30 | 31 | # metadata "file" 32 | metadata() { 33 | file="${1}" 34 | echo "Downloading ${file}" 35 | if [[ -f ${file} ]]; then 36 | echo "${file} already exists" 37 | else 38 | curl --progress-bar -o "${file}" "https://fluence-dao.s3.eu-west-1.amazonaws.com/${file}" 39 | fi 40 | } 41 | 42 | # setup "name" "url" 43 | setup() { 44 | name="$1" 45 | url="$2" 46 | echo "Downloading ${name} from ${url}" 47 | curl --progress-bar -L -S -o "${TEMP_DIR}/${name}.tar.gz" "${url}" 48 | tar xf "${TEMP_DIR}/${name}.tar.gz" -C "${TEMP_DIR}" 49 | 50 | # move all executables to BIN_DIR 51 | [[ ! -d "${BIN_DIR}" ]] && mkdir "${BIN_DIR}" -p 52 | find "${TEMP_DIR}" -type f -exec file {} + | grep 'executable' | grep -v 'shell script' | cut -d: -f1 | xargs -I {} mv {} "${BIN_DIR}" 53 | chmod +x "${BIN_DIR}"/* 54 | } 55 | 56 | case "$OS" in 57 | Linux-x86_64) 58 | SHA3SUM_URL="https://gitlab.com/kurdy/sha3sum/uploads/95b6ec553428e3940b3841fc259d02d4/sha3sum-x86_64_Linux-1.1.0.tar.gz" 59 | AGE_URL="https://github.com/FiloSottile/age/releases/download/v1.1.1/age-v1.1.1-linux-amd64.tar.gz" 60 | ;; 61 | Darwin-x86_64) 62 | SHA3SUM_URL="https://gitlab.com/kurdy/sha3sum/uploads/47a60658d30743fba6ea6dd99c48da98/sha3sum-x86_64-AppleDarwin-1.1.0.tar.gz" 63 | AGE_URL="https://github.com/FiloSottile/age/releases/download/v1.1.1/age-v1.1.1-darwin-amd64.tar.gz" 64 | ;; 65 | Darwin-arm64) 66 | SHA3SUM_URL="https://gitlab.com/kurdy/sha3sum/uploads/47a60658d30743fba6ea6dd99c48da98/sha3sum-x86_64-AppleDarwin-1.1.0.tar.gz" 67 | AGE_URL="https://github.com/FiloSottile/age/releases/download/v1.1.1/age-v1.1.1-darwin-arm64.tar.gz" 68 | ;; 69 | *) 70 | echo "Error: Unsupported OS ${OS}" 71 | exit 1 72 | ;; 73 | esac 74 | 75 | setup age "${AGE_URL}" 76 | setup sha3sum "${SHA3SUM_URL}" 77 | 78 | metadata metadata.bin 79 | metadata metadata.json 80 | -------------------------------------------------------------------------------- /proof-sh/proof.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | set -o errexit -o nounset -o pipefail 3 | 4 | # TODO: what about openssl versions? maybe use python for signing? 5 | # TODO: tell user how to install utilities 6 | 7 | # This script does the following: 8 | # 1. Ask user for her GitHub username and Ethereum address (eth_addr) 9 | # 2. Negotiate with user which SSH key to use 10 | # 3. Find at least one match of key and encrypted_key that decrypts succesfully 11 | # 4. Decrypt encrypted_key to tmp_eth_key 12 | # 5. Sign sender ethereum address: sign[tmp_eth_key](eth_addr) 13 | # 6. Encode signature (#5) and merkle proof and output result 14 | 15 | # keys.bin format (CSV): 16 | # GH UserName,Encrypted[userId,tmp_eth_addr,tmp_eth_key,merkle proof] 17 | 18 | trap 'echo GOT IT ; exit 0' SIGTERM 19 | 20 | # check_program_in_path "program" 21 | check_program_in_path() { 22 | program="${1}" 23 | if ! type -p "${program}" &>/dev/null; then 24 | printf '%s\n' "error: ${program} is not installed." 25 | printf '%s\n' "You should run install script first" 26 | printf '%s\n' "or use your package manager to install it." 27 | exit 1 28 | fi 29 | } 30 | 31 | # while true; do :; done 32 | 33 | # check that everything installed 34 | PATH="./bin:${PATH}" 35 | for i in age base64 sha3sum; do 36 | check_program_in_path $i 37 | done 38 | 39 | SSH_KEYS_DIR="$HOME/.ssh" 40 | 41 | ask_ssh_key() { 42 | SSH_KEYS=() 43 | # list all files from ~/.ssh, except for *.pub, known_hosts, config and log files (tmux sometimes puts logs there) 44 | while IFS= read -r -d $'\0'; do 45 | SSH_KEYS+=("$REPLY") 46 | done < <(find "$SSH_KEYS_DIR" -mindepth 1 -maxdepth 1 ! -name "*.pub" ! -name "known_hosts*" ! -name "config" ! -name "*.log" -print0) 47 | 48 | select fname in "${SSH_KEYS[@]}"; do 49 | echo "$fname" 50 | break 51 | done 52 | } 53 | 54 | WORK_DIR="$(pwd)/__sh_cache__" 55 | DECRYPTED_DATA="$WORK_DIR/decrypted.data" 56 | ETH_KEY_DER="$WORK_DIR/tmp_eth.key.der" 57 | ETH_KEY="$WORK_DIR/tmp_eth.key" 58 | OPENSSL_STDERR="$WORK_DIR/openssl.stderr" 59 | AGE_STDERR="$WORK_DIR/age.stderr" 60 | 61 | mkdir -p $WORK_DIR 62 | 63 | METADATA_BIN="metadata.bin" 64 | # $# is the number of arguments 65 | if [ $# -gt 1 ]; then 66 | GITHUB_USERNAME="$1" 67 | ETHEREUM_ADDRESS="$2" 68 | else 69 | if [ ! -f "$METADATA_BIN" ]; then 70 | echo "$METADATA_BIN doesn't exist" 71 | exit 1 72 | fi 73 | 74 | printf "\nWelcome to the proof generation script for Fluence Developer Reward Airdrop." 75 | printf "\n5%% of the FLT supply is allocated to ~110,000 developers who contributed into open source web3 repositories during last year." 76 | printf "\nPublic keys of selected Github accounts were added into a smart contract on Ethereum. Claim your allocation and help us build the decentralized internet together!" 77 | printf "\n" 78 | printf "\nCheck if you are eligible and proceed with claiming" 79 | 80 | read -r -p "Enter your github username so we can check if you are participating in the airdrop: " GITHUB_USERNAME 81 | 82 | printf "\nEthereum wallet address is necessary to generate a proof that you will send through our web page." 83 | printf "\n\033[33mImportant notice: you need to make a claim transaction from the entered address!\033[0m\n\n" 84 | 85 | read -r -p "Enter the ethereum address to which you plan to receive the airdrop: " ETHEREUM_ADDRESS 86 | 87 | STR_LENGTH=$(echo "$ETHEREUM_ADDRESS" | sed -e 's/^0x//' | awk '{ print length }') 88 | if [ "$STR_LENGTH" -ne 40 ]; then 89 | echo "$ETHEREUM_ADDRESS is not an Ethereum address. Must be of 40 or 42 (with 0x) characters, was $STR_LENGTH chars" 90 | exit 1 91 | fi 92 | NON_HEX_BYTES_LENGTH=$(echo "$ETHEREUM_ADDRESS" | sed -e 's/^0x//' | tr -d "[:xdigit:]" | awk '{ print length }') 93 | if [ "$NON_HEX_BYTES_LENGTH" -ne 0 ]; then 94 | echo "$ETHEREUM_ADDRESS is not an Ethereum address. Must be hexadecimal, has non-hexadecimal symbols." 95 | exit 1 96 | fi 97 | fi 98 | 99 | KEY_ARG_PATH='' 100 | if [ $# -gt 2 ]; then 101 | KEY_ARG_PATH="$3" 102 | fi 103 | 104 | ENCRYPTED_KEYS=() 105 | while IFS='' read -r line; do ENCRYPTED_KEYS+=("$line"); done < <(grep -i "^${GITHUB_USERNAME}," "${METADATA_BIN}" || true) 106 | 107 | # ${#ENCRYPTED_KEYS[@]} -- calculates number of elements in the array 108 | if [ ${#ENCRYPTED_KEYS[@]} -gt 1 ]; then 109 | echo "Found ${#ENCRYPTED_KEYS[@]} encrypted keys for your GitHub username. That means you have several SSH keys published on GitHub" 110 | # echo "Any of your keys would work. We have encrypted a temporary keypair for each of your SSH keys." 111 | elif [ ${#ENCRYPTED_KEYS[@]} -gt 0 ]; then 112 | echo "Found an encrypted key for your GitHub username" 113 | else 114 | echo "This'$GITHUB_USERNAME' Github account is not eligible for claiming" 115 | exit 1 116 | fi 117 | 118 | printf "\n\tNOTE: your SSH key is used ONLY LOCALLY to decrypt a message and generate Token Claim Proof." 119 | printf "\n\tScript will explicitly ask your consent before using the key." 120 | printf "\n\tIf you have any technical issues, take a look at the following logs:\n\t\t$OPENSSL_STDERR\n\t\t$AGE_STDERR\n\tReport any issues to https://fluence.chat \n\n" 121 | 122 | printf "Now the script needs your ssh key to generate proof. \n" 123 | 124 | while true; do 125 | if [ -n "$KEY_ARG_PATH" ] && [ -f "$KEY_ARG_PATH" ]; then 126 | KEY_PATH=$KEY_ARG_PATH 127 | else 128 | if [ -d "$SSH_KEYS_DIR" ]; then 129 | # shellcheck disable=SC2162 # user can have spaces in the path to ssh key and use backslashes to escape them 130 | read -p "Enter path to the private SSH key to use or just press Enter to show existing keys: " KEY_PATH 131 | if [ -z "$KEY_PATH" ]; then 132 | KEY_PATH=$(ask_ssh_key) 133 | fi 134 | else 135 | # shellcheck disable=SC2162 # user can have spaces in the path to ssh key and use backslashes to escape them 136 | read -p "Enter path to the private SSH key to use: " KEY_PATH 137 | if [ -z "$KEY_PATH" ]; then 138 | continue 139 | fi 140 | fi 141 | 142 | if ! [ -f "$KEY_PATH" ]; then 143 | echo "Specified $KEY_PATH file does not exits or not a SSH private key" 144 | continue 145 | fi 146 | 147 | read -p "Will use SSH key to generate proof data. Press enter to proceed. " 148 | printf "\n" 149 | fi 150 | 151 | rm -f "$DECRYPTED_DATA" 152 | printf "\n" 153 | 154 | for encrypted in "${ENCRYPTED_KEYS[@]}"; do 155 | # contains encrypted (user_id, eth_tmp_key, merkle proof) 156 | ENCRYPTED_DATA=$(echo "$encrypted" | cut -d',' -f2) 157 | 158 | set +o errexit 159 | echo "$ENCRYPTED_DATA" | xxd -r -p -c 1000 | age --decrypt --identity "$KEY_PATH" --output "$DECRYPTED_DATA" 2>$AGE_STDERR 160 | exit_code=$? 161 | set -o errexit 162 | 163 | if [ $exit_code -ne 0 ]; then 164 | continue 165 | else 166 | break 167 | fi 168 | done 169 | 170 | if [ -e "$DECRYPTED_DATA" ]; then 171 | # echo "Decrypted succesfully! Decrypted data is at $DECRYPTED_DATA" 172 | break 173 | else 174 | echo "Couldn't decrypt with that SSH key, please choose another one." 175 | echo "Possible causes are:" 176 | echo "You have specified the file which doesn't contain valid private key." 177 | echo "Your private key doesn't match your public key in GitHub. It could happen if you've changed local ssh key recently." 178 | echo "Internal error:" 179 | 180 | # replace report URL in $AGE_STDERR 181 | STDERR_TMP="$(mktemp)" 182 | cat "$AGE_STDERR" | sed -e 's#https://filippo.io/age/report#https://fluence.chat#g' > "$STDERR_TMP" 183 | cat "$STDERR_TMP" > "$AGE_STDERR" 184 | 185 | # print Age error with replaced report URL 186 | cat "$AGE_STDERR" 187 | fi 188 | done 189 | 190 | ## Prepare real ethereum address to be hashed and signed 191 | ETH_ADDR_HEX_ONLY=$(echo -n "$ETHEREUM_ADDRESS" | sed -e 's/^0x//') 192 | # length of ETH key is always 20 bytes 193 | LENGTH="20" 194 | PREFIX_HEX=$(echo -n $'\x19Ethereum Signed Message:\n'${LENGTH} | xxd -p) 195 | DATA_HEX="${PREFIX_HEX}${ETH_ADDR_HEX_ONLY}" 196 | 197 | ## '|| true' is needed to work around this bug https://gitlab.com/kurdy/sha3sum/-/issues/2 198 | HASH=$(echo -n "$DATA_HEX" | xxd -r -p | (sha3sum -a Keccak256 -t || true) | sed 's/[^[:xdigit:]].*//') 199 | 200 | ## Write temporary eth key to file in binary format (DER) 201 | cat "$DECRYPTED_DATA" | cut -d',' -f3 | xxd -r -p -c 118 >"$ETH_KEY_DER" 202 | 203 | ## Convert secp256k1 key from DER (binary) to textual representation 204 | 205 | set +o errexit 206 | openssl ec -inform der -in "$ETH_KEY_DER" 2>$OPENSSL_STDERR >"$ETH_KEY" 207 | exit_code=$? 208 | set -o errexit 209 | 210 | if [ $exit_code -ne 0 ]; then 211 | echo "Failed to parse $ETH_KEY_DER with OpenSSL. Errors below may be relevant." 212 | echo "===" 213 | cat $OPENSSL_STDERR 214 | echo "===" 215 | exit 1 216 | fi 217 | 218 | ## Sign hash of the real ethereum address with the temporary one 219 | SIGNATURE_HEX=$(echo "$HASH" | xxd -r -p | openssl pkeyutl -sign -inkey "$ETH_KEY" | xxd -p -c 72) 220 | 221 | USER_ID=$(cat "$DECRYPTED_DATA" | cut -d',' -f1) 222 | TMP_ETH_ADDR=$(cat "$DECRYPTED_DATA" | cut -d',' -f2) 223 | MERKLE_PROOF=$(cat "$DECRYPTED_DATA" | cut -d',' -f4) 224 | 225 | echo -e "Success! Copy the line below and paste it in the browser.\n" 226 | 227 | # userId, tmpEthAddr, signatureHex, merkleProofHex 228 | echo "${USER_ID},${TMP_ETH_ADDR},${SIGNATURE_HEX},${MERKLE_PROOF}" 229 | -------------------------------------------------------------------------------- /python/_test_.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | import hashlib 3 | import csv 4 | import os 5 | import asn1 6 | import shutil 7 | import python.proof as proof 8 | import sys 9 | import json 10 | import secrets 11 | import subprocess 12 | import python.generate as generate 13 | import base64 14 | from unittest import mock 15 | from web3.auto import w3 16 | from eth_account.messages import encode_defunct 17 | from hexbytes import HexBytes 18 | from helpers.common import Metadata 19 | from helpers.merkle import MerkleTree 20 | from cryptography.hazmat.primitives import serialization as crypto_serialization 21 | from cryptography.hazmat.primitives.asymmetric import rsa 22 | from cryptography.hazmat.backends import default_backend as crypto_default_backend 23 | 24 | TEST_FILES_PATH = os.path.join(os. getcwd(), "__test_cache__") 25 | CSV_TEST_PATH = os.path.join(TEST_FILES_PATH, "test.csv") 26 | USERS = ["user_1", "user_2", "user_3"] 27 | USERS_WITH_DOUBLE_KEYS = ["user_4"] 28 | SECOND_KEY_POSTFIX = "_second" 29 | 30 | # https://github.com/ethereum/EIPs/blob/master/EIPS/eip-2.md 31 | SECP256K1_N = int( 32 | 0xfffffffffffffffffffffffffffffffebaaedce6af48a03bbfd25e8cd0364141 33 | ) 34 | SECP256K1_HALF_N = int( 35 | SECP256K1_N/2 36 | ) 37 | 38 | 39 | def init(): 40 | if (os.path.exists(TEST_FILES_PATH)): 41 | shutil.rmtree(TEST_FILES_PATH, ignore_errors=False, onerror=None) 42 | 43 | os.mkdir(TEST_FILES_PATH) 44 | 45 | with open(CSV_TEST_PATH, 'w') as f: 46 | writer = csv.writer(f, delimiter=",") 47 | for user in USERS: 48 | pubKey = create_key(user) 49 | writer.writerow([user, pubKey]) 50 | 51 | for user in USERS_WITH_DOUBLE_KEYS: 52 | pubKey = create_key(user) 53 | writer.writerow([user, pubKey]) 54 | 55 | pubKey = create_key(user + SECOND_KEY_POSTFIX) 56 | writer.writerow([user, pubKey]) 57 | return 58 | 59 | 60 | def create_key(name): 61 | key = rsa.generate_private_key( 62 | backend=crypto_default_backend(), 63 | public_exponent=65537, 64 | key_size=2048 65 | ) 66 | privateKey = key.private_bytes( 67 | crypto_serialization.Encoding.PEM, 68 | crypto_serialization.PrivateFormat.PKCS8, 69 | crypto_serialization.NoEncryption() 70 | ) 71 | 72 | publicKey = key.public_key().public_bytes( 73 | crypto_serialization.Encoding.OpenSSH, 74 | crypto_serialization.PublicFormat.OpenSSH 75 | ) 76 | 77 | with open(os.path.join(TEST_FILES_PATH, name), 'w') as f: 78 | f.write(privateKey.decode()) 79 | 80 | with open(os.path.join(TEST_FILES_PATH, name + ".pub"), 'w') as f: 81 | f.write(publicKey.decode()) 82 | 83 | return publicKey.decode() 84 | 85 | 86 | def call_generate(): 87 | sys.argv = ["", "--output", f"{TEST_FILES_PATH}", 88 | os.path.join(TEST_FILES_PATH, "test.csv")] 89 | 90 | generate.main() 91 | 92 | with open(os.path.join(TEST_FILES_PATH, "metadata.json"), "r") as f: 93 | return hashlib.sha256(f.read().encode('utf-8')).hexdigest() 94 | 95 | 96 | def call_proof(): 97 | sys.argv = ["", os.path.join(TEST_FILES_PATH, "metadata.json")] 98 | proof.main() 99 | 100 | 101 | def call_proof_sh(user, address, keyPath): 102 | result = subprocess.run([ 103 | "./proof-sh/proof.sh", 104 | user, 105 | address, 106 | os.path.join(TEST_FILES_PATH, "metadata.bin"), keyPath 107 | ], capture_output=True) 108 | if result.returncode != 0: 109 | raise OSError(result) 110 | 111 | return result.stdout.decode() 112 | 113 | 114 | def assert_error(test, fn, err, msg): 115 | try: 116 | fn() 117 | except err as e: 118 | test.assertEqual(msg, str(e)) 119 | return 120 | 121 | test.fail("Did not raise the exception") 122 | 123 | 124 | def parse_print_out(address, print_mock): 125 | printList = print_mock.call_args_list 126 | out = printList[len(printList)-1].args[0].split(",") 127 | 128 | index = int(out[0]) 129 | tempAddress = out[1] 130 | signature = out[2] 131 | merkleProof = json.loads(str(base64.b64decode(out[3]).decode())) 132 | 133 | return index, tempAddress, signature, merkleProof 134 | 135 | 136 | def parse_sh_out(out): 137 | result = out.split('\n') 138 | result = result[len(result)-2].split(",") 139 | 140 | index = int(result[0]) 141 | tempAddress = result[1] 142 | asn1Signature = result[2] 143 | merkleProof = json.loads(str(base64.b64decode(result[3]).decode())) 144 | 145 | return index, tempAddress, asn1Signature, merkleProof 146 | 147 | 148 | def is_valid_asn1_sign(asn1Sign, tempAddress, address): 149 | decoder = asn1.Decoder() 150 | decoder.start(bytes.fromhex(asn1Sign)) 151 | _, value = decoder.read() 152 | decoder.start(value) 153 | _, r = decoder.read() 154 | _, s = decoder.read() 155 | 156 | if (s > SECP256K1_HALF_N): 157 | s = SECP256K1_N - s 158 | 159 | rs = r.to_bytes(32, "big") + s.to_bytes(32, "big") 160 | sign = rs + int(27).to_bytes(1, "big") 161 | 162 | one = w3.eth.account.recover_message( 163 | encode_defunct(hexstr=address), signature=HexBytes(sign) 164 | ).lower() == tempAddress 165 | 166 | sign = rs + int(28).to_bytes(1, "big") 167 | two = w3.eth.account.recover_message( 168 | encode_defunct(hexstr=address), signature=HexBytes(sign) 169 | ).lower() == tempAddress 170 | 171 | return one or two 172 | 173 | 174 | def verify_proof(test, index, tempAddress, signature, merkleProof, address, metadata): 175 | tree = MerkleTree(metadata.addresses) 176 | proof = tree.get_proof(index) 177 | test.assertEqual(proof, merkleProof) 178 | 179 | test.assertEqual(proof, merkleProof) 180 | test.assertEqual( 181 | w3.eth.account.recover_message( 182 | encode_defunct(hexstr=address), signature=HexBytes(signature) 183 | ).lower(), 184 | tempAddress 185 | ) 186 | 187 | 188 | def verify_proof_by_user(test, user, address, keyFileName, print_mock, metadata): 189 | path = os.path.join(TEST_FILES_PATH, keyFileName) 190 | with mock.patch( 191 | 'builtins.input', 192 | side_effect=[user, address, path] 193 | ): 194 | call_proof() 195 | 196 | index1, tempAddress1, signature1, merkleProof1 = parse_print_out( 197 | address, print_mock 198 | ) 199 | verify_proof( 200 | test, 201 | index1, 202 | tempAddress1, 203 | signature1, 204 | merkleProof1, 205 | address, 206 | metadata 207 | ) 208 | 209 | result = call_proof_sh(user, address, path) 210 | index2, tempAddress2, asn1Signatrue, merkleProof2 = parse_sh_out( 211 | result 212 | ) 213 | 214 | test.assertEqual(index1, index2) 215 | test.assertEqual(tempAddress1, tempAddress2) 216 | test.assertTrue( 217 | is_valid_asn1_sign( 218 | asn1Signatrue, tempAddress1, address 219 | ) 220 | ) 221 | test.assertEqual(merkleProof1, merkleProof2) 222 | 223 | 224 | @mock.patch('builtins.print') 225 | class Test(unittest.TestCase): 226 | @mock.patch('subprocess.run', side_effect=[subprocess.CompletedProcess([], 1, "", "age test error")]) 227 | def test_generate_error(self, print_mock, subprocess_mock): 228 | assert_error(self, call_generate, OSError, "age test error") 229 | return 230 | 231 | def test_generate(self, print_mock): 232 | self.assertNotEqual(call_generate(), call_generate()) 233 | 234 | @mock.patch('builtins.input', side_effect=["user_999"]) 235 | def test_proof_user_not_found(self, print_mock, inputMock): 236 | assert_error(self, call_proof, ValueError, "User not found") 237 | 238 | @mock.patch('builtins.input', side_effect=[USERS[0], "invalidTestAddress"]) 239 | def test_proof_invalid_eth_address(self, print_mock, inputMock): 240 | assert_error(self, call_proof, ValueError, "Invalid Ethereum address") 241 | 242 | @mock.patch('builtins.input', side_effect=[USERS[0], "0x0000000000000000000000000000000000000006", "key-path"]) 243 | @mock.patch('os.path.exists', side_effect=lambda x: mock.DEFAULT if (x != "key-path") else False) 244 | def test_proof_file_is_not_exist(self, print_mock, inputMock, pathExistsMock): 245 | assert_error(self, call_proof, FileNotFoundError, "File is not exist") 246 | 247 | @mock.patch('builtins.input', side_effect=[USERS[0], "0x0000000000000000000000000000000000000006", "key-path"]) 248 | @mock.patch('os.path.exists', side_effect=lambda x: mock.DEFAULT if (x != "key-path") else True) 249 | @mock.patch('os.path.isfile', side_effect=lambda x: mock.DEFAULT if (x != "key-path") else False) 250 | def test_proof_file_is_not_ssh_key(self, print_mock, inputMock, pathExistsMock, isFileMock): 251 | assert_error(self, call_proof, ValueError, "File is not a SSH key") 252 | 253 | def test_proof(self, print_mock): 254 | metadata = {} 255 | with open(os.path.join(TEST_FILES_PATH, "metadata.json"), "r") as f: 256 | metadata = Metadata.from_json(f.read()) 257 | 258 | for user in USERS: 259 | verify_proof_by_user( 260 | self, 261 | user, 262 | secrets.token_bytes(20).hex(), 263 | user, 264 | print_mock, 265 | metadata 266 | ) 267 | 268 | for user in USERS_WITH_DOUBLE_KEYS: 269 | verify_proof_by_user( 270 | self, 271 | user, 272 | secrets.token_bytes(20).hex(), 273 | user, 274 | print_mock, 275 | metadata 276 | ) 277 | verify_proof_by_user( 278 | self, 279 | user, 280 | secrets.token_bytes(20).hex(), 281 | user + SECOND_KEY_POSTFIX, 282 | print_mock, 283 | metadata 284 | ) 285 | 286 | 287 | if __name__ == '__main__': 288 | init() 289 | unittest.main() 290 | -------------------------------------------------------------------------------- /python/convert.py: -------------------------------------------------------------------------------- 1 | def create_unique_user_list(input_filename, output_filename): 2 | # Множество для хранения уникальных имен пользователей 3 | unique_users = set() 4 | 5 | with open(input_filename, 'r') as input_file: 6 | for line in input_file: 7 | username, _ = line.strip().split(',') 8 | unique_users.add(username) 9 | 10 | with open(output_filename, 'w') as output_file: 11 | for username in sorted(unique_users): 12 | output_file.write(f"{username},") 13 | 14 | input_filename = "metadata.bin" 15 | output_filename = 'github-accounts.txt' 16 | 17 | create_unique_user_list(input_filename, output_filename) -------------------------------------------------------------------------------- /python/generate.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | import base64 4 | import csv 5 | import argparse 6 | import dataclasses 7 | import os 8 | import subprocess 9 | import secrets 10 | import json 11 | from turtle import st 12 | from helpers.merkle import MerkleTree 13 | from eth_account import Account 14 | from helpers.common import Metadata 15 | from cryptography.hazmat.primitives.asymmetric import ec 16 | from cryptography.hazmat.primitives import serialization as crypto_serialization 17 | from cryptography.hazmat.primitives import serialization 18 | from cryptography.hazmat.primitives.asymmetric import rsa, ed25519 19 | from cryptography.hazmat.backends import default_backend 20 | import csv 21 | import cryptography.exceptions as exceptions 22 | 23 | DEFAULT_OUTPUT_DIR = os.path.join(os. getcwd(), "output") 24 | DEFAULT_FRACTION = 1000 25 | 26 | 27 | @dataclasses.dataclass 28 | class User: 29 | name: str 30 | pubKey: str 31 | 32 | 33 | def validate_key(key: str): 34 | try: 35 | public_key = serialization.load_ssh_public_key( 36 | key.encode(), 37 | backend=default_backend() 38 | ) 39 | except exceptions.UnsupportedAlgorithm as e: 40 | return False 41 | except ValueError as e: 42 | return False 43 | 44 | if isinstance(public_key, rsa.RSAPublicKey): 45 | if public_key.key_size < 2048: 46 | return False 47 | return True 48 | elif isinstance(public_key, ed25519.Ed25519PublicKey): 49 | return True 50 | else: 51 | return False 52 | 53 | 54 | def read_csv(filename): 55 | users = [] 56 | with open(filename, 'r') as csvfile: 57 | reader = csv.reader(csvfile, delimiter=',') 58 | i = 1 59 | totalInvalidCount = 0 60 | totalSkippedCount = 0 61 | for row in reader: 62 | print( 63 | f"Progress: {i} valid, {totalSkippedCount} skipped, {totalInvalidCount} invalid keys", 64 | end="\r") 65 | 66 | if len(row) != 2: 67 | totalSkippedCount += 1 68 | continue 69 | 70 | is_valid = validate_key(row[1]) 71 | if not is_valid: 72 | totalInvalidCount += 1 73 | continue 74 | 75 | i += 1 76 | users.append(User(name=row[0], pubKey=row[1])) 77 | 78 | print(f"Total skipped keys: {totalSkippedCount}") 79 | print(f"Total invalid keys: {totalInvalidCount}") 80 | return users 81 | 82 | 83 | def gen_eth_keys(users): 84 | addresses = [] 85 | privateKeys = {} 86 | 87 | i = 0 88 | fraction001 = int(len(users) / DEFAULT_FRACTION) 89 | for user in users: 90 | username = user.name 91 | privateKey = privateKeys.get(username) 92 | 93 | print(f"Progress: {(i / fraction001 / 10):.2f}%", end="\r") 94 | 95 | i += 1 96 | if privateKey is not None: 97 | continue 98 | 99 | privateKey = "0x" + secrets.token_hex(32) 100 | privateKeys[username] = privateKey 101 | addresses.append(Account.from_key(privateKey).address.lower()) 102 | 103 | return addresses, privateKeys 104 | 105 | 106 | def encrypt_data_with_ssh(data, sshPubKey): 107 | result = subprocess.run(["age", 108 | "--encrypt", 109 | "--recipient", 110 | sshPubKey, 111 | "-o", 112 | "-", 113 | "--armor"], 114 | capture_output=True, 115 | input=data.encode(), 116 | env=env) 117 | if result.returncode != 0: 118 | raise OSError(result.stderr) 119 | return result.stdout.decode() 120 | 121 | 122 | def random_sort(addresses): 123 | length = len(addresses) 124 | 125 | for i in range(length): 126 | j = secrets.randbelow(length) 127 | addresses[i], addresses[j] = addresses[j], addresses[i] 128 | 129 | 130 | def encrypt_for_standart_output(users, privateKeys): 131 | encryptedKeys: dict[str, dict[str, str]] = {} 132 | 133 | fraction001 = int(len(users) / DEFAULT_FRACTION) 134 | i = 0 135 | totalFailedCount = 0 136 | for user in users: 137 | username = user.name 138 | sshPubKey = user.pubKey 139 | privateKey = privateKeys[username] 140 | 141 | print(f"Progress: {(i / fraction001 / 10):.2f}%", end="\r") 142 | i += 1 143 | 144 | try: 145 | encrypted_data = encrypt_data_with_ssh( 146 | privateKey, sshPubKey 147 | ) 148 | if username not in encryptedKeys: 149 | encryptedKeys[username] = {} 150 | 151 | encryptedKeys[username][sshPubKey] = encrypted_data 152 | except OSError as e: 153 | totalFailedCount += 1 154 | print(f"Failed to encrypt key for {username} with {sshPubKey}") 155 | print(e) 156 | 157 | return encryptedKeys 158 | 159 | 160 | def encrypt_for_sh_output(tree, users, addresses, privateKeys): 161 | encryptedKeys = {} 162 | indexes = {} 163 | 164 | for i in range(len(addresses)): 165 | indexes[addresses[i]] = i 166 | 167 | fraction001 = int(len(addresses) / DEFAULT_FRACTION) 168 | i = 0 169 | for user in users: 170 | username = user.name 171 | sshPubKey = user.pubKey 172 | privateKey = privateKeys[username] 173 | address = Account.from_key(privateKeys[username]).address.lower() 174 | 175 | print(f"Progress: {(i / fraction001 / 10):.2f}%", end="\r") 176 | i += 1 177 | 178 | if username not in encryptedKeys: 179 | encryptedKeys[username] = [] 180 | 181 | index = indexes[address] 182 | 183 | proof = base64.b64encode(json.dumps( 184 | tree.get_proof(index)).encode() 185 | ).decode() 186 | 187 | key = ec.derive_private_key(int(privateKey, base=16), ec.SECP256K1()) 188 | openSSLPrivKey = "0x" + key.private_bytes( 189 | crypto_serialization.Encoding.DER, 190 | crypto_serialization.PrivateFormat.TraditionalOpenSSL, 191 | crypto_serialization.NoEncryption() 192 | ).hex() 193 | 194 | try: 195 | encryptedData = encrypt_data_with_ssh( 196 | f"{index},{address},{openSSLPrivKey},{proof}", sshPubKey)\ 197 | .replace("-----BEGIN AGE ENCRYPTED FILE-----", "")\ 198 | .replace("-----END AGE ENCRYPTED FILE-----", "") 199 | encryptedKeys[username].append( 200 | base64.b64decode(encryptedData).hex()) 201 | except OSError as e: 202 | print(f"Failed to encrypt key for {username} with {sshPubKey}") 203 | print(e) 204 | 205 | return encryptedKeys 206 | 207 | 208 | def write_output(filePath, root, addresses, encryptedKeys): 209 | metadata = Metadata(root=root, addresses=addresses, 210 | encryptedKeys=encryptedKeys) 211 | 212 | with open(filePath, 'w') as f: 213 | json.dump(metadata.to_dict(), f, ensure_ascii=False, indent=4) 214 | 215 | 216 | def write_output_for_sh_script(filePath, encryptedKeys): 217 | with open(filePath, 'w') as f: 218 | writer = csv.writer(f, delimiter=",") 219 | for username in encryptedKeys: 220 | for key in encryptedKeys[username]: 221 | writer.writerow([username, key]) 222 | 223 | 224 | def main(): 225 | parser = argparse.ArgumentParser() 226 | parser.add_argument('input', type=str) 227 | parser.add_argument('-o', '--output', type=str, default=DEFAULT_OUTPUT_DIR) 228 | 229 | args = parser.parse_args() 230 | inputFilePath = args.input 231 | outputFilePath = args.output 232 | 233 | metadataFilePath = os.path.join(outputFilePath, "metadata.json") 234 | shOutputFILEPath = os.path.join(outputFilePath, "metadata.bin") 235 | 236 | if not os.path.exists(outputFilePath): 237 | os.mkdir(outputFilePath) 238 | 239 | print(f"Reading from {inputFilePath}") 240 | users = read_csv(inputFilePath) 241 | 242 | print(f"Generating keys for {len(users)} users") 243 | addresses, privateKeys = gen_eth_keys(users) 244 | 245 | print(f"Suffling addresses") 246 | random_sort(addresses) 247 | 248 | print(f"Generating Merkle tree") 249 | tree = MerkleTree(addresses) 250 | 251 | write_output(metadataFilePath, tree.get_root(), addresses, 252 | encrypt_for_standart_output(users, privateKeys)) 253 | write_output_for_sh_script(shOutputFILEPath, encrypt_for_sh_output( 254 | tree, users, addresses, privateKeys)) 255 | 256 | print(f"Metadata file written to {metadataFilePath}") 257 | 258 | 259 | if __name__ == '__main__': 260 | main() 261 | -------------------------------------------------------------------------------- /python/helpers/common.py: -------------------------------------------------------------------------------- 1 | from dataclasses import dataclass 2 | from dataclasses_json import dataclass_json 3 | 4 | 5 | @dataclass_json 6 | @dataclass 7 | class Metadata: 8 | root: str 9 | addresses: list[str] 10 | encryptedKeys: dict[str, dict[str, str]] 11 | -------------------------------------------------------------------------------- /python/helpers/merkle.py: -------------------------------------------------------------------------------- 1 | import math 2 | from web3 import Web3 3 | 4 | 5 | class MerkleTree: 6 | def __init__(self, accounts): 7 | self._tree = [] 8 | self._total_depth = 0 9 | nodes = self._create_leafs(accounts) 10 | self._gen_tree(nodes) 11 | 12 | def _gen_tree(self, nodes): 13 | self._tree.append(nodes) 14 | self._total_depth = math.ceil(math.log(len(nodes), 2)) 15 | 16 | while (len(nodes) > 1): 17 | nodes = self._gen_prev_nodes(nodes) 18 | self._tree.append(nodes) 19 | 20 | def _gen_prev_nodes(self, nodes): 21 | newNodes = [] 22 | length = len(nodes) 23 | for i in range(0, length, 2): 24 | if length % 2 != 0 and i+1 >= length: 25 | newNodes.append(nodes[i]) 26 | break 27 | 28 | a = nodes[i] 29 | b = nodes[i+1] 30 | 31 | if (int.from_bytes(a, byteorder='big') < int.from_bytes(b, byteorder='big')): 32 | newNodes.append(Web3.keccak(a + b)) 33 | else: 34 | newNodes.append(Web3.keccak(b + a)) 35 | 36 | return newNodes 37 | 38 | def _create_leafs(self, accounts): 39 | leafs = [] 40 | for i, account in enumerate(accounts): 41 | leaf = Web3.solidity_keccak( 42 | ["uint32", "bytes20"], 43 | [ 44 | i, 45 | account 46 | ] 47 | ) 48 | leafs.append(leaf) 49 | 50 | return leafs 51 | 52 | def get_proof(self, index): 53 | proof = [] 54 | for nodes in self._tree: 55 | length = len(nodes) 56 | 57 | if length == 1: 58 | break 59 | 60 | if length % 2 != 0 and index == length-1: 61 | index = index // 2 62 | continue 63 | 64 | if (index % 2 == 0): 65 | proof.append(nodes[index+1].hex()) 66 | else: 67 | proof.append(nodes[index-1].hex()) 68 | 69 | index = index // 2 70 | 71 | return proof 72 | 73 | def get_root(self): 74 | return (self._tree[self._total_depth][0]).hex() 75 | -------------------------------------------------------------------------------- /python/proof.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import base64 3 | import json 4 | import os 5 | import sys 6 | import subprocess 7 | from web3 import Web3 8 | from pathlib import Path 9 | from eth_account.messages import encode_defunct 10 | from web3.auto import w3 11 | from helpers.common import Metadata 12 | from helpers.merkle import MerkleTree 13 | 14 | SSH_KEYS_DIR = os.path.join(Path.home(), ".ssh") 15 | BIN_DIR= os.path.join(os.getcwd(), "./bin") 16 | IGNORED_SSH_DIR_FILES = ['.DS_Store'] 17 | 18 | # Set PATH for python to find age 19 | env = os.environ.copy() 20 | env["PATH"] = BIN_DIR + os.pathsep + env["PATH"] 21 | 22 | def error(msg): 23 | print(f"\033[31m{msg}\033[0m") 24 | sys.exit(1) 25 | 26 | 27 | def parse_metadata(filename): 28 | with open(filename, 'r') as f: 29 | return Metadata.from_json(f.read()) 30 | 31 | 32 | def ask_user_info(metadata): 33 | username = input( 34 | "Enter your github username so we can check if you are participating in the airdrop:\n") 35 | if username not in metadata.encryptedKeys: 36 | error("This Github account is not eligible for claiming") 37 | 38 | ethereumAddress = input( 39 | ''' 40 | Ethereum wallet address is necessary to generate a proof that you will send through our web page. 41 | \033[33mImportant notice: you need to make a claim transaction from the entered address!\033[0m 42 | 43 | Enter the ethereum address to which you plan to receive the airdrop: 44 | ''') 45 | 46 | if not Web3.is_address(ethereumAddress): 47 | error("You entered an incorrect Ethereum address") 48 | 49 | return (username, ethereumAddress) 50 | 51 | 52 | def choose_ssh_key(): 53 | files = [] 54 | try: 55 | files = os.listdir(SSH_KEYS_DIR) 56 | except FileNotFoundError: 57 | pass 58 | 59 | sshKeys = [] 60 | for f in files: 61 | if f in IGNORED_SSH_DIR_FILES: 62 | continue 63 | 64 | path = os.path.join(SSH_KEYS_DIR, f) 65 | if not is_ssh_key(path): 66 | continue 67 | sshKeys.append(path) 68 | 69 | if len(sshKeys) > 0: 70 | print(f"\nYour ssh keys in ~/.ssh:") 71 | for key in sshKeys: 72 | print(key) 73 | 74 | sshKeyPath = input( 75 | ''' 76 | Now the script needs your ssh key to generate proof. Please, enter path for github SSH key: 77 | ''') 78 | pubKeyPath = sshKeyPath + ".pub" 79 | 80 | if not os.path.exists(sshKeyPath): 81 | error("Specified file does not exits") 82 | elif not is_ssh_key(sshKeyPath): 83 | error("Specified file is not a SSH private key") 84 | elif not os.path.isfile(pubKeyPath) or not os.path.exists(pubKeyPath): 85 | error( 86 | f"SSH public key ({pubKeyPath}) does not exist in current directory") 87 | 88 | pubKey = "" 89 | with open(pubKeyPath, 'r') as pubKeyFile: 90 | pubKey = " ".join(pubKeyFile.read().split(" ")[0:2]) 91 | 92 | return pubKey.strip(), sshKeyPath 93 | 94 | 95 | def is_ssh_key(path): 96 | if not os.path.isfile(path): 97 | return False 98 | 99 | with open(path, 'r') as file: 100 | try: 101 | key = file.read() 102 | return key.startswith("-----BEGIN") 103 | except Exception as e: 104 | print("Couldn't read the file:", file, "Reason:", e) 105 | return False 106 | 107 | 108 | def decrypt_temp_eth_account(sshPubKey, sshPrivKey, username, metadata): 109 | if sshPubKey not in metadata.encryptedKeys[username]: 110 | error("Specified SSH key is not eligible for claiming. Only RSA and Ed25519 keys added before our Github snapshot are supported for proof generation.") 111 | 112 | data = metadata.encryptedKeys[username][sshPubKey] 113 | result = subprocess.run(["age", 114 | "--decrypt", 115 | "--identity", 116 | sshPrivKey], 117 | capture_output=True, 118 | input=data.encode(), 119 | env=env) 120 | if result.returncode != 0: 121 | age_stderr = result.stderr.replace('https://filippo.io/age/report', 'https://fluence.chat') 122 | raise OSError(age_stderr) 123 | 124 | return w3.eth.account.from_key(result.stdout.decode()) 125 | 126 | 127 | def get_merkle_proof(metadata, tempETHAccount): 128 | address = tempETHAccount.address.lower() 129 | if address not in metadata.addresses: 130 | raise ValueError("Invalid temp address") 131 | 132 | tree = MerkleTree(metadata.addresses) 133 | index = metadata.addresses.index(address) 134 | return index, tree.get_proof(index) 135 | 136 | 137 | def main(): 138 | metadataPath = "metadata.json" 139 | metadata = parse_metadata(metadataPath) 140 | 141 | print(''' 142 | Welcome to the proof generation script for Fluence Developer Reward Airdrop. 143 | 5% of the FLT supply is allocated to ~110,000 developers who contributed into open source web3 repositories during last year. 144 | Public keys of selected Github accounts were added into a smart contract on Ethereum. Claim your allocation and help us build the decentralized internet together! 145 | 146 | Check if you are eligible and proceed with claiming 147 | ''') 148 | 149 | username, receiverAddress = ask_user_info(metadata) 150 | sshPubKey, sshKeyPath = choose_ssh_key() 151 | tempETHAccount = decrypt_temp_eth_account( 152 | sshPubKey, sshKeyPath, username, metadata 153 | ) 154 | index, merkleProof = get_merkle_proof(metadata, tempETHAccount) 155 | base64MerkleProof = base64.b64encode( 156 | json.dumps(merkleProof).encode() 157 | ).decode() 158 | 159 | sign = tempETHAccount.sign_message(encode_defunct(hexstr=receiverAddress)) 160 | 161 | print("\nSuccess! Copy the line below and paste it in the fluence airdrop website.") 162 | print(f"{index},{tempETHAccount.address.lower()},{sign.signature.hex()},{base64MerkleProof}") 163 | 164 | 165 | if __name__ == '__main__': 166 | main() 167 | -------------------------------------------------------------------------------- /python/requirements.txt: -------------------------------------------------------------------------------- 1 | eth-account==0.11.0 2 | web3==6.15.1 3 | dataclasses-json==0.6.4 -------------------------------------------------------------------------------- /python/tests/merkle_test.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | from hexbytes import HexBytes 3 | from helpers.merkle import MerkleTree 4 | 5 | 6 | class Test(unittest.TestCase): 7 | def test_gen_merkle_root_one(self): 8 | tree = MerkleTree([ 9 | "0xbb9c914731A9AA3E222e28d703cf4A4165747572", 10 | "0x30fc86c1802f67610DA9180D55426c314ae33DF1" 11 | ]) 12 | 13 | self.assertEqual( 14 | tree.get_root(), 15 | "0x65a7482ec1e10d14aabb46dce0681c14693fd27e99496b90b88d18d942e69827" 16 | ) 17 | 18 | def test_gen_merkle_root_two(self): 19 | tree = MerkleTree([ 20 | "0xbb9c914731A9AA3E222e28d703cf4A4165747572", 21 | "0x30fc86c1802f67610DA9180D55426c314ae33DF1", 22 | "0x1bF09865467eE989b25b8177801bb14b758CA1EA", 23 | "0xFCED23957E60AcCc949BF021Ee4e98941c3A6EeA" 24 | ]) 25 | 26 | self.assertEqual( 27 | tree.get_root(), 28 | "0x887d282fcf9b1b18f5d5ad073add82bb1f9b5947074f17b049423c43c2edc5eb" 29 | ) 30 | 31 | def test_gen_merkle_root_three(self): 32 | tree = MerkleTree([ 33 | "0xbb9c914731A9AA3E222e28d703cf4A4165747572", 34 | "0x30fc86c1802f67610DA9180D55426c314ae33DF1", 35 | "0x1bF09865467eE989b25b8177801bb14b758CA1EA", 36 | "0xFCED23957E60AcCc949BF021Ee4e98941c3A6EeA", 37 | "0xbE51f9928E498D018C18940FD1EbDBB04F493a45", 38 | "0x4Ee1279a8B1b553c25E008222A25e7FAb6feD1Df" 39 | ]) 40 | self.assertEqual( 41 | tree.get_root(), 42 | "0x4c7b502da67f50d75e1ffdc8e9230f0766908a4f4bec41425c476992e175e539" 43 | ) 44 | 45 | def test_gen_merkle_root_tour(self): 46 | tree = MerkleTree([ 47 | "0xbb9c914731A9AA3E222e28d703cf4A4165747572", 48 | "0x30fc86c1802f67610DA9180D55426c314ae33DF1", 49 | "0x1bF09865467eE989b25b8177801bb14b758CA1EA", 50 | "0xFCED23957E60AcCc949BF021Ee4e98941c3A6EeA", 51 | "0xbE51f9928E498D018C18940FD1EbDBB04F493a45", 52 | "0x4Ee1279a8B1b553c25E008222A25e7FAb6feD1Df", 53 | "0x7d95dc520960c65cfd484c2b8a1Ca98f358dEe97" 54 | ]) 55 | self.assertEqual( 56 | tree.get_root(), 57 | "0x3271eb7121b96c005fe3235f211b442a92f491c4c067e249e0ae0af655c96937" 58 | ) 59 | 60 | def test_gen_merkle_root_tour(self): 61 | v = [ 62 | "0xbb9c914731A9AA3E222e28d703cf4A4165747572", 63 | "0x30fc86c1802f67610DA9180D55426c314ae33DF1", 64 | "0x1bF09865467eE989b25b8177801bb14b758CA1EA", 65 | "0xFCED23957E60AcCc949BF021Ee4e98941c3A6EeA", 66 | "0xbE51f9928E498D018C18940FD1EbDBB04F493a45", 67 | "0x4Ee1279a8B1b553c25E008222A25e7FAb6feD1Df", 68 | "0x7d95dc520960c65cfd484c2b8a1Ca98f358dEe97" 69 | ] 70 | 71 | tree = MerkleTree(v) 72 | self.assertEqual( 73 | tree.get_root(), 74 | "0x3271eb7121b96c005fe3235f211b442a92f491c4c067e249e0ae0af655c96937" 75 | ) 76 | 77 | proof = [ 78 | [ 79 | '0x12c17dcd8b8eac48894f491746f44bf8c8bdeab97b0fb42780d032af6a9bae74', 80 | '0x17a23e9275f7628e4fca74944437191dbe7a95cc9cefa865bfb3d54321ec7e00', 81 | '0xf8cd45cad8572d8dd0300ec8dee54aa2a3dcf7565c0c875234c779a04370c686' 82 | ], 83 | [ 84 | '0x827ab251e8cf1433dd27f54b8a3d98174056749dd3ff67ecac0837e5f6cb3cd0', 85 | '0x17a23e9275f7628e4fca74944437191dbe7a95cc9cefa865bfb3d54321ec7e00', 86 | '0xf8cd45cad8572d8dd0300ec8dee54aa2a3dcf7565c0c875234c779a04370c686' 87 | ], 88 | [ 89 | '0x64c3d348b53464e4cd091fb28a52c2a9d9026fd7be0177e8a58c18e9a1dd2698', 90 | '0x65a7482ec1e10d14aabb46dce0681c14693fd27e99496b90b88d18d942e69827', 91 | '0xf8cd45cad8572d8dd0300ec8dee54aa2a3dcf7565c0c875234c779a04370c686' 92 | ], 93 | [ 94 | '0xe4207a74cd8eafd52a57d95be9c2397a1b33bcecba12442195705d441444b3a9', 95 | '0x65a7482ec1e10d14aabb46dce0681c14693fd27e99496b90b88d18d942e69827', 96 | '0xf8cd45cad8572d8dd0300ec8dee54aa2a3dcf7565c0c875234c779a04370c686' 97 | ], 98 | [ 99 | '0xed2f78721594402e6757f4d5d8b491cc6f06c10d6cd3a99803db3f8932d36c4d', 100 | '0x8ac28980a9a848e172c74c57e2f7fff64e73c75799da7e619372ff3173c3be69', 101 | '0x887d282fcf9b1b18f5d5ad073add82bb1f9b5947074f17b049423c43c2edc5eb' 102 | ], 103 | [ 104 | '0x2a06ae5e04fb1513e75d6fe541e3a39bfedca64b209a6c1016bc4af2cc3fa883', 105 | '0x8ac28980a9a848e172c74c57e2f7fff64e73c75799da7e619372ff3173c3be69', 106 | '0x887d282fcf9b1b18f5d5ad073add82bb1f9b5947074f17b049423c43c2edc5eb' 107 | ], 108 | [ 109 | '0xfc4778b67ab8597a9c5d25e5e0062db02ed226352720db0d70b19c06a333ddf7', 110 | '0x887d282fcf9b1b18f5d5ad073add82bb1f9b5947074f17b049423c43c2edc5eb' 111 | ] 112 | ] 113 | 114 | for i, _ in enumerate(v): 115 | p = tree.get_proof(i) 116 | self.assertEqual(p, proof[i]) 117 | 118 | 119 | if __name__ == '__main__': 120 | unittest.main() 121 | -------------------------------------------------------------------------------- /web/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 10 | 11 | 12 | Fluence Token Proof Generator 13 | 152 | 153 | 154 | 155 |
156 |
157 |
158 | 159 |
160 | 161 |
162 | 165 | 570 |
571 |
572 | 573 | 574 | --------------------------------------------------------------------------------