├── .editorconfig ├── .github └── ISSUE_TEMPLATE.md ├── .gitignore ├── .readthedocs.yml ├── .travis.yml ├── AUTHORS.rst ├── CONTRIBUTING.rst ├── HISTORY.rst ├── LICENSE ├── MANIFEST.in ├── Makefile ├── README.rst ├── docs ├── Makefile ├── api_reference.rst ├── authors.rst ├── conf.py ├── contributing.rst ├── history.rst ├── index.rst ├── installation.rst ├── make.bat ├── readme.rst └── usage.rst ├── multihash ├── __init__.py ├── constants.py └── multihash.py ├── requirements_dev.txt ├── setup.cfg ├── setup.py ├── tests └── test_multihash.py └── tox.ini /.editorconfig: -------------------------------------------------------------------------------- 1 | # http://editorconfig.org 2 | 3 | root = true 4 | 5 | [*] 6 | indent_style = space 7 | indent_size = 4 8 | trim_trailing_whitespace = true 9 | insert_final_newline = true 10 | charset = utf-8 11 | end_of_line = lf 12 | 13 | [*.bat] 14 | indent_style = tab 15 | end_of_line = crlf 16 | 17 | [LICENSE] 18 | insert_final_newline = false 19 | 20 | [Makefile] 21 | indent_style = tab 22 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | * py-multihash version: 2 | * Python version: 3 | * Operating System: 4 | 5 | ### Description 6 | 7 | Describe what you were trying to get done. 8 | Tell us what happened, what went wrong, and what you expected to happen. 9 | 10 | ### What I Did 11 | 12 | ``` 13 | Paste the command(s) you ran and the output. 14 | If there was a crash, please include the traceback here. 15 | ``` 16 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | env/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | 28 | # PyInstaller 29 | # Usually these files are written by a python script from a template 30 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 31 | *.manifest 32 | *.spec 33 | 34 | # Installer logs 35 | pip-log.txt 36 | pip-delete-this-directory.txt 37 | 38 | # Unit test / coverage reports 39 | htmlcov/ 40 | .tox/ 41 | .coverage 42 | .coverage.* 43 | .cache 44 | nosetests.xml 45 | coverage.xml 46 | *.cover 47 | .hypothesis/ 48 | .pytest_cache/ 49 | 50 | # Translations 51 | *.mo 52 | *.pot 53 | 54 | # Django stuff: 55 | *.log 56 | local_settings.py 57 | 58 | # Flask stuff: 59 | instance/ 60 | .webassets-cache 61 | 62 | # Scrapy stuff: 63 | .scrapy 64 | 65 | # Sphinx documentation 66 | docs/_build/ 67 | 68 | # PyBuilder 69 | target/ 70 | 71 | # Jupyter Notebook 72 | .ipynb_checkpoints 73 | 74 | # pyenv 75 | .python-version 76 | 77 | # celery beat schedule file 78 | celerybeat-schedule 79 | 80 | # SageMath parsed files 81 | *.sage.py 82 | 83 | # dotenv 84 | .env 85 | 86 | # virtualenv 87 | .venv 88 | venv/ 89 | ENV/ 90 | 91 | # Spyder project settings 92 | .spyderproject 93 | .spyproject 94 | 95 | # Rope project settings 96 | .ropeproject 97 | 98 | # mkdocs documentation 99 | /site 100 | 101 | # mypy 102 | .mypy_cache/ 103 | -------------------------------------------------------------------------------- /.readthedocs.yml: -------------------------------------------------------------------------------- 1 | requirements_file: requirements_dev.txt 2 | python: 3 | version: 3.5 4 | pip_install: true 5 | -------------------------------------------------------------------------------- /.travis.yml: -------------------------------------------------------------------------------- 1 | # Config file for automatic testing at travis-ci.org 2 | 3 | language: python 4 | python: 5 | - 3.8 6 | - 3.7 7 | - 3.6 8 | - 3.5 9 | 10 | cache: pip 11 | # Command to install dependencies, e.g. pip install -r requirements.txt --use-mirrors 12 | install: pip install -U tox-travis 13 | 14 | # Command to run tests, e.g. python setup.py test 15 | script: tox 16 | -------------------------------------------------------------------------------- /AUTHORS.rst: -------------------------------------------------------------------------------- 1 | Authors 2 | ------- 3 | 4 | * Dhruv Baldawa <@dhruvbaldawa> 5 | -------------------------------------------------------------------------------- /CONTRIBUTING.rst: -------------------------------------------------------------------------------- 1 | .. highlight:: shell 2 | 3 | ============ 4 | Contributing 5 | ============ 6 | 7 | Contributions are welcome, and they are greatly appreciated! Every little bit 8 | helps, and credit will always be given. 9 | 10 | You can contribute in many ways: 11 | 12 | Types of Contributions 13 | ---------------------- 14 | 15 | Report Bugs 16 | ~~~~~~~~~~~ 17 | 18 | Report bugs at https://github.com/multiformats/py-multihash/issues. 19 | 20 | If you are reporting a bug, please include: 21 | 22 | * Your operating system name and version. 23 | * Any details about your local setup that might be helpful in troubleshooting. 24 | * Detailed steps to reproduce the bug. 25 | 26 | Fix Bugs 27 | ~~~~~~~~ 28 | 29 | Look through the GitHub issues for bugs. Anything tagged with "bug" and "help 30 | wanted" is open to whoever wants to implement it. 31 | 32 | Implement Features 33 | ~~~~~~~~~~~~~~~~~~ 34 | 35 | Look through the GitHub issues for features. Anything tagged with "enhancement" 36 | and "help wanted" is open to whoever wants to implement it. 37 | 38 | Write Documentation 39 | ~~~~~~~~~~~~~~~~~~~ 40 | 41 | py-multihash could always use more documentation, whether as part of the 42 | official py-multihash docs, in docstrings, or even on the web in blog posts, 43 | articles, and such. 44 | 45 | Submit Feedback 46 | ~~~~~~~~~~~~~~~ 47 | 48 | The best way to send feedback is to file an issue at https://github.com/multiformats/py-multihash/issues. 49 | 50 | If you are proposing a feature: 51 | 52 | * Explain in detail how it would work. 53 | * Keep the scope as narrow as possible, to make it easier to implement. 54 | * Remember that this is a volunteer-driven project, and that contributions 55 | are welcome :) 56 | 57 | Get Started! 58 | ------------ 59 | 60 | Ready to contribute? Here's how to set up `multihash` for local development. 61 | 62 | 1. Fork the `py-multihash` repo on GitHub. 63 | 2. Clone your fork locally:: 64 | 65 | $ git clone git@github.com:your_name_here/py-multihash.git 66 | 67 | 3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development:: 68 | 69 | $ mkvirtualenv multihash 70 | $ cd multihash/ 71 | $ python setup.py develop 72 | 73 | 4. Create a branch for local development:: 74 | 75 | $ git checkout -b name-of-your-bugfix-or-feature 76 | 77 | Now you can make your changes locally. 78 | 79 | 5. When you're done making changes, check that your changes pass flake8 and the 80 | tests, including testing other Python versions with tox:: 81 | 82 | $ flake8 multihash tests 83 | $ python setup.py test or py.test 84 | $ tox 85 | 86 | To get flake8 and tox, just pip install them into your virtualenv. 87 | 88 | 6. Commit your changes and push your branch to GitHub:: 89 | 90 | $ git add . 91 | $ git commit -m "Your detailed description of your changes." 92 | $ git push origin name-of-your-bugfix-or-feature 93 | 94 | 7. Submit a pull request through the GitHub website. 95 | 96 | Pull Request Guidelines 97 | ----------------------- 98 | 99 | Before you submit a pull request, check that it meets these guidelines: 100 | 101 | 1. The pull request should include tests. 102 | 2. If the pull request adds functionality, the docs should be updated. Put 103 | your new functionality into a function with a docstring, and add the 104 | feature to the list in README.rst. 105 | 3. The pull request should work for Python 3.5 and 3.6. Check 106 | https://travis-ci.org/multiformats/py-multihash/pull_requests 107 | and make sure that the tests pass for all supported Python versions. 108 | 109 | Tips 110 | ---- 111 | 112 | To run a subset of tests:: 113 | 114 | $ py.test tests.test_multihash 115 | 116 | 117 | Deploying 118 | --------- 119 | 120 | A reminder for the maintainers on how to deploy. 121 | Make sure all your changes are committed (including an entry in HISTORY.rst). 122 | Then run:: 123 | 124 | $ bumpversion patch # possible: major / minor / patch 125 | $ git push 126 | $ git push --tags 127 | 128 | Travis will then deploy to PyPI if tests pass. 129 | -------------------------------------------------------------------------------- /HISTORY.rst: -------------------------------------------------------------------------------- 1 | History 2 | ------- 3 | 4 | 0.2.3 (2018-10-20) 5 | ======================= 6 | * Fix issue with decoding breaking with app codes 7 | * Fix issue with invalid varint decoding 8 | 9 | 0.1.0 (2018-10-19) 10 | ======================= 11 | 12 | * First release on PyPI. 13 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018, Dhruv Baldawa 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include AUTHORS.rst 2 | include CONTRIBUTING.rst 3 | include HISTORY.rst 4 | include LICENSE 5 | include README.rst 6 | 7 | recursive-include tests * 8 | recursive-exclude * __pycache__ 9 | recursive-exclude * *.py[co] 10 | 11 | recursive-include docs *.rst conf.py Makefile make.bat *.jpg *.png *.gif 12 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | .PHONY: clean clean-test clean-pyc clean-build docs help 2 | .DEFAULT_GOAL := help 3 | 4 | define BROWSER_PYSCRIPT 5 | import os, webbrowser, sys 6 | 7 | try: 8 | from urllib import pathname2url 9 | except: 10 | from urllib.request import pathname2url 11 | 12 | webbrowser.open("file://" + pathname2url(os.path.abspath(sys.argv[1]))) 13 | endef 14 | export BROWSER_PYSCRIPT 15 | 16 | define PRINT_HELP_PYSCRIPT 17 | import re, sys 18 | 19 | for line in sys.stdin: 20 | match = re.match(r'^([a-zA-Z_-]+):.*?## (.*)$$', line) 21 | if match: 22 | target, help = match.groups() 23 | print("%-20s %s" % (target, help)) 24 | endef 25 | export PRINT_HELP_PYSCRIPT 26 | 27 | BROWSER := python -c "$$BROWSER_PYSCRIPT" 28 | 29 | help: 30 | @python -c "$$PRINT_HELP_PYSCRIPT" < $(MAKEFILE_LIST) 31 | 32 | clean: clean-build clean-pyc clean-test ## remove all build, test, coverage and Python artifacts 33 | 34 | clean-build: ## remove build artifacts 35 | rm -fr build/ 36 | rm -fr dist/ 37 | rm -fr .eggs/ 38 | find . -name '*.egg-info' -exec rm -fr {} + 39 | find . -name '*.egg' -exec rm -f {} + 40 | 41 | clean-pyc: ## remove Python file artifacts 42 | find . -name '*.pyc' -exec rm -f {} + 43 | find . -name '*.pyo' -exec rm -f {} + 44 | find . -name '*~' -exec rm -f {} + 45 | find . -name '__pycache__' -exec rm -fr {} + 46 | 47 | clean-test: ## remove test and coverage artifacts 48 | rm -f .coverage 49 | rm -fr htmlcov/ 50 | rm -fr .pytest_cache 51 | 52 | lint: ## check style with flake8 53 | flake8 multihash tests 54 | 55 | test: ## run tests quickly with the default Python 56 | py.test --cov=multihash/ --cov-report=html --cov-report=term-missing --cov-branch 57 | 58 | 59 | test-all: ## run tests on every Python version with tox 60 | tox 61 | 62 | coverage: ## check code coverage quickly with the default Python 63 | coverage run --source multihash -m pytest 64 | coverage report -m 65 | coverage html 66 | $(BROWSER) htmlcov/index.html 67 | 68 | docs: ## generate Sphinx HTML documentation, including API docs 69 | rm -f docs/multihash.rst 70 | rm -f docs/modules.rst 71 | $(MAKE) -C docs clean 72 | $(MAKE) -C docs html 73 | $(BROWSER) docs/_build/html/index.html 74 | 75 | servedocs: docs ## compile the docs watching for changes 76 | watchmedo shell-command -p '*.rst;*.py' -c '$(MAKE) -C docs html' -R -D . 77 | 78 | verify_description: 79 | python setup.py --long-description | rst2html.py > /dev/null 80 | 81 | release: clean verify_description ## package and upload a release 82 | python setup.py sdist bdist_wheel 83 | twine upload dist/* 84 | 85 | dist: clean ## builds source and wheel package 86 | python setup.py sdist 87 | python setup.py bdist_wheel 88 | ls -l dist 89 | 90 | install: clean ## install the package to the active Python's site-packages 91 | python setup.py install 92 | -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | **This project is no longer maintained and has been archived.** 2 | 3 | ============ 4 | py-multihash 5 | ============ 6 | 7 | 8 | .. image:: https://img.shields.io/pypi/v/py-multihash.svg 9 | :target: https://pypi.python.org/pypi/py-multihash 10 | 11 | .. image:: https://img.shields.io/travis/multiformats/py-multihash.svg?branch=master 12 | :target: https://travis-ci.org/multiformats/py-multihash?branch=master 13 | 14 | .. image:: https://codecov.io/gh/multiformats/py-multihash/branch/master/graph/badge.svg 15 | :target: https://codecov.io/gh/multiformats/py-multihash 16 | 17 | .. image:: https://readthedocs.org/projects/multihash/badge/?version=stable 18 | :target: https://multihash.readthedocs.io/en/stable/?badge=stable 19 | :alt: Documentation Status 20 | 21 | 22 | 23 | Multihash implementation in Python 24 | 25 | 26 | * Free software: MIT license 27 | * Documentation: https://multihash.readthedocs.io. 28 | * Python versions: Python 3.4, 3.5, 3.6 29 | 30 | -------------------------------------------------------------------------------- /docs/Makefile: -------------------------------------------------------------------------------- 1 | # Minimal makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = python -msphinx 7 | SPHINXPROJ = multihash 8 | SOURCEDIR = . 9 | BUILDDIR = _build 10 | 11 | # Put it first so that "make" without argument is like "make help". 12 | help: 13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 14 | 15 | .PHONY: help Makefile 16 | 17 | # Catch-all target: route all unknown targets to Sphinx using the new 18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). 19 | %: Makefile 20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 21 | -------------------------------------------------------------------------------- /docs/api_reference.rst: -------------------------------------------------------------------------------- 1 | API Reference 2 | ============= 3 | 4 | .. py:currentmodule:: multihash 5 | 6 | .. autofunction:: encode 7 | 8 | .. autofunction:: decode 9 | 10 | .. autofunction:: is_valid 11 | 12 | .. autofunction:: is_valid_code 13 | 14 | .. autofunction:: get_prefix 15 | 16 | .. autofunction:: coerce_code 17 | 18 | .. autofunction:: is_app_code 19 | 20 | .. autofunction:: to_hex_string 21 | 22 | .. autofunction:: from_hex_string 23 | 24 | .. autofunction:: to_b58_string 25 | 26 | .. autofunction:: from_b58_string 27 | 28 | .. autoclass:: Multihash 29 | -------------------------------------------------------------------------------- /docs/authors.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../AUTHORS.rst 2 | -------------------------------------------------------------------------------- /docs/conf.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | # 4 | # multihash documentation build configuration file, created by 5 | # sphinx-quickstart on Fri Jun 9 13:47:02 2017. 6 | # 7 | # This file is execfile()d with the current directory set to its 8 | # containing dir. 9 | # 10 | # Note that not all possible configuration values are present in this 11 | # autogenerated file. 12 | # 13 | # All configuration values have a default; values that are commented out 14 | # serve to show the default. 15 | 16 | # If extensions (or modules to document with autodoc) are in another 17 | # directory, add these directories to sys.path here. If the directory is 18 | # relative to the documentation root, use os.path.abspath to make it 19 | # absolute, like shown here. 20 | # 21 | import os 22 | import sys 23 | sys.path.insert(0, os.path.abspath('..')) 24 | 25 | import multihash 26 | 27 | # -- General configuration --------------------------------------------- 28 | 29 | # If your documentation needs a minimal Sphinx version, state it here. 30 | # 31 | # needs_sphinx = '1.0' 32 | 33 | # Add any Sphinx extension module names here, as strings. They can be 34 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones. 35 | extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode'] 36 | 37 | # Add any paths that contain templates here, relative to this directory. 38 | templates_path = ['_templates'] 39 | 40 | # The suffix(es) of source filenames. 41 | # You can specify multiple suffix as a list of string: 42 | # 43 | # source_suffix = ['.rst', '.md'] 44 | source_suffix = '.rst' 45 | 46 | # The master toctree document. 47 | master_doc = 'index' 48 | 49 | # General information about the project. 50 | project = u'py-multihash' 51 | copyright = u"2018, Dhruv Baldawa" 52 | author = u"Dhruv Baldawa" 53 | 54 | # The version info for the project you're documenting, acts as replacement 55 | # for |version| and |release|, also used in various other places throughout 56 | # the built documents. 57 | # 58 | # The short X.Y version. 59 | version = multihash.__version__ 60 | # The full version, including alpha/beta/rc tags. 61 | release = multihash.__version__ 62 | 63 | # The language for content autogenerated by Sphinx. Refer to documentation 64 | # for a list of supported languages. 65 | # 66 | # This is also used if you do content translation via gettext catalogs. 67 | # Usually you set "language" from the command line for these cases. 68 | language = None 69 | 70 | # List of patterns, relative to source directory, that match files and 71 | # directories to ignore when looking for source files. 72 | # This patterns also effect to html_static_path and html_extra_path 73 | exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] 74 | 75 | # The name of the Pygments (syntax highlighting) style to use. 76 | pygments_style = 'sphinx' 77 | 78 | # If true, `todo` and `todoList` produce output, else they produce nothing. 79 | todo_include_todos = False 80 | 81 | 82 | # -- Options for HTML output ------------------------------------------- 83 | 84 | # The theme to use for HTML and HTML Help pages. See the documentation for 85 | # a list of builtin themes. 86 | html_theme = 'alabaster' 87 | 88 | # Theme options are theme-specific and customize the look and feel of a 89 | # theme further. For a list of options available for each theme, see the 90 | # documentation. 91 | # 92 | # html_theme_options = {} 93 | 94 | # Add any paths that contain custom static files (such as style sheets) here, 95 | # relative to this directory. They are copied after the builtin static files, 96 | # so a file named "default.css" will overwrite the builtin "default.css". 97 | html_static_path = ['_static'] 98 | 99 | 100 | # -- Options for HTMLHelp output --------------------------------------- 101 | 102 | # Output file base name for HTML help builder. 103 | htmlhelp_basename = 'multihashdoc' 104 | 105 | html_sidebars = { 106 | '**': [ 107 | 'about.html', 108 | 'navigation.html', 109 | 'relations.html', 110 | 'searchbox.html', 111 | ], 112 | } 113 | 114 | html_theme_options = { 115 | 'github_user': 'multiformats', 116 | 'github_repo': 'py-multihash', 117 | 'github_button': True, 118 | 'github_banner': True, 119 | 'code_font_size': '0.8em', 120 | } 121 | 122 | # -- Options for LaTeX output ------------------------------------------ 123 | 124 | latex_elements = { 125 | # The paper size ('letterpaper' or 'a4paper'). 126 | # 127 | # 'papersize': 'letterpaper', 128 | 129 | # The font size ('10pt', '11pt' or '12pt'). 130 | # 131 | # 'pointsize': '10pt', 132 | 133 | # Additional stuff for the LaTeX preamble. 134 | # 135 | # 'preamble': '', 136 | 137 | # Latex figure (float) alignment 138 | # 139 | # 'figure_align': 'htbp', 140 | } 141 | 142 | # Grouping the document tree into LaTeX files. List of tuples 143 | # (source start file, target name, title, author, documentclass 144 | # [howto, manual, or own class]). 145 | latex_documents = [ 146 | (master_doc, 'multihash.tex', 147 | u'py-multihash Documentation', 148 | u'Dhruv Baldawa', 'manual'), 149 | ] 150 | 151 | 152 | # -- Options for manual page output ------------------------------------ 153 | 154 | # One entry per manual page. List of tuples 155 | # (source start file, name, description, authors, manual section). 156 | man_pages = [ 157 | (master_doc, 'multihash', 158 | u'py-multihash Documentation', 159 | [author], 1) 160 | ] 161 | 162 | 163 | # -- Options for Texinfo output ---------------------------------------- 164 | 165 | # Grouping the document tree into Texinfo files. List of tuples 166 | # (source start file, target name, title, author, 167 | # dir menu entry, description, category) 168 | texinfo_documents = [ 169 | (master_doc, 'multihash', 170 | u'py-multihash Documentation', 171 | author, 172 | 'multihash', 173 | 'One line description of project.', 174 | 'Miscellaneous'), 175 | ] 176 | 177 | 178 | 179 | -------------------------------------------------------------------------------- /docs/contributing.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../CONTRIBUTING.rst 2 | -------------------------------------------------------------------------------- /docs/history.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../HISTORY.rst 2 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../README.rst 2 | 3 | Table of contents 4 | ================= 5 | 6 | .. toctree:: 7 | :maxdepth: 2 8 | :caption: Contents: 9 | 10 | index 11 | usage 12 | api_reference 13 | contributing 14 | authors 15 | history 16 | 17 | Indices and tables 18 | ================== 19 | * :ref:`genindex` 20 | * :ref:`modindex` 21 | * :ref:`search` 22 | -------------------------------------------------------------------------------- /docs/installation.rst: -------------------------------------------------------------------------------- 1 | .. highlight:: shell 2 | 3 | ============ 4 | Installation 5 | ============ 6 | 7 | 8 | Stable release 9 | -------------- 10 | 11 | To install py-multihash, run this command in your terminal: 12 | 13 | .. code-block:: console 14 | 15 | $ pip install py-multihash 16 | 17 | This is the preferred method to install py-multihash, as it will always install the most recent stable release. 18 | 19 | If you don't have `pip`_ installed, this `Python installation guide`_ can guide 20 | you through the process. 21 | 22 | .. _pip: https://pip.pypa.io 23 | .. _Python installation guide: http://docs.python-guide.org/en/latest/starting/installation/ 24 | 25 | 26 | From sources 27 | ------------ 28 | 29 | The sources for py-multihash can be downloaded from the `Github repo`_. 30 | 31 | You can either clone the public repository: 32 | 33 | .. code-block:: console 34 | 35 | $ git clone git://github.com/multiformats/py-multihash 36 | 37 | Or download the `tarball`_: 38 | 39 | .. code-block:: console 40 | 41 | $ curl -OL https://github.com/multiformats/py-multihash/tarball/master 42 | 43 | Once you have a copy of the source, you can install it with: 44 | 45 | .. code-block:: console 46 | 47 | $ python setup.py install 48 | 49 | 50 | .. _Github repo: https://github.com/multiformats/py-multihash 51 | .. _tarball: https://github.com/multiformats/py-multihash/tarball/master 52 | -------------------------------------------------------------------------------- /docs/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | pushd %~dp0 4 | 5 | REM Command file for Sphinx documentation 6 | 7 | if "%SPHINXBUILD%" == "" ( 8 | set SPHINXBUILD=python -msphinx 9 | ) 10 | set SOURCEDIR=. 11 | set BUILDDIR=_build 12 | set SPHINXPROJ=multihash 13 | 14 | if "%1" == "" goto help 15 | 16 | %SPHINXBUILD% >NUL 2>NUL 17 | if errorlevel 9009 ( 18 | echo. 19 | echo.The Sphinx module was not found. Make sure you have Sphinx installed, 20 | echo.then set the SPHINXBUILD environment variable to point to the full 21 | echo.path of the 'sphinx-build' executable. Alternatively you may add the 22 | echo.Sphinx directory to PATH. 23 | echo. 24 | echo.If you don't have Sphinx installed, grab it from 25 | echo.http://sphinx-doc.org/ 26 | exit /b 1 27 | ) 28 | 29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 30 | goto end 31 | 32 | :help 33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 34 | 35 | :end 36 | popd 37 | -------------------------------------------------------------------------------- /docs/readme.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../README.rst 2 | -------------------------------------------------------------------------------- /docs/usage.rst: -------------------------------------------------------------------------------- 1 | ===== 2 | Usage 3 | ===== 4 | 5 | To use py-multihash in a project:: 6 | 7 | import hashlib 8 | import multihash 9 | 10 | # hash your data 11 | m = hashlib.sha256() 12 | m.update(b'hello world') 13 | raw_digest = m.digest() 14 | 15 | # add multihash header 16 | multihash_digest = multihash.encode(raw_digest, "sha2-256") 17 | 18 | # encode it to a string 19 | multihashed_str = multihash.to_b58_string(multihash_digest) 20 | 21 | print(multihashed_str) 22 | # QmaozNR7DZHQK1ZcU9p7QdrshMvXqWK6gpu5rmrkPdT3L4 23 | 24 | To see that your data follows the header:: 25 | 26 | print(' ', m.hexdigest()) 27 | print(multihash.to_hex_string(multihash_digest)) 28 | 29 | # b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9 30 | # 1220b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9 31 | -------------------------------------------------------------------------------- /multihash/__init__.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """Top-level package for py-multihash.""" 4 | 5 | __author__ = """Dhruv Baldawa""" 6 | __email__ = 'dhruv@dhruvb.com' 7 | __version__ = '2.0.0' 8 | 9 | from .multihash import ( 10 | Multihash, to_hex_string, from_hex_string, to_b58_string, from_b58_string, is_app_code, 11 | coerce_code, is_valid_code, decode, encode, is_valid, get_prefix, 12 | ) 13 | -------------------------------------------------------------------------------- /multihash/constants.py: -------------------------------------------------------------------------------- 1 | HASH_TABLE = ( 2 | {'code': 0x0, 'hash': 'id'}, 3 | {'code': 0x11, 'length': 0x14, 'hash': 'sha1'}, 4 | {'code': 0x12, 'length': 0x20, 'hash': 'sha2-256'}, 5 | {'code': 0x13, 'length': 0x40, 'hash': 'sha2-512'}, 6 | {'code': 0x14, 'length': 0x40, 'hash': 'sha3-512'}, 7 | {'code': 0x15, 'length': 0x30, 'hash': 'sha3-384'}, 8 | {'code': 0x16, 'length': 0x20, 'hash': 'sha3-256'}, 9 | {'code': 0x17, 'length': 0x1c, 'hash': 'sha3-224'}, 10 | {'code': 0x18, 'length': 0x20, 'hash': 'shake-128'}, 11 | {'code': 0x19, 'length': 0x40, 'hash': 'shake-256'}, 12 | {'code': 0x1a, 'length': 0x1c, 'hash': 'keccak-224'}, 13 | {'code': 0x1b, 'length': 0x20, 'hash': 'keccak-256'}, 14 | {'code': 0x1c, 'length': 0x30, 'hash': 'keccak-384'}, 15 | {'code': 0x1d, 'length': 0x40, 'hash': 'keccak-512'}, 16 | {'code': 0x22, 'length': 0x20, 'hash': 'murmur3-128'}, 17 | {'code': 0x23, 'hash': 'murmur3-32'}, 18 | {'code': 0x56, 'length': 0x20, 'hash': 'dbl-sha2-256'}, 19 | {'code': 0xb201, 'length': 0x1, 'hash': 'blake2b-8'}, 20 | {'code': 0xb202, 'length': 0x2, 'hash': 'blake2b-16'}, 21 | {'code': 0xb203, 'length': 0x3, 'hash': 'blake2b-24'}, 22 | {'code': 0xb204, 'length': 0x4, 'hash': 'blake2b-32'}, 23 | {'code': 0xb205, 'length': 0x5, 'hash': 'blake2b-40'}, 24 | {'code': 0xb206, 'length': 0x6, 'hash': 'blake2b-48'}, 25 | {'code': 0xb207, 'length': 0x7, 'hash': 'blake2b-56'}, 26 | {'code': 0xb208, 'length': 0x8, 'hash': 'blake2b-64'}, 27 | {'code': 0xb209, 'length': 0x9, 'hash': 'blake2b-72'}, 28 | {'code': 0xb20a, 'length': 0xa, 'hash': 'blake2b-80'}, 29 | {'code': 0xb20b, 'length': 0xb, 'hash': 'blake2b-88'}, 30 | {'code': 0xb20c, 'length': 0xc, 'hash': 'blake2b-96'}, 31 | {'code': 0xb20d, 'length': 0xd, 'hash': 'blake2b-104'}, 32 | {'code': 0xb20e, 'length': 0xe, 'hash': 'blake2b-112'}, 33 | {'code': 0xb20f, 'length': 0xf, 'hash': 'blake2b-120'}, 34 | {'code': 0xb210, 'length': 0x10, 'hash': 'blake2b-128'}, 35 | {'code': 0xb211, 'length': 0x11, 'hash': 'blake2b-136'}, 36 | {'code': 0xb212, 'length': 0x12, 'hash': 'blake2b-144'}, 37 | {'code': 0xb213, 'length': 0x13, 'hash': 'blake2b-152'}, 38 | {'code': 0xb214, 'length': 0x14, 'hash': 'blake2b-160'}, 39 | {'code': 0xb215, 'length': 0x15, 'hash': 'blake2b-168'}, 40 | {'code': 0xb216, 'length': 0x16, 'hash': 'blake2b-176'}, 41 | {'code': 0xb217, 'length': 0x17, 'hash': 'blake2b-184'}, 42 | {'code': 0xb218, 'length': 0x18, 'hash': 'blake2b-192'}, 43 | {'code': 0xb219, 'length': 0x19, 'hash': 'blake2b-200'}, 44 | {'code': 0xb21a, 'length': 0x1a, 'hash': 'blake2b-208'}, 45 | {'code': 0xb21b, 'length': 0x1b, 'hash': 'blake2b-216'}, 46 | {'code': 0xb21c, 'length': 0x1c, 'hash': 'blake2b-224'}, 47 | {'code': 0xb21d, 'length': 0x1d, 'hash': 'blake2b-232'}, 48 | {'code': 0xb21e, 'length': 0x1e, 'hash': 'blake2b-240'}, 49 | {'code': 0xb21f, 'length': 0x1f, 'hash': 'blake2b-248'}, 50 | {'code': 0xb220, 'length': 0x20, 'hash': 'blake2b-256'}, 51 | {'code': 0xb221, 'length': 0x21, 'hash': 'blake2b-264'}, 52 | {'code': 0xb222, 'length': 0x22, 'hash': 'blake2b-272'}, 53 | {'code': 0xb223, 'length': 0x23, 'hash': 'blake2b-280'}, 54 | {'code': 0xb224, 'length': 0x24, 'hash': 'blake2b-288'}, 55 | {'code': 0xb225, 'length': 0x25, 'hash': 'blake2b-296'}, 56 | {'code': 0xb226, 'length': 0x26, 'hash': 'blake2b-304'}, 57 | {'code': 0xb227, 'length': 0x27, 'hash': 'blake2b-312'}, 58 | {'code': 0xb228, 'length': 0x28, 'hash': 'blake2b-320'}, 59 | {'code': 0xb229, 'length': 0x29, 'hash': 'blake2b-328'}, 60 | {'code': 0xb22a, 'length': 0x2a, 'hash': 'blake2b-336'}, 61 | {'code': 0xb22b, 'length': 0x2b, 'hash': 'blake2b-344'}, 62 | {'code': 0xb22c, 'length': 0x2c, 'hash': 'blake2b-352'}, 63 | {'code': 0xb22d, 'length': 0x2d, 'hash': 'blake2b-360'}, 64 | {'code': 0xb22e, 'length': 0x2e, 'hash': 'blake2b-368'}, 65 | {'code': 0xb22f, 'length': 0x2f, 'hash': 'blake2b-376'}, 66 | {'code': 0xb230, 'length': 0x30, 'hash': 'blake2b-384'}, 67 | {'code': 0xb231, 'length': 0x31, 'hash': 'blake2b-392'}, 68 | {'code': 0xb232, 'length': 0x32, 'hash': 'blake2b-400'}, 69 | {'code': 0xb233, 'length': 0x33, 'hash': 'blake2b-408'}, 70 | {'code': 0xb234, 'length': 0x34, 'hash': 'blake2b-416'}, 71 | {'code': 0xb235, 'length': 0x35, 'hash': 'blake2b-424'}, 72 | {'code': 0xb236, 'length': 0x36, 'hash': 'blake2b-432'}, 73 | {'code': 0xb237, 'length': 0x37, 'hash': 'blake2b-440'}, 74 | {'code': 0xb238, 'length': 0x38, 'hash': 'blake2b-448'}, 75 | {'code': 0xb239, 'length': 0x39, 'hash': 'blake2b-456'}, 76 | {'code': 0xb23a, 'length': 0x3a, 'hash': 'blake2b-464'}, 77 | {'code': 0xb23b, 'length': 0x3b, 'hash': 'blake2b-472'}, 78 | {'code': 0xb23c, 'length': 0x3c, 'hash': 'blake2b-480'}, 79 | {'code': 0xb23d, 'length': 0x3d, 'hash': 'blake2b-488'}, 80 | {'code': 0xb23e, 'length': 0x3e, 'hash': 'blake2b-496'}, 81 | {'code': 0xb23f, 'length': 0x3f, 'hash': 'blake2b-504'}, 82 | {'code': 0xb240, 'length': 0x40, 'hash': 'blake2b-512'}, 83 | {'code': 0xb241, 'length': 0x1, 'hash': 'blake2s-8'}, 84 | {'code': 0xb242, 'length': 0x2, 'hash': 'blake2s-16'}, 85 | {'code': 0xb243, 'length': 0x3, 'hash': 'blake2s-24'}, 86 | {'code': 0xb244, 'length': 0x4, 'hash': 'blake2s-32'}, 87 | {'code': 0xb245, 'length': 0x5, 'hash': 'blake2s-40'}, 88 | {'code': 0xb246, 'length': 0x6, 'hash': 'blake2s-48'}, 89 | {'code': 0xb247, 'length': 0x7, 'hash': 'blake2s-56'}, 90 | {'code': 0xb248, 'length': 0x8, 'hash': 'blake2s-64'}, 91 | {'code': 0xb249, 'length': 0x9, 'hash': 'blake2s-72'}, 92 | {'code': 0xb24a, 'length': 0xa, 'hash': 'blake2s-80'}, 93 | {'code': 0xb24b, 'length': 0xb, 'hash': 'blake2s-88'}, 94 | {'code': 0xb24c, 'length': 0xc, 'hash': 'blake2s-96'}, 95 | {'code': 0xb24d, 'length': 0xd, 'hash': 'blake2s-104'}, 96 | {'code': 0xb24e, 'length': 0xe, 'hash': 'blake2s-112'}, 97 | {'code': 0xb24f, 'length': 0xf, 'hash': 'blake2s-120'}, 98 | {'code': 0xb250, 'length': 0x10, 'hash': 'blake2s-128'}, 99 | {'code': 0xb251, 'length': 0x11, 'hash': 'blake2s-136'}, 100 | {'code': 0xb252, 'length': 0x12, 'hash': 'blake2s-144'}, 101 | {'code': 0xb253, 'length': 0x13, 'hash': 'blake2s-152'}, 102 | {'code': 0xb254, 'length': 0x14, 'hash': 'blake2s-160'}, 103 | {'code': 0xb255, 'length': 0x15, 'hash': 'blake2s-168'}, 104 | {'code': 0xb256, 'length': 0x16, 'hash': 'blake2s-176'}, 105 | {'code': 0xb257, 'length': 0x17, 'hash': 'blake2s-184'}, 106 | {'code': 0xb258, 'length': 0x18, 'hash': 'blake2s-192'}, 107 | {'code': 0xb259, 'length': 0x19, 'hash': 'blake2s-200'}, 108 | {'code': 0xb25a, 'length': 0x1a, 'hash': 'blake2s-208'}, 109 | {'code': 0xb25b, 'length': 0x1b, 'hash': 'blake2s-216'}, 110 | {'code': 0xb25c, 'length': 0x1c, 'hash': 'blake2s-224'}, 111 | {'code': 0xb25d, 'length': 0x1d, 'hash': 'blake2s-232'}, 112 | {'code': 0xb25e, 'length': 0x1e, 'hash': 'blake2s-240'}, 113 | {'code': 0xb25f, 'length': 0x1f, 'hash': 'blake2s-248'}, 114 | {'code': 0xb260, 'length': 0x20, 'hash': 'blake2s-256'}, 115 | {'code': 0xb301, 'length': 0x1, 'hash': 'Skein256-8'}, 116 | {'code': 0xb302, 'length': 0x2, 'hash': 'Skein256-16'}, 117 | {'code': 0xb303, 'length': 0x3, 'hash': 'Skein256-24'}, 118 | {'code': 0xb304, 'length': 0x4, 'hash': 'Skein256-32'}, 119 | {'code': 0xb305, 'length': 0x5, 'hash': 'Skein256-40'}, 120 | {'code': 0xb306, 'length': 0x6, 'hash': 'Skein256-48'}, 121 | {'code': 0xb307, 'length': 0x7, 'hash': 'Skein256-56'}, 122 | {'code': 0xb308, 'length': 0x8, 'hash': 'Skein256-64'}, 123 | {'code': 0xb309, 'length': 0x9, 'hash': 'Skein256-72'}, 124 | {'code': 0xb30a, 'length': 0xa, 'hash': 'Skein256-80'}, 125 | {'code': 0xb30b, 'length': 0xb, 'hash': 'Skein256-88'}, 126 | {'code': 0xb30c, 'length': 0xc, 'hash': 'Skein256-96'}, 127 | {'code': 0xb30d, 'length': 0xd, 'hash': 'Skein256-104'}, 128 | {'code': 0xb30e, 'length': 0xe, 'hash': 'Skein256-112'}, 129 | {'code': 0xb30f, 'length': 0xf, 'hash': 'Skein256-120'}, 130 | {'code': 0xb310, 'length': 0x10, 'hash': 'Skein256-128'}, 131 | {'code': 0xb311, 'length': 0x11, 'hash': 'Skein256-136'}, 132 | {'code': 0xb312, 'length': 0x12, 'hash': 'Skein256-144'}, 133 | {'code': 0xb313, 'length': 0x13, 'hash': 'Skein256-152'}, 134 | {'code': 0xb314, 'length': 0x14, 'hash': 'Skein256-160'}, 135 | {'code': 0xb315, 'length': 0x15, 'hash': 'Skein256-168'}, 136 | {'code': 0xb316, 'length': 0x16, 'hash': 'Skein256-176'}, 137 | {'code': 0xb317, 'length': 0x17, 'hash': 'Skein256-184'}, 138 | {'code': 0xb318, 'length': 0x18, 'hash': 'Skein256-192'}, 139 | {'code': 0xb319, 'length': 0x19, 'hash': 'Skein256-200'}, 140 | {'code': 0xb31a, 'length': 0x1a, 'hash': 'Skein256-208'}, 141 | {'code': 0xb31b, 'length': 0x1b, 'hash': 'Skein256-216'}, 142 | {'code': 0xb31c, 'length': 0x1c, 'hash': 'Skein256-224'}, 143 | {'code': 0xb31d, 'length': 0x1d, 'hash': 'Skein256-232'}, 144 | {'code': 0xb31e, 'length': 0x1e, 'hash': 'Skein256-240'}, 145 | {'code': 0xb31f, 'length': 0x1f, 'hash': 'Skein256-248'}, 146 | {'code': 0xb320, 'length': 0x20, 'hash': 'Skein256-256'}, 147 | {'code': 0xb321, 'length': 0x1, 'hash': 'Skein512-8'}, 148 | {'code': 0xb322, 'length': 0x2, 'hash': 'Skein512-16'}, 149 | {'code': 0xb323, 'length': 0x3, 'hash': 'Skein512-24'}, 150 | {'code': 0xb324, 'length': 0x4, 'hash': 'Skein512-32'}, 151 | {'code': 0xb325, 'length': 0x5, 'hash': 'Skein512-40'}, 152 | {'code': 0xb326, 'length': 0x6, 'hash': 'Skein512-48'}, 153 | {'code': 0xb327, 'length': 0x7, 'hash': 'Skein512-56'}, 154 | {'code': 0xb328, 'length': 0x8, 'hash': 'Skein512-64'}, 155 | {'code': 0xb329, 'length': 0x9, 'hash': 'Skein512-72'}, 156 | {'code': 0xb32a, 'length': 0xa, 'hash': 'Skein512-80'}, 157 | {'code': 0xb32b, 'length': 0xb, 'hash': 'Skein512-88'}, 158 | {'code': 0xb32c, 'length': 0xc, 'hash': 'Skein512-96'}, 159 | {'code': 0xb32d, 'length': 0xd, 'hash': 'Skein512-104'}, 160 | {'code': 0xb32e, 'length': 0xe, 'hash': 'Skein512-112'}, 161 | {'code': 0xb32f, 'length': 0xf, 'hash': 'Skein512-120'}, 162 | {'code': 0xb330, 'length': 0x10, 'hash': 'Skein512-128'}, 163 | {'code': 0xb331, 'length': 0x11, 'hash': 'Skein512-136'}, 164 | {'code': 0xb332, 'length': 0x12, 'hash': 'Skein512-144'}, 165 | {'code': 0xb333, 'length': 0x13, 'hash': 'Skein512-152'}, 166 | {'code': 0xb334, 'length': 0x14, 'hash': 'Skein512-160'}, 167 | {'code': 0xb335, 'length': 0x15, 'hash': 'Skein512-168'}, 168 | {'code': 0xb336, 'length': 0x16, 'hash': 'Skein512-176'}, 169 | {'code': 0xb337, 'length': 0x17, 'hash': 'Skein512-184'}, 170 | {'code': 0xb338, 'length': 0x18, 'hash': 'Skein512-192'}, 171 | {'code': 0xb339, 'length': 0x19, 'hash': 'Skein512-200'}, 172 | {'code': 0xb33a, 'length': 0x1a, 'hash': 'Skein512-208'}, 173 | {'code': 0xb33b, 'length': 0x1b, 'hash': 'Skein512-216'}, 174 | {'code': 0xb33c, 'length': 0x1c, 'hash': 'Skein512-224'}, 175 | {'code': 0xb33d, 'length': 0x1d, 'hash': 'Skein512-232'}, 176 | {'code': 0xb33e, 'length': 0x1e, 'hash': 'Skein512-240'}, 177 | {'code': 0xb33f, 'length': 0x1f, 'hash': 'Skein512-248'}, 178 | {'code': 0xb340, 'length': 0x20, 'hash': 'Skein512-256'}, 179 | {'code': 0xb341, 'length': 0x21, 'hash': 'Skein512-264'}, 180 | {'code': 0xb342, 'length': 0x22, 'hash': 'Skein512-272'}, 181 | {'code': 0xb343, 'length': 0x23, 'hash': 'Skein512-280'}, 182 | {'code': 0xb344, 'length': 0x24, 'hash': 'Skein512-288'}, 183 | {'code': 0xb345, 'length': 0x25, 'hash': 'Skein512-296'}, 184 | {'code': 0xb346, 'length': 0x26, 'hash': 'Skein512-304'}, 185 | {'code': 0xb347, 'length': 0x27, 'hash': 'Skein512-312'}, 186 | {'code': 0xb348, 'length': 0x28, 'hash': 'Skein512-320'}, 187 | {'code': 0xb349, 'length': 0x29, 'hash': 'Skein512-328'}, 188 | {'code': 0xb34a, 'length': 0x2a, 'hash': 'Skein512-336'}, 189 | {'code': 0xb34b, 'length': 0x2b, 'hash': 'Skein512-344'}, 190 | {'code': 0xb34c, 'length': 0x2c, 'hash': 'Skein512-352'}, 191 | {'code': 0xb34d, 'length': 0x2d, 'hash': 'Skein512-360'}, 192 | {'code': 0xb34e, 'length': 0x2e, 'hash': 'Skein512-368'}, 193 | {'code': 0xb34f, 'length': 0x2f, 'hash': 'Skein512-376'}, 194 | {'code': 0xb350, 'length': 0x30, 'hash': 'Skein512-384'}, 195 | {'code': 0xb351, 'length': 0x31, 'hash': 'Skein512-392'}, 196 | {'code': 0xb352, 'length': 0x32, 'hash': 'Skein512-400'}, 197 | {'code': 0xb353, 'length': 0x33, 'hash': 'Skein512-408'}, 198 | {'code': 0xb354, 'length': 0x34, 'hash': 'Skein512-416'}, 199 | {'code': 0xb355, 'length': 0x35, 'hash': 'Skein512-424'}, 200 | {'code': 0xb356, 'length': 0x36, 'hash': 'Skein512-432'}, 201 | {'code': 0xb357, 'length': 0x37, 'hash': 'Skein512-440'}, 202 | {'code': 0xb358, 'length': 0x38, 'hash': 'Skein512-448'}, 203 | {'code': 0xb359, 'length': 0x39, 'hash': 'Skein512-456'}, 204 | {'code': 0xb35a, 'length': 0x3a, 'hash': 'Skein512-464'}, 205 | {'code': 0xb35b, 'length': 0x3b, 'hash': 'Skein512-472'}, 206 | {'code': 0xb35c, 'length': 0x3c, 'hash': 'Skein512-480'}, 207 | {'code': 0xb35d, 'length': 0x3d, 'hash': 'Skein512-488'}, 208 | {'code': 0xb35e, 'length': 0x3e, 'hash': 'Skein512-496'}, 209 | {'code': 0xb35f, 'length': 0x3f, 'hash': 'Skein512-504'}, 210 | {'code': 0xb360, 'length': 0x40, 'hash': 'Skein512-512'}, 211 | {'code': 0xb361, 'length': 0x1, 'hash': 'Skein1024-8'}, 212 | {'code': 0xb362, 'length': 0x2, 'hash': 'Skein1024-16'}, 213 | {'code': 0xb363, 'length': 0x3, 'hash': 'Skein1024-24'}, 214 | {'code': 0xb364, 'length': 0x4, 'hash': 'Skein1024-32'}, 215 | {'code': 0xb365, 'length': 0x5, 'hash': 'Skein1024-40'}, 216 | {'code': 0xb366, 'length': 0x6, 'hash': 'Skein1024-48'}, 217 | {'code': 0xb367, 'length': 0x7, 'hash': 'Skein1024-56'}, 218 | {'code': 0xb368, 'length': 0x8, 'hash': 'Skein1024-64'}, 219 | {'code': 0xb369, 'length': 0x9, 'hash': 'Skein1024-72'}, 220 | {'code': 0xb36a, 'length': 0xa, 'hash': 'Skein1024-80'}, 221 | {'code': 0xb36b, 'length': 0xb, 'hash': 'Skein1024-88'}, 222 | {'code': 0xb36c, 'length': 0xc, 'hash': 'Skein1024-96'}, 223 | {'code': 0xb36d, 'length': 0xd, 'hash': 'Skein1024-104'}, 224 | {'code': 0xb36e, 'length': 0xe, 'hash': 'Skein1024-112'}, 225 | {'code': 0xb36f, 'length': 0xf, 'hash': 'Skein1024-120'}, 226 | {'code': 0xb370, 'length': 0x10, 'hash': 'Skein1024-128'}, 227 | {'code': 0xb371, 'length': 0x11, 'hash': 'Skein1024-136'}, 228 | {'code': 0xb372, 'length': 0x12, 'hash': 'Skein1024-144'}, 229 | {'code': 0xb373, 'length': 0x13, 'hash': 'Skein1024-152'}, 230 | {'code': 0xb374, 'length': 0x14, 'hash': 'Skein1024-160'}, 231 | {'code': 0xb375, 'length': 0x15, 'hash': 'Skein1024-168'}, 232 | {'code': 0xb376, 'length': 0x16, 'hash': 'Skein1024-176'}, 233 | {'code': 0xb377, 'length': 0x17, 'hash': 'Skein1024-184'}, 234 | {'code': 0xb378, 'length': 0x18, 'hash': 'Skein1024-192'}, 235 | {'code': 0xb379, 'length': 0x19, 'hash': 'Skein1024-200'}, 236 | {'code': 0xb37a, 'length': 0x1a, 'hash': 'Skein1024-208'}, 237 | {'code': 0xb37b, 'length': 0x1b, 'hash': 'Skein1024-216'}, 238 | {'code': 0xb37c, 'length': 0x1c, 'hash': 'Skein1024-224'}, 239 | {'code': 0xb37d, 'length': 0x1d, 'hash': 'Skein1024-232'}, 240 | {'code': 0xb37e, 'length': 0x1e, 'hash': 'Skein1024-240'}, 241 | {'code': 0xb37f, 'length': 0x1f, 'hash': 'Skein1024-248'}, 242 | {'code': 0xb380, 'length': 0x20, 'hash': 'Skein1024-256'}, 243 | {'code': 0xb381, 'length': 0x21, 'hash': 'Skein1024-264'}, 244 | {'code': 0xb382, 'length': 0x22, 'hash': 'Skein1024-272'}, 245 | {'code': 0xb383, 'length': 0x23, 'hash': 'Skein1024-280'}, 246 | {'code': 0xb384, 'length': 0x24, 'hash': 'Skein1024-288'}, 247 | {'code': 0xb385, 'length': 0x25, 'hash': 'Skein1024-296'}, 248 | {'code': 0xb386, 'length': 0x26, 'hash': 'Skein1024-304'}, 249 | {'code': 0xb387, 'length': 0x27, 'hash': 'Skein1024-312'}, 250 | {'code': 0xb388, 'length': 0x28, 'hash': 'Skein1024-320'}, 251 | {'code': 0xb389, 'length': 0x29, 'hash': 'Skein1024-328'}, 252 | {'code': 0xb38a, 'length': 0x2a, 'hash': 'Skein1024-336'}, 253 | {'code': 0xb38b, 'length': 0x2b, 'hash': 'Skein1024-344'}, 254 | {'code': 0xb38c, 'length': 0x2c, 'hash': 'Skein1024-352'}, 255 | {'code': 0xb38d, 'length': 0x2d, 'hash': 'Skein1024-360'}, 256 | {'code': 0xb38e, 'length': 0x2e, 'hash': 'Skein1024-368'}, 257 | {'code': 0xb38f, 'length': 0x2f, 'hash': 'Skein1024-376'}, 258 | {'code': 0xb390, 'length': 0x30, 'hash': 'Skein1024-384'}, 259 | {'code': 0xb391, 'length': 0x31, 'hash': 'Skein1024-392'}, 260 | {'code': 0xb392, 'length': 0x32, 'hash': 'Skein1024-400'}, 261 | {'code': 0xb393, 'length': 0x33, 'hash': 'Skein1024-408'}, 262 | {'code': 0xb394, 'length': 0x34, 'hash': 'Skein1024-416'}, 263 | {'code': 0xb395, 'length': 0x35, 'hash': 'Skein1024-424'}, 264 | {'code': 0xb396, 'length': 0x36, 'hash': 'Skein1024-432'}, 265 | {'code': 0xb397, 'length': 0x37, 'hash': 'Skein1024-440'}, 266 | {'code': 0xb398, 'length': 0x38, 'hash': 'Skein1024-448'}, 267 | {'code': 0xb399, 'length': 0x39, 'hash': 'Skein1024-456'}, 268 | {'code': 0xb39a, 'length': 0x3a, 'hash': 'Skein1024-464'}, 269 | {'code': 0xb39b, 'length': 0x3b, 'hash': 'Skein1024-472'}, 270 | {'code': 0xb39c, 'length': 0x3c, 'hash': 'Skein1024-480'}, 271 | {'code': 0xb39d, 'length': 0x3d, 'hash': 'Skein1024-488'}, 272 | {'code': 0xb39e, 'length': 0x3e, 'hash': 'Skein1024-496'}, 273 | {'code': 0xb39f, 'length': 0x3f, 'hash': 'Skein1024-504'}, 274 | {'code': 0xb3a0, 'length': 0x40, 'hash': 'Skein1024-512'}, 275 | {'code': 0xb3a1, 'length': 0x41, 'hash': 'Skein1024-520'}, 276 | {'code': 0xb3a2, 'length': 0x42, 'hash': 'Skein1024-528'}, 277 | {'code': 0xb3a3, 'length': 0x43, 'hash': 'Skein1024-536'}, 278 | {'code': 0xb3a4, 'length': 0x44, 'hash': 'Skein1024-544'}, 279 | {'code': 0xb3a5, 'length': 0x45, 'hash': 'Skein1024-552'}, 280 | {'code': 0xb3a6, 'length': 0x46, 'hash': 'Skein1024-560'}, 281 | {'code': 0xb3a7, 'length': 0x47, 'hash': 'Skein1024-568'}, 282 | {'code': 0xb3a8, 'length': 0x48, 'hash': 'Skein1024-576'}, 283 | {'code': 0xb3a9, 'length': 0x49, 'hash': 'Skein1024-584'}, 284 | {'code': 0xb3aa, 'length': 0x4a, 'hash': 'Skein1024-592'}, 285 | {'code': 0xb3ab, 'length': 0x4b, 'hash': 'Skein1024-600'}, 286 | {'code': 0xb3ac, 'length': 0x4c, 'hash': 'Skein1024-608'}, 287 | {'code': 0xb3ad, 'length': 0x4d, 'hash': 'Skein1024-616'}, 288 | {'code': 0xb3ae, 'length': 0x4e, 'hash': 'Skein1024-624'}, 289 | {'code': 0xb3af, 'length': 0x4f, 'hash': 'Skein1024-632'}, 290 | {'code': 0xb3b0, 'length': 0x50, 'hash': 'Skein1024-640'}, 291 | {'code': 0xb3b1, 'length': 0x51, 'hash': 'Skein1024-648'}, 292 | {'code': 0xb3b2, 'length': 0x52, 'hash': 'Skein1024-656'}, 293 | {'code': 0xb3b3, 'length': 0x53, 'hash': 'Skein1024-664'}, 294 | {'code': 0xb3b4, 'length': 0x54, 'hash': 'Skein1024-672'}, 295 | {'code': 0xb3b5, 'length': 0x55, 'hash': 'Skein1024-680'}, 296 | {'code': 0xb3b6, 'length': 0x56, 'hash': 'Skein1024-688'}, 297 | {'code': 0xb3b7, 'length': 0x57, 'hash': 'Skein1024-696'}, 298 | {'code': 0xb3b8, 'length': 0x58, 'hash': 'Skein1024-704'}, 299 | {'code': 0xb3b9, 'length': 0x59, 'hash': 'Skein1024-712'}, 300 | {'code': 0xb3ba, 'length': 0x5a, 'hash': 'Skein1024-720'}, 301 | {'code': 0xb3bb, 'length': 0x5b, 'hash': 'Skein1024-728'}, 302 | {'code': 0xb3bc, 'length': 0x5c, 'hash': 'Skein1024-736'}, 303 | {'code': 0xb3bd, 'length': 0x5d, 'hash': 'Skein1024-744'}, 304 | {'code': 0xb3be, 'length': 0x5e, 'hash': 'Skein1024-752'}, 305 | {'code': 0xb3bf, 'length': 0x5f, 'hash': 'Skein1024-760'}, 306 | {'code': 0xb3c0, 'length': 0x60, 'hash': 'Skein1024-768'}, 307 | {'code': 0xb3c1, 'length': 0x61, 'hash': 'Skein1024-776'}, 308 | {'code': 0xb3c2, 'length': 0x62, 'hash': 'Skein1024-784'}, 309 | {'code': 0xb3c3, 'length': 0x63, 'hash': 'Skein1024-792'}, 310 | {'code': 0xb3c4, 'length': 0x64, 'hash': 'Skein1024-800'}, 311 | {'code': 0xb3c5, 'length': 0x65, 'hash': 'Skein1024-808'}, 312 | {'code': 0xb3c6, 'length': 0x66, 'hash': 'Skein1024-816'}, 313 | {'code': 0xb3c7, 'length': 0x67, 'hash': 'Skein1024-824'}, 314 | {'code': 0xb3c8, 'length': 0x68, 'hash': 'Skein1024-832'}, 315 | {'code': 0xb3c9, 'length': 0x69, 'hash': 'Skein1024-840'}, 316 | {'code': 0xb3ca, 'length': 0x6a, 'hash': 'Skein1024-848'}, 317 | {'code': 0xb3cb, 'length': 0x6b, 'hash': 'Skein1024-856'}, 318 | {'code': 0xb3cc, 'length': 0x6c, 'hash': 'Skein1024-864'}, 319 | {'code': 0xb3cd, 'length': 0x6d, 'hash': 'Skein1024-872'}, 320 | {'code': 0xb3ce, 'length': 0x6e, 'hash': 'Skein1024-880'}, 321 | {'code': 0xb3cf, 'length': 0x6f, 'hash': 'Skein1024-888'}, 322 | {'code': 0xb3d0, 'length': 0x70, 'hash': 'Skein1024-896'}, 323 | {'code': 0xb3d1, 'length': 0x71, 'hash': 'Skein1024-904'}, 324 | {'code': 0xb3d2, 'length': 0x72, 'hash': 'Skein1024-912'}, 325 | {'code': 0xb3d3, 'length': 0x73, 'hash': 'Skein1024-920'}, 326 | {'code': 0xb3d4, 'length': 0x74, 'hash': 'Skein1024-928'}, 327 | {'code': 0xb3d5, 'length': 0x75, 'hash': 'Skein1024-936'}, 328 | {'code': 0xb3d6, 'length': 0x76, 'hash': 'Skein1024-944'}, 329 | {'code': 0xb3d7, 'length': 0x77, 'hash': 'Skein1024-952'}, 330 | {'code': 0xb3d8, 'length': 0x78, 'hash': 'Skein1024-960'}, 331 | {'code': 0xb3d9, 'length': 0x79, 'hash': 'Skein1024-968'}, 332 | {'code': 0xb3da, 'length': 0x7a, 'hash': 'Skein1024-976'}, 333 | {'code': 0xb3db, 'length': 0x7b, 'hash': 'Skein1024-984'}, 334 | {'code': 0xb3dc, 'length': 0x7c, 'hash': 'Skein1024-992'}, 335 | {'code': 0xb3dd, 'length': 0x7d, 'hash': 'Skein1024-1000'}, 336 | {'code': 0xb3de, 'length': 0x7e, 'hash': 'Skein1024-1008'}, 337 | {'code': 0xb3df, 'length': 0x7f, 'hash': 'Skein1024-1016'}, 338 | {'code': 0xb3e0, 'length': 0x80, 'hash': 'Skein1024-1024'}, 339 | ) 340 | 341 | HASH_CODES = {x['hash']: x['code'] for x in HASH_TABLE} 342 | CODE_HASHES = {x['code']: x['hash'] for x in HASH_TABLE} 343 | HASH_LENGTHS = {x['code']: x.get('length') for x in HASH_TABLE} 344 | -------------------------------------------------------------------------------- /multihash/multihash.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | from binascii import hexlify 3 | from collections import namedtuple 4 | from io import BytesIO 5 | 6 | import base58 7 | import varint 8 | 9 | import multihash.constants as constants 10 | 11 | 12 | Multihash = namedtuple('Multihash', 'code,name,length,digest') 13 | 14 | 15 | def to_hex_string(multihash): 16 | """ 17 | Convert the given multihash to a hex encoded string 18 | 19 | :param bytes hash: the multihash to be converted to hex string 20 | :return: input multihash in str 21 | :rtype: str 22 | :raises: `TypeError`, if the `multihash` has incorrect type 23 | """ 24 | if not isinstance(multihash, bytes): 25 | raise TypeError('multihash should be bytes, not {}'.format(type(multihash))) 26 | 27 | return hexlify(multihash).decode() 28 | 29 | 30 | def from_hex_string(multihash): 31 | """ 32 | Convert the given hex encoded string to a multihash 33 | 34 | :param str multihash: hex multihash encoded string 35 | :return: input multihash in bytes 36 | :rtype: bytes 37 | :raises: `TypeError`, if the `multihash` has incorrect type 38 | """ 39 | if not isinstance(multihash, str): 40 | raise TypeError('multihash should be str, not {}'.format(type(multihash))) 41 | 42 | return bytes.fromhex(multihash) 43 | 44 | 45 | def to_b58_string(multihash): 46 | """ 47 | Convert the given multihash to a base58 encoded string 48 | 49 | :param bytes multihash: multihash to base58 encode 50 | :return: base58 encoded multihash string 51 | :rtype: str 52 | :raises: `TypeError`, if the `multihash` has incorrect type 53 | """ 54 | if not isinstance(multihash, bytes): 55 | raise TypeError('multihash should be bytes, not {}'.format(type(multihash))) 56 | 57 | return base58.b58encode(multihash).decode() 58 | 59 | 60 | def from_b58_string(multihash): 61 | """ 62 | Convert the given base58 encoded string to a multihash 63 | 64 | :param str multihash: base58 encoded multihash string 65 | :return: decoded multihash 66 | :rtype: bytes 67 | :raises: `TypeError`, if the `multihash` has incorrect type 68 | """ 69 | if not isinstance(multihash, str): 70 | raise TypeError('multihash should be str, not {}'.format(type(multihash))) 71 | 72 | return base58.b58decode(multihash) 73 | 74 | 75 | def is_app_code(code): 76 | """ 77 | Checks whether a code is part of the app range 78 | 79 | :param int code: input code 80 | :return: if `code` is in the app range or not 81 | :rtype: bool 82 | """ 83 | return 0 < code < 0x10 84 | 85 | 86 | def coerce_code(hash_fn): 87 | """ 88 | Converts a hash function name into its code 89 | 90 | If passed a number it will return the number if it's a valid code 91 | 92 | :param hash_fn: The input hash function can be 93 | - str, the name of the hash function 94 | - int, the code of the hash function 95 | :return: hash function code 96 | :rtype: int 97 | :raises ValueError: if the hash function is not supported 98 | :raises ValueError: if the hash code is not supported 99 | :raises ValueError: if the hash type is not a string or an int 100 | """ 101 | if isinstance(hash_fn, str): 102 | try: 103 | return constants.HASH_CODES[hash_fn] 104 | except KeyError: 105 | raise ValueError('Unsupported hash function {}'.format(hash_fn)) 106 | 107 | elif isinstance(hash_fn, int): 108 | if hash_fn in constants.CODE_HASHES or is_app_code(hash_fn): 109 | return hash_fn 110 | raise ValueError('Unsupported hash code {}'.format(hash_fn)) 111 | 112 | raise TypeError('hash code should be either an integer or a string') 113 | 114 | 115 | def is_valid_code(code): 116 | """ 117 | Checks whether a multihash code is valid or not 118 | 119 | :param int code: input code 120 | :return: if the code valid or not 121 | :rtype: bool 122 | """ 123 | return is_app_code(code) or code in constants.CODE_HASHES 124 | 125 | 126 | def decode(multihash): 127 | """ 128 | Decode a hash from the given multihash 129 | 130 | :param bytes multihash: multihash 131 | :return: decoded :py:class:`multihash.Multihash` object 132 | :rtype: :py:class:`multihash.Multihash` 133 | :raises TypeError: if `multihash` is not of type `bytes` 134 | :raises ValueError: if the length of multihash is less than 3 characters 135 | :raises ValueError: if the code is invalid 136 | :raises ValueError: if the length is invalid 137 | :raises ValueError: if the length is not same as the digest 138 | """ 139 | if not isinstance(multihash, bytes): 140 | raise TypeError('multihash should be bytes, not {}', type(multihash)) 141 | 142 | if len(multihash) < 3: 143 | raise ValueError('multihash must be greater than 3 bytes.') 144 | 145 | buffer = BytesIO(multihash) 146 | try: 147 | code = varint.decode_stream(buffer) 148 | except TypeError: 149 | raise ValueError('Invalid varint provided') 150 | 151 | if not is_valid_code(code): 152 | raise ValueError('Unsupported hash code {}'.format(code)) 153 | 154 | try: 155 | length = varint.decode_stream(buffer) 156 | except TypeError: 157 | raise ValueError('Invalid length provided') 158 | 159 | buf = buffer.read() 160 | 161 | if len(buf) != length: 162 | raise ValueError('Inconsistent multihash length {} != {}'.format(len(buf), length)) 163 | 164 | return Multihash(code=code, name=constants.CODE_HASHES.get(code, code), length=length, digest=buf) 165 | 166 | 167 | def encode(digest, code, length=None): 168 | """ 169 | Encode a hash digest along with the specified function code 170 | 171 | :param bytes digest: hash digest 172 | :param (int or str) code: hash function code 173 | :param int length: hash digest length 174 | :return: encoded multihash 175 | :rtype: bytes 176 | :raises TypeError: when the digest is not a bytes object 177 | :raises ValueError: when the digest length is not correct 178 | """ 179 | hash_code = coerce_code(code) 180 | 181 | if not isinstance(digest, bytes): 182 | raise TypeError('digest must be a bytes object, not {}'.format(type(digest))) 183 | 184 | if length is None: 185 | length = len(digest) 186 | 187 | elif length != len(digest): 188 | raise ValueError('digest length should be equal to specified length') 189 | 190 | return varint.encode(hash_code) + varint.encode(length) + digest 191 | 192 | 193 | def is_valid(multihash): 194 | """ 195 | Check if the given buffer is a valid multihash 196 | 197 | :param bytes multihash: input multihash 198 | :return: if the input is a valid multihash or not 199 | :rtype: bool 200 | """ 201 | try: 202 | decode(multihash) 203 | return True 204 | except ValueError: 205 | return False 206 | 207 | 208 | def get_prefix(multihash): 209 | """ 210 | Return the prefix from the multihash 211 | 212 | :param bytes multihash: input multihash 213 | :return: multihash prefix 214 | :rtype: bytes 215 | :raises ValueError: when the multihash is invalid 216 | """ 217 | if is_valid(multihash): 218 | return multihash[:2] 219 | 220 | raise ValueError('invalid multihash') 221 | -------------------------------------------------------------------------------- /requirements_dev.txt: -------------------------------------------------------------------------------- 1 | pip<20 2 | bump2version==0.5.11 3 | coverage==4.5.4 4 | flake8==3.7.8 5 | pytest-cov==2.10.1 6 | pytest-runner==5.1 7 | pytest==4.6.5 8 | Sphinx==1.8.5 9 | tox==3.14.0 10 | twine==1.14.0 11 | watchdog[watchmedo]==0.10.3 12 | wheel 13 | codecov 14 | -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | [bumpversion] 2 | current_version = 2.0.0 3 | commit = True 4 | tag = True 5 | 6 | [bumpversion:file:setup.py] 7 | search = version='{current_version}' 8 | replace = version='{new_version}' 9 | 10 | [bumpversion:file:multihash/__init__.py] 11 | search = __version__ = '{current_version}' 12 | replace = __version__ = '{new_version}' 13 | 14 | [bdist_wheel] 15 | universal = 1 16 | 17 | [flake8] 18 | exclude = docs 19 | max_line_length = 120 20 | 21 | [aliases] 22 | test = pytest 23 | 24 | [tool:pytest] 25 | collect_ignore = ['setup.py'] 26 | python_classes = *TestCase 27 | 28 | [coverage:report] 29 | exclude_lines = 30 | pragma: no cover 31 | def __repr__ 32 | if self.debug: 33 | if settings.DEBUG 34 | raise AssertionError 35 | raise NotImplementedError 36 | if 0: 37 | if __name__ == .__main__.: 38 | 39 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | """The setup script.""" 5 | 6 | from setuptools import setup, find_packages 7 | 8 | with open('README.rst') as readme_file: 9 | readme = readme_file.read() 10 | 11 | with open('HISTORY.rst') as history_file: 12 | history = history_file.read() 13 | 14 | requirements = [ 15 | 'varint>=1.0.2,<2.0', 16 | 'six>=1.10.0,<2.0', 17 | 'morphys>=1.0,<2.0', 18 | 'base58>=1.0.2,<3.0', 19 | ] 20 | 21 | setup_requirements = ['pytest-runner', ] 22 | 23 | test_requirements = [ 24 | 'pytest', 25 | 'pytest-cov', 26 | # TODO: put package test requirements here 27 | ] 28 | 29 | setup( 30 | name='py-multihash', 31 | version='2.0.0', 32 | description="Multihash implementation in Python", 33 | long_description=readme + '\n\n' + history, 34 | author="Dhruv Baldawa", 35 | author_email='dhruv@dhruvb.com', 36 | classifiers=[ 37 | 'Development Status :: 2 - Pre-Alpha', 38 | 'Intended Audience :: Developers', 39 | 'License :: OSI Approved :: MIT License', 40 | 'Natural Language :: English', 41 | 42 | 'Programming Language :: Python :: 3', 43 | 'Programming Language :: Python :: 3.4', 44 | 'Programming Language :: Python :: 3.5', 45 | 'Programming Language :: Python :: 3.6', 46 | 'Programming Language :: Python :: 3.7', 47 | ], 48 | install_requires=requirements, 49 | license="MIT license", 50 | include_package_data=True, 51 | keywords='multihash', 52 | packages=find_packages(include=['multihash']), 53 | setup_requires=setup_requirements, 54 | test_suite='tests', 55 | tests_require=test_requirements, 56 | url='https://github.com/multiformats/multihash', 57 | zip_safe=False, 58 | ) 59 | -------------------------------------------------------------------------------- /tests/test_multihash.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | """Tests for `multihash` package.""" 5 | from binascii import hexlify 6 | 7 | import base58 8 | import pytest 9 | import varint 10 | 11 | from multihash import ( 12 | encode, decode, from_hex_string, to_hex_string, to_b58_string, from_b58_string, is_app_code, is_valid, 13 | is_valid_code, get_prefix, coerce_code, 14 | ) 15 | from multihash.constants import HASH_TABLE 16 | 17 | VALID_TABLE = ( 18 | { 19 | 'encoding': { 20 | 'code': 0x11, 21 | 'name': 'sha1', 22 | }, 23 | 'hex': '0beec7b5ea3f0fdbc95d0dd47f3c5bc275da8a33', 24 | 'length': 20, 25 | }, 26 | { 27 | 'encoding': { 28 | 'code': 0x11, 29 | 'name': 'sha1', 30 | }, 31 | 'hex': '0beec7b8', 32 | 'length': 4, 33 | }, 34 | { 35 | 'encoding': { 36 | 'code': 0x12, 37 | 'name': 'sha2-256', 38 | }, 39 | 'hex': '2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7ae', 40 | 'length': 32, 41 | }, 42 | { 43 | 'encoding': { 44 | 'code': 0x12, 45 | 'name': 'sha2-256', 46 | }, 47 | 'hex': '2c26b46b', 48 | 'length': 4, 49 | }, 50 | ) 51 | 52 | INVALID_TABLE = ( 53 | { 54 | 'code': 0x00, 55 | 'length': 32, 56 | 'hex': '0beec7b5ea3f0fdbc95d0dd47f3c5bc275da8a33' 57 | }, 58 | { 59 | 'code': 0x11, 60 | 'length': 21, 61 | 'hex': '0beec7b5ea3f0fdbc95d0dd47f3c5bc275da8a33' 62 | }, 63 | { 64 | 'code': 0x11, 65 | 'length': 20, 66 | 'hex': '0beec7b5ea3f0fdbc95d0dd47f3c5bc275da8a' 67 | }, 68 | { 69 | 'code': 0x11, 70 | 'length': 20, 71 | 'hex': 'f0' 72 | }, 73 | { 74 | 'code': 0x31, 75 | 'length': 20, 76 | 'hex': '0beec7b5ea3f0fdbc95d0dd47f3c5bc275da8a33' 77 | }, 78 | { 79 | 'code': 0x12, 80 | 'length': 32, 81 | 'hex': '2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7' 82 | }, 83 | { 84 | 'code': 0xb205, 85 | 'length': 5, 86 | 'hex': '2c26b0' 87 | }, 88 | { 89 | 'code': 0xb23f, 90 | 'length': 0x3f, 91 | 'hex': '2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e72c26b46b6f8ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e' 92 | }, 93 | ) 94 | 95 | INVALID_BYTE_TYPES = ('a', 1, 1.0, [], {}) 96 | INVALID_STRING_TYPES = (b'a', 1, 1.0, [], {}) 97 | 98 | 99 | def make_hash(code, size, hex_): 100 | return code.to_bytes((code.bit_length() + 7) // 8, byteorder='big') + \ 101 | size.to_bytes((size.bit_length() + 7) // 8, byteorder='big') + \ 102 | bytes.fromhex(hex_) 103 | 104 | 105 | class ToHexStringTestCase(object): 106 | @pytest.mark.parametrize('value', VALID_TABLE) 107 | def test_to_hex_string_valid(self, value): 108 | """ to_hex_string: test if it passes for all valid cases """ 109 | code = value['encoding']['code'] 110 | buffer = encode(bytes.fromhex(value['hex']), code) 111 | assert to_hex_string(buffer) == hexlify(buffer).decode() 112 | 113 | @pytest.mark.parametrize('value', INVALID_BYTE_TYPES) 114 | def test_to_hex_string_invalid_type(self, value): 115 | """ to_hex_string: raises TypeError for invalid types """ 116 | with pytest.raises(TypeError) as excinfo: 117 | to_hex_string(value) 118 | assert 'multihash should be bytes' in str(excinfo.value) 119 | 120 | 121 | class FromHexStringTestCase(object): 122 | @pytest.mark.parametrize('value', VALID_TABLE) 123 | def test_from_digest_string_valid(self, value): 124 | """ from_hex_string: decodes the correct values """ 125 | code = value['encoding']['code'] 126 | buffer = encode(bytes.fromhex(value['hex']), code) 127 | assert from_hex_string(hexlify(buffer).decode()) == buffer 128 | 129 | @pytest.mark.parametrize('value', INVALID_STRING_TYPES) 130 | def test_from_hex_string_invalid_type(self, value): 131 | """ from_hex_string: raises TypeError for invalid types """ 132 | with pytest.raises(TypeError) as excinfo: 133 | from_hex_string(value) 134 | assert 'multihash should be str' in str(excinfo.value) 135 | 136 | 137 | class To58StringTestCase(object): 138 | @pytest.mark.parametrize('value', VALID_TABLE) 139 | def test_to_b58_string_valid(self, value): 140 | """ to_b58_string: test if it passes for all valid cases """ 141 | code = value['encoding']['code'] 142 | buffer = encode(bytes.fromhex(value['hex']), code) 143 | assert to_b58_string(buffer) == base58.b58encode(buffer).decode() 144 | 145 | @pytest.mark.parametrize('value', INVALID_BYTE_TYPES) 146 | def test_to_b58_string_invalid(self, value): 147 | """ to_b58_string: raises TypeError for invalid types """ 148 | with pytest.raises(TypeError) as excinfo: 149 | to_b58_string(value) 150 | assert 'multihash should be bytes' in str(excinfo.value) 151 | 152 | 153 | class FromB58StringTestCase(object): 154 | def test_from_b58_string_valid(self): 155 | """ from_b58_string: test if it passes for all valid cases """ 156 | expected = 'QmPfjpVaf593UQJ9a5ECvdh2x17XuJYG5Yanv5UFnH3jPE' 157 | actual = bytes.fromhex('122013bf801597d74a660453412635edd8c34271e5998f801fac5d700c6ce8d8e461') 158 | assert from_b58_string(expected) == actual 159 | 160 | @pytest.mark.parametrize('value', INVALID_STRING_TYPES) 161 | def test_from_b58_string_invalid(self, value): 162 | """ from_b58_string: raises TypeError for invalid types """ 163 | with pytest.raises(TypeError) as excinfo: 164 | from_b58_string(value) 165 | assert 'multihash should be str' in str(excinfo.value) 166 | 167 | 168 | class IsAppCodeTestCase(object): 169 | @pytest.mark.parametrize('value', (0x01, 0x0f, 0x04)) 170 | def test_is_app_code_valid(self, value): 171 | """ is_app_code: returns True for all valid cases """ 172 | assert is_app_code(value) 173 | 174 | @pytest.mark.parametrize('value', (0x00, 0x11, 0x10, 0xffff)) 175 | def test_is_app_code_invalid(self, value): 176 | """ is_app_code: returns False for invalid cases """ 177 | assert not is_app_code(value) 178 | 179 | 180 | class CoerceCodeTestCase(object): 181 | @pytest.mark.parametrize('value', HASH_TABLE[:15] + HASH_TABLE[:15]) 182 | def test_coerce_code_valid(self, value): 183 | """ coerce_code: returns code for all valid cases """ 184 | assert coerce_code(value['hash']) == value['code'] 185 | assert coerce_code(value['code']) == value['code'] 186 | 187 | @pytest.mark.parametrize('value', ('SHa1', 'SHA256', 0xf0f0ff0)) 188 | def test_coerce_code_invalid(self, value): 189 | """ coerce_code: raises ValueError for invalid cases """ 190 | with pytest.raises(ValueError) as excinfo: 191 | coerce_code(value) 192 | assert 'Unsupported hash' in str(excinfo.value) 193 | 194 | @pytest.mark.parametrize('value', (1., [], {})) 195 | def test_coerce_code_invalid_type(self, value): 196 | """ coerce_code: raises TypeError for invalid cases """ 197 | with pytest.raises(TypeError) as excinfo: 198 | coerce_code(value) 199 | assert 'should be either an integer or a string' in str(excinfo.value) 200 | 201 | 202 | class IsValidCodeTestCase(object): 203 | @pytest.mark.parametrize('value', (0x02, 0x0f, 0x13)) 204 | def test_is_valid_code_valid(self, value): 205 | """ is_valid_code: returns True for all valid cases """ 206 | assert is_valid_code(value) 207 | 208 | @pytest.mark.parametrize('value', (0x0ff, 0x10, 0x90)) 209 | def test_is_valid_code_invalid(self, value): 210 | """ is_valid_code: returns False for all invalid cases """ 211 | assert not is_valid_code(value) 212 | 213 | 214 | class DecodeTestCase(object): 215 | @pytest.mark.parametrize('value', VALID_TABLE) 216 | def test_decode_valid(self, value): 217 | """ decode: works for all valid cases """ 218 | code = value['encoding']['code'] 219 | buffer = make_hash(code, value['length'], value['hex']) 220 | name = value['encoding']['name'] 221 | actual = bytes.fromhex(value['hex']) 222 | 223 | r = decode(buffer) 224 | expected = r.digest 225 | 226 | assert r.code == code 227 | assert r.name == name 228 | assert r.length == len(actual) 229 | assert actual == expected 230 | 231 | def test_decode_app_code(self): 232 | """ decode: works for all app codes """ 233 | code = 0x08 234 | hex_ = 'fdfdfdfdfd' 235 | buffer = make_hash(code, 5, hex_) 236 | actual = bytes.fromhex(hex_) 237 | 238 | r = decode(buffer) 239 | expected = r.digest 240 | 241 | assert r.code == code 242 | assert r.length == len(actual) 243 | assert actual == expected 244 | 245 | @pytest.mark.parametrize('value', INVALID_BYTE_TYPES) 246 | def test_decode_incorrect_types(self, value): 247 | """ decode: raises TypeError if the type is incorrect """ 248 | with pytest.raises(TypeError) as excinfo: 249 | decode(value) 250 | assert 'should be bytes' in str(excinfo.value) 251 | 252 | @pytest.mark.parametrize('value', (b'', b'a', b'aa')) 253 | def test_decode_less_length(self, value): 254 | """ decode: raises ValueError if the length is less than 3 """ 255 | with pytest.raises(ValueError) as excinfo: 256 | decode(value) 257 | assert 'greater than 3 bytes' in str(excinfo.value) 258 | 259 | def test_decode_invalid_code(self): 260 | """ decode: raises ValueError if the code is invalid """ 261 | value = make_hash(0xfff0, 10, 'ffffffff') 262 | with pytest.raises(ValueError) as excinfo: 263 | decode(value) 264 | assert 'Unsupported hash code' in str(excinfo.value) 265 | 266 | def test_decode_invalid_length(self): 267 | """ decode: raises ValueError if the length is invalid """ 268 | value = make_hash(0x13, 0, 'ffffffff') 269 | with pytest.raises(ValueError) as excinfo: 270 | decode(value) 271 | assert 'Invalid length' in str(excinfo.value) 272 | 273 | def test_decode_unequal_length(self): 274 | """ decode: raises ValueError if the length is not same """ 275 | value = make_hash(0x13, 40, 'ffffffff') 276 | with pytest.raises(ValueError) as excinfo: 277 | decode(value) 278 | assert 'Inconsistent multihash length' in str(excinfo.value) 279 | 280 | def test_decode_invalid_varint(self): 281 | """ decode: raises ValueError if invalid varint is provided """ 282 | value = b'\xff\xff\xff' 283 | with pytest.raises(ValueError) as excinfo: 284 | decode(value) 285 | assert 'Invalid varint provided' in str(excinfo.value) 286 | 287 | class EncodeTestCase(object): 288 | @pytest.mark.parametrize('value', VALID_TABLE) 289 | def test_encode_valid(self, value): 290 | """ encode: encodes stuff for all valid cases """ 291 | code = value['encoding']['code'] 292 | actual = make_hash(code, value['length'], value['hex']) 293 | name = value['encoding']['name'] 294 | assert hexlify(encode(bytes.fromhex(value['hex']), code, value['length'])) == hexlify(actual) 295 | assert hexlify(encode(bytes.fromhex(value['hex']), name, value['length'])) == hexlify(actual) 296 | 297 | @pytest.mark.parametrize('value', VALID_TABLE) 298 | def test_encode_no_length(self, value): 299 | """ encode: encodes stuff for all valid cases when length is not given """ 300 | code = value['encoding']['code'] 301 | actual = make_hash(code, value['length'], value['hex']) 302 | name = value['encoding']['name'] 303 | assert hexlify(encode(bytes.fromhex(value['hex']), code)) == hexlify(actual) 304 | assert hexlify(encode(bytes.fromhex(value['hex']), name)) == hexlify(actual) 305 | 306 | def test_encode_unequal_length(self): 307 | """ encode: raises ValueError when unequal lengths are given """ 308 | value = VALID_TABLE[-1] 309 | code = value['encoding']['code'] 310 | 311 | with pytest.raises(ValueError) as excinfo: 312 | assert encode(bytes.fromhex(value['hex']), code, value['length'] + 1) 313 | assert 'digest length should be equal' in str(excinfo.value) 314 | 315 | @pytest.mark.parametrize('value', INVALID_BYTE_TYPES) 316 | def test_encode_invalid_type(self, value): 317 | """ encode: raises TypeError when invalid type is given """ 318 | with pytest.raises(TypeError) as excinfo: 319 | encode(value, 0x11) 320 | assert 'must be a bytes object' in str(excinfo.value) 321 | 322 | class IsValidTestCase(object): 323 | @pytest.mark.parametrize('value', VALID_TABLE) 324 | def test_is_valid_valid(self, value): 325 | """ is_valid: returns True for all valid cases """ 326 | assert is_valid(make_hash(value['encoding']['code'], value['length'], value['hex'])) 327 | 328 | @pytest.mark.parametrize('value', INVALID_TABLE) 329 | def test_is_valid_invalid(self, value): 330 | """ is_valid: returns False for all invalid cases """ 331 | assert not is_valid(make_hash(value['code'], value['length'], value['hex'])) 332 | 333 | 334 | class GetPrefixTestCase(object): 335 | def test_get_prefix_valid(self): 336 | """ get_prefix: returns valid prefix """ 337 | multihash = encode(b'foo', 0x11, 3) 338 | prefix = get_prefix(multihash) 339 | assert hexlify(prefix).decode() == '1103' 340 | 341 | def test_get_prefix_invalid(self): 342 | """ get_prefix: raises ValueError for invalid cases """ 343 | with pytest.raises(ValueError): 344 | get_prefix(b'foobar') 345 | -------------------------------------------------------------------------------- /tox.ini: -------------------------------------------------------------------------------- 1 | [tox] 2 | envlist = py35, py36, py37, py38, flake8 3 | 4 | [travis] 5 | python = 6 | 3.8: py38 7 | 3.7: py37 8 | 3.6: py36 9 | 3.5: py35 10 | 11 | [testenv:flake8] 12 | basepython = python 13 | deps = flake8 14 | commands = flake8 multihash 15 | 16 | [testenv] 17 | passenv = CI TRAVIS TRAVIS_* 18 | setenv = 19 | PYTHONPATH = {toxinidir} 20 | deps = 21 | -r{toxinidir}/requirements_dev.txt 22 | ; If you want to make tox run the tests with the same versions, create a 23 | ; requirements.txt with the pinned versions and uncomment the following line: 24 | ; -r{toxinidir}/requirements.txt 25 | commands = 26 | pip install -U pip 27 | py.test --cov=multihash/ --cov-branch --cov-report=term-missing --cov-report=xml --basetemp={envtmpdir} 28 | codecov 29 | --------------------------------------------------------------------------------