├── .github ├── ISSUE_TEMPLATE.md └── workflows │ └── build.yml ├── .gitignore ├── CHANGELOG.rst ├── LICENSE.txt ├── README.rst ├── TODO.rst ├── pkgbuild-extras ├── python-cairo.PKGBUILD_EXTRAS ├── python-gobject.PKGBUILD_EXTRAS ├── python-graphviz.PKGBUILD_EXTRAS ├── python-pillow.PKGBUILD_EXTRAS ├── python-pylibftdi.PKGBUILD_EXTRAS ├── python-supersmoother.PKGBUILD_EXTRAS ├── python-tqdm.PKGBUILD_EXTRAS ├── python-vapory.PKGBUILD_EXTRAS ├── python-wxpython.PKGBUILD_EXTRAS ├── python-yep.PKGBUILD_EXTRAS ├── sftpman-gtk.PKGBUILD_EXTRAS └── sftpman.PKGBUILD_EXTRAS ├── pypi2pkgbuild-compare.sh ├── pypi2pkgbuild.py ├── pyproject.toml └── test_pypi2pkgbuild.py /.github/ISSUE_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 6 | -------------------------------------------------------------------------------- /.github/workflows/build.yml: -------------------------------------------------------------------------------- 1 | name: build 2 | 3 | on: [push, pull_request] 4 | 5 | jobs: 6 | build: 7 | runs-on: ubuntu-latest 8 | steps: 9 | - uses: actions/checkout@v3 10 | - name: Test 11 | shell: bash 12 | run: | 13 | docker run --interactive --volume="$(pwd)":/io:Z --workdir=/io \ 14 | archlinux/archlinux:base-devel bash </etc/locale.gen && locale-gen && 16 | unset LANG && source /etc/profile.d/locale.sh && 17 | pacman -Syu --noconfirm && 18 | pacman -S --noconfirm git namcap pkgfile python && 19 | pkgfile --update && 20 | chmod -R a+w . && mkdir -p /.cache/pip && chown nobody /.cache/pip && 21 | sudo -u nobody python -munittest 22 | EOF 23 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | **/.ipynb_checkpoints/ 2 | *.egg-info/ 3 | .eggs/ 4 | .pytest_cache/ 5 | build/ 6 | dist/ 7 | htmlcov/ 8 | oprofile_data/ 9 | .*.swp 10 | *.o 11 | *.pyc 12 | *.so 13 | .coverage 14 | .gdb_history 15 | -------------------------------------------------------------------------------- /CHANGELOG.rst: -------------------------------------------------------------------------------- 1 | 0.3 2 | === 3 | 4 | - Replaced the short flags for ``--no-deps`` and ``--no-install`` from ``-d`` 5 | and ``-n`` to ``-D`` and ``-I`` respectively, in order to make room for the 6 | new ``--pkgname``/``-n``. 7 | -------------------------------------------------------------------------------- /LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright (c) 2016-2017 Antony Lee 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of 4 | this software and associated documentation files (the "Software"), to deal in 5 | the Software without restriction, including without limitation the rights to 6 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 7 | of the Software, and to permit persons to whom the Software is furnished to do 8 | so, subject to the following conditions: 9 | 10 | The above copyright notice and this permission notice shall be included in all 11 | copies or substantial portions of the Software. 12 | 13 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 14 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 15 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 16 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 17 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 18 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 19 | SOFTWARE. 20 | -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | PyPI2PKGBUILD 2 | ============= 3 | 4 | **NOTE**: This package is currently not under active development. 5 | 6 | ----- 7 | 8 | |PyPI| 9 | 10 | .. |PyPI| 11 | image:: https://img.shields.io/pypi/v/pypi2pkgbuild.svg 12 | :target: https://pypi.python.org/pypi/pypi2pkgbuild 13 | 14 | Convert PyPI packages to Arch Linux packages, inspired from pip2arch_. 15 | 16 | Handles packages of all sizes, from the simplest (pure Python, no dependencies) 17 | to the most complex (C-level dependencies, external C libraries, etc., e.g. 18 | most of the scientific Python stack, or GUI toolkits such as PyGObject and 19 | wxPython) [#]_. 20 | 21 | .. [#] ... with a bit of help. 22 | 23 | .. contents:: :local: 24 | 25 | Dependencies and installation 26 | ----------------------------- 27 | 28 | ``pypi2pkgbuild.py`` depends on the Arch Linux packages namcap_, pkgfile_, and 29 | python_ [#]_. 30 | 31 | .. _namcap: https://wiki.archlinux.org/index.php/Namcap 32 | .. _pkgfile: https://wiki.archlinux.org/index.php/Pkgfile 33 | .. _python: https://wiki.archlinux.org/index.php/Python 34 | 35 | .. [#] Officially, only the latest releases packaged by Arch Linux are 36 | supported. In practice, the hard requirements that I am aware of are 37 | pacman≥5.1 (which changed the behavior of ``makepkg --printsrcinfo``) and a 38 | recent enough Python so that ``python -mvenv`` creates a virtual environment 39 | with pip≥10 (which changed the default format of ``pip --list``) and 40 | setuptools. 41 | 42 | The script can be installed with ``pip install [--user] .``, or can also be run 43 | directly. 44 | 45 | One can even run ``pypi2pkgbuild.py`` on itself to create a proper Arch package 46 | (``pypi2pkgbuild.py git+https://github.com/anntzer/pypi2pkgbuild``). 47 | 48 | A minimal test suite (checking that ``pypi2pkgbuild.py`` can indeed package 49 | itself) can by run with unittest (or pytest). 50 | 51 | Usage 52 | ----- 53 | 54 | ``pypi2pkgbuild.py PYPINAME`` creates a PKGBUILD for the latest version of the 55 | given PyPI package and the current version of the Python interpreter (Python 3 56 | only). Prereleases are considered if the ``--pre`` flag is passed. Because 57 | PyPI's dependency information is somewhat unreliable, it installs the package 58 | in a venv to figure out the dependencies. Note that thanks to ``pip``'s wheel 59 | cache, the build is later reused; i.e. the procedure entails very little extra 60 | work. 61 | 62 | A ``-git`` package can be built with ``pypi2pkgbuild.py git+https://...``. 63 | 64 | The package is then built and verified with ``namcap``. 65 | 66 | The goal is to make this tool as automated as possible: if all the information 67 | to build a package is (reasonably) accessible, this tool should be able to 68 | build it. 69 | 70 | In order to provide additional information to ``makepkg``, edit 71 | ``PKGBUILD_EXTRAS`` (which can also be done with the ``--pkgbuild-extras`` 72 | flag). This file is sourced at the *end* of ``PKGBUILD``. For ease of 73 | patching, the ``build``, ``package``, and, where applicable, ``pkgver`` 74 | functions are defined by forwarding to ``_build``, ``_package``, and 75 | ``_pkgver``. A ``_check`` function is also available, but not used (due to the 76 | lack of standard testing CLI). Some useful examples of ``PKGBUILD_EXTRAS`` are 77 | listed in the ``pkgbuild-extras`` directory. 78 | 79 | Usage notes 80 | ``````````` 81 | 82 | - It is suggested to create an alias with standard options set, e.g. 83 | 84 | .. code-block:: sh 85 | 86 | alias pypi2pkgbuild.py='PKGEXT=.pkg.tar pypi2pkgbuild.py -g cython -b /tmp/pypi2pkgbuild/ -f' 87 | 88 | - By default, the ``pkgrel`` of (standard) packages is set to ``00``. This 89 | allows automatic upgrading into official packages (and AUR ones, if an AUR 90 | helper is used) whenever the repositories are updated. Additionally, the use 91 | of ``00`` rather than ``0`` serves as a (weak) marker that the package was 92 | automatically generated by this tool. In order to prevent such an upgrade, 93 | one can use the ``--pkgrel`` flag to set ``pkgrel`` to, e.g., ``99``. 94 | 95 | - If one wishes to completely bypass AUR Python packages while maintaining the 96 | use of an AUR helper for non-Python packages, one can define a shell function 97 | that excludes ``pypi2pkgbuild.py``-generated packages that do not appear in 98 | the official repositories, e.g., for ``pacaur``: 99 | 100 | .. code-block:: sh 101 | 102 | pacaur() { 103 | if [[ "$1" = "-Syu" ]]; then 104 | # Update, in case some packages moved in or out of the official repos. 105 | sudo pacman -Sy 106 | # Upgrade everything except python packages with pkgver=00 or 99. 107 | PKGEXT=.pkg.tar command pacaur -Su --ignore \ 108 | "$(pacman -Qm | grep '^python-.*-\(00\|99\)$' | cut -d' ' -f1 | paste -sd,)" 109 | else 110 | command pacaur "$@" 111 | fi 112 | } 113 | 114 | This function will not bypass Python packages explicitly installed from the 115 | AUR, as the user may have done so to bypass some incorrect packaging by 116 | ``pypi2pkgbuild.py``. It is recommended to use the ``-i`` flag to calls 117 | to ``pypi2pkgbuild.py`` (e.g. in an alias) to exclude packages that are 118 | mishandled by ``pypi2pkgbuild.py`` (see `mispackaged packages`_). The ``-i`` 119 | flag can be passed multiple times; passing an empty argument to it will clear 120 | the ignore list defined so far. 121 | 122 | .. _mispackaged packages: TODO.rst#mispackaged-packages 123 | 124 | - In order to package a locally available git repository, use 125 | 126 | .. code-block:: sh 127 | 128 | $ pypi2pkgbuild.py git+file://$absolute_path_to_repo # (e.g. file:///home/...) 129 | 130 | In order to package a locally available sdist or wheel, use 131 | 132 | .. code-block:: sh 133 | 134 | $ pypi2pkgbuild.py file://$absolute_path_to_file # (e.g. file:///home/...) 135 | 136 | Note that in both cases *absolute* paths are necessary. 137 | 138 | Building packages from local repos or wheels needs to be done in topological 139 | order of the dependencies (so that ``pypi2pkgbuild.py`` can find that 140 | the dependencies are actually present), or by passing the ``-d`` flag 141 | ("do not build dependencies"); if it is used, the Arch package may 142 | not use the correct dependency names (if they are not of the form 143 | ``python-pep503-normalized-name``). 144 | 145 | - By default, ``pypi2pkgbuild.py`` ignores ``pip`` config files such as 146 | ``~/.config/pip/pip.conf``. An explicitly set ``PIP_CONFIG_FILE`` will be 147 | respected, but may cause ``pypi2pkgbuild.py`` to fail as some ``pip`` calls 148 | will be unexpectedly modified. 149 | 150 | Likewise, user-site packages are ignored unless ``PYTHONNOUSERSITE`` is 151 | explicitly set to an empty value. 152 | 153 | Build-time dependencies of packages 154 | ----------------------------------- 155 | 156 | ``pypi2pkgbuild.py`` attempts to guess whether ``Cython`` and ``SWIG`` are 157 | build-time dependencies by checking for the presence of ``.pyx`` and ``.i`` 158 | files, respectively. If this is not desired, set the ``--guess-makedepends`` 159 | option accordingly. 160 | 161 | ``pypi2pkgbuild.py`` guesses whether ``numpy`` is a build-time dependency by 162 | attempting a build without ``numpy``, then, in case of failure, a build with 163 | ``numpy``. 164 | 165 | Additional Python build-time dependencies (i.e., ``setup_requires``) can be 166 | specified (as PyPI names) using the ``--setup-requires`` flag, or just as 167 | normal entries using ``--pkgbuild-extras`` (they will be installed into the 168 | build virtualenv). 169 | 170 | Additional non-Python build-time dependencies can be set as ``makedepends`` 171 | using ``--pkgbuild-extras``; they will be installed *before* 172 | ``pypi2pkgbuild.py`` attempts to build a wheel for the package. 173 | 174 | Vendored packages 175 | ----------------- 176 | 177 | Some Arch packages (e.g. ``ipython``) include a number of smaller PyPI 178 | packages. 179 | 180 | Because it is not possible to assign a meaningful version automatically, 181 | ``pypi2pkgbuild.py`` instead creates an independent Arch package for each of 182 | the PyPI packages (with two dashes in the name, to prevent name conflicts) and 183 | a master package that depends on all of them. The ``pkgrel`` of the master 184 | package is set to ``$official_pkgrel.99``, so that the package appears more 185 | recent than the current official version but older than any future official 186 | version. All these packages ``conflict`` with all versions of the official 187 | package (except the newly created package), so upgrading should work fine when 188 | the official package is actually updated. 189 | 190 | However, dependencies are still expressed using the master package (to avoid 191 | breakage on upgrade into an official package), so internal dependencies will 192 | appear be circular. 193 | 194 | All the packages are placed in a subfolder named ``meta:$pkgname``, so one can 195 | easily install everything by ``cd``'ing there and running 196 | 197 | .. code-block:: sh 198 | 199 | $ sudo pacman -U --asdeps **/*.xz 200 | $ sudo pacman -D --asexplicit $pkgname/$pkgname.tar.xz 201 | 202 | Handling Python upgrades 203 | ------------------------ 204 | 205 | When the Python minor version (``x`` in ``3.x``) is upgraded, it is necessary 206 | to regenerate all self-built packages. This can be done e.g. with 207 | 208 | .. code-block:: sh 209 | 210 | $ pypi2pkgbuild.py $( 211 | ls /usr/lib/python3.$oldver/site-packages | 212 | grep -Po '.*(?=-.*.dist-info)' 213 | ) 214 | 215 | Comparison with other tools 216 | --------------------------- 217 | 218 | Other similar tools include pip2arch_, pip2pkgbuild_, and fpm_. To the best 219 | of my knowledge, the features below are unique to PyPI2PKGBUILD; please let me 220 | know if this is incorrect. 221 | 222 | - Supports wheels (the default is to prefer ``any``-platform wheels, then 223 | ``sdist``\s, then ``manylinux1`` wheels, but this can be changed using 224 | ``--pkgtypes``). 225 | - Resolves Python dependencies via installation in a temporary virtualenv, and 226 | also creates PKGBUILDs for those that are not available as official packages. 227 | - Resolves binary dependencies via ``namcap`` and adds them to the ``depends`` 228 | array if they are installed (thus, it is suggested to first install 229 | them as ``--asdeps`` and then let the generated PKGBUILD pick them up as 230 | dependencies). Note that some packages are distributed with a copy of the 231 | required libraries; in this case, ``pypi2pkgbuild.py``’s behavior will depend 232 | on whether the package defaults to using the system-wide library or its own 233 | copy. 234 | - Automatically tries to fetch a missing license file from Github, if 235 | applicable. 236 | - Automatically builds the package (with options given in ``--makepkg=...``) 237 | and run ``namcap``. 238 | - Automatically builds all outdated dependencies via ``-u``. 239 | 240 | .. _pip2arch: https://github.com/bluepeppers/pip2arch 241 | .. _pip2pkgbuild: https://github.com/wenLiangcan/pip2pkgbuild 242 | .. _fpm: https://github.com/jordansissel/fpm 243 | -------------------------------------------------------------------------------- /TODO.rst: -------------------------------------------------------------------------------- 1 | Issues 2 | ====== 3 | 4 | - Due to pypa/setuptools#5429 (devendoring issues with pip), combining an 5 | Arch-packaged pip and a ``pypi2pkgbuild.py``-packaged setuptools does not 6 | work. In practice, this means that one should exclude ``setuptools`` from 7 | automatic upgrades. 8 | 9 | - Packages that are manually vendored (... poorly) by Arch (e.g., previously, 10 | ``html5lib`` into ``bleach``) cause some issues. A solution would be to 11 | actually install the dependencies, check whether the last versions were 12 | installed, and error if this is not the case (indicating a requirement on an 13 | earlier version, which necessarily means manual vendoring). 14 | 15 | - VCS fragments cannot be given. 16 | 17 | - PyPI packages that depends on another package's ``extra_requires`` are not 18 | supported (needs upstream support from ``pip show``). 19 | 20 | - ``scikit-image`` "depends" (... with fallback) on ``dask[array]``. 21 | 22 | - License support is incomplete. 23 | 24 | - e.g. ``matplotlib`` has a ``LICENSE`` *directory*. 25 | - get licenses from wheels. 26 | 27 | - Meta packages are fully rebuilt even if only a component needs to be built 28 | (although version dependencies -- in particular ``pkgrel``\s -- may have 29 | changed so it may not be possible to avoid this and maintain robustness). 30 | 31 | - ``scipy`` fails to build, probably due to numpy/numpy#7779 (``LDFLAGS`` 32 | set by ``makepkg`` strips defaults). Setting ``LDFLAGS`` to ``"$(. 33 | /etc/makepkg.conf; echo $LDFLAGS) -shared"`` does not seem to help, though. 34 | **NOTE:** This may possibly be fixed using the ``NPY_DISTUTILS_APPEND_FLAG`` 35 | environment variable on numpy≥1.16. 36 | 37 | - ``fpm`` adds a ``get_metadata`` command to avoid having to install the 38 | package but this can't be done with e.g. wheels. Perhaps we could hook 39 | something else? 40 | 41 | - Move ``numpy`` support to ``--guess-makedepends``. Implement 42 | ``guess-makedepends`` by adding shim files to the environment that check 43 | whether they are accessed. A similar strategy can be used e.g. for swig, 44 | pybind11. 45 | 46 | Arch packaging 47 | ============== 48 | 49 | - Some packages are installed without an ``.egg-info`` (e.g. ``entrypoints``) 50 | and thus not seen by ``pip list --outdated`` (and thus 51 | ``pypi2pkgbuild.py -o``). 52 | 53 | Other mispackaged packages 54 | ========================== 55 | 56 | - Setup-time non-Python dependencies. 57 | 58 | - ``notebook`` *from the git repository* (requires at least ``bower``, 59 | perhaps more). 60 | 61 | - Undetected split packages: 62 | 63 | - Arch splits ``pygments`` into ``python-pygments`` and ``pygmentize``, 64 | and ``pygobject`` into ``python-gobject`` and ``pygobject-devel`` 65 | (because the latter is shared with ``python2-gobject``) but in each 66 | case ``pypi2pkgbuild.py`` only sees the former (and thus does not 67 | provides/conflicts the latter). Due to licenses and the presence of 68 | custom scripts (e.g. shell completion for ``pygmentize``, we can't rely 69 | on strict inclusion). The best solution is therefore either to declare a 70 | ``conflicts``/``replaces``, or to manually remove the extraneous file (see 71 | ``python-gobject.PKGBUILD_EXTRAS``). 72 | 73 | - Packages that install system-level (e.g., systemd) scripts: 74 | 75 | - ``sftpman`` (explicitly unsupported via ``sftpman.PKGBUILD_EXTRAS``). 76 | 77 | - Packages vendored into non-Python packages (could be partially detected from 78 | nonmatching versions): 79 | 80 | - ``lit`` (vendored into ``llvm``). 81 | 82 | Note that fixes for some other packages are provided in the ``pkgbuild-extras`` 83 | directory. 84 | 85 | Ideas for extracting makedepends 86 | ================================ 87 | 88 | - Intercept pkg-config calls for missing packages (e.g. cairo/pycairo?). 89 | - Extract manylinux wheels' vendored .libs. 90 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-cairo.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-cairo 2 | makedepends+=('cairo') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-gobject.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-gobject 2 | # Avoid conflicts with pygobject-devel 3 | makedepends+=('gobject-introspection' 'python-cairo') 4 | 5 | package() { 6 | _package && 7 | rm -rf "$pkgdir"/usr/{include,lib/pkgconfig} || 8 | return 1 9 | } 10 | # vim: set ft=PKGBUILD : 11 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-graphviz.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-graphviz 2 | depends+=('graphviz') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-pillow.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # pillow 2 | # Arch installs an invalid version of raqm.h (https://bugs.archlinux.org/task/57492). 3 | pkgrel=99 4 | # vim: set ft=PKGBUILD : 5 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-pylibftdi.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-pylibftdi 2 | depends+=('libftdi') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-supersmoother.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-supersmoother 2 | # Missing dependency, fixed in master. 3 | depends+=('python-numpy') 4 | # vim: set ft=PKGBUILD : 5 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-tqdm.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # tqdm 2 | # Arch fails to install the man page. 3 | pkgrel=99 4 | 5 | package() { 6 | _package && 7 | mkdir -p "$pkgdir/usr/share/man/man1" && 8 | PYTHONPATH="$(find "$pkgdir" -name site-packages)" \ 9 | "$pkgdir/usr/bin/tqdm" \ 10 | --manpath "$pkgdir/usr/share/man/man1" || 11 | return 1 12 | } 13 | # vim: set ft=PKGBUILD : 14 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-vapory.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-vapory 2 | depends+=('povray') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-wxpython.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-gobject 2 | makedepends+=('webkit2gtk' 'wxgtk3') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/python-yep.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # python-yep 2 | depends+=('gperftools') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/sftpman-gtk.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # sftpman-gtk 2 | depends+=('gtk3' 'python-gobject') 3 | # vim: set ft=PKGBUILD : 4 | -------------------------------------------------------------------------------- /pkgbuild-extras/sftpman.PKGBUILD_EXTRAS: -------------------------------------------------------------------------------- 1 | # sftpman 2 | cat <<'EOF' 3 | sftpman is actively maintained on the AUR; use that package to correctly 4 | install the systemd and pm scripts. 5 | EOF 6 | return 1 7 | # vim: set ft=PKGBUILD : 8 | -------------------------------------------------------------------------------- /pypi2pkgbuild-compare.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Compare dependencies between installed and automatic packages. 4 | # First run e.g. 5 | # pypi2pkgbuild.py $(pacman -Qm | grep python- | grep -v 00 | grep -v git | cut -d' ' -f1 | cut -d- -f2-) 6 | # and run this script from the packages-containing directory. 7 | 8 | for pkgname in *; do 9 | echo "$(tput bold)$pkgname$(tput sgr0)" 10 | found="$( 11 | grep '\bdepends' "$pkgname/.SRCINFO" | 12 | cut -d' ' -f3 | grep -v '^python$' | sort)" 13 | info="$(pacman -Qi "$pkgname")" 14 | depends="$( 15 | grep -Po '(?<=Depends On).*' <<<"$info" | 16 | cut -d: -f2 | tr ' ' '\n' | grep -v '^python$')" 17 | optdepends="$( 18 | grep -Pzo '(?<=Optional Deps)(.|\n)*?\n(?=\S)' <<<"$info" | 19 | tr -d '\0' | sed 's/^[ :]*//' | grep -av '^None$')" 20 | fulldepends="$(grep -o '^[[:alnum:]-]\+' <<<$"$depends\n$optdepends" | sort)" 21 | c1="$(comm -23 <(echo "$found") <(echo "$fulldepends"))" 22 | c2="$(comm -13 <(echo "$found") <(echo "$fulldepends"))" 23 | c3="$(comm -12 <(echo "$found") <(echo "$fulldepends"))" 24 | if [[ "$c3" ]]; then 25 | echo "$(tput smul)Common$(tput sgr0)" 26 | echo "$c3" 27 | fi 28 | if [[ "$c2" ]]; then 29 | echo "$(tput smul)Only in installed package$(tput sgr0)" 30 | echo "$c2" 31 | fi 32 | if [[ "$c1" ]]; then 33 | echo "$(tput smul)Only in automatic package$(tput sgr0)" 34 | echo "$c1" 35 | fi 36 | printf '\n------\n' 37 | done 38 | -------------------------------------------------------------------------------- /pypi2pkgbuild.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | """Convert PyPI entries to Arch Linux packages.""" 3 | 4 | import abc 5 | from abc import ABC 6 | from argparse import (Action, ArgumentParser, ArgumentDefaultsHelpFormatter, 7 | RawDescriptionHelpFormatter) 8 | import ast 9 | from collections import namedtuple 10 | from contextlib import suppress 11 | from functools import lru_cache 12 | import hashlib 13 | import importlib.metadata 14 | from io import StringIO 15 | import json 16 | import logging 17 | import os 18 | from pathlib import Path 19 | import re 20 | import shlex 21 | import shutil 22 | import site 23 | import subprocess 24 | from subprocess import CalledProcessError, PIPE, DEVNULL 25 | import sys 26 | from tempfile import NamedTemporaryFile, TemporaryDirectory 27 | import textwrap 28 | import urllib.request 29 | 30 | try: 31 | import setuptools_scm 32 | __version__ = setuptools_scm.get_version( # xref setup.py 33 | root=".", relative_to=__file__, 34 | version_scheme="post-release", local_scheme="node-and-date") 35 | except (ImportError, LookupError): 36 | try: 37 | __version__ = importlib.metadata.version("pypi2pkgbuild") 38 | except ModuleNotFoundError: 39 | __version__ = "(unknown version)" 40 | 41 | 42 | LOGGER = logging.getLogger(Path(__file__).stem) 43 | 44 | PKGTYPES = ["anywheel", "sdist", "manylinuxwheel"] 45 | PY_TAGS = ["py{0.major}".format(sys.version_info), 46 | "cp{0.major}".format(sys.version_info), 47 | "py{0.major}{0.minor}".format(sys.version_info), 48 | "cp{0.major}{0.minor}".format(sys.version_info)] 49 | THIS_ARCH = ["i686", "x86_64"][sys.maxsize > 2 ** 32] 50 | LICENSE_NAMES = ["LICENSE", "LICENSE.txt", "license.txt", 51 | "COPYING", "COPYING.md", "COPYING.rst", "COPYING.txt", 52 | "COPYRIGHT"] 53 | TROVE_COMMON_LICENSES = { # Licenses provided by base `licenses` package. 54 | "GNU Affero General Public License v3": 55 | "AGPL3", 56 | "GNU Affero General Public License v3 or later (AGPLv3+)": 57 | "AGPL3", 58 | "Apache Software License": 59 | "Apache", 60 | "Artistic License": 61 | "Artistic2.0", 62 | "Boost Software License 1.0 (BSL-1.0)": 63 | "Boost", 64 | # "CCPL", 65 | "Common Development and Distribution License 1.0 (CDDL-1.0)": 66 | "CDDL", 67 | "Eclipse Public License 1.0 (EPL-1.0)": 68 | "EPL", 69 | # "FDL1.2", # See FDL1.3. 70 | "GNU Free Documentation License (FDL)": 71 | "FDL1.3", 72 | "GNU General Public License (GPL)": 73 | "GPL", 74 | "GNU General Public License v2 (GPLv2)": 75 | "GPL2", 76 | "GNU General Public License v2 or later (GPLv2+)": 77 | "GPL2", 78 | "GNU General Public License v3 (GPLv3)": 79 | "GPL3", 80 | "GNU General Public License v3 or later (GPLv3+)": 81 | "GPL3", 82 | "GNU Library or Lesser General Public License (LGPL)": 83 | "LGPL", 84 | "GNU Lesser General Public License v2 (LGPLv2)": 85 | "LGPL2.1", 86 | "GNU Lesser General Public License v2 or later (LGPLv2+)": 87 | "LGPL2.1", 88 | "GNU Lesser General Public License v3 (LGPLv3)": 89 | "LGPL3", 90 | "GNU Lesser General Public License v3 or later (LGPLv3+)": 91 | "LGPL3", 92 | # "LPPL", 93 | "Mozilla Public License 1.1 (MPL 1.1)": 94 | "MPL", 95 | "Mozilla Public License 2.0 (MPL 2.0)": 96 | "MPL2", 97 | # "PerlArtistic", # See Artistic2.0. 98 | # "PHP", 99 | "Python Software Foundation License": 100 | "PSF", 101 | # "RUBY", 102 | "W3C License": 103 | "W3C", 104 | "Zope Public License": 105 | "ZPL", 106 | } 107 | TROVE_SPECIAL_LICENSES = { # Standard licenses with specific line. 108 | "BSD License": 109 | "BSD", 110 | "MIT License": 111 | "MIT", 112 | "zlib/libpng License": 113 | "ZLIB", 114 | "Python License (CNRI Python License)": 115 | "Python", 116 | } 117 | 118 | PKGBUILD_HEADER = """\ 119 | # Maintainer: {config[PACKAGER]} 120 | 121 | export PIP_CONFIG_FILE=/dev/null 122 | export PIP_DISABLE_PIP_VERSION_CHECK=true 123 | 124 | pkgname={pkg.pkgname} 125 | epoch={pkg.epoch} 126 | pkgver={pkg.pkgver} 127 | pkgrel={pkg.pkgrel} 128 | pkgdesc={pkg.pkgdesc} 129 | arch=({pkg.arch}) 130 | url={pkg.url} 131 | license=({pkg.license}) 132 | depends=(python {pkg.depends:{pkg.__class__.__name__}}) 133 | ## EXTRA_DEPENDS ## 134 | makedepends=({pkg.makedepends:{pkg.__class__.__name__}}) 135 | checkdepends=({pkg.checkdepends:{pkg.__class__.__name__}}) 136 | provides=({pkg.provides}) 137 | conflicts=(${{provides%=*}}) # No quotes, to avoid an empty entry. 138 | source=(PKGBUILD_EXTRAS) 139 | md5sums=(SKIP) 140 | noextract=() 141 | """ 142 | 143 | SDIST_SOURCE = """\ 144 | source+=({url[url]}) 145 | md5sums+=({url[md5_digest]}) 146 | """ 147 | 148 | WHEEL_ANY_SOURCE = """\ 149 | source+=({url[url]}) 150 | md5sums+=({url[md5_digest]}) 151 | noextract+=({name}) 152 | """ 153 | 154 | WHEEL_ARCH_SOURCE = """\ 155 | source_{arch}=({url[url]}) 156 | md5sums_{arch}=({url[md5_digest]}) 157 | noextract+=({name}) 158 | """ 159 | 160 | MORE_SOURCES = """\ 161 | source+=({names}) 162 | md5sums+=({md5s}) 163 | """ 164 | 165 | PKGBUILD_CONTENTS = """\ 166 | 167 | _first_source() { 168 | echo " ${source_i686[@]} ${source_x86_64[@]} ${source[@]}" | 169 | tr ' ' '\\n' | grep -Pv '^(PKGBUILD_EXTRAS)?$' | head -1 170 | } 171 | 172 | _vcs="$(grep -Po '^[a-z]+(?=\\+)' <<< "$(_first_source)")" 173 | if [[ "$_vcs" ]]; then 174 | makedepends+=("$(pkgfile --quiet /usr/bin/$_vcs)") 175 | provides+=("${pkgname%-$_vcs}") 176 | conflicts+=("${pkgname%-$_vcs}") 177 | fi 178 | 179 | _is_wheel() { 180 | [[ $(_first_source) =~ \\.whl$ ]] 181 | } 182 | 183 | if [[ _is_wheel && 184 | $(basename "$(_first_source)" | rev | cut -d- -f1 | rev) =~ ^manylinux ]]; then 185 | options=(!strip) # https://github.com/pypa/manylinux/issues/119 186 | fi 187 | 188 | _dist_name() { 189 | find "$srcdir" -mindepth 1 -maxdepth 1 -type d -printf '%f\n' | 190 | grep -v '^_tmpenv$' 191 | } 192 | 193 | if [[ $(_first_source) =~ ^git+ ]]; then 194 | _pkgver() { 195 | ( set -o pipefail 196 | cd "$srcdir/$(_dist_name)" 197 | git describe --long --tags 2>/dev/null | 198 | sed 's/^v//;s/\\([^-]*-g\\)/r\\1/;s/-/./g' || 199 | printf "r%s.%s" \\ 200 | "$(git rev-list --count HEAD)" "$(git rev-parse --short HEAD)" 201 | ) 202 | } 203 | 204 | pkgver() { _pkgver; } 205 | fi 206 | 207 | _build() { 208 | if _is_wheel; then return; fi 209 | cd "$srcdir" 210 | # See Arch Wiki/PKGBUILD/license. 211 | # Get the first filename that matches. 212 | local test_name 213 | if [[ ${license[0]} =~ ^(BSD|MIT|ZLIB|Python)$ ]]; then 214 | for test_name in """ + " ".join(LICENSE_NAMES) + """; do 215 | if cp "$srcdir/$(_dist_name)/$test_name" "$srcdir/LICENSE" 2>/dev/null; then 216 | break 217 | fi 218 | done 219 | fi 220 | # Use the latest version of pip, as Arch's version is historically out of 221 | # date(!) and newer versions do fix bugs (sometimes). 222 | python -mvenv --clear --system-site-packages _tmpenv 223 | _tmpenv/bin/pip --quiet install -U pip 224 | # Build the wheel (which we allow to fail) only after fetching the license. 225 | # In order to isolate from ~/.pydistutils.cfg, we need to set $HOME to a 226 | # temporary directory, and thus first $XDG_CACHE_HOME back to its real 227 | # location, so that pip inserts the wheel in the wheel cache. We cannot 228 | # use --global-option=--no-user-cfg instead because that fully disables 229 | # wheels, causing a from-source build of build dependencies such as 230 | # numpy/scipy. 231 | XDG_CACHE_HOME="${XDG_CACHE_HOME:-"$HOME/.cache"}" HOME=_tmpenv \\ 232 | _tmpenv/bin/pip wheel -v --no-deps --wheel-dir="$srcdir" \\ 233 | "./$(_dist_name)" || true 234 | } 235 | 236 | build() { _build; } 237 | 238 | _check() { 239 | # Define check(), possibly using _check as a helper, to run the tests. 240 | # You may need to call `python setup.py build_ext -i` first. 241 | if _is_wheel; then return; fi 242 | cd "$srcdir/$(_dist_name)" 243 | /usr/bin/python setup.py -q test 244 | } 245 | 246 | _package() { 247 | cd "$srcdir" 248 | # pypa/pip#3063: pip always checks for a globally installed version. 249 | python -mvenv --clear --system-site-packages _tmpenv 250 | _tmpenv/bin/pip install --prefix="$pkgdir/usr" \\ 251 | --no-deps --ignore-installed --no-warn-script-location \\ 252 | "$(ls ./*.whl 2>/dev/null || echo ./"$(_dist_name)")" 253 | if [[ -d "$pkgdir/usr/bin" ]]; then # Fix entry points. 254 | python="#!$(readlink -f _tmpenv)/bin/python" 255 | for f in "$pkgdir/usr/bin/"*; do 256 | # Like [[ "$(head -n1 "$f")" = "#!$(readlink -f _tmpenv)/bin/python" ]] 257 | # but without bash warning on null bytes in "$f" (if it is actually 258 | # a compiled executable, not an entry point). 259 | if python -c 'import os, sys; sys.exit(not open(sys.argv[1], "rb").read().startswith(os.fsencode(sys.argv[2]) + b"\\n"))' "$f" "$python"; then 260 | sed -i '1c#!/usr/bin/python' "$f" 261 | fi 262 | done 263 | fi 264 | if [[ -d "$pkgdir/usr/etc" ]]; then 265 | mv "$pkgdir/usr/etc" "$pkgdir/etc" 266 | fi 267 | if [[ -f LICENSE ]]; then 268 | install -D -m644 LICENSE "$pkgdir/usr/share/licenses/$pkgname/LICENSE" 269 | fi 270 | } 271 | 272 | package() { _package; } 273 | 274 | . "$(dirname "$BASH_SOURCE")/PKGBUILD_EXTRAS" 275 | 276 | # Remove makedepends already in depends (which may have been listed for the 277 | # first build, but autodetected on the second. 278 | makedepends=($(printf '%s\\n' "${makedepends[@]}" | 279 | grep -Pwv "^($(IFS='|'; echo "${depends[*]}"))$")) 280 | : # Apparently ending with makedepends assignment sometimes fails. 281 | """ 282 | 283 | METAPKGBUILD_CONTENTS = """\ 284 | package() { 285 | true 286 | } 287 | """ 288 | 289 | 290 | def _run_shell(args, **kwargs): 291 | """ 292 | Logging wrapper for `subprocess.run`, with useful defaults. 293 | 294 | Log at ``DEBUG`` level except if the *verbose* kwarg is set, in which case 295 | log at ``INFO`` level. 296 | """ 297 | kwargs = {"shell": isinstance(args, str), 298 | "env": {**os.environ, 299 | # This should fallback to C if the locale is not present. 300 | # We'd prefer C.utf8 but that doesn't exist. With other 301 | # locales, outputs cannot be parsed. 302 | "LC_ALL": "en_US.UTF-8", 303 | # LANGUAGE is also needed for e.g. pacman which goes 304 | # through gettext. 305 | "LANGUAGE": "en_US.UTF-8", 306 | "PYTHONNOUSERSITE": "1", 307 | "PIP_CONFIG_FILE": "/dev/null", 308 | "PIP_DISABLE_PIP_VERSION_CHECK": "1", 309 | "COLOREDLOGS_AUTO_INSTALL": "", # Hide pip logging. 310 | **kwargs.pop("env", {})}, 311 | "check": True, 312 | "text": True, 313 | **kwargs} 314 | if "cwd" in kwargs: 315 | kwargs["cwd"] = str(Path(kwargs["cwd"])) 316 | level = logging.INFO if kwargs.pop("verbose", None) else logging.DEBUG 317 | args_s = (args if isinstance(args, str) 318 | else " ".join(shlex.quote(str(arg)) for arg in args)) 319 | if "cwd" in kwargs: 320 | LOGGER.log(level, 321 | "Running subprocess from %s:\n%s", kwargs["cwd"], args_s) 322 | else: 323 | LOGGER.log(level, "Running subprocess:\n%s", args_s) 324 | cproc = subprocess.run(args, **kwargs) 325 | # Stripping final newlines matches the behavior of `a=$(foo)`. 326 | if isinstance(cproc.stdout, str): 327 | cproc.stdout = cproc.stdout.rstrip("\n") 328 | elif isinstance(cproc.stdout, bytes): 329 | cproc.stdout = cproc.stdout.rstrip(b"\n") 330 | return cproc 331 | 332 | 333 | def _run_shell_stdout(args, **kwargs): 334 | """Run a shell command and return its stdout.""" 335 | return _run_shell(args, **kwargs, stdout=PIPE).stdout 336 | 337 | 338 | @lru_cache() 339 | def _get_readonly_clean_venv(): # "readonly" is an intent, but not enforced. 340 | venv_dir = TemporaryDirectory() 341 | _run_shell(["python", "-mvenv", venv_dir.name]) 342 | _run_shell([ # needed for version parsing 343 | f"{venv_dir.name}/bin/pip", "install", "packaging"], 344 | stdout=DEVNULL) 345 | return venv_dir # Don't let venv_dir get GC'd. 346 | 347 | 348 | def _run_python(args, **kwargs): 349 | """Run python from a temporary venv.""" 350 | venv_dir = _get_readonly_clean_venv().name 351 | return _run_shell( # args must be a list; str is not supported. 352 | [f"{venv_dir}/bin/python"] + args, **kwargs) 353 | 354 | 355 | @lru_cache() 356 | def get_makepkg_conf(): 357 | with TemporaryDirectory() as tmpdir: 358 | mini_pkgbuild = textwrap.dedent(r""" 359 | pkgname=_ 360 | pkgver=0 361 | pkgrel=0 362 | arch=(any) 363 | prepare() { 364 | printf "CFLAGS %s\0CXXFLAGS %s\0PACKAGER %s" \ 365 | "$CFLAGS" "$CXXFLAGS" "$PACKAGER" > log.txt 366 | exit 0 367 | } 368 | """) 369 | Path(tmpdir, "PKGBUILD").write_text(mini_pkgbuild) 370 | try: 371 | _run_shell("makepkg", cwd=tmpdir, stdout=PIPE, stderr=PIPE) 372 | except CalledProcessError as e: 373 | sys.stderr.write(e.stderr) 374 | raise 375 | out = Path(tmpdir, "src/log.txt").read_text() 376 | return dict(pair.split(" ", 1) for pair in out.split("\0")) 377 | 378 | 379 | class ArchVersion(namedtuple("_ArchVersion", "epoch pkgver pkgrel")): 380 | @classmethod 381 | def parse(cls, s): 382 | epoch, pkgver, pkgrel = ( 383 | re.fullmatch(r"(?:(.*):)?(.*)-(.*)", s).groups()) 384 | return cls(epoch or "", pkgver, pkgrel) 385 | 386 | def __str__(self): 387 | return (f"{self.epoch}:{self.pkgver}-{self.pkgrel}" if self.epoch 388 | else f"{self.pkgver}-{self.pkgrel}") 389 | 390 | 391 | class WheelInfo( 392 | namedtuple("_WheelInfo", "name version build pythons abi platform")): 393 | @classmethod 394 | def parse(cls, url): 395 | parts = Path(urllib.parse.urlparse(url).path).stem.split("-") 396 | if len(parts) == 5: 397 | name, version, pythons, abi, platform = parts 398 | build = "" 399 | elif len(parts) == 6: 400 | name, version, build, pythons, abi, platform = parts 401 | else: 402 | raise ValueError(f"Invalid wheel url: {url}") 403 | return cls( 404 | name, version, build, set(pythons.split(".")), abi, platform) 405 | 406 | def get_arch_platforms(self): 407 | # any -> any 408 | # manylinuxXXX_{i686,x86_64}.manylinuxYYY_{...} -> {i686,x86_64}, {...} 409 | # No other wheel tags (e.g. windows/macos) reach this point because 410 | # they are first filtered away by _filter_and_sort_urls. 411 | platforms = [] 412 | regex = ( 413 | "(any)" 414 | # https://peps.python.org/pep-0600/#package-indexes 415 | "|manylinux1_(x86_64|i686)" 416 | "|manylinux2010_(x86_64|i686)" 417 | "|manylinux2014_(x86_64|i686|aarch64|armv7l|ppc64|ppc64le|s390x)" 418 | "|manylinux_[0-9]+_[0-9]+_(.*)") 419 | for part in self.platform.split("."): 420 | platform, = filter(None, re.fullmatch(regex, part).groups()) 421 | platforms.append(platform) 422 | return platforms 423 | 424 | 425 | # Copy-pasted from PEP503. 426 | def pep503_normalize_name(name): 427 | return re.sub(r"[-_.]+", "-", name).lower() 428 | 429 | 430 | def to_wheel_name(pep503_name): 431 | return pep503_name.replace("-", "_") 432 | 433 | 434 | def gen_ver_cmp_operator(ver): 435 | # Handle cases where only a post-release is available. 436 | return f"=={ver}.*,<{ver}.1" 437 | 438 | 439 | class PackagingError(Exception): 440 | pass 441 | 442 | 443 | def _get_vcs(name): 444 | match = re.match(r"\A[a-z]+(?=\+)", name) 445 | return match.group(0) if match else None 446 | 447 | 448 | # Vendored from pip._internal.vcs.VersionControl.get_url_rev. 449 | def _vcs_get_url_rev(url): 450 | error_message = ( 451 | "Sorry, '%s' is a malformed VCS url. " 452 | "The format is +://, " 453 | "e.g. svn+http://myrepo/svn/MyApp#egg=MyApp" 454 | ) 455 | assert '+' in url, error_message % url 456 | url = url.split('+', 1)[1] 457 | scheme, netloc, path, query, frag = urllib.parse.urlsplit(url) 458 | rev = None 459 | if '@' in path: 460 | path, rev = path.rsplit('@', 1) 461 | url = urllib.parse.urlunsplit((scheme, netloc, path, query, '')) 462 | return url, rev 463 | 464 | 465 | @lru_cache() 466 | def _get_url_impl(url): 467 | cache_dir = TemporaryDirectory() 468 | parsed = urllib.parse.urlparse(url) 469 | if parsed.scheme.startswith("git+"): 470 | _run_shell(["git", "clone", "--recursive", url[4:]], 471 | cwd=cache_dir.name) 472 | elif parsed.scheme == "pip": 473 | try: 474 | _run_python([ 475 | "-mpip", "download", "--no-deps", "-d", cache_dir.name, 476 | *(parsed.fragment.split() if parsed.fragment else []), 477 | parsed.netloc]) 478 | except CalledProcessError: 479 | # pypa/pip#1884: download can "fail" due to buggy setup.py (e.g. 480 | # astropy 1.3.3). 481 | raise PackagingError(f"Failed to download {parsed.netloc}, " 482 | "possibly due to a buggy setup.py") 483 | else: 484 | Path(cache_dir.name, Path(parsed.path).name).write_bytes( 485 | urllib.request.urlopen(url).read()) 486 | packed_path, = (path for path in Path(cache_dir.name).iterdir()) 487 | return cache_dir, packed_path # Don't let cache_dir get GC'd. 488 | 489 | 490 | def _get_url_packed_path(url): 491 | cache_dir, packed_path = _get_url_impl(url) 492 | return packed_path 493 | 494 | 495 | @lru_cache() 496 | def _get_url_unpacked_path_or_null(url): 497 | parsed = urllib.parse.urlparse(url) 498 | if parsed.scheme == "file" and parsed.path.endswith(".whl"): 499 | return Path("/dev/null") 500 | try: 501 | cache_dir, packed_path = _get_url_impl(url) 502 | except CalledProcessError: 503 | return Path("/dev/null") 504 | if packed_path.is_file(): # pip:// 505 | shutil.unpack_archive(str(packed_path), cache_dir.name) 506 | unpacked_path, = ( 507 | path for path in Path(cache_dir.name).iterdir() if path.is_dir()) 508 | return unpacked_path 509 | 510 | 511 | @lru_cache() 512 | def _guess_url_makedepends(url, guess_makedepends): 513 | makedepends = [] 514 | if ("swig" in guess_makedepends 515 | and list(_get_url_unpacked_path_or_null(url).glob("**/*.i"))): 516 | makedepends.append(NonPyPackageRef("swig")) 517 | if ("cython" in guess_makedepends 518 | and list(_get_url_unpacked_path_or_null(url).glob("**/*.pyx"))): 519 | makedepends.append(PackageRef("Cython")) 520 | return DependsTuple(makedepends) 521 | 522 | 523 | @lru_cache() 524 | def _get_metadata(name, setup_requires): 525 | # Dependency resolution is done by installing the package in a venv and 526 | # calling `pip show`; otherwise it would be necessary to parse environment 527 | # markers (from "requires_dist"). The package name may get denormalized 528 | # ("_" -> "-") during installation so we just look at whatever got 529 | # installed. 530 | # 531 | # `entry_points` is a generator, thus not json-serializable. 532 | # 533 | # To handle sdists that depend on numpy, we just see whether installing in 534 | # presence of numpy makes things better... 535 | with TemporaryDirectory() as venv_dir, \ 536 | NamedTemporaryFile("r") as more_requires_log, \ 537 | NamedTemporaryFile("r") as log: 538 | script = textwrap.dedent(r""" 539 | set -e 540 | python -mvenv {venv_dir} 541 | # Leave the source directory, which may contain wheels/sdists/etc. 542 | cd {venv_dir} 543 | . '{venv_dir}/bin/activate' 544 | if [[ -n '{setup_requires}' ]]; then 545 | pip install --upgrade {setup_requires} >/dev/null 546 | fi 547 | install_cmd() {{ 548 | pip list --format=freeze | cut -d= -f1 | sort >'{venv_dir}/a' 549 | if ! pip install --no-deps '{req}'; then 550 | return 1 551 | fi 552 | pip list --format=freeze | cut -d= -f1 | sort >'{venv_dir}/b' 553 | # installed name, or real name if it doesn't appear 554 | # (setuptools, pip, Cython, numpy). 555 | install_name="$(comm -13 '{venv_dir}/a' '{venv_dir}/b')" 556 | # the requirement can be 'req_name==version', or a path name. 557 | if [[ -z "$install_name" ]]; then 558 | if [[ -e '{req}' ]]; then 559 | install_name="$(basename '{req}' .git)" 560 | else 561 | install_name="$(echo '{req}' | cut -d= -f1 -)" 562 | fi 563 | fi 564 | }} 565 | show_cmd() {{ 566 | python - "$(pip show -v "$install_name")" <{log.name}; then 574 | show_cmd 575 | else 576 | pip install numpy >/dev/null 577 | echo numpy >>{more_requires_log.name} 578 | install_cmd >{log.name} 579 | show_cmd 580 | fi 581 | """).format( 582 | venv_dir=venv_dir, 583 | setup_requires=" ".join(setup_requires), 584 | req=(_get_url_unpacked_path_or_null(name) 585 | if _get_vcs(name) else name), 586 | more_requires_log=more_requires_log, 587 | log=log) 588 | try: 589 | out = _run_shell_stdout( 590 | script, env={ 591 | # Matters, as a built wheel would get cached. 592 | "CFLAGS": get_makepkg_conf()["CFLAGS"], 593 | # Not actually used, per pypa/setuptools#1192. Still 594 | # relevant for packages that ship their own autoconf-based 595 | # builds, e.g. wxPython. 596 | "CXXFLAGS": get_makepkg_conf()["CXXFLAGS"], 597 | }) 598 | except CalledProcessError: 599 | sys.stderr.write(log.read()) 600 | raise PackagingError(f"Failed to obtain metadata for {name}.") 601 | more_requires = more_requires_log.read().splitlines() 602 | metadata = {k.lower(): v for k, v in json.loads(out).items()} 603 | metadata["requires"] = [ 604 | *(metadata["requires"].split(", ") if metadata["requires"] else []), 605 | *more_requires] 606 | metadata["classifiers"] = metadata["classifiers"].split("\n ")[1:] 607 | return {key.replace("-", "_"): value for key, value in metadata.items()} 608 | 609 | 610 | @lru_cache() 611 | def _get_info(name, *, 612 | pre=False, 613 | guess_makedepends=(), 614 | _sources=("git", "local", "pypi"), 615 | _version=""): 616 | 617 | parsed = urllib.parse.urlparse(name) 618 | 619 | def _get_info_git(): 620 | if not parsed.scheme.startswith("git+"): 621 | return 622 | url, rev = _vcs_get_url_rev(name) 623 | if rev: 624 | # FIXME pip guesses whether a name is a branch, a commit or a tag, 625 | # whereas the fragment type must be specified in the PKGBUILD. 626 | # FIXME fragment support. 627 | raise PackagingError( 628 | "No support for packaging specific revisions.") 629 | metadata = _get_metadata( 630 | name, _guess_url_makedepends(name, guess_makedepends).pep503_names) 631 | try: # Normalize the name if available on PyPI. 632 | metadata["name"] = _get_info( 633 | metadata["name"], _sources=("pypi",))["info"]["name"] 634 | except PackagingError: 635 | pass 636 | return {"info": {"download_url": url, 637 | "home_page": url, 638 | "package_url": url, 639 | **metadata}, 640 | "urls": [{"packagetype": "sdist", 641 | "path": parsed.path, 642 | "url": name, 643 | "md5_digest": "SKIP"}]} 644 | 645 | def _get_info_local(): 646 | if not parsed.scheme == "file": 647 | return 648 | metadata = _get_metadata( 649 | name, _guess_url_makedepends(name, guess_makedepends).pep503_names) 650 | return {"info": {"download_url": name, 651 | "home_page": name, 652 | "package_url": name, 653 | **metadata}, 654 | "urls": [{"packagetype": 655 | "bdist_wheel" if parsed.path.endswith(".whl") 656 | else "sdist", 657 | "path": parsed.path, 658 | "url": name, 659 | "md5_digest": "SKIP"}]} 660 | 661 | def _get_info_pypi(): 662 | try: 663 | r = urllib.request.urlopen( 664 | f"https://pypi.org/pypi/{name}/{_version}/json" 665 | if _version else f"https://pypi.org/pypi/{name}/json") 666 | except urllib.error.HTTPError: 667 | return 668 | request = json.loads(r.read()) 669 | if not _version: 670 | if not request["releases"]: 671 | raise PackagingError(f"No suitable release found for {name}.") 672 | src = ("from sys import argv; " 673 | "from packaging.version import parse as pv; ") 674 | src += ( 675 | "print(sorted(argv[1:], key=pv))" if pre else 676 | "print(sorted([v for v in argv[1:] if not pv(v).is_prerelease]," 677 | "key=pv))") 678 | versions = ast.literal_eval( 679 | _run_python(["-c", src, *request["releases"]], stdout=PIPE) 680 | .stdout) 681 | if not versions: # request only returned pre-releases. 682 | raise PackagingError( 683 | f"No suitable release found for {name}. Pre-releases are " 684 | f"available, use --pre to use the latest one.") 685 | max_version = versions[-1] 686 | if max_version != request["info"]["version"]: 687 | return _get_info(name, pre=pre, _version=max_version) 688 | return request 689 | 690 | for source in _sources: 691 | info = locals()[f"_get_info_{source}"]() 692 | if info: 693 | return info 694 | else: 695 | raise PackagingError("Package {} not found.".format( 696 | " ".join(filter(None, [name, _version])))) 697 | 698 | 699 | # For _find_{installed,arch}_name_version: 700 | # - first check for a matching `.{dist,egg}-info` file, ignoring case to 701 | # handle e.g. `cycler` (pip) / `Cycler` (PyPI); also, there is usually a 702 | # version number (separated by a dash) but not always, e.g. for PySide6 (in 703 | # which case a dot comes next). 704 | # - then check exact lowercase matches, to handle packages without a 705 | # `.{dist,egg}-info`. 706 | 707 | 708 | def _find_installed_name_version(pep503_name, *, ignore_vendored=False): 709 | parts = ( 710 | _run_shell_stdout( 711 | "find . -maxdepth 1 -iname '%s[.-]*-info' " 712 | "-exec pacman -Qo '{}' \\; | rev | cut -d' ' -f1,2 | rev" 713 | % (to_wheel_name(pep503_name) 714 | # https://github.com/pypa/wheel/issues/440 715 | .replace("-", "[-.]").replace("_", "[_.]")), 716 | cwd=site.getsitepackages()[0]).split() 717 | or _run_shell_stdout( 718 | f"pacman -Q python-{pep503_name} 2>/dev/null", 719 | check=False).split()) 720 | if parts: 721 | pkgname, version = parts # This will raise if there is an ambiguity. 722 | if pkgname.endswith("-git"): 723 | expected_conflict = pkgname[:-len("-git")] 724 | if _run_shell( 725 | f"pacman -Qi {pkgname} 2>/dev/null | " 726 | rf"grep -q 'Conflicts With *:.*\b{expected_conflict}\b'", 727 | check=False).returncode == 0: 728 | pkgname = pkgname[:-len("-git")] 729 | else: 730 | raise PackagingError( 731 | f"Found installed package {pkgname} which does NOT " 732 | f"conflict with {expected_conflict}; please uninstall it " 733 | f"first.") 734 | if ignore_vendored and pkgname.startswith("python--"): 735 | return 736 | else: 737 | return pkgname, ArchVersion.parse(version) 738 | else: 739 | return 740 | 741 | 742 | def _find_arch_name_version(pep503_name): 743 | for standalone in [True, False]: # vendored into another Python package? 744 | *candidates, = map(str.strip, _run_shell_stdout( 745 | "pkgfile -riv " 746 | "'^/usr/lib/python{version.major}\\.{version.minor}/{parent}" 747 | "{wheel_name}-.*py{version.major}\\.{version.minor}\\.egg-info' | " 748 | "cut -f1 | uniq | cut -d/ -f2".format( 749 | parent="site-packages/" if standalone else "", 750 | wheel_name=to_wheel_name(pep503_name), 751 | version=sys.version_info) 752 | ).splitlines()) 753 | if len(candidates) > 1: 754 | message = "Multiple candidates for {}: {}.".format( 755 | pep503_name, ", ".join(candidates)) 756 | try: 757 | canonical, = ( 758 | candidate for candidate in candidates 759 | if candidate.startswith(f"python-{pep503_name} ")) 760 | except ValueError: 761 | raise PackagingError(message) 762 | else: 763 | LOGGER.warning("%s Using canonical name: %s.", 764 | message, canonical.split()[0]) 765 | candidates = [canonical] 766 | if len(candidates) == 1: 767 | pkgname, version = candidates[0].split() 768 | arch_version = ArchVersion.parse(version) 769 | return pkgname, arch_version 770 | 771 | 772 | class NonPyPackageRef: 773 | def __init__(self, pkgname): 774 | self.pkgname = self.depname = pkgname 775 | 776 | 777 | class PackageRef: 778 | def __init__(self, name, *, 779 | pre=False, guess_makedepends=(), subpkg_of=None): 780 | # If `subpkg_of` is set, do not attempt to use the Arch Linux name, 781 | # and name the package python--$pkgname to prevent collision. 782 | self.orig_name = name # A name or an URL. 783 | self.info = _get_info( 784 | name, pre=pre, guess_makedepends=guess_makedepends) 785 | self.pypi_name = self.info["info"]["name"] 786 | # pacman -Slq | grep '^python-' | cut -d- -f 2- | 787 | # grep -v '^\([[:alnum:]]\)*$' | grep '_' 788 | # (or '\.', or '-') shows that PEP503 normalization is by far the most 789 | # common, so we use it everywhere... except when downloading, which 790 | # requires the actual PyPI-registered name. 791 | self.pep503_name = pep503_normalize_name(self.pypi_name) 792 | 793 | if subpkg_of: 794 | pkgname = f"python--{self.pep503_name}" 795 | depname = subpkg_of.pkgname 796 | arch_version = None 797 | 798 | else: 799 | # For the name as package: First, check installed packages, 800 | # which may have inherited non-standard names from the AUR (e.g., 801 | # `python-numpy-openblas`, `pipdeptree`). Specifically ignore 802 | # vendored packages (`python--*`). Then, check official packages. 803 | # Then, fallback on the default. 804 | # For the name as dependency, try the official name first, so that 805 | # one can replace the local package (which provides the official 806 | # one anyways) by the official one if desired without breaking 807 | # dependencies. 808 | 809 | installed = _find_installed_name_version( 810 | self.pep503_name, ignore_vendored=True) 811 | arch = _find_arch_name_version(self.pep503_name) 812 | default = f"python-{self.pep503_name}", None 813 | pkgname, arch_version = installed or arch or default 814 | depname, _ = arch or installed or default 815 | 816 | arch_packaged = sorted({*_run_shell_stdout( 817 | f"pkgfile -l {pkgname} 2>/dev/null | " 818 | # Package name has no dash (per packaging standard) nor slashes 819 | # (which can occur when a subpackage is vendored (depending on how 820 | # it is done), e.g. `.../foo.egg-info` and `.../foo/bar.egg-info` 821 | # both existing). 822 | r"grep -Po '(?<=site-packages/)[^-/]*(?=.*\.egg-info/?$)'", 823 | check=False).splitlines()}) 824 | 825 | # Final values. 826 | vcs = _get_vcs(name) 827 | self.pkgname = f"{pkgname}-{vcs}" if vcs else pkgname 828 | # Packages that depend on a vendored package should list the 829 | # metapackage (which may be otherwise unrelated) as a dependency, so 830 | # that the metapackage can get updated into an official package without 831 | # breaking dependencies. 832 | # However, the owning metapackage should list their vendorees 833 | # explicitly, so that they do not end up unrequired (other metapackages 834 | # don't matter as they only depend on their own components). 835 | # This logic is implemented in `DependsTuple.__fmt__`. 836 | self.depname = depname 837 | self.arch_version = arch_version 838 | self.arch_packaged = arch_packaged 839 | self.exists = arch_version is not None 840 | 841 | 842 | class DependsTuple(tuple): # Keep it hashable. 843 | @property 844 | def pep503_names(self): 845 | # Needs to be hashable. 846 | return tuple(ref.pep503_name for ref in self 847 | if isinstance(ref, PackageRef)) 848 | 849 | def __format__(self, fmt): 850 | # See above re: dependency type. 851 | def _unique(seq): return [*dict.fromkeys(seq)] # Unique, in order. 852 | if fmt == "Package": 853 | return " ".join(_unique(ref.depname for ref in self)) 854 | elif fmt == "MetaPackage": 855 | return " ".join(_unique(ref.pkgname for ref in self)) 856 | else: 857 | return super().__format__(fmt) # Raise TypeError. 858 | 859 | 860 | BuildCacheEntry = namedtuple( 861 | "BuildCacheEntry", "pkgname path is_dep namcap_report") 862 | 863 | 864 | class _BasePackage(ABC): 865 | build_cache = [] 866 | 867 | def __init__(self): 868 | self._files = {} 869 | # self._pkgbuild = ... 870 | 871 | @abc.abstractmethod 872 | def write_deps(self, options): 873 | pass 874 | 875 | def get_pkgbuild_extras(self, options): 876 | if os.path.isdir(options.pkgbuild_extras): 877 | extras_path = Path(options.pkgbuild_extras, 878 | f"{self.pkgname}.PKGBUILD_EXTRAS") 879 | if extras_path.exists(): 880 | LOGGER.info("Using %s.", extras_path) 881 | return extras_path.read_text() 882 | else: 883 | return "" 884 | else: 885 | return options.pkgbuild_extras 886 | 887 | def write(self, options): 888 | cwd = options.base_path / self.pkgname 889 | cwd.mkdir(parents=True, exist_ok=options.force) 890 | (cwd / "PKGBUILD").write_text(self._pkgbuild) 891 | (cwd / "PKGBUILD_EXTRAS").write_text(self.get_pkgbuild_extras(options)) 892 | for fname, content in self._files.items(): 893 | (cwd / fname).write_bytes(content) 894 | if isinstance(self, Package): 895 | srctree = _get_url_packed_path(self._get_pip_url()) 896 | dest = cwd / srctree.name 897 | with suppress(FileNotFoundError): 898 | if dest.is_dir(): 899 | shutil.rmtree(dest) 900 | else: 901 | dest.unlink() 902 | shutil.move(srctree, dest) 903 | cmd = ["makepkg", 904 | *(["--force"] if options.force else []), 905 | *shlex.split(options.makepkg)] 906 | _run_shell(cmd, cwd=cwd) 907 | 908 | def _get_fullpath(): 909 | # This may be absolute and not in cwd (if PKGDEST is set). 910 | # --packagelist may output multiple lines when debug option is set. 911 | # only take the first line (the main package). 912 | return Path(_run_shell_stdout( 913 | "makepkg --packagelist", 914 | cwd=cwd).splitlines()[0]) 915 | 916 | fullpath = _get_fullpath() 917 | # Update PKGBUILD. 918 | needs_rebuild = False 919 | # fullpath may not exist if --makepkg=--nobuild. 920 | namcap = (_run_shell_stdout(["namcap", fullpath], cwd=cwd).splitlines() 921 | if fullpath.exists() else []) 922 | # `pkgver()` may update the PKGBUILD, so reread it. 923 | pkgbuild_contents = (cwd / "PKGBUILD").read_text() 924 | # Binary dependencies. 925 | extra_deps_re = (f"(?<=^{self.pkgname} " 926 | "E: Dependency ).*(?= detected and not included)") 927 | extra_deps = [ 928 | match.group(0) 929 | for match in map(re.compile(extra_deps_re).search, namcap) 930 | if match] 931 | pkgbuild_contents = pkgbuild_contents.replace( 932 | "## EXTRA_DEPENDS ##", 933 | "depends+=({})".format(" ".join(extra_deps))) 934 | if extra_deps: 935 | needs_rebuild = True 936 | # Unexpected arch-dependent package (e.g. direct compilation of C 937 | # source). 938 | any_arch_re = (f"^{self.pkgname} " 939 | "E: ELF file .* found in an 'any' package.") 940 | if any(re.search(any_arch_re, line) for line in namcap): 941 | pkgbuild_contents = re.sub( 942 | "(?m)^arch=.*$", f"arch=({THIS_ARCH})", pkgbuild_contents, 1) 943 | needs_rebuild = True 944 | if needs_rebuild: 945 | # Remove previous package, repackage, and get new name (arch may 946 | # have changed). 947 | fullpath.unlink() 948 | (cwd / "PKGBUILD").write_text(pkgbuild_contents) 949 | _run_shell("makepkg --force --repackage --nodeps", cwd=cwd) 950 | fullpath = _get_fullpath() 951 | namcap_pkgbuild_report = _run_shell_stdout( 952 | "namcap PKGBUILD", cwd=cwd, check=False) 953 | # Suppressed namcap warnings (may be better to do this via a namcap 954 | # option?): 955 | # - Python dependencies always get misanalyzed; filter them away. 956 | # - Dependencies match install_requires + whatever namcap wants us to 957 | # add, so suppress warning about redundant transitive dependencies. 958 | # - Extension modules unconditionally link to `libpthread` (see 959 | # output of `python-config --libs`); filter that away. 960 | # - Extension modules appear to never be PIE? 961 | namcap_package_report = ( 962 | _run_shell_stdout( 963 | f"namcap {shlex.quote(str(fullpath))} | " 964 | f"grep -v \"^{self.pkgname} W: " 965 | r"\(Dependency included and not needed" 966 | r"\|Dependency .* included but already satisfied$" 967 | r"\|Unused shared library '/usr/lib/libpthread\.so\.0' by" 968 | r"\|ELF file .* lacks PIE\.$\)" 969 | "\"", cwd=cwd, check=False) 970 | if fullpath.exists() else "") 971 | namcap_report = [ 972 | line for report in [namcap_pkgbuild_report, namcap_package_report] 973 | for line in report.split("\n") if line] 974 | if re.search(f"^{self.pkgname} E: ", namcap_package_report): 975 | raise PackagingError("namcap found a problem with the package.") 976 | _run_shell("makepkg --printsrcinfo >.SRCINFO", cwd=cwd) 977 | type(self).build_cache.append(BuildCacheEntry( 978 | self.pkgname, fullpath, options.is_dep, namcap_report)) 979 | # FIXME Suppress message about redundancy of 'python' dependency. 980 | 981 | 982 | class Package(_BasePackage): 983 | def __init__(self, ref, options): 984 | super().__init__() 985 | 986 | self._ref = ref 987 | self._pkgrel = options.pkgrel 988 | 989 | stream = StringIO() 990 | 991 | LOGGER.info("Packaging %s %s.", 992 | self.pkgname, ref.info["info"]["version"]) 993 | self._urls = self._filter_and_sort_urls( 994 | ref.info["urls"], options.pkgtypes) 995 | if not self._urls: 996 | raise PackagingError( 997 | f"No URL available for package {self.pkgname}.") 998 | 999 | self._find_makedepends(options) 1000 | for dep in self._makedepends: 1001 | if _run_shell(f"pacman -Q {dep.pkgname} >/dev/null 2>&1", 1002 | check=False).returncode: 1003 | # Only log this as needed, to not spam messages about pip. 1004 | _run_shell(f"sudo pacman -S --asdeps {dep.pkgname}", 1005 | verbose=True) 1006 | self._extract_setup_requires() 1007 | 1008 | metadata = _get_metadata( 1009 | f"{ref.orig_name}{gen_ver_cmp_operator(self.pkgver)}" 1010 | if urllib.parse.urlparse(ref.orig_name).scheme == "" 1011 | else ref.orig_name, 1012 | self._makedepends.pep503_names) 1013 | self._depends = DependsTuple( 1014 | PackageRef(req, pre=options.pre) 1015 | if options.build_deps else 1016 | # FIXME Could use something slightly better, i.e. still check local 1017 | # packages... 1018 | NonPyPackageRef("python-{}".format(pep503_normalize_name(req))) 1019 | for req in metadata["requires"]) 1020 | self._licenses = self._find_license() 1021 | 1022 | arches = [] 1023 | src_template = None 1024 | sources = [] 1025 | if self._urls[0]["packagetype"] == "bdist_wheel": 1026 | for url in self._urls: 1027 | if url["packagetype"] != "bdist_wheel": 1028 | continue 1029 | wheel_info = WheelInfo.parse(url["url"]) 1030 | if wheel_info.platform == "any": 1031 | # If there is both an any wheel and one or more 1032 | # arch-specific wheels, do not mix them up. 1033 | if src_template == WHEEL_ARCH_SOURCE: 1034 | continue 1035 | arches.append("any") 1036 | src_template = WHEEL_ANY_SOURCE 1037 | else: 1038 | if src_template == WHEEL_ANY_SOURCE: 1039 | continue 1040 | arches.extend(wheel_info.get_arch_platforms()) 1041 | src_template = WHEEL_ARCH_SOURCE 1042 | sources.extend( 1043 | src_template.format( 1044 | arch=arch, 1045 | url=url, 1046 | name=Path(urllib.parse.urlparse(url["url"]).path).name) 1047 | for arch in wheel_info.get_arch_platforms()) 1048 | else: 1049 | arches.append("any") 1050 | sources.append(SDIST_SOURCE.format(url=self._urls[0])) 1051 | self._arch = sorted({*arches}) 1052 | stream.write( 1053 | PKGBUILD_HEADER.format(pkg=self, config=get_makepkg_conf())) 1054 | stream.write("".join(sources)) 1055 | stream.write(MORE_SOURCES.format( 1056 | names=" ".join(shlex.quote(name) 1057 | for name in self._files), 1058 | md5s=" ".join(hashlib.md5(content).hexdigest() 1059 | for content in self._files.values()))) 1060 | stream.write(PKGBUILD_CONTENTS) 1061 | 1062 | self._pkgbuild = stream.getvalue() 1063 | 1064 | def _filter_and_sort_urls(self, unfiltered_urls, pkgtypes): 1065 | urls = [] 1066 | for url in unfiltered_urls: 1067 | if url["packagetype"] == "bdist_wheel": 1068 | wh_info = WheelInfo.parse(url["url"]) 1069 | if not wh_info.pythons.intersection(PY_TAGS): 1070 | continue 1071 | if wh_info.platform == "any": 1072 | pkgtype = "anywheel" 1073 | elif wh_info.platform.startswith("manylinux"): 1074 | pkgtype = "manylinuxwheel" 1075 | else: 1076 | continue 1077 | try: 1078 | order = pkgtypes.index(pkgtype) 1079 | except ValueError: 1080 | continue 1081 | else: 1082 | # - https://packaging.python.org/en/latest/specifications/binary-distribution-format/#escaping-and-unicode 1083 | # The wheel name is the normalized name, but uppercase 1084 | # should be supported too, and "." can occur instead of 1085 | # "_". 1086 | # - PyPI currently allows uploading of packages with local 1087 | # version identifiers, see pypa/pypi-legacy#486. 1088 | if (wh_info.name.lower().replace(".", "_") 1089 | != to_wheel_name(self._ref.pep503_name)): 1090 | LOGGER.warning( 1091 | "Unexpected wheel info: %s " 1092 | "(expected case-insensitive name: %s)", 1093 | wh_info, to_wheel_name(self._ref.pep503_name)) 1094 | elif wh_info.version != self._ref.info["info"]["version"]: 1095 | LOGGER.warning( 1096 | "Unexpected wheel info: %s (expected version: %s)", 1097 | wh_info, self._ref.info["info"]["version"]) 1098 | else: 1099 | urls.append((url, order)) 1100 | elif url["packagetype"] == "sdist": 1101 | with suppress(ValueError): 1102 | urls.append((url, pkgtypes.index("sdist"))) 1103 | else: # Skip other dists. 1104 | continue 1105 | return [url for url, key in sorted(urls, key=lambda kv: kv[1])] 1106 | 1107 | def _get_first_package_type(self): 1108 | return self._urls[0]["packagetype"] 1109 | 1110 | def _get_sdist_url(self): 1111 | parsed = urllib.parse.urlparse(self._ref.orig_name) 1112 | return (self._ref.orig_name 1113 | if re.match(r"\A(git\+|file\Z)", parsed.scheme) 1114 | # pypa/pip#1884: pip download will actually run egg_info, thus 1115 | # install setup_requires, but we don't want to bother e.g. 1116 | # rebuilding numpy just for getting a sdist. So, be accurate 1117 | # when specifying --no-binary. 1118 | else "pip://{0}{1}#--no-binary={0}".format( 1119 | self._ref.pypi_name, gen_ver_cmp_operator(self.pkgver))) 1120 | 1121 | def _get_pip_url(self): 1122 | parsed = urllib.parse.urlparse(self._ref.orig_name) 1123 | return ( 1124 | self._ref.orig_name 1125 | if re.match(r"\A(git\+|file\Z)", parsed.scheme) else 1126 | f"pip://{self._ref.pypi_name}{gen_ver_cmp_operator(self.pkgver)}") 1127 | 1128 | def _find_makedepends(self, options): 1129 | self._makedepends = DependsTuple(( 1130 | *map(PackageRef, options.setup_requires), 1131 | *(_guess_url_makedepends(self._get_sdist_url(), 1132 | options.guess_makedepends) 1133 | if self._get_first_package_type() != "bdist_wheel" else ()))) 1134 | with TemporaryDirectory() as tmpdir: 1135 | Path(tmpdir, "PKGBUILD").write_text( 1136 | # makepkg always requires that these three variables are set. 1137 | f"pkgname={self.pkgname}\n" 1138 | f"pkgver={self.pkgver}\n" 1139 | f"pkgrel={self.pkgrel}\n" 1140 | + self.get_pkgbuild_extras(options)) 1141 | extra_makedepends = _run_shell_stdout( 1142 | r"makepkg --printsrcinfo | " 1143 | r"grep -Po '(?<=^\tmakedepends = ).*'", 1144 | cwd=tmpdir, check=False) 1145 | if extra_makedepends: 1146 | self._makedepends = DependsTuple( 1147 | [*self._makedepends, 1148 | # Use NonPyPackageRef even when the extra makedepends is 1149 | # actually a Python package, because we need access to it 1150 | # (as a system package) from within the build venv. 1151 | *map(NonPyPackageRef, extra_makedepends.split("\n"))]) 1152 | 1153 | def _extract_setup_requires(self): 1154 | makedepends = [] 1155 | for pkg in self._makedepends: 1156 | if isinstance(pkg, PackageRef): 1157 | makedepends.append(pkg) 1158 | elif isinstance(pkg, NonPyPackageRef): 1159 | pep503_name = _run_shell_stdout( 1160 | f"pacman -Qql {pkg.pkgname} | " 1161 | f"grep -Po '(?<=^{site.getsitepackages()[0]}/)" 1162 | r"[^-]*(?=-.*\.(dist|egg)-info/$)'", 1163 | check=False) 1164 | makedepends.append( 1165 | PackageRef(pep503_name) if pep503_name else pkg) 1166 | else: 1167 | raise TypeError("Unexpected makedepends entry") 1168 | self._makedepends = DependsTuple(makedepends) 1169 | 1170 | def _find_license(self): 1171 | # FIXME Support license-in-wheel. 1172 | info = self._ref.info["info"] 1173 | licenses = [] 1174 | license_classes = [ 1175 | classifier for classifier in info["classifiers"] 1176 | if classifier.startswith("License :: ") 1177 | and classifier != "License :: OSI Approved"] # What's that?... 1178 | if license_classes: 1179 | for license_class in license_classes: 1180 | *_, license_class = license_class.split(" :: ") 1181 | try: 1182 | licenses.append( 1183 | {**TROVE_COMMON_LICENSES, 1184 | **TROVE_SPECIAL_LICENSES}[license_class]) 1185 | except KeyError: 1186 | licenses.append(f"LicenseRef-{license_class}") 1187 | # pypa/warehouse#3473: "UNKNOWN" -> "", but not for old pkgs. 1188 | elif info["license"] not in [None, "", "UNKNOWN"]: 1189 | licenses.append("LicenseRef-{}".format(info["license"])) 1190 | else: 1191 | LOGGER.warning("No license information available.") 1192 | licenses.append("LicenseRef-unknown") 1193 | 1194 | _license_found = False 1195 | if any(license not in TROVE_COMMON_LICENSES for license in licenses): 1196 | for url in [info["download_url"], info["home_page"]]: 1197 | parsed = urllib.parse.urlparse(url or "") # Could be None. 1198 | if len(Path(parsed.path).parts) != 3: # ["/", user, name] 1199 | continue 1200 | # Strip final slash for later manipulations. 1201 | parsed = parsed._replace(path=re.sub("/$", "", parsed.path)) 1202 | # Could instead lookup 1203 | # https://api.github.com/repos/:owner/:repo/contents 1204 | # or see opengh/github-ls. 1205 | if parsed.netloc in ["github.com", "www.github.com"]: 1206 | parsed = parsed._replace( 1207 | netloc="raw.githubusercontent.com") 1208 | elif parsed.netloc in ["bitbucket.org", "www.bitbucket.org"]: 1209 | parsed = parsed._replace( 1210 | path=parsed.path + "/raw") 1211 | else: 1212 | continue 1213 | for license_name in LICENSE_NAMES: 1214 | try: 1215 | r = urllib.request.urlopen( 1216 | urllib.parse.urlunparse( 1217 | parsed._replace(path=parsed.path + "/master/" 1218 | + license_name))) 1219 | except urllib.error.HTTPError: 1220 | pass 1221 | else: 1222 | self._files.update(LICENSE=r.read()) 1223 | _license_found = True 1224 | break 1225 | if _license_found: 1226 | break 1227 | else: 1228 | try: 1229 | sdist_unpackacked_path = _get_url_unpacked_path_or_null( 1230 | self._get_sdist_url()) 1231 | # Should really fail with CalledProcessError (e.g. if 1232 | # wheel-only) but that can actually be transformed into a 1233 | # PackagingError; see _get_url_impl for explanation... 1234 | except PackagingError: 1235 | pass 1236 | else: 1237 | for path in map(sdist_unpackacked_path.joinpath, 1238 | LICENSE_NAMES): 1239 | if path.is_file(): 1240 | self._files.update(LICENSE=path.read_bytes()) 1241 | _license_found = True 1242 | break 1243 | if not _license_found: 1244 | self._files.update( 1245 | # These entries are mostly from a fixed, ASCII-only list, 1246 | # but can also be read from info["license"] which doesn't 1247 | # have to be ASCII. 1248 | LICENSE=("LICENSE: " + ", ".join(licenses) + "\n") 1249 | .encode("utf-8")) 1250 | LOGGER.warning("Could not retrieve license file.") 1251 | 1252 | return licenses 1253 | 1254 | pkgname = property( 1255 | lambda self: self._ref.pkgname) 1256 | epoch = property( 1257 | lambda self: 1258 | self._ref.arch_version.epoch if self._ref.arch_version else "") 1259 | # NOTE: some metadata can be corrupted due to pypa/setuptools#1390 :/ 1260 | pkgver = property( 1261 | lambda self: shlex.quote(self._ref.info["info"]["version"])) 1262 | pkgrel = property( 1263 | lambda self: self._pkgrel) 1264 | pkgdesc = property( 1265 | lambda self: shlex.quote(self._ref.info["info"]["summary"])) 1266 | arch = property( 1267 | lambda self: " ".join(self._arch)) 1268 | url = property( 1269 | lambda self: shlex.quote( 1270 | next(url for url in [self._ref.info["info"]["home_page"], 1271 | self._ref.info["info"]["download_url"], 1272 | self._ref.info["info"]["package_url"]] 1273 | # pypa/warehouse#3473: "UNKNOWN" -> "", but not for old pkgs. 1274 | if url not in [None, "", "UNKNOWN"]))) 1275 | license = property( 1276 | lambda self: " ".join(map(shlex.quote, self._licenses))) 1277 | depends = property( 1278 | lambda self: self._depends) 1279 | makedepends = property( 1280 | lambda self: self._makedepends) 1281 | checkdepends = property( 1282 | lambda self: DependsTuple()) 1283 | 1284 | @property 1285 | def provides(self): 1286 | # Packages should provide their official alias (e.g. for dependents of 1287 | # `python-numpy-openblas`)... except for vendored packages (so that 1288 | # `python--pillow` doesn't provide `python-pillow`). 1289 | if self._ref.pkgname.startswith("python--"): 1290 | return "" 1291 | try: 1292 | name, version = _find_arch_name_version(self._ref.pep503_name) 1293 | except TypeError: # name, version = None 1294 | return "" 1295 | else: 1296 | return f"{name}={self.pkgver}" 1297 | 1298 | def write_deps(self, options): 1299 | for ref in self._depends: 1300 | if not ref.exists: 1301 | # Dependency not found, build it too. 1302 | create_package(ref.pep503_name, options._replace(is_dep=True)) 1303 | 1304 | 1305 | class MetaPackage(_BasePackage): 1306 | def __init__(self, ref, options): 1307 | super().__init__() 1308 | self._ref = ref 1309 | self._arch_version = self._ref.arch_version._replace( 1310 | pkgrel=self._ref.arch_version.pkgrel + ".99") 1311 | self._subpkgrefs = DependsTuple( 1312 | PackageRef(name, subpkg_of=ref, pre=options.pre) 1313 | for name in ref.arch_packaged) 1314 | self._subpkgs = [Package(ref, options) for ref in self._subpkgrefs] 1315 | for pkg in self._subpkgs: 1316 | pkg._pkgbuild = re.sub( 1317 | "(?m)^conflicts=.*$", 1318 | "conflicts=('{0}<{1}' '{0}>{1}')".format( 1319 | ref.pkgname, self._arch_version), 1320 | pkg._pkgbuild, 1321 | 1) 1322 | self._pkgbuild = ( 1323 | PKGBUILD_HEADER.format(pkg=self, config=get_makepkg_conf()) 1324 | + METAPKGBUILD_CONTENTS) 1325 | 1326 | pkgname = property( 1327 | lambda self: self._ref.pkgname) 1328 | epoch = property( 1329 | lambda self: self._arch_version.epoch) 1330 | pkgver = property( 1331 | lambda self: self._arch_version.pkgver) 1332 | pkgrel = property( 1333 | lambda self: self._arch_version.pkgrel) 1334 | pkgdesc = property( 1335 | lambda self: "'A wrapper package.'") 1336 | arch = property( 1337 | lambda self: "any") 1338 | url = property( 1339 | lambda self: "N/A") 1340 | license = property( 1341 | lambda self: "CCPL:by") # Individual components retain their license. 1342 | depends = property( 1343 | lambda self: self._subpkgrefs) 1344 | makedepends = property( 1345 | lambda self: DependsTuple()) 1346 | checkdepends = property( 1347 | lambda self: DependsTuple()) 1348 | provides = property( 1349 | lambda self: "") 1350 | 1351 | def _get_target_path(self, base_path): 1352 | return base_path / ("meta:" + self._ref.pkgname) 1353 | 1354 | def write_deps(self, options): 1355 | dep_options = options._replace( 1356 | base_path=self._get_target_path(options.base_path), 1357 | is_dep=True) 1358 | for pkg in self._subpkgs: 1359 | pkg.write_deps(dep_options) 1360 | pkg.write(dep_options) 1361 | 1362 | def write(self, options): 1363 | super().write(options._replace( 1364 | base_path=self._get_target_path(options.base_path))) 1365 | 1366 | 1367 | _CREATE_PACKAGE_CACHE = set() 1368 | 1369 | 1370 | def create_package(name, options): 1371 | # This cannot use lru_cache because, in the case of mutually dependent 1372 | # packages, we want to prevent infinite recursion through write_deps (which 1373 | # is called *before* create_package returns, whereas lru_cache would only 1374 | # populate the cache *after* it returns). 1375 | if (name, options) in _CREATE_PACKAGE_CACHE: 1376 | return 1377 | _CREATE_PACKAGE_CACHE.add((name, options)) 1378 | ref = PackageRef( 1379 | name, pre=options.pre, guess_makedepends=options.guess_makedepends) 1380 | if options.pkgname: 1381 | ref.pkgname = options.pkgname 1382 | cls = Package if len(ref.arch_packaged) <= 1 else MetaPackage 1383 | pkg = cls(ref, options) 1384 | if options.build_deps: 1385 | pkg.write_deps(options) 1386 | pkg.write(options) 1387 | 1388 | 1389 | def find_outdated(): 1390 | outdated = json.loads(_run_python([ 1391 | "-mpip", "list", "--outdated", "--format=json", "--path", 1392 | site.getsitepackages()[0], 1393 | ], stdout=PIPE).stdout) 1394 | owners = {} 1395 | for row in outdated: 1396 | pkgname, arch_version = _find_installed_name_version( 1397 | pep503_normalize_name(row["name"])) 1398 | # Check that pypi's version is indeed newer. Some packages mis-report 1399 | # their version to pip (e.g., slicerator 0.9.7's Github release). 1400 | if arch_version.pkgver == row["latest_version"]: 1401 | LOGGER.warning( 1402 | "pip thinks that %s is outdated, but the installed " 1403 | "version is actually %s, and up-to-date.", 1404 | row["name"], row["latest_version"]) 1405 | continue 1406 | owners.setdefault(f"{pkgname} {arch_version}", []).append(row) 1407 | owners = {k: v for k, v in sorted(owners.items())} 1408 | rows = sum(owners.values(), []) 1409 | name_len, ver_len, lver_len, lft_len = ( 1410 | max(map(len, (row[key] for row in rows)), default=0) 1411 | for key in ["name", "version", "latest_version", "latest_filetype"]) 1412 | for owner, rows in owners.items(): 1413 | print(owner) 1414 | for row in rows: 1415 | print(" " 1416 | f"{row['name']:{name_len}} " 1417 | f"{row['version']:{ver_len}} -> " 1418 | f"{row['latest_version']:{lver_len}} " 1419 | f"({row['latest_filetype']:{lft_len}})") 1420 | return owners 1421 | 1422 | 1423 | Options = namedtuple( 1424 | "Options", 1425 | "base_path force pre pkgname pkgrel guess_makedepends setup_requires " 1426 | "pkgtypes build_deps pkgbuild_extras makepkg is_dep") 1427 | 1428 | 1429 | def main(): 1430 | 1431 | class CommaSeparatedList(Action): 1432 | def __call__(self, parser, namespace, values, option_string=None): 1433 | setattr(namespace, self.dest, 1434 | tuple(values.split(",") if values else [])) 1435 | 1436 | class PersistentCommaSeparatedList(Action): 1437 | def __call__(self, parser, namespace, values, option_string=None): 1438 | values = (*getattr(namespace, self.dest), *values.split(",")) 1439 | try: 1440 | idx = values.index("") 1441 | except ValueError: 1442 | pass 1443 | else: 1444 | values = values[idx + 1:] 1445 | setattr(namespace, self.dest, values) 1446 | 1447 | parser = ArgumentParser( 1448 | description="Create a PKGBUILD for a PyPI package and run makepkg.", 1449 | formatter_class=type("", (RawDescriptionHelpFormatter, 1450 | ArgumentDefaultsHelpFormatter), {})) 1451 | parser.add_argument("--version", action="version", 1452 | version=f"%(prog)s {__version__}") 1453 | parser.add_argument( 1454 | "names", metavar="name", nargs="*", 1455 | help="The PyPI package names.") 1456 | parser.add_argument( 1457 | "-v", "--verbose", action="store_true", default=False, 1458 | help="Log at DEBUG level.") 1459 | parser.add_argument( 1460 | "-o", "--outdated", action="store_true", default=False, 1461 | help="Find outdated packages.") 1462 | parser.add_argument( 1463 | "-u", "--upgrade", action="store_true", default=False, 1464 | help="Find and build outdated packages.") 1465 | parser.add_argument( 1466 | "-i", "--ignore", metavar="NAME,...", 1467 | action=PersistentCommaSeparatedList, default=(), 1468 | help="Comma-separated list of packages not to be upgrade. This flag " 1469 | "can be passed multiple times; passing an empty flag ('-i=') " 1470 | "can be used to strip out values passed so far.") 1471 | parser.add_argument( 1472 | "-b", "--base-path", type=Path, default=Path(), 1473 | help="Base path where the packages directories are created.") 1474 | parser.add_argument( 1475 | "-f", "--force", action="store_true", 1476 | help="Overwrite a previously existing PKGBUILD.") 1477 | parser.add_argument( 1478 | "--pre", action="store_true", 1479 | help="Include pre-releases.") 1480 | parser.add_argument( 1481 | "-n", "--pkgname", 1482 | help="Force $pkgname.") 1483 | parser.add_argument( 1484 | "-r", "--pkgrel", default="00", 1485 | help="Force value of $pkgrel (not applicable to metapackages). " 1486 | "Set e.g. to 99 to override AUR packages.") 1487 | parser.add_argument( 1488 | "-g", "--guess-makedepends", metavar="MAKEDEPENDS,...", 1489 | action=CommaSeparatedList, default=("cython", "swig"), 1490 | help="Comma-separated list of makedepends that will be guessed. " 1491 | "Allowed values: cython, swig.") 1492 | parser.add_argument( 1493 | "-s", "--setup-requires", metavar="PYPI_NAME,...", 1494 | action=CommaSeparatedList, default=(), 1495 | help="Comma-separated list of setup_requires that will be forced.") 1496 | parser.add_argument( 1497 | "-t", "--pkgtypes", action=CommaSeparatedList, default=tuple(PKGTYPES), 1498 | help="Comma-separated preference order for dists.") 1499 | parser.add_argument( 1500 | "-D", "--no-deps", action="store_false", 1501 | dest="build_deps", default=True, 1502 | help="Don't generate PKGBUILD for dependencies.") 1503 | parser.add_argument( 1504 | "-e", "--pkgbuild-extras", default="", 1505 | help="Either contents of PKGBUILD_EXTRAS, or path to a patch " 1506 | "directory (if a valid path). A patch directory should contain " 1507 | "files of the form $pkgname.PKGBUILD_EXTRAS, which are used as " 1508 | "PKGBUILD_EXTRAS.") 1509 | parser.add_argument( 1510 | "-m", "--makepkg", metavar="MAKEPKG_OPTS", 1511 | default="--cleanbuild --nodeps", 1512 | help="Additional arguments to pass to `makepkg`.") 1513 | parser.add_argument( 1514 | "-I", "--no-install", action="store_false", 1515 | dest="install", default="True", 1516 | help="Don't install the built packages.") 1517 | parser.add_argument( 1518 | "-p", "--pacman", metavar="PACMAN_OPTS", 1519 | default="", 1520 | help="Additional arguments to pass to `pacman -U`.") 1521 | args = parser.parse_args() 1522 | log_level = logging.DEBUG if vars(args).pop("verbose") else logging.INFO 1523 | LOGGER.setLevel(log_level) 1524 | logging.basicConfig(level=log_level) 1525 | handler_level = logging.getLogger().handlers[0].level 1526 | @LOGGER.addFilter 1527 | def f(record): 1528 | # This hack allows us to run with COLOREDLOGS_AUTO_INSTALL=1 without 1529 | # having to set the root handler level to DEBUG (which would affect 1530 | # other packages as well). We still need to set this package's logger 1531 | # level accordingly as otherwise DEBUG level records will not even 1532 | # reach this filter. 1533 | if record.levelno >= log_level: 1534 | record.levelno = max(handler_level, record.levelno) 1535 | return True 1536 | 1537 | LOGGER.debug(f"This is pypi2pkgbuild {__version__}.") 1538 | 1539 | # Dependency checking needs to happen after logging is configured. 1540 | for cmd in ["namcap", "pkgfile"]: 1541 | if shutil.which(cmd) is None: 1542 | parser.error(f"Missing dependency: {cmd}") 1543 | try: 1544 | _run_shell("pkgfile pkgfile >/dev/null") 1545 | except CalledProcessError: 1546 | # "error: No repo files found. Please run `pkgfile --update'." 1547 | sys.exit(1) 1548 | 1549 | outdated, upgrade, ignore, install, pacman_opts = map( 1550 | vars(args).pop, ["outdated", "upgrade", "ignore", "install", "pacman"]) 1551 | 1552 | if not {*args.pkgtypes} <= {*PKGTYPES}: 1553 | parser.error("valid --pkgtypes are: {}".format(", ".join(PKGTYPES))) 1554 | 1555 | if outdated: 1556 | if vars(args).pop("names"): 1557 | parser.error("--outdated should be given with no name.") 1558 | find_outdated() 1559 | 1560 | elif upgrade: 1561 | if vars(args).pop("names"): 1562 | parser.error("--upgrade-outdated should be given with no name.") 1563 | ignore = {*map(pep503_normalize_name, ignore)} 1564 | names = {pep503_normalize_name(row["name"]) 1565 | for row in sum(find_outdated().values(), [])} 1566 | ignored = ignore & names 1567 | if ignored: 1568 | LOGGER.info("Ignoring upgrade of %s.", ", ".join(sorted(ignored))) 1569 | for name in sorted(names - ignore): 1570 | try: 1571 | create_package(name, Options(**vars(args), is_dep=False)) 1572 | except PackagingError as exc: 1573 | LOGGER.error("%s", exc) 1574 | return 1 1575 | 1576 | else: 1577 | if not args.names: 1578 | parser.error("the following arguments are required: name") 1579 | try: 1580 | for name in vars(args).pop("names"): 1581 | create_package(name, Options(**vars(args), is_dep=False)) 1582 | except PackagingError as exc: 1583 | LOGGER.error("%s", exc) 1584 | return 1 1585 | 1586 | print("\n".join(line for cache_entry in Package.build_cache 1587 | for line in cache_entry.namcap_report)) 1588 | 1589 | if install and Package.build_cache: 1590 | cmd = "pacman -U{} {} {}".format( 1591 | "" if args.build_deps else "dd", 1592 | pacman_opts, 1593 | " ".join(shlex.quote(str(cache_entry.path)) 1594 | for cache_entry in Package.build_cache)) 1595 | deps = [cache_entry.pkgname for cache_entry in Package.build_cache 1596 | if cache_entry.is_dep] 1597 | if deps: 1598 | cmd += "; pacman -D --asdeps {}".format(" ".join(deps)) 1599 | cmd = "sudo sh -c {}".format(shlex.quote(cmd)) 1600 | _run_shell(cmd, check=False, verbose=True) 1601 | 1602 | 1603 | if __name__ == "__main__": 1604 | sys.exit(main()) 1605 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [build-system] 2 | requires = ["setuptools>=61", "setuptools_scm[toml]>=6.2"] 3 | build-backend = "setuptools.build_meta" 4 | 5 | [project] 6 | name = "pypi2pkgbuild" 7 | description = "A PyPI to PKGBUILD converter." 8 | readme = "README.rst" 9 | authors = [{name = "Antony Lee"}] 10 | urls = {Repository = "https://github.com/anntzer/pypi2pkgbuild"} 11 | license = {text = "MIT"} 12 | classifiers = [ 13 | "Development Status :: 4 - Beta", 14 | "Environment :: Console", 15 | "Intended Audience :: System Administrators", 16 | "License :: OSI Approved :: MIT License", 17 | "Operating System :: POSIX :: Linux", 18 | "Programming Language :: Python :: 3", 19 | "Topic :: System :: Software Distribution", 20 | ] 21 | requires-python = ">=3.8" 22 | dynamic = ["version"] 23 | 24 | [tool.setuptools] 25 | packages = [] 26 | script-files = ["pypi2pkgbuild.py"] 27 | 28 | [tool.setuptools_scm] 29 | version_scheme = "post-release" 30 | local_scheme = "node-and-date" 31 | fallback_version = "0+unknown" 32 | 33 | [tool.coverage.run] 34 | branch = true 35 | include = ["pypi2pkgbuild.py"] 36 | -------------------------------------------------------------------------------- /test_pypi2pkgbuild.py: -------------------------------------------------------------------------------- 1 | import functools 2 | import os 3 | from pathlib import Path 4 | import subprocess 5 | import sys 6 | from tempfile import TemporaryDirectory 7 | from unittest import TestCase 8 | 9 | 10 | _local_path = Path(__file__).parent 11 | _run = functools.partial(subprocess.run, check=True) 12 | 13 | 14 | class TestPyPI2PKGBUILD(TestCase): 15 | 16 | def test_build_git(self): 17 | for makepkg_opts in ["", "--nobuild"]: 18 | with self.subTest(makepkg_opts=makepkg_opts), \ 19 | TemporaryDirectory() as tmp_dir: 20 | _run([sys.executable, _local_path / "pypi2pkgbuild.py", 21 | "-v", "-I", f"-m={makepkg_opts}", "-b", tmp_dir, 22 | f"git+file://{_local_path}"]) 23 | 24 | def test_build_sdist_wheel(self): 25 | env = {"PIP_CONFIG_FILE": "/dev/null", **os.environ} 26 | for makepkg_opts in ["", "--nobuild"]: 27 | with self.subTest(makepkg_opts=makepkg_opts), \ 28 | TemporaryDirectory() as tmp_dir: 29 | tmp_path = Path(tmp_dir) 30 | _run([sys.executable, "-mvenv", tmp_path]) 31 | _run([tmp_path / "bin/pip", "install", "build"], env=env) 32 | _run([tmp_path / "bin/pyproject-build", _local_path, 33 | "-o", tmp_path / "dist"], env=env) 34 | sdist_path, = tmp_path.glob("dist/*.tar.gz") 35 | wheel_path, = tmp_path.glob("dist/*.whl") 36 | _run([sys.executable, _local_path / "pypi2pkgbuild.py", 37 | "-v", "-I", f"-m={makepkg_opts}", "-b", tmp_path / "s", 38 | f"file://{sdist_path}"]) 39 | _run([sys.executable, _local_path / "pypi2pkgbuild.py", 40 | "-v", "-I", f"-m={makepkg_opts}", "-b", tmp_path / "w", 41 | f"file://{wheel_path}"]) 42 | --------------------------------------------------------------------------------