Tips and tricks¶
Tips¶
Building Linux wheels for non-native archs using emulation¶
cibuildwheel supports building non-native architectures on Linux, viaemulation through the binfmt_misc kernel feature. The easiest way to use thisis via thedocker/setup-qemu-action on GitHub Actions ortonistiigi/binfmt.
Check out the following config for an example of how to set it up on GitHubActions. Once QEMU is set up and registered, you just need to set theCIBW_ARCHS_LINUX
environment variable (or use the--archs
option onLinux), and the other architectures are emulated automatically.
.github/workflows/build.yml
name: Buildon: [push, pull_request]jobs: build_wheels: name: Build wheels on ${{ matrix.os }} runs-on: ${{ matrix.os }} strategy: matrix: # macos-13 is an intel runner, macos-14 is apple silicon os: [ubuntu-latest, ubuntu-24.04-arm, windows-latest, windows-11-arm, macos-13, macos-14] steps: - uses: actions/checkout@v4 - name: Set up QEMU if: runner.os == 'Linux' && runner.arch == 'X64' uses: docker/setup-qemu-action@v3 with: platforms: all - name: Build wheels uses: pypa/cibuildwheel@v3.0.1 env: # configure cibuildwheel on Linux to build native archs ('auto'), # and to split the remaining architectures between the x86_64 and # ARM runners CIBW_ARCHS_LINUX: ${{ runner.arch == 'X64' && 'auto ppc64le s390x' || 'auto' }} - uses: actions/upload-artifact@v4 with: name: cibw-wheels-${{ matrix.os }}-${{ strategy.job-index }} path: ./wheelhouse/*.whl
Building CPython ABI3 wheels (Limited API)¶
The CPython Limited API is a subset of the Python C Extension API that's declared to be forward-compatible, meaning you can compile wheels for one version of Python, and they'll be compatible with future versions. Wheels that use the Limited API are known as ABI3 wheels.
To create a package that builds ABI3 wheels, you'll need to configure your build backend to compile libraries correctly create wheels with the right tags.Check this repo for an example of how to do this with setuptools.
You could also consider runningabi3audit against the produced wheels in order to check for abi3 violations or inconsistencies. You can run it alongside the default in yourrepair-wheel-command.
Packages with optional C extensions¶
cibuildwheel
defines the environment variableCIBUILDWHEEL
to the value1
allowing projects for which the C extension is optional to make it mandatory when building wheels.
An easy way to do it in Python 3 is through theoptional
named argument ofExtension
constructor in yoursetup.py
:
myextension = Extension( "myextension", ["myextension.c"], optional=os.environ.get('CIBUILDWHEEL', '0') != '1',)
Building with NumPy¶
If using NumPy, there are a couple of things that can help.
First, if you require thenumpy
package at build-time (some binding tools, likepybind11
andnanobind
, do not), then the backward compatibility for yourbuild-backend.build-requires
is a little complicated for Python <3.9:
- NumPy <1.25: You must build with the oldest version of NumPy you want to support at runtime.
- NumPy 1.25 and 1.26: Anything you build will be compatible with 1.19+ by default, and you can set the minimum target to, for example, 1.22 with
#define NPY_TARGET_VERSION NPY_1_22_API_VERSION
. - NumPy 2.x: You must build with NumPy 2 to support NumPy 2; otherwise the same as 1.25+.
So the rule is:
- Python <3.8: Use the oldest supported NumPy (via helper
oldest-supported-numpy
if you want) - Python 3.9+: Use latest supported NumPy (2+).
Second, there might be platforms you want to ship for that NumPy (or some other scientific Python libraries) are not shipping yet for. This is often true for beta candidates of new Python releases, for example. To work with this, you can use the Scientific Python Nightly wheels. Here's an example, depending on what frontend you use:
pip based
For frontends likebuild
(the default) andpip
:
[tool.cibuildwheel]environment.PIP_ONLY_BINARY = "numpy"environment.PIP_PREFER_BINARY = "1"[[tool.cibuildwheel.overrides]]select = ["cp314*"]inherit.environment = "append"environment.PIP_EXTRA_INDEX_URL = "https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/"environment.PIP_PRERELEASE = "allow"
uv based
For frontends likebuild[uv]
:
[tool.cibuildwheel]environment.UV_ONLY_BINARY = "numpy"environment.UV_PREFER_BINARY = "1"[[tool.cibuildwheel.overrides]]select = ["cp314*"]inherit.environment = "append"environment.UV_INDEX = "https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/"environment.UV_INDEX_STRATEGY = "unsafe-best-match"environment.UV_PRERELEASE = "allow"
(Note the*_ONLY_BINARY
variable also supports":all:"
, and you don't need both that and*_PREFER_BINARY
, you can use either one, depending on if you want a missing wheel to be a failure or an attempt to build in CI.)
Automatic updates using Dependabot¶
Selecting a moving target (like the latest release) is generally a bad idea in CI. If something breaks, you can't tell whether it was your code or an upstream update that caused the breakage, and in a worst-case scenario, it could occur during a release.
There are two suggested methods for keeping cibuildwheel up to date that instead involve scheduled pull requests using GitHub's Dependabot.
Option 1: GitHub Action¶
If you use GitHub Actions for builds, you can use cibuildwheel as an action:
uses: pypa/cibuildwheel@v3.0.1
This is a composite step that just runs cibuildwheel using pipx. You can set command-line options aswith:
parameters, and useenv:
as normal.
Then, your.github/dependabot.yml
file could look like this:
version: 2updates: - package-ecosystem: "github-actions" directory: "/" schedule: interval: "weekly"
Option 2: Requirement files¶
The second option, and the only one that supports other CI systems, is using arequirements-*.txt
file. The file should have a distinct name and have only one entry:
# requirements-cibw.txtcibuildwheel==3.0.1
Then your install step would havepython -m pip install -r requirements-cibw.txt
in it. Your.github/dependabot.yml
file could look like this:
version: 2updates: - package-ecosystem: "pip" directory: "/" schedule: interval: "daily"
This will also try to update other pins in all requirement files, so be sure you want to do that. The only control you have over the files used is via the directory option.
Alternatives to cibuildwheel options¶
cibuildwheel provides lots of opportunities to configure the buildenvironment. However, you might consider adding this build configuration intothe package itself - in general, this is preferred, because users of yourpackage 'sdist' will also benefit.
Missing build dependencies¶
If your build needs Python dependencies, rather than usingbefore-build
, it's best to add these to thebuild-system.requires
section of your pyproject.toml. For example, if your project requires Cythonto build, your pyproject.toml might include a section like this:
[build-system]requires = [ "setuptools>=42", "Cython",]build-backend = "setuptools.build_meta"
Actions you need to perform before building¶
You might need to run some other commands before building, like running ascript that performs codegen or downloading some data that's not stored inyour source tree.
Rather than usingbefore-all
orbefore-build
, you could incorporatethese steps into your package's build process. For example, if you're usingsetuptools, you can add steps to your package'ssetup.py
using a structurelike this:
import subprocessimport setuptoolsimport setuptools.command.build_pyclass BuildPyCommand(setuptools.command.build_py.build_py): """Custom build command.""" def run(self): # your custom build steps here # e.g. # subprocess.run(['python', 'scripts/my_custom_script.py'], check=True) setuptools.command.build_py.build_py.run(self)setuptools.setup( cmdclass={ 'build_py': BuildPyCommand, }, # Usual setup() args. # ...)
Compiler flags¶
Your build might need some compiler flags to be set through environment variables.Consider incorporating these into your package, for example, insetup.py
usingextra_compile_args
orextra_link_args
.
Troubleshooting¶
If your wheel didn't compile, you might have a mistake in your config.
To quickly test your config without doing a git push and waiting for your code to build on CI, you cantest the Linux build in a local Docker container.
Missing dependencies¶
Sometimes a build will fail due to a missing dependency.
If the build is missing a Python package, you shouldadd it to pyproject.toml.
If you need a build tool (e.g. cmake, automake, ninja), you can install it through a package manager like apt/yum, brew or choco, using thebefore-all
option.
If your build is linking into a native library dependency, you can build/install that inbefore-all
. However, on Linux, Mac (and Windows if you're usingdelvewheel), the library that you install will be bundled into the wheel in therepair step. So take care to ensure that
- the bundled library doesn't accidentally increase the minimum system requirements (such as the minimum macOS version)
- the bundled library matches the architecture of the wheel you're building when cross-compiling
This is particularly an issue on macOS, where de facto package manager Homebrew will install libraries that are compiled for the specific version of macOS that the build machine is running, rendering the wheels useless for any previous version. And brew will not install the right arch for cross compilation of Apple Silicon wheels.
For these reasons, it's strongly recommended to not use brew for native library dependencies. Instead, we recommend compiling the library yourself. If you compile in thebefore-all
step, cibuildwheel will have already set the appropriateMACOSX_DEPLOYMENT_TARGET
env var, so the library will target the correct version of macOS.
Tip
For build steps, Homebrew is still a great resource - you canlook up the build formula and use that as a starting point.
Building Rust wheels¶
If you build Rust wheels, you need to download the Rust compilers in manylinux.If you support 32-bit Windows, you need to add this as a potential target. Youcan do this on GitHub Actions, for example, with:
CIBW_BEFORE_ALL_LINUX: curl -sSf https://sh.rustup.rs | sh -s -- -yCIBW_BEFORE_ALL_WINDOWS: rustup target add i686-pc-windows-msvcCIBW_ENVIRONMENT_LINUX: "PATH=$HOME/.cargo/bin:$PATH"
Rust's minimum macOS target is 10.12, while CPython supports 10.9 beforePython 3.12, so you'll need to raise the minimum:
[tool.cibuildwheel.macos.environment]MACOSX_DEPLOYMENT_TARGET = "10.12"
And Rust does not provide Cargo for musllinux 32-bit, so that needs to beskipped:
[tool.cibuildwheel]skip = ["*-musllinux_i686"]
Also seematurin-action which is optimized for Rust wheels, builds the non-Python Rust modules once, and can cross-compile (and can build 32-bit musl, for example).
macOS: 'No module named XYZ' errors after running cibuildwheel¶
cibuildwheel
on Mac installs the distributions from Python.org system-wide during its operation. This is necessary, but it can cause some confusing errors after cibuildwheel has finished.
Consider the build script:
python3 -m pip install twine cibuildwheelpython3 -m cibuildwheel --output-dir wheelhousepython3 -m twine upload wheelhouse/*.whl# error: no module named 'twine'
This doesn't work because whilecibuildwheel
was running, it installed a few new versions of 'python3', so thepython3
run on line 3 isn't the same as thepython3
that ran on line 1.
Solutions to this vary, but the simplest is to use pipx:
# most runners have pipx preinstalled, but in case you don'tpython3 -m pip install pipxpipx run cibuildwheel==3.0.1 --output-dir wheelhousepipx run twine upload wheelhouse/*.whl
macOS: Passing DYLD_LIBRARY_PATH to delocate¶
macOS has built-inSystem Integrity protections which limits the use ofDYLD_LIBRARY_PATH
andLD_LIBRARY_PATH
so that it does not automatically pass to children processes. This means if you setDYLD_LIBRARY_PATH
before running cibuildwheel, or even set it inenvironment
, it will be stripped out of the environment before delocate is called.
To work around this, use a different environment variable such asREPAIR_LIBRARY_PATH
to store the library path, and setDYLD_LIBRARY_PATH
inmacos.repair-wheel-command
, like this:
Environment variables
CIBW_REPAIR_WHEEL_COMMAND_MACOS: > DYLD_LIBRARY_PATH=$REPAIR_LIBRARY_PATH delocate-wheel --require-archs {delocate_archs} -w {dest_dir} -v {wheel}
pyproject.toml
[tool.cibuildwheel.macos]repair-wheel-command = """\DYLD_LIBRARY_PATH=$REPAIR_LIBRARY_PATH delocate-wheel \--require-archs {delocate_archs} -w {dest_dir} -v {wheel}\"""
See#816, thanks to @phoerious for reporting.
macOS: Building CPython 3.8 wheels on arm64¶
If you're building on an arm64 runner, you might notice something strange about CPython 3.8 - unlike Python 3.9+, it's cross-compiled to arm64 from an x86_64 version of Python running under Rosetta emulation. This is because (despite the prevalence of arm64 versions of Python 3.8 from Apple and Homebrew) there is no officially supported Python.org installer of Python 3.8 for arm64.
This is fine for simple C extensions, but for more complicated builds on arm64 it becomes an issue.
So, if you want to build macOS arm64 wheels on an arm64 runner (e.g.,macos-14
) on Python 3.8, before invoking cibuildwheel, you should install a native arm64 Python 3.8 interpreter on the runner:
GitHub Actions
- uses: actions/setup-python@v5 with: python-version: 3.8 if: runner.os == 'macOS' && runner.arch == 'ARM64'
Generic
curl -o /tmp/Python38.pkg https://www.python.org/ftp/python/3.8.10/python-3.8.10-macos11.pkgsudo installer -pkg /tmp/Python38.pkg -target /sh "/Applications/Python 3.8/Install Certificates.command"
Then cibuildwheel will detect that it's installed and use it instead. However, you probably don't want to build x86_64 wheels on this Python, unless you're happy with them only supporting macOS 11+.
macOS: Library dependencies do not satisfy target MacOS¶
Since delocate 0.11.0 there is added verification that the library binary dependencies match the target macOS version. This is to prevent the situation where a wheel platform tag is lower than the actual minimum macOS version required by the library. To resolve this error you need to build the library to the same macOS version as the target wheel (for example usingMACOSX_DEPLOYMENT_TARGET
environment variable).Alternatively, you could setMACOSX_DEPLOYMENT_TARGET
inenvironment
to correctly label the wheel as incompatible with older macOS versions.
This error may happen when you install a library using a package manager like Homebrew, which compiles the library for the macOS version of the build machine. This is not suitable for wheels, as the library will only work on the same macOS version as the build machine. You should compile the library yourself, or use a precompiled binary that matches the target macOS version.
Windows: 'ImportError: DLL load failed: The specific module could not be found'¶
Visual Studio and MSVC link the compiled binary wheels to the Microsoft Visual C++ Runtime. Normally, the C parts of the runtime are included with Python, but the C++ components are not. When compiling modules using C++, it is possible users will run into problems on systems that do not have the full set of runtime libraries installed. The solution is to ask users to download the corresponding Visual C++ Redistributable from theMicrosoft website and install it.
Additionally, Visual Studio 2019 started linking to an even newer DLL,VCRUNTIME140_1.dll
, besides theVCRUNTIME140.dll
that is included with recent Python versions (starting from Python 3.5; seehere for more details on the corresponding Visual Studio & MSVC versions used to compile the different Python versions). To avoid this extra dependency onVCRUNTIME140_1.dll
, the/d2FH4-
flag can be added to the MSVC invocations (check outthis issue for details and references). CPython 3.8.3 and all versions after it have this extra DLL, so it is only needed for 3.8 and earlier.
To add the/d2FH4-
flag to a standardsetup.py
usingsetuptools
, theextra_compile_args
option can be used:
ext_modules=[ Extension( 'c_module', sources=['extension.c'], extra_compile_args=['/d2FH4-'] if sys.platform == 'win32' else [] ) ],
To investigate the dependencies of a C extension (i.e., the.pyd
file, a DLL in disguise) on Windows,Dependency Walker is a great tool. For diagnosing a failing import, thedlltracer tool may also provide additional details.