Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

A set of tools to keep your pinned Python dependencies fresh.

License

NotificationsYou must be signed in to change notification settings

michael-k/pip-tools

 
 

jazzband-imagepypipyversionspre-commitbuildstatus-ghacodecovMatrix Room BadgeMatrix Space Badgediscord-chat-image

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep yourpip-based packages fresh,even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar topip,pip-tools must be installed in each of your project'svirtual environments:

$source /path/to/venv/bin/activate(venv) $ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated yourproject's virtual environment.

Example usage forpip-compile

Thepip-compile command lets you compile arequirements.txt file fromyour dependencies, specified in eitherpyproject.toml,setup.cfg,setup.py, orrequirements.in.

Run it withpip-compile orpython -m piptools compile (orpipx run --spec pip-tools pip-compile ifpipx was installed with theappropriate Python version). If you use multiple Python versions, you can alsorunpy -X.Y -m piptools compile on Windows andpythonX.Y -m piptools compileon other systems.

pip-compile should be run from the same virtual environment as yourproject so conditional dependencies that require a specific Python version,or other environment markers, resolve relative to your project'senvironment.

Note: Ifpip-compile finds an existingrequirements.txt file thatfulfils the dependencies then no changes will be made, even if updates areavailable. To compile from scratch, first delete the existingrequirements.txt file, or seeUpdating requirementsfor alternative approaches.

Requirements frompyproject.toml

Thepyproject.toml file is thelatest standard for configuringpackages and applications, and is recommended for new projects.pip-compilesupports both installing yourproject.dependencies as well as yourproject.optional-dependencies. Thanks to the fact that this is anofficial standard, you can usepip-compile to pin the dependenciesin projects that use modern standards-adhering packaging tools likeSetuptools,Hatchorflit.

Suppose you have a 'foobar' Python application that is packaged usingSetuptools,and you want to pin it for production. You can declare the project metadata as:

[build-system]requires = ["setuptools","setuptools-scm"]build-backend ="setuptools.build_meta"[project]requires-python =">=3.9"name ="foobar"dynamic = ["dependencies","optional-dependencies"][tool.setuptools.dynamic]dependencies = {file = ["requirements.in"] }optional-dependencies.test = {file = ["requirements-test.txt"] }

If you have a Django application that is packaged usingHatch, and youwant to pin it for production. You also want to pin your development toolsin a separate pin file. You declaredjango as a dependency and create anoptional dependencydev that includespytest:

[build-system]requires = ["hatchling"]build-backend ="hatchling.build"[project]name ="my-cool-django-app"version ="42"dependencies = ["django"][project.optional-dependencies]dev = ["pytest"]

You can produce your pin files as easily as:

$pip-compile -o requirements.txt pyproject.toml##This file is autogenerated by pip-compile with Python 3.10#by the following command:##pip-compile --output-file=requirements.txt pyproject.toml#asgiref==3.6.0    # via djangodjango==4.1.7    # via my-cool-django-app (pyproject.toml)sqlparse==0.4.3    # via django$pip-compile --extra dev -o dev-requirements.txt pyproject.toml##This file is autogenerated by pip-compile with Python 3.10#by the following command:##pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml#asgiref==3.6.0    # via djangoattrs==22.2.0    # via pytestdjango==4.1.7    # via my-cool-django-app (pyproject.toml)exceptiongroup==1.1.1    # via pytestiniconfig==2.0.0    # via pytestpackaging==23.0    # via pytestpluggy==1.0.0    # via pytestpytest==7.2.2    # via my-cool-django-app (pyproject.toml)sqlparse==0.4.3    # via djangotomli==2.0.1    # via pytest

This is great for both pinning your applications, but also to keep the CIof your open-source Python package stable.

Requirements fromsetup.py andsetup.cfg

pip-compile has also full support forsetup.py- andsetup.cfg-based projects that usesetuptools.

Just define your dependencies and extras as usual and runpip-compile as above.

Requirements fromrequirements.in

You can also use plain text files for your requirements (e.g. if you don'twant your application to be a package). To use arequirements.in file todeclare the Django dependency:

# requirements.indjango

Now, runpip-compile requirements.in:

$pip-compile requirements.in##This file is autogenerated by pip-compile with Python 3.10#by the following command:##pip-compile requirements.in#asgiref==3.6.0    # via djangodjango==4.1.7    # via -r requirements.insqlparse==0.4.3    # via django

And it will produce yourrequirements.txt, with all the Django dependencies(and all underlying dependencies) pinned.

(updating-requirements)=

Updating requirements

pip-compile generates arequirements.txt file using the latest versionsthat fulfil the dependencies you specify in the supported files.

Ifpip-compile finds an existingrequirements.txt file that fulfils thedependencies then no changes will be made, even if updates are available.

To forcepip-compile to update all packages in an existingrequirements.txt, runpip-compile --upgrade.

To update a specific package to the latest or a specific version use the--upgrade-package or-P flag:

#only update the django package$pip-compile --upgrade-package django#update both the django and requests packages$pip-compile --upgrade-package django --upgrade-package requests#update the django package to the latest, and requests to v2.0.0$pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine--upgrade and--upgrade-package in one command, toprovide constraints on the allowed upgrades. For example to upgrade allpackages whilst constraining requests to the latest version less than 3.0:

$pip-compile --upgrade --upgrade-package'requests<3.0'

Using hashes

If you would like to useHash-Checking Mode available inpip sinceversion 8.0,pip-compile offers--generate-hashes flag:

$pip-compile --generate-hashes requirements.in##This file is autogenerated by pip-compile with Python 3.10#by the following command:##pip-compile --generate-hashes requirements.in#asgiref==3.6.0 \    --hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \    --hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506    # via djangodjango==4.1.7 \    --hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \    --hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e    # via -r requirements.insqlparse==0.4.3 \    --hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \    --hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268    # via django

Output File

To output the pinned requirements in a filename other thanrequirements.txt, use--output-file. This might be useful for compilingmultiple files, for example with different constraints on django to test alibrary with both versions usingtox:

$pip-compile --upgrade-package'django<1.0' --output-file requirements-django0x.txt$pip-compile --upgrade-package'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use--output-file=-:

$pip-compile --output-file=-> requirements.txt$pip-compile - --output-file=-< requirements.in> requirements.txt

Forwarding options topip

Any validpip flags or arguments may be passed on withpip-compile's--pip-args option, e.g.

$pip-compile requirements.in --pip-args"--retries 10 --timeout 30"

Configuration

You can define project-level defaults forpip-compile andpip-sync bywriting them to a configuration file in the same directory as your requirementsinput files (or the current working directory if piping input from stdin).By default, bothpip-compile andpip-sync will look firstfor a.pip-tools.toml file and then in yourpyproject.toml. You canalso specify an alternate TOML configuration file with the--config option.

For example, to by default generatepip hashes in the resultingrequirements file output, you can specify in a configuration file:

[tool.pip-tools]generate-hashes =true

Options topip-compile andpip-sync that may be used more than oncemust be defined as lists in a configuration file, even if they only have onevalue.

pip-tools supports default values forall valid command-line flagsof its subcommands. Configuration keys may contain underscores instead of dashes,so the above could also be specified in this format:

[tool.pip-tools]generate_hashes =true

You might be wrapping thepip-compile command in another script. To avoidconfusing consumers of your custom script you can override the update commandgenerated at the top of requirements files by setting theCUSTOM_COMPILE_COMMAND environment variable.

$CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in##This file is autogenerated by pip-compile with Python 3.10#by the following command:##./pipcompilewrapper#asgiref==3.6.0    # via djangodjango==4.1.7    # via -r requirements.insqlparse==0.4.3    # via django

Workflow for layered requirements

If you have different environments that you need to install different butcompatible packages for, then you can create layered requirements files and useone layer to constrain the other.

For example, if you have a Django project where you want the newest2.1release in production and when developing you want to use the Django debugtoolbar, then you can create two*.in files, one for each layer:

# requirements.indjango<2.2

At the top of the development requirementsdev-requirements.in you use-c requirements.txt to constrain the dev requirements to packages alreadyselected for production inrequirements.txt.

# dev-requirements.in-c requirements.txtdjango-debug-toolbar<2.2

First, compilerequirements.txt as usual:

$ pip-compile## This file is autogenerated by pip-compile with Python 3.10# by the following command:##    pip-compile#django==2.1.15    # via -r requirements.inpytz==2023.3    # via django

Now compile the dev requirements and therequirements.txt file is used asa constraint:

$pip-compile dev-requirements.in##This file is autogenerated by pip-compile with Python 3.10#by the following command:##pip-compile dev-requirements.in#django==2.1.15    # via    #   -c requirements.txt    #   django-debug-toolbardjango-debug-toolbar==2.1    # via -r dev-requirements.inpytz==2023.3    # via    #   -c requirements.txt    #   djangosqlparse==0.4.3    # via django-debug-toolbar

As you can see above, even though a2.2 release of Django is available, thedev requirements only include a2.1 version of Django because they wereconstrained. Now both compiled requirements files can be installed safely inthe dev environment.

To install requirements in production stage use:

$pip-sync

You can install requirements in development stage by:

$pip-sync requirements.txt dev-requirements.txt

Version control integration

You might usepip-compile as a hook for thepre-commit.Seepre-commit docs for instructions.Sample.pre-commit-config.yaml:

repos:  -repo:https://github.com/jazzband/pip-toolsrev:7.3.0hooks:      -id:pip-compile

You might want to customizepip-compile args by configuringargs and/orfiles, for example:

repos:  -repo:https://github.com/jazzband/pip-toolsrev:7.3.0hooks:      -id:pip-compilefiles:^requirements/production\.(in|txt)$args:[--index-url=https://example.com, requirements/production.in]

If you have multiple requirement files make sure you create a hook for each file.

repos:  -repo:https://github.com/jazzband/pip-toolsrev:7.3.0hooks:      -id:pip-compilename:pip-compile setup.pyfiles:^(setup\.py|requirements\.txt)$      -id:pip-compilename:pip-compile requirements-dev.inargs:[requirements-dev.in]files:^requirements-dev\.(in|txt)$      -id:pip-compilename:pip-compile requirements-lint.inargs:[requirements-lint.in]files:^requirements-lint\.(in|txt)$      -id:pip-compilename:pip-compile requirements.inargs:[requirements.in]files:^requirements\.(in|txt)$

Example usage forpip-sync

Now that you have arequirements.txt, you can usepip-sync to updateyour virtual environment to reflect exactly what's in there. This willinstall/upgrade/uninstall everything necessary to match therequirements.txt contents.

Run it withpip-sync orpython -m piptools sync. If you use multiplePython versions, you can also runpy -X.Y -m piptools sync on Windows andpythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtualenvironment as your project to identify which packages to installor upgrade.

Be careful:pip-sync is meant to be used only with arequirements.txt generated bypip-compile.

$pip-syncUninstalling flake8-2.4.1:    Successfully uninstalled flake8-2.4.1Collecting click==4.1    Downloading click-4.1-py2.py3-none-any.whl (62kB)    100% |................................| 65kB 1.8MB/s    Found existing installation: click 4.0    Uninstalling click-4.0:        Successfully uninstalled click-4.0Successfully installed click-4.1

To sync multiple*.txt dependency lists, just pass them in via commandline arguments, e.g.

$pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default torequirements.txt.

Any validpip install flags or arguments may be passed withpip-sync's--pip-args option, e.g.

$pip-sync requirements.txt --pip-args"--no-cache-dir --no-deps"

Note:pip-sync will not upgrade or uninstall packaging tools likesetuptools,pip, orpip-tools itself. Usepython -m pip install --upgradeto upgrade those packages.

Should I commitrequirements.in andrequirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control,then yes, you should commit bothrequirements.in andrequirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below),then you must commit a separate output file for each Python environment.We suggest to use the{env}-requirements.txt format(ex:win32-py3.7-requirements.txt,macos-py3.10-requirements.txt, etc.).

Cross-environment usage ofrequirements.in/requirements.txt andpip-compile

The dependencies of a package can change depending on the Python environment in which itis installed. Here, we define a Python environment as the combination of OperatingSystem, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy,etc.). For an exact definition, refer to the possible combinations ofPEP 508environment markers.

As the resultingrequirements.txt can differ for each environment, users mustexecutepip-compileon each Python environment separately to generate arequirements.txt valid for each said environment. The samerequirements.in canbe used as the source file for all environments, usingPEP 508 environment markers asneeded, the same way it would be done for regularpip cross-environment usage.

If the generatedrequirements.txt remains exactly the same for all Pythonenvironments, then it can be used across Python environments safely.But usersshould be careful as any package update can introduce environment-dependentdependencies, making any newly generatedrequirements.txt environment-dependent too.As a general rule, it's advised that users should still always executepip-compileon each targeted Python environment to avoid issues.

Other useful tools

Deprecations

This section listspip-tools features that are currently deprecated.

  • In the next major release, the--allow-unsafe behavior will be enabled bydefault (jazzband#989).Use--no-allow-unsafe to keep the old behavior. It is recommendedto pass--allow-unsafe now to adapt to the upcoming change.
  • The legacy resolver is deprecated and will be removed in future versions.The new default is--resolver=backtracking.
  • In the next major release, the--strip-extras behavior will be enabled bydefault (jazzband#1613).Use--no-strip-extras to keep the old behavior.

A Note on Resolvers

You can choose from either default backtracking resolver or the deprecated legacy resolver.

The legacy resolver will occasionally fail to resolve dependencies. Thebacktracking resolver is more robust, but can take longer to run in general.

You can continue using the legacy resolver with--resolver=legacy althoughnote that it is deprecated and will be removed in a future release.

About

A set of tools to keep your pinned Python dependencies fresh.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python100.0%

[8]ページ先頭

©2009-2025 Movatter.jp