A set of tools to keep your pinned Python dependencies fresh.

Overview

Jazzband PyPI version Supported Python versions GitHub Actions build status Coverage

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar to pip, pip-tools must be installed in each of your project's virtual environments:

$ source /path/to/venv/bin/activate
(venv)$ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated your project's virtual environment.

Example usage for pip-compile

The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either setup.py or requirements.in.

Run it with pip-compile or python -m piptools compile. If you use multiple Python versions, you can run pip-compile as py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile on other systems.

pip-compile should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.

Note: ensure you don't have requirements.txt if you compile setup.py or requirements.in from scratch, otherwise, it might interfere.

Requirements from setup.py

Suppose you have a Django project, and want to pin it for production. If you have a setup.py with install_requires=['django'], then run pip-compile without any arguments:

$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via my_django_project (setup.py)
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

pip-compile will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

Without setup.py

If you don't use setup.py (it's easy to write one), you can create a requirements.in file to declare the Django dependency:

# requirements.in
django

Now, run pip-compile requirements.in:

$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile requirements.in
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via -r requirements.in
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

And it will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

Using hashes

If you would like to use Hash-Checking Mode available in pip since version 8.0, pip-compile offers --generate-hashes flag:

$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --generate-hashes requirements.in
#
asgiref==3.2.3 \
    --hash=sha256:7e06d934a7718bf3975acbf87780ba678957b87c7adc056f13b6215d610695a0 \
    --hash=sha256:ea448f92fc35a0ef4b1508f53a04c4670255a3f33d22a81c8fc9c872036adbe5 \
    # via django
django==3.0.3 \
    --hash=sha256:2f1ba1db8648484dd5c238fb62504777b7ad090c81c5f1fd8d5eb5ec21b5f283 \
    --hash=sha256:c91c91a7ad6ef67a874a4f76f58ba534f9208412692a840e1d125eb5c279cb0a \
    # via -r requirements.in
pytz==2019.3 \
    --hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
    --hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be \
    # via django
sqlparse==0.3.0 \
    --hash=sha256:40afe6b8d4b1117e7dff5504d7a8ce07d9a1b15aeeade8a2d10f130a834f8177 \
    --hash=sha256:7c3dca29c022744e95b547e867cee89f4fce4373f3549ccd8797d8eb52cdb873 \
    # via django

Updating requirements

To update all packages, periodically re-run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-package or -P flag:

# only update the django package
$ pip-compile --upgrade-package django

# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests

# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine --upgrade and --upgrade-package in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:

$ pip-compile --upgrade --upgrade-package 'requests<3.0'

Output File

To output the pinned requirements in a filename other than requirements.txt, use --output-file. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:

$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use --output-file=-:

$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt

Forwarding options to pip

Any valid pip flags or arguments may be passed on with pip-compile's --pip-args option, e.g.

$ pip-compile requirements.in --pip-args '--retries 10 --timeout 30'

Configuration

You might be wrapping the pip-compile command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND environment variable.

$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    ./pipcompilewrapper
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via -r requirements.in
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

Workflow for layered requirements

If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.

For example, if you have a Django project where you want the newest 2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two *.in files, one for each layer:

# requirements.in
django<2.2

At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already selected for production in requirements.txt.

# dev-requirements.in
-c requirements.txt
django-debug-toolbar

First, compile requirements.txt as usual:

$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile
#
django==2.1.15
    # via -r requirements.in
pytz==2019.3
    # via django

Now compile the dev requirements and the requirements.txt file is used as a constraint:

$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile dev-requirements.in
#
django-debug-toolbar==2.2
    # via -r dev-requirements.in
django==2.1.15
    # via
    #   -c requirements.txt
    #   django-debug-toolbar
pytz==2019.3
    # via
    #   -c requirements.txt
    #   django
sqlparse==0.3.0
    # via django-debug-toolbar

As you can see above, even though a 2.2 release of Django is available, the dev requirements only include a 2.1 version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.

To install requirements in production stage use:

$ pip-sync

You can install requirements in development stage by:

$ pip-sync requirements.txt dev-requirements.txt

Version control integration

You might use pip-compile as a hook for the pre-commit. See pre-commit docs for instructions. Sample .pre-commit-config.yaml:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 5.0.0
    hooks:
      - id: pip-compile

You might want to customize pip-compile args by configuring args and/or files, for example:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 5.0.0
    hooks:
      - id: pip-compile
        files: ^requirements/production\.(in|txt)$
        args: [--index-url=https://example.com, requirements/production.in]

Example usage for pip-sync

Now that you have a requirements.txt, you can use pip-sync to update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt contents.

Run it with pip-sync or python -m piptools sync. If you use multiple Python versions, you can also run py -X.Y -m piptools sync on Windows and pythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.

Be careful: pip-sync is meant to be used only with a requirements.txt generated by pip-compile.

$ pip-sync
Uninstalling flake8-2.4.1:
  Successfully uninstalled flake8-2.4.1
Collecting click==4.1
  Downloading click-4.1-py2.py3-none-any.whl (62kB)
    100% |................................| 65kB 1.8MB/s
  Found existing installation: click 4.0
    Uninstalling click-4.0:
      Successfully uninstalled click-4.0
Successfully installed click-4.1

To sync multiple *.txt dependency lists, just pass them in via command line arguments, e.g.

$ pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default to requirements.txt.

Any valid pip install flags or arguments may be passed with pip-sync's --pip-args option, e.g.

$ pip-sync requirements.txt --pip-args '--no-cache-dir --no-deps'

If you use multiple Python versions, you can run pip-sync as py -X.Y -m piptools sync ... on Windows and pythonX.Y -m piptools sync ... on other systems.

Note: pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade to upgrade those packages.

Should I commit requirements.in and requirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit both requirements.in and requirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below), then you must commit a seperate output file for each Python environment. We suggest to use the {env}-requirements.txt format (ex: win32-py3.7-requirements.txt, macos-py3.6-requirements.txt, etc.).

Cross-environment usage of requirements.in/requirements.txt and pip-compile

The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.6, 3.7, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.

As the resulting requirements.txt can differ for each environment, users must execute pip-compile on each Python environment separately to generate a requirements.txt valid for each said environment. The same requirements.in can be used as the source file for all environments, using PEP 508 environment markers as needed, the same way it would be done for regular pip cross-environment usage.

If the generated requirements.txt remains exactly the same for all Python environments, then it can be used across Python environments safely. But users should be careful as any package update can introduce environment-dependant dependencies, making any newly generated requirements.txt environment-dependant too. As a general rule, it's advised that users should still always execute pip-compile on each targeted Python environment to avoid issues.

Other useful tools

Deprecations

This section lists pip-tools features that are currently deprecated.

  • --index/--no-index command-line options, use instead --emit-index-url/--no-emit-index-url (since 5.2.0).
  • In future versions, the --allow-unsafe behavior will be enabled by default. Use --no-allow-unsafe to keep the old behavior. It is recommended to pass the --allow-unsafe now to adapt to the upcoming change.

Versions and compatibility

The table below summarizes the latest pip-tools versions with the required pip and Python versions. Generally, pip-tools supports the same Python versions as the required pip versions.

pip-tools pip Python
4.5.* 8.1.3 - 20.0.2 2.7, 3.5 - 3.8
5.0.0 - 5.3.0 20.0 - 20.1.1 2.7, 3.5 - 3.8
5.4.0 20.1 - 20.3.* 2.7, 3.5 - 3.8
5.5.0 20.1 - 20.3.* 2.7, 3.5 - 3.9
6.0.0 20.1 - 20.3.* 3.6 - 3.9
Comments
  • Fixup relative and absolute path handling

    Fixup relative and absolute path handling

    Initial Summary (Outdated)
    
    Rewrite input file paths as relative to the output file, or as absolutes if using stdout:
    
    - Change click arg output_file from click.File to click.Path,
      so we can get its absolute path
    - Change to output file's parent folder before opening it,
      helping upstream pip code to use correct relative paths
    - Return to the starting dir after compilation,
      for test or other calls from code which don't expect dir changes
    - Rewrite src_files as absolute paths as soon as we have them,
      to resolve relative paths properly (CLI args are relative to CWD)
    - When deriving the output path from a single input path, stay safer and more predictable,
      particularly if the basename has no dot, and the path does (see example in #1067)
    - Rewrite src_files as relative to the output path once that's known,
      unless output is stdout
    - Don't overwrite an input file ending in '.txt' when deriving the output file path
    - Add tests:
        - test_annotation_relative_paths
        - test_input_file_with_txt_extension
        - test_input_file_without_extension_and_dotted_path
    
    Fixes #1107
    
    Contributes to #1067
    
    Minor tag-alongs:
      - fix some comment typos
      - git-ignore sublime text project and workspace files
    
    QUESTIONS:
    1. If output is stdout, should paths (in annotations) be absolute,
      or relative to current working folder? 
      With this PR, they're absolute, and I think that is appropriate.
    2. With this PR, `output_file` changes its type, early on. 
      Does that bother anyone? It starts as a `click.Path`, then 
      after paths are resolved by our logic, it's replaced by a `click.File`. 
      It's a small window of `click.Path`-ness. An example consequence
      is seen in this PR's change to `test_writer.py`.
    3. What is the best result of using `name.txt` as an input file, without specifying the output file?
      With this PR, it outputs to `name.txt.txt`, which is the best I can think of.
    4. What additional tests would be good to have, if any? 
      More annotation variants, like with `-c`? More complicated relative paths?
    
    **Changelog-friendly one-liners**: 
    - Rewrite input file paths as relative to the output file, or as absolutes if using stdout
    - Don't overwrite an input file ending in '.txt' when deriving the output file path
    - Don't confuse dots in folder names with file extensions when deriving the output file path
    
    
    Summary, Take 2 (also outdated)

    Fixup relative and absolute path handling:

    These changes have been made with the general guideline of storing paths as absolute as soon as we can, and rendering them as relative or absolute as needed.

    | Path | Initial Interpretation | Output Format (file) | Output Format (stdout) | | --- | --- | --- | --- | | source file | relative to invocation dir | (annotation) relative to output file | absolute | | ireq from source file | relative to its source file | relative to output file, unless initially absolute | absolute | | ireq from --upgrade-package | relative to invocation dir | ~relative to output file~ I think: relative to output file if passed as relative, absolute if passed as absolute, pathless if passed as pathless | absolute | | git+file: ireq from source file | relative to its source file | absolute (pip doesn't support relative paths in that form) | absolute |

    Itemized Changes by File

    utils.py

    • Changed:
      • format_requirement:
        • Add optional str kwarg from_dir:
          • If used, it'll rewrite local paths as relative (to from_dir).
        • Replace alternative path separators in relpaths with forward slashes.
        • Use pip's path_to_url for abs paths.
        • Ensure fragment is attached if present originally.
    • Added:
      • Function abs_ireq:

        def abs_ireq(ireq: InstallRequirement, from_dir: str) -> InstallRequirement:
            """
            Return the given InstallRequirement if its source isn't a relative path;
            Otherwise, return a new one with the relative path rewritten as absolute.
        
            In this case, an extra attribute is added: _was_relative,
            which is always True when present at all.
            """
        
      • Context manager working_dir:

        @contextmanager
        def working_dir(folder: Optional[str]) -> Iterator[None]:
            """Change the current directory within the context, then change it back."""
        
      • Function fragment_string:

        def fragment_string(ireq: InstallRequirement) -> str:
            """
            Return a string like "#egg=pkgname&subdirectory=folder", or "".
            """
        

    pip_compat.py

    • Changed:
      • parse_requirements:
        • Add optional str kwarg from_dir to parse_requirements. If left to its default, None, the parent of the source file is used. Either way it's passed to abs_ireq, so any yielded local ireqs have absolute .links, and some have ._was_relative.
        • Ensure pip's install_req_from_parsed_requirement is called from a sensible folder, to better resolve relative paths; and try to detect if each ireq was initially relative, to "manually" mark the resulting (absolute) ireq with _was_relative.

    compile.py

    • Change Click argument type for the output file from File to Path. When Click's File object is initialized with the absolute path, that full path is preserved as the .name attribute. So we now instantiate the output File ourselves after resolving its absolute path.
    • Resolve src_files to their absolute paths.
    • When deriving the output path from a single input path, ensure it's properly adjacent to the input, and stay safer and more predictable when the basename has no dot and the path does, or the input file ends in .txt (see #1067, #1107, and tests below).
    • Use abs_ireq when collecting upgrade_install_reqs (--upgrade-package), passing the invocation dir as from_dir.
    • No support for relative paths is introduced for setup.py install_requires, given the discussion @ https://discuss.python.org/t/what-is-the-correct-interpretation-of-path-based-pep-508-uri-reference/2815/18
    • Ensure a suitable from_dir is passed to parse_requirements when parsing from setup.py or stdin, which really parses a temp file. This means setup.py's parent folder, or the invocation dir if the source is stdin.

    writer.py

    • Added:
      • comes_from_line_project_re pattern for parsing and rewriting comes_from strs that point to setup.pys and pyproject.tomls.
    • Changed:
      • strip_comes_from_line_re:
        • Extend/replace the pattern as comes_from_line_re, with named groups for opts (-r/-c), path, and line_num.
      • _comes_from_as_string:
        • Add optional str kwarg from_dir. If the ireq.comes_from is already a str and from_dir is passed, in addition to stripping the line number as before, rewrite the path as relative.
        • Add handling for comes_from_line_project_re matches.
      • _format_requirement:
        • If the ireq has ._was_relative and the output is a file, pass the output file's parent as from_dir to format_requirement, ensuring the written path for the ireq is relative in that case.
        • Pass the parent of the output file, if any, as from_dir to _comes_from_as_string.

    test_cli_compile.py

    • Added:
      • test_relative_local_package
        • Relative paths are properly resolved between input, output, and local packages.
        • Input file paths/URIs can be relative, as long as they start with file: or ..
      • test_input_file_with_txt_extension
        • Compile an input file ending in .txt to a separate output file (*.txt.txt), without overwriting the input file.
      • test_input_file_without_extension_and_dotted_path
        • Compile a file without an extension, in a subdir with a dot, into an input-adjacent file with .txt as the extension.
      • test_annotation_relative_paths
        • Annotations referencing reqs.in files use paths relative to the reqs.txt.
      • test_local_vcs_package
        • git+file urls are rewritten to use absolute paths, and otherwise remain intact.
    • Changed:
      • test_duplicate_reqs_combined
        • Use pip's path_to_url to detect the normalized package path in URL form.

    test_writer.py

    • Changed:
      • writer fixture:
        • Open an output file object to pass to OutputWriter, rather than passing the click ctx entry (now just a Path).
      • test_write_header:
        • Access the user-supplied output file path via writer.click_ctx.params["output_file"] (now just a Path), rather than checking that for a .name.
      • test_iter_lines__hash_missing:
        • Use regex match to recognize Windows drive names.
      • test_iter_lines__no_warn_if_only_unhashable_packages:
        • Use regex match to recognize Windows drive names.
    • Added:
      • test_format_requirement_annotation_source_ireqs

    test_utils.py

    • Changed:
      • test_format_requirement_editable_local_path
        • Use regex match to recognize Windows drive names.
    • Added:
      • test_working_dir
      • test_local_abs_ireq_preserves_source_ireqs

    Changelog-friendly one-liners:

    • Support relative paths in input files, as long as they lead with file:, <vcs>+file:, or ..
    • If a local requirement path is relative in the input, interpret it as relative to that input file, and write it as relative to the output file, if any. Otherwise, write the absolute path.
    • Rewrite input file paths (in annotations) as relative to the output file, or as absolute if using stdout.
    • Don't overwrite an input file ending in '.txt' when deriving the output file path.
    • Don't confuse dots in folder names with file extensions when deriving the output file path.
    • Write requirement paths using forward slashes rather than backslashes, on Windows.

    Changelog-friendly one-liners:

    • Support relative paths in input files, as long as they lead with file:, <vcs>+file:, or ..
    • Relative paths in input files become relative paths in output files.
    • pip-compile will interpret relative paths in an input file as relative to that input file, rather than the current folder, if --read-relative-to-input is passed.
    • pip-compile will reconstruct relative req paths as relative to the output file, rather than the current folder, if --write-relative-to-output is passed.
    • pip-sync will interpret relative paths in an input file as relative to that input file, rather than the current folder, if --read-relative-to-input is passed.
    • Annotation paths are now relative to the output file.
    • Don't overwrite an input file ending in '.txt' when deriving the output file path.
    • Don't confuse dots in folder names with file extensions when deriving the output file path.
    • Include extras more reliably in output lines, like pkg[extra1,extra2].

    • Fixes #1107
    • Fixes #204
    • Fixes #1165
    • Related #1067
    • Related #453 (Not addressed in this PR)
    • Related #673
    • Related #702
    Contributor checklist
    • [x] Provided the tests for the changes.
    • [ ] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md on release).
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    enhancement 
    opened by AndydeCleyre 103
  • Add support for pip's 2020 dependency resolver

    Add support for pip's 2020 dependency resolver

    What's new?

    Added new option --resolver [backtracking|legacy] to pip-compile (default is legacy).

    How to use?

    To enable 2020 dependency resolver run pip-compile --resolver=backtracking.

    Backtracking resolver example

    $ echo "oslo.utils==1.4.0" | pip-compile - --resolver=backtracking --allow-unsafe --annotation-style=line -qo-
    #
    # This file is autogenerated by pip-compile with python 3.8
    # To update, run:
    #
    #    pip-compile --allow-unsafe --annotation-style=line --output-file=- --resolver=backtracking -
    #
    babel==2.10.3             # via oslo-i18n, oslo-utils
    iso8601==1.0.2            # via oslo-utils
    netaddr==0.8.0            # via oslo-utils
    netifaces==0.11.0         # via oslo-utils
    oslo-i18n==2.1.0          # via oslo-utils
    oslo-utils==1.4.0         # via -r -
    pbr==0.11.1               # via oslo-i18n, oslo-utils
    pytz==2022.1              # via babel
    six==1.16.0               # via oslo-i18n, oslo-utils
    
    # The following packages are considered to be unsafe in a requirements file:
    pip==22.1.2               # via pbr
    

    Legacy resolver example

    $ echo "oslo.utils==1.4.0" | pip-compile - --resolver=legacy --allow-unsafe -qo-
    Could not find a version that matches pbr!=0.7,!=2.1.0,<1.0,>=0.6,>=2.0.0 (from oslo.utils==1.4.0->-r -)
    Tried: 0.5.2.5.g5b3e942, 0.5.0, 0.5.1, 0.5.2, 0.5.4, 0.5.5, 0.5.6, 0.5.7, 0.5.8, 0.5.10, 0.5.11, 0.5.12, 0.5.13, 0.5.14, 0.5.15, 0.5.16, 0.5.17, 0.5.18, 0.5.19, 0.5.20, 0.5.21, 0.5.22, 0.5.23, 0.6, 0.7.0, 0.8.0, 0.8.1, 0.8.2, 0.9.0, 0.9.0, 0.10.0, 0.10.0, 0.10.1, 0.10.1, 0.10.2, 0.10.2, 0.10.3, 0.10.3, 0.10.4, 0.10.4, 0.10.5, 0.10.5, 0.10.6, 0.10.6, 0.10.7, 0.10.7, 0.10.8, 0.10.8, 0.11.0, 0.11.0, 0.11.1, 0.11.1, 1.0.0, 1.0.0, 1.0.1, 1.0.1, 1.1.0, 1.1.0, 1.1.1, 1.1.1, 1.2.0, 1.2.0, 1.3.0, 1.3.0, 1.4.0, 1.4.0, 1.5.0, 1.5.0, 1.6.0, 1.6.0, 1.7.0, 1.7.0, 1.8.0, 1.8.0, 1.8.1, 1.8.1, 1.9.0, 1.9.0, 1.9.1, 1.9.1, 1.10.0, 1.10.0, 2.0.0, 2.0.0, 2.1.0, 2.1.0, 3.0.0, 3.0.0, 3.0.1, 3.0.1, 3.1.0, 3.1.0, 3.1.1, 3.1.1, 4.0.0, 4.0.0, 4.0.1, 4.0.1, 4.0.2, 4.0.2, 4.0.3, 4.0.3, 4.0.4, 4.0.4, 4.1.0, 4.1.0, 4.1.1, 4.1.1, 4.2.0, 4.2.0, 4.3.0, 4.3.0, 5.0.0, 5.0.0, 5.1.0, 5.1.0, 5.1.1, 5.1.1, 5.1.2, 5.1.2, 5.1.3, 5.1.3, 5.2.0, 5.2.0, 5.2.1, 5.2.1, 5.3.0, 5.3.0, 5.3.1, 5.3.1, 5.4.0, 5.4.0, 5.4.1, 5.4.1, 5.4.2, 5.4.2, 5.4.3, 5.4.3, 5.4.4, 5.4.4, 5.4.5, 5.4.5, 5.5.0, 5.5.0, 5.5.1, 5.5.1, 5.6.0, 5.6.0, 5.7.0, 5.7.0, 5.8.0, 5.8.0
    There are incompatible versions in the resolved dependencies:
      pbr!=0.7,<1.0,>=0.6 (from oslo.utils==1.4.0->-r -)
      pbr!=2.1.0,>=2.0.0 (from oslo.i18n==5.1.0->oslo.utils==1.4.0->-r -)
    
    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [x] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [x] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    enhancement resolver 
    opened by atugushev 65
  • Annotate primary requirements and VCS dependencies

    Annotate primary requirements and VCS dependencies

    Resolves #881 Resolves #293

    This change brings annotations to primary requirements in the compilation output.

    The annotation may be merely a reqs-in source:

    django-debug-toolbar==2.2  # via -r requirements.in (line 2)
    

    or it may additionally include reverse dependencies:

    django==3.0.3             # via -r requirements.in (line 1), django-debug-toolbar
    

    Existing tests are modified to either adjust their expectations, or compile with --no-annotations if annotations are irrelevant. Two tests have been inverted and renamed:

    -test_format_requirement_not_for_primary
    +test_format_requirement_for_primary
    -test_format_requirement_not_for_primary_lower_case
    +test_format_requirement_for_primary_lower_case
    

    Changelog-friendly one-liner: Primary requirements and VCS dependencies now get annotated with any source .in files and reverse dependencies

    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md on release).
    • [x] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).

    Please review these changes with the following questions in mind:

    1. Is the new annotation string as desired?
    2. Is it desirable to add an option to disable this new behavior?
    3. Should the non-annotation-focused tests needing modification be, generally, expecting an exact annotation? Or should we simply use --no-annotate?
    4. In which cases might an InstallRequirement's comes_from attribute be a str, or an InstallRequirement? The following alternatives seem to result in the same output. Which is saner or preferred, or handles potential edge cases better?
     required_by |= {
         src_ireq.comes_from
    -    for src_ireq in ireq._source_ireqs
         if isinstance(src_ireq.comes_from, str)
    +    else src_ireq.comes_from.name.lower()
    +    for src_ireq in ireq._source_ireqs
    +    if src_ireq.comes_from
     }
    

    code in context

    1. If/when all looks good, want it squashed?
    enhancement 
    opened by AndydeCleyre 40
  • pip-compile failure pep517/in_process/_in_process.py get_requires_for_build_wheel

    pip-compile failure pep517/in_process/_in_process.py get_requires_for_build_wheel

    My pip-compile was working fine yesterday and is now failing.

    I don't know what changed. I know that my requirements.txt did not change and my pip-tools version did not change either.

    I have seen issues #1535 and #1390 but no workaround works for me.

    Environment Versions

    1. OS Type: Linux
    2. Python version: Python 3.8.10
    3. pip version: pip 21.3.1
    4. pip-tools version: pip-compile, version 6.4.0

    Steps to replicate

    $ cat setup.py
    from setuptools import setup
    
    
    setup(
        name='apollo',
        install_requires=['conan==1.45.0']
        )
    
    $ cat build/requirements.txt
    bottle==0.12.19
        # via conan
    certifi==2021.10.8
        # via requests
    charset-normalizer==2.0.12
        # via requests
    colorama==0.4.4
        # via conan
    conan==1.45.0
        # via apollo (setup.py)
    distro==1.6.0
        # via conan
    fasteners==0.17.3
        # via conan
    idna==3.3
        # via requests
    jinja2==3.0.3
        # via conan
    markupsafe==2.1.0
        # via jinja2
    node-semver==0.6.1
        # via conan
    patch-ng==1.17.4
        # via conan
    pluginbase==1.0.1
        # via conan
    pygments==2.11.2
        # via conan
    pyjwt==1.7.1
        # via conan
    python-dateutil==2.8.2
        # via conan
    pyyaml==5.4.1
        # via conan
    requests==2.27.1
        # via conan
    six==1.16.0
        # via
        #   conan
        #   python-dateutil
    tqdm==4.62.3
        # via conan
    urllib3==1.26.8
        # via
        #   conan
        #   requests
    

    Expected result

    success

    Actual result

    $ pip-compile --output-file build/requirements.txt
    ERROR: WARNING: You are using pip version 21.3.1; however, version 22.0.4 is available.
    ERROR: You should consider upgrading via the '/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/python -m pip install --upgrade pip' command.
    Traceback (most recent call last):
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/pip-compile", line 8, in <module>
        sys.exit(cli())
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/piptools/scripts/compile.py", line 408, in cli
        dist = meta.load(os.path.dirname(os.path.abspath(src_file)))
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 71, in load
        path = Path(build_as_zip(builder))
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 58, in build_as_zip
        builder(dest=out_dir)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 53, in build
        _prep_meta(hooks, env, dest)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 28, in _prep_meta
        reqs = hooks.get_requires_for_build_wheel({})
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 172, in get_requires_for_build_wheel
        return self._call_hook('get_requires_for_build_wheel', {
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 322, in _call_hook
        self._subprocess_runner(
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 75, in quiet_subprocess_runner
        check_output(cmd, cwd=cwd, env=env, stderr=STDOUT)
      File "/usr/lib/python3.8/subprocess.py", line 415, in check_output
        return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
      File "/usr/lib/python3.8/subprocess.py", line 516, in run
        raise CalledProcessError(retcode, process.args,
    subprocess.CalledProcessError: Command '['/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/python', '/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/in_process/_in_process.py', 'get_requires_for_build_wheel', '/tmp/tmpr0u030tn']' returned non-zero exit status 1.
    
    opened by jeremy-coulon 38
  • Pip10 update

    Pip10 update

    Update pip-tools for pip10 compatibility (and backwards compatibility with pip9)

    Contributor checklist
    • [x] Provided the tests for the changes
    • [x] Requested (or received) a review from another contributor
    • [x] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md afterwards).

    /cc @vphilippon

    I didn't add tests, since I just rebuilt existing functionality using pip10 as well. This should work with both versions of pip and is all green locally

    opened by techalchemy 38
  • Dependency handling in requirements when updating packages

    Dependency handling in requirements when updating packages

    So, here is the promised brain dump, sorry for the length.

    Right now naively updating requirements can lead to dependency conflicts. For instance, let's say I want to add raven to my project but pinned to a specific version:

    $ pip install raven==1.9.4
    …
    Successfully installed raven simplejson
    

    So raven needs simplejson. Now I run pip freeze and get in my requirements.txt:

    raven==1.9.4
    simplejson==2.4.0
    

    Some time later I run pip-review and get (this is not what you'd get right now):

    raven==2.0.2 is available (you have 1.9.4)
    simplejson==2.6.2 is available (you have 2.4.0)
    

    Note that the newer simplejson was already available when I initially installed raven, but raven needed simplejson>=2.3.0,<2.5.0. Raven 2.0.2 does as well, but this still encourages me to upgrade simplejson when I shouldn't.

    The current version of raven dropped the >=2.3.0,<2.5.0 part so now we can get the latest and greatest raven and simplejson safely.

    My point is that when updating dependencies, checking for conflicts is very hard to do by hand. This needs to be automated with a tool that yells at the developer when an update leads to a version conflict.

    Ruby gets this right with Bundler. gem install bundle, create a Gemfile with the following content:

    source :rubygems
    gem 'compass-less-plugin'
    

    And run bundle install. This installs the required package and its dependencies and creates a Gemfile.lock file:

    GEM
      remote: http://rubygems.org/
      specs:
        chunky_png (1.2.6)
        compass (0.12.2)
          chunky_png (~> 1.2)
          fssm (>= 0.2.7)
          sass (~> 3.1)
        compass-less-plugin (1.0)
          compass (>= 0.10)
        fssm (0.2.9)
        sass (3.2.1)
    
    PLATFORMS
      ruby
    
    DEPENDENCIES
      compass-less-plugin
    

    Gemfile.lock is like requirements.txt with pinned versions (not everything is pinned here but should probably be): when creating a new environment and running bundle install, bundler looks at the .lock file to install what's specified.

    Then there is a bunch of commands that bundle provides. For instance, to list available updates (running this on a bundle created months ago):

    $ bundle outdated
    Fetching gem metadata from http://rubygems.org/.....
    
    Outdated gems included in the bundle:
      * chunky_png (1.2.6 > 1.2.5)
      * fssm (0.2.9 > 0.2.8.1)
      * sass (3.2.1 > 3.1.12)
      * compass (0.12.2 > 0.11.7)
    

    Updating compass-less-plugin and its dependencies can be done in one command (bundle update compass-less-plugin) and does so while checking for version conflicts.

    Sorry if you're already familiar with all this. Now I'll try to explain how we can make improve requirements.txt by using this approach.

    First, instead of putting all the requirements in requirements.txt, people would only list first-level deps, pinned. So for raven:

    raven==1.9.4
    

    Then some tool provided by pip-tools compiles this into the full requirements list, into an other file (like Gemfile and Gemfile.lock but with less noise):

    raven==1.9.4
    simplejson==2.4.0
    

    The key point is that this tool builds the whole dependency tree for all the top-level requirements and dumps it as a safely-installable-with-no-conflicts requirements file, which pip can just use.

    So next time raven is updated and doesn't require an old simplejson, the tool can update the simplejson requirement. When raven drops simplejson to use python's built-in json implementation, the 2nd-level requirement can be dropped as well, automatically.

    Other use case: requests which used to have dependencies on oauthlib, certifi, chardet and doesn't anymore (and oauthlib needed rsa or pyasn1 or whatever). If I just need requests I'll list in my top-level requirements and the tool will pin or drop the dependencies if they're not needed when I upgrade requests itself.

    And finally, this tool could prevent me from installing package X and Y which need Z<1.0 and Z>1.1.

    That's the theory and I think pip already does some version conflict checks but that's not enough to guarantee safe updates. Now in practice, I think the dependency information is not provided by the PyPI API and requires the whole package to be fetched to actually extract it (or maybe create.io provides that info). So that's annoying but doable, and pip-tools seems like a nice place to experiment with such things.

    I think buildout does check for dependency conflicts but I never managed to wrap my head around it.

    What do you think? I'm happy to start a proof-of-concept that could be integrated in this project.

    opened by brutasse 38
  • Workflow for layered requirements (e.g. prod<-test<-dev requirements)?

    Workflow for layered requirements (e.g. prod<-test<-dev requirements)?

    Say I have

    requirements.in:

    Django~=1.8.0
    

    And also

    requirements-dev.in:

    django-debug-toolbar
    

    How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?

    For now I have an ad-hoc script that compiles requirements.in first, then requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow? I'm worried that in the future if I add a dependency it will try and update a bunch of stuff I don't want it to update, but I haven't actually used this tool long enough to determine whether that's truly a problem. Wondering if anyone else has used pip-tools in this fashion and has any advice?

    PR wanted docs 
    opened by dan-passaro 37
  • Periods get converted to dashes in package name

    Periods get converted to dashes in package name

    I have a requirements.txt file with the following:

    My.Package~=1.0
    My.Sub.Package~=1.1
    

    When I run python -m piptools compile, the periods in the package name get converted to dashes:

    my-package==1.0.0
        # via -r requirements.txt
    my-sub-package==1.1.1
        # via
        #   -r requirements.txt
        #   my.package
    

    I need the periods to stay periods. I have no control over the names of the packages. Some package names may have both periods and dashes, such as My.More-Complex.Package, and I don't want that to change to my-more-complex-package.

    Yes, the output file above functions correctly, but we're doing extra parsing on it that's breaking because the periods are now dashes.

    Edit: Per comment below, if the periods are converted to dashes to be consistent with pip, then I'd prefer all periods can converted to dashes, including # my.package to # my-package in the example above.

    writer 
    opened by sawatsky 36
  • Add --newline=[LF|CRLF|native|preserve] option to compile, to override the line separator characters used

    Add --newline=[LF|CRLF|native|preserve] option to compile, to override the line separator characters used

    pip-compile gains an option with ~two~ ~three~ four valid choices: --newline=[LF|CRLF|native|preserve], which can be used to override the guessed newline character used in the output file. The default is ~native~ preserve, which ~uses os.linesep~ tries to be consistent with an existing output file, or input file, or ~FALLBACK_VALUE (native, or LF? TBD)~ falls back to LF, in that order.

    This aims to address #1448.

    ~Note: poll for fallback value~

    Contributor checklist
    • [ ] Provided the tests for the changes.
    • [ ] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [ ] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    opened by AndydeCleyre 36
  • virtualenv issue

    virtualenv issue

    I'm not exactly sure what's going on, but with a barebones requirements.txt file within a virtualenv, pip-sync is failing.

    (venv)➜  pip-tools  pip list
    pip (7.1.2)
    setuptools (18.2)
    wheel (0.24.0)
    (venv)➜  pip-tools  pip-sync
    Cannot uninstall requirement appnope, not installed
    Traceback (most recent call last):
      File "/usr/local/bin/pip-sync", line 11, in <module>
        sys.exit(cli())
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 716, in __call__
        return self.main(*args, **kwargs)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 696, in main
        rv = self.invoke(ctx)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 889, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 534, in invoke
        return callback(*args, **kwargs)
      File "/usr/local/lib/python2.7/site-packages/piptools/scripts/sync.py", line 68, in cli
        pip_flags=pip_flags))
      File "/usr/local/lib/python2.7/site-packages/piptools/sync.py", line 137, in sync
        check_call(['pip', 'uninstall', '-y'] + pip_flags + sorted(to_uninstall))
      File "/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 540, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['pip', 'uninstall', '-y', 'appnope', 'aws-shell', 'awscli', 'boto3', 'botocore', 'colorama', 'configobj', 'decorator', 'docutils', 'flake8', 'futures', 'gnureadline', 'ipython', 'ipython-genutils', 'isort', 'jmespath', 'mccabe', 'path.py', 'pep8', 'pexpect', 'pickleshare', 'prompt-toolkit', 'ptyprocess', 'pyasn1', 'pyflakes', 'pygments', 'python-dateutil', 'requests', 'rsa', 'simplegeneric', 'speedtest-cli', 'traitlets', 'virtualenv', 'wcwidth']' returned non-zero exit status 1
    

    In my current directory and virtual environment, pip-sync is trying to uninstall globally installed packages.

    Possibly related to #277.

    opened by zackhsi 35
  • pip-review?

    pip-review?

    Hello,

    Thanks for all the great work on pip-tools! I like the way that the project is heading (pip-compile and pip-sync look pretty cool).

    I just noticed that the latest release has removed pip-review. I was wondering what the new equivalent is (as I found this tool very useful)?

    eg. every night I build a new pyvenv, install required pip libraries and run pip-review to send an email out to the devs letting them know if any libraries require upgrading.

    opened by gavinjackson 33
  • Resolve recursive extras

    Resolve recursive extras

    Closes #1685

    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [x] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    bug 
    opened by q0w 0
  • --pip-args not passed to pip installing setuptools

    --pip-args not passed to pip installing setuptools

    I have to use custom/private pypi index as our network setup blocks access to https://pypi.org. Passing --pip-args="--index_url https://pypi.private.url" (aka pip -i) is not applied when setuptools and wheel are being installed. The issue is also visible with other args like --timeout 1, etc...

    As this seems to be an issue out of scope of pip-tools I'm not sure it can be fixed/handled. I wanted to provide a writeup with observed behavior and workarounds I've tested for devs facing the same issues in future, because I've failed to find a simple answer/solution when searching through issues myself.

    Environment Versions

    1. OS Type: Windows 10.0.19044
    2. Python version: $ python -V: Python 3.10.6
    3. pip version: $ pip --version: pip 22.3.1
    4. pip-tools version: $ pip-compile --version: pip-compile, version 6.12.0

    Steps to replicate

    1. create venv
    2. update pip, install piptools
    3. create pyproject.toml. nothing fancy, barebones setup
    # pyproject.toml
    
    [project]
    name = "modulename"
    version = "1.0.0"
    
    dependencies = []
    requires-python = ">=3.8"
    
    [build-system]
    requires      = ["setuptools>=61.0.0", "wheel"]
    build-backend = "setuptools.build_meta"
    
    1. try to compile pyproject.toml:
      pip-compile --verbose pyproject.toml --resolver=backtracking --pip-args="-i https://pypi.private.url'

    My findings so far

    1. click loads --pip-args succcessfuly -> no issues with CLI parsing
    2. the variable pip_args is available until piptools\scripts\compile.py", line 483, metadata = project_wheel_metadata()
    3. It seems the -i is not passed to \build\util.py", line 53, project_wheel_metadata() in any way and there is no interface to make it happen.
    4. and that's why build\env.py", line 211, in install; _subprocess(cmd) executes without -i and fails

    Expected result

    Successful result obtained via Workaround

    Creating venv isolated environment...
    Installing packages in isolated environment... (setuptools>=61.0.0, wheel)
    Getting build dependencies for wheel...
    Installing packages in isolated environment... (wheel)
    Getting metadata for wheel...
    Using indexes:
      https://pypi.private.url
      Looking in indexes: https://pypi.private.url
    

    Actual result

    subprocess executes pip without -i, tries to reach pypi.org despite --pip-args="-i https://pypi.private.url" and obviously fails.

    output + traceback:
    Creating venv isolated environment...
    Installing packages in isolated environment... (setuptools>=61.0.0, wheel)
    WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ReadTimeoutError("HTTPSConnectionPool(host='pypi.org', port=443): Read timed out. (read timeout=15)")': /simple/setuptools/
    WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ReadTimeoutError("HTTPSConnectionPool(host='pypi.org', port=443): Read timed out. (read timeout=15)")': /simple/setuptools/
    ...
    
    Traceback (most recent call last):
      File "C:\Python38\lib\runpy.py", line 196, in _run_module_as_main
        return _run_code(code, main_globals, None,
      File "C:\Python38\lib\runpy.py", line 86, in _run_code
        exec(code, run_globals)
      File "C:\Python38\Scripts\pip-compile.exe\__main__.py", line 7, in <module>
      File "C:\Python38\lib\site-packages\click\core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "C:\Python38\lib\site-packages\click\core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "C:\Python38\lib\site-packages\click\core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "C:\Python38\lib\site-packages\click\core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "C:\Python38\lib\site-packages\click\decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "C:\Python38\lib\site-packages\piptools\scripts\compile.py", line 483, in cli
        metadata = project_wheel_metadata(
      File "C:\Python38\lib\site-packages\build\util.py", line 53, in project_wheel_metadata
        env.install(builder.build_system_requires)
      File "C:\Python38\lib\site-packages\build\env.py", line 211, in install
        _subprocess(cmd)
      File "C:\Python38\lib\site-packages\build\env.py", line 76, in _subprocess
        raise e
      File "C:\Python38\lib\site-packages\build\env.py", line 73, in _subprocess
        subprocess.run(cmd, check=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
      File "C:\Python38\lib\subprocess.py", line 524, in run
        raise CalledProcessError(retcode, process.args,
    subprocess.CalledProcessError: Command '['C:\\Users\\USERNAME\\AppData\\Local\\Temp\\build-env-nltqvjdv\\Scripts\\python.exe', '-Im', 'pip', 'install', '--use-pep517', '--no-warn-script-location', '-r', 'C:\\Users\\USERNAME\\AppData\\Local\\Temp\\build-reqs-wthveb8h.txt']' returned non-zero exit status 1.
    

    Workarounds

    1. Use PIP_INDEX_URL environment variable

    2. Force system global pip index in:
      Win: %HOME%/pip/pip.ini
      nix: $HOME/.config/pip/pip.conf

    [global]
    index_url = https://pypi.private.url
    
    [search]
    index_url = https://pypi.private.url
    
    bug pep-517 
    opened by neaxi 2
  • editable requirements are not allowed as constraints with the new backtracking resolver

    editable requirements are not allowed as constraints with the new backtracking resolver

    I'm using a pattern of 3 requirement files:

    • global constraints (base.txt)
    • setup
    • src
    • dev

    Setup includes a local package that is used by later src packages as a custom setuptools build class.

    Since things are ordered, we use the pins from the previous file as constraints for the next file to ensure that all the files are pinned to the same versions. Thus:

    # setup.in
    -c base.txt
    -e file:src/my-setup-package#egg=my-setup-package
    
    # src.in
    -c base.txt
    -c setup.txt  # <------------ note this is the output pin from setup.in
    -e file:src/my-web-app#egg=my-web-app
    

    With the pip-tools workflow we then update each .txt file in order:

    $ pip-compile --find-links wheels --no-emit-find-links --resolver=backtracking setup.in
    $ pip-compile --find-links wheels --no-emit-find-links --resolver=backtracking src.in
    $ pip-compile --find-links wheels --no-emit-find-links --resolver=backtracking dev.in
    

    This workflow only fails when using --resolver=backtracking.

    Tested with pip-tools 6.11.0.

    resolver vcs 
    opened by mmerickel 5
  • Using a combination of `--generate-hashes`, `-c constraints.txt`, `--resolver=backtracking` and `--strip-extras` doesn't currently work

    Using a combination of `--generate-hashes`, `-c constraints.txt`, `--resolver=backtracking` and `--strip-extras` doesn't currently work

    Hi,

    First of all, thanks for the package and all the hard work you put into it.

    I tried looking through the repo for a similar bug, but I haven't found anything. At the same time, I don't think I'm doing anything too much out of the ordinary - I'm using:

    • --generate-hashes, for obvious security reasons
    • -c constraints.txt at the top of the dev requirements files, as that's the recommended layered approach
    • --resolver=backtracking, for better package resolving (and it will be the default at some point)
    • --strip-extras, because otherwise using --resolver=backtracking failed with Constraints cannot have extras

    The above scenario unfortunately produces requirement files that cannot be installed because of this:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==. These do not:
    PyJWT[crypto]<3.0.0,>=1.5.2 from https://files.pythonhosted.org/packages/40/46/505f0dd53c14096f01922bf93a7abb4e40e29a06f858abbaa791e6954324/PyJWT-2.6.0-py3-none-any.whl (from drf-jwt==1.19.2->-r /home/circleci/project/requirements/prod.txt (line 269))
    

    As I understand it, --strip-extras, which is needed for --resolver=backtracking to work with-c constraints.txt, is currently incompatible with --generate-hashes, because it needs all dependencies to have hashes and (correctly?) differences between PyJWT[crypto] and PyJWT.

    My current workaround will probably involve dropping --resolver=backtracking and --strip-extras for now, but I wanted to write up this issue while I'm debugging all this.

    Related issues: #398, #1092, #1300

    Environment Versions

    1. macOS 12.6.1
    2. Python 3.8.13
    3. pip 22.3.1
    4. pip-compile, version 6.10.0

    Steps to replicate

    Given requirements files:

    $ cat main.in
    django==3.1.14
    djangorestframework==3.12.4
    drf-jwt==1.19.2
    PyJWT[crypto]==2.1.0
    
    $ cat dev.in
    -c main.txt
    
    black
    
    $ python3 -m piptools compile --generate-hashes --strip-extras --resolver=backtracking main.in
    #
    # This file is autogenerated by pip-compile with python 3.8
    # To update, run:
    #
    #    pip-compile --generate-hashes --resolver=backtracking --strip-extras main.in
    #
    asgiref==3.5.2 \
        --hash=sha256:1d2880b792ae8757289136f1db2b7b99100ce959b2aa57fd69dab783d05afac4 \
        --hash=sha256:4a29362a6acebe09bf1d6640db38c1dc3d9217c68e6f9f6204d72667fc19a424
        # via django
    cffi==1.15.1 \
        --hash=sha256:00a9ed42e88df81ffae7a8ab6d9356b371399b91dbdf0c3cb1e84c03a13aceb5 \
        --hash=sha256:03425bdae262c76aad70202debd780501fabeaca237cdfddc008987c0e0f59ef \
        --hash=sha256:04ed324bda3cda42b9b695d51bb7d54b680b9719cfab04227cdd1e04e5de3104 \
        --hash=sha256:0e2642fe3142e4cc4af0799748233ad6da94c62a8bec3a6648bf8ee68b1c7426 \
        --hash=sha256:173379135477dc8cac4bc58f45db08ab45d228b3363adb7af79436135d028405 \
        --hash=sha256:198caafb44239b60e252492445da556afafc7d1e3ab7a1fb3f0584ef6d742375 \
        --hash=sha256:1e74c6b51a9ed6589199c787bf5f9875612ca4a8a0785fb2d4a84429badaf22a \
        --hash=sha256:2012c72d854c2d03e45d06ae57f40d78e5770d252f195b93f581acf3ba44496e \
        --hash=sha256:21157295583fe8943475029ed5abdcf71eb3911894724e360acff1d61c1d54bc \
        --hash=sha256:2470043b93ff09bf8fb1d46d1cb756ce6132c54826661a32d4e4d132e1977adf \
        --hash=sha256:285d29981935eb726a4399badae8f0ffdff4f5050eaa6d0cfc3f64b857b77185 \
        --hash=sha256:30d78fbc8ebf9c92c9b7823ee18eb92f2e6ef79b45ac84db507f52fbe3ec4497 \
        --hash=sha256:320dab6e7cb2eacdf0e658569d2575c4dad258c0fcc794f46215e1e39f90f2c3 \
        --hash=sha256:33ab79603146aace82c2427da5ca6e58f2b3f2fb5da893ceac0c42218a40be35 \
        --hash=sha256:3548db281cd7d2561c9ad9984681c95f7b0e38881201e157833a2342c30d5e8c \
        --hash=sha256:3799aecf2e17cf585d977b780ce79ff0dc9b78d799fc694221ce814c2c19db83 \
        --hash=sha256:39d39875251ca8f612b6f33e6b1195af86d1b3e60086068be9cc053aa4376e21 \
        --hash=sha256:3b926aa83d1edb5aa5b427b4053dc420ec295a08e40911296b9eb1b6170f6cca \
        --hash=sha256:3bcde07039e586f91b45c88f8583ea7cf7a0770df3a1649627bf598332cb6984 \
        --hash=sha256:3d08afd128ddaa624a48cf2b859afef385b720bb4b43df214f85616922e6a5ac \
        --hash=sha256:3eb6971dcff08619f8d91607cfc726518b6fa2a9eba42856be181c6d0d9515fd \
        --hash=sha256:40f4774f5a9d4f5e344f31a32b5096977b5d48560c5592e2f3d2c4374bd543ee \
        --hash=sha256:4289fc34b2f5316fbb762d75362931e351941fa95fa18789191b33fc4cf9504a \
        --hash=sha256:470c103ae716238bbe698d67ad020e1db9d9dba34fa5a899b5e21577e6d52ed2 \
        --hash=sha256:4f2c9f67e9821cad2e5f480bc8d83b8742896f1242dba247911072d4fa94c192 \
        --hash=sha256:50a74364d85fd319352182ef59c5c790484a336f6db772c1a9231f1c3ed0cbd7 \
        --hash=sha256:54a2db7b78338edd780e7ef7f9f6c442500fb0d41a5a4ea24fff1c929d5af585 \
        --hash=sha256:5635bd9cb9731e6d4a1132a498dd34f764034a8ce60cef4f5319c0541159392f \
        --hash=sha256:59c0b02d0a6c384d453fece7566d1c7e6b7bae4fc5874ef2ef46d56776d61c9e \
        --hash=sha256:5d598b938678ebf3c67377cdd45e09d431369c3b1a5b331058c338e201f12b27 \
        --hash=sha256:5df2768244d19ab7f60546d0c7c63ce1581f7af8b5de3eb3004b9b6fc8a9f84b \
        --hash=sha256:5ef34d190326c3b1f822a5b7a45f6c4535e2f47ed06fec77d3d799c450b2651e \
        --hash=sha256:6975a3fac6bc83c4a65c9f9fcab9e47019a11d3d2cf7f3c0d03431bf145a941e \
        --hash=sha256:6c9a799e985904922a4d207a94eae35c78ebae90e128f0c4e521ce339396be9d \
        --hash=sha256:70df4e3b545a17496c9b3f41f5115e69a4f2e77e94e1d2a8e1070bc0c38c8a3c \
        --hash=sha256:7473e861101c9e72452f9bf8acb984947aa1661a7704553a9f6e4baa5ba64415 \
        --hash=sha256:8102eaf27e1e448db915d08afa8b41d6c7ca7a04b7d73af6514df10a3e74bd82 \
        --hash=sha256:87c450779d0914f2861b8526e035c5e6da0a3199d8f1add1a665e1cbc6fc6d02 \
        --hash=sha256:8b7ee99e510d7b66cdb6c593f21c043c248537a32e0bedf02e01e9553a172314 \
        --hash=sha256:91fc98adde3d7881af9b59ed0294046f3806221863722ba7d8d120c575314325 \
        --hash=sha256:94411f22c3985acaec6f83c6df553f2dbe17b698cc7f8ae751ff2237d96b9e3c \
        --hash=sha256:98d85c6a2bef81588d9227dde12db8a7f47f639f4a17c9ae08e773aa9c697bf3 \
        --hash=sha256:9ad5db27f9cabae298d151c85cf2bad1d359a1b9c686a275df03385758e2f914 \
        --hash=sha256:a0b71b1b8fbf2b96e41c4d990244165e2c9be83d54962a9a1d118fd8657d2045 \
        --hash=sha256:a0f100c8912c114ff53e1202d0078b425bee3649ae34d7b070e9697f93c5d52d \
        --hash=sha256:a591fe9e525846e4d154205572a029f653ada1a78b93697f3b5a8f1f2bc055b9 \
        --hash=sha256:a5c84c68147988265e60416b57fc83425a78058853509c1b0629c180094904a5 \
        --hash=sha256:a66d3508133af6e8548451b25058d5812812ec3798c886bf38ed24a98216fab2 \
        --hash=sha256:a8c4917bd7ad33e8eb21e9a5bbba979b49d9a97acb3a803092cbc1133e20343c \
        --hash=sha256:b3bbeb01c2b273cca1e1e0c5df57f12dce9a4dd331b4fa1635b8bec26350bde3 \
        --hash=sha256:cba9d6b9a7d64d4bd46167096fc9d2f835e25d7e4c121fb2ddfc6528fb0413b2 \
        --hash=sha256:cc4d65aeeaa04136a12677d3dd0b1c0c94dc43abac5860ab33cceb42b801c1e8 \
        --hash=sha256:ce4bcc037df4fc5e3d184794f27bdaab018943698f4ca31630bc7f84a7b69c6d \
        --hash=sha256:cec7d9412a9102bdc577382c3929b337320c4c4c4849f2c5cdd14d7368c5562d \
        --hash=sha256:d400bfb9a37b1351253cb402671cea7e89bdecc294e8016a707f6d1d8ac934f9 \
        --hash=sha256:d61f4695e6c866a23a21acab0509af1cdfd2c013cf256bbf5b6b5e2695827162 \
        --hash=sha256:db0fbb9c62743ce59a9ff687eb5f4afbe77e5e8403d6697f7446e5f609976f76 \
        --hash=sha256:dd86c085fae2efd48ac91dd7ccffcfc0571387fe1193d33b6394db7ef31fe2a4 \
        --hash=sha256:e00b098126fd45523dd056d2efba6c5a63b71ffe9f2bbe1a4fe1716e1d0c331e \
        --hash=sha256:e229a521186c75c8ad9490854fd8bbdd9a0c9aa3a524326b55be83b54d4e0ad9 \
        --hash=sha256:e263d77ee3dd201c3a142934a086a4450861778baaeeb45db4591ef65550b0a6 \
        --hash=sha256:ed9cb427ba5504c1dc15ede7d516b84757c3e3d7868ccc85121d9310d27eed0b \
        --hash=sha256:fa6693661a4c91757f4412306191b6dc88c1703f780c8234035eac011922bc01 \
        --hash=sha256:fcd131dd944808b5bdb38e6f5b53013c5aa4f334c5cad0c72742f6eba4b73db0
        # via cryptography
    cryptography==3.4.8 \
        --hash=sha256:0a7dcbcd3f1913f664aca35d47c1331fce738d44ec34b7be8b9d332151b0b01e \
        --hash=sha256:1eb7bb0df6f6f583dd8e054689def236255161ebbcf62b226454ab9ec663746b \
        --hash=sha256:21ca464b3a4b8d8e86ba0ee5045e103a1fcfac3b39319727bc0fc58c09c6aff7 \
        --hash=sha256:34dae04a0dce5730d8eb7894eab617d8a70d0c97da76b905de9efb7128ad7085 \
        --hash=sha256:3520667fda779eb788ea00080124875be18f2d8f0848ec00733c0ec3bb8219fc \
        --hash=sha256:3c4129fc3fdc0fa8e40861b5ac0c673315b3c902bbdc05fc176764815b43dd1d \
        --hash=sha256:3fa3a7ccf96e826affdf1a0a9432be74dc73423125c8f96a909e3835a5ef194a \
        --hash=sha256:5b0fbfae7ff7febdb74b574055c7466da334a5371f253732d7e2e7525d570498 \
        --hash=sha256:695104a9223a7239d155d7627ad912953b540929ef97ae0c34c7b8bf30857e89 \
        --hash=sha256:8695456444f277af73a4877db9fc979849cd3ee74c198d04fc0776ebc3db52b9 \
        --hash=sha256:94cc5ed4ceaefcbe5bf38c8fba6a21fc1d365bb8fb826ea1688e3370b2e24a1c \
        --hash=sha256:94fff993ee9bc1b2440d3b7243d488c6a3d9724cc2b09cdb297f6a886d040ef7 \
        --hash=sha256:9965c46c674ba8cc572bc09a03f4c649292ee73e1b683adb1ce81e82e9a6a0fb \
        --hash=sha256:a00cf305f07b26c351d8d4e1af84ad7501eca8a342dedf24a7acb0e7b7406e14 \
        --hash=sha256:a305600e7a6b7b855cd798e00278161b681ad6e9b7eca94c721d5f588ab212af \
        --hash=sha256:cd65b60cfe004790c795cc35f272e41a3df4631e2fb6b35aa7ac6ef2859d554e \
        --hash=sha256:d2a6e5ef66503da51d2110edf6c403dc6b494cc0082f85db12f54e9c5d4c3ec5 \
        --hash=sha256:d9ec0e67a14f9d1d48dd87a2531009a9b251c02ea42851c060b25c782516ff06 \
        --hash=sha256:f44d141b8c4ea5eb4dbc9b3ad992d45580c1d22bf5e24363f2fbf50c2d7ae8a7
        # via pyjwt
    django==3.1.14 \
        --hash=sha256:0fabc786489af16ad87a8c170ba9d42bfd23f7b699bd5ef05675864e8d012859 \
        --hash=sha256:72a4a5a136a214c39cf016ccdd6b69e2aa08c7479c66d93f3a9b5e4bb9d8a347
        # via
        #   -r main.in
        #   djangorestframework
        #   drf-jwt
    djangorestframework==3.12.4 \
        --hash=sha256:6d1d59f623a5ad0509fe0d6bfe93cbdfe17b8116ebc8eda86d45f6e16e819aaf \
        --hash=sha256:f747949a8ddac876e879190df194b925c177cdeb725a099db1460872f7c0a7f2
        # via
        #   -r main.in
        #   drf-jwt
    drf-jwt==1.19.2 \
        --hash=sha256:63c3d4ed61a1013958cd63416e2d5c84467d8ae3e6e1be44b1fb58743dbd1582 \
        --hash=sha256:660bc66f992065cef59832adcbbdf871847e9738671c19e5121971e773768235
        # via -r main.in
    pycparser==2.21 \
        --hash=sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9 \
        --hash=sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206
        # via cffi
    pyjwt==2.1.0 \
        --hash=sha256:934d73fbba91b0483d3857d1aff50e96b2a892384ee2c17417ed3203f173fca1 \
        --hash=sha256:fba44e7898bbca160a2b2b501f492824fc8382485d3a6f11ba5d0c1937ce6130
        # via
        #   -r main.in
        #   drf-jwt
    pytz==2022.6 \
        --hash=sha256:222439474e9c98fced559f1709d89e6c9cbf8d79c794ff3eb9f8800064291427 \
        --hash=sha256:e89512406b793ca39f5971bc999cc538ce125c0e51c27941bef4568b460095e2
        # via django
    sqlparse==0.4.3 \
        --hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \
        --hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268
        # via django
    
    $ python3 -m piptools compile --generate-hashes --strip-extras --resolver=backtracking dev.in
    #
    # This file is autogenerated by pip-compile with python 3.8
    # To update, run:
    #
    #    pip-compile --generate-hashes --resolver=backtracking --strip-extras dev.in
    #
    black==22.10.0 \
        --hash=sha256:14ff67aec0a47c424bc99b71005202045dc09270da44a27848d534600ac64fc7 \
        --hash=sha256:197df8509263b0b8614e1df1756b1dd41be6738eed2ba9e9769f3880c2b9d7b6 \
        --hash=sha256:1e464456d24e23d11fced2bc8c47ef66d471f845c7b7a42f3bd77bf3d1789650 \
        --hash=sha256:2039230db3c6c639bd84efe3292ec7b06e9214a2992cd9beb293d639c6402edb \
        --hash=sha256:21199526696b8f09c3997e2b4db8d0b108d801a348414264d2eb8eb2532e540d \
        --hash=sha256:2644b5d63633702bc2c5f3754b1b475378fbbfb481f62319388235d0cd104c2d \
        --hash=sha256:432247333090c8c5366e69627ccb363bc58514ae3e63f7fc75c54b1ea80fa7de \
        --hash=sha256:444ebfb4e441254e87bad00c661fe32df9969b2bf224373a448d8aca2132b395 \
        --hash=sha256:5b9b29da4f564ba8787c119f37d174f2b69cdfdf9015b7d8c5c16121ddc054ae \
        --hash=sha256:5cc42ca67989e9c3cf859e84c2bf014f6633db63d1cbdf8fdb666dcd9e77e3fa \
        --hash=sha256:5d8f74030e67087b219b032aa33a919fae8806d49c867846bfacde57f43972ef \
        --hash=sha256:72ef3925f30e12a184889aac03d77d031056860ccae8a1e519f6cbb742736383 \
        --hash=sha256:819dc789f4498ecc91438a7de64427c73b45035e2e3680c92e18795a839ebb66 \
        --hash=sha256:915ace4ff03fdfff953962fa672d44be269deb2eaf88499a0f8805221bc68c87 \
        --hash=sha256:9311e99228ae10023300ecac05be5a296f60d2fd10fff31cf5c1fa4ca4b1988d \
        --hash=sha256:974308c58d057a651d182208a484ce80a26dac0caef2895836a92dd6ebd725e0 \
        --hash=sha256:b8b49776299fece66bffaafe357d929ca9451450f5466e997a7285ab0fe28e3b \
        --hash=sha256:c957b2b4ea88587b46cf49d1dc17681c1e672864fd7af32fc1e9664d572b3458 \
        --hash=sha256:e41a86c6c650bcecc6633ee3180d80a025db041a8e2398dcc059b3afa8382cd4 \
        --hash=sha256:f513588da599943e0cde4e32cc9879e825d58720d6557062d1098c5ad80080e1 \
        --hash=sha256:fba8a281e570adafb79f7755ac8721b6cf1bbf691186a287e990c7929c7692ff
        # via -r dev.in
    click==8.1.3 \
        --hash=sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e \
        --hash=sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48
        # via black
    mypy-extensions==0.4.3 \
        --hash=sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d \
        --hash=sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8
        # via black
    pathspec==0.10.2 \
        --hash=sha256:88c2606f2c1e818b978540f73ecc908e13999c6c3a383daf3705652ae79807a5 \
        --hash=sha256:8f6bf73e5758fd365ef5d58ce09ac7c27d2833a8d7da51712eac6e27e35141b0
        # via black
    platformdirs==2.5.4 \
        --hash=sha256:1006647646d80f16130f052404c6b901e80ee4ed6bef6792e1f238a8969106f7 \
        --hash=sha256:af0276409f9a02373d540bf8480021a048711d572745aef4b7842dad245eba10
        # via black
    tomli==2.0.1 \
        --hash=sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc \
        --hash=sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f
        # via black
    typing-extensions==4.4.0 \
        --hash=sha256:1511434bb92bf8dd198c12b1cc812e800d4181cfcb867674e0f8279cc93087aa \
        --hash=sha256:16fa4864408f655d35ec496218b85f79b3437c829e93320c7c9215ccfd92489e
        # via black
    

    Expected result

    I'd like to be able to use generated requirements files to install the packages.

    Actual result

    I can't install the packages, neither with pip-sync or with pip install -r:

    $ python -m piptools sync main.txt dev.txt
    Collecting asgiref==3.5.2
      Using cached asgiref-3.5.2-py3-none-any.whl (22 kB)
    Collecting black==22.10.0
      Using cached black-22.10.0-cp38-cp38-macosx_11_0_arm64.whl (1.2 MB)
    Collecting cffi==1.15.1
      Using cached cffi-1.15.1.tar.gz (508 kB)
      Preparing metadata (setup.py) ... done
    Collecting cryptography==3.4.8
      Using cached cryptography-3.4.8-cp36-abi3-macosx_11_0_arm64.whl (1.9 MB)
    Collecting django==3.1.14
      Using cached Django-3.1.14-py3-none-any.whl (7.8 MB)
    Collecting djangorestframework==3.12.4
      Using cached djangorestframework-3.12.4-py3-none-any.whl (957 kB)
    Collecting drf-jwt==1.19.2
      Using cached drf_jwt-1.19.2-py2.py3-none-any.whl (21 kB)
    Collecting mypy-extensions==0.4.3
      Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
    Collecting pathspec==0.10.2
      Using cached pathspec-0.10.2-py3-none-any.whl (28 kB)
    Collecting platformdirs==2.5.4
      Using cached platformdirs-2.5.4-py3-none-any.whl (14 kB)
    Collecting pycparser==2.21
      Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
    Collecting pyjwt==2.1.0
      Using cached PyJWT-2.1.0-py3-none-any.whl (16 kB)
    Collecting pytz==2022.6
      Using cached pytz-2022.6-py2.py3-none-any.whl (498 kB)
    Collecting sqlparse==0.4.3
      Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
    Collecting typing-extensions==4.4.0
      Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB)
    Requirement already satisfied: tomli>=1.1.0 in /Users/pawelad/.pyenv/versions/3.8.13/envs/tmp/lib/python3.8/site-packages (from black==22.10.0->-r /var/folders/zd/bh5ny2dj5hv8x_5vf92tnfz40000gn/T/tmp5g4e645s (line 4)) (2.0.1)
    Requirement already satisfied: click>=8.0.0 in /Users/pawelad/.pyenv/versions/3.8.13/envs/tmp/lib/python3.8/site-packages (from black==22.10.0->-r /var/folders/zd/bh5ny2dj5hv8x_5vf92tnfz40000gn/T/tmp5g4e645s (line 4)) (8.1.3)
    Collecting PyJWT[crypto]<3.0.0,>=1.5.2
    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==. These do not:
        PyJWT[crypto]<3.0.0,>=1.5.2 from https://files.pythonhosted.org/packages/40/46/505f0dd53c14096f01922bf93a7abb4e40e29a06f858abbaa791e6954324/PyJWT-2.6.0-py3-none-any.whl (from drf-jwt==1.19.2->-r /var/folders/zd/bh5ny2dj5hv8x_5vf92tnfz40000gn/T/tmp5g4e645s (line 117))
    Traceback (most recent call last):
      File "/Users/pawelad/.pyenv/versions/3.8.13/lib/python3.8/runpy.py", line 194, in _run_module_as_main
        return _run_code(code, main_globals, None,
      File "/Users/pawelad/.pyenv/versions/3.8.13/lib/python3.8/runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/piptools/__main__.py", line 19, in <module>
        cli()
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/click/core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/click/core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/click/core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/click/core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/piptools/scripts/sync.py", line 177, in cli
        sync.sync(
      File "/Users/pawelad/.pyenv/versions/tmp/lib/python3.8/site-packages/piptools/sync.py", line 240, in sync
        run(  # nosec
      File "/Users/pawelad/.pyenv/versions/3.8.13/lib/python3.8/subprocess.py", line 516, in run
        raise CalledProcessError(retcode, process.args,
    subprocess.CalledProcessError: Command '['/Users/pawelad/.pyenv/versions/tmp/bin/python', '-m', 'pip', 'install', '-r', '/var/folders/zd/bh5ny2dj5hv8x_5vf92tnfz40000gn/T/tmp5g4e645s']' returned non-zero exit status 1.
    
    $ pip install -r main.txt
    Collecting asgiref==3.5.2
      Using cached asgiref-3.5.2-py3-none-any.whl (22 kB)
    Collecting cffi==1.15.1
      Using cached cffi-1.15.1.tar.gz (508 kB)
      Preparing metadata (setup.py) ... done
    Collecting cryptography==3.4.8
      Using cached cryptography-3.4.8-cp36-abi3-macosx_11_0_arm64.whl (1.9 MB)
    Collecting django==3.1.14
      Using cached Django-3.1.14-py3-none-any.whl (7.8 MB)
    Collecting djangorestframework==3.12.4
      Using cached djangorestframework-3.12.4-py3-none-any.whl (957 kB)
    Collecting drf-jwt==1.19.2
      Using cached drf_jwt-1.19.2-py2.py3-none-any.whl (21 kB)
    Collecting pycparser==2.21
      Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
    Collecting pyjwt==2.1.0
      Using cached PyJWT-2.1.0-py3-none-any.whl (16 kB)
    Collecting pytz==2022.6
      Using cached pytz-2022.6-py2.py3-none-any.whl (498 kB)
    Collecting sqlparse==0.4.3
      Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
    Collecting PyJWT[crypto]<3.0.0,>=1.5.2
    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==. These do not:
        PyJWT[crypto]<3.0.0,>=1.5.2 from https://files.pythonhosted.org/packages/40/46/505f0dd53c14096f01922bf93a7abb4e40e29a06f858abbaa791e6954324/PyJWT-2.6.0-py3-none-any.whl (from drf-jwt==1.19.2->-r main.txt (line 111))
    
    dependency resolver 
    opened by pawelad 5
  • Is 22.0 possible?

    Is 22.0 possible?

    opened by daveisfera 4
Releases(6.12.1)
  • 6.12.1(Dec 17, 2022)

  • 6.12.0(Dec 14, 2022)

    Features:

    • Add --no-index flag to pip-compile (#1745). Thanks @atugushev

    Bug Fixes:

    • Treat --upgrade-packages PKGSPECs as constraints (not just minimums), consistently (#1578). Thanks @AndydeCleyre
    • Filter out the user provided unsafe packages (#1766). Thanks @q0w
    • Adopt PEP-621 for packaging (#1763). Thanks @ssbarnea
    Source code(tar.gz)
    Source code(zip)
  • 6.11.0(Dec 1, 2022)

    Features:

    • Add pyproject.toml file (#1643). Thanks @otherJL0
    • Support build isolation using setuptools/pyproject.toml requirement files (#1727). Thanks @atugushev

    Bug Fixes:

    • Improve punctuation/grammar with pip-compile header (#1547). Thanks @blueyed
    • Generate hashes for all available candidates (#1723). Thanks @neykov

    Other Changes:

    • Bump click minimum version to >= 8 (#1733). Thanks @atugushev
    • Bump pip minimum version to >= 22.2 (#1729). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 6.10.0(Nov 14, 2022)

    Features:

    • Deprecate pip-compile --resolver=legacy (#1724). Thanks @atugushev
    • Prompt user to use the backtracking resolver on errors (#1719). Thanks @maxfenv
    • Add support for Python 3.11 final (#1708). Thanks @hugovk
    • Add --newline=[LF|CRLF|native|preserve] option to pip-compile (#1652). Thanks @AndydeCleyre

    Bug Fixes:

    • Fix inconsistent handling of constraints comments with backtracking resolver (#1713). Thanks @mkniewallner
    • Fix some encoding warnings in Python 3.10 (PEP 597) (#1614). Thanks @GalaxySnail

    Other Changes:

    • Update pip-tools version in the README's pre-commit examples (#1701). Thanks @Kludex
    • Document use of the backtracking resolver (#1718). Thanks @maxfenv
    • Use HTTPS in a readme link (#1716). Thanks @Arhell
    Source code(tar.gz)
    Source code(zip)
  • 6.9.0(Oct 5, 2022)

    Features:

    • Add --all-extras flag to pip-compile (#1630). Thanks @apljungquist
    • Support Exclude Package with custom unsafe packages (#1509). Thanks @hmc-cs-mdrissi

    Bug Fixes:

    • Fix compile cached vcs packages (#1649). Thanks @atugushev
    • Include py.typed in wheel file (#1648). Thanks @FlorentJeannot

    Other Changes:

    • Add pyproject.toml & modern packaging to introduction. (#1668). Thanks @hynek
    Source code(tar.gz)
    Source code(zip)
  • 6.8.0(Jun 30, 2022)

  • 6.7.0(Jun 28, 2022)

    Features:

    • Support for the importlib.metadata metadata implementation (#1632). Thanks @richafrank

    Bug Fixes:

    • Instantiate a new accumulator InstallRequirement for combine_install_requirements output (#1519). Thanks @richafrank

    Other Changes:

    • Replace direct usage of the pep517 module with the build module, for loading project metadata (#1629). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.6.2(May 23, 2022)

  • 6.6.1(May 13, 2022)

  • 6.6.0(Apr 6, 2022)

    Features:

    • Add support for pip>=22.1 (#1607). Thanks @atugushev

    Bug Fixes:

    • Ensure pip-compile --dry-run --quiet still shows what would be done, while omitting the dry run message (#1592). Thanks @AndydeCleyre
    • Fix --generate-hashes when hashes are computed from files (#1540). Thanks @RazerM
    Source code(tar.gz)
    Source code(zip)
  • 6.5.1(Feb 8, 2022)

  • 6.5.0(Feb 4, 2022)

  • 6.4.0(Oct 12, 2021)

  • 6.3.1(Oct 8, 2021)

    Bug Fixes:

    • Ensure pip-tools unions dependencies of multiple declarations of a package with different extras (#1486). Thanks @richafrank
    • Allow comma-separated arguments for --extra (#1493). Thanks @AndydeCleyre
    • Improve clarity of help text for options supporting multiple (#1492). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.3.0(Sep 21, 2021)

    Features:

    • Enable single-line annotations with pip-compile --annotation-style=line (#1477). Thanks @AndydeCleyre
    • Generate PEP 440 direct reference whenever possible (#1455). Thanks @FlorentJeannot
    • PEP 440 Direct Reference support (#1392). Thanks @FlorentJeannot

    Bug Fixes:

    • Change log level of hash message (#1460). Thanks @plannigan
    • Allow passing --no-upgrade option (#1438). Thanks @ssbarnea
    Source code(tar.gz)
    Source code(zip)
  • 6.2.0(Jun 22, 2021)

    Features:

    • Add --emit-options/--no-emit-options flags to pip-compile (#1123). Thanks @atugushev
    • Add --python-executable option for pip-sync (#1333). Thanks @MaratFM
    • Log which python version was used during compile (#828). Thanks @graingert

    Bug Fixes:

    • Fix pip-compile package ordering (#1419). Thanks @adamsol
    • Add --strip-extras option to pip-compile for producing constraint compatible output (#1404). Thanks @ssbarnea
    • Fix click v7 version_option compatibility (#1410). Thanks @FuegoFro
    • Pass package_name explicitly in click.version_option decorators for compatibility with click>=8.0 (#1400). Thanks @nicoa

    Other Changes:

    • Document updating requirements with pre-commit hooks (#1387). Thanks @microcat49
    • Add setuptools and wheel dependencies to the setup.cfg (#889). Thanks @jayvdb
    • Improve instructions for new contributors (#1394). Thanks @FlorentJeannot
    • Better explain role of existing requirements.txt (#1369). Thanks @mikepqr
    Source code(tar.gz)
    Source code(zip)
  • 6.1.0(Apr 14, 2021)

    Features:

    • Add support for pyproject.toml or setup.cfg as input dependency file (PEP-517) for pip-compile (#1356). Thanks @orsinium
    • Add pip-compile --extra option to specify extras_require dependencies (#1363). Thanks @orsinium

    Bug Fixes:

    • Restore ability to set compile cache with env var PIP_TOOLS_CACHE_DIR (#1368). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.0.1(Mar 15, 2021)

  • 6.0.0(Mar 13, 2021)

    Backwards Incompatible Changes:

    • Remove support for EOL Python 3.5 and 2.7 (#1243). Thanks @jdufresne
    • Remove deprecated --index/--no-index option from pip-compile (#1234). Thanks @jdufresne

    Features:

    • Use pep517 to parse dependencies metadata from setup.py (#1311). Thanks @astrojuanlu

    Bug Fixes:

    • Fix a bug where pip-compile with setup.py would not include dependencies with environment markers (#1311). Thanks @astrojuanlu
    • Prefer === over == when generating requirements.txt if a dependency was pinned with === (#1323). Thanks @IceTDrinker
    • Fix a bug where pip-compile with setup.py in nested folder would generate setup.txt output file (#1324). Thanks @peymanslh
    • Write out default index when it is provided as --extra-index-url (#1325). Thanks @fahrradflucht

    Dependencies:

    • Bump pip minimum version to >= 20.3 (#1340). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 5.5.0(Dec 30, 2020)

    Features:

    • Add Python 3.9 support (1222). Thanks @jdufresne
    • Improve formatting of long "via" annotations (1237). Thanks @jdufresne
    • Add --verbose and --quiet options to pip-sync (1241). Thanks @jdufresne
    • Add --no-allow-unsafe option to pip-compile (1265). Thanks @jdufresne

    Bug Fixes:

    • Restore PIP_EXISTS_ACTION environment variable to its previous state when resolve dependencies in pip-compile (1255). Thanks @jdufresne

    Dependencies:

    • Remove six dependency in favor pip's vendored six (1240). Thanks @jdufresne

    Improved Documentation:

    • Add pip-requirements.el (for Emacs) to useful tools to README (#1244). Thanks @jdufresne
    • Add supported Python versions to README (#1246). Thanks @jdufresne
    Source code(tar.gz)
    Source code(zip)
  • 5.4.0(Nov 21, 2020)

    Features:

    • Add pip>=20.3 support (1216). Thanks @atugushev and @AndydeCleyre
    • Exclude --no-reuse-hashes option from «command to run» header (1197). Thanks @graingert

    Dependencies:

    • Bump pip minimum version to >= 20.1 (1191). Thanks @atugushev and @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 5.3.1(Jul 31, 2020)

  • 5.3.0(Jul 26, 2020)

    Features:

    • Add -h alias for --help option to pip-sync and pip-compile (1163). Thanks @jan25
    • Add pip>=20.2 support (1168). Thanks @atugushev
    • pip-sync now exists with code 1 on --dry-run (1172). Thanks @francisbrito
    • pip-compile now doesn't resolve constraints from -c constraints.txtthat are not (yet) requirements (1175). Thanks @clslgrnc
    • Add --reuse-hashes/--no-reuse-hashes options to pip-compile (1177). Thanks @graingert
    Source code(tar.gz)
    Source code(zip)
  • 5.2.1(Jun 9, 2020)

  • 5.2.0(May 27, 2020)

    Features:

    • Show basename of URLs when pip-compile generates hashes in a verbose mode (1113). Thanks @atugushev
    • Add --emit-index-url/--no-emit-index-url options to pip-compile (1130). Thanks @atugushev

    Bug Fixes:

    • Fix a bug where pip-compile would ignore some of package versions when PIP_PREFER_BINARY is set on (1119). Thanks @atugushev
    • Fix leaked URLs with credentials in the debug output of pip-compile (1146). Thanks @atugushev
    • Fix a bug where URL requirements would have name collisions (1149). Thanks @geokala

    Deprecations:

    • Deprecate --index/--no-index in favor of --emit-index-url/--no-emit-index-url options in pip-compile (1130). Thanks @atugushev

    Other Changes:

    • Switch to setuptools declarative syntax through setup.cfg (1141). Thanks @jdufresne
    Source code(tar.gz)
    Source code(zip)
  • 5.1.2(May 5, 2020)

  • 5.1.1(May 1, 2020)

  • 5.1.0(Apr 27, 2020)

    Features:

    • Show progress bar when downloading packages in pip-compile verbose mode (#949). Thanks @atugushev
    • pip-compile now gets hashes from PyPI JSON API (if available) which significantly increases the speed of hashes generation (#1109). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 5.0.0(Apr 16, 2020)

    Backwards Incompatible Changes:

    • pip-tools now requires pip>=20.0 (previosly 8.1.x - 20.0.x). Windows users, make sure to use python -m pip install pip-tools to avoid issues with pip self-update from now on (#1055). Thanks @atugushev
    • --build-isolation option now set on by default for pip-compile (#1060). Thanks @hramezani

    Features:

    • Exclude requirements with non-matching markers from pip-sync (#927). Thanks @AndydeCleyre
    • Add pre-commit hook for pip-compile (#976). Thanks @atugushev
    • pip-compile and pip-sync now pass anything provided to the new --pip-args option on to pip (#1080). Thanks @AndydeCleyre
    • pip-compile output headers are now more accurate when -- is used to escape filenames (#1080). Thanks @AndydeCleyre
    • Add pip>=20.1 support (#1088). Thanks @atugushev

    Bug Fixes:

    • Fix a bug where editables that are both direct requirements and constraints wouldn't appear in pip-compile output (#1093). Thanks @richafrank
    • pip-compile now sorts format controls (--no-binary/--only-binary) to ensure consistent results (#1098). Thanks @richafrank

    Improved Documentation:

    • Add cross-environment usage documentation to README (#651). Thanks @vphilippon
    • Add versions compatibility table to README (#1106). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 4.5.1(Feb 26, 2020)

    Bug Fixes:

    • Strip line number annotations such as "(line XX)" from file requirements, to prevent diff noise when modifying input requirement files (#1075). Thanks @adamchainz

    Improved Documentation:

    • Updated README example outputs for primary requirement annotations (#1072). Thanks @richafrank
    Source code(tar.gz)
    Source code(zip)
Owner
Jazzband
Jazzband
Workon - A simple project manager for conda, windows 10 and vscode

WORK ON A simple project manager for conda, windows 10 and vscode Installation p

Jesus Alan Hernandez Galvan 1 Jan 16, 2022
A set of tools to keep your pinned Python dependencies fresh.

pip-tools = pip-compile + pip-sync A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pi

Jazzband 6.5k Dec 29, 2022
Conan - The open-source C/C++ package manager

Conan Decentralized, open-source (MIT), C/C++ package manager. Homepage: https://conan.io/ Github: https://github.com/conan-io/conan Docs: https://doc

Conan.io 6.5k Jan 05, 2023
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch =4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 345 Dec 28, 2022
A software manager for easy development and distribution of Python code

Piper A software manager for easy development and distribution of Python code. The main features that Piper adds to Python are: Support for large-scal

13 Nov 22, 2022
A flexible package manager that supports multiple versions, configurations, platforms, and compilers.

Spack Spack is a multi-platform package manager that builds and installs multiple versions and configurations of software. It works on Linux, macOS, a

Spack 3.1k Jan 09, 2023
A PDM plugin that packs your packages into a zipapp

pdm-packer A PDM plugin that packs your packages into a zipapp Requirements pdm-packer requires Python =3.7 Installation If you have installed PDM wi

Frost Ming 23 Dec 29, 2022
Python dependency management and packaging made easy.

Poetry: Dependency Management for Python Poetry helps you declare, manage and install dependencies of Python projects, ensuring you have the right sta

Poetry 23.2k Jan 05, 2023
Dotpkg - Package manager for your dotfiles

Dotpkg A package manager for your dotfiles. Usage First make sure to have Python

FW 4 Mar 18, 2022
Python Development Workflow for Humans.

Pipenv: Python Development Workflow for Humans [ ~ Dependency Scanning by PyUp.io ~ ] Pipenv is a tool that aims to bring the best of all packaging wo

Python Packaging Authority 23.5k Jan 06, 2023
A Poetry plugin for dynamically extracting the package version.

Poetry Version Plugin A Poetry plugin for dynamically extracting the package version. It can read the version from a file __init__.py with: # __init__

Sebastián Ramírez 264 Dec 22, 2022
Python PyPi staging server and packaging, testing, release tool

devpi: PyPI server and packaging/testing/release tool This repository contains three packages comprising the core devpi system on the server and clien

629 Jan 01, 2023
Python dependency management and packaging made easy.

Poetry: Dependency Management for Python Poetry helps you declare, manage and install dependencies of Python projects, ensuring you have the right sta

Poetry 23.1k Jan 01, 2023
Solaris IPS: Image Packaging System

Solaris Image Packaging System Introduction The image packaging system (IPS) is a software delivery system with interaction with a network repository

Oracle 57 Dec 30, 2022
PokerFace is a Python package for various poker tools.

PokerFace is a Python package for various poker tools. The following features are present in PokerFace... Types for cards and their componen

Juho Kim 21 Dec 29, 2022
If you have stars in your Pipfile and you don't want them, this project is for you!

unstar-pipfile If you have stars in your Pipfile, this project is for you! unstar-pipfile is a tool to scan Pipfile.lock and replace any stars in Pipf

2 Jul 26, 2022
Package manager based on libdnf and libsolv. Replaces YUM.

Dandified YUM Dandified YUM (DNF) is the next upcoming major version of YUM. It does package management using RPM, libsolv and hawkey libraries. For m

1.1k Dec 26, 2022
The Python Package Index

Warehouse Warehouse is the software that powers PyPI. See our development roadmap, documentation, and architectural overview. Getting Started You can

Python Packaging Authority 3.1k Jan 01, 2023
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch =4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 345 Dec 28, 2022
OS-agnostic, system-level binary package manager and ecosystem

Conda is a cross-platform, language-agnostic binary package manager. It is the package manager used by Anaconda installations, but it may be used for

Conda 5.1k Jan 07, 2023