-
-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip-compile --extra doesn't take into account already pinned versions #1891
Comments
Is it correct that you're pinning versions in Currently If I understand right, then at least a workaround would be to If you were using |
Related: #1364 |
Yes. Essentially, I was trying to achieve the same "Layered workflow" available with |
Digging into the issue I found that pip parses requirements from I suppose this can be fixed by calling |
I'm trying to move away from
*.in
files and manage everything from pyproject.toml, but I'm facing a problem wherepip-compile --extra
doesn't take into account the already pinned versions frompip-compile
(without --extra).Environment Versions
Steps to replicate
pyproject.toml
:pip-compile --no-header --no-annotate --resolver backtracking
to obtain:django
dependency from thepyproject.toml
, and verify thatpip-compile
doesn't update any package.pip-compile --no-header --no-annotate --resolver backtracking --extra dev -o dev-requirements.txt
, and see that thedjango
dependency has been updated in thedev-requirements.txt
:I also tried to pass
--pip-args "-c requirements.txt"
, but nothing changed.Expected result
Actually, I would expect a way to pin only the dev dependencies, since the main dependencies were already pinned. Some workflow similar to the Workflow for layered requirements but with
pyproject.toml
instead of*.in
files.The text was updated successfully, but these errors were encountered: