Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support multiple requirements in / txt #532

Open
graingert opened this issue Jun 14, 2017 · 17 comments
Open

support multiple requirements in / txt #532

graingert opened this issue Jun 14, 2017 · 17 comments
Labels
feature Request for a new feature needs discussion Need some more discussion

Comments

@graingert
Copy link
Member

graingert commented Jun 14, 2017

I've got a directory of various requirements(?P<category>-[^.]*)?.in files. And I want to generate the appropriate f'requirements{category}.txt' files in one command.

@graingert
Copy link
Member Author

Also as it happens, I've got a bunch of these directories nested. I'd like to update all the f'requriements{category}.txt' files in one shot.

@merwok
Copy link

merwok commented Jul 11, 2017

+1! I find it’s very easy to have requirements split into runtime / test / deployment / CI files, and (sometimes) cpython vs pypy. I’ve been using a makefile to generate files but it’s not terribly user-friendly.

@tysonclugg
Copy link
Contributor

How about using GNU make with the following Makefile:

requirements%.txt: requirements%.in
	pip-compile -o "$@" "$<"

.PHONY: requirements
requirements: $(wildcard requirements*.txt)

Then you run make requirements, job done.

@merwok
Copy link

merwok commented May 3, 2018

Well makefiles are somewhat awful to write and don’t work on all OSes.

I used to have them then switched to small shell scripts.

@tysonclugg
Copy link
Contributor

@merwok I guess my point was that your workflow may be better supported by existing build tools, rather than expecting each tool to be a swiss army knife supporting all possible use cases. I'm -1 on this since you could indeed write a shell script to handle your use case.

@vphilippon - what are your thoughts?

@rpkilby
Copy link

rpkilby commented May 3, 2018

Just my two cents, but I'm -1 on this in its current form. This issue assumes that the .in file paths directly relate to the output requirements .txt file paths, which isn't always going to be the case. eg, the output requirements.txt may be located in the package's root directory, with the .in files in some subdirectory.

Instead I'd rather see a pip-recompile (or a --recompile flag) that does the following:

  • walks a directory tree to find pip-compiled requirements .txt files
  • parse the requirements files for the original pip-compile statement that's embedded at the top
  • execute those pip-compile statements.

Basically, we don't need to assume what the desired output is, this information is already present in the output requirements files.

@IvanAnishchuk
Copy link
Member

IvanAnishchuk commented Dec 6, 2018

I'd also add some support for multiple sections in setup.py/setup.cfg (install_requires, tests_require, multiple extras_require) translating them (loosely) into in-files with additional constraints added for files already compiled (so that resolved versions are not conflicting). I guess the recompile approach with parsing output files should work for that as well as long as the order of compilation is preserved. Perhaps we could add some information on that into the same file header?

I originally posted a longer comment in #492 but that was before I read this discussion. I would like to somehow achieve both better setup.py parsing and multiple-in-files compilation as parts of the same generalized workflow.

UPD: I'm against using a makefile, I think this shouldn't depend on anything other than python and pip-tools. It could be a separate package, of course, but I think dependency layering is a common-enough practice that could benefit many pip-tools users if not most of them.

UPD2: We probably don't even need to add any custom header. If we have explicit -c lines we can use them to establish the dependency order of the files. I guess that's it, we just need the right way to add -c lines to inter-dependent in-files (that would be automated for setup.cfg) and a recompile script to go over the txt files and recompile them (in the correct order). And if the txt files are yet present we would just make some safe enough default assumptions and allow the user to run pip-compile manually for more granular control. Easy peasy, I guess I'll just start implementing it and stop bothering you guys :)

@merwok
Copy link

merwok commented Dec 11, 2018

FTR I have found https://pypi.org/project/pip-compile-multi/

@IvanAnishchuk
Copy link
Member

IvanAnishchuk commented Dec 27, 2018

Thank you @merwok seems like exactly the tool I need. Except like many others it isn't aware about constraint files. But maybe I can modify it at least instead of implementing my own from scratch.

@chaoflow
Copy link
Contributor

chaoflow commented Apr 3, 2019

@IvanAnishchuk Have you found a solution? Please see also peterdemin/pip-compile-multi#128 (comment)

@IvanAnishchuk
Copy link
Member

@chaoflow My current approach is to use two files: requirements.in and requirements-dev.in with -c requirements.txt in the second file. They have to be compiled in that order (I use a script helper for that) and after compiling you can install everything with pip install -r requirements.txt -r requirements-dev.txt (or just skip the dev for production environment). setup.cfg-based solution would probably be somewhat nicer but this works fine for my needs and doesn't require anything other than pip-tools. You can also compile it with pip-compile -o requiremets.txt requirements.in requirements-dev.in to get a single txt file.

You can read more about constraint files in pip documentation.

@atugushev atugushev added feature Request for a new feature needs discussion Need some more discussion labels Sep 20, 2019
@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented Nov 9, 2019

FWIW my helper Zsh scripts for pip-tools work on all <category>-requirements.* files in the current directory when no individual files are specified, but does not currently do any recursive searching (except for the pypc function which populates pyproject.toml dependencies). For example:

# compile requirements.txt files from all found or specified requirements.in files (compile)
pipc () {  # [reqs-in...]
    for reqsin in ${@:-*requirements.in(N)}; do
        print -rP "%F{cyan}> compiling $reqsin -> ${reqsin:r}.txt . . .%f"
        pip-compile --no-header $reqsin 2>&1
    done
}

# install packages according to all found or specified requirements.txt files (sync)
pips () {  # [reqs-txt...]
    local reqstxts=(${@:-*requirements.txt(N)})
    if [[ $reqstxts ]]; then
        print -rP "%F{cyan}> syncing env <- $reqstxts . . .%f"
        pip-sync $reqstxts
        for reqstxt in $reqstxts; do  # can remove if https://github.com/jazzband/pip-tools/issues/896 is resolved (by merging https://github.com/jazzband/pip-tools/pull/907)
            pip install -qr $reqstxt  # AND
        done                          # https://github.com/jazzband/pip-tools/issues/925 is resolved (by merging https://github.com/jazzband/pip-tools/pull/927)
    fi
}

@nolar
Copy link

nolar commented Jun 3, 2020

I little addition to @IvanAnishchuk 's solution, atomic compilation is still needed. Here is why:

# requirements.in
requests
# requirements-dev.in
-c requirements.txt
moto

When we compile the first file (which is the runtime dependencies), we get:

# requirements.txt
………
idna==2.9                 # via requests
requests==2.23.0          # via -r requirements.in
………

Then we compile the 2nd file (dev/CI dependencies):

………
idna==2.8                 # via moto, requests
moto==1.3.14              # via -r requirements-dev.in
requests==2.23.0          # via docker, moto, responses
………
# requirements-dev.txt

Here comes a problem: moto is a library that had no releases since November 2019. The latest one is restricted to idna<2.9,>=2.5 (link).

So, the latest idna suitable for moto is 2.8. However, we have 2.9 in the runtime compiled list, which was unaware about the dev-dependencies.

Now, when we do

pip install -r requirements.txt -r requirements-dev.txt

We fail:

ERROR: Double requirement given: idna==2.8 (from -r requirements-dev.txt (line 23)) (already in idna==2.9 (from -r requirements.txt (line 9)), name='idna')

The 1st file should have been compiled with the 2nd file in mind somehow. But adding moto and all its sub-dependencies (a lot!) to the runtime file is undesired. Adding -c requirements-dev.in to requirements.in does not work (no effect).

@AndydeCleyre
Copy link
Contributor

@nolar I think you want -r requirements.in in your dev file, and -c dev-requirements.txt (not .in) in your main file.

@nolar
Copy link

nolar commented Jun 3, 2020

@AndydeCleyre That will be some kind of a circular dependency which is hard to compile. Effectively, requirements.in will be forever-pinned to the dev-versions. I'm not sure if the dev-txt-file will be upgradeable in that case. But I will try.

@AndydeCleyre
Copy link
Contributor

My Zsh frontend functions for pip-tools (linked above) have come a way, and the interface seems pretty stable now. Three relevant functions from the project:

# Compile, then sync.
# Use -h to include hashes, -u dep1,dep2... to upgrade specific dependencies, and -U to upgrade all.
pipcs [-h] [-U|-u <pkgspec>[,<pkgspec>...]] [--only-sync-if-changed] [<reqs-in>...] [-- <pip-compile-arg>...]

# 'pipcs -U' (upgrade-compile, sync) in a venv-activated subshell for the current or specified folders.
# Use --all to instead act on all known projects, or -i to interactively choose.
pipup [--py 2|pypy|current] [--only-sync-if-changed] [--all|-i|<proj-dir>...]

# Compile requirements.txt files from all found or specified requirements.in files (compile).
# Use -h to include hashes, -u dep1,dep2... to upgrade specific dependencies, and -U to upgrade all.
pipc [-h] [-U|-u <pkgspec>[,<pkgspec>...]] [<reqs-in>...] [-- <pip-compile-arg>...]

@webknjaz
Copy link
Member

webknjaz commented Jul 8, 2023

Just realized we've been discussing something similar @ #826 (comment). Though, I feel like it's hard to come up with a good UX for multiple source-to-output mappings. Maybe with the config (#604, #1863), it could be bearable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Request for a new feature needs discussion Need some more discussion
Projects
None yet
Development

No branches or pull requests

10 participants