Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

app: Update Forall command to allow multiple concurrent processes #755

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

pdgendt
Copy link
Collaborator

@pdgendt pdgendt commented Oct 16, 2024

Demonstrate asynchronous behavior for the Forall command and add an argument to select the number of jobs.

The same idea can be applied to the Update command.

@pdgendt
Copy link
Collaborator Author

pdgendt commented Oct 16, 2024

For comparison, timing counting the lines of code using cloc with 20 logical cores:

west forall -c "cloc --quiet ."  103.21s user 5.00s system 70% cpu 2:34.15 total
west forall -j -c "cloc --quiet ."  148.83s user 5.32s system 472% cpu 32.600 total

@pdgendt pdgendt force-pushed the async-command branch 2 times, most recently from e120eb3 to 9592b45 Compare October 16, 2024 11:27
@pdgendt pdgendt requested a review from marc-hb October 16, 2024 11:31
@pdgendt pdgendt marked this pull request as ready for review October 17, 2024 07:30
Copy link
Collaborator

@marc-hb marc-hb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks really cool, thanks for designing and testing this! I'm afraid we still have the issue of concurrent outputs though, see below.

self.banner(f'running "{args.subcommand}" in {project.name_and_path}:')
proc = await asyncio.create_subprocess_shell(args.subcommand,
cwd=cwd, env=env, shell=True)
return await proc.wait()
Copy link
Collaborator

@marc-hb marc-hb Oct 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you need to capture standard outputs from different processes here so they don't interleave concurrently and randomly and become really hard to read. This could even turn into a terminal disaster if they use --color ANSI codes. I think we already touched on this question in #713 and before.

You probably tested this with relatively "quiet" and easy to read output and a reasonable number of threads... can you try again after cranking it all up? This code must be prepared to handle not just the "common" cases but all cases.

You can either use the usual (out, err) = proc.communicate(). This avoids concurrent terminal outputs from different subprocesses but it assumes process outputs are already line-based.
https://docs.python.org/3/library/asyncio-subprocess.html

So it's probably better to play it safer and use some readline() variation. I found a couple examples that seem relevant: https://kevinmccarthy.org/2016/07/25/streaming-subprocess-stdin-and-stdout-with-asyncio-in-python/
https://stackoverflow.com/questions/2804543/read-subprocess-stdout-line-by-line

The real icing on the cake would be an option that prefixes each line of the outputs with the project name! BTW:

Eventually, that parallelization and output capture code should be generic enough to be-reused by all commands, not just forall! And especially west update where it is ... awaited (pun intended) the most (#713 etc.)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I didn't want to dive right in here, but it would be nice to get right from the get-go.

I was dabbling with the idea of having the output behave like ninja where the output line is replaced with the banner and maybe some counter indicating the progress. And when the subprocesses is done, print its output as is.

Not sure if this would require something like curses, I think it could be more lightweight.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was dabbling with the idea of having the output behave like ninja where the output line is replaced with the banner and maybe some counter indicating the progress. And when the subprocesses is done, print its output as is

That would be awesome but I think just 1) making sure the output is readable 2) all commands use the same output "framework" would already be a major milestone and great stepping stone towards something better. And it would give what a lot of users have been waiting for: concurrency at last.

Copy link
Collaborator

@marc-hb marc-hb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important process output questions to address.

continue

async def run_for_project(self, project, args, sem):
async with sem:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think sem is really too short. Also, it's used only twice so that does not save much.

Suggested change
async with sem:
async with semaphore:


cmd('update net-tools Kconfiglib')

# print order is no longer guaranteed when there are multiple projects
Copy link
Collaborator

@marc-hb marc-hb Oct 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then the banners don't make sense anymore and should not be printed when j > 1

It's more complicated...

@@ -1670,16 +1671,15 @@ def do_add_parser(self, parser_adder):
parser.add_argument('projects', metavar='PROJECT', nargs='*',
help='''projects (by name or path) to operate on;
defaults to active cloned projects''')
parser.add_argument('-j', '--jobs', nargs='?', const=-1,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
parser.add_argument('-j', '--jobs', nargs='?', const=-1,
# Default to 1 when `-j` is not given because there is no way to
# whether the user commands can be run at the same time safely.
parser.add_argument('-j', '--jobs', nargs='?', const=-1,

Eventually, west grep, west update and others could default to cpu_count() if everything goes well but I think forall should always default to 1.

(such a comment also helps a bit with the peculiar default+const argparse idiom)

Allow passing a custom end to banner methods. This is useful for example
to print a carriage return.

Signed-off-by: Pieter De Gendt <[email protected]>
Demonstrate asynchronous behavior for the Forall command and add an
argument to select the number of jobs.

Signed-off-by: Pieter De Gendt <[email protected]>
Add test cases for running the forall command with multiple processes.

Signed-off-by: Pieter De Gendt <[email protected]>
Copy link
Collaborator

@marc-hb marc-hb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You probably tested this with relatively "quiet" and easy to read output and a reasonable number of threads... can you try again after cranking it all up? This code must be prepared to handle not just the "common" cases but all cases.

OK, I just had a test idea and this PR was pretty easy to "break" after all. The following one-liner prints readable output with -j 1 and totally jumbled up with -j > 1

west forall -j 1 -c 'i=8; while test $((i--)) -ge 0;
 do printf " $WEST_PROJECT_NAME"; sleep 0.1; done; printf "\n"'

It would be great to "upgrade" a test like this and make it part to the actual test suite. I don't know how we could make it portable to Windows.. by converting it to Python maybe? Or maybe we don't need to? I think it would be OK to have some tests skipped on some platforms, there would still be value in that.

Don't get me wrong: we DO need some tests that exert the terminal on Windows. Windows terminal issues was the reason for the 7223431 revert and that whole saga. But maybe not all tests need to run on all platforms.

@pdgendt
Copy link
Collaborator Author

pdgendt commented Oct 18, 2024

The updated version is better, as all printing is done from a single thread, I tried to overwrite the "running" line, but it doesn't clear it.

@marc-hb
Copy link
Collaborator

marc-hb commented Oct 18, 2024

Colorization was surprisingly more robust than I thought (mainly due to a lack of "toggles") but I managed to craft a test that broke your earlier version. But you just fixed it before I shared the test :-) I'm sharing that test code anyway because I think it's still useful:

west forall -j 10 -c 'i=7; while test $((--i)) -ge 0;
 do printf "\e[$((30+i))m"; sleep 0.0${RANDOM}; printf " $WEST_PROJECT_NAME-$i"; done; printf "\e[0m\n"'
Screenshot 2024-10-18 at 10 14 13

@marc-hb marc-hb dismissed their stale review October 18, 2024 17:37

need time to review the new version

@pdgendt
Copy link
Collaborator Author

pdgendt commented Oct 21, 2024

Colorization was surprisingly more robust than I thought (mainly due to a lack of "toggles") but I managed to craft a test that broke your earlier version. But you just fixed it before I shared the test :-) I'm sharing that test code anyway because I think it's still useful:

west forall -j 10 -c 'i=7; while test $((--i)) -ge 0;
 do printf "\e[$((30+i))m"; sleep 0.0${RANDOM}; printf " $WEST_PROJECT_NAME-$i"; done; printf "\e[0m\n"'

Which shell do you use? Doesn't work properly with zsh.

@marc-hb
Copy link
Collaborator

marc-hb commented Oct 21, 2024

Which shell do you use? Doesn't work properly with zsh.

The default/standard: bash. Just type bash to switch temporarily. likely won't affect Python.

Should be compatible with all Bourne shells: ash, ksh, etc.

Strange zsh went its own way...

@marc-hb
Copy link
Collaborator

marc-hb commented Oct 21, 2024

Which shell do you use? Doesn't work properly with zsh.

In zsh, try: emulate ksh, this should get close enough to POSIX.
https://zsh.sourceforge.io/Doc/Release/Invocation.html#Compatibility
https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html

What does not work?

I just checked and $(( )) is POSIX https://mywiki.wooledge.org/ArithmeticExpression
On the other hand, ${RANDOM} is most likely a bashism: replace it with 2 or 3.

A must have for any shell script: https://www.shellcheck.net/ (apt install shellcheck)

@marc-hb
Copy link
Collaborator

marc-hb commented Oct 21, 2024

You just made me realize something important...

west forall -h

Runs a shell (on a Unix OS) or batch (on Windows) command
within the repository of each of the specified PROJECTs.

"A shell" is vague. On Linux this is apparently the default shell. Why not but this must be more explicit. Can you please update this help text? Also, is this a login shell or not? It matters: https://superuser.com/questions/183870/difference-between-bashrc-and-bash-profile/

Similar problem with Windows: will west forall ever use a "real" Powershell depending on the user's configuration or only the obsolete and horrible .BAT ?

In the longer term, there should be a new west forall -i zsh option to choose what interpreter gets used. Of course, the user should anyway use a separate script / layer of indirection for anything longer than 2-3 lines and then invoke west forall -c myscript.ps1. Quoting becomes impossible otherwise. So that new option is not high priority but it would be nice to have for interactive use - as just seen above.

@marc-hb
Copy link
Collaborator

marc-hb commented Oct 21, 2024

"A shell" is vague. On Linux this is apparently the default shell. Why not but this must be more explicit. Can you please update this help text?

So the reason this is "vague" is because 1) it's a mess 2) accordingly, the Python documentation is vague too - intentionally! python/cpython#114539

So I think we only need a warning that defers to Python documentation here. But we do need that warning. Something like "it's a wildly non-portable and insecure mess, check the Python documentation and don't use this in automation". Something like that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants