Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests: add cram wrapper parallel_cram.sh that runs cram tests in parallel #1667

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

corneliusroemer
Copy link
Member

@corneliusroemer corneliusroemer commented Nov 9, 2024

Description of proposed changes

Cram tests are perfect for parallelization. Each test is independend and
we have 178 of them. The wrapper script allows to run tests in parallel.

In my experiments, on an M1 Pro, total test time was reduced
from 7m30s to 1m23s, a more than 5x speedup. I got best results
with -j8 (instead of all 10).

The wrapper takes many of cram's options, one can still run tests of
a single directory, for example ./parallel_cram.sh tests/functional/tree.

One caveat: it seems that iqtree creates files in the input file directory. As multiple tests use the same input files, they might conflict. So we should copy the input files to a temporary directory before running. This is done in commit 687bd04

We seem to be getting almost a 2x speedup in CI as well, as Github runners come with 2 cores by default. The bottleneck is now RSV pathogen CI which runs around 8 minutes (previously cram tests took 13min, now they are faster than RSV). Overall Github runner time is reduced from around 2h5min to 1h15min, a saving of around a third.

I tested the script to ensure that it correctly reports test failures. I've also been using it in my regular work on various PRs and it's worked exactly as expected, saving me waiting time.

Checklist

  • Automated checks pass
  • Check if you need to add a changelog message
  • Check if you need to add tests
  • Check if you need to update docs

…f cram tests

Cram tests are perfect for parallelization. Each test is independend and
we have 178 of them. The wrapper script allows to run tests in parallel.

In my experiments, on an M1 Pro, total test time was reduced
from 7m30s to 1m23s, a more than 5x speedup. I got best results
with `-j8` (instead of all 10).

The wrapper takes many of cram's options, one can still run tests of
a single directory, for example `./parallel_cram.sh tests/functional/tree`.

One caveat: it seems that iqtree creates files in the _input_ file directory.
So we should copy the input files to a temporary directory before running.
@corneliusroemer corneliusroemer marked this pull request as ready for review November 9, 2024 21:54
@corneliusroemer
Copy link
Member Author

TODO:

  • Handle case where a file is passed, rather than a directory, currently this results in:
    ./parallel_cram.sh -j8 tests/functional/parse.t
    Error: Directory tests/functional/parse.t does not exist
    

@tsibley
Copy link
Member

tsibley commented Nov 13, 2024

One caveat: it seems that iqtree creates files in the input file directory. As multiple tests use the same input files, they might conflict. So we should copy the input files to a temporary directory before running. This is done in commit 687bd04

Our tests are revealing a real user interface issue here. Instead of working around it in tests, perhaps we can fix this input directory pollution for good by relocating the temporary alignment file we're already using

tmp_aln_file = str(Path(aln_file).with_name(Path(aln_file).stem + "-delim.fasta"))

into a temporary directory and thus also avoid having to do this junk

augur/augur/tree.py

Lines 304 to 310 in 3f72c40

if clean_up:
if os.path.isfile(tmp_aln_file):
os.remove(tmp_aln_file)
for ext in [".bionj",".ckp.gz",".iqtree",".mldist",".model.gz",".treefile",".uniqueseq.phy",".model"]:
if os.path.isfile(tmp_aln_file + ext):
os.remove(tmp_aln_file + ext)

Copy link
Member

@tsibley tsibley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Parallel testing is great and we should absolutely do it for our Cram tests. Thank you for tackling it!

OTOH, I don't love the additional overhead of maintaining parallel_cram.sh ourselves. It's re-implementing bits of common tooling like parallel and even prove. My preference would be to write a much smaller bit of code to translate cram's output to TAP and then use prove to run the files in parallel. This would not only get us things like slow test priority out-of-the-box, but also recording of statistics and keeping track of that (i.e. "what's slow") over time if we wanted.

@@ -0,0 +1,185 @@
#!/bin/bash
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No set -euo pipefail, which is a red flag to me.

Comment on lines +4 to +8
VERBOSE=0
KEEP_TMPDIR=0
SHELL_PATH="/bin/bash"
SHELL_OPTS=""
INDENT=2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is hardcoding things that should be left at defaults, either cram's builtins or our project-level defaults from .cramrc.

Comment on lines +80 to +91
--shell=*)
SHELL_PATH="${1#*=}"
shift
;;
--shell-opts=*)
SHELL_OPTS="${1#*=}"
shift
;;
--indent=*)
INDENT="${1#*=}"
shift
;;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These don't take the common --opt value form, only --opt=value.

Comment on lines +96 to +99
*)
TEST_DIR="$1"
shift
;;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If more than one positional is given, only the last will be used. e.g. you can't do

./parallel_cram.sh tests/functional/cram/{tree,refine}

Comment on lines +109 to +116
# Build cram command with options
CRAM_CMD="cram"
[ $QUIET -eq 1 ] && CRAM_CMD="$CRAM_CMD -q"
[ $VERBOSE -eq 1 ] && CRAM_CMD="$CRAM_CMD -v"
[ $KEEP_TMPDIR -eq 1 ] && CRAM_CMD="$CRAM_CMD --keep-tmpdir"
[ -n "$SHELL_PATH" ] && CRAM_CMD="$CRAM_CMD --shell=$SHELL_PATH"
[ -n "$SHELL_OPTS" ] && CRAM_CMD="$CRAM_CMD --shell-opts=$SHELL_OPTS"
[ -n "$INDENT" ] && CRAM_CMD="$CRAM_CMD --indent=$INDENT"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of a string, this should build up an array and then be used as "${CRAM_CMD[@]}" so that it's robust to the multiple applications of word splitting.

Comment on lines +125 to +126
LOCK_DIR="/tmp/cram_lock_$$"
mkdir "$LOCK_DIR"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be a mktemp -d call.

Comment on lines +137 to +163
# Run tests in parallel with specified number of jobs
echo "$ALL_FILES" | xargs -P "$PARALLEL_JOBS" -I {} sh -c '
file="$1"
lock_dir="$LOCK_DIR"
output=$($CRAM_CMD "$file" 2>&1)
status=$?

while ! mkdir "$lock_dir/lock" 2>/dev/null; do
sleep 0.05
done

# Record test status
echo "$status" >> "$RESULTS_FILE"

if [ $status -eq 0 ]; then
printf "."
else
echo "$output" | grep -v "^# Ran "
fi

# If --keep-tmpdir is enabled, capture the temporary directory
if [ $KEEP_TMPDIR -eq 1 ]; then
echo "$output" | grep "Kept temporary directory:" >> "$TMPDIR_FILE"
fi

rm -rf "$lock_dir/lock"
' - {}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW, this reimplements a bit of functionality in parallel (which is typically more useful than xargs).

@tsibley
Copy link
Member

tsibley commented Nov 13, 2024

My preference would be to write a much smaller bit of code to translate cram's output to TAP and then use prove to run the files in parallel.

Or, alternatively, if we don't want to switch to TAP, modify Cram to support parallel testing. This doesn't seem difficult, from a look at the codebase.

@corneliusroemer
Copy link
Member Author

I initially tried parallel but it's less portable. In fact the brew installed parallel I have on my system is the moreutils parallel and not gnu parallel. Could work around this by installing gnu parallel in the conda environment.

What you suggest @tsibley sounds good but realistically I won't do it.

I'm not sure how much maintenance there will have to be? We could go with this for now until we have one of the other alternatives you mention if you feel like you want to do this.

@tsibley
Copy link
Member

tsibley commented Nov 14, 2024

I initially tried parallel but it's less portable.

I mean, GNU parallel is a single-file (15k line!) Perl program using only the Perl stdlib, compatible with a wide range of Perl versions. It's very easy to ship a copy (and I've done so before ;-). But yes, I get your point.

What you suggest @tsibley sounds good but realistically I won't do it.

Yep, I'm not surprised.

I'm not sure how much maintenance there will have to be? We could go with this for now until we have one of the other alternatives you mention if you feel like you want to do this.

Going with this "for now" (really, indefinitely) is fine. Don't take my comments to mean this can't merge! There won't be much maintenance, probably, but there will be little bugs here and there (e.g. in the current pass, -v / --verbose doesn't actually work) which will involve dealing with its approach. That's kinda my point: there's more code here than is necessary, and more code you don't need is always a liability.

Out of curiosity, this afternoon I wrote an alternative because I wanted to see what it'd look like with parallel.

Upon writing the above comment this evening, I got curious again and decided to quickly implement the same output-filter approach to produce TAP instead so I could use it with prove. It would still be better (and not difficult, I think) to add TAP output to Cram directly, but as-is even this little implementation works great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants