summaryrefslogtreecommitdiff
path: root/share
diff options
context:
space:
mode:
authorkwryankrattiger <80296582+kwryankrattiger@users.noreply.github.com>2023-09-07 15:41:31 -0500
committerGitHub <noreply@github.com>2023-09-07 15:41:31 -0500
commit8ec16571361415767252836e4ce3026ce244315e (patch)
treedbbde3ffbdf85afb1b00300c7648eb8b52761fe8 /share
parentc5fc794d772d306b0ba28ecf65a61ca784981359 (diff)
downloadspack-8ec16571361415767252836e4ce3026ce244315e.tar.gz
spack-8ec16571361415767252836e4ce3026ce244315e.tar.bz2
spack-8ec16571361415767252836e4ce3026ce244315e.tar.xz
spack-8ec16571361415767252836e4ce3026ce244315e.zip
CI Timing Statistics (#38598)
* Write timing information for installs from cache * CI: aggregate and upload install_times.json to artifacts * CI: Don't change root directory for artifact generation * Flat event based timer variation Event based timer allows for easily starting and stopping timers without wiping sub-timer data. It also requires less branching logic when tracking time. The json output is non-hierarchical in this version and hierarchy is less rigidly enforced between starting and stopping. * Add and write timers for top level install * Update completion * remove unused subtimer api * Fix unit tests * Suppress timing summary option * Save timers summaries to user_data artifacts * Remove completion from fish * Move spack python to script section * Write timer correctly for non-cache installs * Re-add hash to timer file * Fish completion updates * Fix null timer yield value * fix type hints * Remove timer-summary-file option * Add "." in front of non-package timer name --------- Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com> Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Diffstat (limited to 'share')
-rw-r--r--share/spack/gitlab/cloud_pipelines/configs/ci.yaml6
-rw-r--r--share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py38
2 files changed, 43 insertions, 1 deletions
diff --git a/share/spack/gitlab/cloud_pipelines/configs/ci.yaml b/share/spack/gitlab/cloud_pipelines/configs/ci.yaml
index 1b3e723d5a..da0abb65d5 100644
--- a/share/spack/gitlab/cloud_pipelines/configs/ci.yaml
+++ b/share/spack/gitlab/cloud_pipelines/configs/ci.yaml
@@ -20,7 +20,11 @@ ci:
- k=$CI_GPG_KEY_ROOT/intermediate_ci_signing_key.gpg; [[ -r $k ]] && spack gpg trust $k
- k=$CI_GPG_KEY_ROOT/spack_public_key.gpg; [[ -r $k ]] && spack gpg trust $k
script::
- - spack --color=always --backtrace ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
+ - - spack --color=always --backtrace ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
+ - - spack python ${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py
+ --prefix /home/software/spack:${CI_PROJECT_DIR}
+ --log install_times.json
+ ${SPACK_ARTIFACTS_ROOT}/user_data/install_times.json
after_script:
- - cat /proc/loadavg || true
variables:
diff --git a/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py b/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py
new file mode 100644
index 0000000000..9adab64e57
--- /dev/null
+++ b/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py
@@ -0,0 +1,38 @@
+#!/usr/bin/env spack-python
+"""
+This script is meant to be run using:
+ `spack python aggregate_logs.spack.py`
+"""
+
+import os
+
+
+def find_logs(prefix, filename):
+ for root, _, files in os.walk(prefix):
+ if filename in files:
+ yield os.path.join(root, filename)
+
+
+if __name__ == "__main__":
+ import json
+ from argparse import ArgumentParser
+
+ parser = ArgumentParser("aggregate_logs")
+ parser.add_argument("output_file")
+ parser.add_argument("--log", default="install_times.json")
+ parser.add_argument("--prefix", required=True)
+
+ args = parser.parse_args()
+
+ prefixes = [p for p in args.prefix.split(":") if os.path.exists(p)]
+
+ # Aggregate the install timers into a single json
+ data = []
+ for prefix in prefixes:
+ time_logs = find_logs(prefix, args.log)
+ for log in time_logs:
+ with open(log) as fd:
+ data.append(json.load(fd))
+
+ with open(args.output_file, "w") as fd:
+ json.dump(data, fd)