summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--lib/spack/docs/pipelines.rst108
-rw-r--r--lib/spack/spack/binary_distribution.py26
-rw-r--r--lib/spack/spack/ci.py315
-rw-r--r--lib/spack/spack/cmd/buildcache.py20
-rw-r--r--lib/spack/spack/cmd/ci.py50
-rw-r--r--lib/spack/spack/installer.py2
-rw-r--r--lib/spack/spack/schema/gitlab_ci.py185
-rw-r--r--lib/spack/spack/test/cmd/ci.py376
-rw-r--r--lib/spack/spack/test/installer.py2
-rwxr-xr-xshare/spack/spack-completion.bash8
10 files changed, 833 insertions, 259 deletions
diff --git a/lib/spack/docs/pipelines.rst b/lib/spack/docs/pipelines.rst
index b0913f9095..c3c86e75c9 100644
--- a/lib/spack/docs/pipelines.rst
+++ b/lib/spack/docs/pipelines.rst
@@ -122,9 +122,26 @@ pipeline jobs.
Concretizes the specs in the active environment, stages them (as described in
:ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk.
-This sub-command takes two arguments, but the most useful is ``--output-file``,
-which should be an absolute path (including file name) to the generated
-pipeline, if the default (``./.gitlab-ci.yml``) is not desired.
+Using ``--prune-dag`` or ``--no-prune-dag`` configures whether or not jobs are
+generated for specs that are already up to date on the mirror. If enabling
+DAG pruning using ``--prune-dag``, more information may be required in your
+``spack.yaml`` file, see the :ref:`noop_jobs` section below regarding
+``service-job-attributes``.
+
+The ``--optimize`` argument is experimental and runs the generated pipeline
+document through a series of optimization passes designed to reduce the size
+of the generated file.
+
+The ``--dependencies`` is also experimental and disables what in Gitlab is
+referred to as DAG scheduling, internally using the ``dependencies`` keyword
+rather than ``needs`` to list dependency jobs. The drawback of using this option
+is that before any job can begin, all jobs in previous stages must first
+complete. The benefit is that Gitlab allows more dependencies to be listed
+when using ``dependencies`` instead of ``needs``.
+
+The optional ``--output-file`` argument should be an absolute path (including
+file name) to the generated pipeline, and if not given, the default is
+``./.gitlab-ci.yml``.
.. _cmd-spack-ci-rebuild:
@@ -223,21 +240,6 @@ takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``).
-The
-``final-stage-rebuild-index`` section controls whether an extra job is added to the
-end of your pipeline (in a stage by itself) which will regenerate the mirror's
-buildcache index. Under normal operation, each pipeline job that rebuilds a package
-will re-generate the mirror's buildcache index after the buildcache entry for that
-job has been created and pushed to the mirror. Since jobs in the same stage can run in
-parallel, there is the possibility that at the end of some stage, the index may not
-reflect all the binaries in the buildcache. Adding the ``final-stage-rebuild-index``
-section ensures that at the end of the pipeline, the index will be in sync with the
-binaries on the mirror. If the mirror lives in an S3 bucket, this job will need to
-run on a machine with the Python ``boto3`` module installed, and consequently the
-``final-stage-rebuild-index`` needs to specify a list of ``tags`` to pick a runner
-satisfying that condition. It can also take an ``image`` key so Docker executor type
-runners can pick the right image for the index regeneration job.
-
The optional ``cdash`` section provides information that will be used by the
``spack ci generate`` command (invoked by ``spack ci start``) for reporting
to CDash. All the jobs generated from this environment will belong to a
@@ -251,6 +253,76 @@ Take a look at the
for the gitlab-ci section of the spack environment file, to see precisely what
syntax is allowed there.
+.. _rebuild_index:
+
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+Note about rebuilding buildcache index
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+By default, while a pipeline job may rebuild a package, create a buildcache
+entry, and push it to the mirror, it does not automatically re-generate the
+mirror's buildcache index afterward. Because the index is not needed by the
+default rebuild jobs in the pipeline, not updating the index at the end of
+each job avoids possible race conditions between simultaneous jobs, and it
+avoids the computational expense of regenerating the index. This potentially
+saves minutes per job, depending on the number of binary packages in the
+mirror. As a result, the default is that the mirror's buildcache index may
+not correctly reflect the mirror's contents at the end of a pipeline.
+
+To make sure the buildcache index is up to date at the end of your pipeline,
+spack generates a job to update the buildcache index of the target mirror
+at the end of each pipeline by default. You can disable this behavior by
+adding ``rebuild-index: False`` inside the ``gitlab-ci`` section of your
+spack environment. Spack will assign the job any runner attributes found
+on the ``service-job-attributes``, if you have provided that in your
+``spack.yaml``.
+
+.. _noop_jobs:
+
+^^^^^^^^^^^^^^^^^^^^^^^
+Note about "no-op" jobs
+^^^^^^^^^^^^^^^^^^^^^^^
+
+If no specs in an environment need to be rebuilt during a given pipeline run
+(meaning all are already up to date on the mirror), a single succesful job
+(a NO-OP) is still generated to avoid an empty pipeline (which GitLab
+considers to be an error). An optional ``service-job-attributes`` section
+can be added to your ``spack.yaml`` where you can provide ``tags`` and
+``image`` or ``variables`` for the generated NO-OP job. This section also
+supports providing ``before_script``, ``script``, and ``after_script``, in
+case you want to take some custom actions in the case of any empty pipeline.
+
+Following is an example of this section added to a ``spack.yaml``:
+
+.. code-block:: yaml
+
+ spack:
+ specs:
+ - openmpi
+ mirrors:
+ cloud_gitlab: https://mirror.spack.io
+ gitlab-ci:
+ mappings:
+ - match:
+ - os=centos8
+ runner-attributes:
+ tags:
+ - custom
+ - tag
+ image: spack/centos7
+ service-job-attributes:
+ tags: ['custom', 'tag']
+ image:
+ name: 'some.image.registry/custom-image:latest'
+ entrypoint: ['/bin/bash']
+ script:
+ - echo "Custom message in a custom script"
+
+The example above illustrates how you can provide the attributes used to run
+the NO-OP job in the case of an empty pipeline. The only field for the NO-OP
+job that might be generated for you is ``script``, but that will only happen
+if you do not provide one yourself.
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Assignment of specs to runners
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
diff --git a/lib/spack/spack/binary_distribution.py b/lib/spack/spack/binary_distribution.py
index 694b323716..9e8ca3a204 100644
--- a/lib/spack/spack/binary_distribution.py
+++ b/lib/spack/spack/binary_distribution.py
@@ -1322,7 +1322,7 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
os.remove(filename)
-def try_direct_fetch(spec, force=False, full_hash_match=False, mirrors=None):
+def try_direct_fetch(spec, full_hash_match=False, mirrors=None):
"""
Try to find the spec directly on the configured mirrors
"""
@@ -1360,11 +1360,26 @@ def try_direct_fetch(spec, force=False, full_hash_match=False, mirrors=None):
return found_specs
-def get_mirrors_for_spec(spec=None, force=False, full_hash_match=False,
- mirrors_to_check=None):
+def get_mirrors_for_spec(spec=None, full_hash_match=False,
+ mirrors_to_check=None, index_only=False):
"""
Check if concrete spec exists on mirrors and return a list
indicating the mirrors on which it can be found
+
+ Args:
+ spec (Spec): The spec to look for in binary mirrors
+ full_hash_match (bool): If True, only includes mirrors where the spec
+ full hash matches the locally computed full hash of the ``spec``
+ argument. If False, any mirror which has a matching DAG hash
+ is included in the results.
+ mirrors_to_check (dict): Optionally override the configured mirrors
+ with the mirrors in this dictionary.
+ index_only (bool): Do not attempt direct fetching of ``spec.yaml``
+ files from remote mirrors, only consider the indices.
+
+ Return:
+ A list of objects, each containing a ``mirror_url`` and ``spec`` key
+ indicating all mirrors where the spec can be found.
"""
if spec is None:
return []
@@ -1390,10 +1405,9 @@ def get_mirrors_for_spec(spec=None, force=False, full_hash_match=False,
results = filter_candidates(candidates)
# Maybe we just didn't have the latest information from the mirror, so
- # try to fetch directly.
- if not results:
+ # try to fetch directly, unless we are only considering the indices.
+ if not results and not index_only:
results = try_direct_fetch(spec,
- force=force,
full_hash_match=full_hash_match,
mirrors=mirrors_to_check)
diff --git a/lib/spack/spack/ci.py b/lib/spack/spack/ci.py
index 388d3528b4..6f48c90e15 100644
--- a/lib/spack/spack/ci.py
+++ b/lib/spack/spack/ci.py
@@ -202,8 +202,8 @@ def format_root_spec(spec, main_phase, strip_compiler):
# spec.name, spec.version, spec.compiler, spec.architecture)
-def spec_deps_key_label(s):
- return s.dag_hash(), "%s/%s" % (s.name, s.dag_hash(7))
+def spec_deps_key(s):
+ return '{0}/{1}'.format(s.name, s.dag_hash(7))
def _add_dependency(spec_label, dep_label, deps):
@@ -214,8 +214,8 @@ def _add_dependency(spec_label, dep_label, deps):
deps[spec_label].add(dep_label)
-def get_spec_dependencies(specs, deps, spec_labels):
- spec_deps_obj = compute_spec_deps(specs)
+def get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
+ spec_deps_obj = compute_spec_deps(specs, check_index_only=check_index_only)
if spec_deps_obj:
dependencies = spec_deps_obj['dependencies']
@@ -225,19 +225,25 @@ def get_spec_dependencies(specs, deps, spec_labels):
spec_labels[entry['label']] = {
'spec': Spec(entry['spec']),
'rootSpec': entry['root_spec'],
+ 'needs_rebuild': entry['needs_rebuild'],
}
for entry in dependencies:
_add_dependency(entry['spec'], entry['depends'], deps)
-def stage_spec_jobs(specs):
+def stage_spec_jobs(specs, check_index_only=False):
"""Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework.
Arguments:
specs (Iterable): Specs to build
+ check_index_only (bool): Regardless of whether DAG pruning is enabled,
+ all configured mirrors are searched to see if binaries for specs
+ are up to date on those mirrors. This flag limits that search to
+ the binary cache indices on those mirrors to speed the process up,
+ even though there is no garantee the index is up to date.
Returns: A tuple of information objects describing the specs, dependencies
and stages:
@@ -274,7 +280,8 @@ def stage_spec_jobs(specs):
deps = {}
spec_labels = {}
- get_spec_dependencies(specs, deps, spec_labels)
+ get_spec_dependencies(
+ specs, deps, spec_labels, check_index_only=check_index_only)
# Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is
@@ -311,12 +318,15 @@ def print_staging_summary(spec_labels, dependencies, stages):
for job in sorted(stage):
s = spec_labels[job]['spec']
- tty.msg(' {0} -> {1}'.format(job, get_spec_string(s)))
+ tty.msg(' [{1}] {0} -> {2}'.format(
+ job,
+ 'x' if spec_labels[job]['needs_rebuild'] else ' ',
+ get_spec_string(s)))
stage_index += 1
-def compute_spec_deps(spec_list):
+def compute_spec_deps(spec_list, check_index_only=False):
"""
Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a
@@ -386,33 +396,39 @@ def compute_spec_deps(spec_list):
# root_spec = get_spec_string(spec)
root_spec = spec
- rkey, rlabel = spec_deps_key_label(spec)
+ rkey = spec_deps_key(spec)
for s in spec.traverse(deptype=all):
if s.external:
tty.msg('Will not stage external pkg: {0}'.format(s))
continue
- skey, slabel = spec_deps_key_label(s)
- spec_labels[slabel] = {
+ up_to_date_mirrors = bindist.get_mirrors_for_spec(
+ spec=s, full_hash_match=True, index_only=check_index_only)
+
+ skey = spec_deps_key(s)
+ spec_labels[skey] = {
'spec': get_spec_string(s),
'root': root_spec,
+ 'needs_rebuild': not up_to_date_mirrors,
}
- append_dep(rlabel, slabel)
+
+ append_dep(rkey, skey)
for d in s.dependencies(deptype=all):
- dkey, dlabel = spec_deps_key_label(d)
+ dkey = spec_deps_key(d)
if d.external:
tty.msg('Will not stage external dep: {0}'.format(d))
continue
- append_dep(slabel, dlabel)
+ append_dep(skey, dkey)
for spec_label, spec_holder in spec_labels.items():
specs.append({
'label': spec_label,
'spec': spec_holder['spec'],
'root_spec': spec_holder['root'],
+ 'needs_rebuild': spec_holder['needs_rebuild'],
})
deps_json_obj = {
@@ -481,22 +497,43 @@ def pkg_name_from_spec_label(spec_label):
def format_job_needs(phase_name, strip_compilers, dep_jobs,
- osname, build_group, enable_artifacts_buildcache):
+ osname, build_group, prune_dag, stage_spec_dict,
+ enable_artifacts_buildcache):
needs_list = []
for dep_job in dep_jobs:
- needs_list.append({
- 'job': get_job_name(phase_name,
- strip_compilers,
- dep_job,
- osname,
- build_group),
- 'artifacts': enable_artifacts_buildcache,
- })
+ dep_spec_key = spec_deps_key(dep_job)
+ dep_spec_info = stage_spec_dict[dep_spec_key]
+
+ if not prune_dag or dep_spec_info['needs_rebuild']:
+ needs_list.append({
+ 'job': get_job_name(phase_name,
+ strip_compilers,
+ dep_job,
+ osname,
+ build_group),
+ 'artifacts': enable_artifacts_buildcache,
+ })
return needs_list
-def generate_gitlab_ci_yaml(env, print_summary, output_file,
- run_optimizer=False, use_dependencies=False):
+def add_pr_mirror(url):
+ cfg_scope = cfg.default_modify_scope()
+ mirrors = cfg.get('mirrors', scope=cfg_scope)
+ items = [(n, u) for n, u in mirrors.items()]
+ items.insert(0, ('ci_pr_mirror', url))
+ cfg.set('mirrors', syaml.syaml_dict(items), scope=cfg_scope)
+
+
+def remove_pr_mirror():
+ cfg_scope = cfg.default_modify_scope()
+ mirrors = cfg.get('mirrors', scope=cfg_scope)
+ mirrors.pop('ci_pr_mirror')
+ cfg.set('mirrors', mirrors, scope=cfg_scope)
+
+
+def generate_gitlab_ci_yaml(env, print_summary, output_file, prune_dag=False,
+ check_index_only=False, run_optimizer=False,
+ use_dependencies=False):
# FIXME: What's the difference between one that opens with 'spack'
# and one that opens with 'env'? This will only handle the former.
with spack.concretize.disable_compiler_existence_check():
@@ -509,10 +546,6 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
gitlab_ci = yaml_root['gitlab-ci']
- final_job_config = None
- if 'final-stage-rebuild-index' in gitlab_ci:
- final_job_config = gitlab_ci['final-stage-rebuild-index']
-
build_group = None
enable_cdash_reporting = False
cdash_auth_token = None
@@ -539,6 +572,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
pr_mirror_url = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
spack_pr_branch)
+ if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
+ tty.die('spack ci generate requires an env containing a mirror')
+
ci_mirrors = yaml_root['mirrors']
mirror_urls = [url for url in ci_mirrors.values()]
@@ -546,6 +582,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
if 'enable-artifacts-buildcache' in gitlab_ci:
enable_artifacts_buildcache = gitlab_ci['enable-artifacts-buildcache']
+ rebuild_index_enabled = True
+ if 'rebuild-index' in gitlab_ci and gitlab_ci['rebuild-index'] is False:
+ rebuild_index_enabled = False
+
bootstrap_specs = []
phases = []
if 'bootstrap' in gitlab_ci:
@@ -573,19 +613,27 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'strip-compilers': False,
})
- staged_phases = {}
- for phase in phases:
- phase_name = phase['name']
- with spack.concretize.disable_compiler_existence_check():
- staged_phases[phase_name] = stage_spec_jobs(
- env.spec_lists[phase_name])
+ # Add this mirror if it's enabled, as some specs might be up to date
+ # here and thus not need to be rebuilt.
+ if pr_mirror_url:
+ add_pr_mirror(pr_mirror_url)
- if print_summary:
+ # Speed up staging by first fetching binary indices from all mirrors
+ # (including the per-PR mirror we may have just added above).
+ bindist.binary_index.update()
+
+ staged_phases = {}
+ try:
for phase in phases:
phase_name = phase['name']
- tty.msg('Stages for phase "{0}"'.format(phase_name))
- phase_stages = staged_phases[phase_name]
- print_staging_summary(*phase_stages)
+ with spack.concretize.disable_compiler_existence_check():
+ staged_phases[phase_name] = stage_spec_jobs(
+ env.spec_lists[phase_name],
+ check_index_only=check_index_only)
+ finally:
+ # Clean up PR mirror if enabled
+ if pr_mirror_url:
+ remove_pr_mirror()
all_job_names = []
output_object = {}
@@ -611,7 +659,8 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
stage_id += 1
for spec_label in stage_jobs:
- root_spec = spec_labels[spec_label]['rootSpec']
+ spec_record = spec_labels[spec_label]
+ root_spec = spec_record['rootSpec']
pkg_name = pkg_name_from_spec_label(spec_label)
release_spec = root_spec[pkg_name]
@@ -678,11 +727,15 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
job_dependencies = []
if spec_label in dependencies:
if enable_artifacts_buildcache:
+ # Get dependencies transitively, so they're all
+ # available in the artifacts buildcache.
dep_jobs = [
d for d in release_spec.traverse(deptype=all,
root=False)
]
else:
+ # In this case, "needs" is only used for scheduling
+ # purposes, so we only get the direct dependencies.
dep_jobs = []
for dep_label in dependencies[spec_label]:
dep_pkg = pkg_name_from_spec_label(dep_label)
@@ -690,10 +743,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
dep_jobs.append(dep_root[dep_pkg])
job_dependencies.extend(
- format_job_needs(phase_name, strip_compilers, dep_jobs,
- osname, build_group,
+ format_job_needs(phase_name, strip_compilers,
+ dep_jobs, osname, build_group,
+ prune_dag, spec_labels,
enable_artifacts_buildcache))
+ rebuild_spec = spec_record['needs_rebuild']
+
# This next section helps gitlab make sure the right
# bootstrapped compiler exists in the artifacts buildcache by
# creating an artificial dependency between this spec and its
@@ -709,11 +765,12 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
compiler_pkg_spec = compilers.pkg_spec_for_compiler(
release_spec.compiler)
for bs in bootstrap_specs:
- bs_arch = bs['spec'].architecture
+ c_spec = bs['spec']
+ bs_arch = c_spec.architecture
bs_arch_family = (bs_arch.target
.microarchitecture
.family)
- if (bs['spec'].satisfies(compiler_pkg_spec) and
+ if (c_spec.satisfies(compiler_pkg_spec) and
bs_arch_family == spec_arch_family):
# We found the bootstrap compiler this release spec
# should be built with, so for DAG scheduling
@@ -721,10 +778,24 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
# to the jobs "needs". But if artifact buildcache
# is enabled, we'll have to add all transtive deps
# of the compiler as well.
- dep_jobs = [bs['spec']]
+
+ # Here we check whether the bootstrapped compiler
+ # needs to be rebuilt. Until compilers are proper
+ # dependencies, we artificially force the spec to
+ # be rebuilt if the compiler targeted to build it
+ # needs to be rebuilt.
+ bs_specs, _, _ = staged_phases[bs['phase-name']]
+ c_spec_key = spec_deps_key(c_spec)
+ rbld_comp = bs_specs[c_spec_key]['needs_rebuild']
+ rebuild_spec = rebuild_spec or rbld_comp
+ # Also update record so dependents do not fail to
+ # add this spec to their "needs"
+ spec_record['needs_rebuild'] = rebuild_spec
+
+ dep_jobs = [c_spec]
if enable_artifacts_buildcache:
dep_jobs = [
- d for d in bs['spec'].traverse(deptype=all)
+ d for d in c_spec.traverse(deptype=all)
]
job_dependencies.extend(
@@ -733,6 +804,8 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
dep_jobs,
str(bs_arch),
build_group,
+ prune_dag,
+ bs_specs,
enable_artifacts_buildcache))
else:
debug_msg = ''.join([
@@ -741,9 +814,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'not the compiler required by the spec, or ',
'because the target arch families of the ',
'spec and the compiler did not match'
- ]).format(bs['spec'], release_spec)
+ ]).format(c_spec, release_spec)
tty.debug(debug_msg)
+ if prune_dag and not rebuild_spec:
+ continue
+
+ job_vars['SPACK_SPEC_NEEDS_REBUILD'] = str(rebuild_spec)
+
if enable_cdash_reporting:
cdash_build_name = get_cdash_build_name(
release_spec, build_group)
@@ -812,11 +890,19 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
output_object[job_name] = job_object
job_id += 1
+ if print_summary:
+ for phase in phases:
+ phase_name = phase['name']
+ tty.msg('Stages for phase "{0}"'.format(phase_name))
+ phase_stages = staged_phases[phase_name]
+ print_staging_summary(*phase_stages)
+
tty.debug('{0} build jobs generated in {1} stages'.format(
job_id, stage_id))
- tty.debug('The max_needs_job is {0}, with {1} needs'.format(
- max_needs_job, max_length_needs))
+ if job_id > 0:
+ tty.debug('The max_needs_job is {0}, with {1} needs'.format(
+ max_needs_job, max_length_needs))
# Use "all_job_names" to populate the build group for this set
if enable_cdash_reporting and cdash_auth_token:
@@ -828,63 +914,94 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
else:
tty.warn('Unable to populate buildgroup without CDash credentials')
- if final_job_config and not is_pr_pipeline:
- # Add an extra, final job to regenerate the index
- final_stage = 'stage-rebuild-index'
- final_job = {
- 'stage': final_stage,
- 'script': 'spack buildcache update-index --keys -d {0}'.format(
- mirror_urls[0]),
- 'tags': final_job_config['tags'],
- 'when': 'always'
- }
- if 'image' in final_job_config:
- final_job['image'] = final_job_config['image']
- if before_script:
- final_job['before_script'] = before_script
- if after_script:
- final_job['after_script'] = after_script
- output_object['rebuild-index'] = final_job
- stage_names.append(final_stage)
-
- output_object['stages'] = stage_names
-
- # Capture the version of spack used to generate the pipeline, transform it
- # into a value that can be passed to "git checkout", and save it in a
- # global yaml variable
- spack_version = spack.main.get_version()
- version_to_clone = None
- v_match = re.match(r"^\d+\.\d+\.\d+$", spack_version)
- if v_match:
- version_to_clone = 'v{0}'.format(v_match.group(0))
- else:
- v_match = re.match(r"^[^-]+-[^-]+-([a-f\d]+)$", spack_version)
+ service_job_config = None
+ if 'service-job-attributes' in gitlab_ci:
+ service_job_config = gitlab_ci['service-job-attributes']
+
+ default_attrs = [
+ 'image',
+ 'tags',
+ 'variables',
+ 'before_script',
+ # 'script',
+ 'after_script',
+ ]
+
+ if job_id > 0:
+ if rebuild_index_enabled and not is_pr_pipeline:
+ # Add a final job to regenerate the index
+ final_stage = 'stage-rebuild-index'
+ final_job = {}
+
+ if service_job_config:
+ copy_attributes(default_attrs,
+ service_job_config,
+ final_job)
+
+ final_script = 'spack buildcache update-index --keys'
+ final_script = '{0} -d {1}'.format(final_script, mirror_urls[0])
+
+ final_job['stage'] = final_stage
+ final_job['script'] = [final_script]
+ final_job['when'] = 'always'
+
+ output_object['rebuild-index'] = final_job
+ stage_names.append(final_stage)
+
+ output_object['stages'] = stage_names
+
+ # Capture the version of spack used to generate the pipeline, transform it
+ # into a value that can be passed to "git checkout", and save it in a
+ # global yaml variable
+ spack_version = spack.main.get_version()
+ version_to_clone = None
+ v_match = re.match(r"^\d+\.\d+\.\d+$", spack_version)
if v_match:
- version_to_clone = v_match.group(1)
+ version_to_clone = 'v{0}'.format(v_match.group(0))
else:
- version_to_clone = spack_version
+ v_match = re.match(r"^[^-]+-[^-]+-([a-f\d]+)$", spack_version)
+ if v_match:
+ version_to_clone = v_match.group(1)
+ else:
+ version_to_clone = spack_version
- output_object['variables'] = {
- 'SPACK_VERSION': spack_version,
- 'SPACK_CHECKOUT_VERSION': version_to_clone,
- }
+ output_object['variables'] = {
+ 'SPACK_VERSION': spack_version,
+ 'SPACK_CHECKOUT_VERSION': version_to_clone,
+ }
- if pr_mirror_url:
- output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url
+ if pr_mirror_url:
+ output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url
+
+ sorted_output = {}
+ for output_key, output_value in sorted(output_object.items()):
+ sorted_output[output_key] = output_value
+
+ # TODO(opadron): remove this or refactor
+ if run_optimizer:
+ import spack.ci_optimization as ci_opt
+ sorted_output = ci_opt.optimizer(sorted_output)
+
+ # TODO(opadron): remove this or refactor
+ if use_dependencies:
+ import spack.ci_needs_workaround as cinw
+ sorted_output = cinw.needs_to_dependencies(sorted_output)
+ else:
+ # No jobs were generated
+ tty.debug('No specs to rebuild, generating no-op job')
+ noop_job = {}
- sorted_output = {}
- for output_key, output_value in sorted(output_object.items()):
- sorted_output[output_key] = output_value
+ if service_job_config:
+ copy_attributes(default_attrs,
+ service_job_config,
+ noop_job)
- # TODO(opadron): remove this or refactor
- if run_optimizer:
- import spack.ci_optimization as ci_opt
- sorted_output = ci_opt.optimizer(sorted_output)
+ if 'script' not in noop_job:
+ noop_job['script'] = [
+ 'echo "All specs already up to date, nothing to rebuild."',
+ ]
- # TODO(opadron): remove this or refactor
- if use_dependencies:
- import spack.ci_needs_workaround as cinw
- sorted_output = cinw.needs_to_dependencies(sorted_output)
+ sorted_output = {'no-specs-to-rebuild': noop_job}
with open(output_file, 'w') as outf:
outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
diff --git a/lib/spack/spack/cmd/buildcache.py b/lib/spack/spack/cmd/buildcache.py
index 841370127f..f5cb94e7c7 100644
--- a/lib/spack/spack/cmd/buildcache.py
+++ b/lib/spack/spack/cmd/buildcache.py
@@ -769,19 +769,14 @@ def buildcache_copy(args):
shutil.copyfile(cdashid_src_path, cdashid_dest_path)
-def buildcache_update_index(args):
- """Update a buildcache index."""
- outdir = '.'
- if args.mirror_url:
- outdir = args.mirror_url
-
- mirror = spack.mirror.MirrorCollection().lookup(outdir)
+def update_index(mirror_url, update_keys=False):
+ mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
outdir = url_util.format(mirror.push_url)
bindist.generate_package_index(
url_util.join(outdir, bindist.build_cache_relative_path()))
- if args.keys:
+ if update_keys:
keys_url = url_util.join(outdir,
bindist.build_cache_relative_path(),
bindist.build_cache_keys_relative_path())
@@ -789,6 +784,15 @@ def buildcache_update_index(args):
bindist.generate_key_index(keys_url)
+def buildcache_update_index(args):
+ """Update a buildcache index."""
+ outdir = '.'
+ if args.mirror_url:
+ outdir = args.mirror_url
+
+ update_index(outdir, update_keys=args.keys)
+
+
def buildcache(parser, args):
if args.func:
args.func(args)
diff --git a/lib/spack/spack/cmd/ci.py b/lib/spack/spack/cmd/ci.py
index 6cac26ef88..9adb7e11c7 100644
--- a/lib/spack/spack/cmd/ci.py
+++ b/lib/spack/spack/cmd/ci.py
@@ -54,6 +54,26 @@ def setup_parser(subparser):
'--dependencies', action='store_true', default=False,
help="(Experimental) disable DAG scheduling; use "
' "plain" dependencies.')
+ prune_group = generate.add_mutually_exclusive_group()
+ prune_group.add_argument(
+ '--prune-dag', action='store_true', dest='prune_dag',
+ default=True, help="""Do not generate jobs for specs already up to
+date on the mirror""")
+ prune_group.add_argument(
+ '--no-prune-dag', action='store_false', dest='prune_dag',
+ default=True, help="""Generate jobs for specs already up to date
+on the mirror""")
+ generate.add_argument(
+ '--check-index-only', action='store_true', dest='index_only',
+ default=False, help="""Spack always check specs against configured
+binary mirrors when generating the pipeline, regardless of whether or not
+DAG pruning is enabled. This flag controls whether it might attempt to
+fetch remote spec.yaml files directly (ensuring no spec is rebuilt if it is
+present on the mirror), or whether it should reduce pipeline generation time
+by assuming all remote buildcache indices are up to date and only use those
+to determine whether a given spec is up to date on mirrors. In the latter
+case, specs might be needlessly rebuilt if remote buildcache indices are out
+of date.""")
generate.set_defaults(func=ci_generate)
# Check a spec against mirror. Rebuild, create buildcache and push to
@@ -61,6 +81,11 @@ def setup_parser(subparser):
rebuild = subparsers.add_parser('rebuild', help=ci_rebuild.__doc__)
rebuild.set_defaults(func=ci_rebuild)
+ # Rebuild the buildcache index associated with the mirror in the
+ # active, gitlab-enabled environment.
+ index = subparsers.add_parser('rebuild-index', help=ci_reindex.__doc__)
+ index.set_defaults(func=ci_reindex)
+
def ci_generate(args):
"""Generate jobs file from a spack environment file containing CI info.
@@ -75,6 +100,8 @@ def ci_generate(args):
copy_yaml_to = args.copy_to
run_optimizer = args.optimize
use_dependencies = args.dependencies
+ prune_dag = args.prune_dag
+ index_only = args.index_only
if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml")
@@ -86,7 +113,8 @@ def ci_generate(args):
# Generate the jobs
spack_ci.generate_gitlab_ci_yaml(
- env, True, output_file, run_optimizer=run_optimizer,
+ env, True, output_file, prune_dag=prune_dag,
+ check_index_only=index_only, run_optimizer=run_optimizer,
use_dependencies=use_dependencies)
if copy_yaml_to:
@@ -306,8 +334,8 @@ def ci_rebuild(args):
# Checks all mirrors for a built spec with a matching full hash
matches = bindist.get_mirrors_for_spec(
- job_spec, force=False, full_hash_match=True,
- mirrors_to_check=mirrors_to_check)
+ job_spec, full_hash_match=True, mirrors_to_check=mirrors_to_check,
+ index_only=False)
if matches:
# Got at full hash match on at least one configured mirror. All
@@ -408,6 +436,22 @@ def ci_rebuild(args):
artifact_mirror_url or pr_mirror_url or remote_mirror_url)
+def ci_reindex(args):
+ """Rebuild the buildcache index associated with the mirror in the
+ active, gitlab-enabled environment. """
+ env = ev.get_env(args, 'ci rebuild-index', required=True)
+ yaml_root = ev.config_dict(env.yaml)
+
+ if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
+ tty.die('spack ci rebuild-index requires an env containing a mirror')
+
+ ci_mirrors = yaml_root['mirrors']
+ mirror_urls = [url for url in ci_mirrors.values()]
+ remote_mirror_url = mirror_urls[0]
+
+ buildcache.update_index(remote_mirror_url, update_keys=True)
+
+
def ci(parser, args):
if args.func:
args.func(args)
diff --git a/lib/spack/spack/installer.py b/lib/spack/spack/installer.py
index d36e650378..a9c0d7e85c 100644
--- a/lib/spack/spack/installer.py
+++ b/lib/spack/spack/installer.py
@@ -392,7 +392,7 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False,
pkg_id = package_id(pkg)
tty.debug('Searching for binary cache of {0}'.format(pkg_id))
matches = binary_distribution.get_mirrors_for_spec(
- pkg.spec, force=False, full_hash_match=full_hash_match)
+ pkg.spec, full_hash_match=full_hash_match)
if not matches:
return False
diff --git a/lib/spack/spack/schema/gitlab_ci.py b/lib/spack/spack/schema/gitlab_ci.py
index f3cbc24bb8..31af841cca 100644
--- a/lib/spack/spack/schema/gitlab_ci.py
+++ b/lib/spack/spack/schema/gitlab_ci.py
@@ -9,6 +9,8 @@
:lines: 13-
"""
+from llnl.util.lang import union_dicts
+
image_schema = {
'oneOf': [
{
@@ -28,127 +30,98 @@ image_schema = {
],
}
+runner_attributes_schema_items = {
+ 'image': image_schema,
+ 'tags': {
+ 'type': 'array',
+ 'items': {'type': 'string'}
+ },
+ 'variables': {
+ 'type': 'object',
+ 'patternProperties': {
+ r'[\w\d\-_\.]+': {
+ 'type': 'string',
+ },
+ },
+ },
+ 'before_script': {
+ 'type': 'array',
+ 'items': {'type': 'string'}
+ },
+ 'script': {
+ 'type': 'array',
+ 'items': {'type': 'string'}
+ },
+ 'after_script': {
+ 'type': 'array',
+ 'items': {'type': 'string'}
+ },
+}
+
+runner_selector_schema = {
+ 'type': 'object',
+ 'additionalProperties': False,
+ 'required': ['tags'],
+ 'properties': runner_attributes_schema_items,
+}
+
#: Properties for inclusion in other schemas
properties = {
'gitlab-ci': {
'type': 'object',
'additionalProperties': False,
'required': ['mappings'],
- 'patternProperties': {
- 'bootstrap': {
- 'type': 'array',
- 'items': {
- 'anyOf': [
- {
- 'type': 'string',
- }, {
- 'type': 'object',
- 'additionalProperties': False,
- 'required': ['name'],
- 'properties': {
- 'name': {
- 'type': 'string',
- },
- 'compiler-agnostic': {
- 'type': 'boolean',
- 'default': False,
- },
- },
- },
- ],
- },
- },
- 'mappings': {
- 'type': 'array',
- 'items': {
- 'type': 'object',
- 'additionalProperties': False,
- 'required': ['match'],
- 'properties': {
- 'match': {
- 'type': 'array',
- 'items': {
+ 'patternProperties': union_dicts(
+ runner_attributes_schema_items,
+ {
+ 'bootstrap': {
+ 'type': 'array',
+ 'items': {
+ 'anyOf': [
+ {
'type': 'string',
- },
- },
- 'runner-attributes': {
- 'type': 'object',
- 'additionalProperties': True,
- 'required': ['tags'],
- 'properties': {
- 'image': image_schema,
- 'tags': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'variables': {
- 'type': 'object',
- 'patternProperties': {
- r'[\w\d\-_\.]+': {
- 'type': 'string',
- },
+ }, {
+ 'type': 'object',
+ 'additionalProperties': False,
+ 'required': ['name'],
+ 'properties': {
+ 'name': {
+ 'type': 'string',
+ },
+ 'compiler-agnostic': {
+ 'type': 'boolean',
+ 'default': False,
},
- },
- 'before_script': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'script': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'after_script': {
- 'type': 'array',
- 'items': {'type': 'string'}
},
},
- },
+ ],
},
},
- },
- 'image': image_schema,
- 'tags': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'variables': {
- 'type': 'object',
- 'patternProperties': {
- r'[\w\d\-_\.]+': {
- 'type': 'string',
+ 'mappings': {
+ 'type': 'array',
+ 'items': {
+ 'type': 'object',
+ 'additionalProperties': False,
+ 'required': ['match'],
+ 'properties': {
+ 'match': {
+ 'type': 'array',
+ 'items': {
+ 'type': 'string',
+ },
+ },
+ 'runner-attributes': runner_selector_schema,
+ },
},
},
- },
- 'before_script': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'script': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'after_script': {
- 'type': 'array',
- 'items': {'type': 'string'}
- },
- 'enable-artifacts-buildcache': {
- 'type': 'boolean',
- 'default': False,
- },
- 'final-stage-rebuild-index': {
- 'type': 'object',
- 'additionalProperties': False,
- 'required': ['tags'],
- 'properties': {
- 'image': image_schema,
- 'tags': {
- 'type': 'array',
- 'default': [],
- 'items': {'type': 'string'}
- },
+ 'enable-artifacts-buildcache': {
+ 'type': 'boolean',
+ 'default': False,
},
- },
- },
+ 'service-job-attributes': runner_selector_schema,
+ 'rebuild-index': {'type': 'boolean'},
+ }
+ ),
},
}
diff --git a/lib/spack/spack/test/cmd/ci.py b/lib/spack/spack/test/cmd/ci.py
index 5266dbf846..69e4aa7410 100644
--- a/lib/spack/spack/test/cmd/ci.py
+++ b/lib/spack/spack/test/cmd/ci.py
@@ -11,6 +11,7 @@ from jsonschema import validate
import spack
import spack.ci as ci
+import spack.compilers as compilers
import spack.config
import spack.environment as ev
import spack.hash_types as ht
@@ -19,7 +20,7 @@ import spack.paths as spack_paths
import spack.repo as repo
from spack.schema.buildcache_spec import schema as spec_yaml_schema
from spack.schema.database_index import schema as db_idx_schema
-from spack.spec import Spec
+from spack.spec import Spec, CompilerSpec
from spack.util.mock_package import MockPackageMultiRepo
import spack.util.executable as exe
import spack.util.spack_yaml as syaml
@@ -31,6 +32,7 @@ env_cmd = spack.main.SpackCommand('env')
mirror_cmd = spack.main.SpackCommand('mirror')
gpg_cmd = spack.main.SpackCommand('gpg')
install_cmd = spack.main.SpackCommand('install')
+uninstall_cmd = spack.main.SpackCommand('uninstall')
buildcache_cmd = spack.main.SpackCommand('buildcache')
git = exe.which('git', required=True)
@@ -77,13 +79,13 @@ and then 'd', 'b', and 'a' to be put in the next three stages, respectively.
spec_a = Spec('a')
spec_a.concretize()
- spec_a_label = ci.spec_deps_key_label(spec_a)[1]
- spec_b_label = ci.spec_deps_key_label(spec_a['b'])[1]
- spec_c_label = ci.spec_deps_key_label(spec_a['c'])[1]
- spec_d_label = ci.spec_deps_key_label(spec_a['d'])[1]
- spec_e_label = ci.spec_deps_key_label(spec_a['e'])[1]
- spec_f_label = ci.spec_deps_key_label(spec_a['f'])[1]
- spec_g_label = ci.spec_deps_key_label(spec_a['g'])[1]
+ spec_a_label = ci.spec_deps_key(spec_a)
+ spec_b_label = ci.spec_deps_key(spec_a['b'])
+ spec_c_label = ci.spec_deps_key(spec_a['c'])
+ spec_d_label = ci.spec_deps_key(spec_a['d'])
+ spec_e_label = ci.spec_deps_key(spec_a['e'])
+ spec_f_label = ci.spec_deps_key(spec_a['f'])
+ spec_g_label = ci.spec_deps_key(spec_a['g'])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])
@@ -109,6 +111,7 @@ def test_ci_generate_with_env(tmpdir, mutable_mock_env_path, env_deactivate,
install_mockery, mock_packages):
"""Make sure we can get a .gitlab-ci.yml from an environment file
which has the gitlab-ci, cdash, and mirrors sections."""
+ mirror_url = 'https://my.fake.mirror'
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
@@ -128,7 +131,7 @@ spack:
- matrix:
- [$old-gcc-pkgs]
mirrors:
- some-mirror: https://my.fake.mirror
+ some-mirror: {0}
gitlab-ci:
bootstrap:
- name: bootstrap
@@ -140,7 +143,7 @@ spack:
tags:
- donotcare
image: donotcare
- final-stage-rebuild-index:
+ service-job-attributes:
image: donotcare
tags: [donotcare]
cdash:
@@ -148,7 +151,7 @@ spack:
url: https://my.fake.cdash
project: Not used
site: Nothing
-""")
+""".format(mirror_url))
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
@@ -170,6 +173,12 @@ spack:
assert(yaml_contents['stages'][0] == 'stage-0')
assert(yaml_contents['stages'][5] == 'stage-rebuild-index')
+ assert('rebuild-index' in yaml_contents)
+ rebuild_job = yaml_contents['rebuild-index']
+ expected = 'spack buildcache update-index --keys -d {0}'.format(
+ mirror_url)
+ assert(rebuild_job['script'][0] == expected)
+
def _validate_needs_graph(yaml_contents, needs_graph, artifacts):
for job_name, job_def in yaml_contents.items():
@@ -533,7 +542,7 @@ spack:
def test_ci_generate_for_pr_pipeline(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery,
- mock_packages):
+ mock_packages, monkeypatch):
"""Test that PR pipelines do not include a final stage job for
rebuilding the mirror index, even if that job is specifically
configured"""
@@ -558,9 +567,10 @@ spack:
runner-attributes:
tags:
- donotcare
- final-stage-rebuild-index:
+ service-job-attributes:
image: donotcare
tags: [donotcare]
+ rebuild-index: False
""")
with tmpdir.as_cwd():
@@ -569,6 +579,9 @@ spack:
with ev.read('test'):
os.environ['SPACK_IS_PR_PIPELINE'] = 'True'
+ os.environ['SPACK_PR_BRANCH'] = 'fake-test-branch'
+ monkeypatch.setattr(
+ ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -579,10 +592,17 @@ spack:
assert('rebuild-index' not in yaml_contents)
+ for ci_key in yaml_contents.keys():
+ if ci_key.startswith('(specs) '):
+ job_object = yaml_contents[ci_key]
+ job_vars = job_object['variables']
+ assert('SPACK_IS_PR_PIPELINE' in job_vars)
+ assert(job_vars['SPACK_IS_PR_PIPELINE'] == 'True')
+
def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery,
- mock_packages):
+ mock_packages, monkeypatch):
"""Make sure we do not generate jobs for external pkgs"""
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
@@ -609,6 +629,8 @@ spack:
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
+ monkeypatch.setattr(
+ ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -712,6 +734,19 @@ spack:
- $packages
mirrors:
test-mirror: {0}
+ gitlab-ci:
+ enable-artifacts-buildcache: True
+ mappings:
+ - match:
+ - patchelf
+ runner-attributes:
+ tags:
+ - donotcare
+ image: donotcare
+ service-job-attributes:
+ tags:
+ - nonbuildtag
+ image: basicimage
""".format(mirror_url)
print('spack.yaml:\n{0}\n'.format(spack_yaml_contents))
@@ -739,6 +774,48 @@ spack:
buildcache_path = os.path.join(mirror_dir.strpath, 'build_cache')
+ # Now test the --prune-dag (default) option of spack ci generate
+ mirror_cmd('add', 'test-ci', mirror_url)
+
+ outputfile_pruned = str(tmpdir.join('pruned_pipeline.yml'))
+ ci_cmd('generate', '--output-file', outputfile_pruned)
+
+ with open(outputfile_pruned) as f:
+ contents = f.read()
+ yaml_contents = syaml.load(contents)
+ assert('no-specs-to-rebuild' in yaml_contents)
+ # Make sure there are no other spec jobs or rebuild-index
+ assert(len(yaml_contents.keys()) == 1)
+ the_elt = yaml_contents['no-specs-to-rebuild']
+ assert('tags' in the_elt)
+ assert('nonbuildtag' in the_elt['tags'])
+ assert('image' in the_elt)
+ assert(the_elt['image'] == 'basicimage')
+
+ outputfile_not_pruned = str(tmpdir.join('unpruned_pipeline.yml'))
+ ci_cmd('generate', '--no-prune-dag', '--output-file',
+ outputfile_not_pruned)
+
+ # Test the --no-prune-dag option of spack ci generate
+ with open(outputfile_not_pruned) as f:
+ contents = f.read()
+ yaml_contents = syaml.load(contents)
+
+ found_spec_job = False
+
+ for ci_key in yaml_contents.keys():
+ if '(specs) patchelf' in ci_key:
+ the_elt = yaml_contents[ci_key]
+ assert('variables' in the_elt)
+ job_vars = the_elt['variables']
+ assert('SPACK_SPEC_NEEDS_REBUILD' in job_vars)
+ assert(job_vars['SPACK_SPEC_NEEDS_REBUILD'] == 'False')
+ found_spec_job = True
+
+ assert(found_spec_job)
+
+ mirror_cmd('rm', 'test-ci')
+
# Test generating buildcache index while we have bin mirror
buildcache_cmd('update-index', '--mirror-url', mirror_url)
index_path = os.path.join(buildcache_path, 'index.json')
@@ -839,7 +916,7 @@ spack:
- custom main step
after_script:
- custom post step one
- final-stage-rebuild-index:
+ service-job-attributes:
image: donotcare
tags: [donotcare]
""")
@@ -851,6 +928,8 @@ spack:
with ev.read('test'):
monkeypatch.setattr(
spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1')
+ monkeypatch.setattr(
+ ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -925,3 +1004,270 @@ spack:
assert(the_elt['script'][0] == 'main step')
assert(len(the_elt['after_script']) == 1)
assert(the_elt['after_script'][0] == 'post step one')
+
+
+def test_ci_generate_with_workarounds(tmpdir, mutable_mock_env_path,
+ env_deactivate, install_mockery,
+ mock_packages, monkeypatch):
+ """Make sure the post-processing cli workarounds do what they should"""
+ filename = str(tmpdir.join('spack.yaml'))
+ with open(filename, 'w') as f:
+ f.write("""\
+spack:
+ specs:
+ - callpath%gcc@3.0
+ mirrors:
+ some-mirror: https://my.fake.mirror
+ gitlab-ci:
+ mappings:
+ - match: ['%gcc@3.0']
+ runner-attributes:
+ tags:
+ - donotcare
+ image: donotcare
+ enable-artifacts-buildcache: true
+""")
+
+ with tmpdir.as_cwd():
+ env_cmd('create', 'test', './spack.yaml')
+ outputfile = str(tmpdir.join('.gitlab-ci.yml'))
+
+ with ev.read('test'):
+ monkeypatch.setattr(
+ ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
+ ci_cmd('generate', '--output-file', outputfile, '--dependencies')
+
+ with open(outputfile) as f:
+ contents = f.read()
+ yaml_contents = syaml.load(contents)
+
+ found_one = False
+
+ for ci_key in yaml_contents.keys():
+ if ci_key.startswith('(specs) '):
+ found_one = True
+ job_obj = yaml_contents[ci_key]
+ assert('needs' not in job_obj)
+ assert('dependencies' in job_obj)
+
+ assert(found_one is True)
+
+
+@pytest.mark.disable_clean_stage_check
+def test_ci_rebuild_index(tmpdir, mutable_mock_env_path, env_deactivate,
+ install_mockery, mock_packages, mock_fetch,
+ mock_stage):
+ working_dir = tmpdir.join('working_dir')
+
+ mirror_dir = working_dir.join('mirror')
+ mirror_url = 'file://{0}'.format(mirror_dir.strpath)
+
+ spack_yaml_contents = """
+spack:
+ specs:
+ - callpath
+ mirrors:
+ test-mirror: {0}
+ gitlab-ci:
+ mappings:
+ - match:
+ - patchelf
+ runner-attributes:
+ tags:
+ - donotcare
+ image: donotcare
+""".format(mirror_url)
+
+ filename = str(tmpdir.join('spack.yaml'))
+ with open(filename, 'w') as f:
+ f.write(spack_yaml_contents)
+
+ with tmpdir.as_cwd():
+ env_cmd('create', 'test', './spack.yaml')
+ with ev.read('test'):
+ spec_map = ci.get_concrete_specs(
+ 'callpath', 'callpath', '', 'FIND_ANY')
+ concrete_spec = spec_map['callpath']
+ spec_yaml = concrete_spec.to_yaml(hash=ht.build_hash)
+ yaml_path = str(tmpdir.join('spec.yaml'))
+ with open(yaml_path, 'w') as ypfd:
+ ypfd.write(spec_yaml)
+
+ install_cmd('--keep-stage', '-f', yaml_path)
+ buildcache_cmd('create', '-u', '-a', '-f', '--mirror-url',
+ mirror_url, 'callpath')
+ ci_cmd('rebuild-index')
+
+ buildcache_path = os.path.join(mirror_dir.strpath, 'build_cache')
+ index_path = os.path.join(buildcache_path, 'index.json')
+ with open(index_path) as idx_fd:
+ index_object = json.load(idx_fd)
+ validate(index_object, db_idx_schema)
+
+
+def test_ci_generate_bootstrap_prune_dag(
+ install_mockery_mutable_config, mock_packages, mock_fetch,
+ mock_archive, mutable_config, monkeypatch, tmpdir,
+ mutable_mock_env_path, env_deactivate):
+ """Test compiler bootstrapping with DAG pruning. Specifically, make
+ sure that if we detect the bootstrapped compiler needs to be rebuilt,
+ we ensure the spec we want to build with that compiler is scheduled
+ for rebuild as well."""
+
+ # Create a temp mirror directory for buildcache usage
+ mirror_dir = tmpdir.join('mirror_dir')
+ mirror_url = 'file://{0}'.format(mirror_dir.strpath)
+
+ # Install a compiler, because we want to put it in a buildcache
+ install_cmd('gcc@10.1.0%gcc@4.5.0')
+
+ # Put installed compiler in the buildcache
+ buildcache_cmd('create', '-u', '-a', '-f', '-d', mirror_dir.strpath,
+ 'gcc@10.1.0%gcc@4.5.0')
+
+ # Now uninstall the compiler
+ uninstall_cmd('-y', 'gcc@10.1.0%gcc@4.5.0')
+
+ monkeypatch.setattr(spack.concretize.Concretizer,
+ 'check_for_compiler_existence', False)
+ spack.config.set('config:install_missing_compilers', True)
+ assert CompilerSpec('gcc@10.1.0') not in compilers.all_compiler_specs()
+
+ # Configure the mirror where we put that buildcache w/ the compiler
+ mirror_cmd('add', 'test-mirror', mirror_url)
+
+ install_cmd('--no-check-signature', 'a%gcc@10.1.0')
+
+ # Put spec built with installed compiler in the buildcache
+ buildcache_cmd('create', '-u', '-a', '-f', '-d', mirror_dir.strpath,
+ 'a%gcc@10.1.0')
+
+ # Now uninstall the spec
+ uninstall_cmd('-y', 'a%gcc@10.1.0')
+
+ filename = str(tmpdir.join('spack.yaml'))
+ with open(filename, 'w') as f:
+ f.write("""\
+spack:
+ definitions:
+ - bootstrap:
+ - gcc@10.1.0%gcc@4.5.0
+ specs:
+ - a%gcc@10.1.0
+ mirrors:
+ atestm: {0}
+ gitlab-ci:
+ bootstrap:
+ - name: bootstrap
+ compiler-agnostic: true
+ mappings:
+ - match:
+ - arch=test-debian6-x86_64
+ runner-attributes:
+ tags:
+ - donotcare
+ - match:
+ - arch=test-debian6-core2
+ runner-attributes:
+ tags:
+ - meh
+""".format(mirror_url))
+
+ # Without this monkeypatch, pipeline generation process would think that
+ # nothing in the environment needs rebuilding. With the monkeypatch, the
+ # process sees the compiler as needing a rebuild, which should then result
+ # in the specs built with that compiler needing a rebuild too.
+ def fake_get_mirrors_for_spec(spec=None, full_hash_match=False,
+ mirrors_to_check=None, index_only=False):
+ if spec.name == 'gcc':
+ return []
+ else:
+ return [{
+ 'spec': spec,
+ 'mirror_url': mirror_url,
+ }]
+
+ with tmpdir.as_cwd():
+ env_cmd('create', 'test', './spack.yaml')
+ outputfile = str(tmpdir.join('.gitlab-ci.yml'))
+
+ with ev.read('test'):
+ monkeypatch.setattr(
+ ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
+
+ ci_cmd('generate', '--output-file', outputfile)
+
+ with open(outputfile) as of:
+ yaml_contents = of.read()
+ original_yaml_contents = syaml.load(yaml_contents)
+
+ # without the monkeypatch, everything appears up to date and no
+ # rebuild jobs are generated.
+ assert(original_yaml_contents)
+ assert('no-specs-to-rebuild' in original_yaml_contents)
+
+ monkeypatch.setattr(spack.binary_distribution,
+ 'get_mirrors_for_spec',
+ fake_get_mirrors_for_spec)
+
+ ci_cmd('generate', '--output-file', outputfile)
+
+ with open(outputfile) as of:
+ yaml_contents = of.read()
+ new_yaml_contents = syaml.load(yaml_contents)
+
+ assert(new_yaml_contents)
+
+ # This 'needs' graph reflects that even though specs 'a' and 'b' do
+ # not otherwise need to be rebuilt (thanks to DAG pruning), they
+ # both end up in the generated pipeline because the compiler they
+ # depend on is bootstrapped, and *does* need to be rebuilt.
+ needs_graph = {
+ '(bootstrap) gcc': [],
+ '(specs) b': [
+ '(bootstrap) gcc',
+ ],
+ '(specs) a': [
+ '(bootstrap) gcc',
+ '(specs) b',
+ ],
+ }
+
+ _validate_needs_graph(new_yaml_contents, needs_graph, False)
+
+
+def test_ci_subcommands_without_mirror(tmpdir, mutable_mock_env_path,
+ env_deactivate, mock_packages,
+ install_mockery):
+ """Make sure we catch if there is not a mirror and report an error"""
+ filename = str(tmpdir.join('spack.yaml'))
+ with open(filename, 'w') as f:
+ f.write("""\
+spack:
+ specs:
+ - archive-files
+ gitlab-ci:
+ mappings:
+ - match:
+ - archive-files
+ runner-attributes:
+ tags:
+ - donotcare
+ image: donotcare
+""")
+
+ with tmpdir.as_cwd():
+ env_cmd('create', 'test', './spack.yaml')
+ outputfile = str(tmpdir.join('.gitlab-ci.yml'))
+
+ with ev.read('test'):
+ # Check the 'generate' subcommand
+ output = ci_cmd('generate', '--output-file', outputfile,
+ output=str, fail_on_error=False)
+ ex = 'spack ci generate requires an env containing a mirror'
+ assert(ex in output)
+
+ # Also check the 'rebuild-index' subcommand
+ output = ci_cmd('rebuild-index', output=str, fail_on_error=False)
+ ex = 'spack ci rebuild-index requires an env containing a mirror'
+ assert(ex in output)
diff --git a/lib/spack/spack/test/installer.py b/lib/spack/spack/test/installer.py
index 725a4e0f71..fb00081931 100644
--- a/lib/spack/spack/test/installer.py
+++ b/lib/spack/spack/test/installer.py
@@ -231,7 +231,7 @@ def test_process_binary_cache_tarball_tar(install_mockery, monkeypatch, capfd):
def test_try_install_from_binary_cache(install_mockery, mock_packages,
monkeypatch, capsys):
"""Tests SystemExit path for_try_install_from_binary_cache."""
- def _mirrors_for_spec(spec, force, full_hash_match=False):
+ def _mirrors_for_spec(spec, full_hash_match=False):
spec = spack.spec.Spec('mpi').concretized()
return [{
'mirror_url': 'notused',
diff --git a/share/spack/spack-completion.bash b/share/spack/spack-completion.bash
index ea4a630691..cf0e79cdac 100755
--- a/share/spack/spack-completion.bash
+++ b/share/spack/spack-completion.bash
@@ -473,18 +473,22 @@ _spack_ci() {
then
SPACK_COMPREPLY="-h --help"
else
- SPACK_COMPREPLY="generate rebuild"
+ SPACK_COMPREPLY="generate rebuild rebuild-index"
fi
}
_spack_ci_generate() {
- SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies"
+ SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --prune-dag --no-prune-dag --check-index-only"
}
_spack_ci_rebuild() {
SPACK_COMPREPLY="-h --help"
}
+_spack_ci_rebuild_index() {
+ SPACK_COMPREPLY="-h --help"
+}
+
_spack_clean() {
if $list_options
then