summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorMassimiliano Culpo <massimiliano.culpo@gmail.com>2022-05-24 21:13:28 +0200
committerGitHub <noreply@github.com>2022-05-24 12:13:28 -0700
commitf2a81af70e1d70828a579ea325a0699d89c7a420 (patch)
treeca1c33b18e129076012b79e6d16da7219fde0ad0
parent494e567fe5826dd8be28d95aefafc8aebb05fbbf (diff)
downloadspack-f2a81af70e1d70828a579ea325a0699d89c7a420.tar.gz
spack-f2a81af70e1d70828a579ea325a0699d89c7a420.tar.bz2
spack-f2a81af70e1d70828a579ea325a0699d89c7a420.tar.xz
spack-f2a81af70e1d70828a579ea325a0699d89c7a420.zip
Best effort co-concretization (iterative algorithm) (#28941)
Currently, environments can either be concretized fully together or fully separately. This works well for users who create environments for interoperable software and can use `concretizer:unify:true`. It does not allow environments with conflicting software to be concretized for maximal interoperability. The primary use-case for this is facilities providing system software. Facilities provide multiple MPI implementations, but all software built against a given MPI ought to be interoperable. This PR adds a concretization option `concretizer:unify:when_possible`. When this option is used, Spack will concretize specs in the environment separately, but will optimize for minimal differences in overlapping packages. * Add a level of indirection to root specs This commit introduce the "literal" atom, which comes with a few different "arities". The unary "literal" contains an integer that id the ID of a spec literal. Other "literals" contain information on the requests made by literal ID. For instance zlib@1.2.11 generates the following facts: literal(0,"root","zlib"). literal(0,"node","zlib"). literal(0,"node_version_satisfies","zlib","1.2.11"). This should help with solving large environments "together where possible" since later literals can be now solved together in batches. * Add a mechanism to relax the number of literals being solved * Modify spack solve to display the new criteria Since the new criteria is above all the build criteria, we need to modify the way we display the output. Originally done by Greg in #27964 and cherry-picked to this branch by the co-author of the commit. Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com> * Inject reusable specs into the solve Instead of coupling the PyclingoDriver() object with spack.config, inject the concrete specs that can be reused. A method level function takes care of reading from the store and the buildcache. * spack solve: show output of multi-rounds * add tests for best-effort coconcretization * Enforce having at least a literal being solved Co-authored-by: Greg Becker <becker33@llnl.gov>
-rw-r--r--lib/spack/docs/environments.rst73
-rw-r--r--lib/spack/spack/cmd/solve.py133
-rw-r--r--lib/spack/spack/environment/environment.py52
-rw-r--r--lib/spack/spack/schema/concretizer.py10
-rw-r--r--lib/spack/spack/solver/asp.py307
-rw-r--r--lib/spack/spack/solver/concretize.lp46
-rw-r--r--lib/spack/spack/test/cmd/concretize.py21
-rw-r--r--lib/spack/spack/test/cmd/env.py26
-rw-r--r--lib/spack/spack/test/concretize.py64
-rw-r--r--lib/spack/spack/test/spec_dag.py2
-rw-r--r--share/spack/gitlab/cloud_pipelines/stacks/build_systems/spack.yaml2
-rw-r--r--var/spack/repos/builtin.mock/packages/installed-deps-b/package.py2
12 files changed, 532 insertions, 206 deletions
diff --git a/lib/spack/docs/environments.rst b/lib/spack/docs/environments.rst
index fca70d8af9..72882c6c71 100644
--- a/lib/spack/docs/environments.rst
+++ b/lib/spack/docs/environments.rst
@@ -273,19 +273,9 @@ or
Concretizing
^^^^^^^^^^^^
-Once some user specs have been added to an environment, they can be
-concretized. *By default specs are concretized separately*, one after
-the other. This mode of operation permits to deploy a full
-software stack where multiple configurations of the same package
-need to be installed alongside each other. Central installations done
-at HPC centers by system administrators or user support groups
-are a common case that fits in this behavior.
-Environments *can also be configured to concretize all
-the root specs in a unified way* to ensure that
-each package in the environment corresponds to a single concrete spec. This
-mode of operation is usually what is required by software developers that
-want to deploy their development environment.
-
+Once some user specs have been added to an environment, they can be concretized.
+There are at the moment three different modes of operation to concretize an environment,
+which are explained in details in :ref:`environments_concretization_config`.
Regardless of which mode of operation has been chosen, the following
command will ensure all the root specs are concretized according to the
constraints that are prescribed in the configuration:
@@ -493,25 +483,64 @@ Appending to this list in the yaml is identical to using the ``spack
add`` command from the command line. However, there is more power
available from the yaml file.
+.. _environments_concretization_config:
+
^^^^^^^^^^^^^^^^^^^
Spec concretization
^^^^^^^^^^^^^^^^^^^
+An environment can be concretized in three different modes and the behavior active under any environment
+is determined by the ``concretizer:unify`` property. By default specs are concretized *separately*, one after the other:
-Specs can be concretized separately or together, as already
-explained in :ref:`environments_concretization`. The behavior active
-under any environment is determined by the ``concretizer:unify`` property:
+.. code-block:: yaml
+
+ spack:
+ specs:
+ - hdf5~mpi
+ - hdf5+mpi
+ - zlib@1.2.8
+ concretizer:
+ unify: false
+
+This mode of operation permits to deploy a full software stack where multiple configurations of the same package
+need to be installed alongside each other using the best possible selection of transitive dependencies. The downside
+is that redundancy of installations is disregarded completely, and thus environments might be more bloated than
+strictly needed. In the example above, for instance, if a version of ``zlib`` newer than ``1.2.8`` is known to Spack,
+then it will be used for both ``hdf5`` installations.
+
+If redundancy of the environment is a concern, Spack provides a way to install it *together where possible*,
+i.e. trying to maximize reuse of dependencies across different specs:
.. code-block:: yaml
spack:
specs:
- - ncview
- - netcdf
- - nco
- - py-sphinx
+ - hdf5~mpi
+ - hdf5+mpi
+ - zlib@1.2.8
+ concretizer:
+ unify: when_possible
+
+Also in this case Spack allows having multiple configurations of the same package, but privileges the reuse of
+specs over other factors. Going back to our example, this means that both ``hdf5`` installations will use
+``zlib@1.2.8`` as a dependency even if newer versions of that library are available.
+Central installations done at HPC centers by system administrators or user support groups are a common case
+that fits either of these two modes.
+
+Environments can also be configured to concretize all the root specs *together*, in a self-consistent way, to
+ensure that each package in the environment comes with a single configuration:
+
+.. code-block:: yaml
+
+ spack:
+ specs:
+ - hdf5+mpi
+ - zlib@1.2.8
concretizer:
unify: true
+This mode of operation is usually what is required by software developers that want to deploy their development
+environment and have a single view of it in the filesystem.
+
.. note::
The ``concretizer:unify`` config option was introduced in Spack 0.18 to
@@ -521,9 +550,9 @@ under any environment is determined by the ``concretizer:unify`` property:
.. admonition:: Re-concretization of user specs
- When concretizing specs together the entire set of specs will be
+ When concretizing specs *together* or *together where possible* the entire set of specs will be
re-concretized after any addition of new user specs, to ensure that
- the environment remains consistent. When instead the specs are concretized
+ the environment remains consistent / minimal. When instead the specs are concretized
separately only the new specs will be re-concretized after any addition.
^^^^^^^^^^^^^
diff --git a/lib/spack/spack/cmd/solve.py b/lib/spack/spack/cmd/solve.py
index f329bfd829..ba5cc218cb 100644
--- a/lib/spack/spack/cmd/solve.py
+++ b/lib/spack/spack/cmd/solve.py
@@ -15,6 +15,8 @@ import llnl.util.tty.color as color
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
+import spack.config
+import spack.environment
import spack.hash_types as ht
import spack.package
import spack.solver.asp as asp
@@ -74,6 +76,51 @@ def setup_parser(subparser):
spack.cmd.common.arguments.add_concretizer_args(subparser)
+def _process_result(result, show, required_format, kwargs):
+ result.raise_if_unsat()
+ opt, _, _ = min(result.answers)
+ if ("opt" in show) and (not required_format):
+ tty.msg("Best of %d considered solutions." % result.nmodels)
+ tty.msg("Optimization Criteria:")
+
+ maxlen = max(len(s[2]) for s in result.criteria)
+ color.cprint(
+ "@*{ Priority Criterion %sInstalled ToBuild}" % ((maxlen - 10) * " ")
+ )
+
+ fmt = " @K{%%-8d} %%-%ds%%9s %%7s" % maxlen
+ for i, (installed_cost, build_cost, name) in enumerate(result.criteria, 1):
+ color.cprint(
+ fmt % (
+ i,
+ name,
+ "-" if build_cost is None else installed_cost,
+ installed_cost if build_cost is None else build_cost,
+ )
+ )
+ print()
+
+ # dump the solutions as concretized specs
+ if 'solutions' in show:
+ for spec in result.specs:
+ # With -y, just print YAML to output.
+ if required_format == 'yaml':
+ # use write because to_yaml already has a newline.
+ sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
+ elif required_format == 'json':
+ sys.stdout.write(spec.to_json(hash=ht.dag_hash))
+ else:
+ sys.stdout.write(
+ spec.tree(color=sys.stdout.isatty(), **kwargs))
+ print()
+
+ if result.unsolved_specs and "solutions" in show:
+ tty.msg("Unsolved specs")
+ for spec in result.unsolved_specs:
+ print(spec)
+ print()
+
+
def solve(parser, args):
# these are the same options as `spack spec`
name_fmt = '{namespace}.{name}' if args.namespaces else '{name}'
@@ -102,58 +149,42 @@ def solve(parser, args):
if models < 0:
tty.die("model count must be non-negative: %d")
- specs = spack.cmd.parse_specs(args.specs)
-
- # set up solver parameters
- # Note: reuse and other concretizer prefs are passed as configuration
- solver = asp.Solver()
- output = sys.stdout if "asp" in show else None
- result = solver.solve(
- specs,
- out=output,
- models=models,
- timers=args.timers,
- stats=args.stats,
- setup_only=(set(show) == {'asp'})
- )
- if 'solutions' not in show:
- return
-
- # die if no solution was found
- result.raise_if_unsat()
-
- # show the solutions as concretized specs
- if 'solutions' in show:
- opt, _, _ = min(result.answers)
+ # Format required for the output (JSON, YAML or None)
+ required_format = args.format
- if ("opt" in show) and (not args.format):
- tty.msg("Best of %d considered solutions." % result.nmodels)
- tty.msg("Optimization Criteria:")
+ # If we have an active environment, pick the specs from there
+ env = spack.environment.active_environment()
+ if env and args.specs:
+ msg = "cannot give explicit specs when an environment is active"
+ raise RuntimeError(msg)
- maxlen = max(len(s[2]) for s in result.criteria)
- color.cprint(
- "@*{ Priority Criterion %sInstalled ToBuild}" % ((maxlen - 10) * " ")
- )
-
- fmt = " @K{%%-8d} %%-%ds%%9s %%7s" % maxlen
- for i, (installed_cost, build_cost, name) in enumerate(result.criteria, 1):
- color.cprint(
- fmt % (
- i,
- name,
- "-" if build_cost is None else installed_cost,
- installed_cost if build_cost is None else build_cost,
- )
- )
- print()
+ specs = list(env.user_specs) if env else spack.cmd.parse_specs(args.specs)
- for spec in result.specs:
- # With -y, just print YAML to output.
- if args.format == 'yaml':
- # use write because to_yaml already has a newline.
- sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
- elif args.format == 'json':
- sys.stdout.write(spec.to_json(hash=ht.dag_hash))
+ solver = asp.Solver()
+ output = sys.stdout if "asp" in show else None
+ setup_only = set(show) == {'asp'}
+ unify = spack.config.get('concretizer:unify')
+ if unify != 'when_possible':
+ # set up solver parameters
+ # Note: reuse and other concretizer prefs are passed as configuration
+ result = solver.solve(
+ specs,
+ out=output,
+ models=models,
+ timers=args.timers,
+ stats=args.stats,
+ setup_only=setup_only
+ )
+ if not setup_only:
+ _process_result(result, show, required_format, kwargs)
+ else:
+ for idx, result in enumerate(solver.solve_in_rounds(
+ specs, out=output, models=models, timers=args.timers, stats=args.stats
+ )):
+ if "solutions" in show:
+ tty.msg("ROUND {0}".format(idx))
+ tty.msg("")
else:
- sys.stdout.write(
- spec.tree(color=sys.stdout.isatty(), **kwargs))
+ print("% END ROUND {0}\n".format(idx))
+ if not setup_only:
+ _process_result(result, show, required_format, kwargs)
diff --git a/lib/spack/spack/environment/environment.py b/lib/spack/spack/environment/environment.py
index b275dd02f2..7366468f7a 100644
--- a/lib/spack/spack/environment/environment.py
+++ b/lib/spack/spack/environment/environment.py
@@ -628,7 +628,7 @@ class Environment(object):
# This attribute will be set properly from configuration
# during concretization
- self.concretization = None
+ self.unify = None
self.clear()
if init_file:
@@ -772,10 +772,14 @@ class Environment(object):
# Retrieve the current concretization strategy
configuration = config_dict(self.yaml)
- # Let `concretization` overrule `concretize:unify` config for now.
- unify = spack.config.get('concretizer:unify')
- self.concretization = configuration.get(
- 'concretization', 'together' if unify else 'separately')
+ # Let `concretization` overrule `concretize:unify` config for now,
+ # but use a translation table to have internally a representation
+ # as if we were using the new configuration
+ translation = {'separately': False, 'together': True}
+ try:
+ self.unify = translation[configuration['concretization']]
+ except KeyError:
+ self.unify = spack.config.get('concretizer:unify', False)
# Retrieve dev-build packages:
self.dev_specs = configuration.get('develop', {})
@@ -1156,14 +1160,44 @@ class Environment(object):
self.specs_by_hash = {}
# Pick the right concretization strategy
- if self.concretization == 'together':
+ if self.unify == 'when_possible':
+ return self._concretize_together_where_possible(tests=tests)
+
+ if self.unify is True:
return self._concretize_together(tests=tests)
- if self.concretization == 'separately':
+ if self.unify is False:
return self._concretize_separately(tests=tests)
msg = 'concretization strategy not implemented [{0}]'
- raise SpackEnvironmentError(msg.format(self.concretization))
+ raise SpackEnvironmentError(msg.format(self.unify))
+
+ def _concretize_together_where_possible(self, tests=False):
+ # Avoid cyclic dependency
+ import spack.solver.asp
+
+ # Exit early if the set of concretized specs is the set of user specs
+ user_specs_did_not_change = not bool(
+ set(self.user_specs) - set(self.concretized_user_specs)
+ )
+ if user_specs_did_not_change:
+ return []
+
+ # Proceed with concretization
+ self.concretized_user_specs = []
+ self.concretized_order = []
+ self.specs_by_hash = {}
+
+ result_by_user_spec = {}
+ solver = spack.solver.asp.Solver()
+ for result in solver.solve_in_rounds(self.user_specs, tests=tests):
+ result_by_user_spec.update(result.specs_by_input)
+
+ result = []
+ for abstract, concrete in sorted(result_by_user_spec.items()):
+ self._add_concrete_spec(abstract, concrete)
+ result.append((abstract, concrete))
+ return result
def _concretize_together(self, tests=False):
"""Concretization strategy that concretizes all the specs
@@ -1316,7 +1350,7 @@ class Environment(object):
concrete_spec: if provided, then it is assumed that it is the
result of concretizing the provided ``user_spec``
"""
- if self.concretization == 'together':
+ if self.unify is True:
msg = 'cannot install a single spec in an environment that is ' \
'configured to be concretized together. Run instead:\n\n' \
' $ spack add <spec>\n' \
diff --git a/lib/spack/spack/schema/concretizer.py b/lib/spack/spack/schema/concretizer.py
index 46d0e9126b..63a1692411 100644
--- a/lib/spack/spack/schema/concretizer.py
+++ b/lib/spack/spack/schema/concretizer.py
@@ -26,12 +26,10 @@ properties = {
}
},
'unify': {
- 'type': 'boolean'
- # Todo: add when_possible.
- # 'oneOf': [
- # {'type': 'boolean'},
- # {'type': 'string', 'enum': ['when_possible']}
- # ]
+ 'oneOf': [
+ {'type': 'boolean'},
+ {'type': 'string', 'enum': ['when_possible']}
+ ]
}
}
}
diff --git a/lib/spack/spack/solver/asp.py b/lib/spack/spack/solver/asp.py
index b4739b616d..7a8117976b 100644
--- a/lib/spack/spack/solver/asp.py
+++ b/lib/spack/spack/solver/asp.py
@@ -98,6 +98,20 @@ DeclaredVersion = collections.namedtuple(
# Below numbers are used to map names of criteria to the order
# they appear in the solution. See concretize.lp
+# The space of possible priorities for optimization targets
+# is partitioned in the following ranges:
+#
+# [0-100) Optimization criteria for software being reused
+# [100-200) Fixed criteria that are higher priority than reuse, but lower than build
+# [200-300) Optimization criteria for software being built
+# [300-1000) High-priority fixed criteria
+# [1000-inf) Error conditions
+#
+# Each optimization target is a minimization with optimal value 0.
+
+#: High fixed priority offset for criteria that supersede all build criteria
+high_fixed_priority_offset = 300
+
#: Priority offset for "build" criteria (regular criterio shifted to
#: higher priority for specs we have to build)
build_priority_offset = 200
@@ -112,6 +126,7 @@ def build_criteria_names(costs, tuples):
priorities_names = []
num_fixed = 0
+ num_high_fixed = 0
for pred, args in tuples:
if pred != "opt_criterion":
continue
@@ -128,6 +143,8 @@ def build_criteria_names(costs, tuples):
if priority < fixed_priority_offset:
build_priority = priority + build_priority_offset
priorities_names.append((build_priority, name))
+ elif priority >= high_fixed_priority_offset:
+ num_high_fixed += 1
else:
num_fixed += 1
@@ -141,19 +158,26 @@ def build_criteria_names(costs, tuples):
# split list into three parts: build criteria, fixed criteria, non-build criteria
num_criteria = len(priorities_names)
- num_build = (num_criteria - num_fixed) // 2
+ num_build = (num_criteria - num_fixed - num_high_fixed) // 2
+
+ build_start_idx = num_high_fixed
+ fixed_start_idx = num_high_fixed + num_build
+ installed_start_idx = num_high_fixed + num_build + num_fixed
- build = priorities_names[:num_build]
- fixed = priorities_names[num_build:num_build + num_fixed]
- installed = priorities_names[num_build + num_fixed:]
+ high_fixed = priorities_names[:build_start_idx]
+ build = priorities_names[build_start_idx:fixed_start_idx]
+ fixed = priorities_names[fixed_start_idx:installed_start_idx]
+ installed = priorities_names[installed_start_idx:]
# mapping from priority to index in cost list
indices = dict((p, i) for i, (p, n) in enumerate(priorities_names))
# make a list that has each name with its build and non-build costs
- criteria = [
- (costs[p - fixed_priority_offset + num_build], None, name) for p, name in fixed
- ]
+ criteria = [(cost, None, name) for cost, (p, name) in
+ zip(costs[:build_start_idx], high_fixed)]
+ criteria += [(cost, None, name) for cost, (p, name) in
+ zip(costs[fixed_start_idx:installed_start_idx], fixed)]
+
for (i, name), (b, _) in zip(installed, build):
criteria.append((costs[indices[i]], costs[indices[b]], name))
@@ -306,7 +330,9 @@ class Result(object):
self.abstract_specs = specs
# Concrete specs
+ self._concrete_specs_by_input = None
self._concrete_specs = None
+ self._unsolved_specs = None
def format_core(self, core):
"""
@@ -403,15 +429,32 @@ class Result(object):
"""List of concretized specs satisfying the initial
abstract request.
"""
- # The specs were already computed, return them
- if self._concrete_specs:
- return self._concrete_specs
+ if self._concrete_specs is None:
+ self._compute_specs_from_answer_set()
+ return self._concrete_specs
+
+ @property
+ def unsolved_specs(self):
+ """List of abstract input specs that were not solved."""
+ if self._unsolved_specs is None:
+ self._compute_specs_from_answer_set()
+ return self._unsolved_specs
- # Assert prerequisite
- msg = 'cannot compute specs ["satisfiable" is not True ]'
- assert self.satisfiable, msg
+ @property
+ def specs_by_input(self):
+ if self._concrete_specs_by_input is None:
+ self._compute_specs_from_answer_set()
+ return self._concrete_specs_by_input
+
+ def _compute_specs_from_answer_set(self):
+ if not self.satisfiable:
+ self._concrete_specs = []
+ self._unsolved_specs = self.abstract_specs
+ self._concrete_specs_by_input = {}
+ return
- self._concrete_specs = []
+ self._concrete_specs, self._unsolved_specs = [], []
+ self._concrete_specs_by_input = {}
best = min(self.answers)
opt, _, answer = best
for input_spec in self.abstract_specs:
@@ -420,10 +463,13 @@ class Result(object):
providers = [spec.name for spec in answer.values()
if spec.package.provides(key)]
key = providers[0]
+ candidate = answer.get(key)
- self._concrete_specs.append(answer[key])
-
- return self._concrete_specs
+ if candidate and candidate.satisfies(input_spec):
+ self._concrete_specs.append(answer[key])
+ self._concrete_specs_by_input[input_spec] = answer[key]
+ else:
+ self._unsolved_specs.append(input_spec)
def _normalize_packages_yaml(packages_yaml):
@@ -520,6 +566,7 @@ class PyclingoDriver(object):
setup,
specs,
nmodels=0,
+ reuse=None,
timers=False,
stats=False,
out=None,
@@ -530,7 +577,8 @@ class PyclingoDriver(object):
Arguments:
setup (SpackSolverSetup): An object to set up the ASP problem.
specs (list): List of ``Spec`` objects to solve for.
- nmodels (list): Number of models to consider (default 0 for unlimited).
+ nmodels (int): Number of models to consider (default 0 for unlimited).
+ reuse (None or list): list of concrete specs that can be reused
timers (bool): Print out coarse timers for different solve phases.
stats (bool): Whether to output Clingo's internal solver statistics.
out: Optional output stream for the generated ASP program.
@@ -554,7 +602,7 @@ class PyclingoDriver(object):
self.assumptions = []
with self.control.backend() as backend:
self.backend = backend
- setup.setup(self, specs)
+ setup.setup(self, specs, reuse=reuse)
timer.phase("setup")
# read in the main ASP program and display logic -- these are
@@ -573,6 +621,7 @@ class PyclingoDriver(object):
arg = ast_sym(ast_sym(term.atom).arguments[0])
self.fact(AspFunction(name)(arg.string))
+ self.h1("Error messages")
path = os.path.join(parent_dir, 'concretize.lp')
parse_files([path], visit)
@@ -622,7 +671,7 @@ class PyclingoDriver(object):
if result.satisfiable:
# build spec from the best model
- builder = SpecBuilder(specs)
+ builder = SpecBuilder(specs, reuse=reuse)
min_cost, best_model = min(models)
tuples = [
(sym.name, [stringify(a) for a in sym.arguments])
@@ -654,7 +703,7 @@ class PyclingoDriver(object):
class SpackSolverSetup(object):
"""Class to set up and run a Spack concretization solve."""
- def __init__(self, reuse=False, tests=False):
+ def __init__(self, tests=False):
self.gen = None # set by setup()
self.declared_versions = {}
@@ -680,11 +729,11 @@ class SpackSolverSetup(object):
self.target_specs_cache = None
# whether to add installed/binary hashes to the solve
- self.reuse = reuse
-
- # whether to add installed/binary hashes to the solve
self.tests = tests
+ # If False allows for input specs that are not solved
+ self.concretize_everything = True
+
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1737,32 +1786,7 @@ class SpackSolverSetup(object):
if spec.concrete:
self._facts_from_concrete_spec(spec, possible)
- def define_installed_packages(self, specs, possible):
- """Add facts about all specs already in the database.
-
- Arguments:
- possible (dict): result of Package.possible_dependencies() for
- specs in this solve.
- """
- # Specs from local store
- with spack.store.db.read_transaction():
- for spec in spack.store.db.query(installed=True):
- if not spec.satisfies('dev_path=*'):
- self._facts_from_concrete_spec(spec, possible)
-
- # Specs from configured buildcaches
- try:
- index = spack.binary_distribution.update_cache_and_get_specs()
- for spec in index:
- if not spec.satisfies('dev_path=*'):
- self._facts_from_concrete_spec(spec, possible)
- except (spack.binary_distribution.FetchCacheError, IndexError):
- # this is raised when no mirrors had indices.
- # TODO: update mirror configuration so it can indicate that the source cache
- # TODO: (or any mirror really) doesn't have binaries.
- pass
-
- def setup(self, driver, specs):
+ def setup(self, driver, specs, reuse=None):
"""Generate an ASP program with relevant constraints for specs.
This calls methods on the solve driver to set up the problem with
@@ -1770,7 +1794,9 @@ class SpackSolverSetup(object):
specs, as well as constraints from the specs themselves.
Arguments:
+ driver (PyclingoDriver): driver instance of this solve
specs (list): list of Specs to solve
+ reuse (None or list): list of concrete specs that can be reused
"""
self._condition_id_counter = itertools.count()
@@ -1809,11 +1835,11 @@ class SpackSolverSetup(object):
self.gen.h1("Concrete input spec definitions")
self.define_concrete_input_specs(specs, possible)
- if self.reuse:
- self.gen.h1("Installed packages")
+ if reuse:
+ self.gen.h1("Reusable specs")
self.gen.fact(fn.optimize_for_reuse())
- self.gen.newline()
- self.define_installed_packages(specs, possible)
+ for reusable_spec in reuse:
+ self._facts_from_concrete_spec(reusable_spec, possible)
self.gen.h1('General Constraints')
self.available_compilers()
@@ -1846,19 +1872,7 @@ class SpackSolverSetup(object):
_develop_specs_from_env(dep, env)
self.gen.h1('Spec Constraints')
- for spec in sorted(specs):
- self.gen.h2('Spec: %s' % str(spec))
- self.gen.fact(
- fn.virtual_root(spec.name) if spec.virtual
- else fn.root(spec.name)
- )
-
- for clause in self.spec_clauses(spec):
- self.gen.fact(clause)
- if clause.name == 'variant_set':
- self.gen.fact(
- fn.variant_default_value_from_cli(*clause.args)
- )
+ self.literal_specs(specs)
self.gen.h1("Variant Values defined in specs")
self.define_variant_values()
@@ -1875,45 +1889,47 @@ class SpackSolverSetup(object):
self.gen.h1("Target Constraints")
self.define_target_constraints()
+ def literal_specs(self, specs):
+ for idx, spec in enumerate(specs):
+ self.gen.h2('Spec: %s' % str(spec))
+ self.gen.fact(fn.literal(idx))
+
+ root_fn = fn.virtual_root(spec.name) if spec.virtual else fn.root(spec.name)
+ self.gen.fact(fn.literal(idx, root_fn.name, *root_fn.args))
+ for clause in self.spec_clauses(spec):
+ self.gen.fact(fn.literal(idx, clause.name, *clause.args))
+ if clause.name == 'variant_set':
+ self.gen.fact(fn.literal(
+ idx, "variant_default_value_from_cli", *clause.args
+ ))
+
+ if self.concretize_everything:
+ self.gen.fact(fn.concretize_everything())
+
class SpecBuilder(object):
"""Class with actions to rebuild a spec from ASP results."""
#: Attributes that don't need actions
ignored_attributes = ["opt_criterion"]
- def __init__(self, specs):
+ def __init__(self, specs, reuse=None):
self._specs = {}
self._result = None
self._command_line_specs = specs
self._flag_sources = collections.defaultdict(lambda: set())
self._flag_compiler_defaults = set()
+ # Pass in as arguments reusable specs and plug them in
+ # from this dictionary during reconstruction
+ self._hash_lookup = {}
+ if reuse is not None:
+ for spec in reuse:
+ for node in spec.traverse():
+ self._hash_lookup.setdefault(node.dag_hash(), node)
+
def hash(self, pkg, h):
if pkg not in self._specs:
- try:
- # try to get the candidate from the store
- concrete_spec = spack.store.db.get_by_hash(h)[0]
- except TypeError:
- # the dag hash was not in the DB, try buildcache
- s = spack.binary_distribution.binary_index.find_by_hash(h)
- if s:
- concrete_spec = s[0]['spec']
- else:
- # last attempt: maybe the hash comes from a particular input spec
- # this only occurs in tests (so far)
- for clspec in self._command_line_specs:
- for spec in clspec.traverse():
- if spec.concrete and spec.dag_hash() == h:
- concrete_spec = spec
-
- assert concrete_spec, "Unable to look up concrete spec with hash %s" % h
- self._specs[pkg] = concrete_spec
- else:
- # TODO: remove this code -- it's dead unless we decide that node() clauses
- # should come before hashes.
- # ensure that if it's already there, it's correct
- spec = self._specs[pkg]
- assert spec.dag_hash() == h
+ self._specs[pkg] = self._hash_lookup[h]
def node(self, pkg):
if pkg not in self._specs:
@@ -2183,7 +2199,7 @@ def _develop_specs_from_env(spec, env):
class Solver(object):
"""This is the main external interface class for solving.
- It manages solver configuration and preferences in once place. It sets up the solve
+ It manages solver configuration and preferences in one place. It sets up the solve
and passes the setup method to the driver, as well.
Properties of interest:
@@ -2199,6 +2215,42 @@ class Solver(object):
# by setting them directly as properties.
self.reuse = spack.config.get("concretizer:reuse", False)
+ @staticmethod
+ def _check_input_and_extract_concrete_specs(specs):
+ reusable = []
+ for root in specs:
+ for s in root.traverse():
+ if s.virtual:
+ continue
+ if s.concrete:
+ reusable.append(s)
+ spack.spec.Spec.ensure_valid_variants(s)
+ return reusable
+
+ def _reusable_specs(self):
+ reusable_specs = []
+ if self.reuse:
+ # Specs from the local Database
+ with spack.store.db.read_transaction():
+ reusable_specs.extend([
+ s for s in spack.store.db.query(installed=True)
+ if not s.satisfies('dev_path=*')
+ ])
+
+ # Specs from buildcaches
+ try:
+ index = spack.binary_distribution.update_cache_and_get_specs()
+ reusable_specs.extend([
+ s for s in index if not s.satisfies('dev_path=*')
+ ])
+
+ except (spack.binary_distribution.FetchCacheError, IndexError):
+ # this is raised when no mirrors had indices.
+ # TODO: update mirror configuration so it can indicate that the
+ # TODO: source cache (or any mirror really) doesn't have binaries.
+ pass
+ return reusable_specs
+
def solve(
self,
specs,
@@ -2222,23 +2274,78 @@ class Solver(object):
setup_only (bool): if True, stop after setup and don't solve (default False).
"""
# Check upfront that the variants are admissible
- for root in specs:
- for s in root.traverse():
- if s.virtual:
- continue
- spack.spec.Spec.ensure_valid_variants(s)
-
- setup = SpackSolverSetup(reuse=self.reuse, tests=tests)
+ reusable_specs = self._check_input_and_extract_concrete_specs(specs)
+ reusable_specs.extend(self._reusable_specs())
+ setup = SpackSolverSetup(tests=tests)
return self.driver.solve(
setup,
specs,
nmodels=models,
+ reuse=reusable_specs,
timers=timers,
stats=stats,
out=out,
setup_only=setup_only,
)
+ def solve_in_rounds(
+ self,
+ specs,
+ out=None,
+ models=0,
+ timers=False,
+ stats=False,
+ tests=False,
+ ):
+ """Solve for a stable model of specs in multiple rounds.
+
+ This relaxes the assumption of solve that everything must be consistent and
+ solvable in a single round. Each round tries to maximize the reuse of specs
+ from previous rounds.
+
+ The function is a generator that yields the result of each round.
+
+ Arguments:
+ specs (list): list of Specs to solve.
+ models (int): number of models to search (default: 0)
+ out: Optionally write the generate ASP program to a file-like object.
+ timers (bool): print timing if set to True
+ stats (bool): print internal statistics if set to True
+ tests (bool): add test dependencies to the solve
+ """
+ reusable_specs = self._check_input_and_extract_concrete_specs(specs)
+ reusable_specs.extend(self._reusable_specs())
+ setup = SpackSolverSetup(tests=tests)
+
+ # Tell clingo that we don't have to solve all the inputs at once
+ setup.concretize_everything = False
+
+ input_specs = specs
+ while True:
+ result = self.driver.solve(
+ setup,
+ input_specs,
+ nmodels=models,
+ reuse=reusable_specs,
+ timers=timers,
+ stats=stats,
+ out=out,
+ setup_only=False
+ )
+ yield result
+
+ # If we don't have unsolved specs we are done
+ if not result.unsolved_specs:
+ break
+
+ # This means we cannot progress with solving the input
+ if not result.satisfiable or not result.specs:
+ break
+
+ input_specs = result.unsolved_specs
+ for spec in result.specs:
+ reusable_specs.extend(spec.traverse())
+
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""
diff --git a/lib/spack/spack/solver/concretize.lp b/lib/spack/spack/solver/concretize.lp
index 1e7d0f66de..b9b3141499 100644
--- a/lib/spack/spack/solver/concretize.lp
+++ b/lib/spack/spack/solver/concretize.lp
@@ -8,6 +8,52 @@
%=============================================================================
%-----------------------------------------------------------------------------
+% Map literal input specs to facts that drive the solve
+%-----------------------------------------------------------------------------
+
+% Give clingo the choice to solve an input spec or not
+{ literal_solved(ID) } :- literal(ID).
+literal_not_solved(ID) :- not literal_solved(ID), literal(ID).
+
+% If concretize_everything() is a fact, then we cannot have unsolved specs
+:- literal_not_solved(ID), concretize_everything.
+
+% Make a problem with "zero literals solved" unsat. This is to trigger
+% looking for solutions to the ASP problem with "errors", which results
+% in better reporting for users. See #30669 for details.
+1 { literal_solved(ID) : literal(ID) }.
+
+opt_criterion(300, "number of input specs not concretized").
+#minimize{ 0@300: #true }.
+#minimize { 1@300,ID : literal_not_solved(ID) }.
+
+% Map constraint on the literal ID to the correct PSID
+attr(Name, A1) :- literal(LiteralID, Name, A1), literal_solved(LiteralID).
+attr(Name, A1, A2) :- literal(LiteralID, Name, A1, A2), literal_solved(LiteralID).
+attr(Name, A1, A2, A3) :- literal(LiteralID, Name, A1, A2, A3), literal_solved(LiteralID).
+
+% For these two atoms we only need implications in one direction
+root(Package) :- attr("root", Package).
+virtual_root(Package) :- attr("virtual_root", Package).
+
+node_platform_set(Package, Platform) :- attr("node_platform_set", Package, Platform).
+node_os_set(Package, OS) :- attr("node_os_set", Package, OS).
+node_target_set(Package, Target) :- attr("node_target_set", Package, Target).
+node_flag_set(Package, Flag, Value) :- attr("node_flag_set", Package, Flag, Value).
+
+node_compiler_version_set(Package, Compiler, Version)
+ :- attr("node_compiler_version_set", Package, Compiler, Version).
+
+variant_default_value_from_cli(Package, Variant, Value)
+ :- attr("variant_default_value_from_cli", Package, Variant, Value).
+
+#defined concretize_everything/0.
+#defined literal/1.
+#defined literal/3.
+#defined literal/4.
+#defined literal/5.
+
+%-----------------------------------------------------------------------------
% Version semantics
%-----------------------------------------------------------------------------
diff --git a/lib/spack/spack/test/cmd/concretize.py b/lib/spack/spack/test/cmd/concretize.py
index d357ccc9dc..a92e059464 100644
--- a/lib/spack/spack/test/cmd/concretize.py
+++ b/lib/spack/spack/test/cmd/concretize.py
@@ -18,37 +18,40 @@ add = SpackCommand('add')
concretize = SpackCommand('concretize')
-@pytest.mark.parametrize('concretization', ['separately', 'together'])
-def test_concretize_all_test_dependencies(concretization):
+unification_strategies = [False, True, 'when_possible']
+
+
+@pytest.mark.parametrize('unify', unification_strategies)
+def test_concretize_all_test_dependencies(unify):
"""Check all test dependencies are concretized."""
env('create', 'test')
with ev.read('test') as e:
- e.concretization = concretization
+ e.unify = unify
add('depb')
concretize('--test', 'all')
assert e.matching_spec('test-dependency')
-@pytest.mark.parametrize('concretization', ['separately', 'together'])
-def test_concretize_root_test_dependencies_not_recursive(concretization):
+@pytest.mark.parametrize('unify', unification_strategies)
+def test_concretize_root_test_dependencies_not_recursive(unify):
"""Check that test dependencies are not concretized recursively."""
env('create', 'test')
with ev.read('test') as e:
- e.concretization = concretization
+ e.unify = unify
add('depb')
concretize('--test', 'root')
assert e.matching_spec('test-dependency') is None
-@pytest.mark.parametrize('concretization', ['separately', 'together'])
-def test_concretize_root_test_dependencies_are_concretized(concretization):
+@pytest.mark.parametrize('unify', unification_strategies)
+def test_concretize_root_test_dependencies_are_concretized(unify):
"""Check that root test dependencies are concretized."""
env('create', 'test')
with ev.read('test') as e:
- e.concretization = concretization
+ e.unify = unify
add('a')
add('b')
concretize('--test', 'root')
diff --git a/lib/spack/spack/test/cmd/env.py b/lib/spack/spack/test/cmd/env.py
index 350dbe7ec1..b686d5abb4 100644
--- a/lib/spack/spack/test/cmd/env.py
+++ b/lib/spack/spack/test/cmd/env.py
@@ -2197,7 +2197,7 @@ def test_env_activate_default_view_root_unconditional(mutable_mock_env_path):
def test_concretize_user_specs_together():
e = ev.create('coconcretization')
- e.concretization = 'together'
+ e.unify = True
# Concretize a first time using 'mpich' as the MPI provider
e.add('mpileaks')
@@ -2225,7 +2225,7 @@ def test_concretize_user_specs_together():
def test_cant_install_single_spec_when_concretizing_together():
e = ev.create('coconcretization')
- e.concretization = 'together'
+ e.unify = True
with pytest.raises(ev.SpackEnvironmentError, match=r'cannot install'):
e.concretize_and_add('zlib')
@@ -2234,7 +2234,7 @@ def test_cant_install_single_spec_when_concretizing_together():
def test_duplicate_packages_raise_when_concretizing_together():
e = ev.create('coconcretization')
- e.concretization = 'together'
+ e.unify = True
e.add('mpileaks+opt')
e.add('mpileaks~opt')
@@ -2556,7 +2556,7 @@ def test_custom_version_concretize_together(tmpdir):
# Custom versions should be permitted in specs when
# concretizing together
e = ev.create('custom_version')
- e.concretization = 'together'
+ e.unify = True
# Concretize a first time using 'mpich' as the MPI provider
e.add('hdf5@myversion')
@@ -2647,7 +2647,7 @@ spack:
def test_virtual_spec_concretize_together(tmpdir):
# An environment should permit to concretize "mpi"
e = ev.create('virtual_spec')
- e.concretization = 'together'
+ e.unify = True
e.add('mpi')
e.concretize()
@@ -2989,3 +2989,19 @@ def test_environment_depfile_out(tmpdir, mock_packages):
stdout = env('depfile', '-G', 'make')
with open(makefile_path, 'r') as f:
assert stdout == f.read()
+
+
+def test_unify_when_possible_works_around_conflicts():
+ e = ev.create('coconcretization')
+ e.unify = 'when_possible'
+
+ e.add('mpileaks+opt')
+ e.add('mpileaks~opt')
+ e.add('mpich')
+
+ e.concretize()
+
+ assert len([x for x in e.all_specs() if x.satisfies('mpileaks')]) == 2
+ assert len([x for x in e.all_specs() if x.satisfies('mpileaks+opt')]) == 1
+ assert len([x for x in e.all_specs() if x.satisfies('mpileaks~opt')]) == 1
+ assert len([x for x in e.all_specs() if x.satisfies('mpich')]) == 1
diff --git a/lib/spack/spack/test/concretize.py b/lib/spack/spack/test/concretize.py
index c8ec0700fe..eafea0ad99 100644
--- a/lib/spack/spack/test/concretize.py
+++ b/lib/spack/spack/test/concretize.py
@@ -1668,3 +1668,67 @@ class TestConcretize(object):
with spack.config.override("concretizer:reuse", True):
s = Spec('c').concretized()
assert s.namespace == 'builtin.mock'
+
+ @pytest.mark.parametrize('specs,expected', [
+ (['libelf', 'libelf@0.8.10'], 1),
+ (['libdwarf%gcc', 'libelf%clang'], 2),
+ (['libdwarf%gcc', 'libdwarf%clang'], 4),
+ (['libdwarf^libelf@0.8.12', 'libdwarf^libelf@0.8.13'], 4),
+ (['hdf5', 'zmpi'], 3),
+ (['hdf5', 'mpich'], 2),
+ (['hdf5^zmpi', 'mpich'], 4),
+ (['mpi', 'zmpi'], 2),
+ (['mpi', 'mpich'], 1),
+ ])
+ def test_best_effort_coconcretize(self, specs, expected):
+ import spack.solver.asp
+ if spack.config.get('config:concretizer') == 'original':
+ pytest.skip('Original concretizer cannot concretize in rounds')
+
+ specs = [spack.spec.Spec(s) for s in specs]
+ solver = spack.solver.asp.Solver()
+ solver.reuse = False
+ concrete_specs = set()
+ for result in solver.solve_in_rounds(specs):
+ for s in result.specs:
+ concrete_specs.update(s.traverse())
+
+ assert len(concrete_specs) == expected
+
+ @pytest.mark.parametrize('specs,expected_spec,occurances', [
+ # The algorithm is greedy, and it might decide to solve the "best"
+ # spec early in which case reuse is suboptimal. In this case the most
+ # recent version of libdwarf is selected and concretized to libelf@0.8.13
+ (['libdwarf@20111030^libelf@0.8.10',
+ 'libdwarf@20130207^libelf@0.8.12',
+ 'libdwarf@20130729'], 'libelf@0.8.12', 1),
+ # Check we reuse the best libelf in the environment
+ (['libdwarf@20130729^libelf@0.8.10',
+ 'libdwarf@20130207^libelf@0.8.12',
+ 'libdwarf@20111030'], 'libelf@0.8.12', 2),
+ (['libdwarf@20130729',
+ 'libdwarf@20130207',
+ 'libdwarf@20111030'], 'libelf@0.8.13', 3),
+ # We need to solve in 2 rounds and we expect mpich to be preferred to zmpi
+ (['hdf5+mpi', 'zmpi', 'mpich'], 'mpich', 2)
+ ])
+ def test_best_effort_coconcretize_preferences(
+ self, specs, expected_spec, occurances
+ ):
+ """Test package preferences during coconcretization."""
+ import spack.solver.asp
+ if spack.config.get('config:concretizer') == 'original':
+ pytest.skip('Original concretizer cannot concretize in rounds')
+
+ specs = [spack.spec.Spec(s) for s in specs]
+ solver = spack.solver.asp.Solver()
+ solver.reuse = False
+ concrete_specs = {}
+ for result in solver.solve_in_rounds(specs):
+ concrete_specs.update(result.specs_by_input)
+
+ counter = 0
+ for spec in concrete_specs.values():
+ if expected_spec in spec:
+ counter += 1
+ assert counter == occurances, concrete_specs
diff --git a/lib/spack/spack/test/spec_dag.py b/lib/spack/spack/test/spec_dag.py
index eca87f1f13..8fc137008c 100644
--- a/lib/spack/spack/test/spec_dag.py
+++ b/lib/spack/spack/test/spec_dag.py
@@ -135,8 +135,6 @@ def test_installed_deps(monkeypatch, mock_packages):
assert spack.version.Version('3') == a_spec[b][d].version
assert spack.version.Version('3') == a_spec[d].version
- # TODO: with reuse, this will be different -- verify the reuse case
-
@pytest.mark.usefixtures('config')
def test_specify_preinstalled_dep():
diff --git a/share/spack/gitlab/cloud_pipelines/stacks/build_systems/spack.yaml b/share/spack/gitlab/cloud_pipelines/stacks/build_systems/spack.yaml
index 2e8d31a2e4..5a742718d4 100644
--- a/share/spack/gitlab/cloud_pipelines/stacks/build_systems/spack.yaml
+++ b/share/spack/gitlab/cloud_pipelines/stacks/build_systems/spack.yaml
@@ -3,7 +3,7 @@ spack:
concretizer:
reuse: false
- unify: false
+ unify: when_possible
config:
install_tree:
diff --git a/var/spack/repos/builtin.mock/packages/installed-deps-b/package.py b/var/spack/repos/builtin.mock/packages/installed-deps-b/package.py
index 66c24d9c31..85a593d6d2 100644
--- a/var/spack/repos/builtin.mock/packages/installed-deps-b/package.py
+++ b/var/spack/repos/builtin.mock/packages/installed-deps-b/package.py
@@ -22,5 +22,5 @@ class InstalledDepsB(Package):
version("2", "abcdef0123456789abcdef0123456789")
version("3", "def0123456789abcdef0123456789abc")
- depends_on("installed-deps-d", type=("build", "link"))
+ depends_on("installed-deps-d@3:", type=("build", "link"))
depends_on("installed-deps-e", type=("build", "link"))