Age | Commit message (Collapse) | Author | Files | Lines |
|
|
|
update s3 bucket
update tutorial branch
|
|
|
|
|
|
Follow-up to #17110
### Before
```bash
CC=/Users/Adam/spack/lib/spack/env/clang/clang; export CC
SPACK_CC=/usr/bin/clang; export SPACK_CC
PATH=...:/Users/Adam/spack/lib/spack/env/apple-clang:/Users/Adam/spack/lib/spack/env/case-insensitive:/Users/Adam/spack/lib/spack/env:...; export PATH
```
### After
```bash
CC=/Users/Adam/spack/lib/spack/env/clang/clang; export CC
SPACK_CC=/usr/bin/clang; export SPACK_CC
PATH=...:/Users/Adam/spack/lib/spack/env/clang:/Users/Adam/spack/lib/spack/env/case-insensitive:/Users/Adam/spack/lib/spack/env:...; export PATH
```
`CC` and `SPACK_CC` were being set correctly, but `PATH` was using the name of the compiler `apple-clang` instead of `clang`. For most packages, since `CC` was set correctly, nothing broke. But for packages using `Makefiles` that set `CC` based on `which clang`, it was using the system compilers instead of the compiler wrappers. Discovered when working on `py-xgboost@0.90`.
An alternative fix would be to copy the symlinks in `env/clang` to `env/apple-clang`. Let me know if you think there's a better way to do this, or to test this.
|
|
Facilitate running intel-oneapi-mpi outside of Spack (set PATH,
LD_LIBRARY_PATH, etc. appropriately).
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
|
|
(#20717)
* add to LD_LIBRARY_PATH so that it finds libimf.so
* amrex: fix handling of CUDA arch (#20786)
* amrex: fix handling of CUDA arch
* amrex: fix style
* amrex: fix bug
* Update var/spack/repos/builtin/packages/amrex/package.py
* Update var/spack/repos/builtin/packages/amrex/package.py
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
* ecp-data-vis-sdk: Combine the vis and io SDK packages (#20737)
This better enables the collective set to be deployed togethor satisfying
eachothers dependencies
* r-sf: fix dependency error (#20898)
* improve documentation for Rocm (hip amd builds) (#20812)
* improve documentation
* astyle: Fix makefile for install parameter (#20899)
* llvm-doe: added new package (#20719)
The package contains duplicated code from llvm/package.py,
will supersede solve.
* r-e1071: added v1.7-4 (#20891)
* r-diffusionmap: added v1.2.0 (#20881)
* r-covr: added v3.5.1 (#20868)
* r-class: added v7.3-17 (#20856)
* py-h5py: HDF5_DIR is needed for ~mpi too (#20905)
For the `~mpi` variant, the environment variable `HDF5_DIR` is still required. I moved this command out of the `+mpi` conditional.
* py-hovorod: fix typo on variant name in conflicts directive (#20906)
* fujitsu-fftw: Add new package (#20824)
* pocl: added v1.6 (#20932)
Made version 1.5 or lower conflicts with a64fx.
* PCL: add new package (#20933)
* r-rle: new package (#20916)
Common 'base' and 'stats' methods for 'rle' objects, aiming to make it
possible to treat them transparently as vectors.
* r-ellipsis: added v0.3.1 (#20913)
* libconfig: add build dependency on texinfo (#20930)
* r-flexmix: add v2.3-17 (#20924)
* r-fitdistrplus: add v1.1-3 (#20923)
* r-fit-models: add v0.64 (#20922)
* r-fields: add v11.6 (#20921)
* r-fftwtools: add v0.9-9 (#20920)
* r-farver: add v2.0.3 (#20919)
* r-expm: add v0.999-6 (#20918)
* cln: add build dependency on texinfo (#20928)
* r-expint: add v0.1-6 (#20917)
* r-envstats: add v2.4.0 (#20915)
* r-energy: add v1.7-7 (#20914)
* r-ellipse: add v0.4.2 (#20912)
* py-fiscalyear: add v0.3.0 (#20911)
* r-ecp: add v3.1.3 (#20910)
* r-plotmo: add v3.6.0 (#20909)
* Improve gcc detection in llvm. (#20189)
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
Co-authored-by: Thomas Green <ca-tgreen@gw4a64fxlogin00.head.gw4.metoffice.gov.uk>
* hatchet: updated urls (#20908)
* py-anuga: add new package (#20782)
* libvips: added v8.10.5 (#20902)
* libzmq: add platform conditions to libbsd dependency (#20893)
* r-dtw: add v1.22-3 (#20890)
* r-dt: add v0.17 (#20889)
* r-dosnow: add v1.0.19 (#20888)
* add version 1.0.16 to r-doparallel (#20886)
* add version 1.3.7 to r-domc (#20885)
* add version 0.9-15 to r-diversitree (#20884)
* add version 1.3-3 to r-dismo (#20883)
* add version 0.6.27 to r-digest (#20882)
* add version 1.5 to r-rngtools (#20887)
* add version 1.5.8 to r-dicekriging (#20877)
* add version 1.4.2 to r-httr (#20876)
* add version 1.28 to r-desolve (#20875)
* add version 2.2-5 to r-deoptim (#20874)
* add version 0.2-3 to r-deldir (#20873)
* add version 1.0.0 to r-crul (#20870)
* add version 1.1.0.1 to r-crosstalk (#20869)
* add version 1.0-1 to r-copula (#20867)
* add version 5.0.2 to r-rcppparallel (#20866)
* add version 2.0-1 to r-compositions (#20865)
* add version 0.4.10 to r-rlang (#20796)
* add version 0.3.6 to r-vctrs (#20878)
* amrex: add ROCm support (#20809)
* add version 2.0-0 to r-colorspace (#20864)
* add version 1.3-1 to r-coin (#20863)
* add version 0.19-4 to r-coda (#20862)
* add version 1.3.7 to r-clustergeneration (#20861)
* add version 0.3-58 to r-clue (#20860)
* add version 0.7.1 to r-clipr (#20859)
* add version 2.2.0 to r-cli (#20858)
* add version 0.4-3 to r-classint (#20857)
* add version 0.1.2 to r-globaloptions (#20855)
* add version 2.3-56 to r-chron (#20854)
* add version 0.4.10 to r-checkpoint (#20853)
* add version 2.0.0 to r-checkmate (#20852)
* add version 1.18.1 to r-catools (#20850)
* add version 1.2.2.2 to r-modelmetrics (#20849)
* add version 3.0-4 to r-cardata (#20847)
* add version 1.0.1 to r-caracas (#20846)
* r-lifecycle: new package at v0.2.0 (#20845)
* add version 3.0-10 to r-car (#20844)
* add version 3.4.5 to r-processx (#20843)
* add version 1.5-12.2 to r-cairo (#20842)
* add version 0.2.3 to r-cubist (#20841)
* add version 2.6 to r-rmarkdown (#20838)
* add version 1.2.1 to r-blob (#20819)
* add version 4.0.4 to r-bit (#20818)
* add version 2.4-1 to r-bio3d (#20816)
* add version 0.4.2.3 to r-bibtex (#20815)
* add version 3.1-4 to r-bayesm (#20807)
* add version 1.2.1 to r-backports (#20806)
* add version 2.0.3 to r-argparse (#20805)
* add version 5.4-1 to r-ape (#20804)
* add version 0.8-18 to r-amap (#20803)
* r-pixmap: added new package (#20795)
* zoltan: source code location change (#20787)
* refactor path logic
* added some paths to make compilers and libs discoverable
* add to LD_LIBRARY_PATH so that it finds libimf.so
and cleanup PEP8
* refactor path logic
* adding paths to LIBRARY_PATH so compiler wrappers will find -lmpi
* added vals for CC=icx, CXX=icpx, FC=ifx to generated module
* back out changes to intel-oneapi-mpi, save for separate PR
* Update var/spack/repos/builtin/packages/intel-oneapi-compilers/package.py
path is joined in _ld_library_path()
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* set absolute paths to icx,icpx,ifx
* dang close parenthesis
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: mic84 <mrosso@lbl.gov>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Chuck Atkins <chuck.atkins@kitware.com>
Co-authored-by: darmac <xiaojun2@hisilicon.com>
Co-authored-by: Danny Taller <66029857+dtaller@users.noreply.github.com>
Co-authored-by: Tomoyasu Nojiri <68096132+t-nojiri@users.noreply.github.com>
Co-authored-by: Shintaro Iwasaki <siwasaki@anl.gov>
Co-authored-by: Glenn Johnson <glenn-johnson@uiowa.edu>
Co-authored-by: Kelly (KT) Thompson <KineticTheory@users.noreply.github.com>
Co-authored-by: Henrique Mendonça <henrique@users.noreply.github.com>
Co-authored-by: h-denpo <57649496+h-denpo@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Thomas Green <tomgreen66@hotmail.com>
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
Co-authored-by: Thomas Green <ca-tgreen@gw4a64fxlogin00.head.gw4.metoffice.gov.uk>
Co-authored-by: Abhinav Bhatele <bhatele@cs.umd.edu>
Co-authored-by: a-saitoh-fj <63334055+a-saitoh-fj@users.noreply.github.com>
Co-authored-by: QuellynSnead <quellyn@lanl.gov>
|
|
|
|
|
|
|
|
|
|
|
|
* sbang pushed back to callers;
star moved to util.lang
* updated unit test
* sbang test moved; local tests pass
Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
|
|
|
|
fixes #20784
Similarly to the previous bug, here we were deducing
conditions to be imposed on nodes that were not part
of the DAG.
|
|
fixes #20736
Before this one line fix we were erroneously deducing
that dependency conditions hold even if a package
was external.
This may result in answer sets that contain imposed
conditions on a node without the node being present
in the DAG, hence #20736.
|
|
fixes #20611
The conflict was triggered by an invalid value of the
'scheduler' variant. This causes Spack to error when libyogrt
facts are validated by the ASP-based concretizer.
|
|
At some point in the past, the skip_patch argument was removed
from the call to package.do_install() this broke the --skip-patch
flag on the dev-build command.
|
|
Set up environment and dependent packages properly when building
with intel-oneapi-mpi as a dependency MPI provider (e.g. point to
mpicc compiler wrapper).
|
|
This properly sets PATH/CPATH/LIBRARY_PATH etc. to make the
Spack-generated module file for intel-oneapi-compilers useful
(without this, 'icx' would not be found after loading the module
file for intel-oneapi-compilers).
|
|
|
|
llvm package (#19933)
* Restore OS based Clang default choice of C++ standard library.
* Add LLVM 11.0.1 release
|
|
fixes #20679
In this refactor we have a single cardinality rule on the
provider, which triggers a rule transforming a dependency
on a virtual package into a dependency on the provider of
the virtual.
|
|
Every other predicate in the concretizer uses a `_set` suffix to
implement user- or package-supplied settings, but compiler settings use a
`_hard` suffix for this. There's no difference in how they're used, so
make the names the same.
- [x] change `node_compiler_hard` to `node_compiler_set`
- [x] change `node_compiler_version_hard` to `node_compiler_version_set`
|
|
Previously, the concretizer handled version constraints by comparing all
pairs of constraints and ensuring they satisfied each other. This led to
INCONSISTENT ressults from clingo, due to ambiguous semantics like:
version_constraint_satisfies("mpi", ":1", ":3")
version_constraint_satisfies("mpi", ":3", ":1")
To get around this, we introduce possible (fake) versions for virtuals,
based on their constraints. Essentially, we add any Versions,
VersionRange endpoints, and all such Versions and endpoints from
VersionLists to the constraint. Virtuals will have one of these synthetic
versions "picked" by the solver. This also allows us to remove a special
case from handling of `version_satisfies/3` -- virtuals now work just
like regular packages.
|
|
Our program only generates facts now, so remove all unused code related
to generating cardinality constraints and rules.
|
|
This converts the virtual handling in the new concretizer from
already-ground rules to facts. This is the last thing that needs to be
refactored, and it converts the entire concretizer to just use facts.
The previous way of handling virtuals hinged on rules involving
`single_provider_for` facts that were tied to the virtual and a version
range. The new method uses the condition pattern we've been using for
dependencies, externals, and conflicts.
To handle virtuals as conditions, we impose constraints on "fake" virtual
specs in the logic program. i.e., `version_satisfies("mpi", "2.0:",
"2.0")` is legal whereas before we wouldn't have seen something like
this. Currently, constriants are only handled on versions -- we don't
handle variants or anything else yet, but they key change here is that we
*could*. For a long time, virtual handling in Spack has only dealt with
versions, and we'd like to be able to handle variants as well. We could
easily add an integrity constraint to handle variants like the one we use
for versions.
One issue with the implementation here is that virtual packages don't
actually declare possible versions like regular packages do. To get
around that, we implement an integrity constraint like this:
:- virtual_node(Virtual),
version_satisfies(Virtual, V1), version_satisfies(Virtual, V2),
not version_constraint_satisfies(Virtual, V1, V2).
This requires us to compare every version constraint to every other, both
in program generation and within the concretizer -- so there's a
potentially quadratic evaluation time on virtual constraints because we
don't have a real version to "anchor" things to. We just say that all the
constraints need to agree for the virtual constraint to hold.
We can investigate adding synthetic versions for virtuals in the future,
to speed this up.
|
|
|
|
|
|
|
|
This code in `SpecBuilder.build_specs()` introduced in #20203, can loop
seemingly interminably for very large specs:
```python
set([spec.root for spec in self._specs.values()])
```
It's deceptive, because it seems like there must be an issue with
`spec.root`, but that works fine. It's building the set afterwards that
takes forever, at least on `r-rminer`. Currently if you try running
`spack solve r-rminer`, it loops infinitely and spins up your fan.
The issue (I think) is that the spec is not yet complete when this is
run, and something is going wrong when constructing and comparing so many
values produced by `_cmp_key()`. We can investigate the efficiency of
`_cmp_key()` separately, but for now, the fix is:
```python
roots = [spec.root for spec in self._specs.values()]
roots = dict((id(r), r) for r in roots)
```
We know the specs in `self._specs` are distinct (they just came out of
the solver), so we can just use their `id()` to unique them here. This
gets rid of the infinite loop.
|
|
Generate only facts for external specs. Substitute the
use of already grounded rules with non-grounded rules
in concretize.lp
|
|
Environment yaml files should not have default values written to them.
To accomplish this, we change the validator to not add the default values to yaml. We rely on the code to set defaults for all values (and use defaulting getters like dict.get(key, default)).
Includes regression test.
|
|
This creates a set of packages which all use the same script to install
components of Intel oneAPI. This includes:
* An inheritable IntelOneApiPackage which knows how to invoke the
installation script based on which components are requested
* For components which include headers/libraries, an inheritable
IntelOneApiLibraryPackage is provided to locate them
* Individual packages for DAL, DNN, TBB, etc.
* A package for the Intel oneAPI compilers (icx/ifx). This also includes
icc/ifortran but these are not currently detected in this PR
|
|
We have to repeat all the spec attributes in a number of places in
`concretize.lp`, and Spack has a fair number of spec attributes. If we
instead add some rules up front that establish equivalencies like this:
```
node(Package) :- attr("node", Package).
attr("node", Package) :- node(Package).
version(Package, Version) :- attr("version", Package, Version).
attr("version", Package, Version) :- version(Package, Version).
```
We can rewrite most of the repetitive conditions with `attr` and repeat
only for each arity (there are only 3 arities for spec attributes so far)
as opposed to each spec attribute. This makes the logic easier to read
and the rules easier to follow.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
|
|
Similar to the optimization on platform
|
|
We can speed-up the computation by avoiding a
double loop in a cardinality constraint and
enforcing the rule instead as an integrity
constraint.
|
|
|
|
|
|
|
|
|
|
Continuing to convert everything in `asp.py` into facts, make the
generation of ground rules for conditional dependencies use facts, and
move the semantics into `concretize.lp`.
This is probably the most complex logic in Spack, as dependencies can be
conditional on anything, and we need conditional ASP rules to accumulate
and map all the dependency conditions to spec attributes.
The logic looks complicated, but essentially it accumulates any
constraints associated with particular conditions into a fact associated
with the condition by id. Then, if *any* condition id's fact is True, we
trigger the dependency.
This simplifies the way `declared_dependency()` works -- the dependency
is now declared regardless of whether it is conditional, and the
conditions are handled by `dependency_condition()` facts.
|
|
There are currently no places where we do not want to traverse
dependencies in `spec_clauses()`, so simplify the logic by consolidating
`spec_traverse_clauses()` with `spec_clauses()`.
|
|
|
|
`version_satisfies/2` and `node_compiler_version_satisfies/3` are
generated but need `#defined` directives to avoid " info: atom does not
occur in any rule head:" warnings.
|
|
* asp: memoize the list of all target_specs to speed-up setup phase
* asp: memoize using a cache per solver object
|
|
This PR addresses a number of issues related to compiler bootstrapping.
Specifically:
1. Collect compilers to be bootstrapped while queueing in installer
Compiler tasks currently have an incomplete list in their task.dependents,
making those packages fail to install as they think they have not all their
dependencies installed. This PR collects the dependents and sets them on
compiler tasks.
2. allow boostrapped compilers to back off target
Bootstrapped compilers may be built with a compiler that doesn't support
the target used by the rest of the spec. Allow them to build with less
aggressive target optimization settings.
3. Support for target ranges
Backing off the target necessitates computing target ranges, so make Spack
handle those properly. Notably, this adds an intersection method for target
ranges and fixes the way ranges are satisfied and constrained on Spec objects.
This PR also:
- adds testing
- improves concretizer handling of target ranges
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
|
|
refers #20292
Added a unit test that ensures we can reuse installed
packages even if in the repository variants have been
removed or added.
|
|
bug only relevant for python3
|
|
Currently, version range constraints, compiler version range constraints,
and target range constraints are implemented by generating ground rules
from `asp.py`, via `one_of_iff()`. The rules look like this:
```
version_satisfies("python", "2.6:") :- 1 { version("python", "2.4"); ... } 1.
1 { version("python", "2.4"); ... } 1. :- version_satisfies("python", "2.6:").
```
So, `version_satisfies(Package, Constraint)` is true if and only if the
package is assigned a version that satisfies the constraint. We
precompute the set of known versions that satisfy the constraint, and
generate the rule in `SpackSolverSetup`.
We shouldn't need to generate already-ground rules for this. Rather, we
should leave it to the grounder to do the grounding, and generate facts
so that the constraint semantics can be defined in `concretize.lp`.
We can replace rules like the ones above with facts like this:
```
version_satisfies("python", "2.6:", "2.4")
```
And ground them in `concretize.lp` with rules like this:
```
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
```
The top rule is the same as before. It makes conditional dependencies and
other places where version constraints are used work properly. Note that
we do not need the cardinality constraint for the second rule -- we
already have rules saying there can be only one version assigned to a
package, so we can just infer from `version/2` `version_satisfies/3`.
This form is also safe for grounding -- If we used the original form we'd
have unsafe variables like `Constraint` and `Package` -- the original
form only really worked when specified as ground to begin with.
- [x] use facts instead of generating rules for package version constraints
- [x] use facts instead of generating rules for compiler version constraints
- [x] use facts instead of generating rules for target range constraints
- [x] remove `one_of_iff()` and `iff()` as they're no longer needed
|