Age | Commit message (Collapse) | Author | Files | Lines |
|
(#20367)
* Kluge to get the gfortran linker to work correctly on Big Sur.
* Fixed formatting error; stetting the other.
* Removed spaces.
* Added comment, mainly to re-trigger Spack CI.
|
|
Currently, version range constraints, compiler version range constraints,
and target range constraints are implemented by generating ground rules
from `asp.py`, via `one_of_iff()`. The rules look like this:
```
version_satisfies("python", "2.6:") :- 1 { version("python", "2.4"); ... } 1.
1 { version("python", "2.4"); ... } 1. :- version_satisfies("python", "2.6:").
```
So, `version_satisfies(Package, Constraint)` is true if and only if the
package is assigned a version that satisfies the constraint. We
precompute the set of known versions that satisfy the constraint, and
generate the rule in `SpackSolverSetup`.
We shouldn't need to generate already-ground rules for this. Rather, we
should leave it to the grounder to do the grounding, and generate facts
so that the constraint semantics can be defined in `concretize.lp`.
We can replace rules like the ones above with facts like this:
```
version_satisfies("python", "2.6:", "2.4")
```
And ground them in `concretize.lp` with rules like this:
```
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
```
The top rule is the same as before. It makes conditional dependencies and
other places where version constraints are used work properly. Note that
we do not need the cardinality constraint for the second rule -- we
already have rules saying there can be only one version assigned to a
package, so we can just infer from `version/2` `version_satisfies/3`.
This form is also safe for grounding -- If we used the original form we'd
have unsafe variables like `Constraint` and `Package` -- the original
form only really worked when specified as ground to begin with.
- [x] use facts instead of generating rules for package version constraints
- [x] use facts instead of generating rules for compiler version constraints
- [x] use facts instead of generating rules for target range constraints
- [x] remove `one_of_iff()` and `iff()` as they're no longer needed
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* ParaView: add new ParaView-5.9.0-RC2 release
Signed-off-by: Vicente Adolfo Bolea Sanchez <vicente.bolea@kitware.com>
* Update var/spack/repos/builtin/packages/paraview/package.py
Indeed, I misunderstood the previous review. This looks good to me too.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
|
|
|
|
|
|
Require boost at cxxstd=14 if cxxstd=14 is selected, not 11
|
|
|
|
Enabling PSATD is not mutually exclusive with other runtime options
anymore, so we can always compile with support for it to ease
usability.
|
|
|
|
|
|
I was keeping the old `clingo` driver code around in case we had to run
using the command line tool instad of through the Python interface.
So far, the command line is faster than running through Python, but I'm
working on fixing that. I found that if I do this:
```python
control = clingo.Control()
control.load("concretize.lp")
control.load("hdf5.lp") # code from spack solve --show asp hdf5
control.load("display.lp")
control.ground([("base", [])])
control.solve(...)
```
It's just as fast as the command line tool. So we can always generate the
code and load it manually if we need to -- we don't need two drivers for
clingo. Given that the python interface is also the only way to get unsat
cores, I think we pretty much have to use it.
So, I'm removing the old command line driver and other unused code. We
can dig it up again from the history if it is needed.
|
|
|
|
|
|
This fixes a logging error observed on macOS 11.0.1 (Big Sur).
When performing a Spack install in debugging mode (e.g.
`spack -d install py-scipy`) Spack is supposed to write a log of
compiler wrapper command line invocations to the current working
directory.
Due to a regression error introduced by #18205, these files were
no-longer generated, and Spack was printing errors such as
"No such file or directory: None/." This is because the log file
directory gets set from `spack.main.spack_working_dir`, but that
variable is not set in the spawned process.
This PR ensures that the working directory (at the time of the
"spack install" invocation) is persisted to the subprocess.
|
|
|
|
|
|
|
|
|
|
|
|
Co-authored-by: Christian Kniep <kniec@amazon.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Fixed hard tab in flux-sched edit and unbound hwloc in flux-core after
testing to better support modern MPIs in spack environments
Verified that flux-core@0.17 is when hwloc@2: became viable
|
|
|
|
|
|
|
|
* Add a new NEST version, 2.20.0.
* Scalasca requires Score-P as a 'run' dependency.
Co-authored-by: Itaru Kitayama <itaru.kitayama@riken.jp>
|
|
|
|
|
|
|
|
|
|
|
|
1) amdfftw library support
2) opt added in supported packages
|
|
|
|
|
|
|