summaryrefslogtreecommitdiff
path: root/lib
AgeCommit message (Collapse)AuthorFilesLines
2021-01-27Add a context wrapper for mtime preservation (#21258)Sergey Kosukhin2-1/+44
Sometimes we need to patch a file that is a dependency for some other automatically generated file that comes in a release tarball. As a result, make tries to regenerate the dependent file using additional tools (e.g. help2man), which would not be needed otherwise. In some cases, it's preferable to avoid that (e.g. see #21255). A way to do that is to save the modification timestamps before patching and restoring them afterwards. This PR introduces a context wrapper that does that.
2021-01-27Doc default behavior of install tests (#21309)Mark C. Miller1-2/+5
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-01-27spack setup: remove the command for v0.17.0 (#20277)Adam J. Stewart3-204/+0
spack setup was deprecated in 0.16 and will be removed in 0.17 Follow-up to #18240
2021-01-22use module and package flags to get more correct mypy behavior (#21225)Tom Scogland2-32/+25
The first of my two upstream patches to mypy landed in the 0.800 tag that was released this morning, which lets us use module and package parameters with a .mypy.ini file that has a files key. This uses those parameters to check all of spack in style, but leaves the packages out for now since they are still very, very broken. If no package has been modified, the packages are not checked, but if one has they are. Includes some fixes for the log tests since they were not type checking. Should also fix all failures related to "duplicate module named package" errors. Hopefully the next drop of mypy will include my other patch so we can just specify the modules and packages in the config file to begin with, but for now we'll have to live with a bare mypy doing a check of the libs but not the packages. * use module and package flags to check packages properly * stop checking package files, use package flag for libs The packages are not type checkable yet, need to finish out another PR before they can be. The previous commit also didn't check the libraries properly, this one does.
2021-01-22Added @property stdcxx_libs to return -lstdc++ for AOCC compiler (#21145)AMD Toolchain Support2-0/+6
2021-01-21docs: Update the CudaPackage (build system) description (#20742)Tamara Dahlgren1-16/+101
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-01-21Added ROCmPackage (build system) documentation (#20743)Tamara Dahlgren3-2/+127
2021-01-20store sbang_install_path in buildinfo, use for subsequent relocation (#20768)eugeneswalker1-0/+11
2021-01-20[WIP] relocate.py: parallelize test replacement logic (#19690)Nathan Hanford9-210/+281
* sbang pushed back to callers; star moved to util.lang * updated unit test * sbang test moved; local tests pass Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2021-01-18add required libs for sycl programs (#20728)Robert Cohn1-1/+2
2021-01-14improve documentation for Rocm (hip amd builds) (#20812)Danny Taller1-1/+6
* improve documentation
2021-01-12concretizer: require at least a dependency type to say the dependency holdsMassimiliano Culpo2-0/+11
fixes #20784 Similarly to the previous bug, here we were deducing conditions to be imposed on nodes that were not part of the DAG.
2021-01-12concretizer: dependency conditions cannot hold if package is externalMassimiliano Culpo3-2/+9
fixes #20736 Before this one line fix we were erroneously deducing that dependency conditions hold even if a package was external. This may result in answer sets that contain imposed conditions on a node without the node being present in the DAG, hence #20736.
2021-01-12restore ability of dev-build to skip patches (#20351)Robert Underwood1-0/+1
At some point in the past, the skip_patch argument was removed from the call to package.do_install() this broke the --skip-patch flag on the dev-build command.
2021-01-11Package Repositories docs: num packages has grown (#20735)Adam J. Stewart1-1/+1
2021-01-11Update the docs footer copyright (#20741)Tamara Dahlgren1-1/+1
2021-01-06fix gpg user rundir check (#20705)eugeneswalker1-1/+1
2021-01-05concretizer: make rules on virtual packages more linearMassimiliano Culpo1-11/+9
fixes #20679 In this refactor we have a single cardinality rule on the provider, which triggers a rule transforming a dependency on a virtual package into a dependency on the provider of the virtual.
2021-01-05spack python: allow use of IPython (#20329)Vanessasaurus2-12/+76
This adds a -i option to "spack python" which allows use of the IPython interpreter; it can be used with "spack python -i ipython". This assumes it is available in the Python instance used to run Spack (i.e. that you can "import IPython").
2021-01-05bugfix for target adjustments on target ranges (#20537)Greg Becker3-15/+20
2021-01-05concretizer: use consistent naming for compiler predicates (#20677)Todd Gamblin2-10/+8
Every other predicate in the concretizer uses a `_set` suffix to implement user- or package-supplied settings, but compiler settings use a `_hard` suffix for this. There's no difference in how they're used, so make the names the same. - [x] change `node_compiler_hard` to `node_compiler_set` - [x] change `node_compiler_version_hard` to `node_compiler_version_set`
2021-01-04concretizer: simplify handling of virtual version constraintsTodd Gamblin2-18/+29
Previously, the concretizer handled version constraints by comparing all pairs of constraints and ensuring they satisfied each other. This led to INCONSISTENT ressults from clingo, due to ambiguous semantics like: version_constraint_satisfies("mpi", ":1", ":3") version_constraint_satisfies("mpi", ":3", ":1") To get around this, we introduce possible (fake) versions for virtuals, based on their constraints. Essentially, we add any Versions, VersionRange endpoints, and all such Versions and endpoints from VersionLists to the constraint. Virtuals will have one of these synthetic versions "picked" by the solver. This also allows us to remove a special case from handling of `version_satisfies/3` -- virtuals now work just like regular packages.
2021-01-04concretizer: remove rule generation code from concretizerTodd Gamblin1-70/+1
Our program only generates facts now, so remove all unused code related to generating cardinality constraints and rules.
2021-01-04concretizer: convert virtuals to facts; move all rules to `concretize.lp`Todd Gamblin2-116/+164
This converts the virtual handling in the new concretizer from already-ground rules to facts. This is the last thing that needs to be refactored, and it converts the entire concretizer to just use facts. The previous way of handling virtuals hinged on rules involving `single_provider_for` facts that were tied to the virtual and a version range. The new method uses the condition pattern we've been using for dependencies, externals, and conflicts. To handle virtuals as conditions, we impose constraints on "fake" virtual specs in the logic program. i.e., `version_satisfies("mpi", "2.0:", "2.0")` is legal whereas before we wouldn't have seen something like this. Currently, constriants are only handled on versions -- we don't handle variants or anything else yet, but they key change here is that we *could*. For a long time, virtual handling in Spack has only dealt with versions, and we'd like to be able to handle variants as well. We could easily add an integrity constraint to handle variants like the one we use for versions. One issue with the implementation here is that virtual packages don't actually declare possible versions like regular packages do. To get around that, we implement an integrity constraint like this: :- virtual_node(Virtual), version_satisfies(Virtual, V1), version_satisfies(Virtual, V2), not version_constraint_satisfies(Virtual, V1, V2). This requires us to compare every version constraint to every other, both in program generation and within the concretizer -- so there's a potentially quadratic evaluation time on virtual constraints because we don't have a real version to "anchor" things to. We just say that all the constraints need to agree for the virtual constraint to hold. We can investigate adding synthetic versions for virtuals in the future, to speed this up.
2021-01-04concretizer: consolidate handling of virtuals into spec_clausesTodd Gamblin1-26/+20
2021-01-04concretizer: make _condtion_id_counter an iteratorTodd Gamblin1-18/+18
2021-01-04concretizer: more detailed section headers in concretize.lpTodd Gamblin1-51/+69
2021-01-04ci: fix issue with latest sphinx (#20661)Massimiliano Culpo2-1/+17
2021-01-04bugfix: infinite loop when building a set from incomplete specs (#20649)Todd Gamblin1-1/+6
This code in `SpecBuilder.build_specs()` introduced in #20203, can loop seemingly interminably for very large specs: ```python set([spec.root for spec in self._specs.values()]) ``` It's deceptive, because it seems like there must be an issue with `spec.root`, but that works fine. It's building the set afterwards that takes forever, at least on `r-rminer`. Currently if you try running `spack solve r-rminer`, it loops infinitely and spins up your fan. The issue (I think) is that the spec is not yet complete when this is run, and something is going wrong when constructing and comparing so many values produced by `_cmp_key()`. We can investigate the efficiency of `_cmp_key()` separately, but for now, the fix is: ```python roots = [spec.root for spec in self._specs.values()] roots = dict((id(r), r) for r in roots) ``` We know the specs in `self._specs` are distinct (they just came out of the solver), so we can just use their `id()` to unique them here. This gets rid of the infinite loop.
2021-01-02copyrights: update all files with license headers for 2021Todd Gamblin485-486/+504
- [x] add `concretize.lp`, `spack.yaml`, etc. to licensed files - [x] update all licensed files to say 2013-2021 using `spack license update-copyright-year` - [x] appease mypy with some additions to package.py that needed for oneapi.py
2021-01-02commands: add `spack license update-copyright-year`Todd Gamblin2-17/+58
This adds a new subcommand to `spack license` that automatically updates the copyright year in files that should have a license header. - [x] add `spack license update-copyright-year` command - [x] add test
2020-12-30extends: add type kwarg (#20045)Adam J. Stewart9-20/+7
* extends: add type kwarg * Flake8 fix
2020-12-29concretizer: generate facts for externalsMassimiliano Culpo3-41/+42
Generate only facts for external specs. Substitute the use of already grounded rules with non-grounded rules in concretize.lp
2020-12-29PythonPackage: add pypi attribute to infer homepage/url/list_url (#17587)Adam J. Stewart6-63/+119
2020-12-28archspec: fixed a typo in the vendored library (#20584)Massimiliano Culpo2-2/+2
2020-12-24Remove more variables from build environment (#20156)Omri Mor1-2/+7
GCC looks for included files based on several env vars. Remove C_INCLUDE_PATH, CPLUS_INCLUDE_PATH, and OBJC_INCLUDE_PATH from the build environment to ensure it's clean and prevent accidental clobbering.
2020-12-23bugfix: do not write empty default dicts/lists in envs (#20526)Greg Becker3-47/+38
Environment yaml files should not have default values written to them. To accomplish this, we change the validator to not add the default values to yaml. We rely on the code to set defaults for all values (and use defaulting getters like dict.get(key, default)). Includes regression test.
2020-12-23concretizer: remove vestigial code and commentMassimiliano Culpo1-20/+0
2020-12-23style: ensure that all packages pass `spack style -a`Todd Gamblin3-5/+13
- fix trailing whitespace and other issues uncovered by better flake8 checking. - fix extra whitespace printed by `spack style` command
2020-12-23Add Intel oneAPI packages (#20411)Robert Cohn4-20/+93
This creates a set of packages which all use the same script to install components of Intel oneAPI. This includes: * An inheritable IntelOneApiPackage which knows how to invoke the installation script based on which components are requested * For components which include headers/libraries, an inheritable IntelOneApiLibraryPackage is provided to locate them * Individual packages for DAL, DNN, TBB, etc. * A package for the Intel oneAPI compilers (icx/ifx). This also includes icc/ifortran but these are not currently detected in this PR
2020-12-22add mypy to style checks; rename `spack flake8` to `spack style` (#20384)Tom Scogland42-304/+590
I lost my mind a bit after getting the completion stuff working and decided to get Mypy working for spack as well. This adds a `.mypy.ini` that checks all of the spack and llnl modules, though not yet packages, and fixes all of the identified missing types and type issues for the spack library. In addition to these changes, this includes: * rename `spack flake8` to `spack style` Aliases flake8 to style, and just runs flake8 as before, but with a warning. The style command runs both `flake8` and `mypy`, in sequence. Added --no-<tool> options to turn off one or the other, they are on by default. Fixed two issues caught by the tools. * stub typing module for python2.x We don't support typing in Spack for python 2.x. To allow 2.x to support `import typing` and `from typing import ...` without a try/except dance to support old versions, this adds a stub module *just* for python 2.x. Doing it this way means we can only reliably use all type hints in python3.7+, and mypi.ini has been updated to reflect that. * add non-default black check to spack style This is a first step to requiring black. It doesn't enforce it by default, but it will check it if requested. Currently enforcing the line length of 79 since that's what flake8 requires, but it's a bit odd for a black formatted project to be quite that narrow. All settings are in the style command since spack has no pyproject.toml and I don't want to add one until more discussion happens. Also re-format `style.py` since it no longer passed the black style check with the new length. * use style check in github action Update the style and docs action to use `spack style`, adding in mypy and black to the action even if it isn't running black right now.
2020-12-22concretizer: refactor conditional rules to be less repetitious (#20507)Todd Gamblin2-89/+60
We have to repeat all the spec attributes in a number of places in `concretize.lp`, and Spack has a fair number of spec attributes. If we instead add some rules up front that establish equivalencies like this: ``` node(Package) :- attr("node", Package). attr("node", Package) :- node(Package). version(Package, Version) :- attr("version", Package, Version). attr("version", Package, Version) :- version(Package, Version). ``` We can rewrite most of the repetitive conditions with `attr` and repeat only for each arity (there are only 3 arities for spec attributes so far) as opposed to each spec attribute. This makes the logic easier to read and the rules easier to follow. Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-12-22Refactor flake8 handling and tool compatibility (#20376)Tom Scogland1-229/+112
This PR does three related things to try to improve developer tooling quality of life: 1. Adds new options to `.flake8` so it applies the rules of both `.flake8` and `.flake_package` based on paths in the repository. 2. Adds a re-factoring of the `spack flake8` logic into a flake8 plugin so using flake8 directly, or through editor or language server integration, only reports errors that `spack flake8` would. 3. Allows star import of `spack.pkgkit` in packages, since this is now the thing that needs to be imported for completion to work correctly in package files, it's nice to be able to do that. I'm sorely tempted to sed over the whole repository and put `from spack.pkgkit import *` in every package, but at least being allowed to do it on a per-package basis helps. As an example of what the result of this is: ``` ~/Workspace/Projects/spack/spack develop* ⇣ ❯ flake8 --format=pylint ./var/spack/repos/builtin/packages/kripke/package.py ./var/spack/repos/builtin/packages/kripke/package.py:6: [F403] 'from spack.pkgkit import *' used; unable to detect undefined names ./var/spack/repos/builtin/packages/kripke/package.py:25: [E501] line too long (88 > 79 characters) ~/Workspace/Projects/spack/spack refactor-flake8* 1 ❯ flake8 --format=spack ./var/spack/repos/builtin/packages/kripke/package.py ~/Workspace/Projects/spack/spack refactor-flake8* ❯ flake8 ./var/spack/repos/builtin/packages/kripke/package.py ``` * qa/flake8: update .flake8, spack formatter plugin Adds: * Modern flake8 settings for per-path/glob error ignores, allows packages to use the same `.flake8` as the rest of spack * A spack formatter plugin to flake8 that implements the behavior of `spack flake8` for direct invocations. Makes integration with developer tooling nicer, linting with flake8 reports only errors that `spack flake8` would report. Using pyls and pyls-flake8, or any other non-format-dependent flake8 integration, now works with spack's rules. * qa/flake8: allow star import of spack.pkgkit To get working completion of directives and spack components it's necessary to import the contents of spack.pkgkit. At the moment doing this makes flake8 displeased. For now, allow spack.pkgkit and spack both, next step is to ban spack * and require spack.pkgkit *. * first cut at refactoring spack flake8 This version still copies all of the files to be checked as befire, and some other things that probably aren't necessary, but it relies on the spack formatter plugin to implement the ignore logic. * keep flake8 from rejecting itself * remove separate packages flake8 config * fix failures from too many files I ran into this in the PR converting pkgkit to std. The solution in that branch does not work in all cases as it turns out, and all the workarounds I tried to use generated configs to get a single invocation of flake8 with a filename optoion to work failed. It's an astonishingly frustrating config option. Regardless, this removes all temporary file creation from the command and relies on the plugin instead. To work around the huge number of files in spack and still allow the command to control what gets checked, it scans files in batches of 100. This is a completely arbitrary number but was chosen to be safely under common line-length limits. One side-effect of this is that every 100 files the command will produce output, rather than only at the end, which doesn't seem like a terrible thing.
2020-12-22concretizer: optimize loop on compiler versionMassimiliano Culpo2-13/+15
Similar to the optimization on platform
2020-12-22concretizer: optimized loop on node platformsMassimiliano Culpo1-3/+3
We can speed-up the computation by avoiding a double loop in a cardinality constraint and enforcing the rule instead as an integrity constraint.
2020-12-20concretizer: fix failing unit testsMassimiliano Culpo2-4/+18
2020-12-20concretizer: emit facts for integrity constraintsMassimiliano Culpo2-37/+40
2020-12-20concretizer: emit facts for constraints on imposed dependenciesMassimiliano Culpo2-38/+85
2020-12-20concretizer: avoid redundant grounding on dependency typesMassimiliano Culpo2-27/+24
2020-12-20concretizer: move conditional dependency logic into `concretize.lp`Todd Gamblin2-18/+60
Continuing to convert everything in `asp.py` into facts, make the generation of ground rules for conditional dependencies use facts, and move the semantics into `concretize.lp`. This is probably the most complex logic in Spack, as dependencies can be conditional on anything, and we need conditional ASP rules to accumulate and map all the dependency conditions to spec attributes. The logic looks complicated, but essentially it accumulates any constraints associated with particular conditions into a fact associated with the condition by id. Then, if *any* condition id's fact is True, we trigger the dependency. This simplifies the way `declared_dependency()` works -- the dependency is now declared regardless of whether it is conditional, and the conditions are handled by `dependency_condition()` facts.