Age | Commit message (Collapse) | Author | Files | Lines |
|
|
|
* xyce: switch to Trilinos 13.2.0 and add dependencies for +pymi
* xyce: clean up Trilinos dependency logic
|
|
|
|
From the release announcement: "This is a special bugfix release ahead of
schedule to address a memory leak that was happening on certain function calls
when using Cython. The memory leak consisted of a small constant amount of bytes
in certain function calls from Cython code. Although in most cases this was not
very noticeable, it was very impactful for long-running applications and certain
usage patterns. Check bpo-46347 for more information."
|
|
|
|
When you install Spack from a tarball, it will always show an exact
version for Spack itself, even when you don't download a tagged commit:
```
$ wget -q https://github.com/spack/spack/archive/refs/heads/develop.tar.gz
$ tar -xf develop.tar.gz
$ ./spack-develop/bin/spack --version
0.16.2
```
This PR sets the Spack version to `0.18.0.dev0` on develop, following [PEP440](https://github.com/spack/spack/pull/25267#issuecomment-896340234) as
suggested by Adam Stewart.
```
spack (fix/set-dev-version)$ spack --version
0.18.0.dev0 (git 0.17.1-1526-e270464ae0)
spack (fix/set-dev-version)$ mv .git .git_
spack $ spack --version
0.18.0.dev0
```
- [x] Update the release guide
- [x] Add __version__ to spack's __init__.py
- [x] Use PEP 440 canonical version strings
- [x] Make spack --version output [actual version] (git version)
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
|
|
|
|
* rivet: fix dependency build types
If it isn't a python package, there is no good reason to change the default build type to remove link
* rivet: turn swig into build dependency
|
|
|
|
Add the INTEL_MKL_DIR variable so py-torch can find mkl if provided by
intel-oneapi-mkl.
|
|
* Added OpenJDK 11.0.14.1 and 17.0.2
* Fix preferred_defined change.
|
|
* Added rust version 1.58.1
* Fixed flake8 style errors
Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
|
|
|
|
* Add tests to ensure google cloud storage urls work as mirrors
This commit adds two tests to track that GCS buckets can work as
mirrors, and can be parsed as valid URLs.
Currently, gs:// format URLs are not correctly parsed.
* Fix URL parsing for GCS buckets
This commit adds GCS bucket URLs as valid URLs.
|
|
defining a function themselves; this check failed (erroneously) if that Package class subclassed another package class (because the check examined all superclasses and thought the definition we automatically add was an offender) (#29569)
|
|
|
|
|
|
This reverts commit 531b1c5c3dcc9bc7bec27e223608aed477e94dbd.
|
|
|
|
|
|
|
|
A new release hasn't been tagged in over a year.
|
|
|
|
|
|
- these versions are a synchronization with the OpenCL v3.0.10
specification release.
|
|
|
|
spack (#29518)
|
|
* lower priority of package-provided urls
This change favors urls found in a scraped page over those provided by
the package from `url_for_version`. In most cases this doesn't matter,
but R specifically returns known bad URLs in some cases, and the
fallback path for a failed fetch uses `fetch_remote_versions` to find a
substitute. This fixes that problem.
fixes #29204
* consider what links actually exist in all cases
Checksum was only actually scraping when called with no versions. It
now always scrapes and then selects URLs from the set of URLs known to
exist whenever possible.
fixes #25831
* bow to the wrath of flake8
* test-fetch urls from package, prefer if successful
* Update lib/spack/spack/package.py
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
* reword as suggested
* re-enable mypy specific ignore and ignore pyflakes
* remove flake8 ignore from .flake8
* address review comments
* address comments
* add sneaky missing substitute
I missed this one because we call substitute on a URL that doesn't
contain a version component. I'm not sure how that's supposed to work,
but apparently it's required by at least one mock package, so back in it
goes.
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
|
|
Adds `spack external read-cray-manifest`, which reads a json file that describes a set of package DAGs. The parsed results are stored directly in the database. A user can see these installed specs with `spack find` (like any installed spec). The easiest way to use them right now as dependencies is to run `spack spec ... ^/hash-of-external-package`.
Changes include:
* `spack external read-cray-manifest --file <path/to/file>` will add all specs described in the file to Spack's installation DB and will also install described compilers to the compilers configuration (the expected format of the file is described in this PR as well including examples of the file)
* Database records now may include an "origin" (the command added in this PR registers the origin as "external-db"). In the future, it is assumed users may want to be able to treat installs registered with this command differently (e.g. they may want to uninstall all specs added with this command)
* Hash properties are now always preserved when copying specs if the source spec is concrete
* I don't think the hashes of installed-and-concrete specs should change and this was the easiest way to handle that
* also specs that are concrete preserve their `.normal` property when copied (external specs may mention compilers that are not registered, and without this change they would fail in `normalize` when calling `validate_or_raise`)
* it might be this should only be the case if the spec was installed
- [x] Improve testing
- [x] Specifically mark DB records added with this command (so that users can do something like "uninstall all packages added with `spack read-external-db`)
* This is now possible with `spack uninstall --all --origin=external-db` (this will remove all specs added from manifest files)
- [x] Strip variants that are listed in json entries but don't actually exist for the package
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
|
|
|
|
|
|
* Use same cxx value as root
* Remove pointer syntax from non-pointer type in source
* Run patch function before build
* Use raw string in filter_file and merge edit function with patch
* Escape parentheses
* Use gDirectory from ROOT instead of CurrentDirectory function
|
|
|
|
|
|
This PR removes a few outdated sections from the "Basics" part of the
documentation. It also makes a few topic under the environment section
more prominent by removing an unneeded spack.yaml subsection and
promoting everything under it.
|
|
* Make boost composable
Currently Boost enables a few components through variants by default,
which means that if you want to use only what you need and no more, you
have to explicitly disable these variants, leading to concretization
errors whenever a second package explicitly needs those components.
For instance if package A only needs `+component_a` it might depend on
`boost +component_a ~component_b`. And if packge B only needs
`+component_b` it might depend on `boost ~component_a +component_b`. If
package C now depends on both A and B, this leads to unsatisfiable
variants and hence a concretization error.
However, if we default to disabling all components, package A can simply
depend on `boost +component_a` and package B on `boost +component_b` and
package C will concretize to depending on `boost +component_a
+component_b`, and whatever you install, you get the bare minimum.
* Fix style
* Added composable boost dependencies for folly
* fixing akantu merge issue
* hpctoolkit boost dependencies already defined
* Fix Styles
* Fixup style once more
* Adding isort fix
* isort one more time
* Fix for package audit issue
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Ryan O'Malley <rd.omalley@comcast.net>
|
|
|
|
|
|
|
|
|
|
* [py-jaconv] created template
* [py-jaconv]
- added homepage
- added description
- depends on setuptools
- removed fixmes
|
|
|
|
Include support for powershell
Prepend cmd and pwsh with [spack] to denote spack enabled shell
|
|
Broaden support for execution of the test suite
on Windows.
General bug and review fixups
|
|
Consistency is restored on next transaction
|
|
|
|
Consolidate Spack's internal filepath logic to a select
few places and refactor to consistent internal useage of
os.path utilities. Creates a prefix, and a series of utilities
in the path utility module that facilitate handling paths
in a platform agnostic manner.
Convert Windows paths to posix paths internally
Prefer posixpath.join instead of os.path.join
Updated util/ directory to account for Windows integration
Co-authored-by: Stephen Crowell <stephen.crowell@khq.kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
Module template format for windows (#23041)
|
|
* Incorporate new search location
* Add external user option
* proper doc string
* Explicit commands in getting started
* raise during chgrp on Win
recover installer changes
Notate admin privleges
Windows phase install hooks
Find external python and install ninja (#23496)
Allow external find python to find windows python and spack install ninja
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
|
|
Fixup common tests
* Remove requirement for Python 2.6
* Skip new failing test
Windows: Update url util to handle Windows paths (#27959)
* update url util to handle windows paths
* Update tests to handle fixed url handling
* canonicalize path only when the path type matches the host platform
* Skip some url tests on Windows
Co-authored-by: Omar Padron <omar.padron@kitware.com>
Use threading.TIMEOUT_MAX when available (#24246)
This value was introduced in Python 3.2. Specifying a timeout greater than
this value will raise an OverflowError.
Co-authored-by: Lou Lawrence <lou.lawrence@kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
|
|
Add compiler hint to the root spec for Windows
Reporters on Windows (#26038)
Reporters use Jinja2 as the templating engine, and Jinja2 indexes
templates by Unix separators, even on Windows, so search using Unix paths
on all systems.
Support patching on win via git (#25871)
Handle GRP on windows
|