diff options
25 files changed, 1331 insertions, 340 deletions
diff --git a/lib/spack/docs/basic_usage.rst b/lib/spack/docs/basic_usage.rst index 2eed9dddd4..50c48b802b 100644 --- a/lib/spack/docs/basic_usage.rst +++ b/lib/spack/docs/basic_usage.rst @@ -342,6 +342,7 @@ will find every installed package with a 'debug' compile-time option enabled. The full spec syntax is discussed in detail in :ref:`sec-specs`. + Compiler configuration ----------------------------------- @@ -1320,6 +1321,120 @@ regenerate all module and dotkit files from scratch: .. _extensions: +Filesystem Views +------------------------------- + +.. Maybe this is not the right location for this documentation. + +The Spack installation area allows for many package installation trees +to coexist and gives the user choices as to what versions and variants +of packages to use. To use them, the user must rely on a way to +aggregate a subset of those packages. The section on Environment +Modules gives one good way to do that which relies on setting various +environment variables. An alternative way to aggregate is through +**filesystem views**. + +A filesystem view is a single directory tree which is the union of the +directory hierarchies of the individual package installation trees +that have been included. The files of the view's installed packages +are brought into the view by symbolic or hard links back to their +location in the original Spack installation area. As the view is +formed, any clashes due to a file having the exact same path in its +package installation tree are handled in a first-come-first-served +basis and a warning is printed. Packages and their dependencies can +be both added and removed. During removal, empty directories will be +purged. These operations can be limited to pertain to just the +packages listed by the user or to exclude specific dependencies and +they allow for software installed outside of Spack to coexist inside +the filesystem view tree. + +By its nature, a filesystem view represents a particular choice of one +set of packages among all the versions and variants that are available +in the Spack installation area. It is thus equivalent to the +directory hiearchy that might exist under ``/usr/local``. While this +limits a view to including only one version/variant of any package, it +provides the benefits of having a simpler and traditional layout which +may be used without any particular knowledge that its packages were +built by Spack. + +Views can be used for a variety of purposes including: + +- A central installation in a traditional layout, eg ``/usr/local`` maintained over time by the sysadmin. +- A self-contained installation area which may for the basis of a top-level atomic versioning scheme, eg ``/opt/pro`` vs ``/opt/dev``. +- Providing an atomic and monolithic binary distribution, eg for delivery as a single tarball. +- Producing ephemeral testing or developing environments. + +Using Filesystem Views +~~~~~~~~~~~~~~~~~~~~~~ + +A filesystem view is created and packages are linked in by the ``spack +view`` command's ``symlink`` and ``hardlink`` sub-commands. The +``spack view remove`` command can be used to unlink some or all of the +filesystem view. + +The following example creates a filesystem view based +on an installed ``cmake`` package and then removes from the view the +files in the ``cmake`` package while retaining its dependencies. + +.. code-block:: sh + + + $ spack view -v symlink myview cmake@3.5.2 + ==> Linking package: "ncurses" + ==> Linking package: "zlib" + ==> Linking package: "openssl" + ==> Linking package: "cmake" + + $ ls myview/ + bin doc etc include lib share + + $ ls myview/bin/ + captoinfo clear cpack ctest infotocap openssl tabs toe tset + ccmake cmake c_rehash infocmp ncurses6-config reset tic tput + + $ spack view -v -d false rm myview cmake@3.5.2 + ==> Removing package: "cmake" + + $ ls myview/bin/ + captoinfo c_rehash infotocap openssl tabs toe tset + clear infocmp ncurses6-config reset tic tput + + +Limitations of Filesystem Views +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +This section describes some limitations that should be considered in +using filesystems views. + +Filesystem views are merely organizational. The binary executable +programs, shared libraries and other build products found in a view +are mere links into the "real" Spack installation area. If a view is +built with symbolic links it requires the Spack-installed package to +be kept in place. Building a view with hardlinks removes this +requirement but any internal paths (eg, rpath or ``#!`` interpreter +specifications) will still require the Spack-installed package files +to be in place. + +.. FIXME: reference the relocation work of Hegner and Gartung. + +As described above, when a view is built only a single instance of a +file may exist in the unified filesystem tree. If more than one +package provides a file at the same path (relative to its own root) +then it is the first package added to the view that "wins". A warning +is printed and it is up to the user to determine if the conflict +matters. + +It is up to the user to assure a consistent view is produced. In +particular if the user excludes packages, limits the following of +dependencies or removes packages the view may become inconsistent. In +particular, if two packages require the same sub-tree of dependencies, +removing one package (recursively) will remove its dependencies and +leave the other package broken. + + + + + Extensions & Python support ------------------------------------ diff --git a/lib/spack/spack/__init__.py b/lib/spack/spack/__init__.py index fc6cb06577..75ddca1abc 100644 --- a/lib/spack/spack/__init__.py +++ b/lib/spack/spack/__init__.py @@ -107,7 +107,7 @@ concretizer = DefaultConcretizer() # Version information from spack.version import Version -spack_version = Version("0.9") +spack_version = Version("0.9.1") # # Executables used by Spack diff --git a/lib/spack/spack/cmd/view.py b/lib/spack/spack/cmd/view.py new file mode 100644 index 0000000000..8f1fc9be74 --- /dev/null +++ b/lib/spack/spack/cmd/view.py @@ -0,0 +1,295 @@ +############################################################################## +# Copyright (c) 2013, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License (as published by +# the Free Software Foundation) version 2.1 dated February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with this program; if not, write to the Free Software Foundation, +# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## +'''Produce a "view" of a Spack DAG. + +A "view" is file hierarchy representing the union of a number of +Spack-installed package file hierarchies. The union is formed from: + +- specs resolved from the package names given by the user (the seeds) + +- all depenencies of the seeds unless user specifies `--no-depenencies` + +- less any specs with names matching the regular expressions given by + `--exclude` + +The `view` can be built and tore down via a number of methods (the "actions"): + +- symlink :: a file system view which is a directory hierarchy that is + the union of the hierarchies of the installed packages in the DAG + where installed files are referenced via symlinks. + +- hardlink :: like the symlink view but hardlinks are used. + +- statlink :: a view producing a status report of a symlink or + hardlink view. + +The file system view concept is imspired by Nix, implemented by +brett.viren@gmail.com ca 2016. + +''' +# Implementation notes: +# +# This is implemented as a visitor pattern on the set of package specs. +# +# The command line ACTION maps to a visitor_*() function which takes +# the set of package specs and any args which may be specific to the +# ACTION. +# +# To add a new view: +# 1. add a new cmd line args sub parser ACTION +# 2. add any action-specific options/arguments, most likely a list of specs. +# 3. add a visitor_MYACTION() function +# 4. add any visitor_MYALIAS assignments to match any command line aliases + +import os +import re +import spack +import spack.cmd +import llnl.util.tty as tty + +description = "Produce a single-rooted directory view of a spec." + + +def setup_parser(sp): + setup_parser.parser = sp + + sp.add_argument( + '-v', '--verbose', action='store_true', default=False, + help="Display verbose output.") + sp.add_argument( + '-e', '--exclude', action='append', default=[], + help="Exclude packages with names matching the given regex pattern.") + sp.add_argument( + '-d', '--dependencies', choices=['true', 'false', 'yes', 'no'], + default='true', + help="Follow dependencies.") + + ssp = sp.add_subparsers(metavar='ACTION', dest='action') + + specs_opts = dict(metavar='spec', nargs='+', + help="Seed specs of the packages to view.") + + # The action parameterizes the command but in keeping with Spack + # patterns we make it a subcommand. + file_system_view_actions = [ + ssp.add_parser( + 'symlink', aliases=['add', 'soft'], + help='Add package files to a filesystem view via symbolic links.'), + ssp.add_parser( + 'hardlink', aliases=['hard'], + help='Add packages files to a filesystem via via hard links.'), + ssp.add_parser( + 'remove', aliases=['rm'], + help='Remove packages from a filesystem view.'), + ssp.add_parser( + 'statlink', aliases=['status', 'check'], + help='Check status of packages in a filesystem view.') + ] + # All these options and arguments are common to every action. + for act in file_system_view_actions: + act.add_argument('path', nargs=1, + help="Path to file system view directory.") + act.add_argument('specs', **specs_opts) + + return + + +def assuredir(path): + 'Assure path exists as a directory' + if not os.path.exists(path): + os.makedirs(path) + + +def relative_to(prefix, path): + 'Return end of `path` relative to `prefix`' + assert 0 == path.find(prefix) + reldir = path[len(prefix):] + if reldir.startswith('/'): + reldir = reldir[1:] + return reldir + + +def transform_path(spec, path, prefix=None): + 'Return the a relative path corresponding to given path spec.prefix' + if os.path.isabs(path): + path = relative_to(spec.prefix, path) + subdirs = path.split(os.path.sep) + if subdirs[0] == '.spack': + lst = ['.spack', spec.name] + subdirs[1:] + path = os.path.join(*lst) + if prefix: + path = os.path.join(prefix, path) + return path + + +def purge_empty_directories(path): + '''Ascend up from the leaves accessible from `path` + and remove empty directories.''' + for dirpath, subdirs, files in os.walk(path, topdown=False): + for sd in subdirs: + sdp = os.path.join(dirpath, sd) + try: + os.rmdir(sdp) + except OSError: + pass + + +def filter_exclude(specs, exclude): + 'Filter specs given sequence of exclude regex' + to_exclude = [re.compile(e) for e in exclude] + + def exclude(spec): + for e in to_exclude: + if e.match(spec.name): + return True + return False + return [s for s in specs if not exclude(s)] + + +def flatten(seeds, descend=True): + 'Normalize and flattend seed specs and descend hiearchy' + flat = set() + for spec in seeds: + if not descend: + flat.add(spec) + continue + flat.update(spec.normalized().traverse()) + return flat + + +def check_one(spec, path, verbose=False): + 'Check status of view in path against spec' + dotspack = os.path.join(path, '.spack', spec.name) + if os.path.exists(os.path.join(dotspack)): + tty.info('Package in view: "%s"' % spec.name) + return + tty.info('Package not in view: "%s"' % spec.name) + return + + +def remove_one(spec, path, verbose=False): + 'Remove any files found in `spec` from `path` and purge empty directories.' + + if not os.path.exists(path): + return # done, short circuit + + dotspack = transform_path(spec, '.spack', path) + if not os.path.exists(dotspack): + if verbose: + tty.info('Skipping nonexistent package: "%s"' % spec.name) + return + + if verbose: + tty.info('Removing package: "%s"' % spec.name) + for dirpath, dirnames, filenames in os.walk(spec.prefix): + if not filenames: + continue + targdir = transform_path(spec, dirpath, path) + for fname in filenames: + dst = os.path.join(targdir, fname) + if not os.path.exists(dst): + continue + os.unlink(dst) + + +def link_one(spec, path, link=os.symlink, verbose=False): + 'Link all files in `spec` into directory `path`.' + + dotspack = transform_path(spec, '.spack', path) + if os.path.exists(dotspack): + tty.warn('Skipping existing package: "%s"' % spec.name) + return + + if verbose: + tty.info('Linking package: "%s"' % spec.name) + for dirpath, dirnames, filenames in os.walk(spec.prefix): + if not filenames: + continue # avoid explicitly making empty dirs + + targdir = transform_path(spec, dirpath, path) + assuredir(targdir) + + for fname in filenames: + src = os.path.join(dirpath, fname) + dst = os.path.join(targdir, fname) + if os.path.exists(dst): + if '.spack' in dst.split(os.path.sep): + continue # silence these + tty.warn("Skipping existing file: %s" % dst) + continue + link(src, dst) + + +def visitor_symlink(specs, args): + 'Symlink all files found in specs' + path = args.path[0] + assuredir(path) + for spec in specs: + link_one(spec, path, verbose=args.verbose) +visitor_add = visitor_symlink +visitor_soft = visitor_symlink + + +def visitor_hardlink(specs, args): + 'Hardlink all files found in specs' + path = args.path[0] + assuredir(path) + for spec in specs: + link_one(spec, path, os.link, verbose=args.verbose) +visitor_hard = visitor_hardlink + + +def visitor_remove(specs, args): + 'Remove all files and directories found in specs from args.path' + path = args.path[0] + for spec in specs: + remove_one(spec, path, verbose=args.verbose) + purge_empty_directories(path) +visitor_rm = visitor_remove + + +def visitor_statlink(specs, args): + 'Give status of view in args.path relative to specs' + path = args.path[0] + for spec in specs: + check_one(spec, path, verbose=args.verbose) +visitor_status = visitor_statlink +visitor_check = visitor_statlink + + +def view(parser, args): + 'Produce a view of a set of packages.' + + # Process common args + seeds = [spack.cmd.disambiguate_spec(s) for s in args.specs] + specs = flatten(seeds, args.dependencies.lower() in ['yes', 'true']) + specs = filter_exclude(specs, args.exclude) + + # Execute the visitation. + try: + visitor = globals()['visitor_' + args.action] + except KeyError: + tty.error('Unknown action: "%s"' % args.action) + visitor(specs, args) diff --git a/lib/spack/spack/config.py b/lib/spack/spack/config.py index e51016998c..db0787edc6 100644 --- a/lib/spack/spack/config.py +++ b/lib/spack/spack/config.py @@ -152,7 +152,7 @@ section_schemas = { 'type': 'object', 'additionalProperties': False, 'required': ['paths', 'spec', 'modules', 'operating_system'], - 'properties': { + 'properties': { 'paths': { 'type': 'object', 'required': ['cc', 'cxx', 'f77', 'fc'], diff --git a/lib/spack/spack/test/versions.py b/lib/spack/spack/test/versions.py index a026403e2e..4624f901c8 100644 --- a/lib/spack/spack/test/versions.py +++ b/lib/spack/spack/test/versions.py @@ -43,7 +43,6 @@ class VersionsTest(unittest.TestCase): self.assertFalse(a > b) self.assertFalse(a >= b) - def assert_ver_gt(self, a, b): a, b = ver(a), ver(b) self.assertTrue(a > b) @@ -53,7 +52,6 @@ class VersionsTest(unittest.TestCase): self.assertFalse(a < b) self.assertFalse(a <= b) - def assert_ver_eq(self, a, b): a, b = ver(a), ver(b) self.assertFalse(a > b) @@ -63,55 +61,43 @@ class VersionsTest(unittest.TestCase): self.assertFalse(a < b) self.assertTrue(a <= b) - def assert_in(self, needle, haystack): self.assertTrue(ver(needle) in ver(haystack)) - def assert_not_in(self, needle, haystack): self.assertFalse(ver(needle) in ver(haystack)) - def assert_canonical(self, canonical_list, version_list): self.assertEqual(ver(canonical_list), ver(version_list)) - def assert_overlaps(self, v1, v2): self.assertTrue(ver(v1).overlaps(ver(v2))) - def assert_no_overlap(self, v1, v2): self.assertFalse(ver(v1).overlaps(ver(v2))) - def assert_satisfies(self, v1, v2): self.assertTrue(ver(v1).satisfies(ver(v2))) - def assert_does_not_satisfy(self, v1, v2): self.assertFalse(ver(v1).satisfies(ver(v2))) - def check_intersection(self, expected, a, b): self.assertEqual(ver(expected), ver(a).intersection(ver(b))) - def check_union(self, expected, a, b): self.assertEqual(ver(expected), ver(a).union(ver(b))) - def test_two_segments(self): self.assert_ver_eq('1.0', '1.0') self.assert_ver_lt('1.0', '2.0') self.assert_ver_gt('2.0', '1.0') - def test_three_segments(self): self.assert_ver_eq('2.0.1', '2.0.1') self.assert_ver_lt('2.0', '2.0.1') self.assert_ver_gt('2.0.1', '2.0') - def test_alpha(self): # TODO: not sure whether I like this. 2.0.1a is *usually* # TODO: less than 2.0.1, but special-casing it makes version @@ -120,7 +106,6 @@ class VersionsTest(unittest.TestCase): self.assert_ver_gt('2.0.1a', '2.0.1') self.assert_ver_lt('2.0.1', '2.0.1a') - def test_patch(self): self.assert_ver_eq('5.5p1', '5.5p1') self.assert_ver_lt('5.5p1', '5.5p2') @@ -129,7 +114,6 @@ class VersionsTest(unittest.TestCase): self.assert_ver_lt('5.5p1', '5.5p10') self.assert_ver_gt('5.5p10', '5.5p1') - def test_num_alpha_with_no_separator(self): self.assert_ver_lt('10xyz', '10.1xyz') self.assert_ver_gt('10.1xyz', '10xyz') @@ -137,7 +121,6 @@ class VersionsTest(unittest.TestCase): self.assert_ver_lt('xyz10', 'xyz10.1') self.assert_ver_gt('xyz10.1', 'xyz10') - def test_alpha_with_dots(self): self.assert_ver_eq('xyz.4', 'xyz.4') self.assert_ver_lt('xyz.4', '8') @@ -145,30 +128,25 @@ class VersionsTest(unittest.TestCase): self.assert_ver_lt('xyz.4', '2') self.assert_ver_gt('2', 'xyz.4') - def test_nums_and_patch(self): self.assert_ver_lt('5.5p2', '5.6p1') self.assert_ver_gt('5.6p1', '5.5p2') self.assert_ver_lt('5.6p1', '6.5p1') self.assert_ver_gt('6.5p1', '5.6p1') - def test_rc_versions(self): self.assert_ver_gt('6.0.rc1', '6.0') self.assert_ver_lt('6.0', '6.0.rc1') - def test_alpha_beta(self): self.assert_ver_gt('10b2', '10a1') self.assert_ver_lt('10a2', '10b2') - def test_double_alpha(self): self.assert_ver_eq('1.0aa', '1.0aa') self.assert_ver_lt('1.0a', '1.0aa') self.assert_ver_gt('1.0aa', '1.0a') - def test_padded_numbers(self): self.assert_ver_eq('10.0001', '10.0001') self.assert_ver_eq('10.0001', '10.1') @@ -176,24 +154,20 @@ class VersionsTest(unittest.TestCase): self.assert_ver_lt('10.0001', '10.0039') self.assert_ver_gt('10.0039', '10.0001') - def test_close_numbers(self): self.assert_ver_lt('4.999.9', '5.0') self.assert_ver_gt('5.0', '4.999.9') - def test_date_stamps(self): self.assert_ver_eq('20101121', '20101121') self.assert_ver_lt('20101121', '20101122') self.assert_ver_gt('20101122', '20101121') - def test_underscores(self): self.assert_ver_eq('2_0', '2_0') self.assert_ver_eq('2.0', '2_0') self.assert_ver_eq('2_0', '2.0') - def test_rpm_oddities(self): self.assert_ver_eq('1b.fc17', '1b.fc17') self.assert_ver_lt('1b.fc17', '1.fc17') @@ -202,7 +176,6 @@ class VersionsTest(unittest.TestCase): self.assert_ver_gt('1g.fc17', '1.fc17') self.assert_ver_lt('1.fc17', '1g.fc17') - # Stuff below here is not taken from RPM's tests and is # unique to spack def test_version_ranges(self): @@ -214,7 +187,6 @@ class VersionsTest(unittest.TestCase): self.assert_ver_lt('1.2:1.4', '1.5:1.6') self.assert_ver_gt('1.5:1.6', '1.2:1.4') - def test_contains(self): self.assert_in('1.3', '1.2:1.4') self.assert_in('1.2.5', '1.2:1.4') @@ -233,7 +205,6 @@ class VersionsTest(unittest.TestCase): self.assert_in('1.4.1', '1.2.7:1.4') self.assert_not_in('1.4.1', '1.2.7:1.4.0') - def test_in_list(self): self.assert_in('1.2', ['1.5', '1.2', '1.3']) self.assert_in('1.2.5', ['1.5', '1.2:1.3']) @@ -245,7 +216,6 @@ class VersionsTest(unittest.TestCase): self.assert_not_in('1.2.5:1.5', ['1.5', '1.2:1.3']) self.assert_not_in('1.1:1.2.5', ['1.5', '1.2:1.3']) - def test_ranges_overlap(self): self.assert_overlaps('1.2', '1.2') self.assert_overlaps('1.2.1', '1.2.1') @@ -262,7 +232,6 @@ class VersionsTest(unittest.TestCase): self.assert_overlaps(':', '1.6:1.9') self.assert_overlaps('1.6:1.9', ':') - def test_overlap_with_containment(self): self.assert_in('1.6.5', '1.6') self.assert_in('1.6.5', ':1.6') @@ -273,7 +242,6 @@ class VersionsTest(unittest.TestCase): self.assert_not_in(':1.6', '1.6.5') self.assert_in('1.6.5', ':1.6') - def test_lists_overlap(self): self.assert_overlaps('1.2b:1.7,5', '1.6:1.9,1') self.assert_overlaps('1,2,3,4,5', '3,4,5,6,7') @@ -287,7 +255,6 @@ class VersionsTest(unittest.TestCase): self.assert_no_overlap('1,2,3,4,5', '6,7') self.assert_no_overlap('1,2,3,4,5', '6:7') - def test_canonicalize_list(self): self.assert_canonical(['1.2', '1.3', '1.4'], ['1.2', '1.3', '1.3', '1.4']) @@ -316,7 +283,6 @@ class VersionsTest(unittest.TestCase): self.assert_canonical([':'], [':,1.3, 1.3.1,1.3.9,1.4 : 1.5 , 1.3 : 1.4']) - def test_intersection(self): self.check_intersection('2.5', '1.0:2.5', '2.5:3.0') @@ -325,12 +291,11 @@ class VersionsTest(unittest.TestCase): self.check_intersection('0:1', ':', '0:1') self.check_intersection(['1.0', '2.5:2.7'], - ['1.0:2.7'], ['2.5:3.0','1.0']) + ['1.0:2.7'], ['2.5:3.0', '1.0']) self.check_intersection(['2.5:2.7'], - ['1.1:2.7'], ['2.5:3.0','1.0']) + ['1.1:2.7'], ['2.5:3.0', '1.0']) self.check_intersection(['0:1'], [':'], ['0:1']) - def test_intersect_with_containment(self): self.check_intersection('1.6.5', '1.6.5', ':1.6') self.check_intersection('1.6.5', ':1.6', '1.6.5') @@ -338,7 +303,6 @@ class VersionsTest(unittest.TestCase): self.check_intersection('1.6:1.6.5', ':1.6.5', '1.6') self.check_intersection('1.6:1.6.5', '1.6', ':1.6.5') - def test_union_with_containment(self): self.check_union(':1.6', '1.6.5', ':1.6') self.check_union(':1.6', ':1.6', '1.6.5') @@ -346,8 +310,6 @@ class VersionsTest(unittest.TestCase): self.check_union(':1.6', ':1.6.5', '1.6') self.check_union(':1.6', '1.6', ':1.6.5') - - def test_union_with_containment(self): self.check_union(':', '1.0:', ':2.0') self.check_union('1:4', '1:3', '2:4') @@ -356,7 +318,6 @@ class VersionsTest(unittest.TestCase): # Tests successor/predecessor case. self.check_union('1:4', '1:2', '3:4') - def test_basic_version_satisfaction(self): self.assert_satisfies('4.7.3', '4.7.3') @@ -372,7 +333,6 @@ class VersionsTest(unittest.TestCase): self.assert_does_not_satisfy('4.8', '4.9') self.assert_does_not_satisfy('4', '4.9') - def test_basic_version_satisfaction_in_lists(self): self.assert_satisfies(['4.7.3'], ['4.7.3']) @@ -388,7 +348,6 @@ class VersionsTest(unittest.TestCase): self.assert_does_not_satisfy(['4.8'], ['4.9']) self.assert_does_not_satisfy(['4'], ['4.9']) - def test_version_range_satisfaction(self): self.assert_satisfies('4.7b6', '4.3:4.7') self.assert_satisfies('4.3.0', '4.3:4.7') @@ -400,7 +359,6 @@ class VersionsTest(unittest.TestCase): self.assert_satisfies('4.7b6', '4.3:4.7') self.assert_does_not_satisfy('4.8.0', '4.3:4.7') - def test_version_range_satisfaction_in_lists(self): self.assert_satisfies(['4.7b6'], ['4.3:4.7']) self.assert_satisfies(['4.3.0'], ['4.3:4.7']) @@ -423,3 +381,11 @@ class VersionsTest(unittest.TestCase): self.assert_satisfies('4.8.0', '4.2, 4.3:4.8') self.assert_satisfies('4.8.2', '4.2, 4.3:4.8') + + def test_formatted_strings(self): + versions = '1.2.3', '1_2_3', '1-2-3' + for item in versions: + v = Version(item) + self.assertEqual(v.dotted, '1.2.3') + self.assertEqual(v.dashed, '1-2-3') + self.assertEqual(v.underscored, '1_2_3') diff --git a/lib/spack/spack/version.py b/lib/spack/spack/version.py index 247f6d2362..858d581472 100644 --- a/lib/spack/spack/version.py +++ b/lib/spack/spack/version.py @@ -43,16 +43,16 @@ be called on any of the types:: intersection concrete """ -import os -import sys import re from bisect import bisect_left from functools import wraps + from functools_backport import total_ordering # Valid version characters VALID_VERSION = r'[A-Za-z0-9_.-]' + def int_if_int(string): """Convert a string to int if possible. Otherwise, return a string.""" try: @@ -62,10 +62,11 @@ def int_if_int(string): def coerce_versions(a, b): - """Convert both a and b to the 'greatest' type between them, in this order: + """ + Convert both a and b to the 'greatest' type between them, in this order: Version < VersionRange < VersionList - This is used to simplify comparison operations below so that we're always - comparing things that are of the same type. + This is used to simplify comparison operations below so that we're always + comparing things that are of the same type. """ order = (Version, VersionRange, VersionList) ta, tb = type(a), type(b) @@ -105,6 +106,7 @@ def coerced(method): @total_ordering class Version(object): """Class to represent versions""" + def __init__(self, string): string = str(string) @@ -124,6 +126,17 @@ class Version(object): # last element of separators is '' self.separators = tuple(re.split(segment_regex, string)[1:-1]) + @property + def dotted(self): + return '.'.join(str(x) for x in self.version) + + @property + def underscored(self): + return '_'.join(str(x) for x in self.version) + + @property + def dashed(self): + return '-'.join(str(x) for x in self.version) def up_to(self, index): """Return a version string up to the specified component, exclusive. @@ -131,15 +144,12 @@ class Version(object): """ return '.'.join(str(x) for x in self[:index]) - def lowest(self): return self - def highest(self): return self - @coerced def satisfies(self, other): """A Version 'satisfies' another if it is at least as specific and has a @@ -147,11 +157,10 @@ class Version(object): gcc@4.7 so that when a user asks to build with gcc@4.7, we can find a suitable compiler. """ - nself = len(self.version) + nself = len(self.version) nother = len(other.version) return nother <= nself and self.version[:nother] == other.version - def wildcard(self): """Create a regex that will match variants of this version string.""" def a_or_n(seg): @@ -181,28 +190,22 @@ class Version(object): wc += '(?:[a-z]|alpha|beta)?)?' * (len(segments) - 1) return wc - def __iter__(self): return iter(self.version) - def __getitem__(self, idx): return tuple(self.version[idx]) - def __repr__(self): return self.string - def __str__(self): return self.string - @property def concrete(self): return self - @coerced def __lt__(self, other): """Version comparison is designed for consistency with the way RPM @@ -235,28 +238,23 @@ class Version(object): # If the common prefix is equal, the one with more segments is bigger. return len(self.version) < len(other.version) - @coerced def __eq__(self, other): return (other is not None and type(other) == Version and self.version == other.version) - def __ne__(self, other): return not (self == other) - def __hash__(self): return hash(self.version) - @coerced def __contains__(self, other): if other is None: return False return other.version[:len(self.version)] == self.version - def is_predecessor(self, other): """True if the other version is the immediate predecessor of this one. That is, NO versions v exist such that: @@ -269,16 +267,13 @@ class Version(object): ol = other.version[-1] return type(sl) == int and type(ol) == int and (ol - sl == 1) - def is_successor(self, other): return other.is_predecessor(self) - @coerced def overlaps(self, other): return self in other or other in self - @coerced def union(self, other): if self == other or other in self: @@ -288,7 +283,6 @@ class Version(object): else: return VersionList([self, other]) - @coerced def intersection(self, other): if self == other: @@ -299,6 +293,7 @@ class Version(object): @total_ordering class VersionRange(object): + def __init__(self, start, end): if isinstance(start, basestring): start = Version(start) @@ -310,15 +305,12 @@ class VersionRange(object): if start and end and end < start: raise ValueError("Invalid Version range: %s" % self) - def lowest(self): return self.start - def highest(self): return self.end - @coerced def __lt__(self, other): """Sort VersionRanges lexicographically so that they are ordered first @@ -331,28 +323,24 @@ class VersionRange(object): s, o = self, other if s.start != o.start: - return s.start is None or (o.start is not None and s.start < o.start) + return s.start is None or (o.start is not None and s.start < o.start) # NOQA: ignore=E501 return (s.end != o.end and o.end is None or (s.end is not None and s.end < o.end)) - @coerced def __eq__(self, other): return (other is not None and type(other) == VersionRange and self.start == other.start and self.end == other.end) - def __ne__(self, other): return not (self == other) - @property def concrete(self): return self.start if self.start == self.end else None - @coerced def __contains__(self, other): if other is None: @@ -373,57 +361,55 @@ class VersionRange(object): other.end in self.end))) return in_upper - @coerced def satisfies(self, other): - """A VersionRange satisfies another if some version in this range - would satisfy some version in the other range. To do this it must - either: - a) Overlap with the other range - b) The start of this range satisfies the end of the other range. - - This is essentially the same as overlaps(), but overlaps assumes - that its arguments are specific. That is, 4.7 is interpreted as - 4.7.0.0.0.0... . This funciton assumes that 4.7 woudl be satisfied - by 4.7.3.5, etc. - - Rationale: - If a user asks for gcc@4.5:4.7, and a package is only compatible with - gcc@4.7.3:4.8, then that package should be able to build under the - constraints. Just using overlaps() would not work here. - - Note that we don't need to check whether the end of this range - would satisfy the start of the other range, because overlaps() - already covers that case. - - Note further that overlaps() is a symmetric operation, while - satisfies() is not. + """ + A VersionRange satisfies another if some version in this range + would satisfy some version in the other range. To do this it must + either: + a) Overlap with the other range + b) The start of this range satisfies the end of the other range. + + This is essentially the same as overlaps(), but overlaps assumes + that its arguments are specific. That is, 4.7 is interpreted as + 4.7.0.0.0.0... . This funciton assumes that 4.7 woudl be satisfied + by 4.7.3.5, etc. + + Rationale: + If a user asks for gcc@4.5:4.7, and a package is only compatible with + gcc@4.7.3:4.8, then that package should be able to build under the + constraints. Just using overlaps() would not work here. + + Note that we don't need to check whether the end of this range + would satisfy the start of the other range, because overlaps() + already covers that case. + + Note further that overlaps() is a symmetric operation, while + satisfies() is not. """ return (self.overlaps(other) or # if either self.start or other.end are None, then this can't # satisfy, or overlaps() would've taken care of it. self.start and other.end and self.start.satisfies(other.end)) - @coerced def overlaps(self, other): - return ((self.start == None or other.end is None or + return ((self.start is None or other.end is None or self.start <= other.end or other.end in self.start or self.start in other.end) and - (other.start is None or self.end == None or + (other.start is None or self.end is None or other.start <= self.end or other.start in self.end or self.end in other.start)) - @coerced def union(self, other): if not self.overlaps(other): if (self.end is not None and other.start is not None and - self.end.is_predecessor(other.start)): + self.end.is_predecessor(other.start)): return VersionRange(self.start, other.end) if (other.end is not None and self.start is not None and - other.end.is_predecessor(self.start)): + other.end.is_predecessor(self.start)): return VersionRange(other.start, self.end) return VersionList([self, other]) @@ -442,13 +428,12 @@ class VersionRange(object): else: end = self.end # TODO: See note in intersection() about < and in discrepancy. - if not other.end in self.end: + if other.end not in self.end: if end in other.end or other.end > self.end: end = other.end return VersionRange(start, end) - @coerced def intersection(self, other): if self.overlaps(other): @@ -470,7 +455,7 @@ class VersionRange(object): # 1.6 < 1.6.5 = True (lexicographic) # Should 1.6 NOT be less than 1.6.5? Hm. # Here we test (not end in other.end) first to avoid paradox. - if other.end is not None and not end in other.end: + if other.end is not None and end not in other.end: if other.end < end or other.end in end: end = other.end @@ -479,15 +464,12 @@ class VersionRange(object): else: return VersionList() - def __hash__(self): return hash((self.start, self.end)) - def __repr__(self): return self.__str__() - def __str__(self): out = "" if self.start: @@ -501,6 +483,7 @@ class VersionRange(object): @total_ordering class VersionList(object): """Sorted, non-redundant list of Versions and VersionRanges.""" + def __init__(self, vlist=None): self.versions = [] if vlist is not None: @@ -515,7 +498,6 @@ class VersionList(object): for v in vlist: self.add(ver(v)) - def add(self, version): if type(version) in (Version, VersionRange): # This normalizes single-value version ranges. @@ -524,9 +506,9 @@ class VersionList(object): i = bisect_left(self, version) - while i-1 >= 0 and version.overlaps(self[i-1]): - version = version.union(self[i-1]) - del self.versions[i-1] + while i - 1 >= 0 and version.overlaps(self[i - 1]): + version = version.union(self[i - 1]) + del self.versions[i - 1] i -= 1 while i < len(self) and version.overlaps(self[i]): @@ -542,7 +524,6 @@ class VersionList(object): else: raise TypeError("Can't add %s to VersionList" % type(version)) - @property def concrete(self): if len(self) == 1: @@ -550,11 +531,9 @@ class VersionList(object): else: return None - def copy(self): return VersionList(self) - def lowest(self): """Get the lowest version in the list.""" if not self: @@ -562,7 +541,6 @@ class VersionList(object): else: return self[0].lowest() - def highest(self): """Get the highest version in the list.""" if not self: @@ -570,7 +548,6 @@ class VersionList(object): else: return self[-1].highest() - @coerced def overlaps(self, other): if not other or not self: @@ -586,14 +563,12 @@ class VersionList(object): o += 1 return False - def to_dict(self): """Generate human-readable dict for YAML.""" if self.concrete: - return { 'version' : str(self[0]) } + return {'version': str(self[0])} else: - return { 'versions' : [str(v) for v in self] } - + return {'versions': [str(v) for v in self]} @staticmethod def from_dict(dictionary): @@ -605,7 +580,6 @@ class VersionList(object): else: raise ValueError("Dict must have 'version' or 'versions' in it.") - @coerced def satisfies(self, other, strict=False): """A VersionList satisfies another if some version in the list @@ -633,20 +607,17 @@ class VersionList(object): o += 1 return False - @coerced def update(self, other): for v in other.versions: self.add(v) - @coerced def union(self, other): result = self.copy() result.update(other) return result - @coerced def intersection(self, other): # TODO: make this faster. This is O(n^2). @@ -656,7 +627,6 @@ class VersionList(object): result.add(s.intersection(o)) return result - @coerced def intersect(self, other): """Intersect this spec's list with other. @@ -678,50 +648,40 @@ class VersionList(object): if i == 0: if version not in self[0]: return False - elif all(version not in v for v in self[i-1:]): + elif all(version not in v for v in self[i - 1:]): return False return True - def __getitem__(self, index): return self.versions[index] - def __iter__(self): return iter(self.versions) - def __reversed__(self): return reversed(self.versions) - def __len__(self): return len(self.versions) - @coerced def __eq__(self, other): return other is not None and self.versions == other.versions - def __ne__(self, other): return not (self == other) - @coerced def __lt__(self, other): return other is not None and self.versions < other.versions - def __hash__(self): return hash(tuple(self.versions)) - def __str__(self): return ",".join(str(v) for v in self.versions) - def __repr__(self): return str(self.versions) @@ -730,7 +690,7 @@ def _string_to_version(string): """Converts a string to a Version, VersionList, or VersionRange. This is private. Client code should use ver(). """ - string = string.replace(' ','') + string = string.replace(' ', '') if ',' in string: return VersionList(string.split(',')) @@ -738,7 +698,7 @@ def _string_to_version(string): elif ':' in string: s, e = string.split(':') start = Version(s) if s else None - end = Version(e) if e else None + end = Version(e) if e else None return VersionRange(start, end) else: diff --git a/var/spack/repos/builtin/packages/boost/package.py b/var/spack/repos/builtin/packages/boost/package.py index 1f216a146a..cde76c590a 100644 --- a/var/spack/repos/builtin/packages/boost/package.py +++ b/var/spack/repos/builtin/packages/boost/package.py @@ -27,7 +27,7 @@ import spack import sys import os -import sys + class Boost(Package): """Boost provides free peer-reviewed portable C++ source @@ -75,23 +75,23 @@ class Boost(Package): version('1.34.0', 'ed5b9291ffad776f8757a916e1726ad0') default_install_libs = set(['atomic', - 'chrono', - 'date_time', - 'filesystem', - 'graph', - 'iostreams', - 'locale', - 'log', - 'math', - 'program_options', - 'random', - 'regex', - 'serialization', - 'signals', - 'system', - 'test', - 'thread', - 'wave']) + 'chrono', + 'date_time', + 'filesystem', + 'graph', + 'iostreams', + 'locale', + 'log', + 'math', + 'program_options', + 'random', + 'regex', + 'serialization', + 'signals', + 'system', + 'test', + 'thread', + 'wave']) # mpi/python are not installed by default because they pull in many # dependencies and/or because there is a great deal of customization @@ -109,6 +109,7 @@ class Boost(Package): variant('multithreaded', default=True, description="Build multi-threaded versions of libraries") variant('singlethreaded', default=True, description="Build single-threaded versions of libraries") variant('icu_support', default=False, description="Include ICU support (for regex/locale libraries)") + variant('graph', default=False, description="Build the Boost Graph library") depends_on('icu', when='+icu_support') depends_on('python', when='+python') @@ -120,12 +121,15 @@ class Boost(Package): patch('boost_11856.patch', when='@1.60.0%gcc@4.4.7') def url_for_version(self, version): - """Handle Boost's weird URLs, which write the version two different ways.""" + """ + Handle Boost's weird URLs, + which write the version two different ways. + """ parts = [str(p) for p in Version(version)] dots = ".".join(parts) underscores = "_".join(parts) - return "http://downloads.sourceforge.net/project/boost/boost/%s/boost_%s.tar.bz2" % ( - dots, underscores) + return "http://downloads.sourceforge.net/project/boost" \ + "/boost/%s/boost_%s.tar.bz2" % (dots, underscores) def determine_toolset(self, spec): if spec.satisfies("platform=darwin"): @@ -149,20 +153,20 @@ class Boost(Package): if '+python' in spec: options.append('--with-python=%s' % - join_path(spec['python'].prefix.bin, 'python')) + join_path(spec['python'].prefix.bin, 'python')) with open('user-config.jam', 'w') as f: compiler_wrapper = join_path(spack.build_env_path, 'c++') f.write("using {0} : : {1} ;\n".format(boostToolsetId, - compiler_wrapper)) + compiler_wrapper)) if '+mpi' in spec: f.write('using mpi : %s ;\n' % - join_path(spec['mpi'].prefix.bin, 'mpicxx')) + join_path(spec['mpi'].prefix.bin, 'mpicxx')) if '+python' in spec: f.write('using python : %s : %s ;\n' % - (spec['python'].version, - join_path(spec['python'].prefix.bin, 'python'))) + (spec['python'].version, + join_path(spec['python'].prefix.bin, 'python'))) def determine_b2_options(self, spec, options): if '+debug' in spec: @@ -178,8 +182,7 @@ class Boost(Package): '-s', 'BZIP2_INCLUDE=%s' % spec['bzip2'].prefix.include, '-s', 'BZIP2_LIBPATH=%s' % spec['bzip2'].prefix.lib, '-s', 'ZLIB_INCLUDE=%s' % spec['zlib'].prefix.include, - '-s', 'ZLIB_LIBPATH=%s' % spec['zlib'].prefix.lib, - ]) + '-s', 'ZLIB_LIBPATH=%s' % spec['zlib'].prefix.lib]) linkTypes = ['static'] if '+shared' in spec: @@ -191,7 +194,8 @@ class Boost(Package): if '+singlethreaded' in spec: threadingOpts.append('single') if not threadingOpts: - raise RuntimeError("At least one of {singlethreaded, multithreaded} must be enabled") + raise RuntimeError("""At least one of {singlethreaded, + multithreaded} must be enabled""") options.extend([ 'toolset=%s' % self.determine_toolset(spec), @@ -202,9 +206,9 @@ class Boost(Package): def install(self, spec, prefix): # On Darwin, Boost expects the Darwin libtool. However, one of the - # dependencies may have pulled in Spack's GNU libtool, and these two are - # not compatible. We thus create a symlink to Darwin's libtool and add - # it at the beginning of PATH. + # dependencies may have pulled in Spack's GNU libtool, and these two + # are not compatible. We thus create a symlink to Darwin's libtool + # and add it at the beginning of PATH. if sys.platform == 'darwin': newdir = os.path.abspath('darwin-libtool') mkdirp(newdir) @@ -217,7 +221,8 @@ class Boost(Package): withLibs.append(lib) if not withLibs: # if no libraries are specified for compilation, then you dont have - # to configure/build anything, just copy over to the prefix directory. + # to configure/build anything, just copy over to the prefix + # directory. src = join_path(self.stage.source_path, 'boost') mkdirp(join_path(prefix, 'include')) dst = join_path(prefix, 'include', 'boost') @@ -235,6 +240,9 @@ class Boost(Package): withLibs.remove('chrono') if not spec.satisfies('@1.43.0:'): withLibs.remove('random') + if '+graph' in spec and '+mpi' in spec: + withLibs.remove('graph') + withLibs.append('graph_parallel') # to make Boost find the user-config.jam env['BOOST_BUILD_PATH'] = './' @@ -259,6 +267,7 @@ class Boost(Package): for threadingOpt in threadingOpts: b2('install', 'threading=%s' % threadingOpt, *b2_options) - # The shared libraries are not installed correctly on Darwin; correct this + # The shared libraries are not installed correctly + # on Darwin; correct this if (sys.platform == 'darwin') and ('+shared' in spec): fix_darwin_install_name(prefix.lib) diff --git a/var/spack/repos/builtin/packages/c-blosc/package.py b/var/spack/repos/builtin/packages/c-blosc/package.py new file mode 100644 index 0000000000..dee332be14 --- /dev/null +++ b/var/spack/repos/builtin/packages/c-blosc/package.py @@ -0,0 +1,51 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## + +import sys + +from spack import * + +class CBlosc(Package): + """Blosc, an extremely fast, multi-threaded, meta-compressor library""" + homepage = "http://www.blosc.org" + url = "https://github.com/Blosc/c-blosc/archive/v1.9.2.tar.gz" + + version('1.9.2', 'dd2d83069d74b36b8093f1c6b49defc5') + version('1.9.1', '7d708d3daadfacf984a87b71b1734ce2') + version('1.9.0', 'e4c1dc8e2c468e5cfa2bf05eeee5357a') + version('1.8.1', 'd73d5be01359cf271e9386c90dcf5b05') + version('1.8.0', '5b92ecb287695ba20cc33d30bf221c4f') + + depends_on("cmake") + depends_on("snappy") + depends_on("zlib") + + def install(self, spec, prefix): + cmake('.', *std_cmake_args) + + make() + make("install") + if sys.platform == 'darwin': + fix_darwin_install_name(prefix.lib) diff --git a/var/spack/repos/builtin/packages/caliper/package.py b/var/spack/repos/builtin/packages/caliper/package.py index 4b8fe0d8af..a424c73859 100644 --- a/var/spack/repos/builtin/packages/caliper/package.py +++ b/var/spack/repos/builtin/packages/caliper/package.py @@ -34,7 +34,7 @@ class Caliper(Package): homepage = "https://github.com/LLNL/Caliper" url = "" - version('master', git='ssh://git@github.com:LLNL/Caliper.git') + version('master', git='https://github.com/LLNL/Caliper.git') variant('mpi', default=False, description='Enable MPI function wrappers.') diff --git a/var/spack/repos/builtin/packages/dealii/package.py b/var/spack/repos/builtin/packages/dealii/package.py index 9c09b205a8..54b6426d36 100644 --- a/var/spack/repos/builtin/packages/dealii/package.py +++ b/var/spack/repos/builtin/packages/dealii/package.py @@ -244,38 +244,47 @@ class Dealii(Package): print('========= Step-40 Trilinos ==========') print('=====================================') # change Linear Algebra to Trilinos - filter_file(r'(\/\/ #define FORCE_USE_OF_TRILINOS.*)', - ('#define FORCE_USE_OF_TRILINOS'), 'step-40.cc') + # The below filter_file should be different for versions + # before and after 8.4.0 + if spec.satisfies('@8.4.0:'): + filter_file(r'(\/\/ #define FORCE_USE_OF_TRILINOS.*)', + ('#define FORCE_USE_OF_TRILINOS'), 'step-40.cc') + else: + filter_file(r'(#define USE_PETSC_LA.*)', + ('// #define USE_PETSC_LA'), 'step-40.cc') if '^trilinos+hypre' in spec: make('release') make('run', parallel=False) - print('=====================================') - print('=== Step-40 Trilinos SuperluDist ====') - print('=====================================') - # change to direct solvers - filter_file(r'(LA::SolverCG solver\(solver_control\);)', ('TrilinosWrappers::SolverDirect::AdditionalData data(false,"Amesos_Superludist"); TrilinosWrappers::SolverDirect solver(solver_control,data);'), 'step-40.cc') # NOQA: ignore=E501 - filter_file(r'(LA::MPI::PreconditionAMG preconditioner;)', - (''), 'step-40.cc') - filter_file(r'(LA::MPI::PreconditionAMG::AdditionalData data;)', - (''), 'step-40.cc') - filter_file(r'(preconditioner.initialize\(system_matrix, data\);)', - (''), 'step-40.cc') - filter_file(r'(solver\.solve \(system_matrix, completely_distributed_solution, system_rhs,)', ('solver.solve (system_matrix, completely_distributed_solution, system_rhs);'), 'step-40.cc') # NOQA: ignore=E501 - filter_file(r'(preconditioner\);)', (''), 'step-40.cc') - if '^trilinos+superlu-dist' in spec: - make('release') - make('run', paralle=False) + # the rest of the tests on step 40 only works for + # dealii version 8.4.0 and after + if spec.satisfies('@8.4.0:'): + print('=====================================') + print('=== Step-40 Trilinos SuperluDist ====') + print('=====================================') + # change to direct solvers + filter_file(r'(LA::SolverCG solver\(solver_control\);)', ('TrilinosWrappers::SolverDirect::AdditionalData data(false,"Amesos_Superludist"); TrilinosWrappers::SolverDirect solver(solver_control,data);'), 'step-40.cc') # NOQA: ignore=E501 + filter_file(r'(LA::MPI::PreconditionAMG preconditioner;)', + (''), 'step-40.cc') + filter_file(r'(LA::MPI::PreconditionAMG::AdditionalData data;)', + (''), 'step-40.cc') + filter_file(r'(preconditioner.initialize\(system_matrix, data\);)', + (''), 'step-40.cc') + filter_file(r'(solver\.solve \(system_matrix, completely_distributed_solution, system_rhs,)', ('solver.solve (system_matrix, completely_distributed_solution, system_rhs);'), 'step-40.cc') # NOQA: ignore=E501 + filter_file(r'(preconditioner\);)', (''), 'step-40.cc') + if '^trilinos+superlu-dist' in spec: + make('release') + make('run', paralle=False) - print('=====================================') - print('====== Step-40 Trilinos MUMPS =======') - print('=====================================') - # switch to Mumps - filter_file(r'(Amesos_Superludist)', - ('Amesos_Mumps'), 'step-40.cc') - if '^trilinos+mumps' in spec: - make('release') - make('run', parallel=False) + print('=====================================') + print('====== Step-40 Trilinos MUMPS =======') + print('=====================================') + # switch to Mumps + filter_file(r'(Amesos_Superludist)', + ('Amesos_Mumps'), 'step-40.cc') + if '^trilinos+mumps' in spec: + make('release') + make('run', parallel=False) print('=====================================') print('============ Step-36 ================') diff --git a/var/spack/repos/builtin/packages/fenics/package.py b/var/spack/repos/builtin/packages/fenics/package.py new file mode 100644 index 0000000000..0845237656 --- /dev/null +++ b/var/spack/repos/builtin/packages/fenics/package.py @@ -0,0 +1,176 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## +from spack import * + + +class Fenics(Package): + """FEniCS is organized as a collection of interoperable components + that together form the FEniCS Project. These components include + the problem-solving environment DOLFIN, the form compiler FFC, the + finite element tabulator FIAT, the just-in-time compiler Instant, + the code generation interface UFC, the form language UFL and a + range of additional components.""" + + homepage = "http://fenicsproject.org/" + url = "https://bitbucket.org/fenics-project/dolfin/downloads/dolfin-1.6.0.tar.gz" + + base_url = "https://bitbucket.org/fenics-project/{pkg}/downloads/{pkg}-{version}.tar.gz" # NOQA: ignore E501 + + variant('hdf5', default=True, description='Compile with HDF5') + variant('parmetis', default=True, description='Compile with ParMETIS') + variant('scotch', default=True, description='Compile with Scotch') + variant('petsc', default=True, description='Compile with PETSc') + variant('slepc', default=True, description='Compile with SLEPc') + variant('trilinos', default=True, description='Compile with Trilinos') + variant('suite-sparse', default=True, description='Compile with SuiteSparse solvers') + variant('vtk', default=False, description='Compile with VTK') + variant('qt', default=False, description='Compile with QT') + variant('mpi', default=True, description='Enables the distributed memory support') + variant('openmp', default=True, description='Enables the shared memory support') + variant('shared', default=True, description='Enables the build of shared libraries') + variant('debug', default=False, description='Builds a debug version of the libraries') + + # not part of spack list for now + # variant('petsc4py', default=True, description='Uses PETSc4py') + # variant('slepc4py', default=True, description='Uses SLEPc4py') + # variant('pastix', default=True, description='Compile with Pastix') + + extends('python') + + depends_on('py-numpy') + depends_on('py-ply') + depends_on('py-six') + depends_on('py-sphinx@1.0.1:', when='+doc') + depends_on('eigen@3.2.0:') + depends_on('boost') + depends_on('mpi', when='+mpi') + depends_on('hdf5', when='+hdf5') + depends_on('parmetis@4.0.2:^metis+real64', when='+parmetis') + depends_on('scotch~metis', when='+scotch~mpi') + depends_on('scotch+mpi~metis', when='+scotch+mpi') + depends_on('petsc@3.4:', when='+petsc') + depends_on('slepc@3.4:', when='+slepc') + depends_on('trilinos', when='+trilinos') + depends_on('vtk', when='+vtk') + depends_on('suite-sparse', when='+suite-sparse') + depends_on('qt', when='+qt') + + # This are the build dependencies + depends_on('py-setuptools') + depends_on('cmake@2.8.12:') + depends_on('swig@3.0.3:') + + releases = [ + { + 'version': '1.6.0', + 'md5': '35cb4baf7ab4152a40fb7310b34d5800', + 'resources': { + 'ffc': '358faa3e9da62a1b1a717070217b793e', + 'fiat': 'f4509d05c911fd93cea8d288a78a6c6f', + 'instant': '5f2522eb032a5bebbad6597b6fe0732a', + 'ufl': 'c40c5f04eaa847377ab2323122284016', + } + }, + { + 'version': '1.5.0', + 'md5': '9b589a3534299a5e6d22c13c5eb30bb8', + 'resources': { + 'ffc': '343f6d30e7e77d329a400fd8e73e0b63', + 'fiat': 'da3fa4dd8177bb251e7f68ec9c7cf6c5', + 'instant': 'b744023ded27ee9df4a8d8c6698c0d58', + 'ufl': '130d7829cf5a4bd5b52bf6d0955116fd', + } + }, + ] + + for release in releases: + version(release['version'], release['md5'], url=base_url.format(pkg='dolfin', version=release['version'])) + for name, md5 in release['resources'].items(): + resource(name=name, + url=base_url.format(pkg=name, **release), + md5=md5, + destination='depends', + when='@{version}'.format(**release), + placement=name) + + def cmake_is_on(self, option): + return 'ON' if option in self.spec else 'OFF' + + def install(self, spec, prefix): + for package in ['ufl', 'ffc', 'fiat', 'instant']: + with working_dir(join_path('depends', package)): + python('setup.py', 'install', '--prefix=%s' % prefix) + + cmake_args = [ + '-DCMAKE_BUILD_TYPE:STRING={0}'.format( + 'Debug' if '+debug' in spec else 'RelWithDebInfo'), + '-DBUILD_SHARED_LIBS:BOOL={0}'.format( + self.cmake_is_on('+shared')), + '-DDOLFIN_SKIP_BUILD_TESTS:BOOL=ON', + '-DDOLFIN_ENABLE_OPENMP:BOOL={0}'.format( + self.cmake_is_on('+openmp')), + '-DDOLFIN_ENABLE_CHOLMOD:BOOL={0}'.format( + self.cmake_is_on('suite-sparse')), + '-DDOLFIN_ENABLE_HDF5:BOOL={0}'.format( + self.cmake_is_on('hdf5')), + '-DDOLFIN_ENABLE_MPI:BOOL={0}'.format( + self.cmake_is_on('mpi')), + '-DDOLFIN_ENABLE_PARMETIS:BOOL={0}'.format( + self.cmake_is_on('parmetis')), + '-DDOLFIN_ENABLE_PASTIX:BOOL={0}'.format( + self.cmake_is_on('pastix')), + '-DDOLFIN_ENABLE_PETSC:BOOL={0}'.format( + self.cmake_is_on('petsc')), + '-DDOLFIN_ENABLE_PETSC4PY:BOOL={0}'.format( + self.cmake_is_on('py-petsc4py')), + '-DDOLFIN_ENABLE_PYTHON:BOOL={0}'.format( + self.cmake_is_on('python')), + '-DDOLFIN_ENABLE_QT:BOOL={0}'.format( + self.cmake_is_on('qt')), + '-DDOLFIN_ENABLE_SCOTCH:BOOL={0}'.format( + self.cmake_is_on('scotch')), + '-DDOLFIN_ENABLE_SLEPC:BOOL={0}'.format( + self.cmake_is_on('slepc')), + '-DDOLFIN_ENABLE_SLEPC4PY:BOOL={0}'.format( + self.cmake_is_on('py-slepc4py')), + '-DDOLFIN_ENABLE_SPHINX:BOOL={0}'.format( + self.cmake_is_on('py-sphinx')), + '-DDOLFIN_ENABLE_TRILINOS:BOOL={0}'.format( + self.cmake_is_on('trilinos')), + '-DDOLFIN_ENABLE_UMFPACK:BOOL={0}'.format( + self.cmake_is_on('suite-sparse')), + '-DDOLFIN_ENABLE_VTK:BOOL={0}'.format( + self.cmake_is_on('vtk')), + '-DDOLFIN_ENABLE_ZLIB:BOOL={0}'.format( + self.cmake_is_on('zlib')), + ] + + cmake_args.extend(std_cmake_args) + + with working_dir('build', create=True): + cmake('..', *cmake_args) + + make() + make('install') diff --git a/var/spack/repos/builtin/packages/hdf5-blosc/package.py b/var/spack/repos/builtin/packages/hdf5-blosc/package.py new file mode 100644 index 0000000000..50f380083c --- /dev/null +++ b/var/spack/repos/builtin/packages/hdf5-blosc/package.py @@ -0,0 +1,206 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## + +import os +import shutil +import sys + +from spack import * + +def _install_shlib(name, src, dst): + """Install a shared library from directory src to directory dst""" + if sys.platform == "darwin": + shlib0 = name + ".0.dylib" + shlib = name + ".dylib" + shutil.copyfile(join_path(src, shlib0), join_path(dst, shlib0)) + os.symlink(shlib0, join_path(dst, shlib)) + else: + shlib000 = name + ".so.0.0.0" + shlib0 = name + ".so.0" + shlib = name + ".dylib" + shutil.copyfile(join_path(src, shlib000), join_path(dst, shlib000)) + os.symlink(shlib000, join_path(dst, shlib0)) + os.symlink(shlib0, join_path(dst, shlib)) + +class Hdf5Blosc(Package): + """Blosc filter for HDF5""" + homepage = "https://github.com/Blosc/hdf5-blosc" + url = "https://github.com/Blosc/hdf5-blosc/archive/master.zip" + + version('master', '02c04acbf4bec66ec8a35bf157d1c9de') + + depends_on("c-blosc") + depends_on("hdf5") + depends_on("libtool") + + parallel = False + + def install(self, spec, prefix): + # The included cmake recipe doesn"t work for Darwin + # cmake(".", *std_cmake_args) + # + # make() + # make("install") + # if sys.platform == "darwin": + # fix_darwin_install_name(prefix.lib) + + libtool = Executable(join_path(spec["libtool"].prefix.bin, "libtool")) + if "+mpi" in spec["hdf5"]: + cc = "mpicc" + else: + cc = "cc" + shlibext = "so" if sys.platform!="darwin" else "dylib" + mkdirp(prefix.include) + mkdirp(prefix.lib) + + # Build and install filter + with working_dir("src"): + libtool("--mode=compile", "--tag=CC", + "cc", "-g", "-O", + "-c", "blosc_filter.c") + libtool("--mode=link", "--tag=CC", + "cc", "-g", "-O", + "-rpath", prefix.lib, + "-o", "libblosc_filter.la", + "blosc_filter.lo", + "-L%s" % spec["c-blosc"].prefix.lib, "-lblosc", + "-L%s" % spec["hdf5"].prefix.lib, "-lhdf5") + _install_shlib("libblosc_filter", ".libs", prefix.lib) + + # Build and install plugin + # The plugin requires at least HDF5 1.8.11: + if spec["hdf5"].satisfies("@1.8.11:"): + libtool("--mode=compile", "--tag=CC", + "cc", "-g", "-O", + "-c", "blosc_plugin.c") + libtool("--mode=link", "--tag=CC", + "cc", "-g", "-O", + "-rpath", prefix.lib, + "-o", "libblosc_plugin.la", + "blosc_plugin.lo", + "-L%s" % prefix.lib, "-lblosc_filter", + "-L%s" % spec["c-blosc"].prefix.lib, "-lblosc", + "-L%s" % spec["hdf5"].prefix.lib, "-lhdf5") + _install_shlib("libblosc_plugin", ".libs", prefix.lib) + + self.check_install(spec) + + def check_install(self, spec): + "Build and run a small program to test the installed HDF5 Blosc plugin" + print "Checking HDF5-Blosc plugin..." + checkdir = "spack-check" + with working_dir(checkdir, create=True): + source = r"""\ +#include <hdf5.h> +#include <assert.h> +#include <stdio.h> +#include <stdlib.h> + +#define FILTER_BLOSC 32001 /* Blosc filter ID registered with the HDF group */ + +int main(int argc, char **argv) { + herr_t herr; + hid_t file = H5Fcreate("file.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT); + assert(file >= 0); + hsize_t dims[3] = {10, 10, 10}; + hid_t space = H5Screate_simple(3, dims, NULL); + assert(space >= 0); + hid_t create_proplist = H5Pcreate(H5P_DATASET_CREATE); + assert(create_proplist >= 0); + herr = H5Pset_chunk(create_proplist, 3, dims); + assert(herr >= 0); + herr = H5Pset_filter(create_proplist, FILTER_BLOSC, H5Z_FLAG_OPTIONAL, 0, + NULL); + assert(herr >= 0); + htri_t all_filters_avail = H5Pall_filters_avail(create_proplist); + assert(all_filters_avail > 0); + hid_t dataset = H5Dcreate(file, "dataset", H5T_NATIVE_DOUBLE, space, + H5P_DEFAULT, create_proplist, H5P_DEFAULT); + assert(dataset >= 0); + double data[10][10][10]; + for (int k=0; k<10; ++k) { + for (int j=0; j<10; ++j) { + for (int i=0; i<10; ++i) { + data[k][j][i] = 1.0 / (1.0 + i + j + k); + } + } + } + herr = H5Dwrite(dataset, H5T_NATIVE_DOUBLE, space, space, H5P_DEFAULT, + &data[0][0][0]); + assert(herr >= 0); + herr = H5Pclose(create_proplist); + assert(herr >= 0); + herr = H5Dclose(dataset); + assert(herr >= 0); + herr = H5Sclose(space); + assert(herr >= 0); + herr = H5Fclose(file); + assert(herr >= 0); + printf("Done.\n"); + return 0; +} +""" + expected = """\ +Done. +""" + with open("check.c", "w") as f: + f.write(source) + if "+mpi" in spec["hdf5"]: + cc = which("mpicc") + else: + cc = which("cc") + # TODO: Automate these path and library settings + cc("-c", "-I%s" % spec["hdf5"].prefix.include, "check.c") + cc("-o", "check", "check.o", + "-L%s" % spec["hdf5"].prefix.lib, "-lhdf5") + try: + check = Executable("./check") + output = check(return_output=True) + except: + output = "" + success = output == expected + if not success: + print "Produced output does not match expected output." + print "Expected output:" + print "-"*80 + print expected + print "-"*80 + print "Produced output:" + print "-"*80 + print output + print "-"*80 + print "Environment:" + env = which("env") + env() + raise RuntimeError("HDF5 Blosc plugin check failed") + shutil.rmtree(checkdir) + + def setup_environment(self, spack_env, run_env): + spack_env.append_path("HDF5_PLUGIN_PATH", self.spec.prefix.lib) + run_env.append_path("HDF5_PLUGIN_PATH", self.spec.prefix.lib) + + def setup_dependent_environment(self, spack_env, run_env, dependent_spec): + spack_env.append_path("HDF5_PLUGIN_PATH", self.spec.prefix.lib) + run_env.append_path("HDF5_PLUGIN_PATH", self.spec.prefix.lib) diff --git a/var/spack/repos/builtin/packages/hdf5/package.py b/var/spack/repos/builtin/packages/hdf5/package.py index 21137ef356..e46f432be5 100644 --- a/var/spack/repos/builtin/packages/hdf5/package.py +++ b/var/spack/repos/builtin/packages/hdf5/package.py @@ -38,6 +38,7 @@ class Hdf5(Package): list_url = "http://www.hdfgroup.org/ftp/HDF5/releases" list_depth = 3 + version('1.10.0-patch1', '9180ff0ef8dc2ef3f61bd37a7404f295') version('1.10.0', 'bdc935337ee8282579cd6bc4270ad199') version('1.8.16', 'b8ed9a36ae142317f88b0c7ef4b9c618', preferred=True) version('1.8.15', '03cccb5b33dbe975fdcd8ae9dc021f24') diff --git a/var/spack/repos/builtin/packages/libdwarf/package.py b/var/spack/repos/builtin/packages/libdwarf/package.py index c9c6e9404f..594271f655 100644 --- a/var/spack/repos/builtin/packages/libdwarf/package.py +++ b/var/spack/repos/builtin/packages/libdwarf/package.py @@ -23,11 +23,11 @@ # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA ############################################################################## from spack import * -import os # Only build certain parts of dwarf because the other ones break. dwarf_dirs = ['libdwarf', 'dwarfdump2'] + class Libdwarf(Package): """The DWARF Debugging Information Format is of interest to programmers working on compilers and debuggers (and any one @@ -45,12 +45,13 @@ class Libdwarf(Package): list_url = homepage version('20160507', 'ae32d6f9ece5daf05e2d4b14822ea811') - + version('20130729', '4cc5e48693f7b93b7aa0261e63c0e21d') + version('20130207', '64b42692e947d5180e162e46c689dfbf') + version('20130126', 'ded74a5e90edb5a12aac3c29d260c5db') depends_on("libelf") parallel = False - def install(self, spec, prefix): # dwarf build does not set arguments for ar properly make.add_default_arg('ARFLAGS=rcs') @@ -67,7 +68,11 @@ class Libdwarf(Package): install('libdwarf.h', prefix.include) install('dwarf.h', prefix.include) - with working_dir('dwarfdump'): + if spec.satisfies('@20130126:20130729'): + dwarfdump_dir = 'dwarfdump2' + else: + dwarfdump_dir = 'dwarfdump' + with working_dir(dwarfdump_dir): configure("--prefix=" + prefix) # This makefile has strings of copy commands that diff --git a/var/spack/repos/builtin/packages/lmod/package.py b/var/spack/repos/builtin/packages/lmod/package.py index 0a8b9b4577..7d75866d52 100644 --- a/var/spack/repos/builtin/packages/lmod/package.py +++ b/var/spack/repos/builtin/packages/lmod/package.py @@ -23,7 +23,7 @@ # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA ############################################################################## from spack import * -import os + class Lmod(Package): """ @@ -34,17 +34,25 @@ class Lmod(Package): variable. Modulefiles for Library packages provide environment variables that specify where the library and header files can be found. """ - homepage = "https://www.tacc.utexas.edu/research-development/tacc-projects/lmod" - url = "http://sourceforge.net/projects/lmod/files/Lmod-6.0.1.tar.bz2/download" + homepage = 'https://www.tacc.utexas.edu/research-development/tacc-projects/lmod' # NOQA: ignore=E501 + url = 'https://github.com/TACC/Lmod/archive/6.4.1.tar.gz' + version('6.4.1', '7978ba777c8aa41a4d8c05fec5f780f4') + version('6.3.7', '0fa4d5a24c41cae03776f781aa2dedc1') version('6.0.1', '91abf52fe5033bd419ffe2842ebe7af9') - depends_on("lua@5.2:") + depends_on('lua@5.2:') + depends_on('lua-luaposix') + depends_on('lua-luafilesystem') + + parallel = False + + def setup_environment(self, spack_env, run_env): + stage_lua_path = join_path( + self.stage.path, 'Lmod-{version}', 'src', '?.lua') + spack_env.append_path('LUA_PATH', stage_lua_path.format( + version=self.version), separator=';') def install(self, spec, prefix): - # Add our lua to PATH - os.environ['PATH'] = spec['lua'].prefix.bin + ';' + os.environ['PATH'] - configure('--prefix=%s' % prefix) - make() - make("install") + make('install') diff --git a/var/spack/repos/builtin/packages/lua-luafilesystem/package.py b/var/spack/repos/builtin/packages/lua-luafilesystem/package.py new file mode 100644 index 0000000000..a61b9dd675 --- /dev/null +++ b/var/spack/repos/builtin/packages/lua-luafilesystem/package.py @@ -0,0 +1,51 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## +from spack import * + + +class LuaLuafilesystem(Package): + """ + LuaFileSystem is a Lua library developed to complement the set of + functions related to file systems offered by the standard Lua distribution. + + LuaFileSystem offers a portable way to access the underlying directory + structure and file attributes. + + LuaFileSystem is free software and uses the same license as Lua 5.1 + """ + homepage = 'http://keplerproject.github.io/luafilesystem' + url = 'https://github.com/keplerproject/luafilesystem/archive/v_1_6_3.tar.gz' + + version('1_6_3', 'd0552c7e5a082f5bb2865af63fb9dc95') + + extends('lua') + + def install(self, spec, prefix): + rockspec_fmt = join_path(self.stage.path, + 'luafilesystem-v_{version.underscored}', + 'rockspecs', + 'luafilesystem-{version.dotted}-1.rockspec') + luarocks('--tree=' + prefix, 'install', + rockspec_fmt.format(version=self.spec.version)) diff --git a/var/spack/repos/builtin/packages/lua/package.py b/var/spack/repos/builtin/packages/lua/package.py index 1424cf1a17..761932361b 100644 --- a/var/spack/repos/builtin/packages/lua/package.py +++ b/var/spack/repos/builtin/packages/lua/package.py @@ -105,6 +105,9 @@ class Lua(Package): spack_env.set('LUA_PATH', ';'.join(lua_patterns), separator=';') spack_env.set('LUA_CPATH', ';'.join(lua_cpatterns), separator=';') + # Add LUA to PATH for dependent packages + spack_env.prepend_path('PATH', self.prefix.bin) + # For run time environment set only the path for extension_spec and # prepend it to LUAPATH if extension_spec.package.extends(self.spec): @@ -153,5 +156,5 @@ class Lua(Package): """ # Lua extension builds can have lua and luarocks executable functions module.lua = Executable(join_path(self.spec.prefix.bin, 'lua')) - module.luarocks = Executable(join_path(self.spec.prefix.bin, - 'luarocks')) + module.luarocks = Executable( + join_path(self.spec.prefix.bin, 'luarocks')) diff --git a/var/spack/repos/builtin/packages/mvapich2/package.py b/var/spack/repos/builtin/packages/mvapich2/package.py index f4997bdfa1..d1944023d1 100644 --- a/var/spack/repos/builtin/packages/mvapich2/package.py +++ b/var/spack/repos/builtin/packages/mvapich2/package.py @@ -25,6 +25,7 @@ from spack import * import os + class Mvapich2(Package): """MVAPICH2 is an MPI implementation for Infiniband networks.""" homepage = "http://mvapich.cse.ohio-state.edu/" @@ -43,8 +44,9 @@ class Mvapich2(Package): variant('debug', default=False, description='Enables debug information and error messages at run-time') ########## - # TODO : Process managers should be grouped into the same variant, as soon as variant capabilities will be extended - # See https://groups.google.com/forum/#!topic/spack/F8-f8B4_0so + # TODO : Process managers should be grouped into the same variant, + # as soon as variant capabilities will be extended See + # https://groups.google.com/forum/#!topic/spack/F8-f8B4_0so SLURM = 'slurm' HYDRA = 'hydra' GFORKER = 'gforker' @@ -57,7 +59,8 @@ class Mvapich2(Package): ########## ########## - # TODO : Network types should be grouped into the same variant, as soon as variant capabilities will be extended + # TODO : Network types should be grouped into the same variant, as + # soon as variant capabilities will be extended PSM = 'psm' SOCK = 'sock' NEMESISIBTCP = 'nemesisibtcp' @@ -84,8 +87,8 @@ class Mvapich2(Package): @staticmethod def enabled(x): - """ - Given a variant name returns the string that means the variant is enabled + """Given a variant name returns the string that means the variant is + enabled :param x: variant name :return: @@ -93,8 +96,8 @@ class Mvapich2(Package): return '+' + x def set_build_type(self, spec, configure_args): - """ - Appends to configure_args the flags that depends only on the build type (i.e. release or debug) + """Appends to configure_args the flags that depends only on the build + type (i.e. release or debug) :param spec: spec :param configure_args: list of current configure arguments @@ -104,7 +107,8 @@ class Mvapich2(Package): "--disable-fast", "--enable-error-checking=runtime", "--enable-error-messages=all", - "--enable-g=dbg", "--enable-debuginfo" # Permits debugging with TotalView + # Permits debugging with TotalView + "--enable-g=dbg", "--enable-debuginfo" ] else: build_type_options = ["--enable-fast=all"] @@ -112,25 +116,41 @@ class Mvapich2(Package): configure_args.extend(build_type_options) def set_process_manager(self, spec, configure_args): - """ - Appends to configure_args the flags that will enable the appropriate process managers + """Appends to configure_args the flags that will enable the + appropriate process managers :param spec: spec :param configure_args: list of current configure arguments """ - # Check that slurm variant is not activated together with other pm variants - has_slurm_incompatible_variants = any(self.enabled(x) in spec for x in Mvapich2.SLURM_INCOMPATIBLE_PMS) - if self.enabled(Mvapich2.SLURM) in spec and has_slurm_incompatible_variants: - raise RuntimeError(" %s : 'slurm' cannot be activated together with other process managers" % self.name) + # Check that slurm variant is not activated together with + # other pm variants + has_slurm_incompatible_variants = \ + any(self.enabled(x) in spec + for x in Mvapich2.SLURM_INCOMPATIBLE_PMS) + + if self.enabled(Mvapich2.SLURM) in spec and \ + has_slurm_incompatible_variants: + raise RuntimeError(" %s : 'slurm' cannot be activated \ + together with other process managers" % self.name) process_manager_options = [] + # See: http://slurm.schedmd.com/mpi_guide.html#mvapich2 if self.enabled(Mvapich2.SLURM) in spec: - process_manager_options = [ - "--with-pm=slurm" - ] + if self.version > Version('2.0'): + process_manager_options = [ + "--with-pmi=pmi2", + "--with-pm=slurm" + ] + else: + process_manager_options = [ + "--with-pmi=slurm", + "--with-pm=no" + ] + elif has_slurm_incompatible_variants: pms = [] - # The variant name is equal to the process manager name in the configuration options + # The variant name is equal to the process manager name in + # the configuration options for x in Mvapich2.SLURM_INCOMPATIBLE_PMS: if self.enabled(x) in spec: pms.append(x) @@ -146,7 +166,9 @@ class Mvapich2(Package): if self.enabled(x) in spec: count += 1 if count > 1: - raise RuntimeError('network variants are mutually exclusive (only one can be selected at a time)') + raise RuntimeError('network variants are mutually exclusive \ + (only one can be selected at a time)') + network_options = [] # From here on I can suppose that only one variant has been selected if self.enabled(Mvapich2.PSM) in spec: @@ -164,6 +186,11 @@ class Mvapich2(Package): configure_args.extend(network_options) + def setup_environment(self, spack_env, run_env): + if self.enabled(Mvapich2.SLURM) in self.spec and \ + self.version > Version('2.0'): + run_env.set('SLURM_MPI_TYPE', 'pmi2') + def setup_dependent_environment(self, spack_env, run_env, extension_spec): spack_env.set('MPICH_CC', spack_cc) spack_env.set('MPICH_CXX', spack_cxx) @@ -178,7 +205,8 @@ class Mvapich2(Package): self.spec.mpif77 = join_path(self.prefix.bin, 'mpif77') def install(self, spec, prefix): - # we'll set different configure flags depending on our environment + # we'll set different configure flags depending on our + # environment configure_args = [ "--prefix=%s" % prefix, "--enable-shared", @@ -208,7 +236,6 @@ class Mvapich2(Package): self.filter_compilers() - def filter_compilers(self): """Run after install to make the MPI compilers use the compilers that Spack built the package with. @@ -228,8 +255,17 @@ class Mvapich2(Package): spack_f77 = os.environ['F77'] spack_fc = os.environ['FC'] - kwargs = { 'ignore_absent' : True, 'backup' : False, 'string' : True } - filter_file('CC="%s"' % spack_cc , 'CC="%s"' % self.compiler.cc, mpicc, **kwargs) - filter_file('CXX="%s"'% spack_cxx, 'CXX="%s"' % self.compiler.cxx, mpicxx, **kwargs) - filter_file('F77="%s"'% spack_f77, 'F77="%s"' % self.compiler.f77, mpif77, **kwargs) - filter_file('FC="%s"' % spack_fc , 'FC="%s"' % self.compiler.fc, mpif90, **kwargs) + kwargs = { + 'ignore_absent': True, + 'backup': False, + 'string': True + } + + filter_file('CC="%s"' % spack_cc, + 'CC="%s"' % self.compiler.cc, mpicc, **kwargs) + filter_file('CXX="%s"' % spack_cxx, + 'CXX="%s"' % self.compiler.cxx, mpicxx, **kwargs) + filter_file('F77="%s"' % spack_f77, + 'F77="%s"' % self.compiler.f77, mpif77, **kwargs) + filter_file('FC="%s"' % spack_fc, + 'FC="%s"' % self.compiler.fc, mpif90, **kwargs) diff --git a/var/spack/repos/builtin/packages/openmpi/package.py b/var/spack/repos/builtin/packages/openmpi/package.py index 4e465e1784..0638628a6c 100644 --- a/var/spack/repos/builtin/packages/openmpi/package.py +++ b/var/spack/repos/builtin/packages/openmpi/package.py @@ -142,7 +142,7 @@ class Openmpi(Package): ]) if '+verbs' in spec: path = _verbs_dir() - if path is not None: + if path is not None and path not in ('/usr', '/usr/local'): config_args.append('--with-%s=%s' % (self.verbs, path)) else: config_args.append('--with-%s' % self.verbs) diff --git a/var/spack/repos/builtin/packages/pcre/intel.patch b/var/spack/repos/builtin/packages/pcre/intel.patch new file mode 100644 index 0000000000..f160f55e1b --- /dev/null +++ b/var/spack/repos/builtin/packages/pcre/intel.patch @@ -0,0 +1,12 @@ +diff -up pcre-8.38/pcrecpp.cc.intel pcre-8.38/pcrecpp.cc +--- pcre-8.38/pcrecpp.cc.intel 2014-09-15 07:48:59.000000000 -0600 ++++ pcre-8.38/pcrecpp.cc 2016-06-08 16:16:56.702721214 -0600 +@@ -66,7 +66,7 @@ Arg RE::no_arg((void*)NULL); + // inclusive test if we ever needed it. (Note that not only the + // __attribute__ syntax, but also __USER_LABEL_PREFIX__, are + // gnu-specific.) +-#if defined(__GNUC__) && __GNUC__ >= 3 && defined(__ELF__) ++#if defined(__GNUC__) && __GNUC__ >= 3 && defined(__ELF__) && !defined(__INTEL_COMPILER) + # define ULP_AS_STRING(x) ULP_AS_STRING_INTERNAL(x) + # define ULP_AS_STRING_INTERNAL(x) #x + # define USER_LABEL_PREFIX_STR ULP_AS_STRING(__USER_LABEL_PREFIX__) diff --git a/var/spack/repos/builtin/packages/pcre/package.py b/var/spack/repos/builtin/packages/pcre/package.py index 7a9f3b911d..8e0f83110e 100644 --- a/var/spack/repos/builtin/packages/pcre/package.py +++ b/var/spack/repos/builtin/packages/pcre/package.py @@ -24,6 +24,7 @@ ############################################################################## from spack import * + class Pcre(Package): """The PCRE package contains Perl Compatible Regular Expression libraries. These are useful for implementing regular expression @@ -34,6 +35,8 @@ class Pcre(Package): version('8.36', 'b767bc9af0c20bc9c1fe403b0d41ad97') version('8.38', '00aabbfe56d5a48b270f999b508c5ad2') + patch("intel.patch") + def install(self, spec, prefix): configure("--prefix=%s" % prefix) make() diff --git a/var/spack/repos/builtin/packages/py-ply/package.py b/var/spack/repos/builtin/packages/py-ply/package.py new file mode 100644 index 0000000000..47cd3b5dc8 --- /dev/null +++ b/var/spack/repos/builtin/packages/py-ply/package.py @@ -0,0 +1,38 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## +from spack import * + + +class PyPly(Package): + """PLY is nothing more than a straightforward lex/yacc implementation.""" + homepage = "http://www.dabeaz.com/ply" + url = "http://www.dabeaz.com/ply/ply-3.8.tar.gz" + + version('3.8', '94726411496c52c87c2b9429b12d5c50') + + extends('python') + + def install(self, spec, prefix): + python('setup.py', 'install', '--prefix=%s' % prefix) diff --git a/var/spack/repos/builtin/packages/serf/package.py b/var/spack/repos/builtin/packages/serf/package.py index 3b1d08889c..817db68241 100644 --- a/var/spack/repos/builtin/packages/serf/package.py +++ b/var/spack/repos/builtin/packages/serf/package.py @@ -24,8 +24,10 @@ ############################################################################## from spack import * + class Serf(Package): - """Apache Serf - a high performance C-based HTTP client library built upon the Apache Portable Runtime (APR) library""" + """Apache Serf - a high performance C-based HTTP client library + built upon the Apache Portable Runtime (APR) library""" homepage = 'https://serf.apache.org/' url = 'https://archive.apache.org/dist/serf/serf-1.3.8.tar.bz2' @@ -36,6 +38,7 @@ class Serf(Package): depends_on('scons') depends_on('expat') depends_on('openssl') + depends_on('zlib') def install(self, spec, prefix): scons = which("scons") @@ -44,8 +47,10 @@ class Serf(Package): options.append('APR=%s' % spec['apr'].prefix) options.append('APU=%s' % spec['apr-util'].prefix) options.append('OPENSSL=%s' % spec['openssl'].prefix) - options.append('LINKFLAGS=-L%s/lib' % spec['expat'].prefix) - options.append('CPPFLAGS=-I%s/include' % spec['expat'].prefix) + options.append('LINKFLAGS=-L%s/lib -L%s/lib' % + (spec['expat'].prefix, spec['zlib'].prefix)) + options.append('CPPFLAGS=-I%s/include -I%s/include' % + (spec['expat'].prefix, spec['zlib'].prefix)) scons(*options) scons('install') diff --git a/var/spack/repos/builtin/packages/tetgen/package.py b/var/spack/repos/builtin/packages/tetgen/package.py index 5e87ed7fba..c301a5b4e5 100644 --- a/var/spack/repos/builtin/packages/tetgen/package.py +++ b/var/spack/repos/builtin/packages/tetgen/package.py @@ -24,16 +24,19 @@ ############################################################################## from spack import * + class Tetgen(Package): - """TetGen is a program and library that can be used to generate tetrahedral - meshes for given 3D polyhedral domains. TetGen generates exact constrained - Delaunay tetrahedralizations, boundary conforming Delaunay meshes, and - Voronoi paritions.""" + """TetGen is a program and library that can be used to generate + tetrahedral meshes for given 3D polyhedral domains. TetGen + generates exact constrained Delaunay tetrahedralizations, + boundary conforming Delaunay meshes, and Voronoi paritions. + """ homepage = "http://www.tetgen.org" url = "http://www.tetgen.org/files/tetgen1.4.3.tar.gz" version('1.4.3', 'd6a4bcdde2ac804f7ec66c29dcb63c18') + version('1.5.0', '3b9fd9cdec121e52527b0308f7aad5c1', url='http://www.tetgen.org/1.5/src/tetgen1.5.0.tar.gz') # TODO: Make this a build dependency once build dependencies are supported # (see: https://github.com/LLNL/spack/pull/378). diff --git a/var/spack/repos/builtin/packages/trilinos/package.py b/var/spack/repos/builtin/packages/trilinos/package.py index 1eaec86405..6913d79dcc 100644 --- a/var/spack/repos/builtin/packages/trilinos/package.py +++ b/var/spack/repos/builtin/packages/trilinos/package.py @@ -23,18 +23,23 @@ # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA ############################################################################## from spack import * -import os, sys, glob +import os +import sys -# Trilinos is complicated to build, as an inspiration a couple of links to other repositories which build it: +# Trilinos is complicated to build, as an inspiration a couple of links to +# other repositories which build it: # https://github.com/hpcugent/easybuild-easyblocks/blob/master/easybuild/easyblocks/t/trilinos.py#L111 # https://github.com/koecher/candi/blob/master/deal.II-toolchain/packages/trilinos.package # https://gitlab.com/configurations/cluster-config/blob/master/trilinos.sh -# https://github.com/Homebrew/homebrew-science/blob/master/trilinos.rb -# and some relevant documentation/examples: +# https://github.com/Homebrew/homebrew-science/blob/master/trilinos.rb and some +# relevant documentation/examples: # https://github.com/trilinos/Trilinos/issues/175 + + class Trilinos(Package): - """The Trilinos Project is an effort to develop algorithms and enabling technologies within an object-oriented - software framework for the solution of large-scale, complex multi-physics engineering and scientific problems. + """The Trilinos Project is an effort to develop algorithms and enabling + technologies within an object-oriented software framework for the solution + of large-scale, complex multi-physics engineering and scientific problems. A unique design feature of Trilinos is its focus on packages. """ homepage = "https://trilinos.org/" @@ -54,49 +59,51 @@ class Trilinos(Package): variant('hypre', default=True, description='Compile with Hypre preconditioner') variant('hdf5', default=True, description='Compile with HDF5') variant('suite-sparse', default=True, description='Compile with SuiteSparse solvers') - # not everyone has py-numpy activated, keep it disabled by default to avoid configure errors + # not everyone has py-numpy activated, keep it disabled by default to avoid + # configure errors variant('python', default=False, description='Build python wrappers') variant('shared', default=True, description='Enables the build of shared libraries') variant('debug', default=False, description='Builds a debug version of the libraries') + variant('boost', default=True, description='Compile with Boost') # Everything should be compiled with -fpic depends_on('blas') depends_on('lapack') - depends_on('boost') + depends_on('boost', when='+boost') depends_on('matio') depends_on('glm') depends_on('swig') - depends_on('metis@5:',when='+metis') - depends_on('suite-sparse',when='+suite-sparse') + depends_on('metis@5:', when='+metis') + depends_on('suite-sparse', when='+suite-sparse') # MPI related dependencies depends_on('mpi') depends_on('netcdf+mpi') - depends_on('parmetis',when='+metis') - # Trilinos' Tribits config system is limited which makes it - # very tricky to link Amesos with static MUMPS, see + depends_on('parmetis', when='+metis') + # Trilinos' Tribits config system is limited which makes it very tricky to + # link Amesos with static MUMPS, see # https://trilinos.org/docs/dev/packages/amesos2/doc/html/classAmesos2_1_1MUMPS.html - # One could work it out by getting linking flags from mpif90 --showme:link (or alike) - # and adding results to -DTrilinos_EXTRA_LINK_FLAGS - # together with Blas and Lapack and ScaLAPACK and Blacs and -lgfortran and - # it may work at the end. But let's avoid all this by simply using shared libs - depends_on('mumps@5.0:+mpi+shared',when='+mumps') - depends_on('scalapack',when='+mumps') - depends_on('superlu-dist',when='+superlu-dist') - depends_on('hypre~internal-superlu',when='+hypre') - depends_on('hdf5+mpi',when='+hdf5') - - depends_on('python',when='+python') + # One could work it out by getting linking flags from mpif90 --showme:link + # (or alike) and adding results to -DTrilinos_EXTRA_LINK_FLAGS together + # with Blas and Lapack and ScaLAPACK and Blacs and -lgfortran and it may + # work at the end. But let's avoid all this by simply using shared libs + depends_on('mumps@5.0:+mpi+shared', when='+mumps') + depends_on('scalapack', when='+mumps') + depends_on('superlu-dist', when='+superlu-dist') + depends_on('hypre~internal-superlu', when='+hypre') + depends_on('hdf5+mpi', when='+hdf5') + depends_on('python', when='+python') patch('umfpack_from_suitesparse.patch') # check that the combination of variants makes sense def variants_check(self): if '+superlu-dist' in self.spec and self.spec.satisfies('@:11.4.3'): - # For Trilinos v11 we need to force SuperLUDist=OFF, - # since only the deprecated SuperLUDist v3.3 together with an Amesos patch - # is working. - raise RuntimeError('The superlu-dist variant can only be used with Trilinos @12.0.1:') + # For Trilinos v11 we need to force SuperLUDist=OFF, since only the + # deprecated SuperLUDist v3.3 together with an Amesos patch is + # working. + raise RuntimeError('The superlu-dist variant can only be used' + + ' with Trilinos @12.0.1:') def install(self, spec, prefix): self.variants_check() @@ -106,54 +113,75 @@ class Trilinos(Package): options.extend(std_cmake_args) mpi_bin = spec['mpi'].prefix.bin - options.extend(['-DTrilinos_ENABLE_ALL_PACKAGES:BOOL=ON', - '-DTrilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=ON', - '-DTrilinos_VERBOSE_CONFIGURE:BOOL=OFF', - '-DTrilinos_ENABLE_TESTS:BOOL=OFF', - '-DTrilinos_ENABLE_EXAMPLES:BOOL=OFF', - '-DCMAKE_BUILD_TYPE:STRING=%s' % ('DEBUG' if '+debug' in spec else 'RELEASE'), - '-DBUILD_SHARED_LIBS:BOOL=%s' % ('ON' if '+shared' in spec else 'OFF'), - '-DTPL_ENABLE_MPI:BOOL=ON', - '-DMPI_BASE_DIR:PATH=%s' % spec['mpi'].prefix, - '-DTPL_ENABLE_BLAS=ON', - '-DBLAS_LIBRARY_NAMES=blas', # FIXME: don't hardcode names - '-DBLAS_LIBRARY_DIRS=%s' % spec['blas'].prefix.lib, - '-DTPL_ENABLE_LAPACK=ON', - '-DLAPACK_LIBRARY_NAMES=lapack', - '-DLAPACK_LIBRARY_DIRS=%s' % spec['lapack'].prefix, - '-DTPL_ENABLE_Boost:BOOL=ON', - '-DBoost_INCLUDE_DIRS:PATH=%s' % spec['boost'].prefix.include, - '-DBoost_LIBRARY_DIRS:PATH=%s' % spec['boost'].prefix.lib, - '-DTrilinos_ENABLE_EXPLICIT_INSTANTIATION:BOOL=ON', - '-DTrilinos_ENABLE_CXX11:BOOL=ON', - '-DTPL_ENABLE_Netcdf:BOOL=ON', - '-DTPL_ENABLE_HYPRE:BOOL=%s' % ('ON' if '+hypre' in spec else 'OFF'), - '-DTPL_ENABLE_HDF5:BOOL=%s' % ('ON' if '+hdf5' in spec else 'OFF'), - ]) + options.extend([ + '-DTrilinos_ENABLE_ALL_PACKAGES:BOOL=ON', + '-DTrilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=ON', + '-DTrilinos_VERBOSE_CONFIGURE:BOOL=OFF', + '-DTrilinos_ENABLE_TESTS:BOOL=OFF', + '-DTrilinos_ENABLE_EXAMPLES:BOOL=OFF', + '-DCMAKE_BUILD_TYPE:STRING=%s' % ( + 'DEBUG' if '+debug' in spec else 'RELEASE'), + '-DBUILD_SHARED_LIBS:BOOL=%s' % ( + 'ON' if '+shared' in spec else 'OFF'), + '-DTPL_ENABLE_MPI:BOOL=ON', + '-DMPI_BASE_DIR:PATH=%s' % spec['mpi'].prefix, + '-DTPL_ENABLE_BLAS=ON', + '-DBLAS_LIBRARY_NAMES=blas', # FIXME: don't hardcode names + '-DBLAS_LIBRARY_DIRS=%s' % spec['blas'].prefix.lib, + '-DTPL_ENABLE_LAPACK=ON', + '-DLAPACK_LIBRARY_NAMES=lapack', + '-DLAPACK_LIBRARY_DIRS=%s' % spec['lapack'].prefix, + '-DTrilinos_ENABLE_EXPLICIT_INSTANTIATION:BOOL=ON', + '-DTrilinos_ENABLE_CXX11:BOOL=ON', + '-DTPL_ENABLE_Netcdf:BOOL=ON', + '-DTPL_ENABLE_HYPRE:BOOL=%s' % ( + 'ON' if '+hypre' in spec else 'OFF'), + '-DTPL_ENABLE_HDF5:BOOL=%s' % ( + 'ON' if '+hdf5' in spec else 'OFF'), + ]) + + if '+boost' in spec: + options.extend([ + '-DTPL_ENABLE_Boost:BOOL=ON', + '-DBoost_INCLUDE_DIRS:PATH=%s' % spec['boost'].prefix.include, + '-DBoost_LIBRARY_DIRS:PATH=%s' % spec['boost'].prefix.lib + ]) + else: + options.extend(['-DTPL_ENABLE_Boost:BOOL=OFF']) # Fortran lib - libgfortran = os.path.dirname (os.popen('%s --print-file-name libgfortran.a' % join_path(mpi_bin,'mpif90') ).read()) + libgfortran = os.path.dirname(os.popen( + '%s --print-file-name libgfortran.a' % + join_path(mpi_bin, 'mpif90')).read()) options.extend([ - '-DTrilinos_EXTRA_LINK_FLAGS:STRING=-L%s/ -lgfortran' % libgfortran, + '-DTrilinos_EXTRA_LINK_FLAGS:STRING=-L%s/ -lgfortran' % ( + libgfortran), '-DTrilinos_ENABLE_Fortran=ON' ]) # for build-debug only: - #options.extend([ - # '-DCMAKE_VERBOSE_MAKEFILE:BOOL=TRUE' - #]) + # options.extend([ + # '-DCMAKE_VERBOSE_MAKEFILE:BOOL=TRUE' + # ]) # suite-sparse related if '+suite-sparse' in spec: options.extend([ - '-DTPL_ENABLE_Cholmod:BOOL=OFF', # FIXME: Trilinos seems to be looking for static libs only, patch CMake TPL file? - #'-DTPL_ENABLE_Cholmod:BOOL=ON', - #'-DCholmod_LIBRARY_DIRS:PATH=%s' % spec['suite-sparse'].prefix.lib, - #'-DCholmod_INCLUDE_DIRS:PATH=%s' % spec['suite-sparse'].prefix.include, + # FIXME: Trilinos seems to be looking for static libs only, + # patch CMake TPL file? + '-DTPL_ENABLE_Cholmod:BOOL=OFF', + # '-DTPL_ENABLE_Cholmod:BOOL=ON', + # '-DCholmod_LIBRARY_DIRS:PATH=%s' % ( + # spec['suite-sparse'].prefix.lib, + # '-DCholmod_INCLUDE_DIRS:PATH=%s' % ( + # spec['suite-sparse'].prefix.include, '-DTPL_ENABLE_UMFPACK:BOOL=ON', - '-DUMFPACK_LIBRARY_DIRS:PATH=%s' % spec['suite-sparse'].prefix.lib, - '-DUMFPACK_INCLUDE_DIRS:PATH=%s' % spec['suite-sparse'].prefix.include, - '-DUMFPACK_LIBRARY_NAMES=umfpack;amd;colamd;cholmod;suitesparseconfig' + '-DUMFPACK_LIBRARY_DIRS:PATH=%s' % ( + spec['suite-sparse'].prefix.lib), + '-DUMFPACK_INCLUDE_DIRS:PATH=%s' % ( + spec['suite-sparse'].prefix.include), + '-DUMFPACK_LIBRARY_NAMES=umfpack;amd;colamd;cholmod;' + + 'suitesparseconfig' ]) else: options.extend([ @@ -169,9 +197,11 @@ class Trilinos(Package): '-DMETIS_LIBRARY_NAMES=metis', '-DTPL_METIS_INCLUDE_DIRS=%s' % spec['metis'].prefix.include, '-DTPL_ENABLE_ParMETIS:BOOL=ON', - '-DParMETIS_LIBRARY_DIRS=%s;%s' % (spec['parmetis'].prefix.lib,spec['metis'].prefix.lib), + '-DParMETIS_LIBRARY_DIRS=%s;%s' % ( + spec['parmetis'].prefix.lib, spec['metis'].prefix.lib), '-DParMETIS_LIBRARY_NAMES=parmetis;metis', - '-DTPL_ParMETIS_INCLUDE_DIRS=%s' % spec['parmetis'].prefix.include + '-DTPL_ParMETIS_INCLUDE_DIRS=%s' % ( + spec['parmetis'].prefix.include) ]) else: options.extend([ @@ -184,11 +214,14 @@ class Trilinos(Package): options.extend([ '-DTPL_ENABLE_MUMPS:BOOL=ON', '-DMUMPS_LIBRARY_DIRS=%s' % spec['mumps'].prefix.lib, - '-DMUMPS_LIBRARY_NAMES=dmumps;mumps_common;pord', # order is important! + # order is important! + '-DMUMPS_LIBRARY_NAMES=dmumps;mumps_common;pord', '-DTPL_ENABLE_SCALAPACK:BOOL=ON', - '-DSCALAPACK_LIBRARY_NAMES=scalapack' # FIXME: for MKL it's mkl_scalapack_lp64;mkl_blacs_mpich_lp64 + # FIXME: for MKL it's mkl_scalapack_lp64;mkl_blacs_mpich_lp64 + '-DSCALAPACK_LIBRARY_NAMES=scalapack' ]) - # see https://github.com/trilinos/Trilinos/blob/master/packages/amesos/README-MUMPS + # see + # https://github.com/trilinos/Trilinos/blob/master/packages/amesos/README-MUMPS cxx_flags.extend([ '-DMUMPS_5_0' ]) @@ -201,16 +234,20 @@ class Trilinos(Package): # superlu-dist: if '+superlu-dist' in spec: # Amesos, conflicting types of double and complex SLU_D - # see https://trilinos.org/pipermail/trilinos-users/2015-March/004731.html - # and https://trilinos.org/pipermail/trilinos-users/2015-March/004802.html + # see + # https://trilinos.org/pipermail/trilinos-users/2015-March/004731.html + # and + # https://trilinos.org/pipermail/trilinos-users/2015-March/004802.html options.extend([ '-DTeuchos_ENABLE_COMPLEX:BOOL=OFF', '-DKokkosTSQR_ENABLE_Complex:BOOL=OFF' ]) options.extend([ '-DTPL_ENABLE_SuperLUDist:BOOL=ON', - '-DSuperLUDist_LIBRARY_DIRS=%s' % spec['superlu-dist'].prefix.lib, - '-DSuperLUDist_INCLUDE_DIRS=%s' % spec['superlu-dist'].prefix.include + '-DSuperLUDist_LIBRARY_DIRS=%s' % + spec['superlu-dist'].prefix.lib, + '-DSuperLUDist_INCLUDE_DIRS=%s' % + spec['superlu-dist'].prefix.include ]) if spec.satisfies('^superlu-dist@4.0:'): options.extend([ @@ -221,7 +258,6 @@ class Trilinos(Package): '-DTPL_ENABLE_SuperLUDist:BOOL=OFF', ]) - # python if '+python' in spec: options.extend([ @@ -248,23 +284,26 @@ class Trilinos(Package): '-DTrilinos_ENABLE_FEI=OFF' ]) - with working_dir('spack-build', create=True): cmake('..', *options) make() make('install') - # When trilinos is built with Python, libpytrilinos is included through - # cmake configure files. Namely, Trilinos_LIBRARIES in TrilinosConfig.cmake - # contains pytrilinos. This leads to a run-time error: - # Symbol not found: _PyBool_Type - # and prevents Trilinos to be used in any C++ code, which links executable - # against the libraries listed in Trilinos_LIBRARIES. - # See https://github.com/Homebrew/homebrew-science/issues/2148#issuecomment-103614509 + # When trilinos is built with Python, libpytrilinos is included + # through cmake configure files. Namely, Trilinos_LIBRARIES in + # TrilinosConfig.cmake contains pytrilinos. This leads to a + # run-time error: Symbol not found: _PyBool_Type and prevents + # Trilinos to be used in any C++ code, which links executable + # against the libraries listed in Trilinos_LIBRARIES. See + # https://github.com/Homebrew/homebrew-science/issues/2148#issuecomment-103614509 # A workaround it to remove PyTrilinos from the COMPONENTS_LIST : if '+python' in self.spec: - filter_file(r'(SET\(COMPONENTS_LIST.*)(PyTrilinos;)(.*)', (r'\1\3'), '%s/cmake/Trilinos/TrilinosConfig.cmake' % prefix.lib) + filter_file(r'(SET\(COMPONENTS_LIST.*)(PyTrilinos;)(.*)', + (r'\1\3'), + '%s/cmake/Trilinos/TrilinosConfig.cmake' % + prefix.lib) - # The shared libraries are not installed correctly on Darwin; correct this + # The shared libraries are not installed correctly on Darwin; + # correct this if (sys.platform == 'darwin') and ('+shared' in spec): fix_darwin_install_name(prefix.lib) |