diff options
author | Gregory Becker <becker33@llnl.gov> | 2016-05-27 13:13:19 -0700 |
---|---|---|
committer | Gregory Becker <becker33@llnl.gov> | 2016-05-27 13:13:19 -0700 |
commit | 9dad7c2acee1329c842a3382b6238c3010e6c931 (patch) | |
tree | 386a1d51600864da3513bde0a1c38bd0fcae4256 /lib | |
parent | f49644cdea3b00b057f33bf0c609e3a83eb3cd65 (diff) | |
parent | f6a0cd1bf84432a31cf08aa764d939b39f61bd43 (diff) | |
download | spack-9dad7c2acee1329c842a3382b6238c3010e6c931.tar.gz spack-9dad7c2acee1329c842a3382b6238c3010e6c931.tar.bz2 spack-9dad7c2acee1329c842a3382b6238c3010e6c931.tar.xz spack-9dad7c2acee1329c842a3382b6238c3010e6c931.zip |
re-merged mainline develop
Diffstat (limited to 'lib')
-rw-r--r-- | lib/spack/docs/basic_usage.rst | 161 | ||||
-rw-r--r-- | lib/spack/docs/configuration.rst (renamed from lib/spack/docs/site_configuration.rst) | 88 | ||||
-rw-r--r-- | lib/spack/docs/features.rst | 17 | ||||
-rw-r--r-- | lib/spack/docs/index.rst | 2 | ||||
-rw-r--r-- | lib/spack/docs/packaging_guide.rst | 108 | ||||
-rw-r--r-- | lib/spack/spack/cmd/find.py | 80 | ||||
-rw-r--r-- | lib/spack/spack/database.py | 99 | ||||
-rw-r--r-- | lib/spack/spack/fetch_strategy.py | 33 | ||||
-rw-r--r-- | lib/spack/spack/spec.py | 574 | ||||
-rw-r--r-- | lib/spack/spack/test/__init__.py | 2 | ||||
-rw-r--r-- | lib/spack/spack/test/cmd/find.py | 60 |
11 files changed, 701 insertions, 523 deletions
diff --git a/lib/spack/docs/basic_usage.rst b/lib/spack/docs/basic_usage.rst index 15db2f7a16..2eed9dddd4 100644 --- a/lib/spack/docs/basic_usage.rst +++ b/lib/spack/docs/basic_usage.rst @@ -102,8 +102,8 @@ that the packages is installed: ==> adept-utils is already installed in /home/gamblin2/spack/opt/chaos_5_x86_64_ib/gcc@4.4.7/adept-utils@1.0-5adef8da. ==> Trying to fetch from https://github.com/hpc/mpileaks/releases/download/v1.0/mpileaks-1.0.tar.gz ######################################################################## 100.0% - ==> Staging archive: /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7=chaos_5_x86_64_ib-59f6ad23/mpileaks-1.0.tar.gz - ==> Created stage in /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7=chaos_5_x86_64_ib-59f6ad23. + ==> Staging archive: /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7 arch=chaos_5_x86_64_ib-59f6ad23/mpileaks-1.0.tar.gz + ==> Created stage in /home/gamblin2/spack/var/spack/stage/mpileaks@1.0%gcc@4.4.7 arch=chaos_5_x86_64_ib-59f6ad23. ==> No patches needed for mpileaks. ==> Building mpileaks. @@ -132,10 +132,10 @@ sites, as installing a version that one user needs will not disrupt existing installations for other users. In addition to different versions, Spack can customize the compiler, -compile-time options (variants), and platform (for cross compiles) of -an installation. Spack is unique in that it can also configure the -*dependencies* a package is built with. For example, two -configurations of the same version of a package, one built with boost +compile-time options (variants), compiler flags, and platform (for +cross compiles) of an installation. Spack is unique in that it can +also configure the *dependencies* a package is built with. For example, +two configurations of the same version of a package, one built with boost 1.39.0, and the other version built with version 1.43.0, can coexist. This can all be done on the command line using the *spec* syntax. @@ -246,6 +246,12 @@ Packages are divided into groups according to their architecture and compiler. Within each group, Spack tries to keep the view simple, and only shows the version of installed packages. +``spack find`` can filter the package list based on the package name, spec, or +a number of properties of their installation status. For example, missing +dependencies of a spec can be shown with ``-m``, packages which were +explicitly installed with ``spack install <package>`` can be singled out with +``-e`` and those which have been pulled in only as dependencies with ``-E``. + In some cases, there may be different configurations of the *same* version of a package installed. For example, there are two installations of of ``libdwarf@20130729`` above. We can look at them @@ -328,6 +334,11 @@ of libelf would look like this: -- chaos_5_x86_64_ib / gcc@4.4.7 -------------------------------- libdwarf@20130729-d9b90962 +We can also search for packages that have a certain attribute. For example, +``spack find -l libdwarf +debug`` will show only installations of libdwarf +with the 'debug' compile-time option enabled, while ``spack find -l +debug`` +will find every installed package with a 'debug' compile-time option enabled. + The full spec syntax is discussed in detail in :ref:`sec-specs`. @@ -457,6 +468,26 @@ For compilers, like ``clang``, that do not support Fortran, put Once you save the file, the configured compilers will show up in the list displayed by ``spack compilers``. +You can also add compiler flags to manually configured compilers. The +valid flags are ``cflags``, ``cxxflags``, ``fflags``, ``cppflags``, +``ldflags``, and ``ldlibs``. For example,:: + + ... + chaos_5_x86_64_ib: + ... + intel@15.0.0: + cc: /usr/local/bin/icc-15.0.024-beta + cxx: /usr/local/bin/icpc-15.0.024-beta + f77: /usr/local/bin/ifort-15.0.024-beta + fc: /usr/local/bin/ifort-15.0.024-beta + cppflags: -O3 -fPIC + ... + +These flags will be treated by spack as if they were enterred from +the command line each time this compiler is used. The compiler wrappers +then inject those flags into the compiler command. Compiler flags +enterred from the command line will be discussed in more detail in the +following section. .. _sec-specs: @@ -474,7 +505,7 @@ the full syntax of specs. Here is an example of a much longer spec than we've seen thus far:: - mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt =bgqos_0 ^callpath @1.1 %gcc@4.7.2 + mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt arch=bgq_os ^callpath @1.1 %gcc@4.7.2 If provided to ``spack install``, this will install the ``mpileaks`` library at some version between ``1.2`` and ``1.4`` (inclusive), @@ -492,8 +523,12 @@ More formally, a spec consists of the following pieces: * ``%`` Optional compiler specifier, with an optional compiler version (``gcc`` or ``gcc@4.7.3``) * ``+`` or ``-`` or ``~`` Optional variant specifiers (``+debug``, - ``-qt``, or ``~qt``) -* ``=`` Optional architecture specifier (``bgqos_0``) + ``-qt``, or ``~qt``) for boolean variants +* ``name=<value>`` Optional variant specifiers that are not restricted to +boolean variants +* ``name=<value>`` Optional compiler flag specifiers. Valid flag names are +``cflags``, ``cxxflags``, ``fflags``, ``cppflags``, ``ldflags``, and ``ldlibs``. +* ``arch=<value>`` Optional architecture specifier (``arch=bgq_os``) * ``^`` Dependency specs (``^callpath@1.1``) There are two things to notice here. The first is that specs are @@ -573,7 +608,7 @@ compilers, variants, and architectures just like any other spec. Specifiers are associated with the nearest package name to their left. For example, above, ``@1.1`` and ``%gcc@4.7.2`` associates with the ``callpath`` package, while ``@1.2:1.4``, ``%gcc@4.7.5``, ``+debug``, -``-qt``, and ``=bgqos_0`` all associate with the ``mpileaks`` package. +``-qt``, and ``arch=bgq_os`` all associate with the ``mpileaks`` package. In the diagram above, ``mpileaks`` depends on ``mpich`` with an unspecified version, but packages can depend on other packages with @@ -629,22 +664,25 @@ based on site policies. Variants ~~~~~~~~~~~~~~~~~~~~~~~ -.. Note:: - - Variants are not yet supported, but will be in the next Spack - release (0.9), due in Q2 2015. - -Variants are named options associated with a particular package, and -they can be turned on or off. For example, above, supplying -``+debug`` causes ``mpileaks`` to be built with debug flags. The -names of particular variants available for a package depend on what -was provided by the package author. ``spack info <package>`` will +Variants are named options associated with a particular package. They are +optional, as each package must provide default values for each variant it +makes available. Variants can be specified using +a flexible parameter syntax ``name=<value>``. For example, +``spack install libelf debug=True`` will install libelf build with debug +flags. The names of particular variants available for a package depend on +what was provided by the package author. ``spack into <package>`` will provide information on what build variants are available. -Depending on the package a variant may be on or off by default. For -``mpileaks`` here, ``debug`` is off by default, and we turned it on -with ``+debug``. If a package is on by default you can turn it off by -either adding ``-name`` or ``~name`` to the spec. +For compatibility with earlier versions, variants which happen to be +boolean in nature can be specified by a syntax that represents turning +options on and off. For example, in the previous spec we could have +supplied ``libelf +debug`` with the same effect of enabling the debug +compile time option for the libelf package. + +Depending on the package a variant may have any default value. For +``libelf`` here, ``debug`` is ``False`` by default, and we turned it on +with ``debug=True`` or ``+debug``. If a package is ``True`` by default +you can turn it off by either adding ``-name`` or ``~name`` to the spec. There are two syntaxes here because, depending on context, ``~`` and ``-`` may mean different things. In most shells, the following will @@ -656,7 +694,7 @@ result in the shell performing home directory substitution: mpileaks~debug # use this instead If there is a user called ``debug``, the ``~`` will be incorrectly -expanded. In this situation, you would want to write ``mpileaks +expanded. In this situation, you would want to write ``libelf -debug``. However, ``-`` can be ambiguous when included after a package name without spaces: @@ -671,12 +709,35 @@ package, not a request for ``mpileaks`` built without ``debug`` options. In this scenario, you should write ``mpileaks~debug`` to avoid ambiguity. -When spack normalizes specs, it prints them out with no spaces and -uses only ``~`` for disabled variants. We allow ``-`` and spaces on -the command line is provided for convenience and legibility. +When spack normalizes specs, it prints them out with no spaces boolean +variants using the backwards compatibility syntax and uses only ``~`` +for disabled boolean variants. We allow ``-`` and spaces on the command +line is provided for convenience and legibility. -Architecture specifier +Compiler Flags +~~~~~~~~~~~~~~~~~~~~~~~ + +Compiler flags are specified using the same syntax as non-boolean variants, +but fulfill a different purpose. While the function of a variant is set by +the package, compiler flags are used by the compiler wrappers to inject +flags into the compile line of the build. Additionally, compiler flags are +inherited by dependencies. ``spack install libdwarf cppflags=\"-g\"`` will +install both libdwarf and libelf with the ``-g`` flag injected into their +compile line. + +Notice that the value of the compiler flags must be escape quoted on the +command line. From within python files, the same spec would be specified +``libdwarf cppflags="-g"``. This is necessary because of how the shell +handles the quote symbols. + +The six compiler flags are injected in the order of implicit make commands +in gnu autotools. If all flags are set, the order is +``$cppflags $cflags|$cxxflags $ldflags command $ldlibs`` for C and C++ and +``$fflags $cppflags $ldflags command $ldlibs`` for fortran. + + +Architecture specifiers ~~~~~~~~~~~~~~~~~~~~~~~ .. Note:: @@ -684,12 +745,9 @@ Architecture specifier Architecture specifiers are part of specs but are not yet functional. They will be in Spack version 1.0, due in Q3 2015. -The architecture specifier starts with a ``=`` and also comes after -some package name within a spec. It allows a user to specify a -particular architecture for the package to be built. This is mostly -used for architectures that need cross-compilation, and in most cases, -users will not need to specify the architecture when they install a -package. +The architecture specifier looks identical to a variant specifier for a +non-boolean variant. The architecture can be specified only using the +reserved name ``arch`` (``arch=bgq_os``). .. _sec-virtual-dependencies: @@ -767,6 +825,23 @@ any MPI implementation will do. If another package depends on error. Likewise, if you try to plug in some package that doesn't provide MPI, Spack will raise an error. +Specifying Specs by Hash +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Complicated specs can become cumbersome to enter on the command line, +especially when many of the qualifications are necessary to +distinguish between similar installs, for example when using the +``uninstall`` command. To avoid this, when referencing an existing spec, +Spack allows you to reference specs by their hash. We previously +discussed the spec hash that Spack computes. In place of a spec in any +command, substitute ``/<hash>`` where ``<hash>`` is any amount from +the beginning of a spec hash. If the given spec hash is sufficient +to be unique, Spack will replace the reference with the spec to which +it refers. Otherwise, it will prompt for a more qualified hash. + +Note that this will not work to reinstall a depencency uninstalled by +``spack uninstall -f``. + .. _spack-providers: ``spack providers`` @@ -996,8 +1071,8 @@ than one installed package matches it), then Spack will warn you: $ spack load libelf ==> Error: Multiple matches for spec libelf. Choose one: - libelf@0.8.13%gcc@4.4.7=chaos_5_x86_64_ib - libelf@0.8.13%intel@15.0.0=chaos_5_x86_64_ib + libelf@0.8.13%gcc@4.4.7 arch=chaos_5_x86_64_ib + libelf@0.8.13%intel@15.0.0 arch=chaos_5_x86_64_ib You can either type the ``spack load`` command again with a fully qualified argument, or you can add just enough extra constraints to @@ -1276,7 +1351,7 @@ You can find extensions for your Python installation like this: .. code-block:: sh $ spack extensions python - ==> python@2.7.8%gcc@4.4.7=chaos_5_x86_64_ib-703c7a96 + ==> python@2.7.8%gcc@4.4.7 arch=chaos_5_x86_64_ib-703c7a96 ==> 36 extensions: geos py-ipython py-pexpect py-pyside py-sip py-basemap py-libxml2 py-pil py-pytz py-six @@ -1366,9 +1441,9 @@ installation: .. code-block:: sh $ spack activate py-numpy - ==> Activated extension py-setuptools@11.3.1%gcc@4.4.7=chaos_5_x86_64_ib-3c74eb69 for python@2.7.8%gcc@4.4.7. - ==> Activated extension py-nose@1.3.4%gcc@4.4.7=chaos_5_x86_64_ib-5f70f816 for python@2.7.8%gcc@4.4.7. - ==> Activated extension py-numpy@1.9.1%gcc@4.4.7=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. + ==> Activated extension py-setuptools@11.3.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-3c74eb69 for python@2.7.8%gcc@4.4.7. + ==> Activated extension py-nose@1.3.4%gcc@4.4.7 arch=chaos_5_x86_64_ib-5f70f816 for python@2.7.8%gcc@4.4.7. + ==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. Several things have happened here. The user requested that ``py-numpy`` be activated in the ``python`` installation it was built @@ -1383,7 +1458,7 @@ packages listed as activated: .. code-block:: sh $ spack extensions python - ==> python@2.7.8%gcc@4.4.7=chaos_5_x86_64_ib-703c7a96 + ==> python@2.7.8%gcc@4.4.7 arch=chaos_5_x86_64_ib-703c7a96 ==> 36 extensions: geos py-ipython py-pexpect py-pyside py-sip py-basemap py-libxml2 py-pil py-pytz py-six @@ -1431,7 +1506,7 @@ dependencies, you can use ``spack activate -f``: .. code-block:: sh $ spack activate -f py-numpy - ==> Activated extension py-numpy@1.9.1%gcc@4.4.7=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. + ==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=chaos_5_x86_64_ib-66733244 for python@2.7.8%gcc@4.4.7. .. _spack-deactivate: diff --git a/lib/spack/docs/site_configuration.rst b/lib/spack/docs/configuration.rst index 3abfa21a9d..c613071c65 100644 --- a/lib/spack/docs/site_configuration.rst +++ b/lib/spack/docs/configuration.rst @@ -1,6 +1,6 @@ -.. _site-configuration: +.. _configuration: -Site configuration +Configuration =================================== .. _temp-space: @@ -55,7 +55,7 @@ directory is. External Packages -~~~~~~~~~~~~~~~~~~~~~ +---------------------------- Spack can be configured to use externally-installed packages rather than building its own packages. This may be desirable if machines ship with system packages, such as a customized MPI @@ -70,15 +70,15 @@ directory. Here's an example of an external configuration: packages: openmpi: paths: - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug - openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel + openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 + openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug + openmpi@1.6.5%intel@10.1 arch=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel This example lists three installations of OpenMPI, one built with gcc, one built with gcc and debug information, and another built with Intel. If Spack is asked to build a package that uses one of these MPIs as a dependency, it will use the the pre-installed OpenMPI in -the given directory. +the given directory. Each ``packages.yaml`` begins with a ``packages:`` token, followed by a list of package names. To specify externals, add a ``paths`` @@ -108,9 +108,9 @@ be: packages: openmpi: paths: - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 - openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug - openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel + openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib: /opt/openmpi-1.4.3 + openmpi@1.4.3%gcc@4.4.7 arch=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug + openmpi@1.6.5%intel@10.1 arch=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel buildable: False The addition of the ``buildable`` flag tells Spack that it should never build @@ -118,13 +118,73 @@ its own version of OpenMPI, and it will instead always rely on a pre-built OpenMPI. Similar to ``paths``, ``buildable`` is specified as a property under a package name. -The ``buildable`` does not need to be paired with external packages. -It could also be used alone to forbid packages that may be +The ``buildable`` does not need to be paired with external packages. +It could also be used alone to forbid packages that may be buggy or otherwise undesirable. +Concretization Preferences +-------------------------------- + +Spack can be configured to prefer certain compilers, package +versions, depends_on, and variants during concretization. +The preferred configuration can be controlled via the +``~/.spack/packages.yaml`` file for user configuations, or the +``etc/spack/packages.yaml`` site configuration. + + +Here's an example packages.yaml file that sets preferred packages: + +.. code-block:: sh + + packages: + dyninst: + compiler: [gcc@4.9] + variants: +debug + gperftools: + version: [2.2, 2.4, 2.3] + all: + compiler: [gcc@4.4.7, gcc@4.6:, intel, clang, pgi] + providers: + mpi: [mvapich, mpich, openmpi] + + +At a high level, this example is specifying how packages should be +concretized. The dyninst package should prefer using gcc 4.9 and +be built with debug options. The gperftools package should prefer version +2.2 over 2.4. Every package on the system should prefer mvapich for +its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by preferring gcc 4.9). +These options are used to fill in implicit defaults. Any of them can be overwritten +on the command line if explicitly requested. + +Each packages.yaml file begins with the string ``packages:`` and +package names are specified on the next level. The special string ``all`` +applies settings to each package. Underneath each package name is +one or more components: ``compiler``, ``variants``, ``version``, +or ``providers``. Each component has an ordered list of spec +``constraints``, with earlier entries in the list being preferred over +later entries. + +Sometimes a package installation may have constraints that forbid +the first concretization rule, in which case Spack will use the first +legal concretization rule. Going back to the example, if a user +requests gperftools 2.3 or later, then Spack will install version 2.4 +as the 2.4 version of gperftools is preferred over 2.3. + +An explicit concretization rule in the preferred section will always +take preference over unlisted concretizations. In the above example, +xlc isn't listed in the compiler list. Every listed compiler from +gcc to pgi will thus be preferred over the xlc compiler. + +The syntax for the ``provider`` section differs slightly from other +concretization rules. A provider lists a value that packages may +``depend_on`` (e.g, mpi) and a list of rules for fulfilling that +dependency. + + + Profiling -~~~~~~~~~~~~~~~~~~~~~ +------------------ Spack has some limited built-in support for profiling, and can report statistics using standard Python timing tools. To use this feature, @@ -133,7 +193,7 @@ supply ``-p`` to Spack on the command line, before any subcommands. .. _spack-p: ``spack -p`` -^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~ ``spack -p`` output looks like this: diff --git a/lib/spack/docs/features.rst b/lib/spack/docs/features.rst index 0998ba8da4..27a3b4b435 100644 --- a/lib/spack/docs/features.rst +++ b/lib/spack/docs/features.rst @@ -31,14 +31,21 @@ platform, all on the command line. # Specify a compiler (and its version), with % $ spack install mpileaks@1.1.2 %gcc@4.7.3 - # Add special compile-time options with + + # Add special compile-time options by name + $ spack install mpileaks@1.1.2 %gcc@4.7.3 debug=True + + # Add special boolean compile-time options with + $ spack install mpileaks@1.1.2 %gcc@4.7.3 +debug - # Cross-compile for a different architecture with = - $ spack install mpileaks@1.1.2 =bgqos_0 + # Add compiler flags using the conventional names + $ spack install mpileaks@1.1.2 %gcc@4.7.3 cppflags=\"-O3 -floop-block\" + + # Cross-compile for a different architecture with arch= + $ spack install mpileaks@1.1.2 arch=bgqos_0 -Users can specify as many or few options as they care about. Spack -will fill in the unspecified values with sensible defaults. +Users can specify as many or few options as they care about. Spack +will fill in the unspecified values with sensible defaults. The two listed +syntaxes for variants are identical when the value is boolean. Customize dependencies diff --git a/lib/spack/docs/index.rst b/lib/spack/docs/index.rst index d6ce52b747..98ed9ff0fe 100644 --- a/lib/spack/docs/index.rst +++ b/lib/spack/docs/index.rst @@ -47,7 +47,7 @@ Table of Contents basic_usage packaging_guide mirrors - site_configuration + configuration developer_guide command_index package_list diff --git a/lib/spack/docs/packaging_guide.rst b/lib/spack/docs/packaging_guide.rst index 650e0ee3b2..1f83f611b0 100644 --- a/lib/spack/docs/packaging_guide.rst +++ b/lib/spack/docs/packaging_guide.rst @@ -1221,11 +1221,13 @@ just as easily provide a version range: depends_on("libelf@0.8.2:0.8.4:") -Or a requirement for a particular variant: +Or a requirement for a particular variant or compiler flags: .. code-block:: python depends_on("libelf@0.8+debug") + depends_on('libelf debug=True') + depends_on('libelf cppflags="-fPIC") Both users *and* package authors can use the same spec syntax to refer to different package configurations. Users use the spec syntax on the @@ -1623,21 +1625,21 @@ the user runs ``spack install`` and the time the ``install()`` method is called. The concretized version of the spec above might look like this:: - mpileaks@2.3%gcc@4.7.3=linux-ppc64 - ^callpath@1.0%gcc@4.7.3+debug=linux-ppc64 - ^dyninst@8.1.2%gcc@4.7.3=linux-ppc64 - ^libdwarf@20130729%gcc@4.7.3=linux-ppc64 - ^libelf@0.8.11%gcc@4.7.3=linux-ppc64 - ^mpich@3.0.4%gcc@4.7.3=linux-ppc64 + mpileaks@2.3%gcc@4.7.3 arch=linux-ppc64 + ^callpath@1.0%gcc@4.7.3+debug arch=linux-ppc64 + ^dyninst@8.1.2%gcc@4.7.3 arch=linux-ppc64 + ^libdwarf@20130729%gcc@4.7.3 arch=linux-ppc64 + ^libelf@0.8.11%gcc@4.7.3 arch=linux-ppc64 + ^mpich@3.0.4%gcc@4.7.3 arch=linux-ppc64 .. graphviz:: digraph { - "mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64" - "mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64" - "callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" - "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libdwarf@20130729\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64" - "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64" + "mpileaks@2.3\n%gcc@4.7.3\n arch=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n arch=linux-ppc64" + "mpileaks@2.3\n%gcc@4.7.3\n arch=linux-ppc64" -> "callpath@1.0\n%gcc@4.7.3+debug\n arch=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n arch=linux-ppc64" + "callpath@1.0\n%gcc@4.7.3+debug\n arch=linux-ppc64" -> "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64" + "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64" -> "libdwarf@20130729\n%gcc@4.7.3\n arch=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n arch=linux-ppc64" + "dyninst@8.1.2\n%gcc@4.7.3\n arch=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n arch=linux-ppc64" } Here, all versions, compilers, and platforms are filled in, and there @@ -1648,8 +1650,8 @@ point will Spack call the ``install()`` method for your package. Concretization in Spack is based on certain selection policies that tell Spack how to select, e.g., a version, when one is not specified explicitly. Concretization policies are discussed in more detail in -:ref:`site-configuration`. Sites using Spack can customize them to -match the preferences of their own users. +:ref:`configuration`. Sites using Spack can customize them to match +the preferences of their own users. .. _spack-spec: @@ -1666,9 +1668,9 @@ running ``spack spec``. For example: ^libdwarf ^libelf - dyninst@8.0.1%gcc@4.7.3=linux-ppc64 - ^libdwarf@20130729%gcc@4.7.3=linux-ppc64 - ^libelf@0.8.13%gcc@4.7.3=linux-ppc64 + dyninst@8.0.1%gcc@4.7.3 arch=linux-ppc64 + ^libdwarf@20130729%gcc@4.7.3 arch=linux-ppc64 + ^libelf@0.8.13%gcc@4.7.3 arch=linux-ppc64 This is useful when you want to know exactly what Spack will do when you ask for a particular spec. @@ -1682,60 +1684,8 @@ be concretized on their system. For example, one user may prefer packages built with OpenMPI and the Intel compiler. Another user may prefer packages be built with MVAPICH and GCC. -Spack can be configured to prefer certain compilers, package -versions, depends_on, and variants during concretization. -The preferred configuration can be controlled via the -``~/.spack/packages.yaml`` file for user configuations, or the -``etc/spack/packages.yaml`` site configuration. - - -Here's an example packages.yaml file that sets preferred packages: - -.. code-block:: sh - - packages: - dyninst: - compiler: [gcc@4.9] - variants: +debug - gperftools: - version: [2.2, 2.4, 2.3] - all: - compiler: [gcc@4.4.7, gcc@4.6:, intel, clang, pgi] - providers: - mpi: [mvapich, mpich, openmpi] - - -At a high level, this example is specifying how packages should be -concretized. The dyninst package should prefer using gcc 4.9 and -be built with debug options. The gperftools package should prefer version -2.2 over 2.4. Every package on the system should prefer mvapich for -its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by preferring gcc 4.9). -These options are used to fill in implicit defaults. Any of them can be overwritten -on the command line if explicitly requested. - -Each packages.yaml file begins with the string ``packages:`` and -package names are specified on the next level. The special string ``all`` -applies settings to each package. Underneath each package name is -one or more components: ``compiler``, ``variants``, ``version``, -or ``providers``. Each component has an ordered list of spec -``constraints``, with earlier entries in the list being preferred over -later entries. - -Sometimes a package installation may have constraints that forbid -the first concretization rule, in which case Spack will use the first -legal concretization rule. Going back to the example, if a user -requests gperftools 2.3 or later, then Spack will install version 2.4 -as the 2.4 version of gperftools is preferred over 2.3. - -An explicit concretization rule in the preferred section will always -take preference over unlisted concretizations. In the above example, -xlc isn't listed in the compiler list. Every listed compiler from -gcc to pgi will thus be preferred over the xlc compiler. - -The syntax for the ``provider`` section differs slightly from other -concretization rules. A provider lists a value that packages may -``depend_on`` (e.g, mpi) and a list of rules for fulfilling that -dependency. +See the `documentation in the config section <concretization-preferences_>`_ +for more details. .. _install-method: @@ -1960,6 +1910,12 @@ the command line. ``$rpath_flag`` can be overriden on a compiler specific basis in ``lib/spack/spack/compilers/$compiler.py``. +The compiler wrappers also pass the compiler flags specified by the user from +the command line (``cflags``, ``cxxflags``, ``fflags``, ``cppflags``, ``ldflags``, +and/or ``ldlibs``). They do not override the canonical autotools flags with the +same names (but in ALL-CAPS) that may be passed into the build by particularly +challenging package scripts. + Compiler flags ~~~~~~~~~~~~~~ In rare circumstances such as compiling and running small unit tests, a package @@ -2206,12 +2162,12 @@ example: def install(self, prefix): # Do default install - @when('=chaos_5_x86_64_ib') + @when('arch=chaos_5_x86_64_ib') def install(self, prefix): # This will be executed instead of the default install if # the package's sys_type() is chaos_5_x86_64_ib. - @when('=bgqos_0") + @when('arch=bgqos_0") def install(self, prefix): # This will be executed if the package's sys_type is bgqos_0 @@ -2801,11 +2757,11 @@ build it: $ spack stage libelf ==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz ######################################################################## 100.0% - ==> Staging archive: /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13.tar.gz - ==> Created stage in /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64. + ==> Staging archive: /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64/libelf-0.8.13.tar.gz + ==> Created stage in /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64. $ spack cd libelf $ pwd - /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13 + /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3 arch=linux-ppc64/libelf-0.8.13 ``spack cd`` here changed he current working directory to the directory containing the expanded ``libelf`` source code. There are a diff --git a/lib/spack/spack/cmd/find.py b/lib/spack/spack/cmd/find.py index c2bba13dc8..9649bc7435 100644 --- a/lib/spack/spack/cmd/find.py +++ b/lib/spack/spack/cmd/find.py @@ -38,71 +38,59 @@ description = "Find installed spack packages" def setup_parser(subparser): format_group = subparser.add_mutually_exclusive_group() - format_group.add_argument('-s', - '--short', + format_group.add_argument('-s', '--short', action='store_const', dest='mode', const='short', help='Show only specs (default)') - format_group.add_argument('-p', - '--paths', + format_group.add_argument('-p', '--paths', action='store_const', dest='mode', const='paths', help='Show paths to package install directories') format_group.add_argument( - '-d', - '--deps', + '-d', '--deps', action='store_const', dest='mode', const='deps', help='Show full dependency DAG of installed packages') - subparser.add_argument('-l', - '--long', + subparser.add_argument('-l', '--long', action='store_true', dest='long', help='Show dependency hashes as well as versions.') - subparser.add_argument('-L', - '--very-long', + subparser.add_argument('-L', '--very-long', action='store_true', dest='very_long', help='Show dependency hashes as well as versions.') - subparser.add_argument('-f', - '--show-flags', + subparser.add_argument('-f', '--show-flags', action='store_true', dest='show_flags', help='Show spec compiler flags.') subparser.add_argument( - '-e', - '--explicit', + '-e', '--explicit', action='store_true', help='Show only specs that were installed explicitly') subparser.add_argument( - '-E', - '--implicit', + '-E', '--implicit', action='store_true', help='Show only specs that were installed as dependencies') subparser.add_argument( - '-u', - '--unknown', + '-u', '--unknown', action='store_true', dest='unknown', help='Show only specs Spack does not have a package for.') subparser.add_argument( - '-m', - '--missing', + '-m', '--missing', action='store_true', dest='missing', help='Show missing dependencies as well as installed specs.') - subparser.add_argument('-M', - '--only-missing', + subparser.add_argument('-M', '--only-missing', action='store_true', dest='only_missing', help='Show only missing dependencies.') - subparser.add_argument('-N', - '--namespace', + subparser.add_argument('-N', '--namespace', action='store_true', help='Show fully qualified package names.') @@ -188,7 +176,32 @@ def display_specs(specs, **kwargs): print(hsh + spec.format(format_string, color=True) + '\n') else: - raise ValueError("Invalid mode for display_specs: %s. Must be one of (paths, deps, short)." % mode) # NOQA: ignore=E501 + raise ValueError( + "Invalid mode for display_specs: %s. Must be one of (paths," + "deps, short)." % mode) # NOQA: ignore=E501 + + +def query_arguments(args): + # Check arguments + if args.explicit and args.implicit: + tty.error('You can\'t pass -E and -e options simultaneously.') + raise SystemExit(1) + + # Set up query arguments. + installed, known = True, any + if args.only_missing: + installed = False + elif args.missing: + installed = any + if args.unknown: + known = False + explicit = any + if args.explicit: + explicit = True + if args.implicit: + explicit = False + q_args = {'installed': installed, 'known': known, "explicit": explicit} + return q_args def find(parser, args): @@ -205,22 +218,7 @@ def find(parser, args): if not query_specs: return - # Set up query arguments. - installed, known = True, any - if args.only_missing: - installed = False - elif args.missing: - installed = any - if args.unknown: - known = False - - explicit = any - if args.explicit: - explicit = False - if args.implicit: - explicit = True - - q_args = {'installed': installed, 'known': known, "explicit": explicit} + q_args = query_arguments(args) # Get all the specs the user asked for if not query_specs: diff --git a/lib/spack/spack/database.py b/lib/spack/spack/database.py index f3967e6b72..e768ddf5fe 100644 --- a/lib/spack/spack/database.py +++ b/lib/spack/spack/database.py @@ -40,7 +40,6 @@ filesystem. """ import os -import time import socket import yaml @@ -56,6 +55,7 @@ from spack.spec import Spec from spack.error import SpackError from spack.repository import UnknownPackageError + # DB goes in this directory underneath the root _db_dirname = '.spack-db' @@ -69,10 +69,12 @@ _db_lock_timeout = 60 def _autospec(function): """Decorator that automatically converts the argument of a single-arg function to a Spec.""" + def converter(self, spec_like, *args, **kwargs): if not isinstance(spec_like, spack.spec.Spec): spec_like = spack.spec.Spec(spec_like) return function(self, spec_like, *args, **kwargs) + return converter @@ -92,6 +94,7 @@ class InstallRecord(object): dependents left. """ + def __init__(self, spec, path, installed, ref_count=0, explicit=False): self.spec = spec self.path = str(path) @@ -100,16 +103,19 @@ class InstallRecord(object): self.explicit = explicit def to_dict(self): - return { 'spec' : self.spec.to_node_dict(), - 'path' : self.path, - 'installed' : self.installed, - 'ref_count' : self.ref_count, - 'explicit' : self.explicit } + return { + 'spec': self.spec.to_node_dict(), + 'path': self.path, + 'installed': self.installed, + 'ref_count': self.ref_count, + 'explicit': self.explicit + } @classmethod def from_dict(cls, spec, dictionary): d = dictionary - return InstallRecord(spec, d['path'], d['installed'], d['ref_count'], d.get('explicit', False)) + return InstallRecord(spec, d['path'], d['installed'], d['ref_count'], + d.get('explicit', False)) class Database(object): @@ -144,7 +150,7 @@ class Database(object): # Set up layout of database files within the db dir self._index_path = join_path(self._db_dir, 'index.yaml') - self._lock_path = join_path(self._db_dir, 'lock') + self._lock_path = join_path(self._db_dir, 'lock') # Create needed directories and files if not os.path.exists(self._db_dir): @@ -157,17 +163,14 @@ class Database(object): self.lock = Lock(self._lock_path) self._data = {} - def write_transaction(self, timeout=_db_lock_timeout): """Get a write lock context manager for use in a `with` block.""" return WriteTransaction(self, self._read, self._write, timeout) - def read_transaction(self, timeout=_db_lock_timeout): """Get a read lock context manager for use in a `with` block.""" return ReadTransaction(self, self._read, None, timeout) - def _write_to_yaml(self, stream): """Write out the databsae to a YAML file. @@ -183,9 +186,9 @@ class Database(object): # different paths, it can't differentiate. # TODO: fix this before we support multiple install locations. database = { - 'database' : { - 'installs' : installs, - 'version' : str(_db_version) + 'database': { + 'installs': installs, + 'version': str(_db_version) } } @@ -194,15 +197,11 @@ class Database(object): except YAMLError as e: raise SpackYAMLError("error writing YAML database:", str(e)) - def _read_spec_from_yaml(self, hash_key, installs, parent_key=None): """Recursively construct a spec from a hash in a YAML database. Does not do any locking. """ - if hash_key not in installs: - parent = read_spec(installs[parent_key]['path']) - spec_dict = installs[hash_key]['spec'] # Install records don't include hash with spec, so we add it in here @@ -224,7 +223,6 @@ class Database(object): spec._mark_concrete() return spec - def _read_from_yaml(self, stream): """ Fill database from YAML, do not maintain old data @@ -246,15 +244,15 @@ class Database(object): return def check(cond, msg): - if not cond: raise CorruptDatabaseError(self._index_path, msg) + if not cond: + raise CorruptDatabaseError(self._index_path, msg) check('database' in yfile, "No 'database' attribute in YAML.") # High-level file checks db = yfile['database'] check('installs' in db, "No 'installs' in YAML DB.") - check('version' in db, "No 'version' in YAML DB.") - + check('version' in db, "No 'version' in YAML DB.") installs = db['installs'] @@ -277,25 +275,25 @@ class Database(object): # hashes are the same. spec_hash = spec.dag_hash() if not spec_hash == hash_key: - tty.warn("Hash mismatch in database: %s -> spec with hash %s" - % (hash_key, spec_hash)) - continue # TODO: is skipping the right thing to do? + tty.warn( + "Hash mismatch in database: %s -> spec with hash %s" % + (hash_key, spec_hash)) + continue # TODO: is skipping the right thing to do? # Insert the brand new spec in the database. Each # spec has its own copies of its dependency specs. - # TODO: would a more immmutable spec implementation simplify this? + # TODO: would a more immmutable spec implementation simplify + # this? data[hash_key] = InstallRecord.from_dict(spec, rec) except Exception as e: tty.warn("Invalid database reecord:", "file: %s" % self._index_path, - "hash: %s" % hash_key, - "cause: %s" % str(e)) + "hash: %s" % hash_key, "cause: %s" % str(e)) raise self._data = data - def reindex(self, directory_layout): """Build database index from scratch based from a directory layout. @@ -320,7 +318,6 @@ class Database(object): self._data = old_data raise - def _check_ref_counts(self): """Ensure consistency of reference counts in the DB. @@ -342,9 +339,8 @@ class Database(object): found = rec.ref_count if not expected == found: raise AssertionError( - "Invalid ref_count: %s: %d (expected %d), in DB %s" - % (key, found, expected, self._index_path)) - + "Invalid ref_count: %s: %d (expected %d), in DB %s" % + (key, found, expected, self._index_path)) def _write(self): """Write the in-memory database index to its file path. @@ -366,7 +362,6 @@ class Database(object): os.remove(temp_file) raise - def _read(self): """Re-read Database from the data in the set location. @@ -381,7 +376,6 @@ class Database(object): # reindex() takes its own write lock, so no lock here. self.reindex(spack.install_layout) - def _add(self, spec, path, directory_layout=None, explicit=False): """Add an install record for spec at path to the database. @@ -404,11 +398,11 @@ class Database(object): rec.path = path else: - self._data[key] = InstallRecord(spec, path, True, explicit=explicit) + self._data[key] = InstallRecord(spec, path, True, + explicit=explicit) for dep in spec.dependencies.values(): self._increment_ref_count(dep, directory_layout) - def _increment_ref_count(self, spec, directory_layout=None): """Recursively examine dependencies and update their DB entries.""" key = spec.dag_hash() @@ -438,28 +432,25 @@ class Database(object): with self.write_transaction(): self._add(spec, path, explicit=explicit) - def _get_matching_spec_key(self, spec, **kwargs): """Get the exact spec OR get a single spec that matches.""" key = spec.dag_hash() - if not key in self._data: + if key not in self._data: match = self.query_one(spec, **kwargs) if match: return match.dag_hash() raise KeyError("No such spec in database! %s" % spec) return key - @_autospec def get_record(self, spec, **kwargs): key = self._get_matching_spec_key(spec, **kwargs) return self._data[key] - def _decrement_ref_count(self, spec): key = spec.dag_hash() - if not key in self._data: + if key not in self._data: # TODO: print something here? DB is corrupt, but # not much we can do. return @@ -472,7 +463,6 @@ class Database(object): for dep in spec.dependencies.values(): self._decrement_ref_count(dep) - def _remove(self, spec): """Non-locking version of remove(); does real work. """ @@ -491,7 +481,6 @@ class Database(object): # query spec was passed in. return rec.spec - @_autospec def remove(self, spec): """Removes a spec from the database. To be called on uninstall. @@ -508,7 +497,6 @@ class Database(object): with self.write_transaction(): return self._remove(spec) - @_autospec def installed_extensions_for(self, extendee_spec): """ @@ -519,12 +507,11 @@ class Database(object): try: if s.package.extends(extendee_spec): yield s.package - except UnknownPackageError as e: + except UnknownPackageError: continue # skips unknown packages # TODO: conditional way to do this instead of catching exceptions - def query(self, query_spec=any, known=any, installed=True, explicit=any): """Run a query on the database. @@ -567,14 +554,14 @@ class Database(object): continue if explicit is not any and rec.explicit != explicit: continue - if known is not any and spack.repo.exists(rec.spec.name) != known: + if known is not any and spack.repo.exists( + rec.spec.name) != known: continue if query_spec is any or rec.spec.satisfies(query_spec): results.append(rec.spec) return sorted(results) - def query_one(self, query_spec, known=any, installed=True): """Query for exactly one spec that matches the query spec. @@ -586,10 +573,9 @@ class Database(object): assert len(concrete_specs) <= 1 return concrete_specs[0] if concrete_specs else None - def missing(self, spec): with self.read_transaction(): - key = spec.dag_hash() + key = spec.dag_hash() return key in self._data and not self._data[key].installed @@ -601,7 +587,10 @@ class _Transaction(object): Timeout for lock is customizable. """ - def __init__(self, db, acquire_fn=None, release_fn=None, + + def __init__(self, db, + acquire_fn=None, + release_fn=None, timeout=_db_lock_timeout): self._db = db self._timeout = timeout @@ -636,11 +625,11 @@ class WriteTransaction(_Transaction): class CorruptDatabaseError(SpackError): def __init__(self, path, msg=''): super(CorruptDatabaseError, self).__init__( - "Spack database is corrupt: %s. %s" %(path, msg)) + "Spack database is corrupt: %s. %s" % (path, msg)) class InvalidDatabaseVersionError(SpackError): def __init__(self, expected, found): super(InvalidDatabaseVersionError, self).__init__( - "Expected database version %s but found version %s" - % (expected, found)) + "Expected database version %s but found version %s" % + (expected, found)) diff --git a/lib/spack/spack/fetch_strategy.py b/lib/spack/spack/fetch_strategy.py index 7c8cebe0c9..1953d7c1b3 100644 --- a/lib/spack/spack/fetch_strategy.py +++ b/lib/spack/spack/fetch_strategy.py @@ -1,4 +1,4 @@ -############################################################################## +# # Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. # Produced at the Lawrence Livermore National Laboratory. # @@ -21,7 +21,7 @@ # You should have received a copy of the GNU Lesser General Public # License along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA -############################################################################## +# """ Fetch strategies are used to download source code into a staging area in order to build it. They need to define the following methods: @@ -75,11 +75,13 @@ def _needs_stage(fun): class FetchStrategy(object): + """Superclass of all fetch strategies.""" enabled = False # Non-abstract subclasses should be enabled. required_attributes = None # Attributes required in version() args. class __metaclass__(type): + """This metaclass registers all fetch strategies in a list.""" def __init__(cls, name, bases, dict): @@ -126,6 +128,7 @@ class FetchStrategy(object): @pattern.composite(interface=FetchStrategy) class FetchStrategyComposite(object): + """ Composite for a FetchStrategy object. Implements the GoF composite pattern. """ @@ -134,6 +137,7 @@ class FetchStrategyComposite(object): class URLFetchStrategy(FetchStrategy): + """FetchStrategy that pulls source code from a URL for an archive, checks the archive against a checksum,and decompresses the archive. """ @@ -235,12 +239,13 @@ class URLFetchStrategy(FetchStrategy): # redirects properly. content_types = re.findall(r'Content-Type:[^\r\n]+', headers) if content_types and 'text/html' in content_types[-1]: - tty.warn( - "The contents of " + self.archive_file + " look like HTML.", - "The checksum will likely be bad. If it is, you can use", - "'spack clean <package>' to remove the bad archive, then fix", - "your internet gateway issue and install again.") - + tty.warn("The contents of ", + (self.archive_file if self.archive_file is not None + else "the archive"), + " look like HTML.", + "The checksum will likely be bad. If it is, you can use", + "'spack clean <package>' to remove the bad archive, then", + "fix your internet gateway issue and install again.") if save_file: os.rename(partial_file, save_file) @@ -353,6 +358,7 @@ class URLFetchStrategy(FetchStrategy): class VCSFetchStrategy(FetchStrategy): + def __init__(self, name, *rev_types, **kwargs): super(VCSFetchStrategy, self).__init__() self.name = name @@ -407,6 +413,7 @@ class VCSFetchStrategy(FetchStrategy): class GoFetchStrategy(VCSFetchStrategy): + """ Fetch strategy that employs the `go get` infrastructure Use like this in a package: @@ -466,6 +473,7 @@ class GoFetchStrategy(VCSFetchStrategy): class GitFetchStrategy(VCSFetchStrategy): + """ Fetch strategy that gets source code from a git repository. Use like this in a package: @@ -586,6 +594,7 @@ class GitFetchStrategy(VCSFetchStrategy): class SvnFetchStrategy(VCSFetchStrategy): + """Fetch strategy that gets source code from a subversion repository. Use like this in a package: @@ -662,6 +671,7 @@ class SvnFetchStrategy(VCSFetchStrategy): class HgFetchStrategy(VCSFetchStrategy): + """ Fetch strategy that gets source code from a Mercurial repository. Use like this in a package: @@ -806,11 +816,13 @@ def for_package_version(pkg, version): class FetchError(spack.error.SpackError): + def __init__(self, msg, long_msg=None): super(FetchError, self).__init__(msg, long_msg) class FailedDownloadError(FetchError): + """Raised wen a download fails.""" def __init__(self, url, msg=""): @@ -820,16 +832,19 @@ class FailedDownloadError(FetchError): class NoArchiveFileError(FetchError): + def __init__(self, msg, long_msg): super(NoArchiveFileError, self).__init__(msg, long_msg) class NoDigestError(FetchError): + def __init__(self, msg, long_msg=None): super(NoDigestError, self).__init__(msg, long_msg) class InvalidArgsError(FetchError): + def __init__(self, pkg, version): msg = ("Could not construct a fetch strategy for package %s at " "version %s") @@ -838,6 +853,7 @@ class InvalidArgsError(FetchError): class ChecksumError(FetchError): + """Raised when archive fails to checksum.""" def __init__(self, message, long_msg=None): @@ -845,6 +861,7 @@ class ChecksumError(FetchError): class NoStageError(FetchError): + """Raised when fetch operations are called before set_stage().""" def __init__(self, method): diff --git a/lib/spack/spack/spec.py b/lib/spack/spack/spec.py index bbc1abfa9e..a505b3c12e 100644 --- a/lib/spack/spack/spec.py +++ b/lib/spack/spack/spec.py @@ -129,24 +129,24 @@ from spack.build_environment import get_path_from_module, load_module identifier_re = r'\w[\w-]*' # Convenient names for color formats so that other things can use them -compiler_color = '@g' -version_color = '@c' -architecture_color = '@m' -enabled_variant_color = '@B' +compiler_color = '@g' +version_color = '@c' +architecture_color = '@m' +enabled_variant_color = '@B' disabled_variant_color = '@r' -dependency_color = '@.' -hash_color = '@K' +dependency_color = '@.' +hash_color = '@K' """This map determines the coloring of specs when using color output. We make the fields different colors to enhance readability. See spack.color for descriptions of the color codes. """ -color_formats = {'%' : compiler_color, - '@' : version_color, - '=' : architecture_color, - '+' : enabled_variant_color, - '~' : disabled_variant_color, - '^' : dependency_color, - '#' : hash_color } +color_formats = {'%': compiler_color, + '@': version_color, + '=': architecture_color, + '+': enabled_variant_color, + '~': disabled_variant_color, + '^': dependency_color, + '#': hash_color} """Regex used for splitting by spec field separators.""" _separators = '[%s]' % ''.join(color_formats.keys()) @@ -155,6 +155,7 @@ _separators = '[%s]' % ''.join(color_formats.keys()) every time we call str()""" _any_version = VersionList([':']) + def index_specs(specs): """Take a list of specs and return a dict of lists. Dict is keyed by spec name and lists include all specs with the @@ -162,7 +163,7 @@ def index_specs(specs): """ spec_dict = {} for spec in specs: - if not spec.name in spec_dict: + if spec.name not in spec_dict: spec_dict[spec.name] = [] spec_dict[spec.name].append(spec) return spec_dict @@ -209,8 +210,8 @@ class CompilerSpec(object): else: raise TypeError( - "Can only build CompilerSpec from string or CompilerSpec." + - " Found %s" % type(arg)) + "Can only build CompilerSpec from string or " + + "CompilerSpec. Found %s" % type(arg)) elif nargs == 2: name, version = args @@ -222,23 +223,19 @@ class CompilerSpec(object): raise TypeError( "__init__ takes 1 or 2 arguments. (%d given)" % nargs) - def _add_version(self, version): self.versions.add(version) - def _autospec(self, compiler_spec_like): if isinstance(compiler_spec_like, CompilerSpec): return compiler_spec_like return CompilerSpec(compiler_spec_like) - def satisfies(self, other, strict=False): other = self._autospec(other) return (self.name == other.name and self.versions.satisfies(other.versions, strict=strict)) - def constrain(self, other): """Intersect self's versions with other. @@ -252,44 +249,37 @@ class CompilerSpec(object): return self.versions.intersect(other.versions) - @property def concrete(self): """A CompilerSpec is concrete if its versions are concrete and there is an available compiler with the right version.""" return self.versions.concrete - @property def version(self): if not self.concrete: raise SpecError("Spec is not concrete: " + str(self)) return self.versions[0] - def copy(self): clone = CompilerSpec.__new__(CompilerSpec) clone.name = self.name clone.versions = self.versions.copy() return clone - def _cmp_key(self): return (self.name, self.versions) - def to_dict(self): - d = {'name' : self.name} + d = {'name': self.name} d.update(self.versions.to_dict()) - return { 'compiler' : d } - + return {'compiler': d} @staticmethod def from_dict(d): d = d['compiler'] return CompilerSpec(d['name'], VersionList.from_dict(d)) - def __str__(self): out = self.name if self.versions and self.versions != _any_version: @@ -303,6 +293,7 @@ class CompilerSpec(object): @key_ordering class VariantSpec(object): + """Variants are named, build-time options for a package. Names depend on the particular package being built, and each named variant can be enabled or disabled. @@ -311,17 +302,14 @@ class VariantSpec(object): self.name = name self.value = value - def _cmp_key(self): return (self.name, self.value) - def copy(self): return VariantSpec(self.name, self.value) - def __str__(self): - if self.value in [True,False]: + if self.value in [True, False]: out = '+' if self.value else '~' return out + self.name else: @@ -329,11 +317,11 @@ class VariantSpec(object): class VariantMap(HashableMap): + def __init__(self, spec): super(VariantMap, self).__init__() self.spec = spec - def satisfies(self, other, strict=False): if strict or self.spec._concrete: return all(k in self and self[k].value == other[k].value @@ -342,7 +330,6 @@ class VariantMap(HashableMap): return all(self[k].value == other[k].value for k in other if k in self) - def constrain(self, other): """Add all variants in other that aren't in self to self. @@ -361,7 +348,7 @@ class VariantMap(HashableMap): raise UnsatisfiableVariantSpecError(self[k], other[k]) else: self[k] = other[k].copy() - changed =True + changed = True return changed @property @@ -369,14 +356,12 @@ class VariantMap(HashableMap): return self.spec._concrete or all( v in self for v in self.spec.package_class.variants) - def copy(self): clone = VariantMap(None) for name, variant in self.items(): clone[name] = variant.copy() return clone - def __str__(self): sorted_keys = sorted(self.keys()) return ''.join(str(self[key]) for key in sorted_keys) @@ -385,20 +370,20 @@ class VariantMap(HashableMap): _valid_compiler_flags = [ 'cflags', 'cxxflags', 'fflags', 'ldflags', 'ldlibs', 'cppflags'] + class FlagMap(HashableMap): + def __init__(self, spec): super(FlagMap, self).__init__() self.spec = spec - def satisfies(self, other, strict=False): if strict or (self.spec and self.spec._concrete): return all(f in self and set(self[f]) <= set(other[f]) for f in other) else: return all(set(self[f]) <= set(other[f]) - for f in other if (other[f] != [] and f in self)) - + for f in other if (other[f] != [] and f in self)) def constrain(self, other): """Add all flags in other that aren't in self to self. @@ -408,13 +393,15 @@ class FlagMap(HashableMap): if other.spec and other.spec._concrete: for k in self: if k not in other: - raise UnsatisfiableCompilerFlagSpecError(self[k], '<absent>') + raise UnsatisfiableCompilerFlagSpecError( + self[k], '<absent>') changed = False for k in other: if k in self and not set(self[k]) <= set(other[k]): raise UnsatisfiableCompilerFlagSpecError( - ' '.join(f for f in self[k]), ' '.join( f for f in other[k])) + ' '.join(f for f in self[k]), + ' '.join(f for f in other[k])) elif k not in self: self[k] = other[k] changed = True @@ -428,32 +415,33 @@ class FlagMap(HashableMap): def concrete(self): return all(flag in self for flag in _valid_compiler_flags) - def copy(self): clone = FlagMap(None) for name, value in self.items(): clone[name] = value return clone - def _cmp_key(self): - return ''.join(str(key) + ' '.join(str(v) for v in value) for key, value in sorted(self.items())) - + return ''.join(str(key) + ' '.join(str(v) for v in value) + for key, value in sorted(self.items())) def __str__(self): - sorted_keys = filter(lambda flag: self[flag] != [], sorted(self.keys())) - cond_symbol = ' ' if len(sorted_keys)>0 else '' - return cond_symbol + ' '.join(str(key) + '=\"' + ' '.join(str(f) for f in self[key]) + '\"' for key in sorted_keys) + sorted_keys = filter( + lambda flag: self[flag] != [], sorted(self.keys())) + cond_symbol = ' ' if len(sorted_keys) > 0 else '' + return cond_symbol + ' '.join(str(key) + '=\"' + ' '.join(str(f) + for f in self[key]) + '\"' + for key in sorted_keys) class DependencyMap(HashableMap): + """Each spec has a DependencyMap containing specs for its dependencies. The DependencyMap is keyed by name. """ @property def concrete(self): return all(d.concrete for d in self.values()) - def __str__(self): return ''.join( ["^" + str(self[name]) for name in sorted(self.keys())]) @@ -461,6 +449,7 @@ class DependencyMap(HashableMap): @key_ordering class Spec(object): + def __init__(self, spec_like, *dep_like, **kwargs): # Copy if spec_like is a Spec. if isinstance(spec_like, Spec): @@ -513,7 +502,6 @@ class Spec(object): spec = dep if isinstance(dep, Spec) else Spec(dep) self._add_dependency(spec) - # # Private routines here are called by the parser when building a spec. # @@ -521,10 +509,10 @@ class Spec(object): """Called by the parser to add an allowable version.""" self.versions.add(version) - def _add_variant(self, name, value): """Called by the parser to add a variant.""" - if name in self.variants: raise DuplicateVariantError( + if name in self.variants: + raise DuplicateVariantError( "Cannot specify variant '%s' twice" % name) if isinstance(value, basestring) and value.upper() == 'TRUE': value = True @@ -532,7 +520,6 @@ class Spec(object): value = False self.variants[name] = VariantSpec(name, value) - def _add_flag(self, name, value): """Called by the parser to add a known flag. Known flags currently include "arch" @@ -564,11 +551,12 @@ class Spec(object): assert(self.compiler_flags is not None) self.compiler_flags[name] = value.split() else: - self._add_variant(name,value) + self._add_variant(name, value) def _set_compiler(self, compiler): """Called by the parser to set the compiler.""" - if self.compiler: raise DuplicateCompilerSpecError( + if self.compiler: + raise DuplicateCompilerSpecError( "Spec for '%s' cannot have two compilers." % self.name) self.compiler = compiler @@ -617,7 +605,8 @@ class Spec(object): def _add_dependency(self, spec): """Called by the parser to add another spec as a dependency.""" if spec.name in self.dependencies: - raise DuplicateDependencyError("Cannot depend on '%s' twice" % spec) + raise DuplicateDependencyError( + "Cannot depend on '%s' twice" % spec) self.dependencies[spec.name] = spec spec.dependents[self.name] = self @@ -626,8 +615,8 @@ class Spec(object): # @property def fullname(self): - return '%s.%s' % (self.namespace, self.name) if self.namespace else (self.name if self.name else '') - + return (('%s.%s' % (self.namespace, self.name)) if self.namespace else + (self.name if self.name else '')) @property def root(self): @@ -647,12 +636,10 @@ class Spec(object): assert(all(first_root is d.root for d in depiter)) return first_root - @property def package(self): return spack.repo.get(self) - @property def package_class(self): """Internal package call gets only the class object for a package. @@ -660,7 +647,6 @@ class Spec(object): """ return spack.repo.get_pkg_class(self.name) - @property def virtual(self): """Right now, a spec is virtual if no package exists with its name. @@ -672,12 +658,10 @@ class Spec(object): """ return Spec.is_virtual(self.name) - @staticmethod def is_virtual(name): """Test if a name is virtual without requiring a Spec.""" - return (not name is None) and ( not spack.repo.exists(name) ) - + return (name is not None) and (not spack.repo.exists(name)) @property def concrete(self): @@ -699,7 +683,6 @@ class Spec(object): and self.dependencies.concrete) return self._concrete - def traverse(self, visited=None, d=0, **kwargs): """Generic traversal of the DAG represented by this spec. This will yield each node in the spec. Options: @@ -743,14 +726,14 @@ class Spec(object): """ # get initial values for kwargs - depth = kwargs.get('depth', False) - key_fun = kwargs.get('key', id) + depth = kwargs.get('depth', False) + key_fun = kwargs.get('key', id) if isinstance(key_fun, basestring): key_fun = attrgetter(key_fun) yield_root = kwargs.get('root', True) - cover = kwargs.get('cover', 'nodes') - direction = kwargs.get('direction', 'children') - order = kwargs.get('order', 'pre') + cover = kwargs.get('cover', 'nodes') + direction = kwargs.get('direction', 'children') + order = kwargs.get('order', 'pre') # Make sure kwargs have legal values; raise ValueError if not. def validate(name, val, allowed_values): @@ -787,33 +770,29 @@ class Spec(object): visited.add(key) for name in sorted(successors): child = successors[name] - for elt in child.traverse(visited, d+1, **kwargs): + for elt in child.traverse(visited, d + 1, **kwargs): yield elt # Postorder traversal yields after successors if yield_me and order == 'post': yield result - @property def short_spec(self): """Returns a version of the spec with the dependencies hashed instead of completely enumerated.""" return self.format('$_$@$%@$+$=$#') - @property def cshort_spec(self): """Returns a version of the spec with the dependencies hashed instead of completely enumerated.""" return self.format('$_$@$%@$+$=$#', color=True) - @property def prefix(self): return Prefix(spack.install_layout.path_for_spec(self)) - def dag_hash(self, length=None): """ Return a hash of the entire spec DAG, including connectivity. @@ -830,8 +809,9 @@ class Spec(object): return b32_hash def to_node_dict(self): - params = dict( (name, v.value) for name, v in self.variants.items() ) - params.update( dict( (name, value) for name, value in self.compiler_flags.items()) ) + params = dict((name, v.value) for name, v in self.variants.items()) + params.update(dict((name, value) + for name, value in self.compiler_flags.items())) d = { 'parameters' : params, 'arch' : self.architecture, @@ -857,8 +837,7 @@ class Spec(object): d['compiler'] = None d.update(self.versions.to_dict()) - return { self.name : d } - + return {self.name: d} def to_yaml(self, stream=None): node_list = [] @@ -866,10 +845,9 @@ class Spec(object): node = s.to_node_dict() node[s.name]['hash'] = s.dag_hash() node_list.append(node) - return yaml.dump({ 'spec' : node_list }, + return yaml.dump({'spec': node_list}, stream=stream, default_flow_style=False) - @staticmethod def from_node_dict(node): name = next(iter(node)) @@ -901,11 +879,11 @@ class Spec(object): for name in FlagMap.valid_compiler_flags(): spec.compiler_flags[name] = [] else: - raise SpackRecordError("Did not find a valid format for variants in YAML file") + raise SpackRecordError( + "Did not find a valid format for variants in YAML file") return spec - @staticmethod def from_yaml(stream): """Construct a spec from YAML. @@ -938,15 +916,16 @@ class Spec(object): deps[name].dependencies[dep_name] = deps[dep_name] return spec - def _concretize_helper(self, presets=None, visited=None): """Recursive helper function for concretize(). This concretizes everything bottom-up. As things are concretized, they're added to the presets, and ancestors will prefer the settings of their children. """ - if presets is None: presets = {} - if visited is None: visited = set() + if presets is None: + presets = {} + if visited is None: + visited = set() if self.name in visited: return False @@ -967,7 +946,8 @@ class Spec(object): changed |= any( (spack.concretizer.concretize_architecture(self), spack.concretizer.concretize_compiler(self), - spack.concretizer.concretize_compiler_flags(self),#has to be concretized after compiler + spack.concretizer.concretize_compiler_flags( + self), # has to be concretized after compiler spack.concretizer.concretize_version(self), spack.concretizer.concretize_variants(self))) presets[self.name] = self @@ -975,7 +955,6 @@ class Spec(object): visited.add(self.name) return changed - def _replace_with(self, concrete): """Replace this virtual spec with a concrete spec.""" assert(self.virtual) @@ -987,7 +966,6 @@ class Spec(object): if concrete.name not in dependent.dependencies: dependent._add_dependency(concrete) - def _replace_node(self, replacement): """Replace this spec with another. @@ -1005,7 +983,6 @@ class Spec(object): del dep.dependents[self.name] del self.dependencies[dep.name] - def _expand_virtual_packages(self): """Find virtual packages in this spec, replace them with providers, and normalize again to include the provider's (potentially virtual) @@ -1038,12 +1015,14 @@ class Spec(object): # TODO: may break if in-place on self but # shouldn't happen if root is traversed first. spec._replace_with(replacement) - done=False + done = False break if not replacement: - # Get a list of possible replacements in order of preference. - candidates = spack.concretizer.choose_virtual_or_external(spec) + # Get a list of possible replacements in order of + # preference. + candidates = spack.concretizer.choose_virtual_or_external( + spec) # Try the replacements in order, skipping any that cause # satisfiability problems. @@ -1056,11 +1035,12 @@ class Spec(object): copy[spec.name]._dup(replacement.copy(deps=False)) try: - # If there are duplicate providers or duplicate provider - # deps, consolidate them and merge constraints. + # If there are duplicate providers or duplicate + # provider deps, consolidate them and merge + # constraints. copy.normalize(force=True) break - except SpecError as e: + except SpecError: # On error, we'll try the next replacement. continue @@ -1085,7 +1065,6 @@ class Spec(object): feq(replacement.external, spec.external) and feq(replacement.external_module, spec.external_module)): continue - # Refine this spec to the candidate. This uses # replace_with AND dup so that it can work in # place. TODO: make this more efficient. @@ -1096,12 +1075,11 @@ class Spec(object): changed = True self_index.update(spec) - done=False + done = False break return changed - def concretize(self): """A spec is concrete if it describes one build of a package uniquely. This will ensure that this spec is concrete. @@ -1110,9 +1088,9 @@ class Spec(object): of a package, this will add constraints to make it concrete. Some rigorous validation and checks are also performed on the spec. - Concretizing ensures that it is self-consistent and that it's consistent - with requirements of its pacakges. See flatten() and normalize() for - more details on this. + Concretizing ensures that it is self-consistent and that it's + consistent with requirements of its pacakges. See flatten() and + normalize() for more details on this. """ if not self.name: raise SpecError("Attempting to concretize anonymous spec") @@ -1128,7 +1106,7 @@ class Spec(object): self._expand_virtual_packages(), self._concretize_helper()) changed = any(changes) - force=True + force = True for s in self.traverse(): # After concretizing, assign namespaces to anything left. @@ -1154,7 +1132,6 @@ class Spec(object): # Mark everything in the spec as concrete, as well. self._mark_concrete() - def _mark_concrete(self): """Mark this spec and its dependencies as concrete. @@ -1165,7 +1142,6 @@ class Spec(object): s._normal = True s._concrete = True - def concretized(self): """This is a non-destructive version of concretize(). First clones, then returns a concrete version of this package without modifying @@ -1174,7 +1150,6 @@ class Spec(object): clone.concretize() return clone - def flat_dependencies(self, **kwargs): """Return a DependencyMap containing all of this spec's dependencies with their constraints merged. @@ -1213,7 +1188,6 @@ class Spec(object): # parser doesn't allow it. Spack must be broken! raise InconsistentSpecError("Invalid Spec DAG: %s" % e.message) - def index(self): """Return DependencyMap that points to all the dependencies in this spec.""" @@ -1222,7 +1196,6 @@ class Spec(object): dm[spec.name] = spec return dm - def flatten(self): """Pull all dependencies up to the root (this spec). Merge constraints for dependencies with the same name, and if they @@ -1230,7 +1203,6 @@ class Spec(object): for dep in self.flat_dependencies(copy=False): self._add_dependency(dep) - def _evaluate_dependency_conditions(self, name): """Evaluate all the conditions on a dependency with this name. @@ -1251,12 +1223,11 @@ class Spec(object): try: dep.constrain(dep_spec) except UnsatisfiableSpecError, e: - e.message = ("Conflicting conditional dependencies on package " - "%s for spec %s" % (self.name, self)) + e.message = ("Conflicting conditional dependencies on" + "package %s for spec %s" % (self.name, self)) raise e return dep - def _find_provider(self, vdep, provider_index): """Find provider for a virtual spec in the provider index. Raise an exception if there is a conflicting virtual @@ -1268,7 +1239,8 @@ class Spec(object): # If there is a provider for the vpkg, then use that instead of # the virtual package. if providers: - # Remove duplicate providers that can concretize to the same result. + # Remove duplicate providers that can concretize to the same + # result. for provider in providers: for spec in providers: if spec is not provider and provider.satisfies(spec): @@ -1287,11 +1259,10 @@ class Spec(object): elif required: raise UnsatisfiableProviderSpecError(required[0], vdep) - def _merge_dependency(self, dep, visited, spec_deps, provider_index): """Merge the dependency into this spec. - This is the core of the normalize() method. There are a few basic steps: + This is the core of normalize(). There are some basic steps: * If dep is virtual, evaluate whether it corresponds to an existing concrete dependency, and merge if so. @@ -1335,7 +1306,7 @@ class Spec(object): changed |= spec_deps[dep.name].constrain(dep) except UnsatisfiableSpecError, e: - e.message = "Invalid spec: '%s'. " + e.message = "Invalid spec: '%s'. " e.message += "Package %s requires %s %s, but spec asked for %s" e.message %= (spec_deps[dep.name], dep.name, e.constraint_type, e.required, e.provided) @@ -1346,10 +1317,10 @@ class Spec(object): if dep.name not in self.dependencies: self._add_dependency(dependency) - changed |= dependency._normalize_helper(visited, spec_deps, provider_index) + changed |= dependency._normalize_helper( + visited, spec_deps, provider_index) return changed - def _normalize_helper(self, visited, spec_deps, provider_index): """Recursive helper function for _normalize.""" if self.name in visited: @@ -1380,22 +1351,22 @@ class Spec(object): return any_change - def normalize(self, force=False): """When specs are parsed, any dependencies specified are hanging off the root, and ONLY the ones that were explicitly provided are there. Normalization turns a partial flat spec into a DAG, where: 1. Known dependencies of the root package are in the DAG. - 2. Each node's dependencies dict only contains its known direct deps. + 2. Each node's dependencies dict only contains its known direct + deps. 3. There is only ONE unique spec for each package in the DAG. * This includes virtual packages. If there a non-virtual package that provides a virtual package that is in the spec, then we replace the virtual package with the non-virtual one. - TODO: normalize should probably implement some form of cycle detection, - to ensure that the spec is actually a DAG. + TODO: normalize should probably implement some form of cycle + detection, to ensure that the spec is actually a DAG. """ if not self.name: raise SpecError("Attempting to normalize anonymous spec") @@ -1429,14 +1400,14 @@ class Spec(object): self._normal = True return any_change - def normalized(self): - """Return a normalized copy of this spec without modifying this spec.""" + """ + Return a normalized copy of this spec without modifying this spec. + """ clone = self.copy() clone.normalize() return clone - def validate_names(self): """This checks that names of packages and compilers in this spec are real. If they're not, it will raise either UnknownPackageError or @@ -1457,7 +1428,6 @@ class Spec(object): if vname not in spec.package_class.variants: raise UnknownVariantError(spec.name, vname) - def constrain(self, other, deps=True): """Merge the constraints of other with self. @@ -1465,19 +1435,22 @@ class Spec(object): """ other = self._autospec(other) - if not (self.name == other.name or (not self.name) or (not other.name) ): + if not (self.name == other.name or + (not self.name) or + (not other.name)): raise UnsatisfiableSpecNameError(self.name, other.name) - if other.namespace is not None: - if self.namespace is not None and other.namespace != self.namespace: - raise UnsatisfiableSpecNameError(self.fullname, other.fullname) + if (other.namespace is not None and + self.namespace is not None and + other.namespace != self.namespace): + raise UnsatisfiableSpecNameError(self.fullname, other.fullname) if not self.versions.overlaps(other.versions): raise UnsatisfiableVersionSpecError(self.versions, other.versions) for v in other.variants: if (v in self.variants and - self.variants[v].value != other.variants[v].value): + self.variants[v].value != other.variants[v].value): raise UnsatisfiableVariantSpecError(self.variants[v], other.variants[v]) @@ -1526,7 +1499,6 @@ class Spec(object): return changed - def _constrain_dependencies(self, other): """Apply constraints of other spec's dependencies to this spec.""" other = self._autospec(other) @@ -1545,7 +1517,6 @@ class Spec(object): for name in self.common_dependencies(other): changed |= self[name].constrain(other[name], deps=False) - # Update with additional constraints from other spec for name in other.dep_difference(self): self._add_dependency(other[name].copy()) @@ -1553,7 +1524,6 @@ class Spec(object): return changed - def common_dependencies(self, other): """Return names of dependencies that self an other have in common.""" common = set( @@ -1562,14 +1532,12 @@ class Spec(object): s.name for s in other.traverse(root=False)) return common - def constrained(self, other, deps=True): """Return a constrained copy without modifying this spec.""" clone = self.copy(deps=deps) clone.constrain(other, deps) return clone - def dep_difference(self, other): """Returns dependencies in self that are not in other.""" mine = set(s.name for s in self.traverse(root=False)) @@ -1577,11 +1545,11 @@ class Spec(object): s.name for s in other.traverse(root=False)) return mine - def _autospec(self, spec_like): - """Used to convert arguments to specs. If spec_like is a spec, returns it. - If it's a string, tries to parse a string. If that fails, tries to parse - a local spec from it (i.e. name is assumed to be self's name). + """ + Used to convert arguments to specs. If spec_like is a spec, returns + it. If it's a string, tries to parse a string. If that fails, tries + to parse a local spec from it (i.e. name is assumed to be self's name). """ if isinstance(spec_like, spack.spec.Spec): return spec_like @@ -1589,12 +1557,12 @@ class Spec(object): try: spec = spack.spec.Spec(spec_like) if not spec.name: - raise SpecError("anonymous package -- this will always be handled") + raise SpecError( + "anonymous package -- this will always be handled") return spec except SpecError: return parse_anonymous_spec(spec_like, self.name) - def satisfies(self, other, deps=True, strict=False): """Determine if this spec satisfies all constraints of another. @@ -1610,7 +1578,7 @@ class Spec(object): """ other = self._autospec(other) - # a concrete provider can satisfy a virtual dependency. + # A concrete provider can satisfy a virtual dependency. if not self.virtual and other.virtual: pkg = spack.repo.get(self.fullname) if pkg.provides(other.name): @@ -1625,10 +1593,10 @@ class Spec(object): return False # namespaces either match, or other doesn't require one. - if other.namespace is not None: - if self.namespace is not None and self.namespace != other.namespace: - return False - + if (other.namespace is not None and + self.namespace is not None and + self.namespace != other.namespace): + return False if self.versions and other.versions: if not self.versions.satisfies(other.versions, strict=strict): return False @@ -1650,9 +1618,6 @@ class Spec(object): # Architecture satisfaction is currently just string equality. # If not strict, None means unconstrained. - - - # TODO: Need to make sure that comparisons can be made via classes if self.architecture and other.architecture: if ((self.architecture.platform and other.architecture.platform and self.architecture.platform != other.architecture.platform) or (self.architecture.platform_os and other.architecture.platform_os and self.architecture.platform_os != other.architecture.platform_os) or @@ -1664,21 +1629,24 @@ class Spec(object): (other.architecture.target and not self.architecture.target)): return False - if not self.compiler_flags.satisfies(other.compiler_flags, strict=strict): + if not self.compiler_flags.satisfies( + other.compiler_flags, + strict=strict): return False # If we need to descend into dependencies, do it, otherwise we're done. if deps: deps_strict = strict if not (self.name and other.name): - deps_strict=True + deps_strict = True return self.satisfies_dependencies(other, strict=deps_strict) else: return True - def satisfies_dependencies(self, other, strict=False): - """This checks constraints on common dependencies against each other.""" + """ + This checks constraints on common dependencies against each other. + """ other = self._autospec(other) if strict: @@ -1689,7 +1657,8 @@ class Spec(object): return False elif not self.dependencies or not other.dependencies: - # if either spec doesn't restrict dependencies then both are compatible. + # if either spec doesn't restrict dependencies then both are + # compatible. return True # Handle first-order constraints directly @@ -1705,11 +1674,12 @@ class Spec(object): if not self_index.satisfies(other_index): return False - # These two loops handle cases where there is an overly restrictive vpkg - # in one spec for a provider in the other (e.g., mpi@3: is not compatible - # with mpich2) + # These two loops handle cases where there is an overly restrictive + # vpkg in one spec for a provider in the other (e.g., mpi@3: is not + # compatible with mpich2) for spec in self.virtual_dependencies(): - if spec.name in other_index and not other_index.providers_for(spec): + if (spec.name in other_index and + not other_index.providers_for(spec)): return False for spec in other.virtual_dependencies(): @@ -1718,12 +1688,10 @@ class Spec(object): return True - def virtual_dependencies(self): """Return list of any virtual deps in this spec.""" return [spec for spec in self.traverse() if spec.virtual] - def _dup(self, other, **kwargs): """Copy the spec other into self. This is an overwriting copy. It does not copy any dependents (parents), but by default @@ -1781,7 +1749,6 @@ class Spec(object): self.external_module = other.external_module return changed - def copy(self, **kwargs): """Return a copy of this spec. By default, returns a deep copy. Supply dependencies=False @@ -1791,14 +1758,12 @@ class Spec(object): clone._dup(self, **kwargs) return clone - @property def version(self): if not self.versions.concrete: raise SpecError("Spec version is not concrete: " + str(self)) return self.versions[0] - def __getitem__(self, name): """Get a dependency from the spec by its name.""" for spec in self.traverse(): @@ -1817,7 +1782,6 @@ class Spec(object): raise KeyError("No spec with name %s in %s" % (name, self)) - def __contains__(self, spec): """True if this spec satisfis the provided spec, or if any dependency does. If the spec has no name, then we parse this one first. @@ -1829,13 +1793,11 @@ class Spec(object): return False - def sorted_deps(self): """Return a list of all dependencies sorted by name.""" deps = self.flat_dependencies() return tuple(deps[name] for name in sorted(deps)) - def _eq_dag(self, other, vs, vo): """Recursive helper for eq_dag and ne_dag. Does the actual DAG traversal.""" @@ -1848,18 +1810,22 @@ class Spec(object): if len(self.dependencies) != len(other.dependencies): return False - ssorted = [self.dependencies[name] for name in sorted(self.dependencies)] - osorted = [other.dependencies[name] for name in sorted(other.dependencies)] + ssorted = [self.dependencies[name] + for name in sorted(self.dependencies)] + osorted = [other.dependencies[name] + for name in sorted(other.dependencies)] for s, o in zip(ssorted, osorted): visited_s = id(s) in vs visited_o = id(o) in vo # Check for duplicate or non-equal dependencies - if visited_s != visited_o: return False + if visited_s != visited_o: + return False # Skip visited nodes - if visited_s or visited_o: continue + if visited_s or visited_o: + continue # Recursive check for equality if not s._eq_dag(o, vs, vo): @@ -1867,17 +1833,14 @@ class Spec(object): return True - def eq_dag(self, other): """True if the full dependency DAGs of specs are equal""" return self._eq_dag(other, set(), set()) - def ne_dag(self, other): """True if the full dependency DAGs of specs are not equal""" return not self.eq_dag(other) - def _cmp_node(self): """Comparison key for just *this node* and not its deps.""" return (self.name, @@ -1893,12 +1856,10 @@ class Spec(object): """Equality with another spec, not including dependencies.""" return self._cmp_node() == other._cmp_node() - def ne_node(self, other): """Inequality with another spec, not including dependencies.""" return self._cmp_node() != other._cmp_node() - def _cmp_key(self): """This returns a key for the spec *including* DAG structure. @@ -1910,55 +1871,56 @@ class Spec(object): tuple(hash(self.dependencies[name]) for name in sorted(self.dependencies)),) - def colorized(self): return colorize_spec(self) - def format(self, format_string='$_$@$%@+$+$=', **kwargs): - """Prints out particular pieces of a spec, depending on what is - in the format string. The format strings you can provide are:: - - $_ Package name - $. Full package name (with namespace) - $@ Version with '@' prefix - $% Compiler with '%' prefix - $%@ Compiler with '%' prefix & compiler version with '@' prefix - $%+ Compiler with '%' prefix & compiler flags prefixed by name - $%@+ Compiler, compiler version, and compiler flags with same prefixes as above - $+ Options - $= Architecture prefixed by 'arch=' - $# 7-char prefix of DAG hash with '-' prefix - $$ $ - - You can also use full-string versions, which leave off the prefixes: - - ${PACKAGE} Package name - ${VERSION} Version - ${COMPILER} Full compiler string - ${COMPILERNAME} Compiler name - ${COMPILERVER} Compiler version - ${COMPILERFLAGS} Compiler flags - ${OPTIONS} Options - ${ARCHITECTURE} Architecture - ${SHA1} Dependencies 8-char sha1 prefix - - ${SPACK_ROOT} The spack root directory - ${SPACK_INSTALL} The default spack install directory, ${SPACK_PREFIX}/opt - - Optionally you can provide a width, e.g. $20_ for a 20-wide name. - Like printf, you can provide '-' for left justification, e.g. - $-20_ for a left-justified name. - - Anything else is copied verbatim into the output stream. - - *Example:* ``$_$@$+`` translates to the name, version, and options - of the package, but no dependencies, architecture, or compiler. - - TODO: allow, e.g., $6# to customize short hash length - TODO: allow, e.g., $## for full hash. - """ - color = kwargs.get('color', False) + """ + Prints out particular pieces of a spec, depending on what is + in the format string. The format strings you can provide are:: + + $_ Package name + $. Full package name (with namespace) + $@ Version with '@' prefix + $% Compiler with '%' prefix + $%@ Compiler with '%' prefix & compiler version with '@' prefix + $%+ Compiler with '%' prefix & compiler flags prefixed by name + $%@+ Compiler, compiler version, and compiler flags with same + prefixes as above + $+ Options + $= Architecture prefixed by 'arch=' + $# 7-char prefix of DAG hash with '-' prefix + $$ $ + + You can also use full-string versions, which elide the prefixes: + + ${PACKAGE} Package name + ${VERSION} Version + ${COMPILER} Full compiler string + ${COMPILERNAME} Compiler name + ${COMPILERVER} Compiler version + ${COMPILERFLAGS} Compiler flags + ${OPTIONS} Options + ${ARCHITECTURE} Architecture + ${SHA1} Dependencies 8-char sha1 prefix + + ${SPACK_ROOT} The spack root directory + ${SPACK_INSTALL} The default spack install directory, + ${SPACK_PREFIX}/opt + + Optionally you can provide a width, e.g. $20_ for a 20-wide name. + Like printf, you can provide '-' for left justification, e.g. + $-20_ for a left-justified name. + + Anything else is copied verbatim into the output stream. + + *Example:* ``$_$@$+`` translates to the name, version, and options + of the package, but no dependencies, arch, or compiler. + + TODO: allow, e.g., $6# to customize short hash length + TODO: allow, e.g., $## for full hash. + """ + color = kwargs.get('color', False) length = len(format_string) out = StringIO() named = escape = compiler = False @@ -2016,7 +1978,7 @@ class Spec(object): elif compiler: if c == '@': if (self.compiler and self.compiler.versions and - self.compiler.versions != _any_version): + self.compiler.versions != _any_version): write(c + str(self.compiler.versions), '%') elif c == '+': if self.compiler_flags: @@ -2032,10 +1994,10 @@ class Spec(object): elif named: if not c == '}': if i == length - 1: - raise ValueError("Error: unterminated ${ in format: '%s'" - % format_string) + raise ValueError("Error: unterminated ${ in format:" + "'%s'" % format_string) named_str += c - continue; + continue if named_str == 'PACKAGE': name = self.name if self.name else '' write(fmt % self.name, '@') @@ -2081,7 +2043,6 @@ class Spec(object): result = out.getvalue() return result - def dep_string(self): return ''.join("^" + dep.format() for dep in self.sorted_deps()) @@ -2123,16 +2084,15 @@ class Spec(object): def __str__(self): return self.format() + self.dep_string() - def tree(self, **kwargs): """Prints out this spec and its dependencies, tree-formatted with indentation.""" - color = kwargs.pop('color', False) - depth = kwargs.pop('depth', False) + color = kwargs.pop('color', False) + depth = kwargs.pop('depth', False) showid = kwargs.pop('ids', False) - cover = kwargs.pop('cover', 'nodes') + cover = kwargs.pop('cover', 'nodes') indent = kwargs.pop('indent', 0) - fmt = kwargs.pop('format', '$_$@$%@+$+$=') + fmt = kwargs.pop('format', '$_$@$%@+$+$=') prefix = kwargs.pop('prefix', None) check_kwargs(kwargs, self.tree) @@ -2156,7 +2116,6 @@ class Spec(object): out += node.format(fmt, color=color) + "\n" return out - def __repr__(self): return str(self) @@ -2166,28 +2125,33 @@ class Spec(object): # HASH, DEP, AT, COLON, COMMA, ON, OFF, PCT, EQ, QT, ID = range(11) + class SpecLexer(spack.parse.Lexer): + """Parses tokens that make up spack specs.""" + def __init__(self): super(SpecLexer, self).__init__([ - (r'/', lambda scanner, val: self.token(HASH, val)), - (r'\^', lambda scanner, val: self.token(DEP, val)), - (r'\@', lambda scanner, val: self.token(AT, val)), - (r'\:', lambda scanner, val: self.token(COLON, val)), - (r'\,', lambda scanner, val: self.token(COMMA, val)), - (r'\+', lambda scanner, val: self.token(ON, val)), - (r'\-', lambda scanner, val: self.token(OFF, val)), - (r'\~', lambda scanner, val: self.token(OFF, val)), - (r'\%', lambda scanner, val: self.token(PCT, val)), - (r'\=', lambda scanner, val: self.token(EQ, val)), + (r'/', lambda scanner, val: self.token(HASH, val)), + (r'\^', lambda scanner, val: self.token(DEP, val)), + (r'\@', lambda scanner, val: self.token(AT, val)), + (r'\:', lambda scanner, val: self.token(COLON, val)), + (r'\,', lambda scanner, val: self.token(COMMA, val)), + (r'\+', lambda scanner, val: self.token(ON, val)), + (r'\-', lambda scanner, val: self.token(OFF, val)), + (r'\~', lambda scanner, val: self.token(OFF, val)), + (r'\%', lambda scanner, val: self.token(PCT, val)), + (r'\=', lambda scanner, val: self.token(EQ, val)), # This is more liberal than identifier_re (see above). # Checked by check_identifier() for better error messages. - (r'([\"\'])(?:(?=(\\?))\2.)*?\1',lambda scanner, val: self.token(QT, val)), + (r'([\"\'])(?:(?=(\\?))\2.)*?\1', + lambda scanner, val: self.token(QT, val)), (r'\w[\w.-]*', lambda scanner, val: self.token(ID, val)), - (r'\s+', lambda scanner, val: None)]) + (r'\s+', lambda scanner, val: None)]) class SpecParser(spack.parse.Parser): + def __init__(self): super(SpecParser, self).__init__(SpecLexer()) self.previous = None @@ -2208,7 +2172,8 @@ class SpecParser(spack.parse.Parser): self.token.value = self.token.value[1:-1] else: self.expect(ID) - specs[-1]._add_flag(self.previous.value, self.token.value) + specs[-1]._add_flag( + self.previous.value, self.token.value) else: specs.append(self.spec(self.previous.value)) self.previous = None @@ -2227,9 +2192,11 @@ class SpecParser(spack.parse.Parser): specs[-1]._add_dependency(self.spec(self.token.value)) else: - # Attempt to construct an anonymous spec, but check that the first token is valid - # TODO: Is this check even necessary, or will it all be Lex errors now? - specs.append(self.spec(None,True)) + # Attempt to construct an anonymous spec, but check that + # the first token is valid + # TODO: Is this check even necessary, or will it all be Lex + # errors now? + specs.append(self.spec(None, True)) except spack.parse.ParseError, e: raise SpecParseError(e) @@ -2242,12 +2209,10 @@ class SpecParser(spack.parse.Parser): s._set_platform(spack.architecture.sys_type()) return specs - def parse_compiler(self, text): self.setup(text) return self.compiler() - def spec_by_hash(self): self.expect(ID) @@ -2256,15 +2221,17 @@ class SpecParser(spack.parse.Parser): spec.dag_hash()[:len(self.token.value)] == self.token.value] if not matches: - tty.die("%s does not match any installed packages." %self.token.value) + tty.die("%s does not match any installed packages." % + self.token.value) if len(matches) != 1: - raise AmbiguousHashError("Multiple packages specify hash %s." % self.token.value, *matches) + raise AmbiguousHashError( + "Multiple packages specify hash %s." % self.token.value, + *matches) return matches[0] - - def spec(self, name, check_valid_token = False): + def spec(self, name, check_valid_token=False): """Parse a spec out of the input. If a spec is supplied, then initialize and return it instead of creating a new one.""" if name: @@ -2276,8 +2243,6 @@ class SpecParser(spack.parse.Parser): spec_namespace = None spec_name = None - - # This will init the spec without calling __init__. spec = Spec.__new__(Spec) spec.name = spec_name @@ -2288,7 +2253,7 @@ class SpecParser(spack.parse.Parser): spec.external = None spec.external_module = None spec.compiler_flags = FlagMap(spec) - spec.dependents = DependencyMap() + spec.dependents = DependencyMap() spec.dependencies = DependencyMap() spec.namespace = spec_namespace spec._hash = None @@ -2306,7 +2271,8 @@ class SpecParser(spack.parse.Parser): else: self.expect(ID) if self.accept(EQ): - raise SpecParseError(spack.parse.ParseError("","","Expected dependency received anonymous spec")) + raise SpecParseError(spack.parse.ParseError( + "", "", "Expected dependency received anonymous spec")) spec.add_dependency(self.spec(self.token.value)) while self.next: @@ -2322,7 +2288,7 @@ class SpecParser(spack.parse.Parser): check_valid_token = False elif self.accept(OFF): - spec._add_variant(self.variant(),False) + spec._add_variant(self.variant(), False) check_valid_token = False elif self.accept(PCT): @@ -2352,9 +2318,8 @@ class SpecParser(spack.parse.Parser): return spec - - def variant(self,name=None): - #TODO: Make generalized variants possible + def variant(self, name=None): + # TODO: Make generalized variants possible if name: return name else: @@ -2378,11 +2343,12 @@ class SpecParser(spack.parse.Parser): # No colon and no id: invalid version. self.next_token_error("Invalid version specifier") - if start: start = Version(start) - if end: end = Version(end) + if start: + start = Version(start) + if end: + end = Version(end) return VersionRange(start, end) - def version_list(self): vlist = [] vlist.append(self.version()) @@ -2390,7 +2356,6 @@ class SpecParser(spack.parse.Parser): vlist.append(self.version()) return vlist - def compiler(self): self.expect(ID) self.check_identifier() @@ -2406,7 +2371,6 @@ class SpecParser(spack.parse.Parser): compiler.versions = VersionList(':') return compiler - def check_identifier(self, id=None): """The only identifiers that can contain '.' are versions, but version ids are context-sensitive so we have to check on a case-by-case @@ -2440,10 +2404,15 @@ def parse_anonymous_spec(spec_like, pkg_name): try: anon_spec = Spec(spec_like) if anon_spec.name != pkg_name: - raise SpecParseError(spack.parse.ParseError("","","Expected anonymous spec for package %s but found spec for package %s" % (pkg_name, anon_spec.name) )) + raise SpecParseError(spack.parse.ParseError( + "", + "", + "Expected anonymous spec for package %s but found spec for" + "package %s" % (pkg_name, anon_spec.name))) except SpecParseError: - anon_spec = Spec(pkg_name + ' ' + spec_like) - if anon_spec.name != pkg_name: raise ValueError( + anon_spec = Spec(pkg_name + ' ' + spec_like) + if anon_spec.name != pkg_name: + raise ValueError( "Invalid spec for package %s: %s" % (pkg_name, spec_like)) else: anon_spec = spec_like.copy() @@ -2456,12 +2425,17 @@ def parse_anonymous_spec(spec_like, pkg_name): class SpecError(spack.error.SpackError): + """Superclass for all errors that occur while constructing specs.""" + def __init__(self, message): super(SpecError, self).__init__(message) + class SpecParseError(SpecError): + """Wrapper for ParseError for when we're parsing specs.""" + def __init__(self, parse_error): super(SpecParseError, self).__init__(parse_error.message) self.string = parse_error.string @@ -2469,68 +2443,79 @@ class SpecParseError(SpecError): class DuplicateDependencyError(SpecError): + """Raised when the same dependency occurs in a spec twice.""" + def __init__(self, message): super(DuplicateDependencyError, self).__init__(message) class DuplicateVariantError(SpecError): + """Raised when the same variant occurs in a spec twice.""" + def __init__(self, message): super(DuplicateVariantError, self).__init__(message) class DuplicateCompilerSpecError(SpecError): + """Raised when the same compiler occurs in a spec twice.""" + def __init__(self, message): super(DuplicateCompilerSpecError, self).__init__(message) class UnsupportedCompilerError(SpecError): + """Raised when the user asks for a compiler spack doesn't know about.""" + def __init__(self, compiler_name): super(UnsupportedCompilerError, self).__init__( "The '%s' compiler is not yet supported." % compiler_name) class UnknownVariantError(SpecError): + """Raised when the same variant occurs in a spec twice.""" + def __init__(self, pkg, variant): super(UnknownVariantError, self).__init__( "Package %s has no variant %s!" % (pkg, variant)) -class UnknownArchitectureSpecError(SpecError): - """ Raised when an entry in a string field is neither a platform, - operating system or a target. """ - def __init__(self, architecture_spec_entry): - super(UnknownArchitectureSpecError, self).__init__( - "Architecture spec %s is not a valid spec entry" % ( - architecture_spec_entry)) class DuplicateArchitectureError(SpecError): + """Raised when the same architecture occurs in a spec twice.""" + def __init__(self, message): super(DuplicateArchitectureError, self).__init__(message) class InconsistentSpecError(SpecError): + """Raised when two nodes in the same spec DAG have inconsistent constraints.""" + def __init__(self, message): super(InconsistentSpecError, self).__init__(message) class InvalidDependencyException(SpecError): + """Raised when a dependency in a spec is not actually a dependency of the package.""" + def __init__(self, message): super(InvalidDependencyException, self).__init__(message) class NoProviderError(SpecError): + """Raised when there is no package that provides a particular virtual dependency. """ + def __init__(self, vpkg): super(NoProviderError, self).__init__( "No providers found for virtual package: '%s'" % vpkg) @@ -2538,9 +2523,11 @@ class NoProviderError(SpecError): class MultipleProviderError(SpecError): + """Raised when there is no package that provides a particular virtual dependency. """ + def __init__(self, vpkg, providers): """Takes the name of the vpkg""" super(MultipleProviderError, self).__init__( @@ -2549,9 +2536,12 @@ class MultipleProviderError(SpecError): self.vpkg = vpkg self.providers = providers + class UnsatisfiableSpecError(SpecError): + """Raised when a spec conflicts with package constraints. Provide the requirement that was violated when raising.""" + def __init__(self, provided, required, constraint_type): super(UnsatisfiableSpecError, self).__init__( "%s does not satisfy %s" % (provided, required)) @@ -2561,69 +2551,95 @@ class UnsatisfiableSpecError(SpecError): class UnsatisfiableSpecNameError(UnsatisfiableSpecError): + """Raised when two specs aren't even for the same package.""" + def __init__(self, provided, required): super(UnsatisfiableSpecNameError, self).__init__( provided, required, "name") class UnsatisfiableVersionSpecError(UnsatisfiableSpecError): + """Raised when a spec version conflicts with package constraints.""" + def __init__(self, provided, required): super(UnsatisfiableVersionSpecError, self).__init__( provided, required, "version") class UnsatisfiableCompilerSpecError(UnsatisfiableSpecError): + """Raised when a spec comiler conflicts with package constraints.""" + def __init__(self, provided, required): super(UnsatisfiableCompilerSpecError, self).__init__( provided, required, "compiler") class UnsatisfiableVariantSpecError(UnsatisfiableSpecError): + """Raised when a spec variant conflicts with package constraints.""" + def __init__(self, provided, required): super(UnsatisfiableVariantSpecError, self).__init__( provided, required, "variant") + class UnsatisfiableCompilerFlagSpecError(UnsatisfiableSpecError): + """Raised when a spec variant conflicts with package constraints.""" + def __init__(self, provided, required): super(UnsatisfiableCompilerFlagSpecError, self).__init__( provided, required, "compiler_flags") + class UnsatisfiableArchitectureSpecError(UnsatisfiableSpecError): + """Raised when a spec architecture conflicts with package constraints.""" + def __init__(self, provided, required): super(UnsatisfiableArchitectureSpecError, self).__init__( provided, required, "architecture") class UnsatisfiableProviderSpecError(UnsatisfiableSpecError): + """Raised when a provider is supplied but constraints don't match a vpkg requirement""" + def __init__(self, provided, required): super(UnsatisfiableProviderSpecError, self).__init__( provided, required, "provider") # TODO: get rid of this and be more specific about particular incompatible # dep constraints + + class UnsatisfiableDependencySpecError(UnsatisfiableSpecError): + """Raised when some dependency of constrained specs are incompatible""" + def __init__(self, provided, required): super(UnsatisfiableDependencySpecError, self).__init__( provided, required, "dependency") + class SpackYAMLError(spack.error.SpackError): + def __init__(self, msg, yaml_error): super(SpackYAMLError, self).__init__(msg, str(yaml_error)) + class SpackRecordError(spack.error.SpackError): + def __init__(self, msg): super(SpackRecordError, self).__init__(msg) + class AmbiguousHashError(SpecError): + def __init__(self, msg, *specs): super(AmbiguousHashError, self).__init__(msg) for spec in specs: diff --git a/lib/spack/spack/test/__init__.py b/lib/spack/spack/test/__init__.py index 480e6290e7..97f142e746 100644 --- a/lib/spack/spack/test/__init__.py +++ b/lib/spack/spack/test/__init__.py @@ -39,7 +39,7 @@ test_names = ['architecture', 'versions', 'url_parse', 'url_substitution', 'pack 'svn_fetch', 'hg_fetch', 'mirror', 'modules', 'url_extrapolate', 'cc', 'link_tree', 'spec_yaml', 'optional_deps', 'make_executable', 'configure_guess', 'lock', 'database', - 'namespace_trie', 'yaml', 'sbang', 'environment', + 'namespace_trie', 'yaml', 'sbang', 'environment', 'cmd.find', 'cmd.uninstall', 'cmd.test_install'] diff --git a/lib/spack/spack/test/cmd/find.py b/lib/spack/spack/test/cmd/find.py new file mode 100644 index 0000000000..371e9650e0 --- /dev/null +++ b/lib/spack/spack/test/cmd/find.py @@ -0,0 +1,60 @@ +############################################################################## +# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC. +# Produced at the Lawrence Livermore National Laboratory. +# +# This file is part of Spack. +# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved. +# LLNL-CODE-647188 +# +# For details, see https://github.com/llnl/spack +# Please also see the LICENSE file for our notice and the LGPL. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU Lesser General Public License (as +# published by the Free Software Foundation) version 2.1, February 1999. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and +# conditions of the GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public +# License along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA +############################################################################## + + +import spack.cmd.find +import unittest + + +class Bunch(object): + + def __init__(self, **kwargs): + self.__dict__.update(kwargs) + + +class FindTest(unittest.TestCase): + + def test_query_arguments(self): + query_arguments = spack.cmd.find.query_arguments + # Default arguments + args = Bunch(only_missing=False, missing=False, + unknown=False, explicit=False, implicit=False) + q_args = query_arguments(args) + self.assertTrue('installed' in q_args) + self.assertTrue('known' in q_args) + self.assertTrue('explicit' in q_args) + self.assertEqual(q_args['installed'], True) + self.assertEqual(q_args['known'], any) + self.assertEqual(q_args['explicit'], any) + # Check that explicit works correctly + args.explicit = True + q_args = query_arguments(args) + self.assertEqual(q_args['explicit'], True) + args.explicit = False + args.implicit = True + q_args = query_arguments(args) + self.assertEqual(q_args['explicit'], False) + args.explicit = True + self.assertRaises(SystemExit, query_arguments, args) |