summaryrefslogtreecommitdiff
path: root/lib
diff options
context:
space:
mode:
authorTodd Gamblin <tgamblin@llnl.gov>2014-10-27 19:53:05 -0700
committerTodd Gamblin <tgamblin@llnl.gov>2014-10-27 19:53:05 -0700
commite2af2a27bf9b7b9d04eb190d89abdde365c36928 (patch)
treea4ff4e7c8745df5bcb713c4afde085addc207453 /lib
parent87b87199f28542be20e49549f48462aed4b73e51 (diff)
parentd41d6ed863decfa91720c8f71a2aca1a2c59ace7 (diff)
downloadspack-e2af2a27bf9b7b9d04eb190d89abdde365c36928.tar.gz
spack-e2af2a27bf9b7b9d04eb190d89abdde365c36928.tar.bz2
spack-e2af2a27bf9b7b9d04eb190d89abdde365c36928.tar.xz
spack-e2af2a27bf9b7b9d04eb190d89abdde365c36928.zip
Merge branch 'features/git-fetching' into develop
Conflicts: lib/spack/docs/packaging_guide.rst lib/spack/spack/cmd/info.py lib/spack/spack/package.py lib/spack/spack/stage.py
Diffstat (limited to 'lib')
-rw-r--r--lib/spack/docs/developer_guide.rst2
-rw-r--r--lib/spack/docs/packaging_guide.rst1622
-rw-r--r--lib/spack/llnl/util/compare/__init__.py0
-rw-r--r--lib/spack/llnl/util/compare/none_high.py70
-rw-r--r--lib/spack/llnl/util/compare/none_low.py70
-rw-r--r--lib/spack/spack/cmd/clean.py10
-rw-r--r--lib/spack/spack/cmd/create.py3
-rw-r--r--lib/spack/spack/cmd/info.py4
-rw-r--r--lib/spack/spack/cmd/location.py6
-rw-r--r--lib/spack/spack/cmd/md5.py52
-rw-r--r--lib/spack/spack/cmd/mirror.py157
-rw-r--r--lib/spack/spack/concretize.py10
-rw-r--r--lib/spack/spack/fetch_strategy.py650
-rw-r--r--lib/spack/spack/mirror.py189
-rw-r--r--lib/spack/spack/package.py210
-rw-r--r--lib/spack/spack/packages.py21
-rw-r--r--lib/spack/spack/patch.py2
-rw-r--r--lib/spack/spack/relations.py28
-rw-r--r--lib/spack/spack/spec.py2
-rw-r--r--lib/spack/spack/stage.py190
-rw-r--r--lib/spack/spack/test/__init__.py7
-rw-r--r--lib/spack/spack/test/git_fetch.py133
-rw-r--r--lib/spack/spack/test/hg_fetch.py111
-rw-r--r--lib/spack/spack/test/install.py38
-rw-r--r--lib/spack/spack/test/mirror.py156
-rw-r--r--lib/spack/spack/test/mock_repo.py197
-rw-r--r--lib/spack/spack/test/package_sanity.py6
-rw-r--r--lib/spack/spack/test/stage.py22
-rw-r--r--lib/spack/spack/test/svn_fetch.py123
-rw-r--r--lib/spack/spack/test/url_extrapolate.py90
-rw-r--r--lib/spack/spack/test/url_parse.py7
-rw-r--r--lib/spack/spack/test/versions.py52
-rw-r--r--lib/spack/spack/version.py131
33 files changed, 3318 insertions, 1053 deletions
diff --git a/lib/spack/docs/developer_guide.rst b/lib/spack/docs/developer_guide.rst
index 1f0977b4de..969ed60b15 100644
--- a/lib/spack/docs/developer_guide.rst
+++ b/lib/spack/docs/developer_guide.rst
@@ -4,7 +4,7 @@ Developer Guide
=====================
This guide is intended for people who want to work on Spack itself.
-If you just want to develop pacakges, see the :ref:`packaging-guide`.
+If you just want to develop packages, see the :ref:`packaging-guide`.
It is assumed that you've read the :ref:`basic-usage` and
:ref:`packaging-guide` sections, and that you're familiar with the
diff --git a/lib/spack/docs/packaging_guide.rst b/lib/spack/docs/packaging_guide.rst
index 0ec8047dad..1f95e56d2a 100644
--- a/lib/spack/docs/packaging_guide.rst
+++ b/lib/spack/docs/packaging_guide.rst
@@ -4,223 +4,60 @@ Packaging Guide
=====================
This guide is intended for developers or administrators who want to
-*package* their software so that Spack can install it. We assume that
-you have at least some familiarty with Python, and that you've read
-the :ref:`basic usage guide <basic-usage>`, especially the part
-about :ref:`specs <sec-specs>`.
+package software so that Spack can install it. It assumes that you
+have at least some familiarty with Python, and that you've read the
+:ref:`basic usage guide <basic-usage>`, especially the part about
+:ref:`specs <sec-specs>`.
There are two key parts of Spack:
#. **Specs**: expressions for describing builds of software, and
- #. **Packages**: Python modules that build software according to a
- spec.
+ #. **Packages**: Python modules that describe how to build
+ software according to a spec.
-Package files allow a developer to encapsulate build logic for
-different versions, compilers, and platforms in one place. Specs
-allow a user to describe a *particular* build in a way that a package
-author can understand.
+Specs allow a user to describe a *particular* build in a way that a
+package author can understand. Packages allow a developer to
+encapsulate the logic build logic for different versions, compilers,
+options, platforms, and dependency combinations in one place.
Packages in Spack are written in pure Python, so you can do anything
in Spack that you can do in Python. Python was chosen as the
-implementation language for two reasons. First, Python is getting to
-be ubiquitous in the HPC community due to its use in numerical codes.
+implementation language for two reasons. First, Python is becoming
+ubiquitous in the HPC community due to its use in numerical codes.
Second, it's a modern language and has many powerful features to help
make package writing easy.
-Finally, we've gone to great lengths to make it *easy* to create
-packages. The ``spack create`` command lets you generate a
-boilerplate package template from a tarball URL, and ideally you'll
-only need to run this once and slightly modify the boilerplate to get
-your package working.
-
-This section of the guide goes through the parts of a package, and
-then tells you how to make your own. If you're impatient, jump ahead
-to :ref:`spack-create`.
-
-Package Files
----------------------------
-
-It's probably easiest to learn about packages by looking at an
-example. Let's take a look at the ``libelf`` package:
-
-.. literalinclude:: ../../../var/spack/packages/libelf/package.py
- :lines: 25-
- :linenos:
-
-Directory Structure
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-A Spack installation directory is structured like a standard UNIX
-install prefix (``bin``, ``lib``, ``include``, ``var``, ``opt``,
-etc.). Most of the code for Spack lives in ``$SPACK_ROOT/lib/spack``.
-Packages themselves live in ``$SPACK_ROOT/var/spack/packages``.
-
-If you ``cd`` to that directory, you will see directories for each
-package:
-
-.. command-output:: cd $SPACK_ROOT/var/spack/packages; ls
- :shell:
- :ellipsis: 10
-
-Each of these directories contains a file called ``package.py``. This
-file is where all the python code for a package goes. For example,
-the ``libelf`` package looks like this::
-
- $SPACK_ROOT/var/spack/packages/
- libelf/
- package.py
-
-Alongside the ``package.py`` file, a package may contain extra files (like
-patches) that it needs to build.
-
-
-Package Names
-~~~~~~~~~~~~~~~~~~
-
-Packages are named after the directory containing ``package.py``. So,
-``libelf``'s ``package.py`` lives in a directory called ``libelf``.
-The ``package.py`` file contains a class called ``Libelf``, which
-extends Spack's ``Package`` class. This is what makes it a Spack
-package. The **directory name** is what users need to provide on the
-command line. e.g., if you type any of these:
-
-.. code-block:: sh
-
- $ spack install libelf
- $ spack install libelf@0.8.13
-
-Spack sees the package name in the spec and looks for
-``libelf/package.py`` in ``var/spack/packages``. Likewise, if you say
-``spack install docbook-xml``, then Spack looks for
-``docbook-xml/package.py``.
-
-We use the directory name to packagers more freedom when naming their
-packages. Package names can contain letters, numbers, dashes, and
-underscores. You can name a package ``3proxy`` or ``_foo`` and Spack
-won't care -- it just needs to see that name in the package spec.
-These aren't valid Python module names, but we allow them in Spack and
-import ``package.py`` file dynamically.
-
-Package class names
-~~~~~~~~~~~~~~~~~~~~~~~
-
-The **class name** (``Libelf`` in our example) is formed by converting
-words separated by `-` or ``_`` in the file name to camel case. If
-the name starts with a number, we prefix the class name with
-``_``. Here are some examples:
-
-================= =================
- Module Name Class Name
-================= =================
- ``foo_bar`` ``FooBar``
- ``docbook-xml`` ``DocbookXml``
- ``FooBar`` ``Foobar``
- ``3proxy`` ``_3proxy``
-================= =================
-
-The class name is needed by Spack to properly import a package, but
-not for much else. In general, you won't have to remember this naming
-convention because ``spack create`` will generate a boilerplate class
-for you, and you can just fill in the blanks.
-
-.. _metadata:
-
-Metadata
-~~~~~~~~~~~~~~~~~~~~
-
-Just under the class name is a description of the ``libelf`` package.
-In Python, this is called a *docstring*: a multi-line, triple-quoted
-(``"""``) string that comes just after the definition of a class.
-Spack uses the docstring to generate the description of the package
-that is shown when you run ``spack info``. If you don't provide a
-description, Spack will just print "None" for the description.
-
-In addition to the package description, there are a few fields you'll
-need to fill out. They are as follows:
-
-``homepage`` (required)
- This is the URL where you can learn about the package and get
- information. It is displayed to users when they run ``spack info``.
-
-``url`` (required)
- This is the URL where you can download a distribution tarball of
- the pacakge's source code.
-
-``versions`` (optional)
- This is a `dictionary
- <http://docs.python.org/2/tutorial/datastructures.html#dictionaries>`_
- mapping versions to MD5 hashes. Spack uses the hashes to checksum
- archives when it downloads a particular version.
-
-``parallel`` (optional) Whether make should be parallel by default.
- By default, this is ``True``, and package authors need to call
- ``make(parallel=False)`` to override. If you set this to ``False``
- at the package level then each call to ``make`` will be sequential
- by default, and users will have to call ``make(parallel=True)`` to
- override it.
-
-``versions`` is optional but strongly recommended. Spack will warn
-usrs if they try to install a version (e.g., ``libelf@0.8.10`` for
-which there is not a checksum available. They can force it to
-download the new version and install, but it's better to provide
-checksums so users don't have to install from an unchecked archive.
-
-
-Install method
-~~~~~~~~~~~~~~~~~~~~~~~
-
-The last element of the ``libelf`` package is its ``install()``
-method. This is where the real work of installation happens, and
-it's the main part of the package you'll need to customize for each
-piece of software.
-
-.. literalinclude:: ../../../var/spack/packages/libelf/package.py
- :start-after: 0.8.12
- :linenos:
-
-``install`` takes a ``spec``: a description of how the package should
-be built, and a ``prefix``: the path to the directory where the
-software should be installed.
-
-:ref:`Writing the install method <install-method>` is documented in
-detail later, but in general, the ``install()`` method should look
-familiar. ``libelf`` uses autotools, so the package first calls
-``configure``, passing the prefix and some other package-specific
-arguments. It then calls ``make`` and ``make install``.
-
-Spack provides wrapper functions for ``configure`` and ``make`` so
-that you can call them in a similar way to how you'd call a shell
-comamnd. In reality, these are Python functions. Spack provides
-these functions to make writing packages more natural. See the section
-on :ref:`shell wrappers <shell-wrappers>`.
+Creating & Editing Packages
+----------------------------------
.. _spack-create:
-Creating Packages
-----------------------------------
-
``spack create``
~~~~~~~~~~~~~~~~~~~~~
-The ``spack create`` command takes the tedium out of making packages.
-It generates boilerplate code for you, so that you can focus on
-getting your package build working.
+The ``spack create`` command generates boilerplate package template
+from a URL pointing to a tarball or other software archive. In most
+cases, you'll only need to run this once, then slightly modify the
+boilerplate to get your package working.
-All you need is the URL to a tarball you want to package:
+All you need is the URL to a tarball (other archive formats are ok
+too) you want to package:
.. code-block:: sh
$ spack create http://www.cmake.org/files/v2.8/cmake-2.8.12.1.tar.gz
-When you run this, Spack will look at the tarball URL, and it will try
-to figure out the name of the package to be created. It will also try
-to figure out what version strings for that package look like. Once
-that is done, it tries to find *additional* versions by spidering the
-package's webpage. Spack then prompts you to tell it how many
-versions you want to download and checksum.
+When you run this, Spack looks at the tarball URL and tries to figure
+out the name of the package to be created. It also tries to determine
+out what version strings look like for this package. Using this
+information, it tries to find *additional* versions by spidering the
+package's webpage. If it finds multiple versions, Spack prompts you
+to tell it how many versions you want to download and checksum.
.. code-block:: sh
+ $ spack create http://www.cmake.org/files/v2.8/cmake-2.8.12.1.tar.gz
+ ==> This looks like a URL for cmake version 2.8.12.1.
==> Creating template for package cmake
==> Found 18 versions of cmake.
2.8.12.1 http://www.cmake.org/files/v2.8/cmake-2.8.12.1.tar.gz
@@ -243,7 +80,7 @@ Spack will automatically download the number of tarballs you specify
Note that you don't need to do everything up front. If your package
is large, you can always choose to download just one tarball for now,
then run :ref:`spack checksum <spack-checksum>` later if you end up
-wanting more. Let's say you chose to download 3 tarballs:
+wanting more. Let's say you choose to download 3 tarballs:
.. code-block:: sh
@@ -290,6 +127,9 @@ Now Spack generates boilerplate code and opens the new
version('2.8.12', '105bc6d21cc2e9b6aff901e43c53afea')
version('2.8.11.2', '6f5d7b8e7534a5d9e1a7664ba63cf882')
+ # FIXME: Add dependencies if this package requires them.
+ # depends_on("foo")
+
def install(self, spec, prefix):
# FIXME: Modify the configure line to suit your build system here.
configure("--prefix=" + prefix)
@@ -301,42 +141,85 @@ Now Spack generates boilerplate code and opens the new
The tedious stuff (creating the class, checksumming archives) has been
done for you.
-All the things you still need to change are marked with ``FIXME``
-labels. The first ``FIXME`` refers to the commented instructions at
-the top of the file. You can delete these after reading them. The
-rest of them are as follows:
+.. note::
+
+ If ``spack create`` fails to download or to detect the package
+ version, you can use ``spack edit -f`` to generate simpler
+ boilerplate. See the next section for more on this.
+
+In the generated package, the download ``url`` attribute is already
+set. All the things you still need to change are marked with
+``FIXME`` labels. The first ``FIXME`` refers to the commented
+instructions at the top of the file. You can delete these
+instructions after reading them. The rest of them are as follows:
+
+ #. Add a description.
+
+ Immediately inside the package class is a *docstring* in
+ triple-quotes (``"""``). It's used to generate the description
+ shown when users run ``spack info``.
+
+ #. Change the ``homepage`` to a useful URL.
+
+ The ``homepage`` is displayed when users run ``spack info`` so
+ that they can learn about packages.
+
+ #. Add ``depends_on()`` calls for the package's dependencies.
+
+ ``depends_on`` tells Spack that other packages need to be built
+ and installed before this one. See `dependencies_`.
- #. Add a description in your package's docstring.
- #. Change the homepage to a useful URL (not ``example.com``).
#. Get the ``install()`` method working.
+ The ``install()`` method implements the logic to build a
+ package. The code should look familiar; it is designed to look
+ like a shell script. Specifics will differ depending on the package,
+ and :ref:`implementing the install method <install-method>` is
+ covered in detail later.
+
+Before going into details, we'll cover a few more basics.
+
+.. _spack-edit:
``spack edit``
~~~~~~~~~~~~~~~~~~~~
-Once you've created a package, you can go back and edit it using
-``spack edit``. For example, this:
+One of the easiest ways to learn to write packages is to look at
+existing ones. You can edit a package file by name with the ``spack
+edit`` command:
.. code-block:: sh
- spack edit libelf
+ spack edit cmake
+
+So, if you used ``spack create`` to create a package, then saved and
+closed the resulting file, you can get back to it with ``spack edit``.
+The ``cmake`` package actually lives in
+``$SPACK_ROOT/var/spack/packages/cmake/package.py``, but this provides
+a much simpler shortcut and saves you the trouble of typing the full
+path.
-will open ``$SPACK_ROOT/var/spack/packages/libelf/package.py`` in
-``$EDITOR``. If you try to edit a package that doesn't exist, Spack
-will recommend using ``spack create``:
+
+``spack edit -f``
+~~~~~~~~~~~~~~~~~~~~
+If you try to edit a package that doesn't exist, Spack will recommend
+using ``spack create``:
.. code-block:: sh
$ spack edit foo
==> Error: No package 'foo'. Use spack create, or supply -f/--force to edit a new file.
-And, finally, if you *really* want to skip all the automatic stuff
-that ``spack create`` does for you, then you can run ``spack edit
--f/--force``:
+As the output advises, You can use ``spack edit -f/--force`` to force
+the creation of a new, *very* simple boilerplate package:
+
+.. code-block:: sh
$ spack edit -f foo
-Which will generate a minimal package structure for you to fill in:
+Unlike ``spack create``, which tries to infer names and versions, and
+which actually downloads the tarball and checksums it for you, ``spack
+edit -f`` will substitute dummy values for you to fill in yourself:
.. code-block:: python
:linenos:
@@ -356,17 +239,188 @@ Which will generate a minimal package structure for you to fill in:
make()
make("install")
-This is useful when, e.g., Spack cannot figure out the name and
+This is useful when ``spack create`` cannot figure out the name and
version of your package from the archive URL.
+Naming & Directory Structure
+--------------------------------------
+
+This section describes how packages need to be named, and where they
+live in Spack's directory structure. In general, `spack-create`_ and
+`spack-edit`_ handle creating package files for you, so you can skip
+most of the details here.
+
+``var/spack/packages``
+~~~~~~~~~~~~~~~~~~~~~~~
+
+A Spack installation directory is structured like a standard UNIX
+install prefix (``bin``, ``lib``, ``include``, ``var``, ``opt``,
+etc.). Most of the code for Spack lives in ``$SPACK_ROOT/lib/spack``.
+Packages themselves live in ``$SPACK_ROOT/var/spack/packages``.
+
+If you ``cd`` to that directory, you will see directories for each
+package:
+
+.. command-output:: cd $SPACK_ROOT/var/spack/packages; ls -CF
+ :shell:
+
+Each directory contains a file called ``package.py``, which is where
+all the python code for the package goes. For example, the ``libelf``
+package lives in::
+
+ $SPACK_ROOT/var/spack/packages/libelf/package.py
+
+Alongside the ``package.py`` file, a package may contain extra
+directories or files (like patches) that it needs to build.
+
+
+Package Names
+~~~~~~~~~~~~~~~~~~
+
+Packages are named after the directory containing ``package.py``. So,
+``libelf``'s ``package.py`` lives in a directory called ``libelf``.
+The ``package.py`` file contains a class called ``Libelf``, which
+extends Spack's ``Package`` class. This is what makes it a Spack
+package:
+
+``var/spack/packages/libelf/package.py``
+
+.. code-block:: python
+ :linenos:
+
+ from spack import *
+
+ class Libelf(Package):
+ """ ... description ... """
+ homepage = ...
+ url = ...
+ version(...)
+ depends_on(...)
+
+ def install():
+ ...
+
+The **directory name** (``libelf``) is what users need to provide on
+the command line. e.g., if you type any of these:
+
+.. code-block:: sh
+
+ $ spack install libelf
+ $ spack install libelf@0.8.13
+
+Spack sees the package name in the spec and looks for
+``libelf/package.py`` in ``var/spack/packages``. Likewise, if you say
+``spack install docbook-xml``, then Spack looks for
+``docbook-xml/package.py``.
+
+Spack uses the directory name as the package name in order to give
+packagers more freedom in naming their packages. Package names can
+contain letters, numbers, dashes, and underscores. Using a Python
+identifier (e.g., a class name or a module name) would make it
+difficult to support these options. So, you can name a package
+``3proxy`` or ``_foo`` and Spack won't care. It just needs to see
+that name in the package spec.
+
+Package class names
+~~~~~~~~~~~~~~~~~~~~~~~
+
+Spack loads ``package.py`` files dynamically, and it needs to find a
+special class name in the file for the load to succeed. The **class
+name** (``Libelf`` in our example) is formed by converting words
+separated by `-` or ``_`` in the file name to camel case. If the name
+starts with a number, we prefix the class name with ``_``. Here are
+some examples:
+
+================= =================
+ Module Name Class Name
+================= =================
+ ``foo_bar`` ``FooBar``
+ ``docbook-xml`` ``DocbookXml``
+ ``FooBar`` ``Foobar``
+ ``3proxy`` ``_3proxy``
+================= =================
+
+In general, you won't have to remember this naming convention because
+`spack-create`_ and `spack-edit`_ will generate boilerplate for you,
+and you can just fill in the blanks.
+
+
+Adding new versions
+------------------------
+
+The most straightforward way to add new versions to your package is to
+add a line like this in the package class:
+
+.. code-block:: python
+ :linenos:
+
+ class Foo(Package):
+ url = 'http://example.com/foo-1.0.tar.gz'
+ version('8.2.1', '4136d7b4c04df68b686570afa26988ac')
+ ...
+
+Version URLs
+~~~~~~~~~~~~~~~~~
+
+By default, each version's URL is extrapolated from the ``url`` field
+in the package. For example, Spack is smart enough to download
+version ``8.2.1.`` of the ``Foo`` package above from
+``http://example.com/foo-8.2.1.tar.gz``.
+
+If spack *cannot* extrapolate the URL from the ``url`` field, or if
+the package doesn't have a ``url`` field, you can add a URL explicitly
+for a particular version:
+
+.. code-block:: python
+
+ version('8.2.1', '4136d7b4c04df68b686570afa26988ac',
+ url='http://example.com/foo-8.2.1-special-version.tar.gz')
+
+For the URL above, you might have to add an explicit URL because the
+version can't simply be substituted in the original ``url`` to
+construct the new one for ``8.2.1``.
+
+Wehn you supply a custom URL for a version, Spack uses that URL
+*verbatim* when fetching the version, and will *not* perform
+extrapolation.
+
+Checksums
+~~~~~~~~~~~~~~~~~
+
+Spack uses a checksum to ensure that the downloaded package version is
+not corrupted or compromised. This is especially important when
+fetching from insecure sources, like unencrypted http. By default, a
+package will *not* be installed if it doesn't pass a checksum test
+(though users can overried this with ``spack install --no-checksum``).
+
+Spack can currently support checksums using the MD5, SHA-1, SHA-224,
+SHA-256, SHA-384, and SHA-512 algorithms.
+
+``spack md5``
+^^^^^^^^^^^^^^^^^^^^^^
+
+If you have a single file to checksum, you can use the ``spack md5``
+command to do it. Here's how you might download an archive and get a
+checksum for it:
+
+.. code-block:: sh
+
+ $ curl -O http://exmaple.com/foo-8.2.1.tar.gz'
+ $ spack md5 foo-8.2.1.tar.gz
+ 4136d7b4c04df68b686570afa26988ac foo-8.2.1.tar.gz
+
+Doing this for lots of files, or whenever a new package version is
+released, is tedious. See ``spack checksum`` below for an automated
+version of this process.
+
.. _spack-checksum:
``spack checksum``
-~~~~~~~~~~~~~~~~~~~~~~
+^^^^^^^^^^^^^^^^^^^^^^
-If you've already created a package and you want to add more version
-checksums to it, this is automated with ``spack checksum``. Here's an
+If you want to add new versions to a package you've already created,
+this is automated with the ``spack checksum`` command. Here's an
example for ``libelf``:
.. code-block:: sh
@@ -387,10 +441,11 @@ example for ``libelf``:
How many would you like to checksum? (default is 5, q to abort)
-This does the same thing that ``spack create`` did, it just allows you
-to go back and create more checksums for an existing package. It
-fetches the tarballs you ask for and prints out a dict ready to copy
-and paste into your package file:
+This does the same thing that ``spack create`` does, but it allows you
+to go back and add new vesrions easily as you need them (e.g., as
+they're released). It fetches the tarballs you ask for and prints out
+a list of ``version`` commands ready to copy/paste into your package
+file:
.. code-block:: sh
@@ -400,59 +455,442 @@ and paste into your package file:
version('0.8.11', 'e931910b6d100f6caa32239849947fbf')
version('0.8.10', '9db4d36c283d9790d8fa7df1f4d7b4d9')
-You should be able to add these checksums directly to the versions
-field in your package.
+By default, Spack will search for new tarball downloads by scraping
+the parent directory of the tarball you gave it. So, if your tarball
+is at ``http://example.com/downloads/foo-1.0.tar.gz``, Spack will look
+in ``http://example.com/downloads/`` for links to additional versions.
+If you need to search another path for download links, see the
+reference documentation on `attribute_list_url`_ and
+`attributee_list_depth`_.
+
+.. note::
+
+ * This command assumes that Spack can extrapolate new URLs from an
+ existing URL in the package, and that Spack can find similar URLs
+ on a webpage. If that's not possible, you'll need to manually add
+ ``version`` calls yourself.
+
+ * For ``spack checksum`` to work, Spack needs to be able to
+ ``import`` your pacakge in Python. That means it can't have any
+ syntax errors, or the ``import`` will fail. Use this once you've
+ got your package in working order.
+
+
+.. _vcs-fetch:
+
+Fetching from VCS Repositories
+--------------------------------------
+
+For some packages, source code is hosted in a Version Control System
+(VCS) repository rather than as a tarball. Packages can be set up to
+fetch from a repository instead of a tarball. Currently, Spack
+supports fetching with `Git <git-fetch_>`_, `Mercurial (hg)
+<hg-fetch_>`_, and `Subversion (SVN) <svn-fetch_>`_.
+
+To fetch a package from a source repository, you add a ``version()``
+call to your package with parameters indicating the repository URL and
+any branch, tag, or revision to fetch. See below for the paramters
+you'll need for each VCS system.
+
+Repositories and versions
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The package author is responsible for coming up with a sensible name
+for each version. For example, if you're fetching from a tag like
+``v1.0``, you might call that ``1.0``. If you're fetching a nameless
+git commit or an older subversion revision, you might give the commit
+an intuitive name, like ``dev`` for a development version, or
+``some-fancy-new-feature`` if you want to be more specific.
+
+In general, it's recommended to fetch tags or particular
+commits/revisions, NOT branches or the repository mainline, as
+branches move forward over time and you aren't guaranteed to get the
+same thing every time you fetch a particular version. Life isn't
+simple, though, so this is not strictly enforced.
+
+In some future release, Spack may support extrapolating repository
+versions as it does for tarball URLs, but currently this is not
+supported.
+
+.. _git-fetch:
+
+Git
+~~~~~~~~~~~~~~~~~~~~
+
+Git fetching is enabled with the following parameters to ``version``:
+
+ * ``git``: URL of the git repository.
+ * ``tag``: name of a tag to fetch.
+ * ``branch``: name of a branch to fetch.
+ * ``commit``: SHA hash (or prefix) of a commit to fetch.
+
+Only one of ``tag``, ``branch``, or ``commit`` can be used at a time.
+
+Default branch
+ To fetch a repository's default branch:
+
+ .. code-block:: python
+
+ class Example(Package):
+ ...
+ version('dev', git='https://github.com/example-project/example.git')
+
+ This is not recommended, as the contents of the default branch
+ change over time.
+
+Tags
+ To fetch from a particular tag, use the ``tag`` parameter along with
+ ``git``:
+
+ .. code-block:: python
+
+ version('1.0.1', git='https://github.com/example-project/example.git',
+ tag='v1.0.1')
+
+Branches
+ To fetch a particular branch, use ``branch`` instead:
+
+ .. code-block:: python
+
+ version('experimental', git='https://github.com/example-project/example.git',
+ branch='experimental')
+
+ This is not recommended, as the contents of branches change over
+ time.
+
+Commits
+ Finally, to fetch a particular commit, use ``commit``:
+
+ .. code-block:: python
+
+ version('2014-10-08', git='https://github.com/example-project/example.git',
+ commit='9d38cd4e2c94c3cea97d0e2924814acc')
+
+ This doesn't have to be a full hash; You can abbreviate it as you'd
+ expect with git:
+
+ .. code-block:: python
+
+ version('2014-10-08', git='https://github.com/example-project/example.git',
+ commit='9d38cd')
+
+ It may be useful to provide a saner version for commits like this,
+ e.g. you might use the date as the version, as done above. Or you
+ could just use the abbreviated commit hash. It's up to the package
+ author to decide what makes the most sense.
+
+Installing
+^^^^^^^^^^^^^^
+
+You can fetch and install any of the versions above as you'd expect,
+by using ``@<version>`` in a spec:
+
+.. code-block:: sh
+
+ spack install example@2014-10-08
+
+Git and other VCS versions will show up in the list of versions when
+a user runs ``spack info <package name>``.
+
+
+.. _hg-fetch:
+
+Mercurial
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Fetching with mercurial works much like `git <git-fetch>`_, but you
+use the ``hg`` parameter.
+
+Default
+ Add the ``hg`` parameter with no ``revision``:
+
+ .. code-block:: python
+
+ version('hg-head', hg='https://jay.grs.rwth-aachen.de/hg/example')
+
+ Note that this is not recommended; try to fetch a particular
+ revision instead.
+
+Revisions
+ Add ``hg`` and ``revision``parameters:
+
+ .. code-block:: python
+
+ version('1.0', hg='https://jay.grs.rwth-aachen.de/hg/example',
+ revision='v1.0')
+
+ Unlike ``git``, which has special parameters for different types of
+ revisions, you can use ``revision`` for branches, tags, and commits
+ when you fetch with Mercurial.
+
+As wtih git, you can fetch these versions using the ``spack install
+example@<version>`` command-line syntax.
+
+.. _svn-fetch:
+
+Subversion
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+To fetch with subversion, use the ``svn`` and ``revision`` parameters:
+
+Head
+ Simply add an ``svn`` parameter to ``version``:
+
+ .. code-block:: python
+
+ version('svn-head', svn='https://outreach.scidac.gov/svn/libmonitor/trunk')
+
+ This is not recommended, as the head will move forward over time.
+
+Revisions
+ To fetch a particular revision, add a ``revision`` to the
+ version call:
+
+ .. code-block:: python
+
+ version('svn-head', svn='https://outreach.scidac.gov/svn/libmonitor/trunk',
+ revision=128)
+
+Subversion branches are handled as part of the directory structure, so
+you can check out a branch or tag by changing the ``url``.
+
+.. _patching:
+
+Patches
+------------------------------------------
+
+Depending on the host architecture, package version, known bugs, or
+other issues, you may need to patch your software to get it to build
+correctly. Like many other package systems, spack allows you to store
+patches alongside your package files and apply them to source code
+after it's downloaded.
+
+``patch``
+~~~~~~~~~~~~~~~~~~~~~
+
+You can specify patches in your package file with the ``patch()``
+function. ``patch`` looks like this:
+
+.. code-block:: python
+
+ class Mvapich2(Package):
+ ...
+ patch('ad_lustre_rwcontig_open_source.patch', when='@1.9:')
+
+The first argument can be either a URL or a filename. It specifies a
+patch file that should be applied to your source. If the patch you
+supply is a filename, then the patch needs to live within the spack
+source tree. For example, the patch above lives in a directory
+structure like this::
+
+ $SPACK_ROOT/var/spack/packages/
+ mvapich2/
+ package.py
+ ad_lustre_rwcontig_open_source.patch
+
+If you supply a URL instead of a filename, the patch will be fetched
+from the URL and then applied to your source code.
+
+.. warning::
+
+ It is generally better to use a filename rather than a URL for your
+ patch. Patches fetched from URLs are not currently checksummed,
+ and adding checksums for them is tedious for the package builder.
+ File patches go into the spack repository, which gives you git's
+ integrity guarantees. URL patches may be removed in a future spack
+ version.
+
+``patch`` can take two options keyword arguments. They are:
+
+``when``
+ If supplied, this is a spec that tells spack when to apply
+ the patch. If the installed package spec matches this spec, the
+ patch will be applied. In our example above, the patch is applied
+ when mvapich is at version ``1.9`` or higher.
+
+``level``
+ This tells spack how to run the ``patch`` command. By default,
+ the level is 1 and spack runs ``patch -p1``. If level is 2,
+ spack will run ``patch -p2``, and so on.
+
+ A lot of people are confused by level, so here's a primer. If you
+ look in your patch file, you may see something like this:
-Note that for ``spack checksum`` to work, Spack needs to be able to
-``import`` your pacakge in Python. That means it can't have any
-syntax errors, or the ``import`` will fail. Use this once you've got
-your package in working order.
+ .. code-block:: diff
+ :linenos:
+ --- a/src/mpi/romio/adio/ad_lustre/ad_lustre_rwcontig.c 2013-12-10 12:05:44.806417000 -0800
+ +++ b/src/mpi/romio/adio/ad_lustre/ad_lustre_rwcontig.c 2013-12-10 11:53:03.295622000 -0800
+ @@ -8,7 +8,7 @@
+ * Copyright (C) 2008 Sun Microsystems, Lustre group
+ */
-Optional Package Attributes
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ -#define _XOPEN_SOURCE 600
+ +//#define _XOPEN_SOURCE 600
+ #include <stdlib.h>
+ #include <malloc.h>
+ #include "ad_lustre.h"
-In addition to ``homepage``, ``url``, and ``versions``, there are some
-other useful attributes you can add to your package file.
+ Lines 1-2 show paths with synthetic ``a/`` and ``b/`` prefixes. These
+ are placeholders for the two ``mvapich2`` source directories that
+ ``diff`` compared when it created the patch file. This is git's
+ default behavior when creating patch files, but other programs may
+ behave differently.
+
+ ``-p1`` strips off the first level of the prefix in both paths,
+ allowing the patch to be applied from the root of an expanded mvapich2
+ archive. If you set level to ``2``, it would strip off ``src``, and
+ so on.
+
+ It's generally easier to just structure your patch file so that it
+ applies cleanly with ``-p1``, but if you're using a patch you didn't
+ create yourself, ``level`` can be handy.
+
+
+Finding Package Downloads
+----------------------------
+
+We've already seen the ``homepage`` and ``url`` package attributes:
+
+.. code-block:: python
+ :linenos:
+
+ from spack import *
+
+ class Mpich(Package):
+ """MPICH is a high performance and widely portable implementation of
+ the Message Passing Interface (MPI) standard."""
+ homepage = "http://www.mpich.org"
+ url = "http://www.mpich.org/static/downloads/3.0.4/mpich-3.0.4.tar.gz"
+
+These are class-level attributes used by Spack to show users
+information about the package, and to determine where to download its
+source code.
+
+Spack uses the tarball URL to extrapolate where to find other tarballs
+of the same package (e.g. in `spack-checksum`_, but this does not
+always work. This section covers ways you can tell Spack to find
+tarballs elsewhere.
+
+.. _attribute_list_url:
``list_url``
-^^^^^^^^^^^^^^^^^^
+~~~~~~~~~~~~~~~~~~~~~
-When spack tries to find available versions of packages (e.g. in
-``spack checksum``), by default it looks in the parent directory of
-the tarball in the package's ``url``. For example, for libelf, the
-url is:
+When spack tries to find available versions of packages (e.g. with
+`spack-checksum`_), it spiders the parent directory of the tarball in
+the ``url`` attribute. For example, for libelf, the url is:
-.. literalinclude:: ../../../var/spack/packages/libelf/package.py
- :start-after: homepage
- :end-before: versions
+.. code-block:: python
+
+ url = "http://www.mr511.de/software/libelf-0.8.13.tar.gz"
+
+Spack spiders ``http://www.mr511.de/software/`` to find similar
+tarball links and ultimately to make a list of available versions of
+``libelf``.
-Spack will try to fetch the URL ``http://www.mr511.de/software/``,
-scrape the page, and use any links that look like the tarball URL to
-find other available versions. For many packages, the tarball's
-parent directory may be unlistable, or it may not contain any links to
-source code archives. For these, you can specify a separate
-``list_url`` indicating the page to search for tarballs. For example,
-``libdwarf`` has the homepage as the ``list_url``:
+For many packages, the tarball's parent directory may be unlistable,
+or it may not contain any links to source code archives. In fact,
+many times additional package downloads aren't even available in the
+same directory as the download URL.
-.. literalinclude:: ../../../var/spack/packages/libdwarf/package.py
- :start-after: Libdwarf
- :end-before: versions
+For these, you can specify a separate ``list_url`` indicating the page
+to search for tarballs. For example, ``libdwarf`` has the homepage as
+the ``list_url``, because that is where links to old versions are:
+
+.. code-block:: python
+ :linenos:
+
+ class Libdwarf(Package):
+ homepage = "http://www.prevanders.net/dwarf.html"
+ url = "http://www.prevanders.net/libdwarf-20130729.tar.gz"
+ list_url = homepage
+
+.. _attribute_list_depth:
``list_depth``
-^^^^^^^^^^^^^^^^^^^^
-
-Some packages may not have a listing of available verisons on a single
-page. For these, you can specify a ``list_depth`` indicating that
-Spack should follow links from the ``list_url`` up to a particular
-depth. Spack will follow links and search each page reachable from
-the ``list_url`` for tarball links. For example, ``mpich`` archives
-are stored in a directory tree of versions, so the package looks like
-this:
+~~~~~~~~~~~~~~~~~~~~~
-.. literalinclude:: ../../../var/spack/packages/mpich/package.py
- :start-after: homepage
- :end-before: versions
+``libdwarf`` and many other packages have a listing of available
+verisons on a single webpage, but not all do. For example, ``mpich``
+has a tarball URL that looks like this:
+
+ url = "http://www.mpich.org/static/downloads/3.0.4/mpich-3.0.4.tar.gz"
+
+But its downloads are in many different subdirectories of
+``http://www.mpich.org/static/downloads/``. So, we need to add a
+``list_url`` *and* a ``list_depth`` attribute:
+
+.. code-block:: python
+ :linenos:
+
+ class Mpich(Package):
+ homepage = "http://www.mpich.org"
+ url = "http://www.mpich.org/static/downloads/3.0.4/mpich-3.0.4.tar.gz"
+ list_url = "http://www.mpich.org/static/downloads/"
+ list_depth = 2
+
+By default, Spack only looks at the top-level page available at
+``list_url``. ``list_depth`` tells it to follow up to 2 levels of
+links from the top-level page. Note that here, this implies two
+levels of subdirectories, as the ``mpich`` website is structured much
+like a filesystem. But ``list_depth`` really refers to link depth
+when spidering the page.
+
+.. _attribute_parallel:
+
+Parallel Builds
+------------------
+
+By default, Spack will invoke ``make()`` with a ``-j <njobs>``
+argument, so that builds run in parallel. It figures out how many
+jobs to run by determining how many cores are on the host machine.
+Specifically, it uses the number of CPUs reported by Python's
+`multiprocessing.cpu_count()
+<http://docs.python.org/library/multiprocessing.html#multiprocessing.cpu_count>`_.
+
+If a package does not build properly in parallel, you can override
+this setting by adding ``parallel = False`` to your package. For
+example, OpenSSL's build does not work in parallel, so its package
+looks like this:
+
+.. code-block:: python
+ :emphasize-lines: 8
+ :linenos:
+
+ class Openssl(Package):
+ homepage = "http://www.openssl.org"
+ url = "http://www.openssl.org/source/openssl-1.0.1h.tar.gz"
+
+ version('1.0.1h', '8d6d684a9430d5cc98a62a5d8fbda8cf')
+ depends_on("zlib")
+
+ parallel = False
+
+Similarly, you can disable parallel builds only for specific make
+commands, as ``libdwarf`` does:
+
+.. code-block:: python
+ :emphasize-lines: 9, 12
+ :linenos:
+
+ class Libelf(Package):
+ ...
+
+ def install(self, spec, prefix):
+ configure("--prefix=" + prefix,
+ "--enable-shared",
+ "--disable-dependency-tracking",
+ "--disable-debug")
+ make()
+
+ # The mkdir commands in libelf's install can fail in parallel
+ make("install", parallel=False)
+
+The first make will run in parallel here, but the second will not. If
+you set ``parallel`` to ``False`` at the package level, then each call
+to ``make()`` will be sequential by default, but packagers can call
+``make(parallel=True)`` to override it.
.. _dependencies:
@@ -460,26 +898,36 @@ this:
Dependencies
------------------------------
-We've now covered how to build a simple package, but what if one
-package relies on another package to build? How do you express that
-in a package file? And how do you refer to the other package in the
-build script for your own package?
+We've covered how to build a simple package, but what if one package
+relies on another package to build? How do you express that in a
+package file? And how do you refer to the other package in the build
+script for your own package?
Spack makes this relatively easy. Let's take a look at the
``libdwarf`` package to see how it's done:
-.. literalinclude:: ../../../var/spack/packages/libdwarf/package.py
+.. code-block:: python
+ :emphasize-lines: 9
:linenos:
- :start-after: dwarf_dirs
- :end-before: def clean
- :emphasize-lines: 10
- :append: ...
-``depends_on``
+ class Libdwarf(Package):
+ homepage = "http://www.prevanders.net/dwarf.html"
+ url = "http://www.prevanders.net/libdwarf-20130729.tar.gz"
+ list_url = homepage
+
+ version('20130729', '4cc5e48693f7b93b7aa0261e63c0e21d')
+ ...
+
+ depends_on("libelf")
+
+ def install(self, spec, prefix):
+ ...
+
+``depends_on()``
~~~~~~~~~~~~~~~~~~~~~
-The ``depends_on('libelf')`` call on line 10 tells Spack that it needs
-to build and install the ``libelf`` package before it builds
+The highlighted ``depends_on('libelf')`` call tells Spack that it
+needs to build and install the ``libelf`` package before it builds
``libdwarf``. This means that in your ``install()`` method, you are
guaranteed that ``libelf`` has been built and installed successfully,
so you can rely on it for your libdwarf build.
@@ -488,28 +936,37 @@ Dependency specs
~~~~~~~~~~~~~~~~~~~~~~
``depends_on`` doesn't just take the name of another package. It
-actually takes a full spec. This means that you can restrict the
-versions or other configuration options of ``libelf`` that
-``libdwarf`` will build with. Here's an example. Suppose that in the
-``libdwarf`` package you wrote:
+takes a full spec. This means that you can restrict the versions or
+other configuration options of ``libelf`` that ``libdwarf`` will build
+with. Here's an example. Suppose that in the ``libdwarf`` package
+you write:
.. code-block:: python
depends_on("libelf@0.8:")
-Now ``libdwarf`` will only ever build with ``libelf`` version ``0.8``
-or higher. If some versions of ``libelf`` are installed but they are
-all older than this, then Spack will build a new version of ``libelf``
-that satisfies the spec's version constraint, and it will build
-``libdwarf`` with that one. You could just as easily provide a
-version range (e.g., ``0.8.2:0.8.4``) or a variant constraint
-(e.g.. ``+debug``) to control how dependencies should be built.
+Now ``libdwarf`` will require a version of ``libelf`` version ``0.8``
+or higher in order to build. If some versions of ``libelf`` are
+installed but they are all older than this, then Spack will build a
+new version of ``libelf`` that satisfies the spec's version
+constraint, and it will build ``libdwarf`` with that one. You could
+just as easily provide a version range:
+
+.. code-block:: python
-Note that both users and package authors can use the same spec syntax
-to refer to different package configurations. Users use the spec
-syntax on the command line to find installed packages or to install
-packages with particular constraints, and package authors can use it
-to describe relationships between packages.
+ depends_on("libelf@0.8.2:0.8.4:")
+
+Or a requirement for a particular variant:
+
+.. code-block:: python
+
+ depends_on("libelf@0.8+debug")
+
+Both users *and* package authors can use the same spec syntax to refer
+to different package configurations. Users use the spec syntax on the
+command line to find installed packages or to install packages with
+particular constraints, and package authors can use specs to describe
+relationships between packages.
.. _virtual-dependencies:
@@ -519,39 +976,48 @@ Virtual dependencies
In some cases, more than one package can satisfy another package's
dependency. One way this can happen is if a pacakge depends on a
particular *interface*, but there are multiple *implementations* of
-the interface, and the package could be built with either. A *very*
-common interface in HPC is the `Message Passing Interface (MPI)
+the interface, and the package could be built with any of them. A
+*very* common interface in HPC is the `Message Passing Interface (MPI)
<http://www.mcs.anl.gov/research/projects/mpi/>`_, which is used in
many large-scale parallel applications.
MPI has several different implementations (e.g., `MPICH
<http://www.mpich.org>`_, `OpenMPI <http://www.open-mpi.org>`_, and
`MVAPICH <http://mvapich.cse.ohio-state.edu>`_) and scientific
-applicaitons can be built with any one of these. Complicating
-matters, MPI does not have a standardized ABI, so a package built with
-one implementation cannot be relinked with another implementation.
+applications can be built with any one of them. Complicating matters,
+MPI does not have a standardized ABI, so a package built with one
+implementation cannot simply be relinked with another implementation.
Many pacakage managers handle interfaces like this by requiring many
-similar pacakge files, e.g., ``foo``, ``foo-mvapich``, ``foo-mpich``,
+similar package files, e.g., ``foo``, ``foo-mvapich``, ``foo-mpich``,
but Spack avoids this explosion of package files by providing support
for *virtual dependencies*.
-
``provides``
~~~~~~~~~~~~~~~~~~~~~
-In Spack, ``mpi`` is a *virtual package*. A package can depend on it
-just like any other package, by supplying a ``depends_on`` call in the
-package definition. In ``mpileaks``, this looks like so:
+In Spack, ``mpi`` is handled as a *virtual package*. A package like
+``mpileaks`` can depend on it just like any other package, by
+supplying a ``depends_on`` call in the package definition. For example:
-.. literalinclude:: ../../../var/spack/packages/mpileaks/package.py
- :start-after: url
- :end-before: install
+.. code-block:: python
+ :linenos:
+ :emphasize-lines: 7
+
+ class Mpileaks(Package):
+ homepage = "https://github.com/hpc/mpileaks"
+ url = "https://github.com/hpc/mpileaks/releases/download/v1.0/mpileaks-1.0.tar.gz"
+
+ version('1.0', '8838c574b39202a57d7c2d68692718aa')
+
+ depends_on("mpi")
+ depends_on("adept-utils")
+ depends_on("callpath")
-Here, ``callpath`` is an actual pacakge, but there is no package file
-for ``mpi``, so we say it is a *virtual* package. The syntax of
-``depends_on``, however, is the same for both.. If we look inside the
-package file of an MPI implementation, say MPICH, we'll see something
-like this:
+Here, ``callpath`` and ``adept-utils`` are concrete pacakges, but
+there is no actual package file for ``mpi``, so we say it is a
+*virtual* package. The syntax of ``depends_on``, is the same for
+both. If we look inside the package file of an MPI implementation,
+say MPICH, we'll see something like this:
.. code-block:: python
@@ -560,24 +1026,30 @@ like this:
...
The ``provides("mpi")`` call tells Spack that the ``mpich`` package
-can be substituted whenever a package says it depends on ``mpi``.
+can be used to satisfy the dependency of any package that
+``depends_on('mpi')``.
-Just as you can pass a spec to ``depends_on``, you can pass a spec to
-``provides`` to add constraints. This allows Spack to support the
+Versioned Interfaces
+~~~~~~~~~~~~~~~~~~~~~~
+
+Just as you can pass a spec to ``depends_on``, so can you pass a spec
+to ``provides`` to add constraints. This allows Spack to support the
notion of *versioned interfaces*. The MPI standard has gone through
-many revisions, each with new functions added. Some packages may
-require a recent implementation that supports MPI-3 fuctions, but some
-MPI versions may only provide up to MPI-2. You can indicate this by
-adding a version constraint to the spec passed to ``provides``:
+many revisions, each with new functions added, and each revision of
+the standard has a version number. Some packages may require a recent
+implementation that supports MPI-3 fuctions, but some MPI versions may
+only provide up to MPI-2. Others may need MPI 2.1 or higher. You can
+indicate this by adding a version constraint to the spec passed to
+``provides``:
.. code-block:: python
provides("mpi@:2")
-Suppose that the above restriction is in the ``mpich2`` package. This
-says that ``mpich2`` provides MPI support *up to* version 2, but if a
-package ``depends_on("mpi@3")``, then Spack will *not* build with ``mpich2``
-for the MPI implementation.
+Suppose that the above ``provides`` call is in the ``mpich2`` package.
+This says that ``mpich2`` provides MPI support *up to* version 2, but
+if a package ``depends_on("mpi@3")``, then Spack will *not* build that
+package with ``mpich2``.
``provides when``
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -585,28 +1057,34 @@ for the MPI implementation.
The same package may provide different versions of an interface
depending on *its* version. Above, we simplified the ``provides``
call in ``mpich`` to make the explanation easier. In reality, this is
-how ``mpich`` declares the virtual packages it provides:
+how ``mpich`` calls ``provides``:
.. code-block:: python
provides('mpi@:3', when='@3:')
provides('mpi@:1', when='@1:')
-The ``when`` argument to ``provides`` (a `keyword argument
-<http://docs.python.org/2/tutorial/controlflow.html#keyword-arguments>`_
-for those not familiar with Python) allows you to specify optional
-constraints on the *calling* package. The calling package will only
-provide the declared virtual spec when *it* matches the constraints in
-the when clause. Here, when ``mpich`` is at version 3 or higher, it
-provides MPI up to version 3. When ``mpich`` is at version 1 or higher,
-it provides the MPI virtual pacakge at version 1.
-
-The ``when`` qualifier will ensure that Spack selects a suitably high
-version of ``mpich`` to match another package that ``depends_on`` a
-particular version of MPI. It will also prevent a user from building
-with too low a version of ``mpich``. For example, suppose the package
-``foo`` declares that it ``depends_on('mpi@2')``, and a user invokes
-``spack install`` like this:
+The ``when`` argument to ``provides`` allows you to specify optional
+constraints on the *providing* package, or the *provider*. The
+provider only provides the declared virtual spec when *it* matches
+the constraints in the when clause. Here, when ``mpich`` is at
+version 3 or higher, it provides MPI up to version 3. When ``mpich``
+is at version 1 or higher, it provides the MPI virtual pacakge at
+version 1.
+
+The ``when`` qualifier ensures that Spack selects a suitably high
+version of ``mpich`` to satisfy some other package that ``depends_on``
+a particular version of MPI. It will also prevent a user from
+building with too low a version of ``mpich``. For example, suppose
+the package ``foo`` declares this:
+
+.. code-block:: python
+
+ class Foo(Package):
+ ...
+ depends_on('mpi@2')
+
+Suppose a user invokes ``spack install`` like this:
.. code-block:: sh
@@ -653,6 +1131,18 @@ DAG, based on the constraints above::
^libelf@0.8.11
^mpi
+
+.. graphviz::
+
+ digraph {
+ mpileaks -> mpi
+ mpileaks -> "callpath@1.0+debug" -> mpi
+ "callpath@1.0+debug" -> dyninst
+ dyninst -> libdwarf -> "libelf@0.8.11"
+ dyninst -> "libelf@0.8.11"
+ }
+
+
This diagram shows a spec DAG output as a tree, where successive
levels of indentation represent a depends-on relationship. In the
above DAG, we can see some packages annotated with their constraints,
@@ -671,12 +1161,22 @@ the user runs ``spack install`` and the time the ``install()`` method
is called. The concretized version of the spec above might look like
this::
- mpileaks@2.3%gcc@4.7.3=macosx_10.8_x86_64
- ^callpath@1.0%gcc@4.7.3+debug=macosx_10.8_x86_64
- ^dyninst@8.1.2%gcc@4.7.3=macosx_10.8_x86_64
- ^libdwarf@20130729%gcc@4.7.3=macosx_10.8_x86_64
- ^libelf@0.8.11%gcc@4.7.3=macosx_10.8_x86_64
- ^mpich@3.0.4%gcc@4.7.3=macosx_10.8_x86_64
+ mpileaks@2.3%gcc@4.7.3=linux-ppc64
+ ^callpath@1.0%gcc@4.7.3+debug=linux-ppc64
+ ^dyninst@8.1.2%gcc@4.7.3=linux-ppc64
+ ^libdwarf@20130729%gcc@4.7.3=linux-ppc64
+ ^libelf@0.8.11%gcc@4.7.3=linux-ppc64
+ ^mpich@3.0.4%gcc@4.7.3=linux-ppc64
+
+.. graphviz::
+
+ digraph {
+ "mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64"
+ "mpileaks@2.3\n%gcc@4.7.3\n=linux-ppc64" -> "callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "mpich@3.0.4\n%gcc@4.7.3\n=linux-ppc64"
+ "callpath@1.0\n%gcc@4.7.3+debug\n=linux-ppc64" -> "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64"
+ "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libdwarf@20130729\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64"
+ "dyninst@8.1.2\n%gcc@4.7.3\n=linux-ppc64" -> "libelf@0.8.11\n%gcc@4.7.3\n=linux-ppc64"
+ }
Here, all versions, compilers, and platforms are filled in, and there
is a single version (no version ranges) for each package. All
@@ -703,186 +1203,39 @@ running ``spack spec``. For example:
^libdwarf
^libelf
- dyninst@8.0.1%gcc@4.7.3=macosx_10.8_x86_64
- ^libdwarf@20130729%gcc@4.7.3=macosx_10.8_x86_64
- ^libelf@0.8.13%gcc@4.7.3=macosx_10.8_x86_64
-
-
-.. _install-environment:
-
-Install environment
---------------------------
-
-In general, you should not have to do much differently in your install
-method than you would when installing a pacakge on the command line.
-Spack tries to set environment variables and modify compiler calls so
-that it *appears* to the build system that you're building with a
-standard system install of everything. Obviously that's not going to
-cover *all* build systems, but it should make it easy to port packages
-that use standard build systems to Spack.
-
-There are a couple of things that Spack does that help with this:
-
-
-Compiler interceptors
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Spack intercepts the compiler calls that your build makes. If your
-build invokes ``cc``, then Spack intercepts the ``cc`` call with its
-own wrapper script, and it inserts ``-I``, ``-L``, and ``-Wl,-rpath``
-options for all dependencies before invoking the actual compiler.
+ dyninst@8.0.1%gcc@4.7.3=linux-ppc64
+ ^libdwarf@20130729%gcc@4.7.3=linux-ppc64
+ ^libelf@0.8.13%gcc@4.7.3=linux-ppc64
-An example of this would be the ``libdwarf`` build, which has one
-dependency: ``libelf``. Every call to ``cc`` in the ``libdwarf``
-build will have ``-I$LIBELF_PREFIX/include``,
-``-L$LIBELF_PREFIX/lib``, and ``-Wl,-rpath=$LIBELF_PREFIX/lib``
-inserted on the command line. This is done transparently to the
-project's build system, which will just think it's using a system
-where ``libelf`` is readily available. Because of this, you **do
-not** have to insert extra ``-I``, ``-L``, etc. on the command line.
-
-An exmaple of this is the ``libdwarf`` package. You'll notice that it
-never mentions ``libelf`` outside of the ``depends_on('libelf')``
-call, but it still manages to find its dependency library and build.
-This is due to Spack's compiler interceptors.
-
-
-
-Environment variables
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Spack sets a number of standard environment variables so that build
-systems use its compiler wrappers for their builds. The standard
-enviroment variables are:
-
- ======================= =============================
- Variable Purpose
- ======================= =============================
- ``CC`` C compiler
- ``CXX`` C++ compiler
- ``F77`` Fortran 77 compiler
- ``FC`` Fortran 90 and above compiler
- ``CMAKE_PREFIX_PATH`` Path to dependency prefixes for CMake
- ======================= =============================
-
-All of these are standard variables respected by most build systems,
-so if your project uses something like ``autotools`` or ``CMake``,
-then it should pick them up automatically when you run ``configure``
-or ``cmake`` in your ``install()`` function. Many traditional builds
-using GNU Make and BSD make also respect these variables, so they may
-work with these systems, as well.
-
-If your build systm does *not* pick these variables up from the
-environment automatically, then you can simply pass them on the
-command line or use a patch as part of your build process to get the
-correct compilers into the project's build system.
-
-
-Forked process
-~~~~~~~~~~~~~~~~~~~~~
+This is useful when you want to know exactly what Spack will do when
+you ask for a particular spec.
-To give packages free reign over how they install things, how they
-modify the environemnt, and how they use Spack's internal APIs, we
-fork a new process each time we invoke ``install()``. This allows
-packages to have their own completely sandboxed build environment,
-without impacting other jobs that the main Spack process runs.
-.. _patching:
+.. _install-method:
-Patches
+Implementing the ``install`` method
------------------------------------------
-Depending on the host architecture, package version, known bugs, or
-other issues, you may need to patch your software to get it to build
-correctly. Like many other package systems, spack allows you to store
-patches alongside your package files and apply them to source code
-after it's downloaded.
+The last element of a package is its ``install()`` method. This is
+where the real work of installation happens, and it's the main part of
+the package you'll need to customize for each piece of software.
-``patch``
-~~~~~~~~~~~~~~~~~~~~~
-
-You can specify patches in your package file with the ``patch()``
-function. ``patch`` looks like this:
-
-.. code-block:: python
-
- class Mvapich2(Package):
- ...
- patch('ad_lustre_rwcontig_open_source.patch', when='@1.9:')
-
-The first argument can be either a URL or a filename. It specifies a
-patch file that should be applied to your source. If the patch you
-supply is a filename, then the patch needs to live within the spack
-source tree. For example, the patch above lives in a directory
-structure like this::
-
- $SPACK_ROOT/var/spack/packages/
- mvapich2/
- package.py
- ad_lustre_rwcontig_open_source.patch
-
-If you supply a URL instead of a filename, the patch will be fetched
-from the URL and then applied to your source code.
-
-.. warning::
-
- It is generally better to use a filename rather than a URL for your
- patch. Patches fetched from URLs are not currently checksummed,
- and adding checksums for them is tedious for the package builder.
- File patches go into the spack repository, which gives you git's
- integrity guarantees. URL patches may be removed in a future spack
- version.
-
-``patch`` can take two options keyword arguments. They are:
-
-``when``
- If supplied, this is a spec that tells spack when to apply
- the patch. If the installed package spec matches this spec, the
- patch will be applied. In our example above, the patch is applied
- when mvapich is at version ``1.9`` or higher.
-
-``level``
- This tells spack how to run the ``patch`` command. By default,
- the level is 1 and spack runs ``patch -p1``. If level is 2,
- spack will run ``patch -p2``, and so on.
-
-A lot of people are confused by level, so here's a primer. If you
-look in your patch file, you may see something like this:
-
-.. code-block:: diff
-
- --- a/src/mpi/romio/adio/ad_lustre/ad_lustre_rwcontig.c 2013-12-10 12:05:44.806417000 -0800
- +++ b/src/mpi/romio/adio/ad_lustre/ad_lustre_rwcontig.c 2013-12-10 11:53:03.295622000 -0800
- @@ -8,7 +8,7 @@
- * Copyright (C) 2008 Sun Microsystems, Lustre group
- */
-
- -#define _XOPEN_SOURCE 600
- +//#define _XOPEN_SOURCE 600
- #include <stdlib.h>
- #include <malloc.h>
- #include "ad_lustre.h"
-
-The first two lines show paths with synthetic ``a/`` and ``b/``
-prefixes. These are placeholders for the two ``mvapich2`` source
-directories that ``diff`` compared when it created the patch file.
-This is git's default behavior when creating patch files, but other
-programs may behave differently.
+.. literalinclude:: ../../../var/spack/packages/libelf/package.py
+ :start-after: 0.8.12
+ :linenos:
-``-p1`` strips off the first level of the prefix in both paths,
-allowing the patch to be applied from the root of an expanded mvapich2
-archive. If you set level to ``2``, it would strip off ``src``, and
-so on.
+``install`` takes a ``spec``: a description of how the package should
+be built, and a ``prefix``: the path to the directory where the
+software should be installed.
-It's generally easier to just structure your patch file so that it
-applies cleanly with ``-p1``, but if you're using a URL to a patch you
-didn't create yourself, ``level`` can be handy.
+Spack provides wrapper functions for ``configure`` and ``make`` so
+that you can call them in a similar way to how you'd call a shell
+comamnd. In reality, these are Python functions. Spack provides
+these functions to make writing packages more natural. See the section
+on :ref:`shell wrappers <shell-wrappers>`.
-.. _install-method:
-Implementing the ``install`` method
-------------------------------------------
Now that the metadata is out of the way, we can move on to the
``install()`` method. When a user runs ``spack install``, Spack
@@ -931,6 +1284,153 @@ on the version, compiler, dependencies, etc. that your package is
built with. These parameters give you access to this type of
information.
+.. _install-environment:
+
+The Install environment
+--------------------------
+
+In general, you should not have to do much differently in your install
+method than you would when installing a pacakge on the command line.
+In fact, you may need to do *less* than you would on the command line.
+
+Spack tries to set environment variables and modify compiler calls so
+that it *appears* to the build system that you're building with a
+standard system install of everything. Obviously that's not going to
+cover *all* build systems, but it should make it easy to port packages
+to Spack if they use a standard build system. Usually with autotools
+or cmake, building and installing is easy. With builds that use
+custom Makefiles, you may need to add logic to modify the makefiles.
+
+The remainder of the section covers the way Spack's build environment
+works.
+
+Environment variables
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Spack sets a number of standard environment variables that serve two
+purposes:
+
+ #. Make build systems use Spack's compiler wrappers for their builds.
+ #. Allow build systems to find dependencies more easily
+
+The Compiler enviroment variables that Spack sets are:
+
+ ============ ===============================
+ Variable Purpose
+ ============ ===============================
+ ``CC`` C compiler
+ ``CXX`` C++ compiler
+ ``F77`` Fortran 77 compiler
+ ``FC`` Fortran 90 and above compiler
+ ============ ===============================
+
+All of these are standard variables respected by most build systems.
+If your project uses ``autotools`` or ``CMake``, then it should pick
+them up automatically when you run ``configure`` or ``cmake`` in the
+``install()`` function. Many traditional builds using GNU Make and
+BSD make also respect these variables, so they may work with these
+systems.
+
+If your build system does *not* automatically pick these variables up
+from the environment, then you can simply pass them on the command
+line or use a patch as part of your build process to get the correct
+compilers into the project's build system. There are also some file
+editing commands you can use -- these are described later in
+`filtering-files`_.
+
+In addition to the compiler variables, these variables are set before
+entering ``install()`` so that packages can locate dependencies
+easily:
+
+ ======================= =============================
+ ``PATH`` Set to point to ``/bin`` directories of dpeendencies
+ ``CMAKE_PREFIX_PATH`` Path to dependency prefixes for CMake
+ ``PKG_CONFIG_PATH`` Path to any pkgconfig directories for dependencies
+ ======================= =============================
+
+``PATH`` is set up to point to dependencies ``/bin`` directories so
+that you can use tools installed by dependency packages at build time.
+For example, ``$MPICH_ROOT/bin/mpicc`` is frequently used by dependencies of
+``mpich``.
+
+``CMAKE_PREFIX_PATH`` contains a colon-separated list of prefixes
+where ``cmake`` will search for dependency libraries and headers.
+This causes all standard CMake find commands to look in the paths of
+your dependencies, so you *do not* have to manually specify arguments
+like ``-D DEPENDENCY_DIR=/path/to/dependency`` to ``cmake``. More on
+this is `in the CMake documentation <http://www.cmake.org/cmake/help/v3.0/variable/CMAKE_PREFIX_PATH.html>`_.
+
+``PKG_CONFIG_PATH`` is for packages that attempt to discover
+dependencies using the GNU ``pkg-config`` tool. It is similar to
+``CMAKE_PREFIX_PATH`` in that it allows a build to automatically
+discover its dependencies.
+
+
+Compiler interceptors
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+As mentioned, ``CC``, ``CXX``, ``F77``, and ``FC`` are set to point to
+Spack's compiler wrappers. These are simply called ``cc``, ``c++``,
+``f77``, and ``f90``, and they live in ``$SPACK_ROOT/lib/spack/env``.
+
+``$SPACK_ROOT/lib/spack/env`` is added first in the ``PATH``
+environment variable when ``install()`` runs so that system compilers
+are not picked up instead.
+
+All of these compiler wrappers point to a single compiler wrapper
+script that figures out which *real* compiler it should be building
+with. This comes either from spec `concretization
+<abstract-and-concrete>`_ or from a user explicitly asking for a
+particular compiler using, e.g., ``%intel`` on the command line.
+
+In addition to invoking the right compiler, the compiler wrappers add
+flags to the compile line so that dependencies can be easily found.
+These flags are added for each dependency, if they exist:
+
+Compile-time library search paths
+ * ``-L$dep_prefix/lib``
+ * ``-L$dep_prefix/lib64``
+Runtime library search paths (RPATHs)
+ * ``-Wl,-rpath=$dep_prefix/lib``
+ * ``-Wl,-rpath=$dep_prefix/lib64``
+Include search paths
+ * ``-I$dep_prefix/include``
+
+An example of this would be the ``libdwarf`` build, which has one
+dependency: ``libelf``. Every call to ``cc`` in the ``libdwarf``
+build will have ``-I$LIBELF_PREFIX/include``,
+``-L$LIBELF_PREFIX/lib``, and ``-Wl,-rpath=$LIBELF_PREFIX/lib``
+inserted on the command line. This is done transparently to the
+project's build system, which will just think it's using a system
+where ``libelf`` is readily available. Because of this, you **do
+not** have to insert extra ``-I``, ``-L``, etc. on the command line.
+
+Another useful consequence of this is that you often do *not* have to
+add extra parameters on the ``configure`` line to get autotools to
+find dependencies. The ``libdwarf`` install method just calls
+configure like this:
+
+.. code-block:: python
+
+ configure("--prefix=" + prefix)
+
+Because of the ``-L`` and ``-I`` arguments, configure will
+successfully find ``libdwarf.h`` and ``libdwarf.so``, without the
+packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on
+the command line.
+
+Forking ``install()``
+~~~~~~~~~~~~~~~~~~~~~
+
+To give packagers free reign over their install environemnt, Spack
+forks a new process each time it invokes a package's ``install()``
+method. This allows packages to have their own completely sandboxed
+build environment, without impacting other jobs that the main Spack
+process runs. Packages are free to change the environment or to
+modify Spack internals, because each ``install()`` call has its own
+dedicated process.
+
+
.. _prefix-objects:
Prefix objects
@@ -1232,7 +1732,7 @@ build system.
.. _pacakge-lifecycle:
-The package build process
+Useful Packaging Commands
---------------------------------
When you are building packages, you will likely not get things
@@ -1246,7 +1746,7 @@ want to clean up the temporary directory, or if the package isn't
downloading properly, you might want to run *only* the ``fetch`` stage
of the build.
-A typical package development cycle might look like this:
+A typical package workflow might look like this:
.. code-block:: sh
@@ -1324,13 +1824,109 @@ recover disk space if temporary files from interrupted or failed
installs accumulate in the staging area.
-Dirty Installs
-~~~~~~~~~~~~~~~~~~~
+Keeping the stage directory on success
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
By default, ``spack install`` will delete the staging area once a
-pacakge has been successfully built and installed, *or* if an error
-occurs during the build. Use ``spack install --dirty`` or ``spack
-install -d`` to leave the build directory intact. This allows you to
-inspect the build directory and potentially fix the build. You can
-use ``purge`` or ``clean`` later to get rid of the unwanted temporary
-files.
+pacakge has been successfully built and installed. Use
+``--keep-stage`` to leave the build directory intact:
+
+.. code-block:: sh
+
+ spack install --keep-stage <spec>
+
+This allows you to inspect the build directory and potentially debug
+the build. You can use ``purge`` or ``clean`` later to get rid of the
+unwanted temporary files.
+
+
+Keeping the install prefix on failure
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+By default, ``spack install`` will delete any partially constructed
+install prefix if anything fails during ``install()``. If you want to
+keep the prefix anyway (e.g. to diagnose a bug), you can use
+``--keep-prefix``:
+
+.. code-block:: sh
+
+ spack install --keep-prefix <spec>
+
+Note that this may confuse Spack into thinking that the package has
+been installed properly, so you may need to use ``spack uninstall -f``
+to get rid of the install prefix before you build again:
+
+.. code-block:: sh
+
+ spack uninstall -f <spec>
+
+
+Interactive Shell Support
+--------------------------
+
+Spack provides some limited shell support to make life easier for
+packagers. You can enable these commands by sourcing a setup file in
+the ``/share/spack`` directory. For ``bash`` or ``ksh``, run::
+
+ . $SPACK_ROOT/share/spack/setup-env.sh
+
+For ``csh`` and ``tcsh`` run:
+
+ setenv SPACK_ROOT /path/to/spack
+ source $SPACK_ROOT/share/spack/setup-env.csh
+
+``spack cd`` will then be available.
+
+
+``spack cd``
+~~~~~~~~~~~~~~~~~
+
+``spack cd`` allows you to quickly cd to pertinent directories in Spack.
+Suppose you've staged a package but you want to modify it before you
+build it:
+
+.. code-block:: sh
+
+ $ spack stage libelf
+ ==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz
+ ######################################################################## 100.0%
+ ==> Staging archive: /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13.tar.gz
+ ==> Created stage in /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64.
+ $ spack cd libelf
+ $ pwd
+ /Users/gamblin2/src/spack/var/spack/stage/libelf@0.8.13%gcc@4.8.3=linux-ppc64/libelf-0.8.13
+
+``spack cd`` here changed he current working directory to the
+directory containing theexpanded ``libelf`` source code. There are a
+number of other places you can cd to in the spack directory hierarchy:
+
+.. command-output:: spack cd -h
+
+Some of these change directory into package-specific locations (stage
+directory, install directory, package directory) and others change to
+core spack locations. For example, ``spack cd -m`` will take you to
+the main python source directory of your spack install.
+
+
+``spack location``
+~~~~~~~~~~~~~~~~~~~~~~
+
+``spack location`` is the same as ``spack cd`` but it does not require
+shell support. It simply prints out the path you ask for, rather than
+cd'ing to it. In bash, this::
+
+ cd $(spack location -b <spec>)
+
+is the same as::
+
+ spack cd -b <spec>
+
+``spack location`` is intended for use in scripts or makefiles that
+need to know where packages are installed. e.g., in a makefile you
+might write:
+
+.. code-block:: makefile
+
+ DWARF_PREFIX = $(spack location -i libdwarf)
+ CXXFLAGS += -I$DWARF_PREFIX/include
+ CXXFLAGS += -L$DWARF_PREFIX/lib
diff --git a/lib/spack/llnl/util/compare/__init__.py b/lib/spack/llnl/util/compare/__init__.py
deleted file mode 100644
index e69de29bb2..0000000000
--- a/lib/spack/llnl/util/compare/__init__.py
+++ /dev/null
diff --git a/lib/spack/llnl/util/compare/none_high.py b/lib/spack/llnl/util/compare/none_high.py
deleted file mode 100644
index 78b41cbaf6..0000000000
--- a/lib/spack/llnl/util/compare/none_high.py
+++ /dev/null
@@ -1,70 +0,0 @@
-##############################################################################
-# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
-# Produced at the Lawrence Livermore National Laboratory.
-#
-# This file is part of Spack.
-# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
-# LLNL-CODE-647188
-#
-# For details, see https://scalability-llnl.github.io/spack
-# Please also see the LICENSE file for our notice and the LGPL.
-#
-# This program is free software; you can redistribute it and/or modify
-# it under the terms of the GNU General Public License (as published by
-# the Free Software Foundation) version 2.1 dated February 1999.
-#
-# This program is distributed in the hope that it will be useful, but
-# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
-# conditions of the GNU General Public License for more details.
-#
-# You should have received a copy of the GNU Lesser General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
-##############################################################################
-"""
-Functions for comparing values that may potentially be None.
-These none_high functions consider None as greater than all other values.
-"""
-
-# Preserve builtin min and max functions
-_builtin_min = min
-_builtin_max = max
-
-
-def lt(lhs, rhs):
- """Less-than comparison. None is greater than any value."""
- return lhs != rhs and (rhs is None or (lhs is not None and lhs < rhs))
-
-
-def le(lhs, rhs):
- """Less-than-or-equal comparison. None is greater than any value."""
- return lhs == rhs or lt(lhs, rhs)
-
-
-def gt(lhs, rhs):
- """Greater-than comparison. None is greater than any value."""
- return lhs != rhs and not lt(lhs, rhs)
-
-
-def ge(lhs, rhs):
- """Greater-than-or-equal comparison. None is greater than any value."""
- return lhs == rhs or gt(lhs, rhs)
-
-
-def min(lhs, rhs):
- """Minimum function where None is greater than any value."""
- if lhs is None:
- return rhs
- elif rhs is None:
- return lhs
- else:
- return _builtin_min(lhs, rhs)
-
-
-def max(lhs, rhs):
- """Maximum function where None is greater than any value."""
- if lhs is None or rhs is None:
- return None
- else:
- return _builtin_max(lhs, rhs)
diff --git a/lib/spack/llnl/util/compare/none_low.py b/lib/spack/llnl/util/compare/none_low.py
deleted file mode 100644
index 307bcc8a26..0000000000
--- a/lib/spack/llnl/util/compare/none_low.py
+++ /dev/null
@@ -1,70 +0,0 @@
-##############################################################################
-# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
-# Produced at the Lawrence Livermore National Laboratory.
-#
-# This file is part of Spack.
-# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
-# LLNL-CODE-647188
-#
-# For details, see https://scalability-llnl.github.io/spack
-# Please also see the LICENSE file for our notice and the LGPL.
-#
-# This program is free software; you can redistribute it and/or modify
-# it under the terms of the GNU General Public License (as published by
-# the Free Software Foundation) version 2.1 dated February 1999.
-#
-# This program is distributed in the hope that it will be useful, but
-# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
-# conditions of the GNU General Public License for more details.
-#
-# You should have received a copy of the GNU Lesser General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
-##############################################################################
-"""
-Functions for comparing values that may potentially be None.
-These none_low functions consider None as less than all other values.
-"""
-
-# Preserve builtin min and max functions
-_builtin_min = min
-_builtin_max = max
-
-
-def lt(lhs, rhs):
- """Less-than comparison. None is lower than any value."""
- return lhs != rhs and (lhs is None or (rhs is not None and lhs < rhs))
-
-
-def le(lhs, rhs):
- """Less-than-or-equal comparison. None is less than any value."""
- return lhs == rhs or lt(lhs, rhs)
-
-
-def gt(lhs, rhs):
- """Greater-than comparison. None is less than any value."""
- return lhs != rhs and not lt(lhs, rhs)
-
-
-def ge(lhs, rhs):
- """Greater-than-or-equal comparison. None is less than any value."""
- return lhs == rhs or gt(lhs, rhs)
-
-
-def min(lhs, rhs):
- """Minimum function where None is less than any value."""
- if lhs is None or rhs is None:
- return None
- else:
- return _builtin_min(lhs, rhs)
-
-
-def max(lhs, rhs):
- """Maximum function where None is less than any value."""
- if lhs is None:
- return rhs
- elif rhs is None:
- return lhs
- else:
- return _builtin_max(lhs, rhs)
diff --git a/lib/spack/spack/cmd/clean.py b/lib/spack/spack/cmd/clean.py
index 1df9d87ae2..79dd91c5bf 100644
--- a/lib/spack/spack/cmd/clean.py
+++ b/lib/spack/spack/cmd/clean.py
@@ -52,7 +52,15 @@ def clean(parser, args):
package = spack.db.get(spec)
if args.dist:
package.do_clean_dist()
+ tty.msg("Cleaned %s" % package.name)
+
elif args.work:
package.do_clean_work()
+ tty.msg("Restaged %s" % package.name)
+
else:
- package.do_clean()
+ try:
+ package.do_clean()
+ except subprocess.CalledProcessError, e:
+ tty.warn("Warning: 'make clean' didn't work. Consider 'spack clean --work'.")
+ tty.msg("Made clean for %s" % package.name)
diff --git a/lib/spack/spack/cmd/create.py b/lib/spack/spack/cmd/create.py
index 5f6350c8e8..7ac10285a4 100644
--- a/lib/spack/spack/cmd/create.py
+++ b/lib/spack/spack/cmd/create.py
@@ -72,6 +72,9 @@ class ${class_name}(Package):
${versions}
+ # FIXME: Add dependencies if this package requires them.
+ # depends_on("foo")
+
def install(self, spec, prefix):
# FIXME: Modify the configure line to suit your build system here.
${configure}
diff --git a/lib/spack/spack/cmd/info.py b/lib/spack/spack/cmd/info.py
index 87d272dbc6..29568b8c5d 100644
--- a/lib/spack/spack/cmd/info.py
+++ b/lib/spack/spack/cmd/info.py
@@ -27,6 +27,7 @@ import textwrap
from StringIO import StringIO
from llnl.util.tty.colify import *
import spack
+import spack.fetch_strategy as fs
description = "Get detailed information on a particular package"
@@ -122,7 +123,8 @@ def info_text(pkg):
maxlen = max(len(str(v)) for v in pkg.versions)
fmt = "%%-%ss" % maxlen
for v in reversed(sorted(pkg.versions)):
- print " " + (fmt % v) + " " + pkg.url_for_version(v)
+ f = fs.for_package_version(pkg, v)
+ print " " + (fmt % v) + " " + str(f)
print
print "Dependencies:"
diff --git a/lib/spack/spack/cmd/location.py b/lib/spack/spack/cmd/location.py
index 116cf6ee6c..3fc05d471d 100644
--- a/lib/spack/spack/cmd/location.py
+++ b/lib/spack/spack/cmd/location.py
@@ -55,7 +55,7 @@ def setup_parser(subparser):
'-s', '--stage-dir', action='store_true', help="Stage directory for a spec.")
directories.add_argument(
'-b', '--build-dir', action='store_true',
- help="Expanded archive directory for a spec (requires it to be staged first).")
+ help="Checked out or expanded source directory for a spec (requires it to be staged first).")
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="spec of package to fetch directory for.")
@@ -107,8 +107,8 @@ def location(parser, args):
print pkg.stage.path
else: # args.build_dir is the default.
- if not os.listdir(pkg.stage.path):
+ if not pkg.stage.source_path:
tty.die("Build directory does not exist yet. Run this to create it:",
"spack stage " + " ".join(args.spec))
- print pkg.stage.expanded_archive_path
+ print pkg.stage.source_path
diff --git a/lib/spack/spack/cmd/md5.py b/lib/spack/spack/cmd/md5.py
new file mode 100644
index 0000000000..496835c64b
--- /dev/null
+++ b/lib/spack/spack/cmd/md5.py
@@ -0,0 +1,52 @@
+##############################################################################
+# Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+import hashlib
+from external import argparse
+
+import llnl.util.tty as tty
+from llnl.util.filesystem import *
+
+import spack.util.crypto
+
+description = "Calculate md5 checksums for files."
+
+def setup_parser(subparser):
+ setup_parser.parser = subparser
+ subparser.add_argument('files', nargs=argparse.REMAINDER,
+ help="Files to checksum.")
+
+def md5(parser, args):
+ if not args.files:
+ setup_parser.parser.print_help()
+
+ for f in args.files:
+ if not os.path.isfile(f):
+ tty.die("Not a file: %s" % f)
+ if not can_access(f):
+ tty.die("Cannot read file: %s" % f)
+
+ checksum = spack.util.crypto.checksum(hashlib.md5, f)
+ print "%s %s" % (checksum, f)
diff --git a/lib/spack/spack/cmd/mirror.py b/lib/spack/spack/cmd/mirror.py
index b42b329085..22838e1344 100644
--- a/lib/spack/spack/cmd/mirror.py
+++ b/lib/spack/spack/cmd/mirror.py
@@ -23,23 +23,19 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
-import shutil
+import sys
from datetime import datetime
-from contextlib import closing
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
-from llnl.util.filesystem import mkdirp, join_path
import spack
import spack.cmd
import spack.config
+import spack.mirror
from spack.spec import Spec
from spack.error import SpackError
-from spack.stage import Stage
-from spack.util.compression import extension
-
description = "Manage mirrors."
@@ -58,6 +54,9 @@ def setup_parser(subparser):
'specs', nargs=argparse.REMAINDER, help="Specs of packages to put in mirror")
create_parser.add_argument(
'-f', '--file', help="File with specs of packages to put in mirror.")
+ create_parser.add_argument(
+ '-o', '--one-version-per-spec', action='store_const', const=1, default=0,
+ help="Only fetch one 'preferred' version per spec, not all known versions.")
add_parser = sp.add_parser('add', help=mirror_add.__doc__)
add_parser.add_argument('name', help="Mnemonic name for mirror.")
@@ -72,8 +71,12 @@ def setup_parser(subparser):
def mirror_add(args):
"""Add a mirror to Spack."""
+ url = args.url
+ if url.startswith('/'):
+ url = 'file://' + url
+
config = spack.config.get_config('user')
- config.set_value('mirror', args.name, 'url', args.url)
+ config.set_value('mirror', args.name, 'url', url)
config.write()
@@ -105,112 +108,63 @@ def mirror_list(args):
print fmt % (name, val)
+def _read_specs_from_file(filename):
+ with closing(open(filename, "r")) as stream:
+ for i, string in enumerate(stream):
+ try:
+ s = Spec(string)
+ s.package
+ args.specs.append(s)
+ except SpackError, e:
+ tty.die("Parse error in %s, line %d:" % (args.file, i+1),
+ ">>> " + string, str(e))
+
+
def mirror_create(args):
"""Create a directory to be used as a spack mirror, and fill it with
package archives."""
# try to parse specs from the command line first.
- args.specs = spack.cmd.parse_specs(args.specs)
+ specs = spack.cmd.parse_specs(args.specs)
# If there is a file, parse each line as a spec and add it to the list.
if args.file:
- with closing(open(args.file, "r")) as stream:
- for i, string in enumerate(stream):
- try:
- s = Spec(string)
- s.package
- args.specs.append(s)
- except SpackError, e:
- tty.die("Parse error in %s, line %d:" % (args.file, i+1),
- ">>> " + string, str(e))
-
- if not args.specs:
- args.specs = [Spec(n) for n in spack.db.all_package_names()]
+ if specs:
+ tty.die("Cannot pass specs on the command line with --file.")
+ specs = _read_specs_from_file(args.file)
+
+ # If nothing is passed, use all packages.
+ if not specs:
+ specs = [Spec(n) for n in spack.db.all_package_names()]
+ specs.sort(key=lambda s: s.format("$_$@").lower())
# Default name for directory is spack-mirror-<DATESTAMP>
- if not args.directory:
+ directory = args.directory
+ if not directory:
timestamp = datetime.now().strftime("%Y-%m-%d")
- args.directory = 'spack-mirror-' + timestamp
+ directory = 'spack-mirror-' + timestamp
# Make sure nothing is in the way.
- if os.path.isfile(args.directory):
- tty.error("%s already exists and is a file." % args.directory)
-
- # Create a directory if none exists
- if not os.path.isdir(args.directory):
- mkdirp(args.directory)
- tty.msg("Created new mirror in %s" % args.directory)
- else:
- tty.msg("Adding to existing mirror in %s" % args.directory)
-
- # Things to keep track of while parsing specs.
- working_dir = os.getcwd()
- num_mirrored = 0
- num_error = 0
-
- # Iterate through packages and download all the safe tarballs for each of them
- for spec in args.specs:
- pkg = spec.package
-
- # Skip any package that has no checksummed versions.
- if not pkg.versions:
- tty.msg("No safe (checksummed) versions for package %s."
- % pkg.name)
- continue
-
- # create a subdir for the current package.
- pkg_path = join_path(args.directory, pkg.name)
- mkdirp(pkg_path)
-
- # Download all the tarballs using Stages, then move them into place
- for version in pkg.versions:
- # Skip versions that don't match the spec
- vspec = Spec('%s@%s' % (pkg.name, version))
- if not vspec.satisfies(spec):
- continue
-
- mirror_path = "%s/%s-%s.%s" % (
- pkg.name, pkg.name, version, extension(pkg.url))
-
- os.chdir(working_dir)
- mirror_file = join_path(args.directory, mirror_path)
- if os.path.exists(mirror_file):
- tty.msg("Already fetched %s." % mirror_file)
- num_mirrored += 1
- continue
-
- # Get the URL for the version and set up a stage to download it.
- url = pkg.url_for_version(version)
- stage = Stage(url)
- try:
- # fetch changes directory into the stage
- stage.fetch()
-
- if not args.no_checksum and version in pkg.versions:
- digest = pkg.versions[version]
- stage.check(digest)
- tty.msg("Checksum passed for %s@%s" % (pkg.name, version))
-
- # change back and move the new archive into place.
- os.chdir(working_dir)
- shutil.move(stage.archive_file, mirror_file)
- tty.msg("Added %s to mirror" % mirror_file)
- num_mirrored += 1
-
- except Exception, e:
- tty.warn("Error while fetching %s." % url, e.message)
- num_error += 1
-
- finally:
- stage.destroy()
-
- # If nothing happened, try to say why.
- if not num_mirrored:
- if num_error:
- tty.error("No packages added to mirror.",
- "All packages failed to fetch.")
- else:
- tty.error("No packages added to mirror. No versions matched specs:")
- colify(args.specs, indent=4)
+ existed = False
+ if os.path.isfile(directory):
+ tty.error("%s already exists and is a file." % directory)
+ elif os.path.isdir(directory):
+ existed = True
+
+ # Actually do the work to create the mirror
+ present, mirrored, error = spack.mirror.create(
+ directory, specs, num_versions=args.one_version_per_spec)
+ p, m, e = len(present), len(mirrored), len(error)
+
+ verb = "updated" if existed else "created"
+ tty.msg(
+ "Successfully %s mirror in %s." % (verb, directory),
+ "Archive stats:",
+ " %-4d already present" % p,
+ " %-4d added" % m,
+ " %-4d failed to fetch." % e)
+ if error:
+ tty.error("Failed downloads:")
+ colify(s.format("$_$@") for s in error)
def mirror(parser, args):
@@ -218,4 +172,5 @@ def mirror(parser, args):
'add' : mirror_add,
'remove' : mirror_remove,
'list' : mirror_list }
+
action[args.mirror_command](args)
diff --git a/lib/spack/spack/concretize.py b/lib/spack/spack/concretize.py
index e6d1bb87d4..eee8cb7fde 100644
--- a/lib/spack/spack/concretize.py
+++ b/lib/spack/spack/concretize.py
@@ -68,11 +68,13 @@ class DefaultConcretizer(object):
# If there are known avaialble versions, return the most recent
# version that satisfies the spec
pkg = spec.package
- valid_versions = pkg.available_versions.intersection(spec.versions)
+ valid_versions = [v for v in pkg.available_versions
+ if any(v.satisfies(sv) for sv in spec.versions)]
+
if valid_versions:
spec.versions = ver([valid_versions[-1]])
else:
- raise NoValidVerionError(spec)
+ raise NoValidVersionError(spec)
def concretize_architecture(self, spec):
@@ -160,9 +162,9 @@ class UnavailableCompilerVersionError(spack.error.SpackError):
"Run 'spack compilers' to see available compiler Options.")
-class NoValidVerionError(spack.error.SpackError):
+class NoValidVersionError(spack.error.SpackError):
"""Raised when there is no available version for a package that
satisfies a spec."""
def __init__(self, spec):
- super(NoValidVerionError, self).__init__(
+ super(NoValidVersionError, self).__init__(
"No available version of %s matches '%s'" % (spec.name, spec.versions))
diff --git a/lib/spack/spack/fetch_strategy.py b/lib/spack/spack/fetch_strategy.py
new file mode 100644
index 0000000000..98c78c2e08
--- /dev/null
+++ b/lib/spack/spack/fetch_strategy.py
@@ -0,0 +1,650 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+"""
+Fetch strategies are used to download source code into a staging area
+in order to build it. They need to define the following methods:
+
+ * fetch()
+ This should attempt to download/check out source from somewhere.
+ * check()
+ Apply a checksum to the downloaded source code, e.g. for an archive.
+ May not do anything if the fetch method was safe to begin with.
+ * expand()
+ Expand (e.g., an archive) downloaded file to source.
+ * reset()
+ Restore original state of downloaded code. Used by clean commands.
+ This may just remove the expanded source and re-expand an archive,
+ or it may run something like git reset --hard.
+ * archive()
+ Archive a source directory, e.g. for creating a mirror.
+"""
+import os
+import re
+import shutil
+from functools import wraps
+import llnl.util.tty as tty
+
+import spack
+import spack.error
+import spack.util.crypto as crypto
+from spack.util.executable import *
+from spack.util.string import *
+from spack.version import Version, ver
+from spack.util.compression import decompressor_for, extension
+
+"""List of all fetch strategies, created by FetchStrategy metaclass."""
+all_strategies = []
+
+def _needs_stage(fun):
+ """Many methods on fetch strategies require a stage to be set
+ using set_stage(). This decorator adds a check for self.stage."""
+ @wraps(fun)
+ def wrapper(self, *args, **kwargs):
+ if not self.stage:
+ raise NoStageError(fun)
+ return fun(self, *args, **kwargs)
+ return wrapper
+
+
+class FetchStrategy(object):
+ """Superclass of all fetch strategies."""
+ enabled = False # Non-abstract subclasses should be enabled.
+ required_attributes = None # Attributes required in version() args.
+
+ class __metaclass__(type):
+ """This metaclass registers all fetch strategies in a list."""
+ def __init__(cls, name, bases, dict):
+ type.__init__(cls, name, bases, dict)
+ if cls.enabled: all_strategies.append(cls)
+
+
+ def __init__(self):
+ # The stage is initialized late, so that fetch strategies can be constructed
+ # at package construction time. This is where things will be fetched.
+ self.stage = None
+
+
+ def set_stage(self, stage):
+ """This is called by Stage before any of the fetching
+ methods are called on the stage."""
+ self.stage = stage
+
+
+ # Subclasses need to implement these methods
+ def fetch(self): pass # Return True on success, False on fail.
+ def check(self): pass # Do checksum.
+ def expand(self): pass # Expand archive.
+ def reset(self): pass # Revert to freshly downloaded state.
+
+ def archive(self, destination): pass # Used to create tarball for mirror.
+
+ def __str__(self): # Should be human readable URL.
+ return "FetchStrategy.__str___"
+
+ # This method is used to match fetch strategies to version()
+ # arguments in packages.
+ @classmethod
+ def matches(cls, args):
+ return any(k in args for k in cls.required_attributes)
+
+
+class URLFetchStrategy(FetchStrategy):
+ """FetchStrategy that pulls source code from a URL for an archive,
+ checks the archive against a checksum,and decompresses the archive.
+ """
+ enabled = True
+ required_attributes = ['url']
+
+ def __init__(self, url=None, digest=None, **kwargs):
+ super(URLFetchStrategy, self).__init__()
+
+ # If URL or digest are provided in the kwargs, then prefer
+ # those values.
+ self.url = kwargs.get('url', None)
+ if not self.url: self.url = url
+
+ self.digest = kwargs.get('md5', None)
+ if not self.digest: self.digest = digest
+
+ if not self.url:
+ raise ValueError("URLFetchStrategy requires a url for fetching.")
+
+ @_needs_stage
+ def fetch(self):
+ self.stage.chdir()
+
+ if self.archive_file:
+ tty.msg("Already downloaded %s." % self.archive_file)
+ return
+
+ tty.msg("Trying to fetch from %s" % self.url)
+
+ # Run curl but grab the mime type from the http headers
+ headers = spack.curl('-#', # status bar
+ '-O', # save file to disk
+ '-f', # fail on >400 errors
+ '-D', '-', # print out HTML headers
+ '-L', self.url,
+ return_output=True, fail_on_error=False)
+
+ if spack.curl.returncode != 0:
+ # clean up archive on failure.
+ if self.archive_file:
+ os.remove(self.archive_file)
+
+ if spack.curl.returncode == 22:
+ # This is a 404. Curl will print the error.
+ raise FailedDownloadError(url)
+
+ if spack.curl.returncode == 60:
+ # This is a certificate error. Suggest spack -k
+ raise FailedDownloadError(
+ self.url,
+ "Curl was unable to fetch due to invalid certificate. "
+ "This is either an attack, or your cluster's SSL configuration "
+ "is bad. If you believe your SSL configuration is bad, you "
+ "can try running spack -k, which will not check SSL certificates."
+ "Use this at your own risk.")
+
+ # Check if we somehow got an HTML file rather than the archive we
+ # asked for. We only look at the last content type, to handle
+ # redirects properly.
+ content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
+ if content_types and 'text/html' in content_types[-1]:
+ tty.warn("The contents of " + self.archive_file + " look like HTML.",
+ "The checksum will likely be bad. If it is, you can use",
+ "'spack clean --dist' to remove the bad archive, then fix",
+ "your internet gateway issue and install again.")
+
+ if not self.archive_file:
+ raise FailedDownloadError(self.url)
+
+
+ @property
+ def archive_file(self):
+ """Path to the source archive within this stage directory."""
+ return self.stage.archive_file
+
+ @_needs_stage
+ def expand(self):
+ tty.msg("Staging archive: %s" % self.archive_file)
+
+ self.stage.chdir()
+ if not self.archive_file:
+ raise NoArchiveFileError("URLFetchStrategy couldn't find archive file",
+ "Failed on expand() for URL %s" % self.url)
+
+ decompress = decompressor_for(self.archive_file)
+ decompress(self.archive_file)
+
+
+ def archive(self, destination):
+ """Just moves this archive to the destination."""
+ if not self.archive_file:
+ raise NoArchiveFileError("Cannot call archive() before fetching.")
+
+ if not extension(destination) == extension(self.archive_file):
+ raise ValueError("Cannot archive without matching extensions.")
+
+ shutil.move(self.archive_file, destination)
+
+
+ @_needs_stage
+ def check(self):
+ """Check the downloaded archive against a checksum digest.
+ No-op if this stage checks code out of a repository."""
+ if not self.digest:
+ raise NoDigestError("Attempt to check URLFetchStrategy with no digest.")
+
+ checker = crypto.Checker(self.digest)
+ if not checker.check(self.archive_file):
+ raise ChecksumError(
+ "%s checksum failed for %s." % (checker.hash_name, self.archive_file),
+ "Expected %s but got %s." % (self.digest, checker.sum))
+
+
+ @_needs_stage
+ def reset(self):
+ """Removes the source path if it exists, then re-expands the archive."""
+ if not self.archive_file:
+ raise NoArchiveFileError("Tried to reset URLFetchStrategy before fetching",
+ "Failed on reset() for URL %s" % self.url)
+ if self.stage.source_path:
+ shutil.rmtree(self.stage.source_path, ignore_errors=True)
+ self.expand()
+
+
+ def __repr__(self):
+ url = self.url if self.url else "no url"
+ return "URLFetchStrategy<%s>" % url
+
+
+ def __str__(self):
+ if self.url:
+ return self.url
+ else:
+ return "[no url]"
+
+
+class VCSFetchStrategy(FetchStrategy):
+ def __init__(self, name, *rev_types, **kwargs):
+ super(VCSFetchStrategy, self).__init__()
+ self.name = name
+
+ # Set a URL based on the type of fetch strategy.
+ self.url = kwargs.get(name, None)
+ if not self.url: raise ValueError(
+ "%s requires %s argument." % (self.__class__, name))
+
+ # Ensure that there's only one of the rev_types
+ if sum(k in kwargs for k in rev_types) > 1:
+ raise FetchStrategyError(
+ "Supply only one of %s to fetch with %s." % (
+ comma_or(rev_types), name))
+
+ # Set attributes for each rev type.
+ for rt in rev_types:
+ setattr(self, rt, kwargs.get(rt, None))
+
+
+ @_needs_stage
+ def check(self):
+ tty.msg("No checksum needed when fetching with %s." % self.name)
+
+
+ @_needs_stage
+ def expand(self):
+ tty.debug("Source fetched with %s is already expanded." % self.name)
+
+
+ @_needs_stage
+ def archive(self, destination, **kwargs):
+ assert(extension(destination) == 'tar.gz')
+ assert(self.stage.source_path.startswith(self.stage.path))
+
+ tar = which('tar', required=True)
+
+ patterns = kwargs.get('exclude', None)
+ if patterns is not None:
+ if isinstance(patterns, basestring):
+ patterns = [patterns]
+ for p in patterns:
+ tar.add_default_arg('--exclude=%s' % p)
+
+ self.stage.chdir()
+ tar('-czf', destination, os.path.basename(self.stage.source_path))
+
+
+ def __str__(self):
+ return "VCS: %s" % self.url
+
+
+ def __repr__(self):
+ return "%s<%s>" % (self.__class__, self.url)
+
+
+
+class GitFetchStrategy(VCSFetchStrategy):
+ """Fetch strategy that gets source code from a git repository.
+ Use like this in a package:
+
+ version('name', git='https://github.com/project/repo.git')
+
+ Optionally, you can provide a branch, or commit to check out, e.g.:
+
+ version('1.1', git='https://github.com/project/repo.git', tag='v1.1')
+
+ You can use these three optional attributes in addition to ``git``:
+
+ * ``branch``: Particular branch to build from (default is master)
+ * ``tag``: Particular tag to check out
+ * ``commit``: Particular commit hash in the repo
+ """
+ enabled = True
+ required_attributes = ('git',)
+
+ def __init__(self, **kwargs):
+ super(GitFetchStrategy, self).__init__(
+ 'git', 'tag', 'branch', 'commit', **kwargs)
+ self._git = None
+
+ # For git fetch branches and tags the same way.
+ if not self.branch:
+ self.branch = self.tag
+
+
+ @property
+ def git_version(self):
+ git = which('git', required=True)
+ vstring = git('--version', return_output=True).lstrip('git version ')
+ return Version(vstring)
+
+
+ @property
+ def git(self):
+ if not self._git:
+ self._git = which('git', required=True)
+ return self._git
+
+ @_needs_stage
+ def fetch(self):
+ self.stage.chdir()
+
+ if self.stage.source_path:
+ tty.msg("Already fetched %s." % self.stage.source_path)
+ return
+
+ args = []
+ if self.commit:
+ args.append('at commit %s' % self.commit)
+ elif self.tag:
+ args.append('at tag %s' % self.tag)
+ elif self.branch:
+ args.append('on branch %s' % self.branch)
+ tty.msg("Trying to clone git repository:", self.url, *args)
+
+ if self.commit:
+ # Need to do a regular clone and check out everything if
+ # they asked for a particular commit.
+ self.git('clone', self.url)
+ self.stage.chdir_to_source()
+ self.git('checkout', self.commit)
+
+ else:
+ # Can be more efficient if not checking out a specific commit.
+ args = ['clone']
+
+ # If we want a particular branch ask for it.
+ if self.branch:
+ args.extend(['--branch', self.branch])
+
+ # Try to be efficient if we're using a new enough git.
+ # This checks out only one branch's history
+ if self.git_version > ver('1.7.10'):
+ args.append('--single-branch')
+
+ args.append(self.url)
+ self.git(*args)
+ self.stage.chdir_to_source()
+
+
+ def archive(self, destination):
+ super(GitFetchStrategy, self).archive(destination, exclude='.git')
+
+
+ @_needs_stage
+ def reset(self):
+ self.stage.chdir_to_source()
+ self.git('checkout', '.')
+ self.git('clean', '-f')
+
+
+ def __str__(self):
+ return "[git] %s" % self.url
+
+
+class SvnFetchStrategy(VCSFetchStrategy):
+ """Fetch strategy that gets source code from a subversion repository.
+ Use like this in a package:
+
+ version('name', svn='http://www.example.com/svn/trunk')
+
+ Optionally, you can provide a revision for the URL:
+
+ version('name', svn='http://www.example.com/svn/trunk',
+ revision='1641')
+ """
+ enabled = True
+ required_attributes = ['svn']
+
+ def __init__(self, **kwargs):
+ super(SvnFetchStrategy, self).__init__(
+ 'svn', 'revision', **kwargs)
+ self._svn = None
+ if self.revision is not None:
+ self.revision = str(self.revision)
+
+
+ @property
+ def svn(self):
+ if not self._svn:
+ self._svn = which('svn', required=True)
+ return self._svn
+
+
+ @_needs_stage
+ def fetch(self):
+ self.stage.chdir()
+
+ if self.stage.source_path:
+ tty.msg("Already fetched %s." % self.stage.source_path)
+ return
+
+ tty.msg("Trying to check out svn repository: %s" % self.url)
+
+ args = ['checkout', '--force']
+ if self.revision:
+ args += ['-r', self.revision]
+ args.append(self.url)
+
+ self.svn(*args)
+ self.stage.chdir_to_source()
+
+
+ def _remove_untracked_files(self):
+ """Removes untracked files in an svn repository."""
+ status = self.svn('status', '--no-ignore', return_output=True)
+ self.svn('status', '--no-ignore')
+ for line in status.split('\n'):
+ if not re.match('^[I?]', line):
+ continue
+ path = line[8:].strip()
+ if os.path.isfile(path):
+ os.unlink(path)
+ elif os.path.isdir(path):
+ shutil.rmtree(path, ignore_errors=True)
+
+
+ def archive(self, destination):
+ super(SvnFetchStrategy, self).archive(destination, exclude='.svn')
+
+
+ @_needs_stage
+ def reset(self):
+ self.stage.chdir_to_source()
+ self._remove_untracked_files()
+ self.svn('revert', '.', '-R')
+
+
+ def __str__(self):
+ return "[svn] %s" % self.url
+
+
+
+class HgFetchStrategy(VCSFetchStrategy):
+ """Fetch strategy that gets source code from a Mercurial repository.
+ Use like this in a package:
+
+ version('name', hg='https://jay.grs.rwth-aachen.de/hg/lwm2')
+
+ Optionally, you can provide a branch, or revision to check out, e.g.:
+
+ version('torus', hg='https://jay.grs.rwth-aachen.de/hg/lwm2', branch='torus')
+
+ You can use the optional 'revision' attribute to check out a
+ branch, tag, or particular revision in hg. To prevent
+ non-reproducible builds, using a moving target like a branch is
+ discouraged.
+
+ * ``revision``: Particular revision, branch, or tag.
+ """
+ enabled = True
+ required_attributes = ['hg']
+
+ def __init__(self, **kwargs):
+ super(HgFetchStrategy, self).__init__(
+ 'hg', 'revision', **kwargs)
+ self._hg = None
+
+
+ @property
+ def hg(self):
+ if not self._hg:
+ self._hg = which('hg', required=True)
+ return self._hg
+
+ @_needs_stage
+ def fetch(self):
+ self.stage.chdir()
+
+ if self.stage.source_path:
+ tty.msg("Already fetched %s." % self.stage.source_path)
+ return
+
+ args = []
+ if self.revision:
+ args.append('at revision %s' % self.revision)
+ tty.msg("Trying to clone Mercurial repository:", self.url, *args)
+
+ args = ['clone', self.url]
+ if self.revision:
+ args += ['-r', self.revision]
+
+ self.hg(*args)
+
+
+ def archive(self, destination):
+ super(HgFetchStrategy, self).archive(destination, exclude='.hg')
+
+
+ @_needs_stage
+ def reset(self):
+ self.stage.chdir()
+
+ source_path = self.stage.source_path
+ scrubbed = "scrubbed-source-tmp"
+
+ args = ['clone']
+ if self.revision:
+ args += ['-r', self.revision]
+ args += [source_path, scrubbed]
+ self.hg(*args)
+
+ shutil.rmtree(source_path, ignore_errors=True)
+ shutil.move(scrubbed, source_path)
+ self.stage.chdir_to_source()
+
+
+ def __str__(self):
+ return "[hg] %s" % self.url
+
+
+def from_url(url):
+ """Given a URL, find an appropriate fetch strategy for it.
+ Currently just gives you a URLFetchStrategy that uses curl.
+
+ TODO: make this return appropriate fetch strategies for other
+ types of URLs.
+ """
+ return URLFetchStrategy(url)
+
+
+def args_are_for(args, fetcher):
+ fetcher.matches(args)
+
+
+def for_package_version(pkg, version):
+ """Determine a fetch strategy based on the arguments supplied to
+ version() in the package description."""
+ # If it's not a known version, extrapolate one.
+ if not version in pkg.versions:
+ url = pkg.url_for_verison(version)
+ if not url:
+ raise InvalidArgsError(pkg, version)
+ return URLFetchStrategy(url)
+
+ # Grab a dict of args out of the package version dict
+ args = pkg.versions[version]
+
+ # Test all strategies against per-version arguments.
+ for fetcher in all_strategies:
+ if fetcher.matches(args):
+ return fetcher(**args)
+
+ # If nothing matched for a *specific* version, test all strategies
+ # against
+ for fetcher in all_strategies:
+ attrs = dict((attr, getattr(pkg, attr, None))
+ for attr in fetcher.required_attributes)
+ if 'url' in attrs:
+ attrs['url'] = pkg.url_for_version(version)
+ attrs.update(args)
+ if fetcher.matches(attrs):
+ return fetcher(**attrs)
+
+ raise InvalidArgsError(pkg, version)
+
+
+class FetchError(spack.error.SpackError):
+ def __init__(self, msg, long_msg):
+ super(FetchError, self).__init__(msg, long_msg)
+
+
+class FailedDownloadError(FetchError):
+ """Raised wen a download fails."""
+ def __init__(self, url, msg=""):
+ super(FailedDownloadError, self).__init__(
+ "Failed to fetch file from URL: %s" % url, msg)
+ self.url = url
+
+
+class NoArchiveFileError(FetchError):
+ def __init__(self, msg, long_msg):
+ super(NoArchiveFileError, self).__init__(msg, long_msg)
+
+
+class NoDigestError(FetchError):
+ def __init__(self, msg, long_msg):
+ super(NoDigestError, self).__init__(msg, long_msg)
+
+
+class InvalidArgsError(FetchError):
+ def __init__(self, pkg, version):
+ msg = "Could not construct a fetch strategy for package %s at version %s"
+ msg %= (pkg.name, version)
+ super(InvalidArgsError, self).__init__(msg)
+
+
+class ChecksumError(FetchError):
+ """Raised when archive fails to checksum."""
+ def __init__(self, message, long_msg=None):
+ super(ChecksumError, self).__init__(message, long_msg)
+
+
+class NoStageError(FetchError):
+ """Raised when fetch operations are called before set_stage()."""
+ def __init__(self, method):
+ super(NoStageError, self).__init__(
+ "Must call FetchStrategy.set_stage() before calling %s" % method.__name__)
diff --git a/lib/spack/spack/mirror.py b/lib/spack/spack/mirror.py
new file mode 100644
index 0000000000..9c700cd551
--- /dev/null
+++ b/lib/spack/spack/mirror.py
@@ -0,0 +1,189 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+"""
+This file contains code for creating spack mirror directories. A
+mirror is an organized hierarchy containing specially named archive
+files. This enabled spack to know where to find files in a mirror if
+the main server for a particualr package is down. Or, if the computer
+where spack is run is not connected to the internet, it allows spack
+to download packages directly from a mirror (e.g., on an intranet).
+"""
+import sys
+import os
+import llnl.util.tty as tty
+from llnl.util.filesystem import *
+
+import spack
+import spack.error
+import spack.fetch_strategy as fs
+from spack.spec import Spec
+from spack.stage import Stage
+from spack.version import *
+from spack.util.compression import extension
+
+
+def mirror_archive_filename(spec):
+ """Get the path that this spec will live at within a mirror."""
+ if not spec.version.concrete:
+ raise ValueError("mirror.path requires spec with concrete version.")
+
+ fetcher = spec.package.fetcher
+ if isinstance(fetcher, fs.URLFetchStrategy):
+ # If we fetch this version with a URLFetchStrategy, use URL's archive type
+ ext = extension(fetcher.url)
+ else:
+ # Otherwise we'll make a .tar.gz ourselves
+ ext = 'tar.gz'
+
+ return "%s-%s.%s" % (spec.package.name, spec.version, ext)
+
+
+def get_matching_versions(specs, **kwargs):
+ """Get a spec for EACH known version matching any spec in the list."""
+ matching = []
+ for spec in specs:
+ pkg = spec.package
+
+ # Skip any package that has no known versions.
+ if not pkg.versions:
+ tty.msg("No safe (checksummed) versions for package %s." % pkg.name)
+ continue
+
+ num_versions = kwargs.get('num_versions', 0)
+ for i, v in enumerate(reversed(sorted(pkg.versions))):
+ # Generate no more than num_versions versions for each spec.
+ if num_versions and i >= num_versions:
+ break
+
+ # Generate only versions that satisfy the spec.
+ if v.satisfies(spec.versions):
+ s = Spec(pkg.name)
+ s.versions = VersionList([v])
+ matching.append(s)
+
+ return matching
+
+
+def create(path, specs, **kwargs):
+ """Create a directory to be used as a spack mirror, and fill it with
+ package archives.
+
+ Arguments:
+ path Path to create a mirror directory hierarchy in.
+ specs Any package versions matching these specs will be added
+ to the mirror.
+
+ Keyword args:
+ no_checksum: If True, do not checkpoint when fetching (default False)
+ num_versions: Max number of versions to fetch per spec,
+ if spec is ambiguous (default is 0 for all of them)
+
+ Return Value:
+ Returns a tuple of lists: (present, mirrored, error)
+ * present: Package specs that were already prsent.
+ * mirrored: Package specs that were successfully mirrored.
+ * error: Package specs that failed to mirror due to some error.
+
+ This routine iterates through all known package versions, and
+ it creates specs for those versions. If the version satisfies any spec
+ in the specs list, it is downloaded and added to the mirror.
+ """
+ # Make sure nothing is in the way.
+ if os.path.isfile(path):
+ raise MirrorError("%s already exists and is a file." % path)
+
+ # automatically spec-ify anything in the specs array.
+ specs = [s if isinstance(s, Spec) else Spec(s) for s in specs]
+
+ # Get concrete specs for each matching version of these specs.
+ version_specs = get_matching_versions(
+ specs, num_versions=kwargs.get('num_versions', 0))
+ for s in version_specs:
+ s.concretize()
+
+ # Get the absolute path of the root before we start jumping around.
+ mirror_root = os.path.abspath(path)
+ if not os.path.isdir(mirror_root):
+ mkdirp(mirror_root)
+
+ # Things to keep track of while parsing specs.
+ present = []
+ mirrored = []
+ error = []
+
+ # Iterate through packages and download all the safe tarballs for each of them
+ for spec in version_specs:
+ pkg = spec.package
+
+ stage = None
+ try:
+ # create a subdirectory for the current package@version
+ subdir = join_path(mirror_root, pkg.name)
+ mkdirp(subdir)
+
+ archive_file = mirror_archive_filename(spec)
+ archive_path = join_path(subdir, archive_file)
+
+ if os.path.exists(archive_path):
+ tty.msg("Already added %s" % spec.format("$_$@"))
+ present.append(spec)
+ continue
+
+ # Set up a stage and a fetcher for the download
+ unique_fetch_name = spec.format("$_$@")
+ fetcher = fs.for_package_version(pkg, pkg.version)
+ stage = Stage(fetcher, name=unique_fetch_name)
+ fetcher.set_stage(stage)
+
+ # Do the fetch and checksum if necessary
+ fetcher.fetch()
+ if not kwargs.get('no_checksum', False):
+ fetcher.check()
+ tty.msg("Checksum passed for %s@%s" % (pkg.name, pkg.version))
+
+ # Fetchers have to know how to archive their files. Use
+ # that to move/copy/create an archive in the mirror.
+ fetcher.archive(archive_path)
+ tty.msg("Added %s." % spec.format("$_$@"))
+ mirrored.append(spec)
+
+ except Exception, e:
+ if spack.debug:
+ sys.excepthook(*sys.exc_info())
+ else:
+ tty.warn("Error while fetching %s." % spec.format('$_$@'), e.message)
+ error.append(spec)
+
+ finally:
+ if stage:
+ stage.destroy()
+
+ return (present, mirrored, error)
+
+
+class MirrorError(spack.error.SpackError):
+ """Superclass of all mirror-creation related errors."""
+ def __init__(self, msg, long_msg=None):
+ super(MirrorError, self).__init__(msg, long_msg)
diff --git a/lib/spack/spack/package.py b/lib/spack/spack/package.py
index 91419e3a3f..649e772a10 100644
--- a/lib/spack/spack/package.py
+++ b/lib/spack/spack/package.py
@@ -52,6 +52,7 @@ import spack.compilers
import spack.hooks
import spack.build_environment as build_env
import spack.url as url
+import spack.fetch_strategy as fs
from spack.version import *
from spack.stage import Stage
from spack.util.web import get_pages
@@ -114,13 +115,13 @@ class Package(object):
2. The class name, "Cmake". This is formed by converting `-` or
``_`` in the module name to camel case. If the name starts with
- a number, we prefix the class name with ``Num_``. Examples:
+ a number, we prefix the class name with ``_``. Examples:
Module Name Class Name
foo_bar FooBar
docbook-xml DocbookXml
FooBar Foobar
- 3proxy Num_3proxy
+ 3proxy _3proxy
The class name is what spack looks for when it loads a package module.
@@ -300,7 +301,7 @@ class Package(object):
# These variables are defaults for the various "relations".
#
"""Map of information about Versions of this package.
- Map goes: Version -> VersionDescriptor"""
+ Map goes: Version -> dict of attributes"""
versions = {}
"""Specs of dependency packages, keyed by name."""
@@ -337,7 +338,7 @@ class Package(object):
# Sanity check some required variables that could be
# overridden by package authors.
- def sanity_check_dict(attr_name):
+ def ensure_has_dict(attr_name):
if not hasattr(self, attr_name):
raise PackageError("Package %s must define %s" % attr_name)
@@ -345,10 +346,10 @@ class Package(object):
if not isinstance(attr, dict):
raise PackageError("Package %s has non-dict %s attribute!"
% (self.name, attr_name))
- sanity_check_dict('versions')
- sanity_check_dict('dependencies')
- sanity_check_dict('conflicted')
- sanity_check_dict('patches')
+ ensure_has_dict('versions')
+ ensure_has_dict('dependencies')
+ ensure_has_dict('conflicted')
+ ensure_has_dict('patches')
# Check versions in the versions dict.
for v in self.versions:
@@ -356,22 +357,23 @@ class Package(object):
# Check version descriptors
for v in sorted(self.versions):
- vdesc = self.versions[v]
- assert(isinstance(vdesc, spack.relations.VersionDescriptor))
+ assert(isinstance(self.versions[v], dict))
# Version-ize the keys in versions dict
try:
self.versions = dict((Version(v), h) for v,h in self.versions.items())
- except ValueError:
- raise ValueError("Keys of versions dict in package %s must be versions!"
- % self.name)
+ except ValueError, e:
+ raise ValueError("In package %s: %s" % (self.name, e.message))
# stage used to build this package.
self._stage = None
- # patch up self.url based on the actual version
- if self.spec.concrete:
- self.url = self.url_for_version(self.version)
+ # If there's no default URL provided, set this package's url to None
+ if not hasattr(self, 'url'):
+ self.url = None
+
+ # Init fetch strategy to None
+ self._fetcher = None
# Set a default list URL (place to find available versions)
if not hasattr(self, 'list_url'):
@@ -383,27 +385,100 @@ class Package(object):
@property
def version(self):
- if not self.spec.concrete:
- raise ValueError("Can only get version of concrete package.")
+ if not self.spec.versions.concrete:
+ raise ValueError("Can only get of package with concrete version.")
return self.spec.versions[0]
+ @memoized
+ def version_urls(self):
+ """Return a list of URLs for different versions of this
+ package, sorted by version. A version's URL only appears
+ in this list if it has an explicitly defined URL."""
+ version_urls = {}
+ for v in sorted(self.versions):
+ args = self.versions[v]
+ if 'url' in args:
+ version_urls[v] = args['url']
+ return version_urls
+
+
+ def nearest_url(self, version):
+ """Finds the URL for the next lowest version with a URL.
+ If there is no lower version with a URL, uses the
+ package url property. If that isn't there, uses a
+ *higher* URL, and if that isn't there raises an error.
+ """
+ version_urls = self.version_urls()
+ url = self.url
+
+ for v in version_urls:
+ if v > version and url:
+ break
+ if version_urls[v]:
+ url = version_urls[v]
+ return url
+
+
+ def has_url(self):
+ """Returns whether there is a URL available for this package.
+ If there isn't, it's probably fetched some other way (version
+ control, etc.)"""
+ return self.url or self.version_urls()
+
+
+ # TODO: move this out of here and into some URL extrapolation module?
+ def url_for_version(self, version):
+ """Returns a URL that you can download a new version of this package from."""
+ if not isinstance(version, Version):
+ version = Version(version)
+
+ if not self.has_url():
+ raise NoURLError(self.__class__)
+
+ # If we have a specific URL for this version, don't extrapolate.
+ version_urls = self.version_urls()
+ if version in version_urls:
+ return version_urls[version]
+
+ # If we have no idea, try to substitute the version.
+ return url.substitute_version(self.nearest_url(version),
+ self.url_version(version))
+
+
@property
def stage(self):
if not self.spec.concrete:
raise ValueError("Can only get a stage for a concrete package.")
if self._stage is None:
- if not self.url:
- raise PackageVersionError(self.version)
+ self._stage = Stage(self.fetcher,
+ mirror_path=self.mirror_path(),
+ name=self.spec.short_spec)
+ return self._stage
- # TODO: move this logic into a mirror module.
- mirror_path = "%s/%s" % (self.name, "%s-%s.%s" % (
- self.name, self.version, extension(self.url)))
- self._stage = Stage(
- self.url, mirror_path=mirror_path, name=self.spec.short_spec)
- return self._stage
+ @property
+ def fetcher(self):
+ if not self.spec.versions.concrete:
+ raise ValueError(
+ "Can only get a fetcher for a package with concrete versions.")
+
+ if not self._fetcher:
+ self._fetcher = fs.for_package_version(self, self.version)
+ return self._fetcher
+
+
+ @fetcher.setter
+ def fetcher(self, f):
+ self._fetcher = f
+
+
+ def mirror_path(self):
+ """Get path to this package's archive in a mirror."""
+ filename = "%s-%s." % (self.name, self.version)
+ filename += extension(self.url) if self.has_url() else "tar.gz"
+ return "%s/%s" % (self.name, filename)
def preorder_traversal(self, visited=None, **kwargs):
@@ -524,48 +599,6 @@ class Package(object):
return str(version)
- def url_for_version(self, version):
- """Returns a URL that you can download a new version of this package from."""
- if not isinstance(version, Version):
- version = Version(version)
-
- def nearest_url(version):
- """Finds the URL for the next lowest version with a URL.
- If there is no lower version with a URL, uses the
- package url property. If that isn't there, uses a
- *higher* URL, and if that isn't there raises an error.
- """
- url = getattr(self, 'url', None)
- for v in sorted(self.versions):
- if v > version and url:
- break
- if self.versions[v].url:
- url = self.versions[v].url
- return url
-
- if version in self.versions:
- vdesc = self.versions[version]
- if not vdesc.url:
- base_url = nearest_url(version)
- vdesc.url = url.substitute_version(
- base_url, self.url_version(version))
- return vdesc.url
- else:
- base_url = nearest_url(version)
- return url.substitute_version(base_url, self.url_version(version))
-
-
- @property
- def default_url(self):
- if self.concrete:
- return self.url_for_version(self.version)
- else:
- url = getattr(self, 'url', None)
- if url:
- return url
-
-
-
def remove_prefix(self):
"""Removes the prefix for a package along with any empty parent directories."""
spack.install_layout.remove_path_for_spec(self.spec)
@@ -588,9 +621,7 @@ class Package(object):
self.stage.fetch()
if spack.do_checksum and self.version in self.versions:
- digest = self.versions[self.version].checksum
- self.stage.check(digest)
- tty.msg("Checksum passed for %s@%s" % (self.name, self.version))
+ self.stage.check()
def do_stage(self):
@@ -601,14 +632,13 @@ class Package(object):
self.do_fetch()
- archive_dir = self.stage.expanded_archive_path
+ archive_dir = self.stage.source_path
if not archive_dir:
- tty.msg("Staging archive: %s" % self.stage.archive_file)
self.stage.expand_archive()
- tty.msg("Created stage directory in %s." % self.stage.path)
+ tty.msg("Created stage in %s." % self.stage.path)
else:
tty.msg("Already staged %s in %s." % (self.name, self.stage.path))
- self.stage.chdir_to_archive()
+ self.stage.chdir_to_source()
def do_patch(self):
@@ -617,11 +647,17 @@ class Package(object):
if not self.spec.concrete:
raise ValueError("Can only patch concrete packages.")
+ # Kick off the stage first.
self.do_stage()
+ # If there are no patches, note it.
+ if not self.patches:
+ tty.msg("No patches needed for %s." % self.name)
+ return
+
# Construct paths to special files in the archive dir used to
# keep track of whether patches were successfully applied.
- archive_dir = self.stage.expanded_archive_path
+ archive_dir = self.stage.source_path
good_file = join_path(archive_dir, '.spack_patched')
bad_file = join_path(archive_dir, '.spack_patch_failed')
@@ -631,7 +667,7 @@ class Package(object):
tty.msg("Patching failed last time. Restaging.")
self.stage.restage()
- self.stage.chdir_to_archive()
+ self.stage.chdir_to_source()
# If this file exists, then we already applied all the patches.
if os.path.isfile(good_file):
@@ -791,19 +827,15 @@ class Package(object):
def do_clean(self):
if self.stage.expanded_archive_path:
- self.stage.chdir_to_archive()
+ self.stage.chdir_to_source()
self.clean()
def clean(self):
"""By default just runs make clean. Override if this isn't good."""
- try:
- # TODO: should we really call make clean, ro just blow away the directory?
- make = build_env.MakeExecutable('make', self.parallel)
- make('clean')
- tty.msg("Successfully cleaned %s" % self.name)
- except subprocess.CalledProcessError, e:
- tty.warn("Warning: 'make clean' didn't work. Consider 'spack clean --work'.")
+ # TODO: should we really call make clean, ro just blow away the directory?
+ make = build_env.MakeExecutable('make', self.parallel)
+ make('clean')
def do_clean_work(self):
@@ -815,7 +847,6 @@ class Package(object):
"""Removes the stage directory where this package was built."""
if os.path.exists(self.stage.path):
self.stage.destroy()
- tty.msg("Successfully cleaned %s" % self.name)
def fetch_available_versions(self):
@@ -947,3 +978,10 @@ class VersionFetchError(PackageError):
super(VersionFetchError, self).__init__(
"Cannot fetch version for package %s " % cls.__name__ +
"because it does not define a default url.")
+
+
+class NoURLError(PackageError):
+ """Raised when someone tries to build a URL for a package with no URLs."""
+ def __init__(self, cls):
+ super(NoURLError, self).__init__(
+ "Package %s has no version with a URL." % cls.__name__)
diff --git a/lib/spack/spack/packages.py b/lib/spack/spack/packages.py
index 72f9403a64..047d82a93a 100644
--- a/lib/spack/spack/packages.py
+++ b/lib/spack/spack/packages.py
@@ -47,10 +47,10 @@ _package_file_name = 'package.py'
def _autospec(function):
"""Decorator that automatically converts the argument of a single-arg
function to a Spec."""
- def converter(self, spec_like):
+ def converter(self, spec_like, **kwargs):
if not isinstance(spec_like, spack.spec.Spec):
spec_like = spack.spec.Spec(spec_like)
- return function(self, spec_like)
+ return function(self, spec_like, **kwargs)
return converter
@@ -63,10 +63,14 @@ class PackageDB(object):
@_autospec
- def get(self, spec):
+ def get(self, spec, **kwargs):
if spec.virtual:
raise UnknownPackageError(spec.name)
+ if kwargs.get('new', False):
+ if spec in self.instances:
+ del self.instances[spec]
+
if not spec in self.instances:
package_class = self.get_class_for_package_name(spec.name)
try:
@@ -78,6 +82,17 @@ class PackageDB(object):
@_autospec
+ def delete(self, spec):
+ """Force a package to be recreated."""
+ del self.instances[spec]
+
+
+ def purge(self):
+ """Clear entire package instance cache."""
+ self.instances.clear()
+
+
+ @_autospec
def get_installed(self, spec):
"""Get all the installed specs that satisfy the provided spec constraint."""
return [s for s in self.installed_package_specs() if s.satisfies(spec)]
diff --git a/lib/spack/spack/patch.py b/lib/spack/spack/patch.py
index 42d49b15e5..b1b6e07738 100644
--- a/lib/spack/spack/patch.py
+++ b/lib/spack/spack/patch.py
@@ -64,7 +64,7 @@ class Patch(object):
"""Fetch this patch, if necessary, and apply it to the source
code in the supplied stage.
"""
- stage.chdir_to_archive()
+ stage.chdir_to_source()
patch_stage = None
try:
diff --git a/lib/spack/spack/relations.py b/lib/spack/spack/relations.py
index 5afb7e7624..b1f4348945 100644
--- a/lib/spack/spack/relations.py
+++ b/lib/spack/spack/relations.py
@@ -79,32 +79,28 @@ import spack
import spack.spec
import spack.error
import spack.url
-
from spack.version import Version
from spack.patch import Patch
from spack.spec import Spec, parse_anonymous_spec
-class VersionDescriptor(object):
- """A VersionDescriptor contains information to describe a
- particular version of a package. That currently includes a URL
- for the version along with a checksum."""
- def __init__(self, checksum, url):
- self.checksum = checksum
- self.url = url
-
-def version(ver, checksum, **kwargs):
- """Adds a version and associated metadata to the package."""
+def version(ver, checksum=None, **kwargs):
+ """Adds a version and metadata describing how to fetch it.
+ Metadata is just stored as a dict in the package's versions
+ dictionary. Package must turn it into a valid fetch strategy
+ later.
+ """
pkg = caller_locals()
-
versions = pkg.setdefault('versions', {})
- patches = pkg.setdefault('patches', {})
- ver = Version(ver)
- url = kwargs.get('url', None)
+ # special case checksum for backward compatibility
+ if checksum:
+ kwargs['md5'] = checksum
- versions[ver] = VersionDescriptor(checksum, url)
+ # Store the kwargs for the package to use later when constructing
+ # a fetch strategy.
+ versions[Version(ver)] = kwargs
def depends_on(*specs):
diff --git a/lib/spack/spack/spec.py b/lib/spack/spack/spec.py
index 4838fd9946..a0ab38c049 100644
--- a/lib/spack/spack/spec.py
+++ b/lib/spack/spack/spec.py
@@ -1619,7 +1619,7 @@ class UnsatisfiableSpecError(SpecError):
class UnsatisfiableSpecNameError(UnsatisfiableSpecError):
"""Raised when two specs aren't even for the same package."""
def __init__(self, provided, required):
- super(UnsatisfiableVersionSpecError, self).__init__(
+ super(UnsatisfiableSpecNameError, self).__init__(
provided, required, "name")
diff --git a/lib/spack/spack/stage.py b/lib/spack/spack/stage.py
index e2e156e916..b371761785 100644
--- a/lib/spack/spack/stage.py
+++ b/lib/spack/spack/stage.py
@@ -32,18 +32,19 @@ from llnl.util.filesystem import *
import spack
import spack.config
+import spack.fetch_strategy as fs
import spack.error
-import spack.util.crypto as crypto
-from spack.util.compression import decompressor_for
STAGE_PREFIX = 'spack-stage-'
class Stage(object):
- """A Stage object manaages a directory where an archive is downloaded,
- expanded, and built before being installed. It also handles downloading
- the archive. A stage's lifecycle looks like this:
+ """A Stage object manaages a directory where some source code is
+ downloaded and built before being installed. It handles
+ fetching the source code, either as an archive to be expanded
+ or by checking it out of a repository. A stage's lifecycle
+ looks like this:
Stage()
Constructor creates the stage directory.
@@ -68,21 +69,31 @@ class Stage(object):
similar, and are intended to persist for only one run of spack.
"""
- def __init__(self, url, **kwargs):
+ def __init__(self, url_or_fetch_strategy, **kwargs):
"""Create a stage object.
Parameters:
- url URL of the archive to be downloaded into this stage.
-
- name If a name is provided, then this stage is a named stage
- and will persist between runs (or if you construct another
- stage object later). If name is not provided, then this
- stage will be given a unique name automatically.
+ url_or_fetch_strategy
+ URL of the archive to be downloaded into this stage, OR
+ a valid FetchStrategy.
+
+ name
+ If a name is provided, then this stage is a named stage
+ and will persist between runs (or if you construct another
+ stage object later). If name is not provided, then this
+ stage will be given a unique name automatically.
"""
+ if isinstance(url_or_fetch_strategy, basestring):
+ self.fetcher = fs.from_url(url_or_fetch_strategy)
+ elif isinstance(url_or_fetch_strategy, fs.FetchStrategy):
+ self.fetcher = url_or_fetch_strategy
+ else:
+ raise ValueError("Can't construct Stage without url or fetch strategy")
+
+ self.fetcher.set_stage(self)
self.name = kwargs.get('name')
self.mirror_path = kwargs.get('mirror_path')
self.tmp_root = find_tmp_root()
- self.url = url
self.path = None
self._setup()
@@ -187,7 +198,10 @@ class Stage(object):
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""
- paths = [os.path.join(self.path, os.path.basename(self.url))]
+ if not isinstance(self.fetcher, fs.URLFetchStrategy):
+ return None
+
+ paths = [os.path.join(self.path, os.path.basename(self.fetcher.url))]
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
@@ -198,17 +212,17 @@ class Stage(object):
@property
- def expanded_archive_path(self):
- """Returns the path to the expanded archive directory if it's expanded;
- None if the archive hasn't been expanded.
- """
- if not self.archive_file:
- return None
+ def source_path(self):
+ """Returns the path to the expanded/checked out source code
+ within this fetch strategy's path.
- for file in os.listdir(self.path):
- archive_path = join_path(self.path, file)
- if os.path.isdir(archive_path):
- return archive_path
+ This assumes nothing else is going ot be put in the
+ FetchStrategy's path. It searches for the first
+ subdirectory of the path it can find, then returns that.
+ """
+ for p in [os.path.join(self.path, f) for f in os.listdir(self.path)]:
+ if os.path.isdir(p):
+ return p
return None
@@ -220,76 +234,37 @@ class Stage(object):
tty.die("Setup failed: no such directory: " + self.path)
- def fetch_from_url(self, url):
- # Run curl but grab the mime type from the http headers
- headers = spack.curl('-#', # status bar
- '-O', # save file to disk
- '-f', # fail on >400 errors
- '-D', '-', # print out HTML headers
- '-L', url,
- return_output=True, fail_on_error=False)
-
- if spack.curl.returncode != 0:
- # clean up archive on failure.
- if self.archive_file:
- os.remove(self.archive_file)
-
- if spack.curl.returncode == 22:
- # This is a 404. Curl will print the error.
- raise FailedDownloadError(url)
-
- if spack.curl.returncode == 60:
- # This is a certificate error. Suggest spack -k
- raise FailedDownloadError(
- url,
- "Curl was unable to fetch due to invalid certificate. "
- "This is either an attack, or your cluster's SSL configuration "
- "is bad. If you believe your SSL configuration is bad, you "
- "can try running spack -k, which will not check SSL certificates."
- "Use this at your own risk.")
-
- # Check if we somehow got an HTML file rather than the archive we
- # asked for. We only look at the last content type, to handle
- # redirects properly.
- content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
- if content_types and 'text/html' in content_types[-1]:
- tty.warn("The contents of " + self.archive_file + " look like HTML.",
- "The checksum will likely be bad. If it is, you can use",
- "'spack clean --dist' to remove the bad archive, then fix",
- "your internet gateway issue and install again.")
-
-
def fetch(self):
- """Downloads the file at URL to the stage. Returns true if it was downloaded,
- false if it already existed."""
+ """Downloads an archive or checks out code from a repository."""
self.chdir()
- if self.archive_file:
- tty.msg("Already downloaded %s." % self.archive_file)
- else:
- urls = [self.url]
- if self.mirror_path:
- urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()] + urls
+ fetchers = [self.fetcher]
- for url in urls:
- tty.msg("Trying to fetch from %s" % url)
- self.fetch_from_url(url)
- if self.archive_file:
- break
+ # TODO: move mirror logic out of here and clean it up!
+ if self.mirror_path:
+ urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()]
- if not self.archive_file:
- raise FailedDownloadError(url)
+ digest = None
+ if isinstance(self.fetcher, fs.URLFetchStrategy):
+ digest = self.fetcher.digest
+ fetchers = [fs.URLFetchStrategy(url, digest)
+ for url in urls] + fetchers
+ for f in fetchers:
+ f.set_stage(self)
- return self.archive_file
+ for fetcher in fetchers:
+ try:
+ fetcher.fetch()
+ break
+ except spack.error.SpackError, e:
+ tty.msg("Fetching %s failed." % fetcher)
+ continue
- def check(self, digest):
- """Check the downloaded archive against a checksum digest"""
- checker = crypto.Checker(digest)
- if not checker.check(self.archive_file):
- raise ChecksumError(
- "%s checksum failed for %s." % (checker.hash_name, self.archive_file),
- "Expected %s but got %s." % (digest, checker.sum))
+ def check(self):
+ """Check the downloaded archive against a checksum digest.
+ No-op if this stage checks code out of a repository."""
+ self.fetcher.check()
def expand_archive(self):
@@ -297,19 +272,14 @@ class Stage(object):
archive. Fail if the stage is not set up or if the archive is not yet
downloaded.
"""
- self.chdir()
- if not self.archive_file:
- tty.die("Attempt to expand archive before fetching.")
-
- decompress = decompressor_for(self.archive_file)
- decompress(self.archive_file)
+ self.fetcher.expand()
- def chdir_to_archive(self):
+ def chdir_to_source(self):
"""Changes directory to the expanded archive directory.
Dies with an error if there was no expanded archive.
"""
- path = self.expanded_archive_path
+ path = self.source_path
if not path:
tty.die("Attempt to chdir before expanding archive.")
else:
@@ -322,12 +292,7 @@ class Stage(object):
"""Removes the expanded archive path if it exists, then re-expands
the archive.
"""
- if not self.archive_file:
- tty.die("Attempt to restage when not staged.")
-
- if self.expanded_archive_path:
- shutil.rmtree(self.expanded_archive_path, True)
- self.expand_archive()
+ self.fetcher.reset()
def destroy(self):
@@ -398,15 +363,20 @@ def find_tmp_root():
return None
-class FailedDownloadError(spack.error.SpackError):
- """Raised wen a download fails."""
- def __init__(self, url, msg=""):
- super(FailedDownloadError, self).__init__(
- "Failed to fetch file from URL: %s" % url, msg)
- self.url = url
+class StageError(spack.error.SpackError):
+ def __init__(self, message, long_message=None):
+ super(self, StageError).__init__(message, long_message)
+
+
+class RestageError(StageError):
+ def __init__(self, message, long_msg=None):
+ super(RestageError, self).__init__(message, long_msg)
+
+
+class ChdirError(StageError):
+ def __init__(self, message, long_msg=None):
+ super(ChdirError, self).__init__(message, long_msg)
-class ChecksumError(spack.error.SpackError):
- """Raised when archive fails to checksum."""
- def __init__(self, message, long_msg):
- super(ChecksumError, self).__init__(message, long_msg)
+# Keep this in namespace for convenience
+FailedDownloadError = fs.FailedDownloadError
diff --git a/lib/spack/spack/test/__init__.py b/lib/spack/spack/test/__init__.py
index e245d30f39..9eae3261c2 100644
--- a/lib/spack/spack/test/__init__.py
+++ b/lib/spack/spack/test/__init__.py
@@ -48,7 +48,12 @@ test_names = ['versions',
'package_sanity',
'config',
'directory_layout',
- 'python_version']
+ 'python_version',
+ 'git_fetch',
+ 'svn_fetch',
+ 'hg_fetch',
+ 'mirror',
+ 'url_extrapolate']
def list_tests():
diff --git a/lib/spack/spack/test/git_fetch.py b/lib/spack/spack/test/git_fetch.py
new file mode 100644
index 0000000000..f6d9bfcf05
--- /dev/null
+++ b/lib/spack/spack/test/git_fetch.py
@@ -0,0 +1,133 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+import unittest
+import shutil
+import tempfile
+from contextlib import closing
+
+from llnl.util.filesystem import *
+
+import spack
+from spack.version import ver
+from spack.stage import Stage
+from spack.util.executable import which
+
+from spack.test.mock_packages_test import *
+from spack.test.mock_repo import MockGitRepo
+
+
+class GitFetchTest(MockPackagesTest):
+ """Tests fetching from a dummy git repository."""
+
+ def setUp(self):
+ """Create a git repository with master and two other branches,
+ and one tag, so that we can experiment on it."""
+ super(GitFetchTest, self).setUp()
+
+ self.repo = MockGitRepo()
+
+ spec = Spec('git-test')
+ spec.concretize()
+ self.pkg = spack.db.get(spec, new=True)
+
+
+ def tearDown(self):
+ """Destroy the stage space used by this test."""
+ super(GitFetchTest, self).tearDown()
+
+ if self.repo.stage is not None:
+ self.repo.stage.destroy()
+
+ self.pkg.do_clean_dist()
+
+
+ def assert_rev(self, rev):
+ """Check that the current git revision is equal to the supplied rev."""
+ self.assertEqual(self.repo.rev_hash('HEAD'), self.repo.rev_hash(rev))
+
+
+ def try_fetch(self, rev, test_file, args):
+ """Tries to:
+ 1. Fetch the repo using a fetch strategy constructed with
+ supplied args.
+ 2. Check if the test_file is in the checked out repository.
+ 3. Assert that the repository is at the revision supplied.
+ 4. Add and remove some files, then reset the repo, and
+ ensure it's all there again.
+ """
+ self.pkg.versions[ver('git')] = args
+
+ self.pkg.do_stage()
+ self.assert_rev(rev)
+
+ file_path = join_path(self.pkg.stage.source_path, test_file)
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ os.unlink(file_path)
+ self.assertFalse(os.path.isfile(file_path))
+
+ untracked_file = 'foobarbaz'
+ touch(untracked_file)
+ self.assertTrue(os.path.isfile(untracked_file))
+ self.pkg.do_clean_work()
+ self.assertFalse(os.path.isfile(untracked_file))
+
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ self.assert_rev(rev)
+
+
+ def test_fetch_master(self):
+ """Test a default git checkout with no commit or tag specified."""
+ self.try_fetch('master', self.repo.r0_file, {
+ 'git' : self.repo.path
+ })
+
+
+ def ztest_fetch_branch(self):
+ """Test fetching a branch."""
+ self.try_fetch(self.repo.branch, self.repo.branch_file, {
+ 'git' : self.repo.path,
+ 'branch' : self.repo.branch
+ })
+
+
+ def ztest_fetch_tag(self):
+ """Test fetching a tag."""
+ self.try_fetch(self.repo.tag, self.repo.tag_file, {
+ 'git' : self.repo.path,
+ 'tag' : self.repo.tag
+ })
+
+
+ def ztest_fetch_commit(self):
+ """Test fetching a particular commit."""
+ self.try_fetch(self.repo.r1, self.repo.r1_file, {
+ 'git' : self.repo.path,
+ 'commit' : self.repo.r1
+ })
diff --git a/lib/spack/spack/test/hg_fetch.py b/lib/spack/spack/test/hg_fetch.py
new file mode 100644
index 0000000000..97c5b665e7
--- /dev/null
+++ b/lib/spack/spack/test/hg_fetch.py
@@ -0,0 +1,111 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+import unittest
+import shutil
+import tempfile
+from contextlib import closing
+
+from llnl.util.filesystem import *
+
+import spack
+from spack.version import ver
+from spack.stage import Stage
+from spack.util.executable import which
+from spack.test.mock_packages_test import *
+from spack.test.mock_repo import MockHgRepo
+
+
+class HgFetchTest(MockPackagesTest):
+ """Tests fetching from a dummy hg repository."""
+
+ def setUp(self):
+ """Create a hg repository with master and two other branches,
+ and one tag, so that we can experiment on it."""
+ super(HgFetchTest, self).setUp()
+
+ self.repo = MockHgRepo()
+
+ spec = Spec('hg-test')
+ spec.concretize()
+ self.pkg = spack.db.get(spec, new=True)
+
+
+ def tearDown(self):
+ """Destroy the stage space used by this test."""
+ super(HgFetchTest, self).tearDown()
+
+ if self.repo.stage is not None:
+ self.repo.stage.destroy()
+
+ self.pkg.do_clean_dist()
+
+
+ def try_fetch(self, rev, test_file, args):
+ """Tries to:
+ 1. Fetch the repo using a fetch strategy constructed with
+ supplied args.
+ 2. Check if the test_file is in the checked out repository.
+ 3. Assert that the repository is at the revision supplied.
+ 4. Add and remove some files, then reset the repo, and
+ ensure it's all there again.
+ """
+ self.pkg.versions[ver('hg')] = args
+
+ self.pkg.do_stage()
+ self.assertEqual(self.repo.get_rev(), rev)
+
+ file_path = join_path(self.pkg.stage.source_path, test_file)
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ os.unlink(file_path)
+ self.assertFalse(os.path.isfile(file_path))
+
+ untracked = 'foobarbaz'
+ touch(untracked)
+ self.assertTrue(os.path.isfile(untracked))
+ self.pkg.do_clean_work()
+ self.assertFalse(os.path.isfile(untracked))
+
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ self.assertEqual(self.repo.get_rev(), rev)
+
+
+ def test_fetch_default(self):
+ """Test a default hg checkout with no commit or tag specified."""
+ self.try_fetch(self.repo.r1, self.repo.r1_file, {
+ 'hg' : self.repo.path
+ })
+
+
+ def test_fetch_rev0(self):
+ """Test fetching a branch."""
+ self.try_fetch(self.repo.r0, self.repo.r0_file, {
+ 'hg' : self.repo.path,
+ 'revision' : self.repo.r0
+ })
diff --git a/lib/spack/spack/test/install.py b/lib/spack/spack/test/install.py
index 8047ab92e3..e052f53e77 100644
--- a/lib/spack/spack/test/install.py
+++ b/lib/spack/spack/test/install.py
@@ -32,42 +32,21 @@ from llnl.util.filesystem import *
import spack
from spack.stage import Stage
+from spack.fetch_strategy import URLFetchStrategy
from spack.directory_layout import SpecHashDirectoryLayout
from spack.util.executable import which
from spack.test.mock_packages_test import *
+from spack.test.mock_repo import MockArchive
-dir_name = 'trivial-1.0'
-archive_name = 'trivial-1.0.tar.gz'
-install_test_package = 'trivial_install_test_package'
-
class InstallTest(MockPackagesTest):
"""Tests install and uninstall on a trivial package."""
def setUp(self):
super(InstallTest, self).setUp()
- self.stage = Stage('not_a_real_url')
- archive_dir = join_path(self.stage.path, dir_name)
- dummy_configure = join_path(archive_dir, 'configure')
-
- mkdirp(archive_dir)
- with closing(open(dummy_configure, 'w')) as configure:
- configure.write(
- "#!/bin/sh\n"
- "prefix=$(echo $1 | sed 's/--prefix=//')\n"
- "cat > Makefile <<EOF\n"
- "all:\n"
- "\techo Building...\n\n"
- "install:\n"
- "\tmkdir -p $prefix\n"
- "\ttouch $prefix/dummy_file\n"
- "EOF\n")
- os.chmod(dummy_configure, 0755)
-
- with working_dir(self.stage.path):
- tar = which('tar')
- tar('-czf', archive_name, dir_name)
+ # create a simple installable package directory and tarball
+ self.repo = MockArchive()
# We use a fake package, so skip the checksum.
spack.do_checksum = False
@@ -82,8 +61,8 @@ class InstallTest(MockPackagesTest):
def tearDown(self):
super(InstallTest, self).tearDown()
- if self.stage is not None:
- self.stage.destroy()
+ if self.repo.stage is not None:
+ self.repo.stage.destroy()
# Turn checksumming back on
spack.do_checksum = True
@@ -95,7 +74,7 @@ class InstallTest(MockPackagesTest):
def test_install_and_uninstall(self):
# Get a basic concrete spec for the trivial install package.
- spec = Spec(install_test_package)
+ spec = Spec('trivial_install_test_package')
spec.concretize()
self.assertTrue(spec.concrete)
@@ -103,8 +82,7 @@ class InstallTest(MockPackagesTest):
pkg = spack.db.get(spec)
# Fake the URL for the package so it downloads from a file.
- archive_path = join_path(self.stage.path, archive_name)
- pkg.url = 'file://' + archive_path
+ pkg.fetcher = URLFetchStrategy(self.repo.url)
try:
pkg.do_install()
diff --git a/lib/spack/spack/test/mirror.py b/lib/spack/spack/test/mirror.py
new file mode 100644
index 0000000000..51334198ec
--- /dev/null
+++ b/lib/spack/spack/test/mirror.py
@@ -0,0 +1,156 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+from filecmp import dircmp
+
+import spack
+import spack.mirror
+from spack.util.compression import decompressor_for
+from spack.test.mock_packages_test import *
+from spack.test.mock_repo import *
+
+# paths in repos that shouldn't be in the mirror tarballs.
+exclude = ['.hg', '.git', '.svn']
+
+
+class MirrorTest(MockPackagesTest):
+ def setUp(self):
+ """Sets up a mock package and a mock repo for each fetch strategy, to
+ ensure that the mirror can create archives for each of them.
+ """
+ super(MirrorTest, self).setUp()
+ self.repos = {}
+
+
+ def set_up_package(self, name, mock_repo_class, url_attr):
+ """Use this to set up a mock package to be mirrored.
+ Each package needs us to:
+ 1. Set up a mock repo/archive to fetch from.
+ 2. Point the package's version args at that repo.
+ """
+ # Set up packages to point at mock repos.
+ spec = Spec(name)
+ spec.concretize()
+
+ # Get the package and fix its fetch args to point to a mock repo
+ pkg = spack.db.get(spec)
+ repo = mock_repo_class()
+ self.repos[name] = repo
+
+ # change the fetch args of the first (only) version.
+ assert(len(pkg.versions) == 1)
+ v = next(iter(pkg.versions))
+ pkg.versions[v][url_attr] = repo.url
+
+
+ def tearDown(self):
+ """Destroy all the stages created by the repos in setup."""
+ super(MirrorTest, self).tearDown()
+
+ for name, repo in self.repos.items():
+ if repo.stage:
+ repo.stage.destroy()
+
+ self.repos.clear()
+
+
+ def check_mirror(self):
+ stage = Stage('spack-mirror-test')
+ mirror_root = join_path(stage.path, 'test-mirror')
+
+ try:
+ os.chdir(stage.path)
+ spack.mirror.create(
+ mirror_root, self.repos, no_checksum=True)
+
+ # Stage directory exists
+ self.assertTrue(os.path.isdir(mirror_root))
+
+ # subdirs for each package
+ for name in self.repos:
+ subdir = join_path(mirror_root, name)
+ self.assertTrue(os.path.isdir(subdir))
+
+ files = os.listdir(subdir)
+ self.assertEqual(len(files), 1)
+
+ # Decompress archive in the mirror
+ archive = files[0]
+ archive_path = join_path(subdir, archive)
+ decomp = decompressor_for(archive_path)
+
+ with working_dir(subdir):
+ decomp(archive_path)
+
+ # Find the untarred archive directory.
+ files = os.listdir(subdir)
+ self.assertEqual(len(files), 2)
+ self.assertTrue(archive in files)
+ files.remove(archive)
+
+ expanded_archive = join_path(subdir, files[0])
+ self.assertTrue(os.path.isdir(expanded_archive))
+
+ # Compare the original repo with the expanded archive
+ repo = self.repos[name]
+ if not 'svn' in name:
+ original_path = repo.path
+ else:
+ co = 'checked_out'
+ svn('checkout', repo.url, co)
+ original_path = join_path(subdir, co)
+
+ dcmp = dircmp(original_path, expanded_archive)
+
+ # make sure there are no new files in the expanded tarball
+ self.assertFalse(dcmp.right_only)
+ self.assertTrue(all(l in exclude for l in dcmp.left_only))
+
+ finally:
+ stage.destroy()
+
+
+ def test_git_mirror(self):
+ self.set_up_package('git-test', MockGitRepo, 'git')
+ self.check_mirror()
+
+ def test_svn_mirror(self):
+ self.set_up_package('svn-test', MockSvnRepo, 'svn')
+ self.check_mirror()
+
+ def test_hg_mirror(self):
+ self.set_up_package('hg-test', MockHgRepo, 'hg')
+ self.check_mirror()
+
+ def test_url_mirror(self):
+ self.set_up_package('trivial_install_test_package', MockArchive, 'url')
+ self.check_mirror()
+
+ def test_all_mirror(self):
+ self.set_up_package('git-test', MockGitRepo, 'git')
+ self.set_up_package('svn-test', MockSvnRepo, 'svn')
+ self.set_up_package('hg-test', MockHgRepo, 'hg')
+ self.set_up_package('trivial_install_test_package', MockArchive, 'url')
+ self.check_mirror()
diff --git a/lib/spack/spack/test/mock_repo.py b/lib/spack/spack/test/mock_repo.py
new file mode 100644
index 0000000000..659f29067a
--- /dev/null
+++ b/lib/spack/spack/test/mock_repo.py
@@ -0,0 +1,197 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+import shutil
+from contextlib import closing
+
+from llnl.util.filesystem import *
+
+import spack
+from spack.version import ver
+from spack.stage import Stage
+from spack.util.executable import which
+
+
+#
+# VCS Systems used by mock repo code.
+#
+git = which('git', required=True)
+svn = which('svn', required=True)
+svnadmin = which('svnadmin', required=True)
+hg = which('hg', required=True)
+tar = which('tar', required=True)
+
+
+class MockRepo(object):
+ def __init__(self, stage_name, repo_name):
+ """This creates a stage where some archive/repo files can be staged
+ for testing spack's fetch strategies."""
+ # Stage where this repo has been created
+ self.stage = Stage(stage_name)
+
+ # Full path to the repo within the stage.
+ self.path = join_path(self.stage.path, repo_name)
+ mkdirp(self.path)
+
+
+class MockArchive(MockRepo):
+ """Creates a very simple archive directory with a configure script and a
+ makefile that installs to a prefix. Tars it up into an archive."""
+
+ def __init__(self):
+ repo_name = 'mock-archive-repo'
+ super(MockArchive, self).__init__('mock-archive-stage', repo_name)
+
+ with working_dir(self.path):
+ configure = join_path(self.path, 'configure')
+
+ with closing(open(configure, 'w')) as cfg_file:
+ cfg_file.write(
+ "#!/bin/sh\n"
+ "prefix=$(echo $1 | sed 's/--prefix=//')\n"
+ "cat > Makefile <<EOF\n"
+ "all:\n"
+ "\techo Building...\n\n"
+ "install:\n"
+ "\tmkdir -p $prefix\n"
+ "\ttouch $prefix/dummy_file\n"
+ "EOF\n")
+ os.chmod(configure, 0755)
+
+ with working_dir(self.stage.path):
+ archive_name = "%s.tar.gz" % repo_name
+ tar('-czf', archive_name, repo_name)
+
+ self.archive_path = join_path(self.stage.path, archive_name)
+ self.url = 'file://' + self.archive_path
+
+
+class MockVCSRepo(MockRepo):
+ def __init__(self, stage_name, repo_name):
+ """This creates a stage and a repo directory within the stage."""
+ super(MockVCSRepo, self).__init__(stage_name, repo_name)
+
+ # Name for rev0 & rev1 files in the repo to be
+ self.r0_file = 'r0_file'
+ self.r1_file = 'r1_file'
+
+
+class MockGitRepo(MockVCSRepo):
+ def __init__(self):
+ super(MockGitRepo, self).__init__('mock-git-stage', 'mock-git-repo')
+
+ with working_dir(self.path):
+ git('init')
+
+ # r0 is just the first commit
+ touch(self.r0_file)
+ git('add', self.r0_file)
+ git('commit', '-m', 'mock-git-repo r0')
+
+ self.branch = 'test-branch'
+ self.branch_file = 'branch_file'
+ git('branch', self.branch)
+
+ self.tag_branch = 'tag-branch'
+ self.tag_file = 'tag_file'
+ git('branch', self.tag_branch)
+
+ # Check out first branch
+ git('checkout', self.branch)
+ touch(self.branch_file)
+ git('add', self.branch_file)
+ git('commit', '-m' 'r1 test branch')
+
+ # Check out a second branch and tag it
+ git('checkout', self.tag_branch)
+ touch(self.tag_file)
+ git('add', self.tag_file)
+ git('commit', '-m' 'tag test branch')
+
+ self.tag = 'test-tag'
+ git('tag', self.tag)
+
+ git('checkout', 'master')
+
+ # R1 test is the same as test for branch
+ self.r1 = self.rev_hash(self.branch)
+ self.r1_file = self.branch_file
+
+ self.url = self.path
+
+ def rev_hash(self, rev):
+ return git('rev-parse', rev, return_output=True).strip()
+
+
+class MockSvnRepo(MockVCSRepo):
+ def __init__(self):
+ super(MockSvnRepo, self).__init__('mock-svn-stage', 'mock-svn-repo')
+
+ self.url = 'file://' + self.path
+
+ with working_dir(self.stage.path):
+ svnadmin('create', self.path)
+
+ tmp_path = join_path(self.stage.path, 'tmp-path')
+ mkdirp(tmp_path)
+ with working_dir(tmp_path):
+ touch(self.r0_file)
+
+ svn('import', tmp_path, self.url, '-m', 'Initial import r0')
+
+ shutil.rmtree(tmp_path)
+ svn('checkout', self.url, tmp_path)
+ with working_dir(tmp_path):
+ touch(self.r1_file)
+ svn('add', self.r1_file)
+ svn('ci', '-m', 'second revision r1')
+
+ shutil.rmtree(tmp_path)
+
+ self.r0 = '1'
+ self.r1 = '2'
+
+
+class MockHgRepo(MockVCSRepo):
+ def __init__(self):
+ super(MockHgRepo, self).__init__('mock-hg-stage', 'mock-hg-repo')
+ self.url = 'file://' + self.path
+
+ with working_dir(self.path):
+ hg('init')
+
+ touch(self.r0_file)
+ hg('add', self.r0_file)
+ hg('commit', '-m', 'revision 0', '-u', 'test')
+ self.r0 = self.get_rev()
+
+ touch(self.r1_file)
+ hg('add', self.r1_file)
+ hg('commit', '-m' 'revision 1', '-u', 'test')
+ self.r1 = self.get_rev()
+
+ def get_rev(self):
+ """Get current mercurial revision."""
+ return hg('id', '-i', return_output=True).strip()
diff --git a/lib/spack/spack/test/package_sanity.py b/lib/spack/spack/test/package_sanity.py
index e3de695070..6222e7b5f8 100644
--- a/lib/spack/spack/test/package_sanity.py
+++ b/lib/spack/spack/test/package_sanity.py
@@ -56,8 +56,8 @@ class PackageSanityTest(unittest.TestCase):
def test_url_versions(self):
"""Check URLs for regular packages, if they are explicitly defined."""
for pkg in spack.db.all_packages():
- for v, vdesc in pkg.versions.items():
- if vdesc.url:
+ for v, vattrs in pkg.versions.items():
+ if 'url' in vattrs:
# If there is a url for the version check it.
v_url = pkg.url_for_version(v)
- self.assertEqual(vdesc.url, v_url)
+ self.assertEqual(vattrs['url'], v_url)
diff --git a/lib/spack/spack/test/stage.py b/lib/spack/spack/test/stage.py
index a412549dc7..c5a7013675 100644
--- a/lib/spack/spack/test/stage.py
+++ b/lib/spack/spack/test/stage.py
@@ -146,7 +146,7 @@ class StageTest(unittest.TestCase):
stage_path = self.get_stage_path(stage, stage_name)
self.assertTrue(archive_name in os.listdir(stage_path))
self.assertEqual(join_path(stage_path, archive_name),
- stage.archive_file)
+ stage.fetcher.archive_file)
def check_expand_archive(self, stage, stage_name):
@@ -156,7 +156,7 @@ class StageTest(unittest.TestCase):
self.assertEqual(
join_path(stage_path, archive_dir),
- stage.expanded_archive_path)
+ stage.source_path)
readme = join_path(stage_path, archive_dir, readme_name)
self.assertTrue(os.path.isfile(readme))
@@ -170,7 +170,7 @@ class StageTest(unittest.TestCase):
self.assertEqual(os.path.realpath(stage_path), os.getcwd())
- def check_chdir_to_archive(self, stage, stage_name):
+ def check_chdir_to_source(self, stage, stage_name):
stage_path = self.get_stage_path(stage, stage_name)
self.assertEqual(
join_path(os.path.realpath(stage_path), archive_dir),
@@ -271,9 +271,9 @@ class StageTest(unittest.TestCase):
self.check_fetch(stage, stage_name)
stage.expand_archive()
- stage.chdir_to_archive()
+ stage.chdir_to_source()
self.check_expand_archive(stage, stage_name)
- self.check_chdir_to_archive(stage, stage_name)
+ self.check_chdir_to_source(stage, stage_name)
stage.destroy()
self.check_destroy(stage, stage_name)
@@ -284,24 +284,24 @@ class StageTest(unittest.TestCase):
stage.fetch()
stage.expand_archive()
- stage.chdir_to_archive()
+ stage.chdir_to_source()
self.check_expand_archive(stage, stage_name)
- self.check_chdir_to_archive(stage, stage_name)
+ self.check_chdir_to_source(stage, stage_name)
# Try to make a file in the old archive dir
with closing(open('foobar', 'w')) as file:
file.write("this file is to be destroyed.")
- self.assertTrue('foobar' in os.listdir(stage.expanded_archive_path))
+ self.assertTrue('foobar' in os.listdir(stage.source_path))
# Make sure the file is not there after restage.
stage.restage()
self.check_chdir(stage, stage_name)
self.check_fetch(stage, stage_name)
- stage.chdir_to_archive()
- self.check_chdir_to_archive(stage, stage_name)
- self.assertFalse('foobar' in os.listdir(stage.expanded_archive_path))
+ stage.chdir_to_source()
+ self.check_chdir_to_source(stage, stage_name)
+ self.assertFalse('foobar' in os.listdir(stage.source_path))
stage.destroy()
self.check_destroy(stage, stage_name)
diff --git a/lib/spack/spack/test/svn_fetch.py b/lib/spack/spack/test/svn_fetch.py
new file mode 100644
index 0000000000..a48a86dcc3
--- /dev/null
+++ b/lib/spack/spack/test/svn_fetch.py
@@ -0,0 +1,123 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+import os
+import re
+import unittest
+import shutil
+import tempfile
+from contextlib import closing
+
+from llnl.util.filesystem import *
+
+import spack
+from spack.version import ver
+from spack.stage import Stage
+from spack.util.executable import which
+from spack.test.mock_packages_test import *
+from spack.test.mock_repo import svn, MockSvnRepo
+
+
+class SvnFetchTest(MockPackagesTest):
+ """Tests fetching from a dummy git repository."""
+
+ def setUp(self):
+ """Create an svn repository with two revisions."""
+ super(SvnFetchTest, self).setUp()
+
+ self.repo = MockSvnRepo()
+
+ spec = Spec('svn-test')
+ spec.concretize()
+ self.pkg = spack.db.get(spec, new=True)
+
+
+ def tearDown(self):
+ """Destroy the stage space used by this test."""
+ super(SvnFetchTest, self).tearDown()
+
+ if self.repo.stage is not None:
+ self.repo.stage.destroy()
+
+ self.pkg.do_clean_dist()
+
+
+ def assert_rev(self, rev):
+ """Check that the current revision is equal to the supplied rev."""
+ def get_rev():
+ output = svn('info', return_output=True)
+ self.assertTrue("Revision" in output)
+ for line in output.split('\n'):
+ match = re.match(r'Revision: (\d+)', line)
+ if match:
+ return match.group(1)
+ self.assertEqual(get_rev(), rev)
+
+
+ def try_fetch(self, rev, test_file, args):
+ """Tries to:
+ 1. Fetch the repo using a fetch strategy constructed with
+ supplied args.
+ 2. Check if the test_file is in the checked out repository.
+ 3. Assert that the repository is at the revision supplied.
+ 4. Add and remove some files, then reset the repo, and
+ ensure it's all there again.
+ """
+ self.pkg.versions[ver('svn')] = args
+
+ self.pkg.do_stage()
+ self.assert_rev(rev)
+
+ file_path = join_path(self.pkg.stage.source_path, test_file)
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ os.unlink(file_path)
+ self.assertFalse(os.path.isfile(file_path))
+
+ untracked = 'foobarbaz'
+ touch(untracked)
+ self.assertTrue(os.path.isfile(untracked))
+ self.pkg.do_clean_work()
+ self.assertFalse(os.path.isfile(untracked))
+
+ self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
+ self.assertTrue(os.path.isfile(file_path))
+
+ self.assert_rev(rev)
+
+
+ def test_fetch_default(self):
+ """Test a default checkout and make sure it's on rev 1"""
+ self.try_fetch(self.repo.r1, self.repo.r1_file, {
+ 'svn' : self.repo.url
+ })
+
+
+ def test_fetch_r1(self):
+ """Test fetching an older revision (0)."""
+ self.try_fetch(self.repo.r0, self.repo.r0_file, {
+ 'svn' : self.repo.url,
+ 'revision' : self.repo.r0
+ })
diff --git a/lib/spack/spack/test/url_extrapolate.py b/lib/spack/spack/test/url_extrapolate.py
new file mode 100644
index 0000000000..514d119deb
--- /dev/null
+++ b/lib/spack/spack/test/url_extrapolate.py
@@ -0,0 +1,90 @@
+##############################################################################
+# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
+# Produced at the Lawrence Livermore National Laboratory.
+#
+# This file is part of Spack.
+# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
+# LLNL-CODE-647188
+#
+# For details, see https://scalability-llnl.github.io/spack
+# Please also see the LICENSE file for our notice and the LGPL.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License (as published by
+# the Free Software Foundation) version 2.1 dated February 1999.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
+# conditions of the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+##############################################################################
+"""\
+Tests ability of spack to extrapolate URL versions from existing versions.
+"""
+import spack
+import spack.url as url
+from spack.spec import Spec
+from spack.version import ver
+from spack.test.mock_packages_test import *
+
+
+class UrlExtrapolateTest(MockPackagesTest):
+
+ def test_known_version(self):
+ d = spack.db.get('dyninst')
+
+ self.assertEqual(
+ d.url_for_version('8.2'), 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
+ self.assertEqual(
+ d.url_for_version('8.1.2'), 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
+ self.assertEqual(
+ d.url_for_version('8.1.1'), 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
+
+
+ def test_extrapolate_version(self):
+ d = spack.db.get('dyninst')
+
+ # Nearest URL for 8.1.1.5 is 8.1.1, and the URL there is
+ # release8.1/DyninstAPI-8.1.1.tgz. Only the last part matches
+ # the version, so only extrapolate the last part. Obviously
+ # dyninst has ambiguous URL versions, but we want to make sure
+ # extrapolation works in a well-defined way.
+ self.assertEqual(
+ d.url_for_version('8.1.1.5'), 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.5.tgz')
+
+ # 8.2 matches both the release8.2 component and the DyninstAPI-8.2 component.
+ # Extrapolation should replace both with the new version.
+ self.assertEqual(
+ d.url_for_version('8.2.3'), 'http://www.paradyn.org/release8.2.3/DyninstAPI-8.2.3.tgz')
+
+
+ def test_with_package(self):
+ d = spack.db.get('dyninst@8.2')
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
+
+ d = spack.db.get('dyninst@8.1.2')
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
+
+ d = spack.db.get('dyninst@8.1.1')
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
+
+
+ def test_concrete_package(self):
+ s = Spec('dyninst@8.2')
+ s.concretize()
+ d = spack.db.get(s)
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
+
+ s = Spec('dyninst@8.1.2')
+ s.concretize()
+ d = spack.db.get(s)
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
+
+ s = Spec('dyninst@8.1.1')
+ s.concretize()
+ d = spack.db.get(s)
+ self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
diff --git a/lib/spack/spack/test/url_parse.py b/lib/spack/spack/test/url_parse.py
index a03d6098f1..7a4d201d90 100644
--- a/lib/spack/spack/test/url_parse.py
+++ b/lib/spack/spack/test/url_parse.py
@@ -281,11 +281,16 @@ class UrlParseTest(unittest.TestCase):
'synergy', '1.3.6p2',
'http://synergy.googlecode.com/files/synergy-1.3.6p2-MacOSX-Universal.zip')
- def test_mvapich2_version(self):
+ def test_mvapich2_19_version(self):
self.check(
'mvapich2', '1.9',
'http://mvapich.cse.ohio-state.edu/download/mvapich2/mv2/mvapich2-1.9.tgz')
+ def test_mvapich2_19_version(self):
+ self.check(
+ 'mvapich2', '2.0',
+ 'http://mvapich.cse.ohio-state.edu/download/mvapich/mv2/mvapich2-2.0.tar.gz')
+
def test_hdf5_version(self):
self.check(
'hdf5', '1.8.13',
diff --git a/lib/spack/spack/test/versions.py b/lib/spack/spack/test/versions.py
index 454ab36b8a..20e946e90e 100644
--- a/lib/spack/spack/test/versions.py
+++ b/lib/spack/spack/test/versions.py
@@ -95,6 +95,10 @@ class VersionsTest(unittest.TestCase):
self.assertEqual(ver(expected), ver(a).intersection(ver(b)))
+ def check_union(self, expected, a, b):
+ self.assertEqual(ver(expected), ver(a).union(ver(b)))
+
+
def test_two_segments(self):
self.assert_ver_eq('1.0', '1.0')
self.assert_ver_lt('1.0', '2.0')
@@ -217,12 +221,16 @@ class VersionsTest(unittest.TestCase):
self.assert_in('1.3.5-7', '1.2:1.4')
self.assert_not_in('1.1', '1.2:1.4')
self.assert_not_in('1.5', '1.2:1.4')
- self.assert_not_in('1.4.2', '1.2:1.4')
+
+ self.assert_in('1.4.2', '1.2:1.4')
+ self.assert_not_in('1.4.2', '1.2:1.4.0')
self.assert_in('1.2.8', '1.2.7:1.4')
self.assert_in('1.2.7:1.4', ':')
self.assert_not_in('1.2.5', '1.2.7:1.4')
- self.assert_not_in('1.4.1', '1.2.7:1.4')
+
+ self.assert_in('1.4.1', '1.2.7:1.4')
+ self.assert_not_in('1.4.1', '1.2.7:1.4.0')
def test_in_list(self):
@@ -254,6 +262,17 @@ class VersionsTest(unittest.TestCase):
self.assert_overlaps('1.6:1.9', ':')
+ def test_overlap_with_containment(self):
+ self.assert_in('1.6.5', '1.6')
+ self.assert_in('1.6.5', ':1.6')
+
+ self.assert_overlaps('1.6.5', ':1.6')
+ self.assert_overlaps(':1.6', '1.6.5')
+
+ self.assert_not_in(':1.6', '1.6.5')
+ self.assert_in('1.6.5', ':1.6')
+
+
def test_lists_overlap(self):
self.assert_overlaps('1.2b:1.7,5', '1.6:1.9,1')
self.assert_overlaps('1,2,3,4,5', '3,4,5,6,7')
@@ -311,6 +330,32 @@ class VersionsTest(unittest.TestCase):
self.check_intersection(['0:1'], [':'], ['0:1'])
+ def test_intersect_with_containment(self):
+ self.check_intersection('1.6.5', '1.6.5', ':1.6')
+ self.check_intersection('1.6.5', ':1.6', '1.6.5')
+
+ self.check_intersection('1.6:1.6.5', ':1.6.5', '1.6')
+ self.check_intersection('1.6:1.6.5', '1.6', ':1.6.5')
+
+
+ def test_union_with_containment(self):
+ self.check_union(':1.6', '1.6.5', ':1.6')
+ self.check_union(':1.6', ':1.6', '1.6.5')
+
+ self.check_union(':1.6', ':1.6.5', '1.6')
+ self.check_union(':1.6', '1.6', ':1.6.5')
+
+
+ def test_union_with_containment(self):
+ self.check_union(':', '1.0:', ':2.0')
+
+ self.check_union('1:4', '1:3', '2:4')
+ self.check_union('1:4', '2:4', '1:3')
+
+ # Tests successor/predecessor case.
+ self.check_union('1:4', '1:2', '3:4')
+
+
def test_basic_version_satisfaction(self):
self.assert_satisfies('4.7.3', '4.7.3')
@@ -326,6 +371,7 @@ class VersionsTest(unittest.TestCase):
self.assert_does_not_satisfy('4.8', '4.9')
self.assert_does_not_satisfy('4', '4.9')
+
def test_basic_version_satisfaction_in_lists(self):
self.assert_satisfies(['4.7.3'], ['4.7.3'])
@@ -341,6 +387,7 @@ class VersionsTest(unittest.TestCase):
self.assert_does_not_satisfy(['4.8'], ['4.9'])
self.assert_does_not_satisfy(['4'], ['4.9'])
+
def test_version_range_satisfaction(self):
self.assert_satisfies('4.7b6', '4.3:4.7')
self.assert_satisfies('4.3.0', '4.3:4.7')
@@ -352,6 +399,7 @@ class VersionsTest(unittest.TestCase):
self.assert_satisfies('4.7b6', '4.3:4.7')
self.assert_does_not_satisfy('4.8.0', '4.3:4.7')
+
def test_version_range_satisfaction_in_lists(self):
self.assert_satisfies(['4.7b6'], ['4.3:4.7'])
self.assert_satisfies(['4.3.0'], ['4.3:4.7'])
diff --git a/lib/spack/spack/version.py b/lib/spack/spack/version.py
index fbf86db8e1..cc83634137 100644
--- a/lib/spack/spack/version.py
+++ b/lib/spack/spack/version.py
@@ -50,10 +50,6 @@ from bisect import bisect_left
from functools import wraps
from external.functools import total_ordering
-import llnl.util.compare.none_high as none_high
-import llnl.util.compare.none_low as none_low
-import spack.error
-
# Valid version characters
VALID_VERSION = r'[A-Za-z0-9_.-]'
@@ -256,18 +252,39 @@ class Version(object):
@coerced
def __contains__(self, other):
- return self == other
+ if other is None:
+ return False
+ return other.version[:len(self.version)] == self.version
+
+
+ def is_predecessor(self, other):
+ """True if the other version is the immediate predecessor of this one.
+ That is, NO versions v exist such that:
+ (self < v < other and v not in self).
+ """
+ if len(self.version) != len(other.version):
+ return False
+
+ sl = self.version[-1]
+ ol = other.version[-1]
+ return type(sl) == int and type(ol) == int and (ol - sl == 1)
+
+
+ def is_successor(self, other):
+ return other.is_predecessor(self)
@coerced
def overlaps(self, other):
- return self == other
+ return self in other or other in self
@coerced
def union(self, other):
- if self == other:
+ if self == other or other in self:
return self
+ elif self in other:
+ return other
else:
return VersionList([self, other])
@@ -290,7 +307,7 @@ class VersionRange(object):
self.start = start
self.end = end
- if start and end and end < start:
+ if start and end and end < start:
raise ValueError("Invalid Version range: %s" % self)
@@ -312,9 +329,12 @@ class VersionRange(object):
if other is None:
return False
- return (none_low.lt(self.start, other.start) or
- (self.start == other.start and
- none_high.lt(self.end, other.end)))
+ s, o = self, other
+ if s.start != o.start:
+ return s.start is None or (o.start is not None and s.start < o.start)
+
+ return (s.end != o.end and
+ o.end is None or (s.end is not None and s.end < o.end))
@coerced
@@ -335,8 +355,23 @@ class VersionRange(object):
@coerced
def __contains__(self, other):
- return (none_low.ge(other.start, self.start) and
- none_high.le(other.end, self.end))
+ if other is None:
+ return False
+
+ in_lower = (self.start == other.start or
+ self.start is None or
+ (other.start is not None and (
+ self.start < other.start or
+ other.start in self.start)))
+ if not in_lower:
+ return False
+
+ in_upper = (self.end == other.end or
+ self.end is None or
+ (other.end is not None and (
+ self.end > other.end or
+ other.end in self.end)))
+ return in_upper
@coerced
@@ -372,27 +407,75 @@ class VersionRange(object):
@coerced
def overlaps(self, other):
- return (other in self or self in other or
- ((self.start == None or other.end is None or
- self.start <= other.end) and
- (other.start is None or self.end == None or
- other.start <= self.end)))
+ return ((self.start == None or other.end is None or
+ self.start <= other.end or
+ other.end in self.start or self.start in other.end) and
+ (other.start is None or self.end == None or
+ other.start <= self.end or
+ other.start in self.end or self.end in other.start))
@coerced
def union(self, other):
- if self.overlaps(other):
- return VersionRange(none_low.min(self.start, other.start),
- none_high.max(self.end, other.end))
- else:
+ if not self.overlaps(other):
+ if (self.end is not None and other.start is not None and
+ self.end.is_predecessor(other.start)):
+ return VersionRange(self.start, other.end)
+
+ if (other.end is not None and self.start is not None and
+ other.end.is_predecessor(self.start)):
+ return VersionRange(other.start, self.end)
+
return VersionList([self, other])
+ # if we're here, then we know the ranges overlap.
+ if self.start is None or other.start is None:
+ start = None
+ else:
+ start = self.start
+ # TODO: See note in intersection() about < and in discrepancy.
+ if self.start in other.start or other.start < self.start:
+ start = other.start
+
+ if self.end is None or other.end is None:
+ end = None
+ else:
+ end = self.end
+ # TODO: See note in intersection() about < and in discrepancy.
+ if not other.end in self.end:
+ if end in other.end or other.end > self.end:
+ end = other.end
+
+ return VersionRange(start, end)
+
@coerced
def intersection(self, other):
if self.overlaps(other):
- return VersionRange(none_low.max(self.start, other.start),
- none_high.min(self.end, other.end))
+ if self.start is None:
+ start = other.start
+ else:
+ start = self.start
+ if other.start is not None:
+ if other.start > start or other.start in start:
+ start = other.start
+
+ if self.end is None:
+ end = other.end
+ else:
+ end = self.end
+ # TODO: does this make sense?
+ # This is tricky:
+ # 1.6.5 in 1.6 = True (1.6.5 is more specific)
+ # 1.6 < 1.6.5 = True (lexicographic)
+ # Should 1.6 NOT be less than 1.6.5? Hm.
+ # Here we test (not end in other.end) first to avoid paradox.
+ if other.end is not None and not end in other.end:
+ if other.end < end or other.end in end:
+ end = other.end
+
+ return VersionRange(start, end)
+
else:
return VersionList()