Page History

Turn Off History

Adding a new external package to condor

This document describes how to add a new external package to the Condor build. There are a number of elements to this task which are described below:

Before getting into the specifics, here's the 10,000 foot view of how the Condor build system deals with externals:

Understanding the above overview will help make sense of the specific steps needed for modifying the build system to know about and use a new external.

In general, the process of adding a new version of an existing external is very similar to if you're adding a whole new external. However, in any point in this instructions where something extra is required, that will be made clear...


Naming schemes and conventions

Before we get into the nuts and bolts, there are a few naming conventions that need to be defined and clarified.

Each external package is identified by 2 things:

These two parts of the package identification are important in numerous places, and it's important to keep it all straight. For the rest of this document anywhere you see: [name] or [version], you should substitute the right thing based on the package you're adding.

Regarding the name: it should be descriptive, but not too long/verbose. "z" is not good. "zlib-compression-library" isn't either. usually it's pretty obvious. if you have any questions ASK someone else BEFORE you start adding, since it'll eventually require adding a new directory to our repository, and that's impossible to undo...

Regarding the version: ideally, this should be the official version number of the external package as decided by its authors. If we patch a given external, that doesn't change the official version we're basing the package on, and that's what the version number should be.

However, if you want to add new patches to an existing external, you should change the version number and add a new external! We do NOT want to have multiple things which are different code to be using the same version!

When adding new patches to an existing external, add a -p1 to the [version] to get your new version. If the [version] already ends in -p<number>, increment the number for your new version. You should continue to use the original tarball (with it's original name).

Again, if you have any questions or are uncertain, just ask.

Finally, the combination of [name]-[version] is often referred to in the build system (and this document) as the package name, and you'll see it listed as [package-name]


Packaging the external so it can be built by the build system

Each external package used to build Condor lives in a unique directory as part of the externals directory tree. The basic directory layout of externals is as follows:

externals/
         /build
         /install
         /triggers
         /bundles/
                 /[name]
                        /[version]

The build directory is a temporary build sandbox, which we'll talk about more later. The install directory is where each external package installs its build output so it can be used by the Condor build. The triggers directory holds the trigger files (mentioned in the overview) that determine if a given external was successfully built. Want to have your Condor workspace rebuild the zlib external? Just remove the

externals/triggers/zlib-[version] file and the src/Imakefile will ensure that a top-level "make" will rebuild zlib...

The build, install, and triggers directories can optionally live in a separate location, where they can be shared by multiple build workspaces. On unix, this location is given by the -with-externals option to configure. On Windows, this location is given by the EXTERN_DIR environment variable. At UW-CS, /p/condor/workspaces/externals will be used if it exists.

The bundles directory contains subdirectories for each kind of package, and each package directory has subdirectories for each version of that package. For example, at the time of this writing, we've got 5 different versions of the glibc external, each living in their own subdirectory of externals/bundles/glibc.

externals/
         /bundles/
                 /glibc/
                       /2.1.1
                       /2.2.2
                       /2.2.93
                       /2.3.2
                       /2.3.2.27.9.7

Inside each version-specific subdirectory, are 2 main things:

Optionally, there may be patch file(s) which include modifications we had to make to the package to get it to work with Condor.

Each of these things are addressed in the following sections...

Setting up the source tarball

Ideally, there's nothing you should need to do to modify the source tarball of the external package. We want the original, unmodified source whenever possible. However, the name of the tarball is important, since the Condor build system itself makes assumptions about the name so that the build_externals script can download and untar the tarball for you (one less thing for your build-script to worry about for yourself). So, the source tarball must be named "[name]-[version].tar.gz" (and, needless to say, it must be a real gzip'ed tar file). For example, "krb5-1.2.7.tar.gz". An important exception is that the -p<number> at the end of the version is optional in the tarball name. For example, the tarball for external krb5-1.2.7-p1 can be named krb5-1.2.7.tar.gz (the same tarball used for external krb5-1.2.7).

Putting the source tarball on the web server

The source tarballs live on the web server parrot.cs.wisc.edu. They are synced periodically from the following directories on AFS at UW-CS:

/p/condor/repository/externals
/p/condor/repository/externals-private

The latter is for files that can't be publicly distributed. Currently, the only thing there is a LIGO application for testing standard universe. Once synced, the files can be fetched from the following URLS:

http://parrot.cs.wisc.edu/externals
http://parrot.cs.wisc.edu/externals-private

Once you place a tarball in /p/condor/repository/externals or /p/condor/repository/externals-private, you shouldn't change it. You must not change it once a mention of it is pushed to the shared git repository. If the contents of your source tarball(s) may change while you're preparing to push your changes, you have two options for temporarily locating the tarballs:

In either case, ensure you move the tarball to the official externals location and update the URLS file before pushing your changes.

Making the URLS file

The URLS file is a simple text file containing the URLs of the source tarballs of the external. Normally, there's only one tarball, but a couple externals require several. Each URL should appear on a separate line. All of the externals are hosted on parrot.cs.wisc.edu, and the URLs should look like this:

http://parrot.cs.wisc.edu/externals/krb5-1.4.3.tar.gz

If the tarball contains files that aren't publicly releasable, there's a restricted directory:

http://parrot.cs.wisc.edu/externals-private/krb5-1.4.3.tar.gz

Making the build script

When the Condor build is trying to build your external, first it will create a temporary sandbox build directory. The [name]-[version].tar.gz will be downloaded and untarred into the sandbox. Then, the build_[name]-[version] script will be invoked with the current working directory set to the sandbox build directory. This script is responsible for building the external package in whatever way makes sense. The calling layer assumes the responsibility of cleaning up the build sandbox. It will also keep track of whether or not things need to be rebuilt.

Build script environment variables

In addition to being born with the build sandbox directory as the current working directory, the following environment variables will be set (these are literal... no substitution of package name or anything):

  $PACKAGE_NAME           # the name of this package
  $PACKAGE_SRC_NAME       # package name from source tarball
  $PACKAGE_DEBUG          # empty if release, '-g' if debug build
  $PACKAGE_BUILD_DIR      # the sandbox build directory
  $PACKAGE_INSTALL_DIR    # where to put the results
  $EXTERNALS_INSTALL_DIR  # full path to root of externals/install
  $EXTERNALS_CONFIG       # full path to config.sh with config variables
}

$PACKAGE_NAME is the [name]-[version] identifying string for your package.

$PACKAGE_SRC_NAME is the same as $PACKAGE_NAME, except it will not have any -p<number> on the end if the source tarball doesn't have it. This simplifies allowing multiple external versions to use the same source tarball.

$PACKAGE_BUILD_DIR is a subdirectory of externals/build, named with the package-name. This is just a temporary sandbox directory, and build_external will remove this directory and all its contents as soon as your build script exits.

$PACKAGE_INSTALL_DIR is a subdirectory of externals/install, named with the package-name as well. This directory won't necessarily exist when your build script is spawned, so your script is responsible for creating this directory in whatever way is appropriate for your package. Most packages don't have to worry about it, since make install takes care of it for you. Some packages need to copy a whole tree to this location, so we don't want to make you rmdir if you don't want it there in the first place. However, the parent of this directory (the directory path stored in $EXTERNALS_INSTALL_DIR) is guaranteed to exist, so you don't need to worry about mkdir -p.

If your build script wants to know things that the Condor build system determines at run-time (for example, what globus flavor we're trying to build), it can source the file specified in $EXTERNALS_CONFIG (details on sourcing this file below). This is a simple bourne shell syntax file that defines a bunch of shell variables that hold information that the configure script discovers. If your external needs to know something that the configure script figures out but that isn't yet in config/config.sh, all you have to do is add a new line to config/config.sh.in to define the variable you need. Also, note that if you want to pass any of these variables onto the configure script or build system of your package, you either have to pass them in as arguments or manually export them yourself as environment varialbes (by default, you just get shell variables in your script, not environment variables).

Build script syntax

YOUR BUILD SCRIPT MUST BE PLAIN OLD BOURNE SHELL!!!. Do NOT use any bash syntax, since it won't work on all 15+ Condor platforms. For example, source is not portable across all versions of bourne shell. The correct way to include this file in your build script is to use:

  . $EXTERNALS_CONFIG

Similarly, to set an environment variable, you can NOT do this:

  export FOO=bar

You MUST use:

  FOO=bar
  export FOO

Another shell idiom that does NOT work is using this:

  if( ! make ); then blah blah; fi

You MUST use:

make
if [ $? -ne 0 ]
then
    blah blah
fi

Build script exit codes

If your script encounters an error during the build, exit with status 1.

After building, you should run any tests that are available.

If any tests do not pass or any other error occurs, exit with status 2.

After passing tests, you script should install the package into the directory $PACKAGE_INSTALL_DIR in whatever format is customary for that package.

If there is a problem during installation, exit with status 3.

If there is any problem specific to your package, exit with status 10 or higher. Status 4-9 is reserved for future use. You can only exit with status 0 if everything is fully installed and working (to the best of your knowledge).

Build script example

Here is an example build_generic_package, which would build most things that follow the common tar and 'configure / make / make test / make install' convention:

#!/bin/sh
############# build_generic_package

cd $PACKAGE_SRC_NAME/src
./configure --prefix=$PACKAGE_INSTALL_DIR --with-ccopts=$PACKAGE_DEBUG

make
if [ $? -ne 0 ]
then
    echo "make failed"
    exit 1
fi

make test
if [ $? -ne 0 ]
then
    echo "test failed"
    exit 2
fi

make install
if [ $? -ne 0 ]
then
    echo "install failed"
    exit 3
fi

exit 0

############# end of build_generic_package

Handling patches (optional)

If your external needs to be patched, the preferred method for dealing with it is to create patch files, and put those patches into the version-specific subdirectory. Then, the build-script for the package can invoke patch as needed. An example to look at would be the krb5-1.2.5 external (externals/bundles/krb5/1.2.5/)

Again, if you want to add additional patches to an existing external, you MUST make an entirely new external package with a different version number (e.g. like krb5-1.2.5-p1) so that we can tell the difference between the two versions. This is a little wasteful of space, unfortunately, but there's no way around that at this time.


Telling the build system about the new package

Once your package is setup in the externals tree and the build script is ready, you've got to tell the Condor build system about the new package. There are a few separate places that this needs to happen.

Changing autoconf-related stuff

The first step in integrating a new external with the build system is to modify src/configure.ac, the input file for autoconf. Near the bottom of this file, there's a section of tests that determine what version of each external we want to use. The section begins with the following code block:

  ############################################################
  # What versions of what externals do we need
  ############################################################
  AC_MSG_NOTICE([determining versions of external packages])

Some of the "tests" are trivial, they just define a version to use directly. Other tests do something more fancy, like choose a version depending on the platform we're on. Ultimately, autoconf is just a fancy way of using M4 macros to make a giant bourne shell script. If you following the existing syntax and examples already in this section, you should be in good shape. If you have trouble, either check out the autoconf documentation (for example autoconf 2.57).

The important thing is that once you know what version of the external you need, that you call our special autoconf macro,

CHECK_EXTERNAL(), which handles everything else for you.

For example, here's a simple case, zlib:

  CHECK_EXTERNAL([zlib],[1.1.3], [soft])

This handles the following tasks for you:

If you're just changing the version of an existing external, that's probably all you'll have to do to the autoconf stuff, and you can skip right to the discussion of git changes. However, if you're adding a whole new external package, there are a few more steps (both for autoconf and imake, so read on... In either case, before using your new external you should run ./build_init to repopulate the cf files with correct info.

To add a whole new kind of external, you've got to understand a bit about that [soft] argument (the requirement level), modify config/externals.cf.in and config/config.sh.in to hold the results of the new test you added in src/configure.ac. Again, if you just edit the files and follow the existing examples, you'll probably be in fine shape. But, for the sake of complete docs, I'll explain in a little more detail here:

requirement levels

There are three requirement levels, and they determine how dependent Condor is on an external. The levels are soft,

hard, and optional. The soft level is the most common and is likely what you need to specify. It means that Condor can compile and operate just fine without the external. When configure is run it will look for soft requirements, but will only give a warning if they are not available. This is in contrast to hard requirements, which are always required. Condor either will not compile or run without them. They MUST be present. Not being present will cause configure to fail. You want to avoid adding hard externals to Condor. The third level, optional, is just as uncommon as hard, if not more. It operates almost exactly as soft with one exception: there is a --with-soft-is-hard option that can be given to configure to treat all soft requirements as if they were hard requirements. This option does not change how optional externals are treated. So, you really want your external to be a soft, unless you have a really good reason for it not to be.

config/externals.cf.in

The most important thing is to define a new make variable to hold the version of the external you added. For example, here's what happens for zlib:

  EXT_ZLIB_VERSION = @ext_zlib_version@

The rest of the build system that needs to know how to find zlib can now use this variable. Almost always, what we really care about is the installed output of the external build. This is a common way to deal with that:

  ZLIB_DIR = $(EXT_INSTALL)/$(EXT_ZLIB_VERSION)
  ZLIB_LIB = $(ZLIB_DIR)/lib
  ZLIB_INC = $(ZLIB_DIR)/include

Obviously, the details depend on the specifics of a given package, but in general, something like the above is what you want to do. This way, we have the full path to the lib and include

directories for the external, and each of those make variables can be used in regular Condor imake input files so that the Condor build can use the external.

config/config.sh.in

The config/config.sh.in file is converted by configure into config/config.sh. The full path to this config.sh file is passed into the

externals/build_external script by the Condor build system, which in turn sets it as the $EXTERNALS_CONFIG environment variable. This way, the package-specific build scripts can use information that configure discovers if that information is necessary to build an external. In general, this is only needed when a given external depends on a different external. For example, the gahp external needs to know where the globus external was installed, and what "globus flavor" was built. The blahp external also needs to know if the platform we're on is trying to build a statically-linked version of Condor or not. So, config/config.sh defines the following variables:

  ...
  GLOBUS_FLAVOR=...
  EXT_GLOBUS_VERSION=...
  HAS_STATIC=...
  ...

Inside build_gahp-1.0.10, we use the following code:

  . $EXTERNALS_CONFIG

  FLAVOR="--with-flavor=$GLOBUS_FLAVOR"
  GLOBUS="--with-globus=$EXTERNALS_INSTALL_DIR/$EXT_GLOBUS_VERSION"
  STATIC_BUILD="--with-static=$HAS_STATIC"

  cd $PACKAGE_BUILD_DIR/$PACKAGE_NAME/

  ./configure --prefix=$PACKAGE_INSTALL_DIR $GLOBUS $FLAVOR $STATIC_BUILD

Changing imake-related stuff

NOTE: if you're just changing the version of an existing external, you probably don't have to mess with any imake stuff at all.

Once autoconf knows all about your new external, and the various make variables have been set in the files in

config, there are a few minor changes to the imake input files themselves that need to be made so the Condor build knows about the external. In particular, the src/Imakefile needs to know about your external.

To do this, all you have to do is add a new ext_target to the file. This magic imake rule handles all the details of ensuring the Condor build depends on your external, and that externals/build_external is invoked to build your external at the appropriate time. This target takes two arguments, the package name (which should be held in the make variable

EXT_[NAME]_VERSION), and any externals your package depends on.

Here's the boring zlib example. There, nothing to it, since we always want this external built, and it doesn't depend on anything else:

  ext_target(EXT_ZLIB_VERSION,$(NULL))

Here's the complicated globus example. It depends on the GPT external, and we only want to build it if the HAVE_EXT_GLOBUS imake #define is set, which happens when configure finds the globus external:

#if HAVE_EXT_GLOBUS
ext_target(EXT_GLOBUS_VERSION,$(EXT_TRIGGER)/$(EXT_GPT_VERSION))
...
#endif

Note the use of $(EXT_TRIGGER). That's how imake (and therefore, make) know a given external was successfully built. $(EXT_TRIGGER) holds the full path to the triggers directory (described above). Each trigger file is just named with the package name. So, by making the globus external depend on $(EXT_TRIGGER)/$(EXT_GPT_VERSION), we ensure that GPT is built before Globus.

If your external depends on multiple other externals, all you have to do is include multiple trigger files (separated by spaces) in the second argument to ext_target. For example:

ext_target(EXT_FOO_VERSION,$(EXT_TRIGGER)/$(EXT_BAR_VERSION) $(EXT_TRIGGER)/$(EXT_BAZ_VERSION))

Finally, there's the tricky subject of exactly how the Condor imake system should use your external and integrate it with the rest of the Condor source. Eventually we'll have a better answer for that question here. For now, if it's not totally obvious, just talk to someone else on the staff and try to come up with something good. When in doubt, ask Derek.

Once all this is done, you're in a position to fully test your new external with a Condor build. You'll have to re-run autoconf to generate src/configure and re-run the src/configure script itself. After that, a top-level

make inside src should be all you need to see your external package built. Once your external build is working and the Condor source sees and uses the new external, you're ready to commit your changes to git.


Dealing with externals caches

The Condor build system allows built externals to be stored outside of your immediate build tree. These external caches can be shared across multiple build locations and users, greatly reducing the time to do a fresh build of Condor. This is way changing anything in how an external is built requires you to create a new version of the external.

The cache consists of the following directories:

externals/build
externals/install
externals/triggers
If you don't use an external cache, these directories will be created in your build directory.

There are three situations where you'll see an external cache:

Your own cache

You can specify your own externals cache directory using the --with-externals command-line option to configure like so:

./configure --with-externals=/scratch/externals-cache

The cache in AFS at UW-CS

The Condor team has a shared externals cache on AFS in /p/condor/workspaces/externals. The tree is set up with @sys links to separate files by platform. If you don't use the --with-externals option to configure and this directory exists, configure will use this cache automatically. If you don't want to use this cache, you can explicitly disable it like this:

./configure --with-externals=`pwd`/../externals

The cache in NMI

We keep an externals cache on all of the machines in the NMI Build and Test facility. By default, the Condor glue scripts for NMI don't use the cache. You can enable use of the cache with the --use-externals-cache option to condor_nmi_submit. The automated builds all do this.

New externals and caching

One of the fundamental assumptions of the externals cache is that a particular external version will never change. Whiling you're preparing a new external or external version, this will not be true. Thus, you need to be careful not to pollute any shared caches with old revisions of your new external.

The easiest way to do this is to not use any shared external caches. If you're using a machine at UW-CS, you can explicitly disable use of the cache in AFS. The downside to this is that you have to build all of the externals in your local build space. You can play games with symlinks to minimize this rebuilding.

If you do decide to use the AFS cache, you must make sure it has the final revision of your external once you're ready to check it in. You can do so by simply removing the appropriate trigger file in /p/condor/workspaces/externals/triggers.

The externals caches in NMI are trickier. Each machine has a local cache and some platforms have multiple machines. Manually clearing the caches on all of the machines is cumbersome and error-prone. Better to never use the externals cache when developing a new external.