Page History
- 2014-Nov-19 14:33 johnkn
- 2012-Nov-13 15:13 adesmet
- 2011-Sep-28 11:51 adesmet
- 2011-Sep-28 11:04 adesmet
- 2011-Sep-28 10:26 adesmet
- 2011-Sep-28 10:25 adesmet
- 2011-Aug-30 16:55 adesmet
- 2011-Aug-30 16:53 adesmet
- 2011-Aug-29 17:46 adesmet
- 2011-Aug-29 17:41 adesmet
- 2011-Aug-29 17:36 adesmet
- 2011-Aug-29 12:40 adesmet
- 2011-Aug-29 12:40 adesmet
- 2011-Aug-29 12:38 adesmet
- 2011-Aug-29 11:40 adesmet
- 2011-Aug-29 11:36 adesmet
- 2009-Apr-02 10:40 jfrey
- 2009-Feb-27 14:34 jfrey
- 2009-Feb-27 14:23 jfrey
- 2009-Feb-18 14:14 psilord
- 2009-Jan-16 14:52 michaelb
OUT OF DATE
This document is out of date. It describes how things were done in the configure/imake work. In the cmake world things are different. I've done some work to update it (2011-8-28), but there are almost certainly errors.
Adding a new external package to condor
This document describes how to add a new external package to the Condor build. There are a number of elements to this task which are described below:
- Packaging the external so it can be built by the build system
- Setting up the source tarball
- Putting the source tarball on the web server
- Making the build script
- Build script environment
- Build script syntax
- Build script exit codes
- Build script example
- Handling patches (optional)
- Telling the build system about the new package
- Telling CMake about your package
- Dealing with externals caches
Before getting into the specifics, here's the 10,000 foot view of how the Condor build system deals with externals:
- The actual source tarball is hosted on a public web server.
- cmake decides which externals we need. The version is part of the cmake information for the branch.
- /build/cmake/CondorConfigure.cmake looks for the external-specific CMakeLists.txt
- The individual external CMakeLists.txt describe how to download, patch, and build the external.
- Externals are built and stored in /scratch/condor-externals; you can change this default location with something like -DEXTERNAL_STAGE:PATH=/path/to/a/private/directory when you invoke cmake.
Understanding the above overview will help make sense of the specific steps needed for modifying the build system to know about and use a new external.
In general, the process of adding a new version of an existing external is very similar to if you're adding a whole new external.
Naming schemes and conventions
Before we get into the nuts and bolts, there are a few naming conventions that need to be defined and clarified.
Each external package is identified by 2 things:
- Name (e.g. "globus" or "krb5")
- Version (e.g. "1.2.7")
These two parts of the package identification are important in
numerous places, and it's important to keep it all straight. For the
rest of this document anywhere you see: [name]
or
[version]
, you should substitute the right thing based on the
package you're adding.
Regarding the name: it should be descriptive, but not too long/verbose. "z" is not good. "zlib-compression-library" isn't either. usually it's pretty obvious.
Regarding the version: ideally, this should be the official version number of the external package as decided by its authors. If we patch a given external, that doesn't change the official version we're basing the package on, and that's what the version number should be.
However, if you want to add new patches to an existing external, you should change the version number and add a new external! We do NOT want to have multiple things which are different code to be using the same version! If differt versions share the same name, externals caching may lead to builds using the wrong patches.
When adding new patches to an existing external, add a -p1
to the [version]
to get your new version. If the [version]
already ends in -p<number>
, increment the number for your new version.
You should continue to use the original tarball (with it's original name). At the moment this is not possible, as the CMakeLists for most (all?) of the externals assume that the tarball name, the directory name inside of the tarball, and the version are identical.
If you have any questions or are uncertain, just ask. Good people to ask are TODO.
Finally, the combination of [name]-[version]
is often
referred to in the build system (and this document) as the package
name, and you'll see it listed as [package-name]
Packaging the external so it can be built by the build system
Each external package used to build Condor lives in a unique
directory as part of the externals
directory tree. The basic
directory layout of externals
is as follows:
externals/ /bundles/ /[name]/ /[version]
The bundles
directory contains subdirectories for each
kind of package, and each package directory has a subdirectory for
the version of that package. Some packages may have multiple versions
because different operating systems require different versions.
For example, at the time of this
writing, we've got 5 different versions of the glibc external, each
living in their own subdirectory of externals/bundles/glibc
.
externals/ /bundles/ /glibc/ /2.3.2.27.9.7/ /2.5-20061008T1257-p0/ /2.5-20061008T1257-x86_64-p0/ /2.7-18/ /2.7-18-x86_64/
Inside each version-specific subdirectory, are several things:
CMakeLists.txt
which describes how to build the external. It will include the URL to download the source tarball(s) from. (ideally, exactly what you'd download from the authors distribution, unmodified)build_[package-name]
A build script, usually partnered with a Windows specific script in the formbuild_[package-name].bat
.
Optionally, there may be patch file(s) which include modifications we had to make to the package to get it to work with Condor.
Each of these things are addressed in the following sections...
Setting up the source tarball
In theory there's nothing you should need to do to modify the source tarball of the external package. We want the original, unmodified source whenever possible.
In practice you'll need to repackage the tar.
First, the name of the tarball must be in the form
[package-name].tar.gz
.
Second, the contents of the tarball must be in a subdirected named
[package-name]
. The current CMakeLists.txt assume this. For
example, "krb5-1.2.8.tar.gz
".
Putting the source tarball on the web server
The source tarballs live on the web server parrot.cs.wisc.edu. They are synced periodically from the following directories on AFS at UW-CS:
/p/condor/repository/externals
/p/condor/repository/externals-private
Be sure that the tarball is world readable. The files are copied to local disk on parrot and AFS permissions are not used, only standard Unix permissions.
TODO: How often is this synchronized? Is there a way to force a sync faster?
The latter is for files that can't be publicly distributed. Currently, the only thing there is a LIGO application for testing standard universe. Once synced, the files can be fetched from the following URLS:
Once you place a tarball in /p/condor/repository/externals
or /p/condor/repository/externals-private
, you shouldn't change it. You must not change it once a mention of it is pushed to the shared git repository. If the contents of your source tarball(s) may change while you're preparing to push your changes, you have two options for temporarily locating the tarballs:
- Local builds Place the tarball directly in the external's directory. Modify the package's CMakeFiles.txt, looking for ExternalProject_Add. Inside that function call, comment out the "DOWNLOAD_COMMAND" line. Comment out the "URL" line and add a new one that looks like "URL /full/path/to/my/tarball.tar.gz". This will only work on your localhost. You cannot use a relative path in the URL. Do not push any commits that contain the full tarball to the public git repository.
- Local webserver Place the tarball at some other ftp or http location. Your CS web page ({link: http://pages.cs.wisc.edu/~username http://pages.cs.wisc.edu/~username) or the temporary directory under Condor's anonymous ftp server are possibilities. Then edit the CMakeLists.txt to load the unusual location. You'll be changing the DOWNLOAD_COMMAND and URL arguments to ExternalProject_Add.
TODO: adesmet's edits end here
Making the build script
When the Condor build is trying to build your external, first it
will create a temporary sandbox build directory. The
[name]-[version].tar.gz will be downloaded and untarred into the sandbox. Then, the
build_[name]-[version]
script will be invoked with the
current working directory set to the sandbox build directory. This
script is responsible for building the external package in whatever
way makes sense. The calling layer assumes the responsibility of
cleaning up the build sandbox. It will also keep track of whether or
not things need to be rebuilt.
Build script environment variables
In addition to being born with the build sandbox directory as the current working directory, the following environment variables will be set (these are literal... no substitution of package name or anything):
$PACKAGE_NAME # the name of this package $PACKAGE_SRC_NAME # package name from source tarball $PACKAGE_DEBUG # empty if release, '-g' if debug build $PACKAGE_BUILD_DIR # the sandbox build directory $PACKAGE_INSTALL_DIR # where to put the results $EXTERNALS_INSTALL_DIR # full path to root of externals/install $EXTERNALS_CONFIG # full path to config.sh with config variables
$PACKAGE_NAME
is the [name]-[version]
identifying
string for your package.
$PACKAGE_SRC_NAME
is the same as $PACKAGE_NAME
, except it
will not have any -p<number>
on the end if the source tarball
doesn't have it. This simplifies allowing multiple external versions to use
the same source tarball.
$PACKAGE_BUILD_DIR
is a subdirectory of
externals/build
, named with the package-name
.
This is just a temporary sandbox directory, and
build_external
will remove this directory and all its
contents as soon as your build script exits.
$PACKAGE_INSTALL_DIR
is a subdirectory of
externals/install
, named with the package-name
as
well. This directory won't necessarily exist when your build script
is spawned, so your script is responsible for creating this directory
in whatever way is appropriate for your package. Most packages don't
have to worry about it, since make install
takes care of it
for you. Some packages need to copy a whole tree to this location, so
we don't want to make you rmdir
if you don't want it there in
the first place. However, the parent of this directory (the directory
path stored in $EXTERNALS_INSTALL_DIR
) is guaranteed to
exist, so you don't need to worry about mkdir -p
.
If your build script wants to know things that the Condor build
system determines at run-time (for example, what globus flavor we're
trying to build), it can source the file specified in
$EXTERNALS_CONFIG
(details on sourcing this file below).
This is a simple bourne shell syntax file that defines a bunch of
shell variables that hold information that the configure
script discovers. If your external needs to know something that the
configure
script figures out but that isn't yet in
config/config.sh
, all you have to do is add a new line to
config/config.sh.in
to define the variable you need. Also,
note that if you want to pass any of these variables onto the
configure
script or build system of your package, you either
have to pass them in as arguments or manually export them yourself as
environment varialbes (by default, you just get shell variables in
your script, not environment variables).
Build script syntax
YOUR BUILD SCRIPT MUST BE PLAIN OLD BOURNE SHELL!!!.
Do NOT use any bash syntax, since it won't work on all 15+
Condor platforms. For example, source
is not portable across
all versions of bourne shell. The correct way to include this file in
your build script is to use:
. $EXTERNALS_CONFIG
Similarly, to set an environment variable, you can NOT do this:
export FOO=bar
You MUST use:
FOO=bar export FOO
Another shell idiom that does NOT work is using this:
if( ! make ); then blah blah; fi
You MUST use:
make if [ $? -ne 0 ] then blah blah fi
Build script exit codes
If your script encounters an error during the build, exit with status 1.
After building, you should run any tests that are available.
If any tests do not pass or any other error occurs, exit with status 2.
After passing tests, you script should install the package into the
directory $PACKAGE_INSTALL_DIR
in whatever format is
customary for that package.
If there is a problem during installation, exit with status 3.
If there is any problem specific to your package, exit with status 10 or higher. Status 4-9 is reserved for future use. You can only exit with status 0 if everything is fully installed and working (to the best of your knowledge).
Build script example
Here is an example build_generic_package
, which would
build most things that follow the common tar and 'configure / make /
make test / make install' convention:
#!/bin/sh ############# build_generic_package cd $PACKAGE_SRC_NAME/src ./configure --prefix=$PACKAGE_INSTALL_DIR --with-ccopts=$PACKAGE_DEBUG make if [ $? -ne 0 ] then echo "make failed" exit 1 fi make test if [ $? -ne 0 ] then echo "test failed" exit 2 fi make install if [ $? -ne 0 ] then echo "install failed" exit 3 fi exit 0 ############# end of build_generic_package
Handling patches (optional)
If your external needs to be patched, the preferred method for
dealing with it is to create patch files, and put those patches into
the version-specific subdirectory. Then, the build-script for the
package can invoke patch
as needed. An example to look at
would be the krb5-1.2.5
external
(externals/bundles/krb5/1.2.5/
)
Again, if you want to add additional patches to an existing
external, you MUST make an entirely new external package with a
different version number (e.g. like krb5-1.2.5-p1
)
so that we can tell the difference between the two versions. This is
a little wasteful of space, unfortunately, but there's no way around
that at this time.
Telling the build system about the new package
Once your package is setup in the externals tree and the build script is ready, you've got to tell the Condor build system about the new package. There are a few separate places that this needs to happen.
Changing autoconf-related stuff
The first step in integrating a new external with the build system
is to modify src/configure.ac
, the input file for
autoconf
. Near the bottom of this file, there's a section of
tests that determine what version of each external we want to use.
The section begins with the following code block:
############################################################ # What versions of what externals do we need ############################################################ AC_MSG_NOTICE([determining versions of external packages])
Some of the "tests" are trivial, they just define a version to use
directly. Other tests do something more fancy, like choose a version
depending on the platform we're on. Ultimately, autoconf
is
just a fancy way of using M4 macros to make a giant bourne shell
script. If you following the existing syntax and examples already in
this section, you should be in good shape. If you have trouble,
either check out the autoconf
documentation (for example autoconf 2.57).
The important thing is that once you know what version of the external you need, that you call our special autoconf macro,
CHECK_EXTERNAL()
, which handles everything else for
you.
For example, here's a simple case, zlib:
CHECK_EXTERNAL([zlib],[1.1.3], [soft])
This handles the following tasks for you:
- Defines what version of zlib we need.
- Checks that the given version of zlib exists in the externals tree the build is configured to use.
- Prints out a message to the screen about the version of zlib we're going to be using.
- Tells autoconf that any file it is producing should
substitute any occurrence of
@ext_zlib_version@
with the value "zlib-1.1.3
". - Provides you with
HAVE_EXT_ZLIB
for use in =#ifdef='s, if the zlib is found. - Tells autoconf that the external is a soft requirement. More on this later.
If you're just changing the version of an existing external, that's
probably all you'll have to do to the autoconf
stuff, and you
can skip right to the discussion of git
changes.
However, if you're adding a whole new external package, there are a
few more steps (both for autoconf
and imake
, so read
on... In either case, before using your new external you should run
./build_init
to repopulate the cf files with correct info.
To add a whole new kind of external, you've got to understand a
bit about that [soft]
argument (the requirement level), modify
config/externals.cf.in
and config/config.sh.in
to
hold the results of the new test you added in
src/configure.ac
. Again, if you just edit the files and
follow the existing examples, you'll probably be in fine shape. But,
for the sake of complete docs, I'll explain in a little more detail
here:
requirement levels
There are three requirement levels, and they determine how
dependent Condor is on an external. The levels are soft
,
hard
, and optional
. The soft
level is the
most common and is likely what you need to specify. It means that
Condor can compile and operate just fine without the external. When
configure
is run it will look for soft
requirements,
but will only give a warning if they are not available. This is in
contrast to hard
requirements, which are always
required. Condor either will not compile or run without them. They
MUST be present. Not being present will cause configure
to
fail. You want to avoid adding hard
externals to Condor. The
third level, optional
, is just as uncommon as hard
,
if not more. It operates almost exactly as soft
with one
exception: there is a --with-soft-is-hard
option that can be
given to configure
to treat all soft
requirements as
if they were hard
requirements. This option does not change
how optional
externals are treated. So, you really want your
external to be a soft
, unless you have a really good reason
for it not to be.
config/externals.cf.in
The most important thing is to define a new make
variable
to hold the version of the external you added. For example, here's
what happens for zlib:
EXT_ZLIB_VERSION = @ext_zlib_version@
The rest of the build system that needs to know how to find zlib can now use this variable. Almost always, what we really care about is the installed output of the external build. This is a common way to deal with that:
ZLIB_DIR = $(EXT_INSTALL)/$(EXT_ZLIB_VERSION) ZLIB_LIB = $(ZLIB_DIR)/lib ZLIB_INC = $(ZLIB_DIR)/include
Obviously, the details depend on the specifics of a given package, but
in general, something like the above is what you want to do. This
way, we have the full path to the lib
and include
directories for the external, and each of those make
variables can be used in regular Condor imake
input files so
that the Condor build can use the external.
config/config.sh.in
The config/config.sh.in
file is converted by
configure
into config/config.sh
. The full path to
this config.sh
file is passed into the
externals/build_external
script by the Condor build system,
which in turn sets it as the $EXTERNALS_CONFIG
environment
variable. This way, the package-specific build scripts can use
information that configure
discovers if that information is
necessary to build an external. In general, this is only needed when
a given external depends on a different external. For example, the
gahp external needs to know where the globus external was
installed, and what "globus flavor" was built. The blahp external also
needs to know if the platform we're on is trying to build a
statically-linked version of Condor or not. So,
config/config.sh
defines the following variables:
... GLOBUS_FLAVOR=... EXT_GLOBUS_VERSION=... HAS_STATIC=... ...
Inside build_gahp-1.0.10
, we use the following code:
. $EXTERNALS_CONFIG FLAVOR="--with-flavor=$GLOBUS_FLAVOR" GLOBUS="--with-globus=$EXTERNALS_INSTALL_DIR/$EXT_GLOBUS_VERSION" STATIC_BUILD="--with-static=$HAS_STATIC" cd $PACKAGE_BUILD_DIR/$PACKAGE_NAME/ ./configure --prefix=$PACKAGE_INSTALL_DIR $GLOBUS $FLAVOR $STATIC_BUILD
Changing imake-related stuff
NOTE: if you're just changing the version of an existing external,
you probably don't have to mess with any imake
stuff at all.
Once autoconf
knows all about your new external, and the
various make
variables have been set in the files in
config
, there are a few minor changes to the imake
input files themselves that need to be made so the Condor build knows
about the external. In particular, the src/Imakefile
needs
to know about your external.
To do this, all you have to do is add a new ext_target
to
the file. This magic imake
rule handles all the details of
ensuring the Condor build depends on your external, and that
externals/build_external
is invoked to build your external at
the appropriate time. This target takes two arguments, the package
name (which should be held in the make
variable
EXT_[NAME]_VERSION
), and any externals your package depends
on.
Here's the boring zlib example. There, nothing to it, since we always want this external built, and it doesn't depend on anything else:
ext_target(EXT_ZLIB_VERSION,$(NULL))
Here's the complicated globus example. It depends on the GPT
external, and we only want to build it if the HAVE_EXT_GLOBUS
imake #define is set, which happens when configure
finds the
globus external:
#if HAVE_EXT_GLOBUS ext_target(EXT_GLOBUS_VERSION,$(EXT_TRIGGER)/$(EXT_GPT_VERSION)) ... #endif
Note the use of $(EXT_TRIGGER)
. That's how imake
(and therefore, make
) know a given external was successfully
built. $(EXT_TRIGGER)
holds the full path to the
triggers
directory (described above). Each trigger file is
just named with the package name. So, by making the globus external
depend on $(EXT_TRIGGER)/$(EXT_GPT_VERSION)
, we ensure that
GPT is built before Globus.
If your external depends on multiple other externals, all you have
to do is include multiple trigger files (separated by spaces) in the
second argument to ext_target
. For example:
ext_target(EXT_FOO_VERSION,$(EXT_TRIGGER)/$(EXT_BAR_VERSION) $(EXT_TRIGGER)/$(EXT_BAZ_VERSION))
Finally, there's the tricky subject of exactly how the Condor
imake
system should use your external and integrate it with
the rest of the Condor source. Eventually we'll have a better answer
for that question here. For now, if it's not totally obvious, just
talk to someone else on the staff and try to come up with something
good. When in doubt, ask Derek.
Once all this is done, you're in a position to fully test your new
external with a Condor build. You'll have to re-run autoconf
to generate src/configure
and re-run the
src/configure
script itself. After that, a top-level
make
inside src
should be all you need to see your
external package built. Once your external build is working and the
Condor source sees and uses the new external, you're ready to commit
your changes to git.
Dealing with externals caches
The Condor build system allows built externals to be stored outside of your immediate build tree. These external caches can be shared across multiple build locations, greatly reducing the time to do a fresh build of Condor. This is why changing anything in how an external is built requires you to create a new version of the external.
The cache has one directory per package and version number.
Externals are, by default, built and stored in /scratch/condor-externals; you can change this default location with something like -DEXTERNAL_STAGE:PATH=/path/to/a/private/directory when you invoke cmake.
FAQS
How do I rebuild a specific external
TODO
Unsorted notes
Updating/upgrading
- Update build/cmake/CondorConfigure.cmake, looking for a line like
add_subdirectory(${CONDOR_EXTERNAL_DIR}/bundles/krb5/1.4.3-p1)
- A Windows specific build file (build-*.bat) may have the version number hard coded into it. You'll need to update it by hand.
- The version number is embedded in externals/bundle/packagename/version/CMakeLists.txt. Look for something like
condor_pre_external( KRB5 krb5-1.4.3-p0 "lib;include" "include/krb5.h")