Skip to content
This repository has been archived by the owner on Jun 6, 2018. It is now read-only.

Latest commit

 

History

History
2168 lines (1911 loc) · 64.6 KB

Nix-notes.org

File metadata and controls

2168 lines (1911 loc) · 64.6 KB

Overview

The idea is that every build takes place in a cabal sandbox.

We ultimately want to create a derivation that has been parameterized by a set of constraints. We obtain the right constraints by running cabal install --dependencies-only --dry-run on the downstream package we are trying to build, then using that build plan to constrain the solver when building each upstream package. The buildplan file becomes part of the Nix derivation of the upstream package, meaning we can have multiple instances of the same version of a package that have been built with different dependencies.

Those very specific derivations go into the Nix store so that they might be re-used when the same specific instance of a package version is again required.

Installation

The cabbage script is tangled from this file. Put it on your PATH, or invoke it however you prefer. It is a regular bash script that calls out to sed, awk, grep, etc.

Development Workflow

Run cabbage in your project’s root directory (containing the project’s .cabal file). Then run,

nix-shell --command 'sh $setup'

to make sure all dependencies are built and linked into your sandbox. At this point, you can cabal configure etc., outside of nix-shell and work as usual.

To build the current package with Nix, use cabbage -b instead of nix-build shell.nix. The former is a short wrapper around the latter. It simply runs cabal configure and cabal sdist before calling nix-build shell.nix.

Test and Benchmark Suites

If you have benchmark or test suites, you will want to run cabbage -a to indicate that all targets should be considered when gathering dependencies.

System dependencies

It will take some work to alias names in a Cabal package’s extra-libraries field to Nix package names. For the time being, you can use a cabbage.config file in your project directory to specify systemDeps for each package. These will be propagated to downstream dependencies so that they will be available in the build environment. This part of a cabbage.config file looks like this,

systemDeps:
  mypkg: foo bar

This says that foo and bar should be listed in the buildInputs field of mypkg and everything that depends on it. The names foo and bar are resolved in the nixpkgs scope.

A real example is for the zlib package that requires the nixpkgs package named zlib. The repetition is a bit confusing, but remember that the keys here are Cabal packages, while the values are Nix packages.

systemDeps:
  zlib: zlib

This says that the Haskell package zlib (the one to the left of the colon) depends upon the Nix package zlib that we get from nixpkgs. Every Cabal package that depends on the Haskell zlib package will have the Nix zlib package in its build environment.

Flags

If there are flags that should be set for the package you are building or any of its dependencies, you can create a file cabbage.config that looks like this,

flags:
  MyPackage: demos -silly
  transformers-compat: three

This file will cause dependencies in your project guarded by the demos flag to be pulled in, while the negative setting of silly will cause Cabal to treat that flag as False. Assuming that transformers-compat is a transitive dependency of this package, it will be built with the three flag which is needed when your project ultimately must build against transformers < 4.

Using nix-build

Rather than use the entire source directory as the source of the Nix package, cabbage uses a Cabal sdist as the source of the Nix package when building an unpacked Cabal package. This means that Nix doesn’t waste time hashing data files or other junk you may have in your project directory. The upshot is that you must run cabal sdist before running nix-build shell.nix. If you do so, you should get an executable in result/bin as usual for Nix.

Working with other packages

You can also pass cabbage a path to a .cabal file or give it the name of a package on hackage (which itself may have a version suffix). In this case, cabbage will do its work in a temporary directory. The side effect of building a named package is that the package and all its dependencies will now exist in your Nix store so that subsequent builds may make use of them.

Another use of this named installation interface is to install executables. To link the executables (e.g. pandoc) built from a package into your environment, follow the instructions given in the last line of cabbage output after running cabbage pandoc. It will give you a nix-env -i invocation that points to where pandoc has been built in your Nix store so that it gets linked into your environment.

Building Stackage

Building What We Can

To build the current Stackage Nightly package set, create an empty directory and run, cabbage -n stackage. The -n specifies the Stackage Nightly constraint, while the name stackage is treated specially to create a dummy package based on the constraints given by a cabal.config file.

Once this has finished, you can have Nix build all the dependencies by issuing, nix-shell shell.nix --keep-going. Missing system dependencies are likely to cause some packages to fail, so we build as much as we can.

Now you can create a Nix binary cache. This is slightly complicated by the fact that we may not have succeeded in building all the dependences. To deal with this, we can find all the derivations that are direct dependencies of our Stackage derivation (this will include derivations for packages that failed to build), and then test whether or not the output paths of each of those derivations actually exist.

Pushing to the Cache

nix-store -q --references $(nix-instantiate shell.nix) | grep '\.drv' | awk '{print "nix-instantiate --eval --expr '\''(import " $0 ").outPath'\''"}' | bash | awk '{print "[ -d " $0 " ] && $(nix-store -r " $0 " &>/dev/null) && echo " $0}' | bash | xargs nix-push --dest bcache

where bcache is the directory in which you’d like to create the binary cache.

Shebang

Put a shebang line at the top of our tangled program.

Cabal file helpers

Tooling for working with .cabal files.

cabalFileExists() {
  local NUMCABALS=$(find . -maxdepth 1 -name '?*.cabal' | wc -l)
  if [ "$NUMCABALS" -gt 1 ]; then
    return 2
  elif [ "$NUMCABALS" -eq 1 ]; then
    return 0
  else
    return 1
  fi
}

# Print only the library target portion of a .cabal file and filter
# out line comments.
isolateLibraryTarget() {
  sed -n '/^[Ll]ibrary/,/^[^[:space:]]/ { /^[Ll]ibrary/p; /^[[:space:]]/p; }'
}

# Print only the executable target portions of a .cabal file.
isolateExecutableTarget() {
  sed -n '/^[Ee]xecutable/,/^[^[:space:]]/ { /^[Ee]xecutable/p; /^[[:space:]]/p; }'
}

isolateLibraryAndExecutableTargets() {
  local TMP=$(cat *.cabal | sed 's/--.*$//')
  echo "$TMP" | isolateLibraryTarget
  echo "$TMP" | isolateExecutableTarget
}

# Remove any Cabal block guarded by an "if os(windows)" or "if
# os(solaris)" or "if os(ios)" conditional. This is a very fragile
# test!
removeWindowsBlocks() {
  local AWK
  read -r -d '' AWK<<'EOF'
BEGIN { windowsIndent = 0; }
{
  if(match($0, /if os\(windows\)/)) {
    windowsIndent = RSTART;
  } else if(match($0, /if os\(solaris\)/)) {
    windowsIndent = RSTART;
  } else if(match($0, /if os\(ios\)/)) {
    windowsIndent = RSTART;
  } else if(windowsIndent > 0) {
    match($0, /^[[:space:]]*/);
    if(RLENGTH <= windowsIndent) {
      windowsIndent = 0;
      print($0);
    }
  } else {
    print($0);
  }
}
EOF
  awk "$AWK"
}

# Print the library target of any .cabal file in the current directory
# while removing any blocks guarded by a windows or solaris check.
cabalLibraryTarget() {
  if cabalFileExists ; then
    cat ./*.cabal | sed 's/--.*$//' | isolateLibraryTarget | removeWindowsBlocks
  fi
}

# Take all lines until the next Cabal file stanza begins. The
# assumption is that the first line is the beginning of a stanza, so
# its indentation level determines where the next stanza begins.
stanzaHead() {
  local AWK
  read -r -d '' AWK<<'EOF'
BEGIN { firstLine = 1; }
{
  if(firstLine) {
    match($0, /^[[:space:]]*/);
    stanzaIndent = RLENGTH;
    print $0;
    firstLine = 0;
  } else {
    match($0, /^[[:space:]]*/);
    if(RLENGTH <= stanzaIndent) {
      exit;
    } else {
      print $0;
    }
  }
}
EOF
  awk "$AWK"
  
}

pkgconfig-depends

# Print out a .cabal file starting with a pkgconfig-depends line.
pkgconfigDependsStarts() {
  cat ./*.cabal | sed 's/--.*$//' | sed -n '/[[:space:]]*[Pp]kg[Cc]onfig-[Dd]epends:/,$ p'
}

# Pull all pkgconfig dependencies from a .cabal file
pkgconfigDepends() {
  if cabalFileExists ; then
    grep -q -i "[[:space:]]*pkgconfig\-depends:" *.cabal
    if [ $? -eq 0 ]; then
      pkgconfigDependsStarts | stanzaHead \
 | sed -e 's/[Pp]kg[Cc]onfig-[Dd]epends:[[:space:]]*//' -e 's/,/\
 /g' | sed 's/^[[:space:]]*//' | sed '/^$/d'
    else
      echo
    fi
  fi
}

pkgconfig test

<<cabalHelpers>>
<<pkgconfig>>
cd ../ffmpeg-light
pkgconfigDepends
PKGS=($(pkgconfigDepends))
echo "There are ${#PKGS[@]} pkg deps"

Build-tools

Pick out all build-tools used by a package and all of its dependencies.

# Print each build-tool without any version constraint.
cabalBuildTools() {
  if cabalFileExists ; then
    grep -q -i "[[:space:]]*build-tools:" ./*.cabal
    if [ $? -eq 0 ]; then
      isolateLibraryAndExecutableTargets | removeWindowsBlocks \
    | grep -i "build-tools" | awk 'BEGIN { FS=":"; } { print($2); }' | sed 's/,/\
/g' | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]].*$//'
    else
     echo
    fi
  fi
}

# GTK's build tools package actually defines multiple build-tool
# executables. If a Cabal package refers to the executable name, we
# need to map that back to the Cabal package that provides that
# executable.
resolveBuildToolPackages() {
  sed -e 's/gtk2hsC2hs/gtk2hs-buildtools/' \
      -e 's/gtk2hsTypeGen/gtk2hs-buildtools/' \
      -e 's/gtk2hsHookGenerator/gtk2hs-buildtools/'
}

allBuildToolsAux() {
  cabalBuildTools | resolveBuildToolPackages
  if [ -d .cabbages ]; then
    local DEPS=($(buildplanDependencies))
    local d
    (cd .cabbages && \
     for d in "${DEPS[@]}"; do
       (cd "$d" &&  cabalBuildTools | resolveBuildToolPackages)
     done)
  fi
}

# Print the array of all build-tools used to build this package and
# all its dependencies.
allBuildTools1() {
  local TOOLS=($(allBuildToolsAux | sort -u))

  # hsc2hs comes with GHC
  local special
  for special in hsc2hs ghc; do
    local i=$(findIndex "$special" TOOLS[@])
    if [ "$i" -gt -1 ]; then
      unset TOOLS[$i]
    fi
  done

  echo "${TOOLS[@]}"
}

# Print the array of all build-tools used to build this package, all
# its dependencies, and the build-tools needed to build those
# build-tools.
allBuildTools() {
  local TOOLS=($(allBuildTools1))

  local t
  local TOOLS2=()
  if [ -d .cabbages ]; then
    for t in "${TOOLS[@]}"; do
      local latest=$(find .cabbages -name "$t-[[:digit:].]*" -depth 1 | tail -n 1)
      TOOLS2+=($(cd "$latest" && allBuildTools1))
    done
  fi

  ((for t in "${TOOLS[@]}"; do
     printf "%s\n" "$t"
   done;
   for t in "${TOOLS2[@]}"; do
     printf "%s\n" "$t"
   done) | sort -u)
}

Test

<<cabalHelpers>>
<<findIndex>>
<<allExtraLibraries>>
<<allBuildTools>>
<<dryDependencies>>
#cd ~/temp/rifactor
#cd ~/temp/OpenCL-1.0.3.4
cd ../Hocl-Render
#cd ../Frames
#cd ../Hocl-OpenCL
allBuildTools

Frameworks

We can pick out frameworks requirements that are needed on OS X (darwin).

cabalFrameworks() {
  if cabalFileExists ; then
    grep -q -i "[[:space:]]*frameworks:" ./*.cabal
    if [ $? -eq 0 ]; then
      cabalLibraryTarget \
  | grep -i "frameworks" | awk 'BEGIN { FS=":"; } { print($2); }' | sed 's/,/\
/' | sort -u
    fi
  fi
}

cabal install –dependencies-only –dry-run

Rather than using cabal-freeze, we can run cabal install --dependencies-only --dry-run in a fresh sandbox to obtain a list of all dependencies not available in the global database.

# Prints out lines of the form "pkgName X.Y.Z" (where X, Y, and Z are
# numbers). One line for each package that will have to be installed
# in a sandbox given the current version of GHC and the contents of
# its global package database.
dryDependencies() {
  local EXTRAS=""
  if [ "$#" -gt 0 ]; then
    if [ $1 -eq 2 ]; then
      EXTRAS="--enable-tests --enable-benchmarks"
    fi
  fi
  if [ -f cabbage.config ]; then
    local MYNAME=$(cat ./*.cabal | grep -i "name:" | awk 'BEGIN { FS=":"; } {print($2);}' | sed 's/^[[:space:]]*//')
    local FLAGS=$(cat cabbage.config | flagsFor "$MYNAME")
    if ! [ "$FLAGS" = "" ]; then
      EXTRAS="$EXTRAS --flags=\"$FLAGS\""
    fi
  fi
  local CMD="cabal install --dependencies-only --dry-run"
  if ! [ "$EXTRAS" = "" ]; then
    CMD="$CMD $EXTRAS"
  fi
  eval $CMD \
| sed -n '3,$ p'  | sed '/Warning: /,$ d' \
| sed -e 's/ .*$//' -e 's/\([-_[:alnum:]]*\)-\([[:digit:].]*\)$/\1 \2/' \
| sed '/^[-_[:alnum:]]* [[:digit:]]/ !d'
}

# Reads the buildplan file in the current directory and prints one
# package per line in the form "pkgname-X.Y.Z".
buildplanDependencies() {
  if [ -f buildplan ]; then
    cat buildplan | sed 's/\([^ ]*\) \(.*\)$/\1-\2/'
  else
    echo "No buildplan found in $(pwd)"
    exit 1
  fi
}

# Generate cabal.config contents from a buildplan. This removes
# cabbage-patched version numbers so that "cabal install" can work
# properly.
buildplanConstraints() {
  echo "constraints:"; (cat buildplan | sed 's/^\([^ ]*\) \(.*\)$/  \1 ==\2,/' | sed 's/\.4552,$/,/' | sed '$ s/,//')
}

Test

<<dryDependencies>>
<<flagsFor>>
#cd ../GLUtil
#cd CabbageDown2
cd ../Frames
dryDependencies

<<dryDependencies>>
cd CabbageDown2
dryDependencies > buildplan
DEPS=($(buildplanDependencies))
echo "We have ${#DEPS[@]} dependencies: ${DEPS[@]}"

Unconstrained

As a last-ditch effort to freeze a build plan, we can remove all version constraints. This is needed for the hsc2hs build too.

unconstrainCabal() {
  local UNCONSTRAIN
  read -r -d '' UNCONSTRAIN<<'EOF'
BEGIN { 
  buildDep = 0;
  FS = ",";
}
{
  lineSkip = 0;
  if(match($0, /^[[:space:]]*[Bb][Uu][Ii][Ll][Dd]-[Dd][Ee][Pp][Ee][Nn][Dd][Ss]:/)) {
    buildDep = 1;
    match($0, /^[[:space:]]*/);
    indentation = RLENGTH;
    for(i = 0; i < RLENGTH; ++i) printf(" ");
    printf("build-depends:");
    sub(/^[[:space:]]*[Bb][Uu][Ii][Ll][Dd]-[Dd][Ee][Pp][Ee][Nn][Dd][Ss]:/,"",$0);
    match($0, /^[[:space:]]*/);
    for(i = 0; i < RLENGTH; ++i) printf(" ");
    sub(/^[[:space:]]*/,"",$0);
  } else if(buildDep) {
    if(match($0,/^[[:space:]]*$/)) {
      lineSkip = 1;
    } else {
      match($0, /^[[:space:]]*/);
      if(RLENGTH <= indentation) {
        buildDep = 0;
      } else {
        for(i = 0; i < RLENGTH; ++i) printf(" ");
        sub(/^[[:space:]]*/,"",$0);
      }
    }
  }
  if(buildDep && !lineSkip) {
    # Update a line of a build-depend
    for(i = 1; i <= NF; ++i) {
      sub(/^[[:space:]]*/,"",$(i));
      sub(/[[:space:]]*$/,"",$(i));
      if(match($(i), "[ ><=]")) {
        pkgName = substr($(i), 1, RSTART - 1);
        printf("%s", pkgName);
      } else {
        printf("%s", $(i));
      }
      if(i < NF) printf(", ");
    }
    printf("\n");
  } else {
    # Everything else gets printed
    print $0
  }
}
EOF
  awk "$UNCONSTRAIN"
}

# Try freezing after removing all version constraints.
freezeUnconstrained() {
 local NUMCABALS=$(find . -maxdepth 1 -name '?*.cabal' | wc -l)
  if [ "$NUMCABALS" -gt 1 ]; then
    echo "Error: Found multiple cabal files in $(pwd)"
    exit 1
  fi
  local REALCABAL=$(basename "$(ls ./*.cabal)")
  (cat "$REALCABAL" | sed 's/--.*$//' | unconstrainCabal) > cabbageDummy.cabal
  mv "$REALCABAL" cabbageBackup.bak
  mv cabbageDummy.cabal "$REALCABAL"
  freezeCabbagePatch 1
  local OK=$?
  return $OK
}

Finding dependencies

So we’ve got a package and we’ve created a sandbox. We can run dryDependencies to get a buildplan file that lists all dependencies.

add-sourced dependencies

Deal with the output of cabal sandbox list-sources. The add-sourced directories are found between a pair of blank lines. This bit of sed pulls out the directory names.

# List directories of added sources
getAddedSources() {
  sed '1,/^$/ d' | sed '/^$/,$ d'
}

Dependencies on hackage

This is just cabal get. We then need to cabal configure and dryDependencies so that we can do the buildplan intersection with the downstream constraints file.

However, we will also add-source everything that is add-sourced to the downstream package before configuring.

cabal.config Intersection

We want to take the version constraints from a downstream constraints file, and merge them into an upstream constraints file.

There are two approaches to doing this:

  1. Freeze downstream and upstream independently, then intersect the constraints
  2. Freeze downstream, then edit the upstream package’s .cabal file to replace all version constraints with equality constraints gleaned from the downstream cabal.config file

A problem with the first option is that sometimes Cabal’s solver is able to find a build plan for a downstream package while it is unable to find a plan for an upstream dependency. This is rather odd, but it happens. Another problem is that it is a bit slow, and feels somewhat redundant since the downstream freeze fixes all the versions, the upstream freeze is only used to get the transitive closure of the set of dependencies of the upstream package. This is just a limitation of what cabal-install offers.

The second option is not great as it doesn’t take into account further upstream dependencies that are constrained by other dependencies of the downstream package. It also requires careful surgery of the rather complicated .cabal file format. We must preserve any logic expressed therein so that freezing the newly constrained .cabal file may rely on that logic.

A dummy Cabal Library

# The start of a Cabal library specification, ready for a
# build-depends stanza.
dummyCabalLibrary() {
  echo "name:               Dummy"
  echo "version:            0.1.0.0"
  echo "build-type:         Simple"
  echo "cabal-version:      >=1.10"
  echo ""
  echo "library"
  echo "  exposed-modules:"
}

# Builds a .cabal file that depends on every package listed in a
# cabal.config file consisting solely of a single "constraints"
# stanza. This is intended to work with Stackage releases.
dummyFromConstraints() {
  dummyCabalLibrary; sed -e 's/^constraints:/  build-depends:/' -e 's/^\(   [[:space:]]*\)\(.*\)$/\1    \2/' -e 's/^.* installed,$//' cabal.config
}

Time stamps

Add-sourced dependencies are tracked with a time stamp that cabal uses to see if they have changed since they were last built. We want to work with this mechanism since when we build an add-sourced dependency, we grab the latest source available. Unfortunately, this involves some amount of parsing.

We need to be able to fill in timestamps for a GHC that is not present in the current set of timestamps. We also need to be able to overwrite old timestamps for the GHC we are using. Through this, we should preserve timestamps for any other GHC to be nice to the user.

We don’t tangle this block as it actually gets included in the setup attribute of the generated nix expression.

Unescaped

The above code is a bit gnarly to escape things so that it can be tangled into a bash block and then properly escaped for a Nix expression.

Tests

Test that we can append the new time stamps to an empty list, and replace old timestamps for the correct GHC version in a populated list.

cabbage.config

System dependencies

Concatenate all extra-libraries fields in a build plan. This is a very rough listing as it simply filters out blocks of Cabal files guarded behind one of “if os(windows)”, “os(solaris)”, or “os(ios)”.

# Prints the extra-libraires from a cabal file iff they occur in a
# library target.
cabalExtraLibraries() {
  if cabalFileExists ; then
    grep -q -i "[[:space:]]*extra\-libraries:" ./*.cabal
    if [ $? -eq 0 ]; then
      cat ./*.cabal | sed 's/--.*$//' | isolateLibraryTarget | removeWindowsBlocks | \
      grep -i "extra-libraries" | awk 'BEGIN { FS=":"; } { print($2); }'
    fi
  fi
}

# Looks in a cabal.config file to identify all dependencies, then
# visits each of them in the .cabbages directory and prints out all
# extra-libraries.
allExtraLibrariesAux() {
  local DEPS=($(buildplanDependencies))
  local d
  cabalExtraLibraries
  (cd .cabbages && \
   for d in "${DEPS[@]}"; do
     (cd "$d" &&  cabalExtraLibraries)
   done)
}

# Print out an array of possibly-needed extra-libraries.
allExtraLibraries() {
  local LIBS=($(allExtraLibrariesAux | sed 's/,/\
/'))
  if [ "${#LIBS[@]}" -gt 0 ]; then
    printf '%s\n' "${LIBS[@]}" | sort -u | tr '\n' ' '
  else
    echo "${LIBS[@]}"
  fi
}

# Let the user know they might need to prepare system dependencies.
warnExtraLibraries() {
  local LIBS=($(allExtraLibraries))
  if [ "${#LIBS[@]}" -gt 0 ]; then
    echo
    echo "You may need to supply system dependencies!"
    echo
    echo "See the cabbage documentation for how to do this with a 'systemDeps'"
    echo "section in a cabbage.config file."
    echo
    echo "Potentially necessary extra-libraries: ${LIBS[@]}"
    # read -p "Press any key to continue..." -n 1 -t 5
    echo
  fi
}

Test

<<cabalHelpers>>
<<allExtraLibraries>>
cd ~/temp/rifactor
allExtraLibraries

Configuration lookup

We support setting project-wide flags in a cabbage.config file that looks somewhat like a cabal.config file.

# Unversion package name. Remove the version number from a versioned
# package name.
unversionPackageName() {
  sed 's/\(.*\)-[-[:digit:].]*$/\1/' <<< "$1"
}

# Returns any flags set for the given package name in a cabbage.config
# file
flagsFor() {
  local FINDFLAGS
  read -r -d '' FINDFLAGS<<EOF
BEGIN { inFlags = 0; }
/^flags:/ { inFlags = 1; }
/^[^[:space:]]/ { if(inFlags == 2) { exit 0; } }
{
  if(inFlags == 1) {
    inFlags = 2;
  } else if(inFlags == 2) {
    gsub(/^[[:space:]]*/,"",\$1);
    if(\$1 == "$1:") {
      for(i = 2; i <= NR; ++i) {
        printf("%s", \$(i));
        if(i != NR) { printf(" "); }
      }
    }
  }
}
EOF
  awk "$FINDFLAGS"
}

# Find any systemDeps (system dependencies) specified for the named
# package in a cabbage.config file. The package name should be
# unversioned.
systemDepsFor() {
  local FINDDEPS
  read -r -d '' FINDDEPS<<EOF
BEGIN { inDeps = 0; }
/^systemDeps:/ { inDeps = 1; }
/^[^[:space:]]/ { if(inDeps == 2) { exit 0; } }
{
  if(inDeps == 1) {
    inDeps = 2;
  } else if(inDeps == 2) {
    gsub(/^[[:space:]]*/,"",\$1);
    if(\$1 == "$1:") {
      for(i = 2; i <= NR; ++i) {
        printf("%s", \$(i));
        if(i != NR) { printf(" "); }
      }
    }
  }
}
EOF
  awk "$FINDDEPS"
}

Test

Extract the flags for “transformers-compat”.

Distribute flags to the targeted cabbages

Read in a cabbage.config file, and copy the relevant parts of the file to each named dependency in the .cabbages directory.

There is only one flags stanza in a cabbage.config file. Once we’ve finished processing it, we can quit.

BEGIN { FS = ":"; inFlags = 0;}
/flags:/ { inFlags = 1; }
/^[^[:space:]]/ { if(inFlags == 2) { exit 0; } }
{
  if(inFlags == 1) {
    inFlags = 2;
  } else if(inFlags == 2) {
    gsub(/^[[:space:]]*/,"",$1);
    cmd = sprintf("find .cabbages -maxdepth 1 -name '%s-[[:digit:].]*'", $1);
    if( (cmd | getline versionedName) ) {
      flags = sprintf("flags:\n  %s:%s\n", $1, $2);
      cmd = sprintf("echo '%s' > .cabbages/$(basename \"%s\")/cabbage.config", flags, versionedName);
      system(cmd);
    } else {
      # print "Ignoring flag for unknown dependency:", $1
    }
  }
}
# Takes a cabbage.config file and distributes subset cabbage.config
# files to directories in the .cabbages directory on an as-needed
# basis. Specifically, the flags for a named package will be copied
# into a cabbage.config file in that package's directory.
sowFlags() {
  local AWK
  read -r -d '' AWK<<'EOF'
<<sowFlagsAwk>>
EOF
  awk "$AWK"
}

Creating a derivation for each dependency

Our derivations are actually not that complicated from a Nix perspective because we aren’t using much Nix machinery. Instead, we create a sandbox, then manually symlink dependency artefacts into the sandbox and let cabal-install invoke GHC with all the necessary path information.

Getting package dependency sources

We can cabal get things from hackage, but if a dependency has been add-sourced, we should cabal sdist it.

Getting from hackage

Getting from an add-source

# Get the package in this directory's full versioned name. E.g. name-x.y.z
getMyFullName() {
  local CABAL=$(ls ./*.cabal)
  { (cat "$CABAL" | tr -d '\r' | grep -i "^name:" | sed 's/^[Nn]ame:[[:space:]]*\(.*\)$/\1/');
    (cat "$CABAL" | tr -d '\r' | grep -i "^version:" | sed 's/^[Vv]ersion:[[:space:]]*\(.*\)$/\1/'); } \
  | tr '\n' '-' | sed 's/-$//'
}

# Takes a directory name, and returns the package that can be built
# from that directory.
getAddedPackageName() {
  (cd "$1" && getMyFullName)
}

# Get a source distribution of an added-source package
getAddSource() {
  local CWD=$(pwd)
  (cd "$1" && cabal sdist -v0 --output-directory="$CWD"/.cabbages/"$(getMyFullName)")
}

Get Any Dependency Source

We need a helper function that can get the source code of a dependency whether it has been add-sourced or it comes from hackage.

Array membership

Adapted from this StackOverflow question

# Takes an element and an array, returns -1 if the element is /not/ in
# the array; or its index if it is.
findIndex() {
  local i
  local -a arr=("${!2}")
  for i in "${!arr[@]}"; do 
    [[ "${arr[$i]}" == "$1" ]] && echo "$i" && return 0; done
  echo "-1"
  return 1

  # for e in "${@:2}"; do [[ "$e" == "$1" ]] && return 0; done
  # return 1
}

Getting add-sourced dependency package names

We use cabal sandbox list-sources to get the directories of added sources, then getAddedPackageName to get the name+version of the package in each directory.

getDependencySources

Now we can define a function capable of getting the source for a dependency that has been add-sourced to a sandbox or that is available from hackage via cabal get.

# Get all dependency sources for the package in the current
# directory. This handles add-sourced dependencies, or those that
# "cabal get" can get (i.e. from hackage).
getDependencySources() {
  local ADDEDSOURCEDIRS=($(cabal sandbox list-sources | getAddedSources))
  local ADDEDSOURCEPACKAGES
  local i
  for i in "${!ADDEDSOURCEDIRS[@]}"; do
    ADDEDSOURCEPACKAGES[$i]=$(getAddedPackageName "${ADDEDSOURCEDIRS[$i]}")
  done

  local DEPS=($(buildplanDependencies))
  mkdir -p .cabbages
  local d
  for d in "${DEPS[@]}"; do
    i=$(findIndex "$d" ADDEDSOURCEPACKAGES[@])
    if [ "$i" -gt "-1" ]; then
      echo "Getting add-source dependency: $d"
      getAddSource "${ADDEDSOURCEDIRS[$i]}"
    elif [ -d .cabbages/"$d" ]; then
      echo "Using existing source dist of $d"
    elif [ "${d: -5}" == ".4552" ]; then
      echo "Cabbage patching globally installed package: $d"
      cabbagePatch "$d"
    else
      echo "Getting dependency: $d"
      cabal get "$d" -d .cabbages
    fi
  done

  # add-sourced build-tools are handled carefully We first strip off
  # the version numbers of the add-sourced packages because
  # build-tools are not usually specified with versions. We then
  # perform similar logic to how we get library dependencies.

  local BUILDTOOLS=($(allBuildTools))
  for i in "${!ADDEDSOURCEPACKAGES[@]}"; do
    ADDEDSOURCEPACKAGES[$i]=$(unversionPackageName "${ADDEDSOURCEPACKAGES[$i]}")
  done

  for d in "${BUILDTOOLS[@]}"; do
    i=$(findIndex "$d" ADDEDSOURCEPACKAGES[@])
    if [ "$i" -gt "-1" ]; then
      echo "Getting add-source build-tool: $d"
      getAddSource "${ADDEDSOURCEDIRS[$i]}"
    elif [ -d .cabbages/"$d" ]; then
      echo "Using existing source dist of $d"
    else
      echo "Getting build-tool: $d"
      cabal get "$d" -d .cabbages
    fi
  done
}

Create derivation

We basically use the template suggested by CabbageDown. The only parts we need to fill in are the name and cabbageDeps attributes. The former is the cabal package name prefixed with “haskell-“, and the latter are just the non-builtin dependencies that we callPackage from their paths in the .cabbages directory.

Getting the package db path

We need to figure out a string like “x86_64-osx-ghc-7.8.4” that cabal will use to store things like compiled libraries and a sandbox package database.

getPackageDBPath() {
  if [ -f cabal.sandbox.config ]; then
    cabal sandbox hc-pkg list | grep ".conf.d" | tail -n 1 | sed 's/.*\/\(.*\)-packages.conf.d.*/\1/'
    return 0
  else
    return 1
  fi
}

Generating Nix packages

So we have a directory with a package’s source code, and we have a buildplan from the downstream package. The downstream package may have already had some packages add-sourced to it, so we want to also have those add-sources. We could either create an independent sandbox, or use the downstream package’s sandbox. Interestingly, we’re only doing this to get the benefit of the cabal sandbox add-source commands, so perhaps using the downstream package’s sandbox is the right thing to do.

For the upstream package, we

  • cabal sandbox init –sandbox=../.cabal.sandbox=
  • cabal install –dependencies-only –dry-run
  • Get dependencies by intersecting upstream’s cabal.config with downstream’s

A note on cabal install and custom setup scripts

Previously, the builder script in the Nix expression invoked cabal install with various flags. This worked almost all the time, except with custom setup programs. These work okay when built with cabal configure --builddir=..., but the necessary flags don’t seem to be forwarded to the configure phase from an invocation of cabal install. So, for now we manually configure, build, and copy.

This used to how we configured, built, and installed a package:

A related issue arises when invoking cabal sdist which also builds the setup program. Even with --builddir passed to cabal, this tries to built setup in a dist directory alongside the source code.

This used to be an early part of the builder:

Helper

getSynopsis() {
  local CABAL=$(ls ./*.cabal)
  cat "$CABAL" | sed -n '/^[Ss]ynopsis/,/^[^[:space:]]/ p' | sed '$d' \
  | sed -e 's/^[Ss]ynopsis:[[:space:]]*//' -e 's/^[[:space:]]*//' -e 's/"/\\"/g' \
  | tr '\n' ' '
}

Making Cabbages

# Define an attribute for each package. Takes an array of attribute
# names, and an array of corresponding directory names that are home
# to Nix package definitions (these are all in the .cabbages
# directory).
callCabbages() {
  local -a NAMES=("${!1}")
  local -a PKGS=("${!2}")
  local i

  for ((i = 0; i < ${#NAMES[@]}; ++i)); do
    local TOOLSARR=($(cd .cabbages/${PKGS[$i]} && allBuildTools))
    local TOOLS=""
    if [ ${#TOOLSARR[@]} -gt 0 ]; then
      TOOLS=" $(echo ${TOOLSARR[@]})"
    fi
    echo "      ${NAMES[$i]} = callPackage .cabbages/${PKGS[$i]} {"
    echo "        inherit frozenCabbages haskellBuildTools pkgs$TOOLS;"
    echo "      };"
  done
}

# Build a .nix file from a .cabal file in the current directory Takes
# the ghcPlatform string, this package's name, and whether or not this
# package should define frozenCabbages: 0 = this is an upstream
# package, 1 = this is a downstream package, 2 = this is a build-tool.
mkCabbage() {
  local NIX
  local FROZENUPSTREAM
  local FROZENDEF
  local LINKSANDBOX
  local DEPS=($(buildplanDependencies))
  local DEPNAMES=($(cat buildplan | sed 's/ .*$//'))

  if [ $3 -gt 0 ]; then
    # This is /the/ downstream package or a build-tool

    # We will need the standard callPackage function
    FROZENUPSTREAM="callPackage, frozenCabbages ? {}"

    # We will define the frozenCabbages attribute
    IFS=$'\n' read -r -d '' FROZENDEF <<EOF
myCabbages = lib.fix (frozenCabbages: {
$(callCabbages DEPNAMES[@] DEPS[@])
    });
EOF

    # We will seed the sandbox /in this directory/ with our
    # dependencies in the nix store so the user can continue using a
    # standard cabal workflow (e.g. tools like ghc-mod).
    mkdir -p .cabal-sandbox/lib/"$1"
    LINKSANDBOX="ln -sFf \${pkg.outPath}/.cabal-sandbox/$1-packages.conf.d/*.conf "$(pwd)"/.cabal-sandbox/$1-packages.conf.d/\n";

    # We create a dummy sdist file so that the src attribute on the
    # downstream package's nix expression is a file, even if its
    # contents are currently bogus. This is done so that Nix can
    # evaluate the expression and install dependencies, without which
    # the configure phase (run in order to produce the sdist) of the
    # downstream package can fail due to missing dependencies.
    if ! [ -d "./dist" ]; then
      mkdir dist
    fi
    if ! [ -f "./dist/$2.tar.gz" ]; then
      touch "./dist/$2.tar.gz"
    fi
  else
    # This is an upstream package (dependency)
    FROZENUPSTREAM="frozenCabbages"
  fi

  local SYNOPSIS=$(getSynopsis)
  local SYSTEMDEPS=""

  if [ -f ../../cabbage.config ]; then
    local MYNAME=$(unversionPackageName "$2")
    SYSTEMDEPS=$(cat ../../cabbage.config | systemDepsFor "$MYNAME")
  fi
  if [ -f cabbage.config ]; then
    local MYNAME=$(unversionPackageName "$2")
    SYSTEMDEPS=$(cat cabbage.config | systemDepsFor "$MYNAME")
  fi
  local PKGS=($(pkgconfigDepends))

  if [ "${#PKGS[@]}" -gt 0 ]; then
    SYSTEMDEPS="$SYSTEMDEPS pkgconfig"
  fi

  local TOOLS=($(allBuildTools))
  local TOOLSDEPS=""
  if [ "${#TOOLS[@]}" -gt 0 ]; then
    TOOLSDEPS=$(echo "${TOOLS[@]}" | awk '{for(i=1;i<=NF;i++) printf(", %s",$(i));}')
  fi

  local FRAMEWORKS=($(cabalFrameworks))
  local NIXFW=""
  SYSTEMDEPS="[ $SYSTEMDEPS ]"
  if [ "${#FRAMEWORKS[@]}" -gt 0 ]; then
    # We need to replace uses of CoreFoundation with darwin.cf-private
    # and ensure that it is at the start of the framework search path
    # since it overlaps with another nixpkgs CoreFoundation framework.
    echo "${FRAMEWORKS[@]}" | grep -q CoreFoundation
    if [ $? -eq 0 ]; then
      FRAMEWORKS=($(echo "${FRAMEWORKS[@]}" | sed 's/CoreFoundation[[:space:]]*//'))
      FRAMEWORKS=("darwin.cf-private ${FRAMEWORKS[@]}")
    fi
    
    # Hack to pull in the Kernel framework.
    # Needed by some uses of IOKit, for example.
    FRAMEWORKS=("${FRAMEWORKS[@]} Kernel")

    SYSTEMDEPS="($SYSTEMDEPS ++ lib.optionals stdenv.isDarwin (with darwin.apple_sdk.frameworks; [ ${FRAMEWORKS[@]} ]))"
    local fw
    for fw in "${FRAMEWORKS[@]}"; do
     NIXFW="$NIXFW -framework $fw"
    done
  fi

  # Now we build up the Nix expression
  IFS=$'\n' read -r -d '' NIX <<EOF
{ stdenv, lib, haskellBuildTools, pkgs$TOOLSDEPS, $FROZENUPSTREAM }:
let cabalTmp = "cabal --config-file=./.cabal/config";
    $FROZENDEF
    mkCmd = pkg: let nm = lib.strings.removePrefix "haskell-" pkg.name;
                     p = pkg.outPath;
                     pkgPath = ".cabal-sandbox/$1-packages.conf.d";
                 in ''ln -sFf \${p}/\${pkgPath}/*.conf \$out/\${pkgPath}/
                    '';
$(if [ $3 -eq 1 ]; then
    echo "    mkSetupCmd = pkg: let nm = lib.strings.removePrefix \"haskell-\" pkg.name;"
    echo "                          p = pkg.outPath;"
    echo "                      in \"$LINKSANDBOX\";"
  elif [ $3 -eq 2 ]; then
    echo "    mkSetupCmd = pkg: \"\";"
  fi)
in
stdenv.mkDerivation rec {
  name = "haskell-$2";
  src = $(if [ $3 -eq 1 ]; then 
            echo "./dist/$2.tar.gz"
          else
            echo "./."
          fi);
$(if [ $3 -gt 0 ]; then
  echo -n "  cabbageDeps = with myCabbages; [ "
else
  echo -n "  cabbageDeps = with frozenCabbages; [ "
fi)
$(echo "${DEPNAMES[@]}") ];
  systemDeps = (with pkgs;  $SYSTEMDEPS) ++
               lib.lists.unique (lib.concatMap (lib.attrByPath ["systemDeps"] []) cabbageDeps);
  propagatedBuildInputs = systemDeps;
  buildInputs = [ stdenv.cc $(echo "${TOOLS[@]}")] ++ haskellBuildTools ++ cabbageDeps ++ systemDeps;

  # Build the commands to merge package databases
  cmds = lib.strings.concatStrings (map mkCmd cabbageDeps);
$(if [ $3 -gt 0 ]; then
    cat << SETUPEOF
  setupCmds = lib.strings.concatStrings (map mkSetupCmd cabbageDeps);
  setup = builtins.toFile "setup.sh" ''
    <<updateTimeStamps>>
    eval "\$setupCmds"
    \${cabalTmp} sandbox hc-pkg recache
    SRCS=(\$(cabal sandbox list-sources | sed '1,/^\$/ d' | sed '/^\$/,\$ d'))
    OLDTIMESTAMPS=\$(cat .cabal-sandbox/add-source-timestamps)
    updateTimeStamps "$1" SRCS[@] "\$OLDTIMESTAMPS" > .cabal-sandbox/add-source-timestamps
  '';
SETUPEOF
  fi)

  builder = builtins.toFile "builder.sh" ''
    source \$stdenv/setup
    mkdir \$out

    if [ -d "\$src" ]; then
      cp -R "\$src"/* .
      #*/
      if [ -f \$src/buildplan ]; then
        mkdir \$out/.cabbageCache
        cp "\$src/buildplan" "\$out/.cabbageCache/buildplan"
      fi
    else
      tar xf "\$src" --strip=1
    fi

    chmod -R u+w .
    if [ -d dist ]; then
      # Copy pre-generated dist files to store
      cp -R dist \$out
    fi
    \${cabalTmp} sandbox --sandbox=\$out/.cabal-sandbox init -v0
    mkdir -p \$out/.cabal-sandbox/lib/$1
    eval "\$cmds"
    \${cabalTmp} sandbox hc-pkg recache

    \${cabalTmp} --builddir=\$out/dist --bindir=\$out/bin --libexecdir=\$out/libexec --libdir=\$out/.cabal-sandbox/lib --with-gcc=\$CC configure -p \$(echo \$NIX_LDFLAGS | awk -e '{ for(i=1;i <= NF; i++) { if(match(\$(i), /^-L/)) printf("--extra-lib-dirs=%s ", substr(\$(i),3)); } }')
    echo "Building..."

    # Ensure framework dependencies are used by GHC as they are needed
    # when GHC invokes a C compiler.
    COPTS=\$(echo "\$NIX_CFLAGS_COMPILE" | awk -e '{ for(i=1; i<=NF; i++) { if (match(\$(i), /^-F/)) printf("-optc %s ", \$(i)); }}')
    \${cabalTmp} --builddir=\$out/dist build -v0 --ghc-options="\$COPTS"
    \${cabalTmp} --builddir=\$out/dist haddock -v0 || true
    \${cabalTmp} --builddir=\$out/dist copy
    \${cabalTmp} --builddir=\$out/dist register
    \${cabalTmp} --builddir=\$out/dist clean || true
  '';    
  meta = {
    description = "$SYNOPSIS";
  };
}
EOF

  echo "$NIX" > default.nix
}

# Freezes the cabal file in the current directory. Takes the versioned
# name of the package to prepare, and the dbPath for the current
# platform (e.g. x86_64-osx-ghc-7.8.4).
prepCabbage() {
  local d="$1"
  local dbPath="$2"
  local FLAGS
  cabal sandbox init --sandbox=../../.cabal-sandbox > /dev/null

  if [ -f cabal.config ]; then
    mv cabal.config cabal.config.bak
  fi

  ln -s ../../bpconstraints cabal.config

  if [ -n "$FLAGS" ]; then
    freezeCabbagePatch 0 > /dev/null
  else
    freezeCabbagePatch > /dev/null
  fi

  rm cabal.config
  if [ -f cabal.config.bak ]; then
    mv cabal.config.bak cabal.config
  fi

  rm cabal.sandbox.config
  mkCabbage "$dbPath" "$d" 0
}

# Takes a flag to determine if the dependencies of all targets should
# be built. If the flag is true, then the build-depends of all targets
# are consolidated and considered when determining a build plan. The
# second argument is another flag for which true indicates this is a
# downstream package, and false indicates this is a build-tool.
mkCabbages() {
  local NUMCABALS=$(find . -maxdepth 1 -name '?*.cabal' | wc -l)
  if [ "$NUMCABALS" -gt 1 ]; then
    echo "Error: Found multiple cabal files in $(pwd)!"
    exit 1
  fi

  if [ "$1" = true ]; then
    freezeCabbagePatch 2
  else
    freezeCabbagePatch 1
    if ! [ $? -eq 0 ]; then
      if [ "$2" = false ]; then
        echo "Trying emergency constraint patch..."
        freezeUnconstrained
      fi
    fi
  fi
  if [ -f "$CABAL.cabbage.bak" ]; then
    mv "$CABAL.cabbage.bak" "$CABAL"
  fi
  local RES=$?
  if [ $RES -ne 0 ]; then
    echo "Freezing the downstream package $(pwd) failed ($RES)" && false
  else
    echo "Froze downstream package at $(pwd)"
  fi
  local dbPath=$(getPackageDBPath)
  local deps=($(buildplanDependencies))
  getDependencySources
  if [ -f cabbage.config ]; then
    cat cabbage.config | sowFlags
  fi

  # Print a message if there are extra-libraries sepecified in any
  # .cabal file used to build the downstream package that is not
  # obviously guarded by an os(windows) or os(solaris) check.
  warnExtraLibraries

  # Build a constraints file the upstream packages can use when
  # computing their own build plans.
  buildplanConstraints > bpconstraints

  pushd .cabbages > /dev/null
  local d
  
  # We execute several calls to prepCabbage in parallel
  #local numPar=$(getconf _NPROCESSORS_ONLN)
  local numPar=1
  local pids=()
  local numJobs=0
  for d in "${deps[@]}"; do
    echo "Making cabbage: $d"
    if [ -f "$d"/default.nix ]; then
      echo "Using existing default.nix"
    else
      (cd "$d" && prepCabbage "$d" "$dbPath") &
      pids+=($!)
      numJobs=$((numJobs + 1))
      if [ "$numJobs" -eq "$numPar" ]; then
        for p in ${pids[@]}; do
          wait $p
        done
        pids=()
        numJobs=0
      fi
    fi
  done
  for p in ${pids[@]}; do
    wait $p
  done

  popd > /dev/null
  rm bpconstraints

  local BUILDTOOLS=($(allBuildTools))

  if [ "${#BUILDTOOLS}" -gt 0 ]; then
    echo "Making cabbages for build-tools"
    pushd .cabbages > /dev/null
    local bt
    for bt in "${BUILDTOOLS[@]}"; do
      # cabal get "$bt"
      # if ! [ "$?" -eq 0 ]; then
      #   local ANS
      #   echo
      #   echo "WARNING: cabal get did not unpack $bt"
      #   read -p "Should we continue? [Y/n] " -n 1 -t 5 ANS
      #   echo
      #   if [ "$ANS" = "n" ]; then
      #     exit 1
      #   fi
      # fi
      
      local d=$(find . -name "$bt-[[:digit:].]*" -depth 1)
      local dresult=$?
      if [ "$d" = "" ]; then
        echo "WARNING: Couldn't find build tool: $bt"
      else
        d=$(basename "$d")
        echo "Making build-tool cabbage: $d"
        (cd "$d" && cabal sandbox init && mkCabbages false false && cabal sandbox delete)
      fi
    done
    popd > /dev/null
  fi
  
  if [ "$2" = true ]; then
    mkCabbage "$dbPath" "$(getMyFullName)" 1
  else
    mkCabbage "$dbPath" "$(getMyFullName)" 2
  fi
}

Shadowing the global package database

A problem occurs when we want to rebuild a globally installed package with different dependencies. This would leave us with two packages of the same name and version.

There are some limitations to passing GHC packages that have identical names and versions to ones that are installed in the global package database. Namely, even if you pass the -hide-all-packages to GHC, then supply it packages with the -package-id flag, a globally installed package with the same name and version as one given via -package-id can interfere with the build. To combat this, we create “cabbage-patched” versions of globally-installed packages.

We do this by copying the package source for the globally-installed package, and appending 4552 to the version number (the PLU code for Napa Cabbage). We then tweak every frozen build plan that refers to the globally-installed package to instead refer to the cabbage-patched version.

# Takes a cabbage-patched versioned package name; prepares an sdist.
cabbagePatch() {
  if ! [ ${1: -5} = ".4552" ]; then
    echo "Bad call to cabbagePatch with $1"
    exit 1
  fi
  local NAME=${1%".4552"}
  cabal get "$NAME" -d .cabbages
  (cd .cabbages && \
   mv "$NAME" "$1" && \
   (cd "$1" && \
    local CABAL=$(basename "$(ls ./*.cabal)") && \
     mv "$CABAL" "$CABAL".bak && \
     sed 's/\([Vv]ersion:[[:space:]]*\)\([[:digit:].]*\)$/\1\2.4552/' "$CABAL".bak > "$CABAL" && \
     rm "$CABAL".bak))
}

# Determines if a newer version of a globally installed package is
# required. If so, the exit code is 1. If no globally-installed
# package is to be upgraded, the exit code is 0.
upgradesGlobal() {
  local AWK
  read -r -d '' AWK<<'EOF'
BEGIN {
  firstLine = 1;
}
{
  if(firstLine) {
    split($0,arr," ");
    for(i in arr) {
      match(arr[i], /-[[:digit:].]*$/);
      pkg = substr(arr[i], 1, RSTART-1);
      ver = substr(arr[i],RSTART+1);
      globallyInstalled[pkg] = ver;
    }
    firstLine = 0;
  } else {
    if($1 in globallyInstalled) {
      if(globallyInstalled[$1] != $2) {
        printf("%s-%s is an upgrade from the global package database\n", $1, $2);
        exit 1;
      }
    }
  }
}
EOF
  awk "$AWK"
}

# Tweaks the constraints in a piped ~buildplan~ file to replace
# globally installed packages with cabbage patched versions.
cabbagePatchConfig() {
  local AWK
  read -r -d '' AWK<<'EOF'
BEGIN { firstLine = 1; }
{
  if(firstLine) {
    split($0,arr," ");
    for(i in arr) {
      globallyInstalled[arr[i]] = 1;
    }
    firstLine = 0;
  } else {
    versioned = sprintf("%s-%s", $1, $2);
    if(versioned in globallyInstalled) {
      printf("%s %s.4552\n", $1, $2);
    } else {
      print($0);
    }
  }
}
EOF
  awk "$AWK"
}

# Print the list of globally installed packages that can be
# reinstalled.
getReinstallableGlobals() {
  sed -e 's/base-[[:digit:].]*//' \
      -e 's/bin-package-db-[[:digit:].]*//' \
      -e 's/rts-[[:digit:].]*//' \
      -e 's/ghc-[[:digit:].]*//' \
      -e 's/ghc-prim-[[:digit:].]*//' \
      -e 's/integer-gmp-[[:digit:].]*//' | \
  sed 's/  [ ]*/ /'
}

# Find a build plan, then tweak the ~buildplan~ file to
# cabbage patch references to globally installed packages. If an
# argument is given, we do /not/ search for a cached build plan. This
# is useful when passing cabbage the "-a" flag, which will produce a
# different build plan than when this package is built as a dependency
# of something else.
freezeCabbagePatch() {
  if [ "$#" -gt 0 ]; then
    dryDependencies "$1" > buildplan
    if [ $1 -gt 0 ]; then
      # If a globally-installed package must be upgraded, then we cabbage
      # patch the build plan to allow us to shadow globally-installed
      # packages. Otherwise, we do not mention globally-installed packages
      # in the Nix build plan as GHC will pull them in by default.
      local GLOBALS=$(ghc-pkg list --global --simple-output)
      (echo "$GLOBALS"; cat buildplan) | upgradesGlobal
      if [ $? -eq 1 ]; then
        mv buildplan buildplan.bak
        (((echo "$GLOBALS" | getReinstallableGlobals); cat buildplan.bak) | cabbagePatchConfig) > buildplan
        rm buildplan.bak
      fi
    fi
  else
    dryDependencies > buildplan
  fi
}

Notes on Globally Installed Packages

If a globally-installed package is to be upgraded, we cabbage patch all upgradeable globally-installed packages so that they can have alternate build plans. If no globally-installed package is to be upgraded, we do not cabbage patch, and in fact remove globally-installed packages from the downstream package’s constraints list. This lets us build things that depend upon GHC as a library, as well as things that want to update packages that GHC itself depends on.

Top-level

Default nix expression

We currently build with GHC-7.8.4 and cabal-install-1.20.0.6.

This expression is suitable for nix-shell or to be installed itself.

# A default Nix expression suitable for nix-shell or installation.
defaultShell() {
  local TOOLS=($(allBuildTools))

  local t
  local TOREMOVE=()
  for t in "${TOOLS[@]}"; do
    local dirname=$(find .cabbages -name "$t-[[:digit:].]*" -maxdepth 3 -type d)
    if [ -z "$dirname" ]; then
      TOREMOVE+=($t)
    elif [ "$dirname" = "" ]; then
      TOREMOVE+=($t)
    fi
  done
  for t in "${TOREMOVE[@]}"; do
    local i=$(findIndex "$t" TOOLS[@])
    unset TOOLS[$i]
  done

  local TOOLSDEPS
  if [ "${#TOOLS[@]}" -gt 0 ]; then
    TOOLSDEPS=$(echo " ${TOOLS[@]}")
  else
    TOOLSDEPS=""
  fi
  local NIX
  IFS=$'\n' read -r -d '' NIX <<EOF
{ compiler ? "ghc" }:
let pkgs = import <nixpkgs> {};
    mynix = import <mynix>;
    haskellBuildTools = [ mynix.ghcDefault
                          mynix.cabalDefault ];
$(if [ "${#TOOLS[@]}" -gt 0 ]; then
    echo "    buildHelpers = rec {"
    local t
    for t in "${TOOLS[@]}"; do
      local buildTool=$(find .cabbages -name "$t-[[:digit:].]*" -maxdepth 3 -type d | awk '{print length($0) " " $0;}' | sort -n | cut -d ' ' -f 2- | head -n 1)
      if ! [ "$buildTool" = "" ]; then
        #buildTool=$(basename "$buildTool")
        local buildToolTools=($(cd "$buildTool" && allBuildTools))
        #buildToolTools=(${buildToolTools[@]/#/buildHelpers.})
        echo "      $t = pkgs.callPackage $buildTool/default.nix {"
        echo "        inherit pkgs haskellBuildTools ${buildToolTools[@]};"
        echo "      };"
      else
        echo "WARNING: Couldn't add shell.nix dependency on $t" 1>&2
      fi
    done
    echo "    };"
  fi)
$(if [ "${#TOOLS[@]}" -gt 0 ]; then
echo "in with buildHelpers; pkgs.callPackage ./default.nix {"
echo "   inherit pkgs haskellBuildTools $(echo "${TOOLS[@]}");"
else
echo "in pkgs.callPackage ./default.nix {"
echo "   inherit pkgs haskellBuildTools;"
fi)
}
EOF
  echo "$NIX"
}

getNamedCabbage() {
  local NIX
  read -r -d '' NIX<<EOF
with import <nixpkgs> {};
with import ./shell.nix;
(lib.findFirst (pkg: (builtins.parseDrvName pkg.name).name == "haskell-$1")
               {name="Error";}
               cabbageDeps).outPath
EOF
  echo "$NIX" > getNamedCabbage.nix

  local CABBAGE
  CABBAGE=$(nix-instantiate --eval getNamedCabbage.nix | sed 's/^"\(.*\)"$/\1/')
  echo "To install $1 in your environment, run:"
  echo "nix-env -i $CABBAGE"
}

Arguments

If given an argument, try to get it from hackage.

The technique for creating a temporary directory that works on both Linux and Darwin is from here.

mytmpdir=$(mktemp -d 2>/dev/null || mktemp -d -t 'cabbage-temp')
(cd "$mytmpdir" \
    && getCabalFile "$1" \
    && cabal sandbox init \
    && mkCabbages $ALLTARGETS true \
    && defaultShell > shell.nix \
    && cabal sandbox hc-pkg recache \
    && nix-shell --command "echo 'Done'" \
    && getNamedCabbage "$1")
rm -r "$mytmpdir"

Support to generate a dependency

When the user wants to install a library into the nix store, we generate a dummy package that depends on the package the user wants, then install the dummy package’s dependencies with nix-shell. The cabbage process is driven by cabal freeze which is happy to run the solver on a very minimal cabal file. So, we see what we got from cabal get, then reformat the directory name into a version constraint that we use to populate the dummy cabal file.

# Takes a versioned file name, e.g. "foo-0.8.2",
# and returns "foo ==0.8.2"
mkConstraintString() {
  sed 's/\(.*\)-\([[:digit:]].*\)/\1 ==\2/' <<< "$1"
}

# Takes a versioned file name and produces a minimal cabal file for
# freezing purposes.
mkDummyCabal() {
  local CABAL
  local SELFDEP=$(mkConstraintString "$1")

  read -r -d '' CABAL<<EOF
name:               Dummy
version:            0.1.0.0
build-type:         Simple
cabal-version:      >=1.10

library
  build-depends:    $SELFDEP
  exposed-modules:
EOF

  echo "$CABAL"
}

A quick test

Getting the file to build in a temporary directory

<<mkDummyCabal>>

# If the argument is a cabal file, copy the contents of the directory
# it is in to the current directory. Otherwise, try using ~cabal get~
# to download the package from hackage.
getCabalFile() {
  if [ -f "$1" ]; then
    cp -R "$(dirname "$1")"/* .
  else
    mkdir -p .cabbages
    cabal get "$1" -d .cabbages
    local VERSIONED_NAME=$(ls .cabbages)
    mkDummyCabal "$VERSIONED_NAME" > dummy.cabal
  fi
}

No arguments

This lets us just run the tangled shell script from the command line and generates a Nix expression for the cabal file in the current directory.

Code

<<getCabalFile>>
<<defaultShell>>

# Get a list of all add-source dependencies, then delete and
# re-initialize the sandbox, then re-add those sources. This flushes
# out the sandbox, which can otherwise constrain the dependency
# solver and leave us with an incomplete buildplan.
regenSandbox() {
  local DIRS=($(cabal sandbox list-sources | sed '1,/^$/ d' | sed '/^$/,$ d'))
  cabal sandbox delete
  cabal sandbox init
  local d
  for d in "${DIRS[@]}"; do
    cabal sandbox add-source "$d"
  done
}

showHelp() {
  echo "Usage: cabbage [-a | -b | -l | -n] [packageName]"
  echo ""
  echo "- Run cabbage in a directory with a .cabal file to build Nix"
  echo "  expressions for the current package and all of its dependencies."
  echo "  Then run 'nix-shell --run 'sh $setup'' to ensure that all "
  echo "  dependencies are available in the Nix store, and to link them into "
  echo "  the sandbox."
  echo "  If no sandbox is in the current directory, a new one will be created."
  echo ""
  echo "- The '-a' option will additionally link the dependencies of any "
  echo "  benchmark and test suites. "
  echo ""
  echo "- The '-b' option will build the project with nix-build."
  echo ""
  echo "- The '-l' option will build against the current Stackage LTS"
  echo "  (see www.stackage.org for more information)"
  echo ""
  echo "- The '-n' option will build against the current Stackage Nightly"
  echo "  (see www.stackage.org for more information)"
  echo ""
  echo "- If cabbage is given a path to a .cabal file or a package name "
  echo "  (with optional version suffix) available on hackage, that package "
  echo "  will be built in a temporary directory so that it is available in "
  echo "  the Nix store for future builds. If you want executables provided "
  echo "  by that package to be linked into your environment, follow the "
  echo "  instructions in the last line of cabbage output."
}

ALLTARGETS=false

while getopts ":ablnh" opt; do
  case "$opt" in
    a) ALLTARGETS=true;;
    b) cabal configure && cabal sdist -v0 && nix-build shell.nix; exit 0;;
    l) curl -L http://www.stackage.org/lts/cabal.config > cabal.config;;
    n) curl -L http://www.stackage.org/nightly/cabal.config > cabal.config;;
    h|\?) showHelp; exit 0;;
  esac
done
shift $((OPTIND - 1))

if [ "$#" -eq 0 ]; then
  if ! [ -f cabal.sandbox.config ]; then
    cabal sandbox init
  else
    regenSandbox
  fi
  mkCabbages $ALLTARGETS true
  if ! [ -f shell.nix ]; then
    defaultShell > shell.nix
  fi
else
  if [ "$1" = "stackage" ]; then
    if ! [ -f cabal.config ]; then
      echo "Building stackage requires a cabal.config file."
      echo "Did you forget to use either the -l or -n flags?"
    else
      if [ -f stackage.cabal ] || [ -f cabal.sandbox.config ]; then
        echo "This directory isnot empty."
        read -p "Are you sure you want to clear it? [Y/n]" -n 1 -t 5 STACKAGE_CLEAR_SB
        echo
        if [ "$STACKAGE_CLEAR_SB" = "n" ]; then
          exit 1
        fi
        cabal sandbox delete
      fi
      dummyFromConstraints > stackage.cabal
      cabal sandbox init
      mkCabbages false true
      defaultShell > shell.nix
    fi
  else
    <<buildInTempDir>>
  fi
fi

Tasks

Cache cabbages

Right now, we always download a package and we always generate a cabbage. What we could do is cache the downloaded source and the result of cabal freeze, then do the buildplan intersection and check if we’ve got an equivalent default.nix in the cache. It’s not clear how much time this would save. We need to do the constraint intersection no matter what. We could hash the constraint intersection with the package’s .cabal file and see if we’ve already generated an equivalent cabbage. This would just save us the trouble of producing the actual .nix files, but much of the work would have already been done.

systemDeps for common packages

We could define a Nix expression that has systemDeps for a bunch of well-known Haskell packages. This could serve to obviate the need for a cabbage.config specification of systemDeps for common packages. Where would we install this expression?

Automatic extra-libraries parsing

A Haskell package like zlib has an extra-libraries field that mentions z. This system dependency, libz, is provided by the nixpkgs package, zlib. It would be nice to parse these out of .cabal files automatically and map them to nixpkgs package names. In the meantime, these can be manually specified a cabbage.config file.

Make dependency cabbages in parallel

Perhaps use getconf _NPROCESSORS_ONLN to figure out how many jobs to run in parallel. It’s not entirely sensible as we may not be bottlenecked on CPU, but it’s a number.

This would work, but cabal uses a file system lock in the sandbox that gets confused when multiple jobs run simultaneously.