Category Archives: Build systems

Adding/overriding SCons Tools

I had the bright idea to add SCons extras to my builds via a build helper that adds them to any build, instead of manually copying them to site_scons/site_tools folders in each project. To begin with, I was adding latest/modified Visual Studio code found in the MSCommon directory in the SCons build – rather than force everyone to have a bleeding-edge SCons install to build with Visual Studio 2013, I figured I would just wedge these in.

This sounded like a good idea…

First try – DefaultToolPath

There’s a global array called DefaultToolPath in SCons.Tool that SCons’ own startup code puts your site_scons_site_tools folder into, so I figured I would just use it:

def AddSConsTools():
  # Add the appropriate SCons tools to the default toolpath. This
  # contains an override MSCommon folder as well as other builders

  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  site_tools = os.path.join(os.path.dirname(__file__), 'scons-2.3.3', 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

By inserting in the front of the array, my tools come first, at least before anything that is done by SCons itself (DefaultToolPath is filled in when SCons starts up).

We have a packaging system that makes it easy for SCons projects to load dependencies. I have a package that my SConstruct loads, and that package looks like this:


Clever, right? Except it didn’t work.

After some tracing through the code to find out how tools load (SCons/Tool/, in the __init__ function for the Tool class), I found out that, while you can add extra tools this way, you can’t add sub-modules of tools this way, because the way that the Tool intercept loader works is only searching for modules, not handling sub-modules. So I could replace or add Tools themselves, but I couldn’t do this for sub-modules of Tool, e.g. the MSCommon folder.

Second try – overload tools that reference MSCommon.

Since I really wanted a replacement MSCommon folder, I decided to put some of the Visual Studio tools in my build helper as well. That makes things a little more fragile, but this way, as long as my Tool (say msvc) is loaded first, even in a dummy environment, it pulls in my MSCommon files, and then those loaded modules are used by other parts of the system. Since is the first loaded tool (by default), I just put in, but I could put everything in.

So, to be clear, here’s what my build helpers looks like now:


As another quick reference, here are the tools that import modules from MSCommon:

  • linkloc (Phar Lap ETS, niche)
  • midl (Microsoft IDL compiler)
  • mslib (Microsoft Library archiver)
  • mslink (Microsoft Linker)
  • mssdk (Microsoft SDK setup)
  • msvc (Microsoft Visual C++)
  • msvs (Microsoft Visual Studio, generates solution/project files)

Since MSCommon is a module with an, it directly pulls in three of the submodules (SCons.Tool.MSCommon.sdk,, and SCons.Tool.MSCommon.vs). Each of those pulls in SCons.Tool.MSCommon.common. I can’t find anything that imports SCons.Tool.MSCommon.arch or SCons.Tool.MSCommon.netframework, so these appears to be dead modules.

So as long as I reference a tool in my own hierarchy first, I can get my custom MSCommon modules to be used.

But, that didn’t work either. I had added debug printing in the previous step, and, depressingly, this is what I saw:

Adding SCons tools from c:\package_cache\ Build Tools\0.75\noarch\build\scons-2.3.3\site_tools
  loading SCons.Tool.default
  found 2nd at C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\ ('.py', 'U', 1)
  found 1st at c:\package_cache\ Build Tools\0.75\noarch\build\scons-2.3.3\site_tools\ ('.py', 'U', 1)
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc
I am c:\package_cache\ Build Tools\0.75\noarch\build\scons-2.3.3\site_tools\MSCommon\

This is because is doing absolute imports . So, while my override was indeed loaded, my MSCommon folder was still skipped in favor of the one in the SCons 2.3.0 install folder.

Third try – put folders in the python path

DefaultToolPath isn’t a good idea for anything other than a Tool. For the MSCommon folder, I’ll just insert it into the python search path. So I rearrange my package to look like this:


I didn’t actually need in the package, I just put it there to verify that I had everything working. In my real build package, I have some custom tools that we use for builds. I also trimmed the apparently-obsolete files.

My startup code now looks like this:

def AddSConsTools():
  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  sconsdir = os.path.join(os.path.dirname(__file__), 'scons-2.3.3')
  # Add custom SCons tools to the default toolpath
  site_tools = os.path.join(sconsdir, 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

  # Add the SCons.Tool.MSCommon folder to the python module path
  MSCommon = os.path.join(sconsdir, scons_patch)
  sys.path.insert(0, os.path.abspath(MSCommon))
  print('Patching SCons with %s' % MSCommon)

Of course, that didn’t work either. I get an error saying it can’t find MSCommon.

EnvironmentError: No module named MSCommon:

Even worse, it is finding things inside MSCommon – in the wrong place!

I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc

Fourth try – more care

It would behoove me to pay more attention to how modules inside MSCommon are being imported.

Some files are using absolute module paths

from SCons.Tool.MSCommon import msvs_exists, merge_default_version

but others are using relative paths

from MSCommon import msvc_exists

or (inside the MSCommon folder)

import common

So I can either add an MSCommon path to the python path, or add all the Microsoft-specific tools to my package?

So let’s try that:

def AddSConsTools():
  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  sconsdir = os.path.join(os.path.dirname(__file__), 'scons-2.3.3')
  # Add custom SCons tools to the default toolpath
  site_tools = os.path.join(sconsdir, 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

  # Add the SCons.Tool.MSCommon folder to the python module path
  scons_patch = os.path.join(sconsdir, 'scons_patch')
  print('Patching SCons with %s' % scons_patch)
  sys.path.insert(0, os.path.abspath(os.path.join(scons_patch, 'SCons', 'Tool')))
  sys.path.insert(0, os.path.abspath(scons_patch))

And it didn’t work!

I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc
I am c:\package_cache\ Build Tools\0.83\noarch\build\scons-2.3.3\scons_patch\SCons\Tool\MSCommon\

Without being 100% sure, I believe this means that Python first searches for a .pyc, and then if it doesn’t find one, it looks for a .py. Since I’ve used SCons in the past, my SCons build has .pyc files in it. Since my newly loaded package does not, it skips right past my package when importing the stock Python way. That last use where it finds my package files is maybe due to the Tool loader code? It’s odd, because supposedly an import checks for cached modules before searching on disk. But I really don’t care that much at the moment.

So, two options

  1. Fiddle with the module path and directly import all the MSCommon stuff, then restore the module path
  2. Use sys.meta.path

The first one is probably the easiest to get working, but it’s hacky, whereas option #2 sounds cool.

To be continued…

A rant

It’s a lot of work for short-term gain, but it’s also something I’m thinking about in terms of SCons or other build systems – build tools are themselves dependencies, and having globally installed tools with a single name that could be any version is a bad idea. A lot of the Unix tools, specifically GCC, have this issue, and while there are vendor workarounds, there is no standard for doing this. Apple is a little better with XCode, because you can select the version of xcode you want used, and everything is relative to the xcode root. Visual Studio went the “every version has a unique name”, which is good and bad; they suffer from incompatibility between toolchain releases, or rather they purposefully break compatibility.

Oops, philosophy. I’ll talk more about this in a bit, but build tools should be versioned objects just like source code, and a project declares what build tools it depends on. There has to be a root, but that root would be very lightweight and have no version dependencies.

I call this “build”, but it’s really a build meta-tool that makes sure the right build tools are loaded and used.

Using CMake

Like most build systems, CMake is not clearly documented enough for me. I’m going to use libgit2 as an example of something real that is available to everyone. I’m doing this because there’s still not a single build system that’s good enough for general use, at least not when it comes to working on multi-platform projects.


You’ll almost always be using CMake to generate and use Visual Studio projects, although you have a choice of:

  • Makefile: MinGW, MSYS, NMake
  • Visual Studio projects (6-12)
  • nmake

Let’s start with Visual Studio projects, since that’s the common case.

Visual Studio generation

Grab the libgit2 source. Since I’m going to build for PyGit, I want a specific tag for compatibility. I definitely don’t want the development branch :)

> git clone
> cd libgit2
> git checkout -b local-v0.20.0 v0.20.0

You’re meant to run CMake from the output folder. This is weird, but whatever. So here’s the naive way to use CMake.

> mkdir build
> cd build
> cmake ..
-- Building for: Visual Studio 12
-- The C compiler identification is MSVC 18.0.21005.1
-- Check for working C compiler using: Visual Studio 12
-- Check for working C compiler using: Visual Studio 12 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
zlib was not found; using bundled 3rd-party sources.
-- Found PythonInterp: C:/Python27/python.exe (found version "2.7.6")
-- Configuring done
-- Generating done
-- Build files have been written to: C:/projects/git/github/libgit2/build

Of course, this will auto-pick a Visual Studio toolchain, and since it’s Windows, it won’t use the toolchain found in my path (that I very carefully put there), since it’s actually not common that the Visual Studio toolchain is in the path. CMake will default to the newest version it finds, and while that’s a reasonable thing to do, I need to be specific. So you need to tell CMake about the toolchain.

> mkdir build
> cd build
> cmake -G "Visual Studio 11" ..
-- The C compiler identification is MSVC 17.0.61030.0
-- Check for working C compiler using: Visual Studio 11
-- Check for working C compiler using: Visual Studio 11 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
zlib was not found; using bundled 3rd-party sources.
-- Found PythonInterp: C:/Python27/python.exe (found version "2.7.6")
-- Configuring done
-- Generating done
-- Build files have been written to: C:/projects/git/github/libgit2/build

It doesn’t look like it’s possible to hard-code a generator in the CMakeLists.txt file. A kickstarter CMakeCache.txt file can go into the target folder, but there’s a chicken-and-egg issue there.

In CMake, there is a distinction between “generator” and “toolchain”. E.g. you can use the “Visual Studio 12″ generator but have it create projects that use the “Visual Studio 11″ toolchain.

> cmake -G "Visual Studio 12" -T "Visual Studio 11" ..

Up until now, all we’ve done is create a Visual Studio project file. While that’s useful, we actually want some built libraries and binaries.

You can build from the command-line like so:

> cmake --build .

(assuming you were in the build directory). However, it’s of limited use, because (on Windows with Visual Studio) you can only do debug builds this way. There’s no way to tell CMake to do a release build with Visual Studio. If you do this

> cmake --build . --target Release

you’ll get an error; the target functionality only works for makefile generators. You’ll also find out that cmake is using devenv, when it should now be using msbuild to be a good Windows citizen. CMake is great for creating cross-platform projects, but less good as an actual build tool. So you’ll want to directly use MSBuild.

> msbuild libgit2.sln /t:Build /p:Configuration=Release;Platform=Win32

And now I have libraries and binaries in libgit2/build/Release. If you really want to use devenv (against Microsoft’s desires, but what the heck), then

> devenv libgit2.sln /build Release /project ALL_BUILD

There is nothing that mandated the output folder being named build, it’s merely a convention.





General comments

Once you’ve generated makefiles with a specific generator, you can’t change the generator. You need to wipe the build folder, or pick a new build folder. So for doing cross-platform builds on a single machine, you’ll want some consistent naming for multiple build folders.

CMake likes to generate projects for a single architecture.

> cmake -G "Visual Studio 12 Win32" ..
> cmake -G "Visual Studio 12 Win64" ..

I don’t know how to generate a multi-architecture project. Or rather, the CMake philosophy is to use multiple build directories, with a single source tree, and it sounds like from the architecture of CMake that it just won’t be possible.


CMake documentation


Specific platforms



Using Visual Studio toolchains

This is a collection of information about how to use Visual Studio toolchains from command-lines or from other build systems. It’s probably also useful for people who want to know how things are configured – because when something is broken, you either fix it, or reset and start over.

I’m also only going to cover Visual C++, since that’s what I care about. And this is a little disjoint, but it is a blog post, after all – I’ll try to turn it into actual documentation at some point. Or rather, this is half a blog post, since I’m going to update it multiple times.

Visual Studio 2013

This is also known as Visual Studio 12.

Default install path: C:\Program Files (x86)\Microsoft Visual Studio 12.0\

Location to vcvarsall.bat: $(VSTUDIO)\VC\vcvarsall.bat. This is useful to read or run because it contains all the environment variables needed to run tools from the command line. I’m presuming that the Visual Studio IDE does something equivalent.

Environment variables


These already existed in my environment, but were updated by vcvarsall.bat.

CommonProgramFiles=C:\Program Files\Common Files
CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files
CommonProgramW6432=C:\Program Files\Common Files
ProgramFiles=C:\Program Files
ProgramFiles(x86)=C:\Program Files (x86)
ProgramW6432=C:\Program Files


These are common to the x86 and amd64 toolchains.

ExtensionSdkDir=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1\ExtensionSDKs
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\INCLUDE;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\INCLUDE;
  C:\Program Files (x86)\Windows Kits\8.1\include\shared;
  C:\Program Files (x86)\Windows Kits\8.1\include\um;
  C:\Program Files (x86)\Windows Kits\8.1\include\winrt;
  C:\Program Files (x86)\Windows Kits\8.1\References\CommonConfiguration\Neutral;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1\ExtensionSDKs\Microsoft.VCLibs\12.0\References\CommonConfiguration\neutral;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow;
VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\
VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 12.0\
WindowsSdkDir=C:\Program Files (x86)\Windows Kits\8.1\
WindowsSDK_ExecutablePath_x64=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\x64\
WindowsSDK_ExecutablePath_x86=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\


These are specific to x86 toolchains.

DevEnvDir=C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB;
  C:\Program Files (x86)\Windows Kits\8.1\lib\winv6.3\um\x86;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB;
  C:\Program Files (x86)\MSBuild\12.0\bin;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\BIN;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\VCPackages;
  C:\Program Files (x86)\HTML Help Workshop;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x86;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\


These are specific to amd64 toolchains.

  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB\amd64;
  C:\Program Files (x86)\Windows Kits\8.1\lib\winv6.3\um\x64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB\amd64;
  C:\Program Files (x86)\MSBuild\12.0\bin\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\BIN\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\VCPackages;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools;
  C:\Program Files (x86)\HTML Help Workshop;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools\x64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x64;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x86;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\x64\

Note that many of these are disjoint, so could probably be set in a unified environment.

Some pre-existing environment variables were touched. I think it decided to make sure they were correct, because in one case it’s clear that the previous environment variable was incorrect. Before I ran vcvarsall.bat, I had these

CommonProgramFiles=C:\Program Files (x86)\Common Files

which is clearly wrong. I also had this:


which turned to this for x86


and this for amd64


I’m guessing this is supposed to be recording the architecture of the host machine, not the toolchain target. This

indicates that I’m doing something wrong, I’m using 64-bit tools from a 32-bit cmd.exe. So now this makes slightly more sense. Except Task Manager says otherwise, it says that I’m not running 32-bit cmd.exe processes (there’s a *32 annotation on 32-bit processes). So my machine was set up incorrectly? Something to look into down the road.

SCons Environment in depth, part 3

I’m going to focus on the Microsoft toolchain, with the aim of being able to put a Microsoft toolchain into a package that can be loaded at build time. The plus side to this is that you don’t need toolchains installed to systems, but it require a little finagling of SCons. And to do that, we need to understand what it’s doing. I covered individual Microsoft-specific tools in the past part, but in isolation, and with less understanding than I have now. So, onwards.

Note – this is super-sketchy and should be filled in. I started keeping notes for myself as I was working on Visual-C++-in-a-package, and afer the initial exploration, I started working. I need to circle back and update this.

How does SCons configure Microsoft Visual C++?

There is a debugging environment variable that you can set that will enable some SCons spew from Tool/MSCommon/ If you do that with a simple SConstruct

env = Environment(tools=[], platform='win32', MSVC_VERSION='11.0')

then you’ll get some output that will guide you. Since we’re trying to use specific Microsoft products, there are well-known registry keys pointing to each version. Visual Studio 2012 has a registry key pointing to the on-disk location for Visual C++:

Software\Wow6432Node\Microsoft\VisualStudio\11.0\Setup\VC\ProductDir = C:\dev\VC11\VC\

If you don’t specify a Visual C++ version SCons will enumerate every possible version of Visual Studio going back to to the dawn of time, and then pick the first one it finds – since the list it searches is ordered from newest to oldest, this will find the most recent Visual C++ that you have installed.

If you do this while specifying a specific Visual C++ version, you’ll see that it skips the registry scanning and goes straight to enumerating the hard disk. However, something later forgets this, and it scans anyway. This is because vc.msvc_exists() is defective – it uses the (cached) list of versions as proof that Visual C++ exists, but nothing set it up for the case where you bypass it. This is an easy fix. I’ll add to the list of things I want to patch.

Another nit is that find_vc_pdir is not memoized – it’s called at least three times during setup. The only reason I care is that SCons on Linux (even in a VM) is about 0.5 sec faster at startup than on Windows – this might be Python overhead on the two systems, or it could be the Microsoft tools init. I’ll profile it at some point.

Then it finds the magic BAT file that Microsoft supplies for command-line use, that sets up all the environment variables that the toolchains need to run. There is an boolean environment variable MSVC_USE_SCRIPT that lets you disable the use of the Microsoft script – if this is set to False (it defaults to True), then SCons assumes you have done all the setup yourself.

And it scans for installed SDKs. This part is missing a preconfigure step to let you select a specific SDK. In general, SDKs are loosely coupled with the Visual Studio install, but only very loosely.

Visual C++ vcvarsall.bat

This is a batch file that Microsoft has been supplying for a while, as a convenience for configuring an environment for building with Visual C++. It takes an optional architecture parameter that if not supplied defaults to ‘x86′. And if you’re curious, this just runs a different batch file at \bin\amd64\vcvars64.bat, and this  makes registry queries and calls another batch file, Common7\Tools\VCVarsQueryRegistry.bat, which does most of the real work.

If you run it like this

C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\vcvarsall.bat amd64

then it will set the following environment variables:

ExtensionSdkDir=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0\ExtensionSDKs
FSHARPINSTALLDIR=C:\Program Files (x86)\Microsoft SDKs\F#\3.0\Framework\v4.0\
  C:\Program Files (x86)\Windows Kits\8.0\include\shared;
  C:\Program Files (x86)\Windows Kits\8.0\include\um;
  C:\Program Files (x86)\Windows Kits\8.0\include\winrt;
  C:\Program Files (x86)\Windows Kits\8.0\lib\win8\um\x64;
  C:\Program Files (x86)\Windows Kits\8.0\References\CommonConfiguration\Neutral;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0\ExtensionSDKs\Microsoft.VCLibs\11.0\References\CommonConfiguration\neutral;
  C:\Program Files (x86)\HTML Help Workshop;
  C:\dev\VC11\Team Tools\Performance Tools\x64;
  C:\dev\VC11\Team Tools\Performance Tools;
  C:\Program Files (x86)\Windows Kits\8.0\bin\x64;
  C:\Program Files (x86)\Windows Kits\8.0\bin\x86;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\x64;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\x64;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\
WindowsSdkDir=C:\Program Files (x86)\Windows Kits\8.0\
WindowsSdkDir_35=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\
WindowsSdkDir_old=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\

If environment variables already exist, it prepends to them.

Now, this may not be entirely accurate, because I had a few environment variables already set for some reason (I’m assuming the Visual Studio installer did this)

VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\
VS110COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\

I removed these from an environment and ran vcvars64.bat for VC11, and got the VS110COMNTOOLS environment variable. I think this comes from the “Visual Studio Tools” folder which contains Spy++ and other top-level tools that you would run from the IDE, not as part of the build environment.

This may be a side-light to you, but I want to package Visual C++ into a downloadable tool that is used by the build system to allow builds on arbitrary machines. Yes, we’ll have to make sure we only do this where we’re appropriately licensed.

HKLM\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0\InstallationFolder

Path to the installed Windows SDK, put into environment variable WindowsSdkDir. The default is C:\Program Files (x86)\Windows Kits\8.0\

Alternate locations

  • HKCU\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0\InstallationFolder
  • HKLM\SOFTWARE\Wow6432Node\Microsoft\Microsoft SDKs\Windows\v8.0\InstallationFolder
  • HKCU\SOFTWARE\Wow6432Node\Microsoft\Microsoft SDKs\Windows\v8.0\InstallationFolder

HKLM\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0A\InstallationFolder

Path to an older Windows SDK (for Visual Studio 2012), put into environment variable WindowsSdkDir_old.

Alternate locations

  • HKCU\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0a\InstallationFolder
  • HKLM\SOFTWARE\Wow6432Node\Microsoft\Microsoft SDKs\Windows\v8.0a\InstallationFolder
  • HKCU\SOFTWARE\Wow6432Node\Microsoft\Microsoft SDKs\Windows\v8.0a\InstallationFolder

Environment variables

Microsoft build tools need to have some environment variables set up.


PATH needs to contain the paths to the various tools that will be invoked. For example, it might look something like this. I edited a tiny bit for clarity, where C:\dev\VC11 is the installation folder for Visual Studio 2012 (typically C:\Program Files (x86)\Microsoft Visual Studio 2011), and C:\dev\SDKs is the installation folder for Microsoft SDKs (typically C:\Program Files (x86)\Microsoft SDKs).

  C:\dev\VC11\Team Tools\Performance Tools\x64
  C:\dev\VC11\Team Tools\Performance Tools
  C:\Program Files (x86)\Windows Kits\8.0\bin\x64
  C:\Program Files (x86)\Windows Kits\8.0\bin\x86
  C:\dev\SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\x64
  C:\dev\SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools

As mentioned above, the paths come from executing vcvarsall.bat.

SCons tidbits

Here are a few things I learned that don’t appear to be in the documentation.

Passing variables to SConscripts

The documentation states two ways to make variables available for import:

SConscript('build/src/SConscript', exports = 'env')

which exports a variable for just this SConscript to import, or


which adds to a global export list that all SConscripts can import from. They can be combined, and variables exported in the SConscript line take precedence over ones in the global list.

However, there is a third way to do this that is undocumented as far as I can tell

SConscript('build/src/SConscript', 'env')

which is the same as the first method, just sans the export keyword. And of course, it can be a list of variables, and they can be remapped.


This are not documented, but is useful. Detect (or env.Detect) is the call that is used to find an executable. It’s called Detect() because it’s used by the tool config system to see if a tool is installed. It searches through the paths in the PATH environment variable.

The scalar version returns the path to the executable, if the executable can be found in the system:

path = env.Detect('protoc')

The list version returns the path to the first executable that can be found. This is useful if there’s a single conceptual tool that might have multiple names:

path = env.Detect(['protoc', 'protoc9', 'protoc10'])

The latter would only be used for cases where each variant is equivalent in functionality, because it will just pick the first one found. Alternatively, there could be a case for listing from superset to subset tool, if your usage code can detect and handle fallbacks, but it’s probably better to do that with individual Detect() calls.

You can pass in the file with extension, if you only want to find that exact name. Otherwise, SCons will add the extension appropriate to the operating system (on Windows, it will iterate through PATHEXT).

The idiom is to do this in your tool’s exists() function, if your tool is a wrapper around an installed program.


Travis CI

This is interesting

Travis-CI is hosted continuous integration and deployment, using Mac OS X. It integrates with GitHub and supports a fair number of languages. So, you can test and deploy Mac targets without needing Macs of your own.

The big challenge for some companies is letting their source code leave their walls, because obviously this is going to be building your source on machines you don’t control. But I think that attitude is going to die out over time, as people realize that it’s the writing of source code that is where the value is, not the source code itself.

Travis uses xctool to drive XCode instead of xcodebuild.

Another piece of Travis I think comes from Sauce Labs


A list of make systems

last updated: 2013-12-22 9:50 AM PST.

This is every make system I know of, and I’ll keep updating this. My terminology is that a make system can do dependencies and generate code (object files, binaries, arbitrary tool output); development environments typically sit on top of make systems but include editing and debugging; and build systems offer scheduling, multi-machine control, cross-platform in a single build, reporting/logging, and packaging/distribution.

This needs some sort of organization; I’ll probably do both a family tree, and alphabetical.


Make is written in C.

Perhaps the original make utility, written in 1976 and available with all Unix systems. There are two variants that deserve to be called make; POSIX make, the standardized version from IEEE Std 1003.1, and GNU make, which is the defacto standard. Wikipedia has links to useful information: It’s more likely that you have GNU make than POSIX make. The vast majority of open-source projects to date come with make files.

make is typically used only in conjunction with building Unix programs, and Mac programs that stick to POSIX features. make is also often used in conjunction with a configure program to generate a platform-specific or configuration-specific makefile, and autoconf is the most widely used such system.

There are a number of make-compatible variants.

  • Microsoft’s nmake is a subset, and sees little modern use.
  • Android uses GNU make but passes the and files to multiple phases of the build process.


Autotools is written in C.

While listed separately (because it sits on top of make), it is used in conjunction with make. A lot of Unix software in the 1970s through 1990s was build with Autotools. However, it’s really not recommended for anyone any more, it is very arcane, very slow, and very tied to a specific set of Unix toolchains. For example, KDE switched from Autotools to CMake.

Don’t use Autotools/Autoconf. Just don’t.

msbuild msbuild is probably written in C++.

msbuild is the build system for Visual Studio starting with Visual Studio 2005. The msbuild engine can directly consume .vcproj/.vcxproj files. Most Windows-only software is  built with msbuild.

XCode XCode is (presumably) written in Objective-C++. XCode the IDE has an embeded make system that is available through the xcodebuild command-line tool, or directly from the IDE. Tragically, there is very little direct documentation on it, as compared to msbuild. See for a simple overview.

SCons SCons is written in Python.

SCons is a fairly cross-platform make system, in that a single SConstruct can be used to build on multiple platforms. It hasn’t seen much development in years, however. It would be nice if it doesn’t decay to uselessness, because it’s the only serious cross-platform make system available. It supports Windows and Mac, and a large varienty of Unix systems. SCons has configuration built in, it does not use a separate configuration system.

Some reading:

cmake CMake is written in C++.

CMake isn’t  a make system on its own. Instead, it generates projects to be used in other make systems; its main clients are msbuild, XCode and make. It supports Windows and Mac, and some Unix systems.

ninja Ninja is similar to make in philosophy. It was written by a Chrome engineer, and is supposedly replacing other make systems used by Chrome. Ninja is also not usually used as a standalone make system, but in conjunction with other make systems (e.g. CMake can generate Ninja makefiles). In fact, Ninja files are usually generated by something else, since they are not particularly convenient to write by hand. Also, from the author, Ninja is based on what he thought Tup did, but without the parts he disliked.

Stuff to read:

gyp Supposedly, gyp was written in reaction to the Chromium team’s problems with SCons.

Stuff to read:

tup Tup is a new project notable for its sense of grandiose humor, and its focus on performance. The page is well worth reading for anyone, not just for more Tup information. Tup is a dependency system, so it would be interesting to see what a complex code+data build looks like in Tup.

Some reading:

premake Premake is written in C and Lua. It is like CMake in that it generates makefiles for other build systems, it’s not a make on its own.

Some reading:

waf waf is written in Python.

This was originally a fork of SCons, but it can be considered to be a completely new project by this point. Also see

fabricate Fabricate is written in Python.

Fabricate is interesting in that it has no explicit dependency declarations; it discovers them by watching filesystem operations. It’s essentially a build script that memoizes itself. That said, it only works on a few platforms, and is somewhat fragile because filesystem watching is not perfectly reliable on any system (it’s typically a bolt-on, or a system that is allowed to drop events when overloaded).

Shake Shake is a Haskell library for writing build systems.

Redo Currently written in Python, although the author stated he wants to re-implement it in C. It also really only works on Unix systems.

This is a cleanroom re-implementation of Daniel J. Bernstein’s (unreleased?) redo project; this is a make successor, intended to fix make’s inherent flaws.

Crumb It looks like this only runs on Unix.

This is not really a make system, but a tool that instruments filesystem calls to determine dependencies, and is intended for use in build systems, providing equivalent functionality to Fabricate.

badgerconfig This is written in Python.

This looks like just a Visual Studio project generator.

Rake Rake is written in Ruby.

This is “Ruby Make”, although the makefiles use Ruby syntax.




OMake OMake is written in OCaml.

While OMake uses the syntax of make, it isn’t really a make dropin because it extends make substantially.

Some reading:

FBuild FBuild is written in Python 3.

This seems like yet another build system inspired by the decline of SCons.

Some reading:

mem Mem is written in Python.

Mem works via instrumentation and memoization.



Jam Jam is written in C.

Jam’s makefiles are derived from the syntax of Make, although it is not make compatible. It has been around for a long time, and is fairly well documented. But it’s make, and suffers from the same limitations and has the same problems.

FTJam (FreeType Jam). FTJam is written in C.

FTJam is a slight enhancement of the original Perforce Jam.

BJam (Boost Jam) Boost Jam is written in C++.

It was derived from FTJam, which in turn was derived from Perforce’s original Jam.

Boost.Build This is written in C++.

I think this replaced the use of Jam by the Boost team. I don’t know if anyone else uses Boost.Build. It also appears to be dead – there was a plan in 2007 to reimplement most of it in Python, but it looks like that was never started.



Wonderbuild, or Wonderbuild is written in Python.

Wonderbuild appears to have been written for the Psycle Modular Music Creation Studio project. A perhaps biased benchmark shows it as faster than most other make systems – It is not documented except by reading through the source code.





Other reference


Nix, a purely functional package manager

There’s a lot to like about Nix, including that packages are content-named, and that dependencies are tracked. There’s some crazy parts, like the fact that runtime dependencies are determined by scanning binaries for the content-names of Nix packages, but since a long content name is pretty much a unique thing, it sounds scary but is actually quite safe.

It also makes building very safe, if you have everything as packages.

One negative is that this only supports Unix-like systems (Linux, Mac OS X, FreeBSD). A lesser negative is that it introduces yet another language, the Nix expression language.

The NixOS Linux distribution is built around Nix, and Hydra is a Nix-based continuous build system.