Category Archives: Software Development

MPW shell

I miss MPW – this was Apple’s Macintosh Programmer Workshop. The shell was a full text-editor window that you could type in, but if you hit Enter (instead of Return) at the end of a line, or you selected several lines and hit Enter, it would execute those as commands. Output went to the insertion point, or to the point right after the selection.

The plus side was that you had these worksheets that had all your previous commands and their output. You had a default worksheet, but you could open up any file and use it as a worksheet. There were multiple downsides as well, which is one of the reasons why MPW died (it wasn’t a real console or terminal, tools couldn’t run other tools, etc). But as a programmer-working-on-a-system, it was a great idea.

With more smarts in the editor, it could help you curate stuff, or have a much better search interface. Someone should write this for me. The world will love you.

Adding/overriding SCons Tools

I had the bright idea to add SCons extras to my builds via a build helper that adds them to any build, instead of manually copying them to site_scons/site_tools folders in each project. To begin with, I was adding latest/modified Visual Studio code found in the MSCommon directory in the SCons build – rather than force everyone to have a bleeding-edge SCons install to build with Visual Studio 2013, I figured I would just wedge these in.

This sounded like a good idea…

First try – DefaultToolPath

There’s a global array called DefaultToolPath in SCons.Tool that SCons’ own startup code puts your site_scons_site_tools folder into, so I figured I would just use it:

def AddSConsTools():
  
  # Add the appropriate SCons tools to the default toolpath. This
  # contains an override MSCommon folder as well as other builders

  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  site_tools = os.path.join(os.path.dirname(__file__), 'scons-2.3.3', 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

By inserting in the front of the array, my tools come first, at least before anything that is done by SCons itself (DefaultToolPath is filled in when SCons starts up).

We have a packaging system that makes it easy for SCons projects to load dependencies. I have a package that my SConstruct loads, and that package looks like this:

build/
  scons-2.3.3/
    site_tools/
      MSCommon/
        __init__.py
        arch.py
        common.py
        netframework.py
        sdk.py
        vc.py
        vs.py

Clever, right? Except it didn’t work.

After some tracing through the code to find out how tools load (SCons/Tool/__init__.py, in the __init__ function for the Tool class), I found out that, while you can add extra tools this way, you can’t add sub-modules of tools this way, because the way that the Tool intercept loader works is only searching for modules, not handling sub-modules. So I could replace or add Tools themselves, but I couldn’t do this for sub-modules of Tool, e.g. the MSCommon folder.

Second try – overload tools that reference MSCommon.

Since I really wanted a replacement MSCommon folder, I decided to put some of the Visual Studio tools in my build helper as well. That makes things a little more fragile, but this way, as long as my Tool (say msvc) is loaded first, even in a dummy environment, it pulls in my MSCommon files, and then those loaded modules are used by other parts of the system. Since msvc.py is the first loaded tool (by default), I just put in msvc.py, but I could put everything in.

So, to be clear, here’s what my build helpers looks like now:

build/
  scons-2.3.3/
    site_tools/
      msvc.py
      MSCommon/
        __init__.py
        arch.py
        common.py
        netframework.py
        sdk.py
        vc.py
        vs.py

As another quick reference, here are the tools that import modules from MSCommon:

  • linkloc (Phar Lap ETS, niche)
  • midl (Microsoft IDL compiler)
  • mslib (Microsoft Library archiver)
  • mslink (Microsoft Linker)
  • mssdk (Microsoft SDK setup)
  • msvc (Microsoft Visual C++)
  • msvs (Microsoft Visual Studio, generates solution/project files)

Since MSCommon is a module with an __init__.py, it directly pulls in three of the submodules (SCons.Tool.MSCommon.sdk, SCons.Tool.MSCommon.vc, and SCons.Tool.MSCommon.vs). Each of those pulls in SCons.Tool.MSCommon.common. I can’t find anything that imports SCons.Tool.MSCommon.arch or SCons.Tool.MSCommon.netframework, so these appears to be dead modules.

So as long as I reference a tool in my own hierarchy first, I can get my custom MSCommon modules to be used.

But, that didn’t work either. I had added debug printing in the previous step, and, depressingly, this is what I saw:

Adding SCons tools from c:\package_cache\Battle.net Build Tools\0.75\noarch\build\scons-2.3.3\site_tools
Tool(name=default)
  loading SCons.Tool.default
  found 2nd at C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\default.py ('.py', 'U', 1)
Tool(name=msvc)
  found 1st at c:\package_cache\Battle.net Build Tools\0.75\noarch\build\scons-2.3.3\site_tools\msvc.py ('.py', 'U', 1)
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc
I am c:\package_cache\Battle.net Build Tools\0.75\noarch\build\scons-2.3.3\site_tools\MSCommon\__init__.py

This is because __init__.py is doing absolute imports . So, while my override msvc.py was indeed loaded, my MSCommon folder was still skipped in favor of the one in the SCons 2.3.0 install folder.

Third try – put folders in the python path

DefaultToolPath isn’t a good idea for anything other than a Tool. For the MSCommon folder, I’ll just insert it into the python search path. So I rearrange my package to look like this:

build/
  scons-2.3.3/
    site_tools/
      msvc.py
    scons_patch/
      SCons/
        Tool/
          MSCommon/
            __init__.py
            common.py
            sdk.py
            vc.py
            vs.py

I didn’t actually need msvc.py in the package, I just put it there to verify that I had everything working. In my real build package, I have some custom tools that we use for builds. I also trimmed the apparently-obsolete files.

My startup code now looks like this:

def AddSConsTools():
  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  sconsdir = os.path.join(os.path.dirname(__file__), 'scons-2.3.3')
  
  # Add custom SCons tools to the default toolpath
  site_tools = os.path.join(sconsdir, 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

  # Add the SCons.Tool.MSCommon folder to the python module path
  MSCommon = os.path.join(sconsdir, scons_patch)
  sys.path.insert(0, os.path.abspath(MSCommon))
  print('Patching SCons with %s' % MSCommon)

Of course, that didn’t work either. I get an error saying it can’t find MSCommon.

EnvironmentError: No module named MSCommon:

Even worse, it is finding things inside MSCommon – in the wrong place!

I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc

Fourth try – more care

It would behoove me to pay more attention to how modules inside MSCommon are being imported.

Some files are using absolute module paths

from SCons.Tool.MSCommon import msvs_exists, merge_default_version

but others are using relative paths

from MSCommon import msvc_exists

or (inside the MSCommon folder)

import common

So I can either add an MSCommon path to the python path, or add all the Microsoft-specific tools to my package?

So let’s try that:

def AddSConsTools():
  # TBD - have a folder per scons version, in case we have incompatible
  # changes happening
  sconsdir = os.path.join(os.path.dirname(__file__), 'scons-2.3.3')
  
  # Add custom SCons tools to the default toolpath
  site_tools = os.path.join(sconsdir, 'site_tools')
  print('Adding SCons tools from %s' % site_tools)
  SCons.Tool.DefaultToolpath.insert(0, os.path.abspath(site_tools))

  # Add the SCons.Tool.MSCommon folder to the python module path
  scons_patch = os.path.join(sconsdir, 'scons_patch')
  print('Patching SCons with %s' % scons_patch)
  sys.path.insert(0, os.path.abspath(os.path.join(scons_patch, 'SCons', 'Tool')))
  sys.path.insert(0, os.path.abspath(scons_patch))

And it didn’t work!

Tool(name=msvc)
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\__init__.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\sdk.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\common.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vc.pyc
I am C:\Python27\Scripts\..\Lib\site-packages\scons-2.3.0\SCons\Tool\MSCommon\vs.pyc
I am c:\package_cache\Battle.net Build Tools\0.83\noarch\build\scons-2.3.3\scons_patch\SCons\Tool\MSCommon\__init__.py

Without being 100% sure, I believe this means that Python first searches for a .pyc, and then if it doesn’t find one, it looks for a .py. Since I’ve used SCons in the past, my SCons build has .pyc files in it. Since my newly loaded package does not, it skips right past my package when importing the stock Python way. That last use where it finds my package files is maybe due to the Tool loader code? It’s odd, because supposedly an import checks for cached modules before searching on disk. But I really don’t care that much at the moment.

So, two options

  1. Fiddle with the module path and directly import all the MSCommon stuff, then restore the module path
  2. Use sys.meta.path

The first one is probably the easiest to get working, but it’s hacky, whereas option #2 sounds cool.

To be continued…

A rant

It’s a lot of work for short-term gain, but it’s also something I’m thinking about in terms of SCons or other build systems – build tools are themselves dependencies, and having globally installed tools with a single name that could be any version is a bad idea. A lot of the Unix tools, specifically GCC, have this issue, and while there are vendor workarounds, there is no standard for doing this. Apple is a little better with XCode, because you can select the version of xcode you want used, and everything is relative to the xcode root. Visual Studio went the “every version has a unique name”, which is good and bad; they suffer from incompatibility between toolchain releases, or rather they purposefully break compatibility.

Oops, philosophy. I’ll talk more about this in a bit, but build tools should be versioned objects just like source code, and a project declares what build tools it depends on. There has to be a root, but that root would be very lightweight and have no version dependencies.

I call this “build”, but it’s really a build meta-tool that makes sure the right build tools are loaded and used.

FlatBuffers, new Google project

Google announced a new open-source project in a blog post

FlatBuffers: a memory efficient serialization library

This is akin to but has different use cases compared to, say, Protocol Buffers. The biggest difference is that you don’t unpack data, you access it in its delivered format, apparently without parsing overhead.

The Github page is: https://github.com/google/flatbuffers/

I’ll definitely be taking a look at this soon. Maybe a lot of the data we deliver to endpoints will be as FlatBuffers instead of JSON or custom binary blobs or complicated Protobufs.

The impetus and target for this seems to be game developers and their desire for efficiency along with the portability. Protobufs are nice but so heavy a solution on the client side.

Using CMake

Like most build systems, CMake is not clearly documented enough for me. I’m going to use libgit2 as an example of something real that is available to everyone. I’m doing this because there’s still not a single build system that’s good enough for general use, at least not when it comes to working on multi-platform projects.

Windows

You’ll almost always be using CMake to generate and use Visual Studio projects, although you have a choice of:

  • Makefile: MinGW, MSYS, NMake
  • Visual Studio projects (6-12)
  • nmake

Let’s start with Visual Studio projects, since that’s the common case.

Visual Studio generation

Grab the libgit2 source. Since I’m going to build for PyGit, I want a specific tag for compatibility. I definitely don’t want the development branch :)

> git clone git@github.com:libgit2/libgit2.git
> cd libgit2
> git checkout -b local-v0.20.0 v0.20.0

You’re meant to run CMake from the output folder. This is weird, but whatever. So here’s the naive way to use CMake.

> mkdir build
> cd build
> cmake ..
-- Building for: Visual Studio 12
-- The C compiler identification is MSVC 18.0.21005.1
-- Check for working C compiler using: Visual Studio 12
-- Check for working C compiler using: Visual Studio 12 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
zlib was not found; using bundled 3rd-party sources.
-- Found PythonInterp: C:/Python27/python.exe (found version "2.7.6")
-- Configuring done
-- Generating done
-- Build files have been written to: C:/projects/git/github/libgit2/build

Of course, this will auto-pick a Visual Studio toolchain, and since it’s Windows, it won’t use the toolchain found in my path (that I very carefully put there), since it’s actually not common that the Visual Studio toolchain is in the path. CMake will default to the newest version it finds, and while that’s a reasonable thing to do, I need to be specific. So you need to tell CMake about the toolchain.

> mkdir build
> cd build
> cmake -G "Visual Studio 11" ..
-- The C compiler identification is MSVC 17.0.61030.0
-- Check for working C compiler using: Visual Studio 11
-- Check for working C compiler using: Visual Studio 11 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
zlib was not found; using bundled 3rd-party sources.
-- Found PythonInterp: C:/Python27/python.exe (found version "2.7.6")
-- Configuring done
-- Generating done
-- Build files have been written to: C:/projects/git/github/libgit2/build

It doesn’t look like it’s possible to hard-code a generator in the CMakeLists.txt file. A kickstarter CMakeCache.txt file can go into the target folder, but there’s a chicken-and-egg issue there.

In CMake, there is a distinction between “generator” and “toolchain”. E.g. you can use the “Visual Studio 12″ generator but have it create projects that use the “Visual Studio 11″ toolchain.

> cmake -G "Visual Studio 12" -T "Visual Studio 11" ..

Up until now, all we’ve done is create a Visual Studio project file. While that’s useful, we actually want some built libraries and binaries.

You can build from the command-line like so:

> cmake --build .

(assuming you were in the build directory). However, it’s of limited use, because (on Windows with Visual Studio) you can only do debug builds this way. There’s no way to tell CMake to do a release build with Visual Studio. If you do this

> cmake --build . --target Release

you’ll get an error; the target functionality only works for makefile generators. You’ll also find out that cmake is using devenv, when it should now be using msbuild to be a good Windows citizen. CMake is great for creating cross-platform projects, but less good as an actual build tool. So you’ll want to directly use MSBuild.

> msbuild libgit2.sln /t:Build /p:Configuration=Release;Platform=Win32

And now I have libraries and binaries in libgit2/build/Release. If you really want to use devenv (against Microsoft’s desires, but what the heck), then

> devenv libgit2.sln /build Release /project ALL_BUILD

There is nothing that mandated the output folder being named build, it’s merely a convention.

Macintosh

TBD

Linux

TBD

General comments

Once you’ve generated makefiles with a specific generator, you can’t change the generator. You need to wipe the build folder, or pick a new build folder. So for doing cross-platform builds on a single machine, you’ll want some consistent naming for multiple build folders.

CMake likes to generate projects for a single architecture.

> cmake -G "Visual Studio 12 Win32" ..
> cmake -G "Visual Studio 12 Win64" ..

I don’t know how to generate a multi-architecture project. Or rather, the CMake philosophy is to use multiple build directories, with a single source tree, and it sounds like from the architecture of CMake that it just won’t be possible.

Reference

CMake documentation

 

Specific platforms

Examples

 

Timestamps

This is an incomplete rant. I’ll return to this at some point, when I have a more concrete proposal for better handling of time values in programs and data.

Notes on using time

ZIP archives

The base ZIP format stores a file entry last-modified date-time as 4 bytes: epoch is 1 Jan 1980, 2 bytes for time, 2 bytes for date, in MS-DOS format, including FAT’s 2-second resolution. There is no idea of timezone in basic ZIP files, so timestamps are ambiguous; you must know how the creator of the archive stored time values. Some ZIP programs store time as UTC, but this is not mandated anywhere. The original PKWARE ZIP program used local time.

There are several extensions that record time in a more meaningful way (7-Zip stores NTFS timestamps by default, for example).

  • NTFS Extra Field (0x000a), which stores NTFS-compatible mtime, atime and ctime (8-byte values, UTC, epoch is 1 Jan 1601, 100-nanosecond tick).
  • UNIX Extra Field (0x000d), which stories Posix-compatible mtime and atime (4-byte values, UTC, epoch is 1 Jan 1970, 1 second tick).
  • third-party extended timestamp (0x5455), Posix-compatible mtime, atime and ctime.
  • Info-ZIP Unix extra field (0x0001) – obsolete, similar to 0x5455 in layout.

Without one of these extended fields that mandate UTC, you have to guess at what the timestamps mean. It’s probably best to assume UTC by default anyway, and then have some way to manually tweak times as needed.

See

  • http://www.opensource.apple.com/source/zip/zip-6/unzip/unzip/proginfo/extra.fld
  • http://en.wikipedia.org/wiki/Zip_(file_format)
  • http://www.pkware.com/documents/casestudies/APPNOTE.TXT.
  • https://users.cs.jmu.edu/buchhofp/forensics/formats/pkzip.html

MS-DOS time format

The only remaining relic of this is in ZIP files.

The epoch starts at 1 Jan 1980.

Both date and time are 16-bit unsigned values, packed as follows:

  • date: YYYYYYYM MMMDDDDD (7 bits for year, 4 bits for month, 5 bits for day)
  • time: HHHHHMMM MMMSSSSS (5 bits for hour, 6 bits for minute, 5 bits for second)

The day of the month is in the range 1-31. Month is in the range 1-12. Year is in the range 0..127 (1980 to 2107).

Seconds are stored as 0-29 and multiplied by two, e.g. 0-58 (this is where the 2-second time resolution comes from). Minutes are in the range 0-59. Hours are in the range 0-23.

A rant

I hope there is a special circle in hell for all the engineers who designed and wrote code that deals with timestamps – not elapsed time, but “absolute” time. The APIs all suck, there is mass confusion for something that really is quite simple, and users pay for it (Windows users for many years suffered when daylight savings changes happened, because file times recorded around that point jump forwards or backwards in time).

In one respect, it’s really simple. There is a global timescale, it runs linearly, and you can do simple math on it.

But it’s complicated by the fact that we don’t work with global time, we work with local time.

There’s the complexity of relativity (two observers see their local time as the global time, and see each others’ local time differently). But while that’s important in some kinds of time measurement, it’s not the pain most of us deal with.

I’m referring to the idea of local offsets. Because we inherited ways of recording time values, our idea of a time value has some sort of offset in it. In fact, it has multiple offsets, and it can even run non-linearly.

The most common offsets are known as time zone and calendar. Instead of working with a timestamp like 1269400476, we call it “Wed Mar 24 03:14:36 2010″. The time zone affects the value by some number of hours (perhaps fractional) , and the calendar affects what the timestamp actually means.

That still doesn’t complicate things all that much. We could trivially convert to/from absolute time T to some local time t, or some vector of local time t0..tn. It’s just math

T =  t1*a1 + t2*a2 … + tn*an

What makes it complicated is that we don’t actually attach unambigious meanings to our recorded time values. We neglect to say whether we have an global time or a local time, and we neglect to say which local time we are referring to.

Now, there’s definitely a circle of hell for the people who mess with the linearity of the timescale, by adding leap seconds. This was a “clever” idea to keep the local time in sync with the rotation of the Earth – we decided that the Earth rotates once every 24 hours, and since the rotation is slowing down, we need to add a stray second here and there so that our local clock doesn’t shift compared to the Earth’s rotation. Of course, that makes it much more complicated to do math on time values. It doesn’t introduce discontinuities per se, it’s simply that some day has 86401 seconds in it instead of 86400 seconds. But we stopped doing that – for now.

The way back to sanity is to convert all your input time values into the global time, and exclusively work in global time, and then only convert back to some local time for display. Never store local time values.

Of course, there is no currently good global time. UT isn’t a global time.

References

http://www.epochconverter.com/

Using Visual Studio toolchains

This is a collection of information about how to use Visual Studio toolchains from command-lines or from other build systems. It’s probably also useful for people who want to know how things are configured – because when something is broken, you either fix it, or reset and start over.

I’m also only going to cover Visual C++, since that’s what I care about. And this is a little disjoint, but it is a blog post, after all – I’ll try to turn it into actual documentation at some point. Or rather, this is half a blog post, since I’m going to update it multiple times.

Visual Studio 2013

This is also known as Visual Studio 12.

Default install path: C:\Program Files (x86)\Microsoft Visual Studio 12.0\

Location to vcvarsall.bat: $(VSTUDIO)\VC\vcvarsall.bat. This is useful to read or run because it contains all the environment variables needed to run tools from the command line. I’m presuming that the Visual Studio IDE does something equivalent.

Environment variables

Pre-existing

These already existed in my environment, but were updated by vcvarsall.bat.

ChocolateyInstall=C:\Chocolatey
CommonProgramFiles=C:\Program Files\Common Files
CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files
CommonProgramW6432=C:\Program Files\Common Files
ComSpec=C:\windows\system32\cmd.exe
PROCESSOR_ARCHITECTURE=AMD64
ProgramData=C:\ProgramData
ProgramFiles=C:\Program Files
ProgramFiles(x86)=C:\Program Files (x86)
ProgramW6432=C:\Program Files
PSModulePath=C:\windows\system32\WindowsPowerShell\v1.0\Modules\
SystemDrive=C:
SystemRoot=C:\windows
TEMP=C:\Users\bfitz\AppData\Local\Temp
TMP=C:\Users\bfitz\AppData\Local\Temp
windir=C:\windows
windows_tracing_flags=3
windows_tracing_logfile=C:\BVTBin\Tests\installpackage\csilogfile.log

Common

These are common to the x86 and amd64 toolchains.

ExtensionSdkDir=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1\ExtensionSDKs
Framework40Version=v4.0
FrameworkVersion=v4.0.30319
INCLUDE=
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\INCLUDE;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\INCLUDE;
  C:\Program Files (x86)\Windows Kits\8.1\include\shared;
  C:\Program Files (x86)\Windows Kits\8.1\include\um;
  C:\Program Files (x86)\Windows Kits\8.1\include\winrt;
LIBPATH=
  C:\Program Files (x86)\Windows Kits\8.1\References\CommonConfiguration\Neutral;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1\ExtensionSDKs\Microsoft.VCLibs\12.0\References\CommonConfiguration\neutral;
Path=
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow;
VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\
VisualStudioVersion=12.0
VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 12.0\
WindowsSdkDir=C:\Program Files (x86)\Windows Kits\8.1\
WindowsSDK_ExecutablePath_x64=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\x64\
WindowsSDK_ExecutablePath_x86=C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\

x86-specific

These are specific to x86 toolchains.

DevEnvDir=C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\
FrameworkDir=C:\windows\Microsoft.NET\Framework\
FrameworkDIR32=C:\windows\Microsoft.NET\Framework\
FrameworkVersion32=v4.0.30319
LIB=
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB;
  C:\Program Files (x86)\Windows Kits\8.1\lib\winv6.3\um\x86;
LIBPATH=
  C:\windows\Microsoft.NET\Framework\v4.0.30319;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB;
Path=
  C:\Program Files (x86)\MSBuild\12.0\bin;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\BIN;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools;
  C:\windows\Microsoft.NET\Framework\v4.0.30319;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\VCPackages;
  C:\Program Files (x86)\HTML Help Workshop;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x86;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\

amd64-specific

These are specific to amd64 toolchains.

CommandPromptType=Native
FrameworkDir=C:\windows\Microsoft.NET\Framework64
FrameworkDIR64=C:\windows\Microsoft.NET\Framework64
FrameworkVersion64=v4.0.30319
LIB=
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB\amd64;
  C:\Program Files (x86)\Windows Kits\8.1\lib\winv6.3\um\x64;
LIBPATH=
  C:\windows\Microsoft.NET\Framework64\v4.0.30319;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\ATLMFC\LIB\amd64;
Path=
  C:\Program Files (x86)\MSBuild\12.0\bin\amd64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\BIN\amd64;
  C:\windows\Microsoft.NET\Framework64\v4.0.30319;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\VCPackages;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools;
  C:\Program Files (x86)\HTML Help Workshop;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools\x64;
  C:\Program Files (x86)\Microsoft Visual Studio 12.0\Team Tools\Performance Tools;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x64;
  C:\Program Files (x86)\Windows Kits\8.1\bin\x86;
  C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\x64\
Platform=X64

Note that many of these are disjoint, so could probably be set in a unified environment.

Some pre-existing environment variables were touched. I think it decided to make sure they were correct, because in one case it’s clear that the previous environment variable was incorrect. Before I ran vcvarsall.bat, I had these

CommonProgramFiles=C:\Program Files (x86)\Common Files
PROCESSOR_ARCHITECTURE=AMD64

which is clearly wrong. I also had this:

PROCESSOR_ARCHITECTURE=x86
PROCESSOR_ARCHITEW6432=AMD64

which turned to this for x86

PROCESSOR_ARCHITECTURE=AMD64

and this for amd64

PROCESSOR_ARCHITECTURE=AMD64

I’m guessing this is supposed to be recording the architecture of the host machine, not the toolchain target. This

http://blog.differentpla.net/post/38

indicates that I’m doing something wrong, I’m using 64-bit tools from a 32-bit cmd.exe. So now this makes slightly more sense. Except Task Manager says otherwise, it says that I’m not running 32-bit cmd.exe processes (there’s a *32 annotation on 32-bit processes). So my machine was set up incorrectly? Something to look into down the road.

Fun with Git tags

Git tags have several uses to me.

There’s the classic use of “here’s something we released in the past”. It doesn’t need a branch, because it’s no longer under development, but you may need to refer to it at some point. Presumably you have some regular patterns for naming tags, and perhaps you use annotated tags to contain release information.  It’s just good release practice to have branches be for active development only, because you can always create a branch from a tag if you need to start doing work on it again.

There’s another use of “I have some dead/obsolete development work, but I’d still like it to stick around in the permanent record”. I prefer this to spelunking in the reflog, because sooner or later you’ll garbage-collect, and if there are no live references to commits, those commits will go away. Obviously, you should not keep actual garbage, but a historical record can be a valuable thing. And when you tire of that history, you can delete it just by removing the tags. I switched to this instead of keeping branches around, and it makes my repos feel a bit cleaner.

Lightweight tags have the advantage of not actually being blobs, but simply associating a string with a commit. Annotated tags let you add extra information, in the form of a commit message, and there are other benefits as well (you can sign tags, for example). I see both as valuable for both kinds of tags. Some projects only use annotated tags – for example, in looking through the Git source itself, it seems like all the tags are annotated tags. My preference is to just have annotated tags.

There’s one troublespot where it comes to sharing tags, and that is that tags are in a single namespace, unlike branch refs. Since people rarely share tags, this isn’t an issue. But if you fetch tags from a remote repository, they go into the same .git/refs/tags location as your local tags. One suggestion I saw that was interesting was to have a pattern for naming tags based on remotes, so that you could keep your tags separate from pulled-in remote tags. It’s not automatic, though, you have to do it manually. There aren’t common workflows yet around sharing tags, as far as I know.

While tags are normally stored in .git/refs/tags, if you look in that directory, you might only see a few tag files. Refs (tags and branches) can be packed up into a single .git/packed_refs file for efficiency’s sake, and this works very well for tags, since tag refs normally never change. A ref will get unpacked if it needs to change. This can be done manually with git pack-refs, or a git gc will also do it when it runs automatically.

As of Git 1.9.0, git fetch –tags fetches both branches and tags. By itself, git fetch will only get tags referenced by commits that are brought down, but it won’t bring down new tags pointing to commits that you already have. One down-side to git fetch –tags is that it will fetch and replace all tags. Normally this is fine, but may be dangerous if you have multiple remotes attached to a single repository, especially if those remotes are disjoint. Just keep this in mind that you may need to explicitly pull tags in some cases.

See a separate post I have yet to write about git log/git rev-list and proper use of –all, –branches, –tags and –remotes.

Examples

Create an annotated tag (assumes that the tag message is in the file <tagmessage>):

git tag -a release-1.5.1 -F <tagmessage>

Show the tag and/or related commit (for annotated tag, will show the annotated tag and then the commit; for lightweight tag, will show just the commit):

git show release-1.5.1

Show tags in <remote> repository, where <remote> is the name of a remote attached to your local repository:

git ls-remote --tags <remote>

Show the most recent annotated tag on the current branch:

git describe

Push a specific tag (and related objects) to a remote repository:

git push <remote> release-1.5.1

Push all tags not already in the remote repository:

git push <remote> --tags

Delete a tag in the local repository

git tag -d release-1.5.1

Delete a tag in a remote repository (note: this has the same perils as rebasing, others could be depending on this tag, but it’s not bad in and of itself):

git push <remote> :refs/tags/release-1.5.1

Reference

Git: git-tag

Git book: Git Basics – Tagging

Git Tag Mini Cheat Sheet Revisited

Git Tip of the Week: Tags

On the Perils of Importing Remote Tags in Git

Git Data File Formats

Git Internals – Maintenance and Data Recovery

StackOverflow: Git: distinguish between local and remote tags