in C++, Tools

The Quest for the Perfect Build System

First there were punch cards, and people somehow managed to write software. Then came interactive computing with mainframes and personal computers, and people wrote even more software and become even more productive. There is no doubt that our development environments today are light-years ahead of what the computer pioneers had half a century ago. Yet I constantly see projects suffer with horrible environments that force slow iteration cycles on programmers.

I define an iteration cycle as the time elapsed between making a trivial change and being able to see the results of that change. In particular I’m concentrating on large-scale (around one million lines of code) C++ projects, which is representative of modern PC and console games today.

Of course, there’s more to fast iteration than just the speed of the build system. How quickly you’re able to get in the game and see the results is a big factor (another reason why I wouldn’t want to live without unit tests). The physical dependencies of your program are going to affect how quickly your code builds. And if you’re always working with a full game executable that takes forever to link, your iteration times are going to be shot (yet another reason to use unit tests!).

But let’s put that aside and concentrate on the build system itself.

Fast iteration is more than just about time and speed. It’s also about how you feel about the code and what you dare do with it. When things are slow and painful, you’re going to be a lot less likely to try new things, or fix one last thing to clean up the code, or refactor something out of a header file into its own module. Over time, this is going to accumulate into cruft, hacks, and unmaintainable code. It’s also about not breaking up the flow, the mental state you’re in while you’re writing software. Interruptions of more than just a few seconds are much more detrimental than their time value alone.

construction When you add test-driven development to the mix, fast iteration becomes even more crucial. With test-driven development, you end up doing micro-cycles of modify-compile-test, sometimes several times per minute. Unless you have very fast build times, you’re dead in the water.

Before we go any further, let’s crunch a few numbers. It’s not so much to show specific improvements, but to have as a reference point going forwards. Think about the project you’re currently working on. How long does it take to build when you modify a single cpp file (or even no files at all)? I’ve seen projects that took over two minutes to build, and anywhere between 30 seconds to one minute is fairly typical. Let’s say 30 seconds for this example. How often do you need to do a build? It’s not very fast, so maybe once every 5 minutes, 8 hours a day. That adds up to a staggering 48 minutes per day, or 10% of your full work day!

Now reduce the build time to two seconds instead. And to make things more interesting, let’s do a build every two minutes. That adds up to be 8 minutes per day, and, most importantly, they feel like almost instant builds, so they don’t bring you out of the flow state. That’s what our goal should be for a build system.

The Goals

As I started looking into different build systems (there are a lot of them out there!), I noticed that they have very different sets of goals, and a lot of them are fairly irrelevant to my particular needs.

As a game developer, I work with a varied, but limited number of platforms (most of which are not supported out of the box by build tools anyway). I’m also not planning on releasing the source code any time soon to let players compile the game in any platform, so I have no need to have a build system that automatically detects all the correct settings and does the right thing for any possible platform.

Some other build systems had useless features built in, such as access to version control or the ability to send emails. I consider those type of tasks to be totally beyond what the core build system should do, and I prefer to have those features in a wrapper system that does build/test/deployment of builds by calling the build system itself.

So what exactly do I want in a build system?

  • Super-fast incremental builds (around two seconds). This is the key to fast iteration and I want this at almost any cost.
  • Customizable. I’m going to be using unusual compilers and environments. I want to be able to easily set my own rules and actions and not be tied to any particular platform or compiler.
  • Correctness. I want the build system to build the minimal amount of files and still do the right thing under most normal circumstances.
  • Multiprocessor support. I also care about the speed of a full build, and with multiprocessor machines finally becoming popular, using multiple processors is a great way to speed up build times.
  • Scalable. I want all the related source code to be tied to the same build system. I don’t want to create a separate build file for every minor tool so they build at a reasonable speed. I’d like to simply build any target and have the minimal amount of files rebuilt.

The Contenders

Visual Studio .NET

Most game developers doing Windows or Xbox development will be familiar with this build system. I’ve been stuck with it for many years, and while things were not great in Visual Studio 6.0 and earlier versions, it became decidedly unusable when it turned into “.NET”. I don’t know what happened, but I suspect that in their effort to cram all those languages under a single IDE they crippled the C++ build system even more.

My main gripe is how slow Visual Studio .NET is when doing an incremental build on a solution with many projects. It’s roughly a second per project. If you have 50 projects, there goes a full minute for nothing. That’s simply unacceptable, so I’ve always had to work around it by creating many solutions with the minimum amount of projects necessary, or keeping the dependencies in my head and forcing builds by hand. Alternatively you could throw all the code in a couple of projects instead of breaking it up into many different ones, but that’s like jumping off a cliff to avoid being stung by a bee.

Because Visual Studio mixes the build system with the visual representation of the files, large solutions are not only slow to build, but are a positive pain to work with, making browsing the source code extremely difficult.

The problems with Visual Studio don’t end there. The solution and project files are a pain to generate, they’re full of magical GUIDs and references to the registry, and they’re extremely verbose. The .NET framework offers an API to create those files, but the fact remains that they’re much more complex to create than any of the other build systems.

If you don’t generate the project files by hand, you soon enter configuration hell. Anybody who’s had to make sweeping changes to lots of projects with multiple configurations through the IDE will know what a painful process I’m talking about.

Some of my other complaints are not being able to easily set the build order for different projects (it appears to be determined by the order in which they appear in the solution file), or the fact that setting a dependency between two projects forces an implicit linking.

All in all, Visual Studio seems very well suited to small, toy projects. Projects with just a couple of libraries and a few thousand lines. Anything bigger than that and it becomes an exercise in frustration.

On the bright side, it is possible to use “makefile projects” in Visual Studio, which completely bypasses Visual Studio’s own build system and simply calls an external command to build the project. Visual Studio also offers a command-line line interface, so at least it is possible to do builds from the command line without launching the IDE. Finally, there are some third-party plug-ins that can speed up the dependency checking, which can help with some of the problems (unfortunately, FastSolutionBuild doesn’t seem to have a command-line interface, which renders it useless for my needs).

Other third-party add-ons like Incredibuild claim to speed up full builds, but they do so at the expense of incremental build speed, which I consider much more important.

Make

Make is the granddaddy of all the build systems. With a distinguished history of over 20 years, it has certainly proved its worth in the real world by building hundreds of thousands of projects over the years.

But make is far from perfect; otherwise there would be no need for other build systems. However, things are not as bad as people make them out to be, especially with modern versions of make (GNU make for example). The lack of portability is not an issue for us because we can just trivially write a new makefile, or a variant, for every platform we support.

The claims about make not being scalable are more serious. The article “Recursive Make Considered Harmful” certainly did much harm to make’s reputation. Specifically, that paper claims that it’s very hard to get the order of recursion right because using recursive make has no global project view. I think that’s only true if you’re dealing with self-modifying source code files or autogeneration of source code. If that’s not the case, I can’t see how the order of recursion can matter at all as long as the dependencies are met.

As for the claim that make is slow, let’s put that off until we compare it to the other build systems.

Personally, I really like make. It’s small, clean, and elegant. It does one thing and it does it very well. It’s easy to extend and modify. At first I was surprised that make didn’t do implicit dependency checking of C files (building a C file when an included header file changes), but it fits perfectly with the simplicity of make. It’s just a dependency graph with actions. If you don’t tell it about a rule, it won’t know about it. Fortunately, we can use tools like makedepend to generate the C dependencies with extreme ease.

One of my only gripes with main is the silly tab syntax. The fact that action commands have to be preceded by a tab character is awkward and out of place today, but it’s a small quirk I’m willing to adapt to. Fortunately gnu make’s error messages are very clear and it even asks “Did you forget to put a tab before the command?”

Jam

construction Jam was born an an improved make. It tried to keep all the good things about make and fix all the problems. And to a large extent, it succeeded.

Like make, Jam is small and easily portable. It deals better with inter-project dependencies by avoiding recursive Jam invocations (while still allowing individual sections to be built separately). The Jam language, even though it’s still fairly restrictive, is more expressive than make and it’s easier to write complex functionality.

One of the main differences from make is that Jam actually provides a fair amount of base functionality in its Jambase file. Jam out of the box knows about some of the most popular development environments and languages (including implicit dependency checking for C/C++ files), so it simplifies the build files for the simple cases. In the other cases, you can add your own rules and actions very easily.

I find it funny that while Jam fixes make’s weird tab requirement, it adds its own “space semicolon” weird command terminator (although I know some programmers who think that “space semicolon” is the one and true way). Either way, it really doesn’t matter since it’s such a small thing.

Jam also has spawned some forks that add extra functionality but are fully backwards compatible: FTJam and BoostJam.

Scons

Scons has been hailed as the next step in the evolution of build systems. It is supposed to be a much improved make-like system, not only written in Python, but using Python as the language used to define the build itself. Python is a fully general, object-oriented language, so it’s extremely expressive. It also has the advantage of being a well-established language with a great set of documentation, debuggers, and tools, which can make creating and debugging complex build scripts easier.

Scons also claims to be extremely accurate when it comes to determining what files need to be built. It doesn’t rely in the time stamp for a file, but it uses the MD5 signature instead (a type of checksum approach). Another very intriguing feature I didn’t get around to testing is the network cache of built object files.

Walking into this test I was a bit afraid of what I might find. I had read some reports of several people having problems with Scons performance on large data sets. However, the latest version (0.96.90), released just a couple of months ago, is supposed to have some performance improvements.

The Method

As a test, I decided to run each of the different build systems on the same codebase. Instead of using some real-world codebase, with its own set of quirks and problems (and the difficulty of easily building it with the different systems), I wrote a script to generate a simple C++ codebase. The code structure is based on what I expect to see in my own projects with many different projects. The physical dependencies in the generated codebase are extremely well contained, and header files never include other header files. Real code bases would have more complicated dependencies and would make the tendencies we see here even more exaggerated.

The specific parameters I used for this test were:

  • 50 static libraries
  • 100 classes (2 files per class, .h and .cpp) per library
  • 15 includes from that library in each class
  • 5 includes from other libraries in each class

This is by all accounts still a small or at most medium-sized codebase. A full game engine and tools can easily become much larger than this.

Thinking back, I really should have done the test with at least 100 libraries, not 50, because all my libraries have an extra associated project for unit tests. No big deal. I don’t think it would have changed the results very much. The important thing was to get enough code to make measurements noticeable (if we just build 10 files every build system is going to be really snappy).

For each of the build systems, I measured three operations:

  • Full rebuild. Compiling the full source code for the first time. I didn’t expect this time to change much at all from build system to build system or even across platforms. I was quite wrong!
  • Incremental build: Doing another build without any changes. This is the really interesting measurement that will tell us a lot about potential for fast iteration.
  • Incremental build on a single library: Doing a build of a library without any changes.

I did the measurements in both Linux (2.6 kernel) and Microsoft Windows XP for different systems. Clearly some build systems only run in one platform (Visual Studio). But I decided to run some of the other build systems under Windows as well to provide a more fair comparison.

The specific hardware I ran these tests in is not as important since all we’re comparing are their relative merits. But for the curious it’s a P4 2.8 GHz CPU with hyperthreading, 2GB of fast RAM, and a 7200 rpm EIDE hard drive. The most important part is that I had enough memory to prevent thrashing.

GNU make, Jam, and Scons all support parallel builds. While it won’t speed up incremental builds any, this can reduce the time for full builds dramatically. Since this test was done in a single-CPU machine (and the primary measure was incremental builds), I restricted all the builds to use a single process.

The Results

System Compiler Platform Full build Incremental Incremental lib
Visual Studio VC++ Windows XP 7m 28s 0m 54s 0m 4s
Make g++ Linux 2m 21s 0m 2.4s 0m 0.0s
Jam g++ Linux 2m 42s 0m 1.6s 0m 0.1s
Jam VC++ Windows XP 6m 52s 0m 3.1s 0m 0.3s
Scons g++ Linux 5m 31s 1m 02s 0m 16s
Scons VC++ Windows XP 8m 02s 0m 55s 0m 8s

We can make lots of very interesting observations from this table.

First of all, it confirms what I had seen all along, that Visual Studio is horrible for incremental builds with many projects. My off-the-cuff estimate of one second per project ended up being extremely accurate (54 seconds for 50 projects). That’s simply not acceptable for me.

As I feared, Scons, failed the fast iteration test as well. It actually ended up being slower than Visual Studio in all accounts, even for individual library rebuilds. It might do the “right” thing under all conditions, but frankly, that’s not a price I’m willing to pay to get absolutely correct results. I really don’t think I encounter any situation in everyday work in which Scons would do the right thing and make or Jam wouldn’t.

At this point, I was afraid that I just wasn’t going to be able to get the type of iteration I wanted out of file-based, compiled languages. Fortunately that’s not the case. Both make and Jam do a great job and fall in the range of what I consider acceptable (around a couple of seconds).

There are two interesting observations to be made about full build times from the chart above. First of all, Scons with g++ under Linux is twice as slow as Jam or make for a full rebuild. I find that extremely surprising. Although I guess that’s the extra minute of dependency checking plus some extra overhead of its own. I tried some of the suggestions to get faster Scons builds (by trading off accuracy for speed), but they just improved incremental build times by a couple of seconds. Clearly, Scons needs to do some catching up before it can play with the big boys.

The other one is comparing g++/Linux with VC++/Windows XP. Jam is over twice as slow with VC++/Windows XP than it is with g++ under Linux. Is it Windows XP or is it Visual C++? I don’t know. It would be interesting to try the experiment with g++ or some other compiler under Windows and see if the times are reduced at all. I suspect the Windows file system might have something to do with that.

Conclusion

This little experiment cleared up a lot of doubts for me. I’m ready to ditch Visual Studio as a build system and replace it completely with Jam or make. Make is a simpler but Jam probably edges it out because it’s a bit nicer, it doesn’t have any recursive problems, and the default functionality is pretty handy. It’s hard to go wrong with either one.

Since most programmers still expect to work from within the Visual Studio IDE, you can easily create a “makefile” project type and hook it up to the build system of your choice.

One interesting idea that came up during this research in the Scons mailing list is that of a background process that monitors which files change and updates dependency graphs on the fly. So whenever you initiate a build, all the work has already been done and the build can start right away. A variation on this idea that has been brought up in some TDD mailing lists is that of the build system not just computing dependencies in the background, but actually attempting to compile the code and run the unit tests in the background. If any of the tests fail, they can even be highlighted in the source code editor. Sort of like an on-the-fly, smart code checker on steroids.

Of course, we could also choose a language that has much smaller build times. I haven’t worked on a large-scale C# project yet, but the small tools I’ve created have impressed me with how fast the iteration can be. The same can be said for scripting languages such as Python or Lua. Unfortunately, we’re stuck with C++ for the foreseeable future in game development, so we better learn to deal with it the best we can.

For now, I’ll be happy to stick with Jam and two-second incremental builds. Let’s start jamming!

icon generate_libs.py

30 Comments

  1. What filesystem are you using under Linux? I found a significant increase in build times when I went from ext3 to ReiserFS. I suspect NTFS may be the turd that’s making your build times stink.

    Any chance that you will test Ant too?

    Do you have any experience with RAID and build times? I find the bottleneck a lot of the time is reading from disk and not necessarily the CPU.

  2. I’ve been looking for the perfect build system for years but still haven’t found it.

    Jam was among my favourites but the syntax is too cryptic for my brain to comprehend. I like to employ various mini-languages which compile into C++ and I just couldn’t figure out how to express those rules with Jam. Perhaps I should have been more persistent…

    One of the things I really thought was cool about Jam was that it seemed very good at parallelizing work, even across multiple configurations at once.

  3. Parveen,

    I’m actually using ext3 under Linux. I wonder how much faster it would be with ReiserFS. On the other hand, I’m stuck with Windows XP at work, so there isn’t much I can do there.

    Stefan,

    If you haven’t already, check out the Jambase file that comes with the Jam source code. That shows you all the default rules and actions. I found it a great starting point to add new compilers or completely new rules. I thought it was hard at first, but it turns out to be quite easy. What did you end up using instead, Make?

  4. Could you post the the make/jam/scons files you used in your test? I’d like to run them myself and see how the work on my system. I’m unhappy to hear about SCons performance since I was looking to use that on my future projects.

  5. Chris,

    Just download the python script and that will generate both the source code and the build scripts for all the build systems.

  6. Noel,

    I didn’t know that there were work done to check dependencies in the background for scons. I’ve been glancing over at Apple’s Xcode for a while now, and that seems really cool with their predictive compiling (http://tinyurl.com/75flk). I guess the solution is to switch over to MacOS X now (hey, they’re coming to Intel).

    Btw, you did defragment the windows filesystem before the test? Unix filesystems doesn’t have the same problem with fragmentation as windows seems to have. Even though you’re running that evil Linux 🙂

  7. Noel,

    I tried all of the above (and some more) and came to much the same conclusion.

    I’ll keep waiting for someone to get frustrated or wake up and create a build system based on the background process and in-memory dependency graph.

    Patrick

  8. I agree with the world of pain going from vc6 to .NET, however I changed my mind drastically with .NET 2005 that I find really great and better than VC6 (took them 6 years…). The early .NET were an application written from scratch to encapsulate all the languages hence why C++ suffered, .NET 2005 is a big improvement.

    Did you try or plan on trying with .NET 2005?

  9. In this article next line: “Super-fast incremental builds (around two seconds). This is the key to fast iteration and I want this at almost any cost.” render the rest requirements to simply a bonus. By doing this article is focusing on developer’s role requirements to build system. For automated build reliability will be the top priority. And depending on complexity of your project flexibility can be the key or you simply will not able to use the tool. For simple build, like in this test, there is no need for scons.

    Scons does loose in super-fast rebuild role, but easily wins in reliability (full proof dependency) and flexibility (python language).

    Automation is the biggest part of test-driven development. And you need reliable incremental builds to run big projects in Cruise Control. I have multiple complex projects with a lot of generated files (some generators built in the same single run as generated code), complex dependencies, custom actions, multiple platforms and configurations. I have yet to find something I canÂ’t do with scons. By choosing scons I gave away those seconds for winning hours in whole development process and most importantly freedom to do anything.

    This test does show scons week spot. But this is a penalty for correctness and flexibility, which you simply can not achieve with other tools. Scons is not one for all tool right now. If quick rebuild is your highest requirement and other tools can handle your build complicity maybe scons is not the best choice, but if you wishing “If I only could do this!” or you have to troubleshoot your build weekly then try scons. Also scons is in active development right now and a lot of attention focused on performance and different role requirements to build system.

  10. “…we’re stuck with C++ for the foreseeable future…”

    The real reduction in iteration time is going to come from transitioning our game logic away from compiled languages. We are only stuck with C++ for most of our development these days because of archaic notions about performance. Iteration time should be top on people’s lists these days, and interpreted languages are going to be a huge win there. Move the code to C++ upon determining that any given module is too slow with actual measurement (i.e. guessing that something is too slow is not really a good way to determine that it really is).

    I have spoken with some game developers who are writing the entire game in an interpreted language outside of the middleware and the interpreter. They have managed to get an incredible amount of functionality in a short time.

    This is one of my “One True Ways” of game development. Working primarily in compiled languages is really only a good way to slow yourself down. If you have a decent not-compiled language available then you find you can use it for all sorts of cool things, and even write a quick and dirty console to allow you to affect the game by typing code in while it’s running!

  11. “header files never include other header files”

    Is it really better to use forward definitions than to include all the headers needed by that header? I read your other articles from last year about this and was confused as well. Where do I find more info about when to include a header for another class vs. just a forward definition.

  12. BTW, I’d also be really interested in your take on staying within the IDE vs. workinig outside one with respect to debugging. It seems like I can get decent editors to do the coding/building stuff, but I still need to jump back into the IDE for debugging (stepwise and nice displays of variables and stuff). Does this shift negate the effect of using stuff outside the IDE? Or do you customize something like VS.NET to use a different editor and a different build system but use its debugger?

  13. “The real reduction in iteration time is going to come from transitioning our game logic away from compiled languages.”

    Very true. Unfortunately, new game consoles aren’t exactly making it easier to write game code in a high-level interpreted language. In many ways, they have to be dealt with closer to the hardware than some of the previous consoles.

    For PC games on the other hand, there’s no reason not to go down that direction already though.

  14. “Is it really better to use forward definitions than to include all the headers needed by that header?”

    Bill,

    The reason header files don’t include other header files in the codebase created in this script is just for simplicity (I would have to make sure I never have a circular include). Besides, that represents a best case. In the real world, things can only get worse.

    Lakos proposed the best header file inclusion system in his book “Large Scale C++ Software Design”, which I echoed in a past article: http://www.gamesfromwithin.com/articles/0403/000011.html

    The jist of it is that you should minimize includes from a header file, but at the same time a header file should always be able to be parsed correctly by the compiler when anybody includes it. I’ve been using that method for years and I think it works great.

  15. “Unfortunately, new game consoles aren’t exactly making it easier to write game code in a high-level interpreted language. In many ways, they have to be dealt with closer to the hardware than some of the previous consoles.”

    Isn’t this “just” a matter, then, of generating sufficiently high-level code in the scripting language so that when the optimization pass comes along and the “low-level interpreted” functionality goes into a compiled language, you can also do interesting things like specifically schedule scripts to maximize cache efficiency and the like?

    I have a real strong feeling that managing the complexity of coding on the next generation of consoles is going to be easier in the long run if we write code that makes it so we don’t have to worry about it. Indeed, using C++ for multithreaded programming is tough enough; on symmetrical n-way machines it requires a lot of boilerplate to be maximally efficient. It seems to me that an intelligent enough scripting engine, coupled with key functionality that the scripts themselves use being aware of the SMP issues can allow scripts to be relatively dumb and the “right thing” will just happen.

  16. One interesting bit I just found in the Nant mailing list:

    “By addmission of an MS consultant, Microsoft have never, and never will use Visual Studio to do any in-house building.”

    I have no idea as to the accuracy of this statement, but it certainly has a ring of truth to it and it fits my suggestion that Visual Studio is only good for “toy” projects.

  17. Noel,

    I built the scons and jam projects on my mac just to see what would happen. I used -j2 on both builds because I have a dual G5 and it’s more fun that way 😉

    I couldn’t build the make project because makedepend is part of the X11 source code which I don’t have and didn’t feel like downloading.

    Jam:

    clean – 5:37

    incre – 0:4.8

    Scons:

    clean – 12:22

    incre – 1:22

    I hoped that the dependency checking might take advantage of parallel building, so I tried -j1 -j2 and -j4 with both Jam and Scons incremental builds; all the same.

  18. Hi,

    I’ve read this article and the one on optimizing the asset pipeline.

    They are both very nice, thank you for that. 🙂

    I’m considering building a pipeline that can be used by artists in a very similar way as you described and also I want to be able to do builds for different platforms through a build system. Scons seems quite nice but after running your tests it proved to be too slow. Jam, as you said, wins in the speed department. My first thought was to use Jam for source and Scons for the content, but if Jam coul d be used to build data assets easily I would be happier.

  19. Eurico,

    There’s no reason you can’t use Jam for asset builds as well. It certainly seems straightforward enough to write new rules and actions to build your assets. We’ll probably be adopting Jam for assets in my team in the new few weeks.

  20. Hello again,

    Maybe I need to take a deeper look into Jam. I really want the artists to be able to add assets easily without having to learn too much about scripts. Also in the dependecy checking department, from your experience with Jam do you see it easy to check dependencies between asset files? For instance, say that you have an intermediate in XML that references another file – would it be easy to check if the referenced file was up-to-date without writing a specific rule in Jam? With Scons that problem can be solved through a custom “Scanner” that parses your files for dependencies.

    Looks like I still have a lot of experimenting to do with both systems.

  21. >The claims about make not being scalable are more serious. The article

    >”Recursive Make Considered Harmful” certainly did much harm to make’s

    >reputation. Specifically, that paper claims that it’s very hard to get the

    >order of recursion right because using recursive make has no global project

    >view. I think that’s only true if you’re dealing with self-modifying source

    >code files or autogeneration of source code. If that’s not the case, I can’t

    >see how the order of recursion can matter at all as long as the dependencies

    >are met.

    Yes, that article made me stay away from make. Instead we use — VC 🙁

    If make (i.e. gnu make), in general, has no problem with recursiveness, then it certainly would be much interesting to use — instead of VC 🙂

    The reasoning is interesting – could you please elaborate!?

    /Tompa

  22. I’d be interested in seeing what the performance times will be with Visual C++ 2005, I know they included it with Visual C++ Express Beta 2 now, and despite previous subscribers to this blog saying it wasn’t going to appear it appears it will now. So will be interested to see. I know from personal experience it’s a lot faster.

    http://www.developer.com/net/cplus/article.php/3495506

  23. The build dialogs inside Visual C++ Express have been changed to reference MS Build, not sure if this means it’ll be using it or not. It seems MSBuild should replace the vcproj files but it hasn’t.

    Anyway, I do find Visual C++ 2005 faster to compile with and would be interested in timings when it comes out

  24. Glenn,

    I got a copy of Visual Studio 2005 Beta 2, so I’ll give it a try this weekend. I also wanted to try Boost.Build v2 and maybe even Ant, so I might do a mini-article followup with the results.

    For what I understand though, the build system was improved, but Visual C++ is still not using the underlying MSBuild. It seems we have to wait for the next release for that.

  25. Could you post the Fast Solution Build times, for us GUI-bound people? I’ve been using it with great success, but I’m curious to see real numbers.

  26. I’m almost done collecting data for new build systems (Ant, Nant, MSVC2005, Boost.Build, FastSolutionBuild, etc) and I’ll put up a folloup article in a couple of days.

    In the meanwhile, I’ve also tested FastSolutionBuild had performed well: same build time as MSVC2003 for full build, and about 1 second incremental build. Unfortunately, FastSolutionBuild still has too many drawbacks for what I want:

    – No command-line interface (so harder to run on a script before submitting code or in the build machine)

    – No better dependency checking that VC++

    – Only builds a project, not the full solution or a set of projects

    – No complex dependencies (can’t have multiple executables built as part of a build–which is essential for unit tests). Then again, a lot of those drawbacks are simply part of MSVC.

    All that, coupled with the fact that I had lots of problems with it in the past (maybe it’s much better now), doesn’t make me want to go out and use FastSolutionBuild as my primary build system. I can see it being useful for people building from the GUI only and having just one major project with a simple dependency chain though.

  27. another opinion

    “…we’re stuck with C++ for the foreseeable future…”

    performance is important, keep an eye on the D language it is nearing “1.0” status but is quite usable now:

    http://www.digitalmars.com/d/index.htm

    it compiles very fast!

Comments are closed.