Monday, December 19, 2016

What is the global cost of Autoconf?

Waiting for autoconf to finish is annoying. But have you ever considered how much time and effort is wasted every day doing that? Neither have I, but let's estimate it anyway.

First we need an estimate of how many projects are using Autotools. 100 is too low and 10 000 is probably too high, so let's say 1000. We also need an assumption of how long an average autoconf run takes. Variability here is massive ranging from 30 seconds to (tens of) minutes on big projects on small hardware. Let's say an average of one minute as a round number.

Next we need to know how many times builds are run during a day. Some projects are very active with dozens of developers whereas most have only one person doing occasional development. Let's say each project has on average 2 developers and each one does an average of 4 builds per day.

This gives us an estimate of how much time is spent globally just waiting for autoconf to finish:

1000 proj * 2 dev/proj * 2 build/dev * 1 minute/build =
   4000 minutes = 66 hours

66 hours. Every day. If you are a business person, feel free to do the math on how much that costs.

But it gets worse. Many organisations and companies build a lot of projects each day. As an example there are hundreds of companies that have their own (embedded) Linux distribution that they do a daily build on. Linux distros do rebuilds constantly and so on. If we assume 10 000 organisations that do a daily build and we do a lowball estimate of 5 dependencies per project (many projects are not full distros, but instead build a custom runtime package or something similar), we get different numbers:

10 000 organisations * 1 build/organisation * 5 dependencies/build * 1 min/dependency = 50 000 minutes = 833 CPU hours

That is, every single day over 800 CPU hours are burned just to run autoconf. Estimating CPU consumption is difficult but based on statistics and division we can estimate an average consumption of 20 Watts. This amounts to 16 kWh every day or, assuming 300 working days a year, 5 MWh a year.

That is a lot of energy to waste just to be absolutely sure that your C compiler ships with stdlib.h.


  1. Not many people use it, but you can cache autoconf tests

  2. Ironically, the longest part of a build, for a sufficiently large project, is not the configure step, but the compiler. I would much rather see chromium not take 5-6 hours to build than arguing over whether the configure step, however implemented, takes 5 or 50 seconds.

    The real cost lies in changing the build system to some niche thing. Not only do your colleagues and downstream users have to learn a new thing, but also be frustrated with the new quirks.

    Which makes not all that bad.

  3. If Chromium were built with Autotools, its configure step would take ~30 minutes on Windows, because shell scripts are 10-15x slower on Windows than Linux.

    Tests also show that Meson compiles the same code around 2x faster than Autotools. (just compilation, ignoring configure step). See for example

  4. Regarding the number of Autotools projects: Currently the core Homebrew for OS X has 3677 packages, of which 1860 are built with Autotools (number estimated by grepping for ./ Now, Ubuntu had 73813 packages at 14.04 LTS, so everyone can jump to conclusions.

    1. This comment has been removed by the author.

    2. Too quick. In addition to the packages with pre-generated ./configure, homebrew-core has 149 packages that are built starting from autogen.

    3. ...and another update: 1776 packages with pre-made ./configure after removing false wildcard . matches. Other systems include 275 for CMake, 13 for SCons, 9 for Jam, 3 for Gyp and 1, libhttpseverywhere for Meson.