You might recall that exponential growth is really super crazy fast. This is true with rice on chessboards, with interest rates, and with pandemics among other things.
Oh, and with computers.
If you’re even slightly interested in computer stuff, you probably have heard about Moore’s Law.
Gordon Moore turned 94 a couple of weeks ago, and has spent his life in silicon and electronics. After a Ph.D. in chemistry and some post-doctoral work in physics, he went to work with Nobel Prize laureate and massive racist William Shockley, who had been one of the inventors of the transistor.
As Shockley was apparently also a terrible manager, Moore and seven other top researchers of the Shockley Semiconductor Lab left and started up their own transistor shop, the Fairchild Semiconductor Corporation. There, they produced the first commercially viable integrated circuit.
In 1968, Moore and Robert Noyce, one of his buddies he founded Fairchild with (and who had been one of the inventors of the integrated circuit), left to start up a different semiconductor company, specialized in making logic gates using transistors: MN Electronics. They didn’t quite like the name, though, and rebranded quickly to “Integrated Electronics”, or Intel for short.
But back in 1965, when he was still the director of R&D at Fairchild, Moore wrote an article where he noticed that the number of components that could fit in an integrated circuit was kind of doubling every year, and extrapolated that would continue for a while. He later revised that from a year to “approximately two years”. This is Moore’s Law, and astonishingly it has been mostly true since then, though the doubling has slowed down in recent years.
Moore’s Law shows an exponential growth. If you can fit more components on the same silicon wafer (and you can improve how things are connected and do other smart things), the cost of computing things, large things, becomes smaller.
In fact, though the data is not definitely accepted, computational cost has dropped exponentially too, which means: enormously. For example, you might remember the 1983 nerd cult movie War Games. In that movie, the supercomputer that could simulate world annihilation and tic-tac-toe was a Cray X-MP. The cost of supporting a billion operations per second (a GFLOP, standard unit of computer performance) on that computer was (in today-adjusted money) 48,900,000 US dollars. The same GFLOP today costs about 2 US dollar cents.
A few years ago, people quipped that their mobile phone had more computing power than the navigation system of the Apollo 11 mission. It has been true for a few years now that the charger of my phone has more computational power than the Apollo 11 navigation system.
Now, here’s the bit that blows my mind and that I think most people struggle to wrap their head around.
Let’s assume you can double your computational capacity, for the same price, every two years. This is not quite exact, but not completely off either.
This means that the next two years of progress will be equal to all the progress in computer performance since the first computer ever in 1946.
Isn’t that incredible?