Moore’s law and the future of computing
Gordon Moore, who predicted 58 years ago that the number of transistors in an integrated computer circuit would continue to double roughly every two years into the foreseeable future — this prediction turned out to be correct, which means that the number of transistors in a circuit increased by one-trillion fold between 1955 and 2015 — has died at the age of 94.
Questions/observations:
(1) I suppose there have to be hard physical limits of some sort to this process, as technology approaches the bottom of the size scale of the world. If/when increases in computing power start to become more linear than exponential, I imagine that’s going to have big implications for a lot of things, although it’s always possible that alternative computer technologies could keep the exponential ball rolling. (I don’t actually know the first thing about computer technology so I’m curious what people think about all this).
(2) Of course the big thing in computer technology at the moment is the very rapid advances in what’s called artificial intelligence, although this has always seemed like a misnomer to me. Making a computer more powerful at computing doesn’t make it any more intelligent, since there’s no reason to think any computer has any intelligence in the first place. The whole Turing Test seems like a category error in this regard. For example, if a nuclear weapon has one trillion times more explosive force than a tank shell (somebody do this math), nobody would assume that nuclear weapons were thereby achieving sentience and intentionality relative to tank shells. Alexa, solve this paradox for me.
(3) That said, Moore’s 1965 paper that formulated his famous law now seems eerily prescient about all sorts of things:
Integrated circuits will lead to such wonders as home computers – or at least terminals connected to a central computer – automatic controls for automobiles, and personal portable communications equipment.
At the time of his death Moore was apparently worth seven billion dollars — he went on to found Intel — so he represents a very rare case of a billionaire who probably actually sort of earned his money, at least relatively speaking.
RIP