Vector supercomputers
There are apocryphal reports that Apple M1 systems are not as fast as people have been led to believe for general-purpose programs. That’s unsurprising.
I think what’s happened is that vector supercomputers have secretly won, and with them come all their performance weirdnesses that make a lot of code really suck: no-one wanted to run anything other than rather specialised programs on a Cray 1 or any of its descendants because it was just not very fast for that. Vector supercomputers were great at numerical loops over large arrays, but they were absolutely terrible at code which had to make lots of actual decisions.
So now we’re seeing machines which are optimised to be extremely good at mashing arrays of numbers, and much less good at general computation. Of course, unlike the 1970s & 80s machines ‘much less good’ is ‘quite good enough’ in almost all cases.
And they’ve won, really, because we’re in the middle of another AI hype-cycle: the last hype cycle gave us all sorts of weird hardware like Lisp machines, graph-reduction machines and so on: this one, which is built, really, on programs which ought to be written in Fortran, is giving us special-purpose array-mashing machines — vector supercomputers, in other words — which are really good at all the annoying machine-learning things our computers now insist on foisting on us.
Well, this AI hype cycle will be like all the other AI hype cycles: despite the idiot boosters who have conveniently forgotten what happened last time and all the times before that, we are not anywhere near some kind of strong AI based on machine learning. Already you can see this: whatever language-learning system we’re all meant to worship at the feet of has now been trained on all the natural language that exists on the internet, in order to produce results which are not, in fact, acceptable. And there’s nowhere to go from here: there is no more training data.
It remains to be seen whether array-mashing machines outlive the hype that gave rise to them: there are good uses for systems like this, just as there are good uses for machine learning, but when the bubble bursts it may yet take them with it.