Not very far away
in a Pi thread I made a mention of the Atlas, a marvellous early computer put to diverse uses, including investigations into pure mathematics.
With only a little prompting, I've been encouraged to share a link to an Atlas thread elsewhere, which should be of interest to anyone with an interest in such things:
Ferranti Atlas: Britain’s first supercomputer
There are links within to the Chilton computing history site - a site well worth exploring. Be sure to check the Further Reading drop down menu if you see one. As a random mathematical start page,
here's a conference organised by
John Leech (he of the Leech Lattice.) As a random practical application page, here's one on
Computer Animation.
30+ years ago, I saw a great book on vintage computers in a used-book store. I wanted to buy it, but thought they were asking way too much, considering it was a used book. Later I decided to go ahead and spend the money, and went back to get it—but it was gone. The new store owner thought books on outdated computers were worthless, and got rid of them! Now of course we can enjoy all of that online, free, with no limit. And we definitely do, and are impressed by the dedication shown by the pioneers to accomplish what they did with what they had to work with.
I sometimes think about how much money and effort was spent on computers that were, within even a couple of years, surpassed in power, affordability, and other desirable traits. It seems a shame, as it seems like the computer had no chance of paying for itself. It was a necessary part of the development of the computer field though, and probably all of us took part in that to some extent. I used to frequent a surplus electronics store that had a storage yard outdoors for the big stuff, including mainframe computer components that went for probably millions of dollars (at least in today's money) 25 years earlier, and here they sat, out in the weather.
Seeing the circuit boards in the video, we could think, "Why didn't they make them denser? Why are parts so spread out?" But I suppose most of us who started in electronics in the 1960's or 70's started the same way, having almost no reference yet, nor anything to even make us even think about it. Coming out of the age of vacuum tubes, miniaturization was a new thing. So were so many other parts of computing, like programming languages, and even what a computer could be used for. So much was not even imagined yet, as the field was new and exciting (but also not without insecurities in what companies and systems would survive the coming shakedowns and which ones would fail).
There's definitely a romance to computer history, like one might have for steam locomotives, sailing ships, lighthouses, and more. Thanks for the link.
One of the jobs I interviewed for after college was at huge consumer products company. They had an underground facility with over a dozen IBM mainframes. They told me that each night, tapes were taken from this facility to an identical backup facility many miles away. They explained to me that this was in case the US was attacked with nuclear weapons and the primary facility was destroyed. Apparently, they thought these multi-million-dollar computers did pay for themselves – to the point where it made sense to have 2 of each, “just in case”.
Thanks, lots of interesting reading there. As an aside, the page about Jack Good mentions the HP-15C.
(03-24-2023 02:31 AM)Garth Wilson Wrote: [ -> ]I sometimes think about how much money and effort was spent on computers that were, within even a couple of years, surpassed in power, affordability, and other desirable traits.
This happens constantly nowadays too. Anyway one should not discount the development, especially in the past.
Nowadays one may have - more or less - the ability to port software from one mainframe/supercomputer to another (it is not strictly true either). In the past wasn't that easy.
In calculator terms, writing a program for a TI wouldn't easily port to the HP (from the 70s and 80s - and even nowadays). Therefore it is not only about the HW, rather it is also the development platform. It takes time to get to speed and changing HW too frequently would make the SW development stuttering.
All this to say, it is completely fine that systems, apparently obsolete in HW, get used for some more years because the SW development on another system would slow down again. On one side one gains HW speed, on the other one loses SW development speed. SW development speed is the larger part of the problem unless the product is a library used a gazillion times (in that case then the program speed is more important).
Just as a note. Sierra and Summit are still running despite (a) being installed in 2018 - their base HW is somewhat dated compared to what is available nowadays; (b) IBM not really pushing for major supercomputers at the moment.