|Re: new book on the Apollo Guidance Computer|
Message #38 Posted by Garth Wilson on 24 July 2010, 6:04 p.m.,
in response to message #1 by Don Shepherd
Yes. The slide rule gives a better understanding of number relations-- not that you have to keep using it, but I still benefit from having gotten proficient at it even though I haven't used it in many years.
I'd bet that some of the slide rule fanatics among our little group would say that a new engineer would be a better engineer if he/she understood what a slide rule did
When I started in school on the mainframe computer in the late 1970's, you had to write your code out by hand, then go to the machines where you punch it into the cards, then submit your card pile to the operators, then come back sometime later hoping they had run your program, only to find a long printout of all the reasons it wouldn't run. Turnaround was anything but instant. But, continuing:
The reason you had to desk check your code before running it on the computer was because it was expensive and time consuming to run it on the computer. These days, it's much faster and easier to run the real data through the program itself to see if it gets the right answer. So the purpose for desk checking has largely gone away.
On the software side, when I went to programming school (1968) we learned to "desk check" our code, which meant following the code, instruction by instruction, on paper with some test data to make sure it really did what you wanted it to. We were taught to do this even before running our program on the computer with real data, and we almost always found errors that otherwise might have slipped by us. I don't think the current generations of programmers are taught that, and despite huge advances in the software development and testing fields today, errors can still slip by.
I would change the advice to "The computer should not be the only thing you use to debug your program." Debugging needs to be like a concurrently running mental task, part of the process even before you run anything. Working by myself for small, low-budget outfits, I have found that when the lack of expensive debugging tools actually teaches you to write better code. You can't have the attitude that "I'll whip out this code in record time and debug it later." Out of necessity, I've become more structured and neat in my programming, documenting everything thoroughly, making the code as readable as I know how, and proofreading.
Ask yourself "why is this good advice?" The computer is the perfect tool to debug your program and I think you should use it whenever possible.
There was an old saying my boss used to repeat: don't use the computer to debug your program. I think that is still good advice.
Ten years ago, large companies finally started seeing the value in this, and the industry magazines ran some articles on code inspection and having committees of the programmers' peers proofread the code. I sometimes catch bugs when further commenting code that's already working but not exhaustively tested yet. I comment as if trying to explain it to someone else who hasn't been following my train of though on it. (If I need to change it a year later, I'll need the comments anyway.) As a result of this madness, no user has ever found a software bug in any product or automated test equipment I programmed. The projects have ranged from 700 to 10,500 lines of code, and have always been for control of equipment, quite different from desktop applications or data processing.
BTW, using a lines-of-code-per-day benchmark of programming performance is a sure way to end up with inefficient, buggy code that's hard to figure out and fix or modify later. I once worked with a programmer who typed non-stop. I always wondered what he was typing. After he left the company, I had to fix a 4-page routine he wrote. When I was done, it was down to a half page, bug-free, clearer, faster, and did more with less memory. Eventually most of what he wrote had to be redone.
It just frees you up to advance to the next level of programming and therefore the next level of bugs. ;) As long as there's programming, there will be bugs of some kind.
More important than automated testing systems are the large standardized libraries of working code that handle much of the drudgery that was hand-coded over and over again. Many of the common bugs that were made in the past gone now.
That's a good thing. I can express myself most clearly and accurately in writing, and I hope the keyboard never goes away. I do of course use INCLude files, which can be nested, so it's not flat in that respect. I also like the DOS/ANSI characters above 127 for drawing diagrams and tables with smooth lines in the source code, as well as for the Greek letters, special symbols, etc. that we use all the time in engineering.
Still, programming is, to a large extent, still in the dark ages. Ironically, it's about the ONLY discipline where you still work mainly with flat ASCII files.
It's like people in Washington not having a grasp on what a trillion is, or even the cost of a single dollar. Bloatware anyone? Of course it matters! And if you're writing applications for embedded control of equipment where you have to know the ports and various hardware resources intimately, there's no way around it.
During I chat I had with a programmer who had grown up totally within the Windows generation, I was aghast to find he didn't know how many bits were in a byte! When I pushed him on this, he replied, "why do I need to know?"