Programming Languages
Sometime when I was in 7th grade, at Ophelia Parrish Junior High school, a couple of guys that I knew showed me Northeast Missouri State University’s teletypes.
There were 2 teletypes in the NMSU Science building. Dear god, they were noisy, and they jumped around when they printed computer output. But you could play tic-tac-toe, print a calendar with an ASCII-art Snoopy header, do a simulation of crop-eating insects and their interaction with pesticide, and best of all, play a lunar lander simulation. I don’t think anybody successfully landed it. Still, in 1973 or ‘74 this was advanced stuff. Since teletypes printed on paper, we could steal user IDs and passwords when Professor Kangas was distracted enough to throw the paper from his sessions in the trash can.
I was in 8th grade (1974) when I had the idea of using BASIC to solve my algebra homework. I had no idea how to do this, so I think I actually solved the equation by hand, and merely wrote a short BASIC program to verify the result.
The BASIC program didn’t work. It didn’t give me the correct answer, merely a number close to the correct answer. This puzzled me. I wrote a short program to try to get a correct answer involving fractions:
10 A = 1 / 3
20 B = A * 3
25 PRINT B
30 IF A <> 1 GO TO 10
40 PRINT A
Still didn’t work. In fact, the program looped, apparently endlessly. I was obviously not a college student, and there were folks waiting in line to use the Teletype that evening, so I gathered up my books, and printouts and left. I realized my mistake as I was walking down the stairs of Laughlin Hall.
The division of 1 by 3 (A = 1 / 3
) didn’t give a precise value of one-third,
so multiplying the value of A by 3 didn’t result in exactly 1.
Surely I can deal with this mistake somehow…
I went down a few more steps, then an idea hit me like a brick:
How had the makers of the BASIC programming language known what to put in the language so that I could try to solve my homework problem with it?
The idea was so powerful that I had the actual sensation of getting a brick dropped on my head.
Even though I had my dad’s NMSU library card (he was a professor there), I could not scrounge up any books that enlightened me about either why 1 divided by 3 did not equal one-third, or how to get the computer to do my algebra homework.
I quit using NMSU’s teletypes. I no longer logged in on the sly with Prof. Kangas’ user ID.
I remember the actual computer as a Honeywell 1640, which has hardly any references to on the internet.
It took me a while, but I came back to the issue. I started college classes fall of 1979. NMSU was still using the 1640. I signed up for a FORTRAN IV class, and then I got a $2.20 per hour job at the NMSU student data center, running decks of punch-cards through the reader, then matching previously-read-decks with paper output.
The instructor of my Calculus 2 and 3 classes at NMSU, Don Groff, would give out extra-credit assignments involving things like calculating π with the first so many terms of infinite series. We got to use FORTRAN or XBASIC to write the programs.
Between my student job and extra-credit, I learned that computers were far more complicated than I had thought, but that ordinary college students could write programs that were of interest.
In the early 80s and early 90s, I was a stress analyst, a structural engineer, at two major aerospace corporations. We had to use computers to do otherwise very tedious calculations. It was fun, but I couldn’t figure out how a single computer could run multiple processes. How did “virtual memory” work? In 1986, there were no easy-to-find answers. I bought a Radio Shack Color Computer III, because it could multi-task, where the IBM PC could not. I started to learn how computers worked, what programs were. and how to write them,
Many years later, after 2003 but no later than 2016, I wrote my own computer programming language, crapterpreter. I had to learn to structure programs, understand regular expressions, write a grammar, learn how to use the grammar to build up abstract syntax trees, then use the abstract syntax tree in a tree-walk interpreter.
The issue of language design, how Kemeny and Kurtz decided what to put in their language, and what I put in the crapterpreter, is a little more diffuse. If you have a way to add, subtract, multiply and divide numbers, and your programming language can execute sequences of instructions, and can perform controlled repetition (looping) over those sequences, the language can pretty much calculate anything. Most programming languages have more features than that, BASIC and crapterpreter are no exception, but once the language has those basics, it comes down to taste and preference.
It turns out that solving algebraic equations in general is a lot harder than an 8th grader with minimal knowledge of BASIC could possibly have imagined. But MACSYMA was in development during my 1974 revelation. Its descendant Maxima is still around today. MACSYMA could almost certainly have solved my 8th grade algebra homework.
Sometime in the mid 1980s, two guys, Chris King and Jon Broyles, bought the 1640 from NMSU for $500 as I recall, and were going to run a small business doing batch accounting with it. That was the last I heard of it.
In 2018, I interviewed for a job at Comcast, with their video-over-IP division. I think my performance in the interview was only passable, but at the end of the interview, I got asked for my “computer story”, why I was interested in programming. I told the story above. I believe it got me the job at Comcast.