CSE 130: Principles of Programming Languages
Notes for Chapter 1 of Sethi

Sethi's view of programming languages is focused on principles behind design, but it also includes some underlying mathematics, and some historical and cultural information. I believe every well educated computer scientist should be familiar with such material, and the following is intended to supplement (but not replace) Chapter 1 of Sethi in this regard.

Computer scientists often believe that natural languages and computer languages are very similar, especially since some early work of the linguist Noam Chomsky on formal grammars for natural language has been very successfully applied to computer languages. However, much recent research suggests that natural language is not formal and cannot be formalized, due to its dependence on enormous amounts of implicit (i.e. "tacit") background and social context, each of which is highly variable, as well as difficult or impossible to formalize. Moreover, small errors in natural language cause little or no problems to native speakers, whereas they can be disastrous for computer languages. As a result, Chomsky's work is no longer widely accepted as valid for natural languages. Therefore it seems that natural and computer langauges are really quite different, as is also evident from the vastly different experiences that we actually have in using them.

Both formal linguistics and programming deal mainly with written language, but it is almost certain that spoken language greatly preceded written language, and quite possibly goes back as far as humans, say 300,000 years. It was long thought that Sumerian was the first written language, but findings in 1998 at the very first pyramid show that Egyptian writing is at least as old as Sumerian; also the shards found there, records of tributes to the pharoah, were written in a phonetic alphabet rather than in the hieroglyphics which later came to dominate Egyptian writing.

Other recent archaelogical research shows that certain forms of representation and calculation existed about 5,000 years ago, in ancient Egypt and Sumeria, for representing and calculating with numbers, mainly for what we would now call accounting data, using data types and algorithms (instead of representation and calculation). Work on the sociology of science shows that all mathematics has its origin in practical work; moreover, even today, mathematics research is largely driven by practical considerations; the notion of "pure mathematics" is somewhat of a myth, though a powerful one.

Perhaps Gottfried Leibniz (1540-1603) the first to dream of a language for computation, though his dream went well beyond the programming languages that we have today since he envisioned using them for all forms of human reasoning, including legal reasoning. Leibniz also invented binary numbers, inspired by patterns found in the ancient Chinese book of divination, the I Ching. Leibniz is also famous for having invented the calculus, about the same time as, and independently from, Isaac Newton. Charles Babbage (1792-1871) designed calculating machines but was unable to build them, due to both funding and engineering difficulties; Lady Ada August, Countess of Lovelace (daughter of the great English poet, Lord Byron) was a strong supporter of Babbage, and wrote many programs for his machines in long letters to him. The novel The Difference Engine by William Gibson and Bruce Sterling contains an amusing fictional account of Babbage and Ada Augusta Lovelace, featuring giant steam driven computers (and much more).

The most important thing about FORTRAN was its excellent optimizating compiler; without this, assembly language programmers would never have changed their habits. ALGOL was designed by a (largely European) committee, and was the first langauge to have an international standard; it also introduced many important new features. PL/I failed mostly because it was too large. C was designed for systems programming, and cleverly combined high and low level features, based on Christopher Strachey's very innovative CPL language ("CPL" is for "Christopher's Programming Language"); CPL was designed at Oxford but never fully implemented, although a subset BCPL was implemented ("BCPL" is for "Basic CPL"). Simula, Smalltalk, and of course C++ all grew out of the imperative tradition, and the object oriented paradigm can be considered a variant of the imperative paradigm, as supported by the following quotation from Alan Kay, the designer of Smalltalk:

Though OOP come from many motivations, two were central. The large scale one was to find a better module scheme for complex systems involving hiding of details, and the small scale one was to find a more flexible version of assignment, and then try to eliminate it altogether.
Simula was developed by Kirstin Nygaard and Johan-Ole Dahl at the Norwegian Computing Center, and was originally designed for applications to simulation. For some reason, COBOL is missing from Figure 1.4 in Sethi; it was developed for the US Navy by Admiral Grace Murray Hopper, mainly to facilitate the processing of military accounting information; COBOL is still used in many banks today, and Hopper is honored as one of the pioneer women in computing. Another significant omission is the fact that Robert Kowalski, at Edinburgh around 1970, developed the linear resolution proof method, and used it in an interpreter for Horn clause logic, which provided the foundation upon which Prolog was built.

It is interesting to observe that the three major programming paradigms grew out of three major approaches to computable functions. Imperative (and object oriented) languages grew out of Turing machines, which also inspired the von Neumann architecture on which they usually run. The functional programming paradigm grew out of the lambda calculus approach to computable functions due to Alonzo Church, which was soon proved equivalent to the Turing machine approach. LISP was the first functional language, and it directly includes lambda abstraction and higher order functions. LISP was also the first interactive language, as well as the first to have garbage collection and to support symbolic computation; the latter was important for its applications to artificial intelligence, as was the fact that its programs were written in the same structure (S-expression) that it used for data. More recent functional programming languages are ML and Haskell; the latter takes its name from Haskell Curry, who introduced a variant of the lambda calculus which he called the combinatory calculus.

The so-called logic programming paradigm is related to a different notion of computability, having to do with the manipulation of algebraic terms, arising from work of Jacques Herbrand. (I say "so-called" because I think it is a misleading name, since its syntax is based on Horn clauses rather than full first order logic, and in any case, there are many many other logics than first order and Horn clause logic.) ML and Prolog both originated at Edinburgh from research on theorem proving and logic, in groups led by Robin Milner and Robert Kowalski, respectively, though these languages were more fully developed elsewhere.

The most important ideas for personal computing came from Doug Englebart at SRI (then called Stanford Research Institute): the mouse, windows, menus, and networking (recall that interaction came from LISP). Alan Kay at Xerox PARC popularized these ideas in Smalltalk, also adding icons; others at PARC developed the ethernet and the laser printer. Apple added very little to this mixture, and Microsoft has added nothing of intellectual significance (companies spend a lot on advertising their alleged creativity, often at the expense of the scientists who actually did the work).

Since Java is intended to be used over the internet, it has a very different character from the other languages discussed above; in particular, Java applets have the form of byte code, downloaded from a server to a client machine and run on there on a Java abstract machine; security is therefore an extremely important issue. Here, the hardware is the internet, not just a CPU with associated memory and peripherals. Java has probably been more carefully studied for potential security flaws than any other programming language has ever been.

An overview of the history of programming languages reveals a progressive increase in the level of abstraction: machine language; assembly language; (so called) "high level" languages like FORTRAN; and then ever more powerful features to support abstraction, including blocks, procedures, types, recursion, classes with inheritance, modules, specifications, ... These correlate with improvements in the underlying hardware. Perhaps machine language makes sense if your hardware is (relatively) small systems of gears and shafts, and your programs only compute repetative tables, as was the case for Babbage and Lovelace; assembly language perhaps makes sense if your hardware consists of (relatively) few vacuum tubes and your programs are for (relatively) small numerical tasks; but large programs running on mainframes and modern PCs require much more abstraction.

There are some very close relationships between mathematics and programming. In particular, the mathematicians John von Neumann and Alan Turing designed computers to solve difficult computational problems associated with World War II (the atom bomb and decrypting German radio signals, respectively). Moreover, the stored program architectures that they developed (around the same time) were inspired by the universal Turing machine, a theoretical model of computation developed by Turing to clarify the logical notions of computable and uncomputable function. In addition, the major programming paradigms correspond to different (but mathematically equivalent) ways of defining the notion of computable function, and the historical trend of rising levels of abstraction also follows a major trend in mathematics. Finally, the increasingly abstract features that have been included in programming languages have generally taken their inspiration from mathematics; these include expressions, data types, functions and procedures, blocks, modules, specifications, and more.


My Essay on Comparative Programming Linguistics gives some further information related to the topics discussed above.

A pdf poster of the history of programming languages, on the O'Reilly site.

A very detailed (perhaps too detailed, and also not always accurate) graphical history of programming languages, by Eric Levenez.

A page programs for the factorial function in various languages


To CSE 130 notes page
To CSE 130 homepage
Maintained by Joseph Goguen
© 2000 - 2006 Joseph Goguen
Last modified: Wed Feb 1 16:19:21 PST 2006