Procedures express program structure at a higher level of abstraction than statements, by encapsulating a number of statements, providing a uniform interface to them, and hiding some information about them. The motivation is to make programs easier to understand, and hence to maintain, especially large programs. Of course, this will be most successful if the structures involved are already familiar, for example, from the way that stories are structured, or the way mathematics texts are structured. The local identifiers and lexical scope rules of block structure certainly have this property.
The Ada parameter passing modes,
in/out, are a nice feature, because they make the programmer's
intent clearer, and permit the compiler some flexibility in optimizing code.
(Sethi calls the
in/out mode call-by-value-result.)
Lisp was the first language to use recursive procedures, and thus the first that had to worry about how to implement them. If you think of recursion as "problem reduction" in the sense of AI, it is not so difficult to understand as it might seem if you think of it as a general programming features. (In fact, AI problem reduction was McCarthy's motivation for including recursion in Lisp.)
The contour rule says that identifiers declared in a given block are only visible inside that block and any other blocks that are nested within it. This idea came from the working group on ALGOL (IFIP WG2.1), as did much other material in this chapter, including the stack discipline for activation records, and the call-by-name semantics for procedure parameters.
Hope (not HOPE) was a clever functional programming language designed by Rod Burstall, named after Hope Park Square, where his office was located in Edinburgh (Hope Park Square was named after a civil engineer who greatly improved that area of Edinburgh, by draining a swamp, among other things). Its pattern matching feature, which led to that of ML, was in part inspired by discussions between Burstall and Goguen about OBJ, which was also under development during the same period; this explains the similarity between pattern matching in ML and in OBJ. But OBJ's pattern matching power is much greater than that of ML, e.g., in allowing matching modulo associativity and/or commutativity. For example, if we define
op _+_ : Nat Nat -> Nat [comm] . vars M N : Nat . eq M + 0 = M . eq M + s N = s(M + N).then the first equation can match
0 + s s 0and the second can match
(s x) + y, returning
s s 0and
s(x + y), respectively. Matching modulo associativity and commutativity can yield even more dramatic differences from ML pattern matching.
There is a relevant joke about Niklaus Wirth, a member of the Algol 60 design team, and the designer of Pascal, Modula, and Oberon. His name can be confusing for English speakers to pronounce. When asked about the correct pronunciation, he is reported to have said, "If you call me by name, it is is 'virt' and if you call me by value, it is 'worth'." (The correct German pronunciation is 'virt'.)
We will see later that call by name is closely related to the lambda calculus.
Text substitution is a form of macro expansion, and is very poor practice for parameter passing.
Another form of parameter passing is call-by-need, also called
lazy evaluation; here a parameter is evaluated only if some computation
actually needs it. This notion applies to operations just as well as it does
to procedures. For example,
if_then_else_ needs to evaluate its
first argument, but only needs to evaluate its second argument if its first
true. Lazy evaluation is featured in many modern
functional languages, such as Haskell.