CSE 230: Principles of Programming Languages
Notes on Chapter 5 of Stansifer (blocks, procedures and modules)

This chapter is about the large grain structure of programs; the motivation for this is to make it easier to understand, and hence to maintain, programs, especially large programs. Of course, this will only work if we use structures that are already familiar to us, for example, from the way that stories are structured, or the way mathematics texts are structured. The local identifiers and lexical scope rules of block structure certainly have this property.

Some of the material in this chapter is explained very briefly and even incompletely; consequently you may want to do some supplementary reading in other books, such as that by Pratt and Zelkowitz, especially for topics like the runtime handling of recursive procedures with activation records, etc.

5.1 It may not be clear from the text, but the contour rule just says that identifiers declared in a given block are only visible inside that block and other blocks that are nested within it. This came from the working group on ALGOL (IFIP WG2.1), as did much other material in this chapter, including the stack discipline for activation records, and the call-by-name semantics for procedure parameters.

5.2 Lisp was the first language to use recursion, and thus the first that had to worry about how to implement it. If you think of recursion as "problem reduction" in the sense of AI, it is not so difficult to understand as it might seem if you only think of it as general programming.

The material on pages 152-153 is where this class links with the compiler class; the use of offsets, etc. is good engineering, which makes a very efficient runtime environment possible.

5.3 Hope (not HOPE) was a clever functional programming language designed by Rod Burstall, named after Hope Park Square, where his office was located in Edinburgh (Hope Park Square was named after a civil engineer who greatly improved that area of Edinburgh, by draining a swamp, among other things). Its pattern matching feature, which led to that of ML, was in part inspired by discussions between Burstall and Goguen about OBJ, which was also under development during the same period. I hope you noticed the similarity between pattern matching in ML and in OBJ. Moreover, the "pattern matching" capability (if we want to call it that) of OBJ3 is even more powerful than that of ML, in particular, allowing matching modulo associativity and/or commutativity. For example, if we define

   op _+_ : Nat Nat -> Nat [comm] .
   vars M N : Nat .
   eq M + 0   = M .
   eq M + s N = s(M + N). 
then the first equation can match 0 + s s 0 and the second can match (s x) + y, returning s s 0 and s(x + y), respectively. Matching modulo associativity and commutativity can yield even more dramatic differences from ML pattern matching.

5.4 The joke about Niklaus Wirth is worth thinking about.

5.4.1 The procedure P on page 157 will detect whether or not its two actual parameters are aliased, in case Stansifer has not said that clearly enough.

5.4.5 Text substitution is a form of macro expansion, and is very poor practice for parameter passing.

5.4.6 There is a bug in the table for Sum near the top of page 163: the code in the third line should read

    Sum(i,1,m,Sum(j,1,i,j*B[j,i])) 
You can skip the procedure GPS on page 163, but if you want to understand it, here is a link to an explanation, provided by Dana Dahlstrom.

5.4.7 Call by need is also called lazy evaluation, and is featured in modern functional languages like Haskell.

5.4.8 The Ada parameter passing modes, in, out, and in/out, are a nice feature, because they make the programmer's intent clearer, and permit the compiler some flexibility in optimizing code.


To CSE 230 notes page
To CSE 230 homepage
Maintained by Joseph Goguen
© 2000, 2001, 2002 Joseph Goguen
Last modified: Sun Feb 24 22:38:39 PST 2002