CSE 271: User Interface Design: Social and Technical Issues
Notes for Sixth Meeting
Notes on Chapter 14 of Shneiderman

This short chapter contains several statements that strongly support the CSCW position, such as

The introversion and isolation of early computer users has given way to lively online communities of busily interacting dyads and bustling crowds of chatty users. (p.478)

Computing has become a social process. (p.502)

Teams of people often work together and use complex shared technology. (p.494)

The first sentence above is the first sentence of this chapter, and the second is the first sentence of the practitioner's summary. Although Shneiderman does not seem to have realized how far reaching the implications of such statements really are, there are nevertheless many valuable things in this chapter. The 4 element table on p.481 is a basic starting point for considering any communication system. The short list of things that can cause cooperative systems to fail (p.481) should be noted, though it is incomplete:
disparity between who does the work and who gets the benefit; threats to existing political power structures; insufficient critical mass of users who have convenient access; violation of social taboos; and rigidity that counters common practice or prevents exception handling.
The Coordinator (of Flores, Winograd, etc) that we discussed is mentioned on p.484. but without citing the papers that analyze why it failed. "The dangers of junk electronic mail, ...users who fail to be polite, ... electronic snoopers, ... calculating opportunists" are mentioned (p.485) but the importance of these difficult social factors are far from being sufficiently emphasized.

For synchronous distributed systems, ownership and control are noted as important concerns (p.488) but without much analysis. MUDs are mentioned (p.490) but their growing routine use in business settings is not noted. Potential problems with video on p.491 should be noted: slow session startup and exit; difficulty of identifying speakers (the issue here is not just "distracting audio"); difficulty of making eye contact; changed social status; small image size; and potential invasion of pravacy. The importance of a good audio channel is discussed on p. 493, but without saying why it is important.

The table on p.503 is very useful, but should have been discussed in some detail; also a lot is missing. The pluses and minuses of anonymity in various situations is discussed in several places and can be very important. The reason this chapter fails to be very satisfactory is suggested at the very end (p.504):

Although user-interface design of applications will be a necessary component, the larger and more difficult research problems lie in studying the social processes.
where we see that Shneiderman separates user interface design for the social processes that they are supposed to support. This is the disastrous attitude that has led to so many failures.
Notes on Chapter 4 of Latour

This chapter does not break much new ground conceptually, but it does underline some important points, e.g., in the following salient quotes:

The more a technological project progresses, the more the role of technology decreases, in relative terms. (p.126)

To study Aramis after 1981, we have to add the filaments of its network a small number of people representing other interests and other goals: elected officials, Budget Office authorities, economists, evaluators, ... (p.134)

A single context can bring about contrary effects. Hence the idiocy of the notion of "preestablished context." The people are missing; the work of contextualization is missing. the context is not the spirit of the times, which would penetrate all things equally. (p.137)

In fact, the trajectory of a project depends not on the context but on the people who do the work of contextualizing. (p.150)

The lecture notes for the next meeting will discuss the work done to create infrastructure for technology and science.

The discussion on the top of p.132 makes it clear that theories based on general laws, like Newtonian mechanics, should not be considered for sociological explanations. Latour even talks about "love" at some points in this chapter (and in the subtitle of the book), e.g., pp.142, 149. The caption on photo 1 (after p.158) again expresses Latour's opposition to pseudo-scientific explanations of social phenomena. Photo 20 is very sad.


Notes on Class Discussion

I was feeling discouraged that some class members did not believe in the importance of the social for user interface design, so I gave a kind of mini-lecture, that began with the importance of the social for engineering in general, first quoting some "wisdom" of experienced engineers, who say things like

In real engineering, there are always competing factors, like safety and cost. So real engineering is about finding the right trande-offs among such factors.

It's the "people factors" that always cause the most trouble.

Engineering is all about finding the right compromises.

I described how Edison and Westinghouse, in their competition about AC vs. DC for mass power distribution, toured the country, giving dramatic demonstrations in theatres. Also, I described how Pasteur, who today would be called a "bioengineer", had to work very hard to get his "germ theory" accepted, and how innoculation was even harder for the public to accept than pasteurization. (Latour has a good book on Pasteur.)

I noted that companies spend a great deal of effort training employees in social skills such as conducting a meeting, ethics, management skills, group skills, etc., and that they claim that such skills are essential for advancement. I also noted that the recent trend in engineering education to include courses like "Engineering ethics" and "Engineer in Society". UCSD has just started a course in "Team Engineering" (ENG 101).

Then I described a number of large computer disasters, including the $8 billion FAA air trafic control modernization default, the Atlanta Games sports data fiasco, the Denver Airport baggage failure, the DoD database modernization default, the London Stock Exchange modernization collapse, and the UK National Health Service information systems disgrace, indicating in each case how social issues were deeply implicated. I also pointed out that these disasters had in common that they were impossible to hide, so that they are in fact the tip of an enormous iceberg of failures, including the near failure of Bank of America in the late 80s, which was never made public.

After that, I described studies from "software economics" showing that for large projects, most of the cost of error correction comes from errors in requirements (approximately 50%), while the smallest part (maybe 5%) comes from errors in coding, and the rest comes from errors in specifying modules (maybe 20%) and in overall system design (maybe 25%). Of course, these figures vary widely among projects. Another important fact is that most projects are cancelled before completion, again usually due to requirements problems. And the final nail in the coffin, so to speak, is that most problems with requirements have a strong social component.

Of course to see that all this is relevant to user interface design, you have to accept that user interface design is a branch of engineering with a great similarity to software engineering. To me this is obvious, as well as very well supported by experience; I would say that these two fields differ mainly in their scope. So the conclusion of all these arguments is that user interface designers have to take account of social issues or they are even more likely to fail.

Note that this way of speaking of "social issues" as separate from "technical issues" is highly suspect, because Latour claims they cannot be separated. (However, I think they often can be separated, and in any case it is convenient to speak of "social factors" even though they never exist in isolation.)

After all this, we reviewed the basic semiotics of the previous meeting, and then went a little deeper, with Saussure's idea that signs always come in systems, with examples like the vowel systems of various accents of the same language, and the tense systems for verbs in various languages. The vowel system example shows that the same sign system can be realized in different ways; we call these different models. The vowel system example also shows that two different models of the same sign system can have the same elements but use them in a different way; so it is how elements are used that makes the models different, not the elements themselves. Models of sign systems are not just sets, they are sets with some kind of structure; we will learn more about this later. Alphabets also provide examples where the sets overlap; for example, the Greek, Roman and Cyrilic alphabets. You can't convey any information if your sign system has just one element (technically, the Shannon information content is zero).

After this discussion, I suggested it may be useful to view sign systems as abstract data types, because the same information can be represented in a variety of different ways; for example, dates, times, and sports scores. This leads naturally to the idea that representations are mappings between abstract data types. I then went over some of the examples in the UCSD Semiotic Zoo, showing how some structure was not being properly preserved by some mapping.

The following is some background on modern research in sociology of science and technology. In contrast to 19th century trend of making heroes out of a small number of individuals, late 20th century research often looks for the work that was done to make something happen, in particular, this may be the kind of "infrastructural" work that is usually left out in traditional accounts, e.g., the work of people who actually build that instruments used in physics experiments, or the people who somehow obtain the money to build a skyscraper. The research strategy that consciously looks for these omissions is called infrastrucural inversion (Susan Leigh Star) and the omissions themselves are called infrastructural submersions. One major example is that the hygiene infrastructure (especially sewers) of cities like London and Paris made possible the medical advances of the mid to late nineteenth century. And of course the experimental work of Newton would have been impossible without the advances in technology that made his experimental apparatus constructable. The same is true of high energy physics today, where (for example) low temperature magnets are an important and highly specialized support for particle accelerators.


To CSE 271 homepage
To my home page
25 February 1998