CSE 271: User Interface Design: Social and Technical Issues
5. CSCW Issues

Chapter 14 of Shneiderman is short, but it contains several statements that strongly support the CSCW position, including

The introversion and isolation of early computer users has given way to lively online communities of busily interacting dyads and bustling crowds of chatty users. (p. 478)

Computing has become a social process. (p. 502)

Teams of people often work together and use complex shared technology. (p. 494)

The first sentence above is the first sentence of this chapter, and the second is the first sentence of the practitioner's summary. Although Shneiderman may not realize how far reaching the implications of such statements really are, there are many valuable things in this chapter. The four element table on page 481 gives a basic starting point for considering any communication system. The short list of things that can cause cooperative systems to fail (on page 481) is good, though necessarily incomplete:
disparity between who does the work and who gets the benefit; threats to existing political power structures; insufficient critical mass of users who have convenient access; violation of social taboos; and rigidity that counters common practice or prevents exception handling.
Today there are many case studies supporting these (and other) points, and much could be written on this area.

The Coordinator (by Flores, Winograd, et al) is mentioned on p.484, but without citing the CSCW papers that analyze why it failed. One major reason is that people very often want to mitigate speech acts in order not to seem rude (see the online notes on speech acts), and having explicit speech act labels on messages forces them to either seem rude, or else risk being misunderstood. In fact, many users really hated the system, despite having undergone extensive "training" in its use.

"The dangers of junk electronic mail, ...users who fail to be polite, ... electronic snoopers, ... calculating opportunists" are mentioned (p.485) but the importance of such difficult social factors is probably deserves more emphasis.

For synchronous distributed systems, ownership and control are noted as important concerns (p. 488) but without much analysis of why. Much more could be said on both of these topics. MUDs are mentioned (p. 490) but their growing use in business is not noted. For example, the ACS (Academic Computing Service) at UCSD uses a MUD to coordinate its work.

The potential problems with video listed on page 491 should be noted:

slow session startup and exit; difficulty of identifying speakers (the issue here is not just "distracting audio"); difficulty of making eye contact; changed social status; small image size; and potential invasion of privacy.
The importance of a good audio channel is discussed on p. 493, but without saying why it is important. The reason, confirmed by experimental studies, is that oral narrative provides the context within which video is interpreted; you can also verify this by observing how sound tracks are used in movies.

The table on page 503 is very useful, but I would have liked to see more. The pluses and minuses of anonymity in various situations is discussed in several places and can be very important. The source of weakness in this chapter may be suggested by a sentence at the very end (p. 504):

Although user-interface design of applications will be a necessary component, the larger and more difficult research problems lie in studying the social processes.
Thus Shneiderman seems to separate user interface design from the social processes that they are supposed to support; this is not the point of view that is taken by modern CSCW researchers, and indeed, similar separations can be blamed for many of the failures that occur in developing large and/or complex systems.

Results about the structure of interaction from the area of ethnomethodology known as conversation analysis (as discussed in Techniques for Requirements Elicitation) can have interesting applications to video conferencing and similar media. Concepts of particular interest include negotiation for turn taking, interrupts, repair, and discourse unit boundaries. One important point is that a long enough delay (perhaps as little as .1 second) can cause large disruptions, due to our expectations about the meaning of gaps in coversation. Another point is that separate video images of individuals or groups, especially when there are many of them, can frustrate our expectations about co-presence, such as our expectation that we and other participants have the ability to monitor attention, and to conduct effective recipient design.

The importance of Social Issues for User Interface Design

The following is a general essay on the importance of social issues for user interface design; this can be seen as part of the general point that it is always artificial to separate social and technical issues. Let's start with the importance of the social for engineering in general, looking at some "wisdom" quotes from experienced engineers, who say things like

In real engineering, there are always competing factors, like safety and cost. So real engineering is about finding the right trade-offs among such factors.

It's the "people factors" that always cause the most trouble.

Engineering is all about finding the right compromises.

There are many interesting historical cases to illustrate these points. For example, Edison and Westinghouse, in their fierce competition about whether AC or DC should be used for mass power distribution, toured the country, giving dramatic demonstrations in theatres. Also Pasteur, who today would be called a "bioengineer", had to work very hard to get his "germ theory" accepted, and then found that innoculation was even harder for the public to accept than "pasteurization" had been. (Bruno Latour has written an excellent book on Pasteur.)

Companies spend a great deal of effort training employees in social skills such as how to conduct a meeting, ethics, management skills, group skills, etc., and many say that such skills are essential for advancement. Also note that there is a recent trend in engineering education to include courses like "Engineering Ethics" and "Engineer in Society". For example, UCSD has introduced a course in "Team Engineering" (ENG 101).

One can also cite many large computer disasters, such as the $8 billion FAA air trafic control modernization default, the Atlanta Games sports data fiasco, the Denver Airport baggage system failure, the default on database modernization for DoD, the London Stock Exchange modernization collapse, and the UK National Health Service information systems disgrace; in each case, social issues were very deeply implicated. What all these disasters had in common is that they were impossible to hide, which suggests that they are in fact the tip of an enormous iceberg of failures which were never made public, such as the near failure of Bank of America in the late 80s.

Another source of data is studies from "software economics" showing that for large projects, most of the cost of error correction comes from errors in requirements (approximately 50%), while the smallest part (maybe 5%) comes from errors in coding, and the rest comes from errors in specifying modules (maybe 20%) and in overall system design (maybe 25%). Of course, these figures vary widely among projects. Another important fact is that most projects are cancelled before completion, again usually due to requirements problems. It is therefore very significant that the most serious problems with requirements have a strong social component.

Of course to see that all this is relevant to user interface design, you have to accept that user interface design is a branch of engineering with a great similarity to software engineering. To me this seems obvious, and it is also very well supported by experience; I would say that these two fields differ mainly in their scope. So the conclusion of all these arguments is that user interface designers have to take account of social issues or they are much more likely to fail. Note that speaking of "social issues" as separate from "technical issues" is highly suspect, because in fact, they cannot be fully separated. However, they can often be partially separated, and in any case it is convenient to speak of "social factors" even though they never exist in isolation.

A couple of years ago, I attended a meeting at UCSD which discussed the so called "digital divide", and was surprised at some of the opinions expressed. One person seemed to think that just throwing technology at the problem could solve it, or at least, make a major contribution, noting that very large amounts of grant money are available for this. Another wanted an extensive survey of views of people currently without computer access on what they wanted. (But another participant recalled a recent large survey of this kind done in Chicago, which found that the top two items for what poor people there wanted were: (1) some way to ensure that their trash would be picked up; and (2) inexpensive child care. Alas, these are not the kinds of thing that the internet is good at.) Several wanted a detailed survey of organizations involved with computer literacy, and of facilities currently available; in fact, this already exists for San Diego. Others wanted to emphasize training through grass roots community organizations.

Only one participant seemed to realize that information and services on the web in their current form are not suitable for many people who could certainly benefit from them if they could access them. This person cited a study done in Birmingham about a project to provide information about cancer over the web. It seems that many people were intimidated or confused by the way all this information was organized and the language and social conventions that were used, and as a result jumped to sometimes unwarranted conclusions, such as that they had learned for sure that they were going to die of cancer soon.

Note that to speak of "social issues" as separate from "technical issues" is questionable, and many modern social scientists (such as Bruno Latour and Leigh Star) claim that they really cannot be separated. However, they are often separated "for some immediate practical purpose," and of course it is convenient to speak of "social factors" even though they never exist in isolation, as long as one realizes the limitations of this way of speaking.

The following is a little background on modern research in the sociology of science and technology. In contrast to the 19th century trend of making heroes of a small number of individuals, recent research often looks for the work that was done to make something happen, and in particular, at the kind of "infrastructural" work that is usually left out in traditional accounts, e.g., the work of people who actually build the instruments that are used in physics experiments, or the people who somehow obtain the money to build a skyscraper. The research strategy that consciously looks for these omissions is called infrastructural inversion (Susan Leigh Star), and I have suggested that the omissions themselves should be called infrastructural submersions. One major example is that the hygiene infrastructure (especially sewers) of cities like London and Paris made possible the medical advances of the mid to late nineteenth century. And of course the experimental work of Newton would have been impossible without the advances in technology that made his experimental apparatus constructable. The same is true of high energy physics today, where (for example) low temperature magnets are an important and highly specialized intrastructure for particle accelerators.
To CSE 271 homepage
To the next section of the class notes
To the previous section of the class notes
Maintained by Joseph Goguen
© 2000, 2001 Joseph Goguen, all rights reserved.
Last modified: Thu May 2 15:10:09 PDT 2002