Date: Mon, 20 Jan 1997 07:22:40 -0800
From: Mark Ackerman <ackerman@sima.ICS.UCI.EDU>

COMMUNICATION AND COLLABORATION FROM A CSCW PERSPECTIVE
Mark Ackerman

Human-centered information systems, whether augmented through AI technology or
anything else, need to have at their core a fundamental understanding of how
people work in groups and organizations.  Otherwise, we will produce unusable
systems, badly mechanizing and distorting collaboration and other social
activity.

I think the fundamental technical question for HCI systems is a meta-question:

    How do we deal with the fundamental tension between the 
    capabilities of current computational technologies and people's
    needs for highly nuanced and contextualized information and
    activity?
	
Since many studies (see below) have determined that information-oriented
activity (or any other activity) in its social environment is very nuanced,
emergent, and contextualized, we should further ask ourselves:

    - When can we successfully ignore the need for this nuance and
    context?
    
    - When can we augment human activity with computer technologies
    suitably to make up for the loss in nuance and context (e.g.,
    different time/place benefits in computer-mediated communications over
    face-to-face)? Can these benefits be systematized so that we know when
    we are adding benefit rather than creating loss?
    
    - What types of future research will solve some of the gaps between
    technical capabilities and what people expect in their full range of
    social and collaborative activities?
	
This meta-question arises from my understanding of findings from the
Computer-Supported Cooperative Work (CSCW) area of Human-Computer Interaction.
Below is my summary of these findings.

A biased summary of CSCW research
---------------------------------

Most of this will be obvious to CSCW researchers, but might be a useful place
to start for non-CSCW researchers.  (References for these findings are
available upon request.)

In addition to Simon and March's limited rational actor model, used by most of
computer science, CSCW researchers also tend to assume the following:

-   Members of organizations sometimes have differing (and multiple)
    goals, and conflict may be as important as cooperation in obtaining
    issue resolutions.  Groups and organizations may not have shared
    goals, knowledge, meanings, and histories. If there are hidden or
    conflicting goals, people will resist concretely articulating
    goals. On the other hand, people are good at resolving communicative
    and activity breakdowns.
    
    Without shared meanings or histories, information will lose context as
    it crosses boundaries.  (Sometimes this loss is beneficial, in that it
    hides the unnecessary details of others' work.  Boundary objects allow
    two groups to coordinate.) An active area of CSCW research is in
    finding ways to manage these problems and trade-offs.
    
-   Social activity is fluid and nuanced, and this makes systems
    technically difficult to construct properly and often awkward to use.
    For example, people have very nuanced behavior concerning how and with
    whom they wish to share information; access control systems often have
    very simple models.  As another example, since people often lack
    shared histories and meanings (especially when they are in differing
    groups or organizations), information must be recontextualized in
    order to reuse experience or knowledge.  One finding of CSCW is that
    it is sometimes easier and better to augment technical mechanisms with
    social mechanisms to control, regulate, or encourage behavior.
    
-   Exceptions are normal in work processes.  It has been found that much
    of office work is handling exceptional situations.  Additionally,
    roles are often informal and fluid.  CSCW approaches to workflow and
    process engineering primarily try to deal with exceptions and
    fluidity.
    
-   People prefer to know who else is present in a shared space, and they
    use this awareness to guide their work.  For example, air traffic
    controllers monitor others in their workspace to anticipate their
    future workflow.  An active area of research is adding awareness
    (i.e., knowing who is present) and peripheral awareness (i.e.,
    low-level monitoring of others' activity) to shared communication
    systems.  Very recent research is addressing the trade-offs inherent
    in awareness versus privacy, and in awareness versus disturbing
    others.
    
-   Visibility of communication exchanges and of information enables
    learning and greater efficiencies.  For example, co-pilots learn from
    observing pilots work (situated learning, learning in a community of
    practice).  However, it has been found that people are aware that
    making their work visible may also open them to criticism or
    management; thus, visibility may also make work more formal and reduce
    sharing.  A very active area of CSCW is trying to determine ways to
    manage the trade-offs in sharing.  This is tied to the issue of
    incentives, below.
    
-   The norms for using a CSCW system are often actively negotiated among
    users.  These norms of use are also subject to re-negotiation. CSCW
    systems should have some secondary mechanism or communication
    back-channel to allow users to negotiate the norms of use, exceptions,
    and breakdowns among themselves, making the system more flexible.
    
-   There appears to be a critical mass problem for CSCW systems.  With an
    insufficient number of users, people will not use a CSCW system.  This
    has been found in e-mail, synchronous communication, and calendar
    systems.  There also appears to be a melt-down problem with
    communication systems if the number of active users falls beneath a
    threshold.
    
-   People not only adapt to their systems, they adapt their systems to
    their needs (co-evolution). One CSCW finding is that people will need
    to change their categories over time.  System designers should assume
    that people will try to tailor their use of a system.
    
-   Incentives are critical.  A classic finding in CSCW, for example, is
    that managers and workers may not share incentive or reward
    structures; systems will be less used than desired if this is true.
    Another classic finding is that people will not share information in
    the absence of a suitable organizational reward structure.  Even small
    incremental costs in collaborating must be compensated (either by
    reducing the cost of collaboration or offering derived benefits).
    Thus, many CSCW researchers try to use available data to reduce the
    cost of sharing and collaborative work.
	
Not every researcher would agree with all of the above assumptions and
findings, and some commercial systems (e.g., workflow systems) sacrifice one
or more of these.  Indeed, we do not know how to produce working systems that
adhere to all of these assumptions, and any successful system, commercial or
research, must relax one or more of these assumptions.  However, the list
provides a first-order ideal of what should be provided, again with the
proviso that some of the idealization must be ignored to provide a working
solution.  This trade-off, of course, provides much of the tension in any
given implementation between "technically working" and "organizationally
workable" systems.  CSCW as a field is notable for its attention and concern
to managing this tension.


Biography
---------

Mark S. Ackerman is an Assistant Professor in the CORPS group of the
Information and Computer Science Department at the University of California,
Irvine.  Dr.  Ackerman received his Ph.D. from MIT in Information Technologies
in 1993.  Prior to attending MIT, he was an R&D software engineer and manager,
working on projects as diverse as home banking, the X Window System Toolkit
(Xt), and the Atari Ms. Pac-Man game.  His areas of interest include
human-computer interaction, computer-supported cooperative work, collaborative
memory, information spaces, computer-mediated communication environments, and
the sociology of computing systems.

This work was done under DARPA contract #N66001-97-M-0157, administered by the
US Navy.

To my courses homepage
To my homepage
9 April 1997