CS631p Human Computer Interaction
Spring 2001
Lecture 2

CONTENTS

Design and Evaluation of Software *

Discipline of Human-Computer Interaction Design *


Evaluating systems and their user interfaces *

Five approaches to system & interface evaluation *

1. Heuristic evaluation with usability guidelines *
2. Cognitive Walkthroughs *
3. Usability testing *
4. Usability Engineering *
5. Controlled Experiments *
Evaluating Evaluation: *

Theory-based design *

Preserving Design Rationale *

Usability Design Process (Gould) *


Design and Evaluation of Software

Past: Present:
  • Interactive Design?
  • What skills are required to design for users?

    Discipline of Human-Computer Interaction Design

    Questions asked:
  • How can knowledge of technology and user's needs be synthesized into an appropriate design?
  • What process should be used?
  • How can it be done effectively?

  •  

     

    Answers: Found In:

  • Principles - collection of statements that advise designer how to proceed
  • Guidelines - collections of tests that can be applied to interface to determine if OK.
  • Methodologies - Formalized procedures that are believed to guide and structure process of effective design.
  • Design Principles

    Hansen(1971) - Four Principles
    1. Know the user
    2. Optimizes operations
    3. Minimize memorization
    4. Engineer for errors
    Rubinstein & Herry (1984) - 93 principles

    Heckel (1991) - 30 design elements

    Shneiderman (1992) - Eight golden rules:

    1. Strive for consistency
    2. Enable frequent users to use shortcut
    3. Offer informative feedback
    4. Design dialogues to yield closure
    5. Offer simple error handling
    6. Permit easy reversal of actions
    7. Support internal locus of control
    8. Reduce short-term memory load
    Gould and Lowes (1985) - Principles:

    Design Methodologies

    Gould's design process
    Informal methodology - 4 Phases:
    NOTE:
    - not "cookbook" approach
    - others have design processes all are variations on theme.

    Nielsen (1993) Develops "usability engineering lifecycle model" - 11 stages:

    1. Know the user
    2. Competitive analysis
    3. Setting usability goal & Financial impact analysis
    4. Parallel design
    5. Participatory design
    6. Coordinated design and total interface
    7. Apply guidelines and heuristic analysis
    8. Prototyping
    9. Empirical testing
    10. Iterative design to capture design rationale
    11. Collect feedback from field use
    Nielsen's comments: Lewis & Rieman (1993) - shareware/internet book - Task-centered design process Their comments: Getting to know users and their tasks

    Lewis & Rieman -

    Note: HCI did not come first - human factors came first - task analysis - a study of what an operator is required to do in terms of actions and/or cognitive processes, to achieve a goal
     
     

    Idea Generation


    Envisionment and Prototyping

    Envisionment of interfaces - the formulation and exploration of system and interface, ideas and metaphors at a very early stage of the design process.

    Methods of envisionment - stories, scenarios, storyboards, flipbooks, animated drawings, cutout animation, animation of real objects, computer animation, computerscripted interactive prototypes.
     

    The Role of Metaphor

    Def: a figure of speech in which a word or phrase denoting one kind of object or action is used in place of another to suggest a likeness or analogy between them

    Metaphors help users understand a target domain they don't understand in terms of a source domain they already understand
    (e.g. typewriter - wordprocessors)

    Source               Target.
      Evaluating systems and their user interfaces

    Joseph McGrath (1994) - framework for understanding methods of evaluation:

    "the phenomenon of interest involves states and actions of human systems - of individuals, group, organizations, and larger social entities - and by products of those assocations" Terminology: Taxonomy of research strategies
    1. Field strategies - studies systems in use on real talk, real work settings
      1. Field studies - observing without interviewing
      2. Field experiment - observe impact of changing aspect of environment of system (e.g. beta testing products)
    2. Experimental Strategies - carried out laboratory
      1. Experimental simulations create real system in lab for experimental purposes by real users
      2. Laboratory experiments - controlled experiments used to study impact of particular parameter.
    3. Respondent strategies
      1. Judgment studies responses from small set of judges - designed to give information about stimulus (e.g. Delphi methods)
      2. Sample surveys - responses from large set of respondents about respondents (e.g. questionnaires)
    4. Theoretical strategies
      1. Formal theory - gives qualitative insights (e.g. theory of vision)
      2. Computer simulation - run on computer to derive predictions about computer performances.
    Five approaches to system & interface evaluation
    1. Heuristic evaluation with usability guidelines

    2. Nielsen (1994) - 10 Design Heuristics:
      (Based on factor analysis of 249 usability problems on 11 projects)

      1. Visibility of System State
      2. Match between system and real world (e.g. speak user's language)
      3. User Control and Freedom (e.g. easy exit, undo)
      4. Consistency and standards (e.g. same words mean same thing in different contexts)
      5. Error prevention
      6. Recognition rather than recall (e.g. options and actions should be visible)
      7. Flexibility and efficiency of use (e.g. accelerators, short cuts, customization)
      8. Aesthetic and minimalist design
      9. Help users recognize, diagnose, and recover from errors (e.g. error messages in plain language)
      10. Help and Documentation
    1. Cognitive Walkthroughs
    1. Usability testing
    2. Usability Engineering
    3. User testing that is more formal in the sense that interface specialists set explicit quantitative performance goals known as metrics
    4. e.g. new users must be able to create and save forms in first 10 mins.
    5. Arguments for explicit quantitative goals:
    6. Human factors engineers taken more seriously in engineering setting because they adopt quantitative, objective goals familiar to engineers
    7. Progress can be charted and success recognized
    1. Controlled Experiments
    Evaluating Evaluation:
    So far best methods are:


    Possible Uses of Evaluation Methods in a Development Process:


    Theory-based design

    Preserving Design Rationale

    Recording the design history of its rationale could serve several purposes:

    Usability Design Process (Gould)

    General Observations about system Design:

    Steps in Designing a Good System
    1. Define problem customer wants to be solved.
    2. Identify tasks user wants to be performed.
    3. Learn user capabilities.
    4. Learn hardware/software constraints.
    5. Set specific usability targets.
    6. Sketch out user scenarios.
    7. Design and build prototype.
    8. Test prototype.
    9. Iteratively identify, incorporate, and test changes until:
    10. Install system
    11. Measure customer reaction and acceptance.
    Principles of Usability Design

    Principle 1: Early and Continual Focus on Users

    Example: Twenty experimental participants familiar with the IBM PC but unfamiliar with query languages will receive 60 minutes training using the new online query training system for novice users. They will then perform nine experimental tasks.

    On task 1, 85% of tested users must complete it successfully in less than 15 minutes, with no help from the experimenter. They must use all reference and help materials, but no quick help. Task 1 consists of 6 steps:

      1. Create a query on the displayed query panel using the table SCHOOL.COURSES.
      2. Delete all column names except COURSE and TITLE.
      3. Save the current query with the name CTITLE.
      4. Run the query once.
      5. Get current query panel displayed.
      6. Clear current query panel so it contains nothing.
      Checklist for Achieving Early and Continual Focus on Users
    __
    We defined a major group of potential users.
    __
    We talked with the users about the good and bad points of their present job and system.
    __
    Our preliminary system design discussions always kept in mind the characteristics of these users.
    __
    We watched these users performing their present jobs.
    __
    We asked them to think aloud as they worked.
    __
    We tried their jobs.
    __
    We did a formal task analysis.
    __
    We developed testable behavioral target goals for our proposed system.

     

    Principle 2: Early - And Continual - User testing

      Checklist for Achieving Early User Testing
    __
    We made informal, preliminary sketches of a few user scenarios -- specifying exactly what the user and system messages will be -- and showed them to a few prospective users.
    __
    We have begun writing the user manual, and it is guiding the development process.
    __
    We have used simulations to try out functions and organization of the user interface.
    __
    We have done early demonstrations.
    __
    We invited as many people as possible to comment on the on-going instantiations of all usability components.
    __
    We had prospective users think aloud as they use simulations, mock-ups, and prototypes.
    __
    We use hallway-and-storefront methods.
    __
    We used computer conferencing forums to get feedback on usability.
    __
    We did formal prototype user testing.
    __
    We compared our results to established behavioral target goals.
    __
    We met our behavioral benchmark targets.
    __
    We let motivated people try to find bugs in our system.
    __
    We did field studies.
    __
    We did follow-up studies on people who are now using the system we made.

     

    Principle 3: Iterative Design

      Checklist for Carrying out Iterative Design
    __ All aspects of usability could be easily changed, i.e., we had good tools.
    __ We regularly changed our system, manuals, etc., based upon testing results with prospective users.

     

    Principle 4: Integrated Design

      Checklist for Achieving Integrated Design
    __
    We considered all aspects of usability in our initial design.
    __
    One person had responsibility for all aspects of usability.
    __
    User Manual
    __
    Manuals for subsidiary groups (e.g. operators, trainers, etc.)
    __
    Identification of required functions.
    __
    User Interfaces.
    __
    Assure adequate system reliability and responsiveness.
    __
    Outreach programs (e,g, help system, training materials, hot-lines, videotapes, etc.)
    __
    Installation.
    __
    Customization.
    __
    Field Maintenance.
    __
    Support-group users.