CS615 – Software Engineering I

Lecture 6


Chapter 12: User Interface Design

Golden Rules

User Interface Design Models

User Interface Design Process (Spiral Model)

1.    User, task, and environment analysis and modeling

o        Where will the interface be located physically?

o        Will the user be sitting, standing, or performing other tasks unrelated to the interface?

o        Does the interface hardware accommodate space, light, or noise constraints?

o        Are there special human factors considerations driven by environmental factors?

2.    Interface design

o        define a set of interface objects and actions (and their screen representations) that enable a user to perform all defined tasks in a manner that meets every usability goal defined for the system

3.    Interface construction

4.    Interface validation

o        the ability of the interface to implement every user task correctly, to accommodate all task variations, and to achieve all general user requirements

o        the degree to which the interface is easy to use and easy to learn

o        the users' acceptance of the interface as a useful tool in their work

Task Analysis and Modeling

Interface Design Activities

1.    Establish the goals and intentions of each task

2.    Map each goal/intention to a sequence of specific actions (objects and methods for manipulating objects)

3.    Specify the action sequence of tasks and subtasks (user scenario)

4.    Indicate the state of the system at the time the user scenario is performed

5.    Define control mechanisms

6.    Show how control mechanisms affect the state of the system

7.    Indicate how the user interprets the state of the system from information provided through the interface

Interface Design Issues

User Interface Implementation Tools

User Interface Evaluation Cycle

1.    Preliminary design

2.    Build first interface prototype

3.    User evaluates interface

4.    Evaluation studied by designer

5.    Design modifications made

6.    Build next prototype

7.    If interface is not complete then go to step 3

User Interface Design Evaluation Criteria


Understanding Interfaces

Don Norman's The Design of Everyday Things

Analyzes common objects (door, toaster, VCR, telephone)

Concepts of Good/Bad Design

Perceived properties of an artifact that determines how it can be used (e.g knobs/buttons/slots)

Physical, semantic, cultural, and logical factors that encourage proper actions

Mental model of system which allows users to:

- understand the system
- predict the effects of actions
- interpret results

Describe relationship between controls and their effects on system

The system shows you the conceptual model by showing its state and actions that can be taken

Information about effects of user's actions


Norman's Seven Stages of Action that explain how people do things:

1.    Form a goal

2.    Form the intention

3.    Specify an action

4.    Execute the action

5.    Perceive the state of the world

6.    Interpret the state of the world

7.    Evaluate the outcome

Norman's Prescription for User-Centered Design:

1.    Make it easy to evaluate the current state of the system

2.    Follow natural mappings

o        between actions and effects

o        between visible information and interpretation of system state

Norman's Questions about artifacts:

1.    What is the system's function?

2.    What system actions are possible?

3.    What are the mappings from intention to execution?

4.    Does the device inform the user about what state it is in?

5.    Can the user determine tell what state the system is in?


Design and Evaluation of Software



  Interactive Design?

What skills are required to design for users?

Discipline of Human-Computer Interaction Design

Questions asked:

  How can knowledge of technology and user's needs be synthesized into an appropriate design?

  What process should be used?

  How can it be done effectively?


Answers: Found In:

  Principles - collection of statements that advise designer how to proceed

  Guidelines - collections of tests that can be applied to interface to determine if OK.

  Methodologies - Formalized procedures that are believed to guide and structure process of effective design.

Design Principles

Hansen(1971) - Four Principles

  1. Know the user
  2. Optimizes operations
  3. Minimize memorization
  4. Engineer for errors

Rubinstein & Herry (1984) - 93 principles

Heckel (1991) - 30 design elements

Shneiderman (1992) - Eight golden rules:

  1. Strive for consistency
  2. Enable frequent users to use shortcut
  3. Offer informative feedback
  4. Design dialogues to yield closure
  5. Offer simple error handling
  6. Permit easy reversal of actions
  7. Support internal locus of control
  8. Reduce short-term memory load

Gould and Lowes (1985) - Principles:


Design Methodologies

Gould's design process

Informal methodology - 4 Phases:

- not "cookbook" approach
- others have design processes all are variations on theme.

Nielsen (1993) Develops "usability engineering lifecycle model" - 11 stages:

  1. Know the user
  2. Competitive analysis
  3. Setting usability goal & Financial impact analysis
  4. Parallel design
  5. Participatory design
  6. Coordinated design and total interface
  7. Apply guidelines and heuristic analysis
  8. Prototyping
  9. Empirical testing
  10. Iterative design to capture design rationale
  11. Collect feedback from field use

Nielsen's comments:

Lewis & Rieman (1993) - shareware/internet book - Task-centered design process

Their comments:

Getting to know users and their tasks

Lewis & Rieman -

Note: HCI did not come first - human factors came first - task analysis - a study of what an operator is required to do in terms of actions and/or cognitive processes, to achieve a goal

Idea Generation

Envisionment and Prototyping

Envisionment of interfaces - the formulation and exploration of system and interface, ideas and metaphors at a very early stage of the design process.

Methods of envisionment - stories, scenarios, storyboards, flipbooks, animated drawings, cutout animation, animation of real objects, computer animation, computerscripted interactive prototypes.

The Role of Metaphor

Def: a figure of speech in which a word or phrase denoting one kind of object or action is used in place of another to suggest a likeness or analogy between them

Metaphors help users understand a target domain they don't understand in terms of a source domain they already understand
(e.g. typewriter - wordprocessors)

Source               Target.


Evaluating systems and their user interfaces

Joseph McGrath (1994) - framework for understanding methods of evaluation:

"the phenomenon of interest involves states and actions of human systems - of individuals, group, organizations, and larger social entities - and by products of those assocations"


Taxonomy of research strategies

  1. Field strategies - studies systems in use on real talk, real work settings
    1. Field studies - observing without interviewing
    2. Field experiment - observe impact of changing aspect of environment of system (e.g. beta testing products)
  2. Experimental Strategies - carried out laboratory
    1. Experimental simulations create real system in lab for experimental purposes by real users
    2. Laboratory experiments - controlled experiments used to study impact of particular parameter.
  3. Respondent strategies
    1. Judgment studies responses from small set of judges - designed to give information about stimulus (e.g. Delphi methods)
    2. Sample surveys - responses from large set of respondents about respondents (e.g. questionnaires)
  4. Theoretical strategies
    1. Formal theory - gives qualitative insights (e.g. theory of vision)
    2. Computer simulation - run on computer to derive predictions about computer performances.

Five approaches to system & interface evaluation

  1. Heuristic evaluation with usability guidelines

Nielsen (1994) - 10 Design Heuristics:
(Based on factor analysis of 249 usability problems on 11 projects)

4.      Visibility of System State

5.      Match between system and real world (e.g. speak user's language)

6.      User Control and Freedom (e.g. easy exit, undo)

7.      Consistency and standards (e.g. same words mean same thing in different contexts)

8.      Error prevention

9.      Recognition rather than recall (e.g. options and actions should be visible)

10. Flexibility and efficiency of use (e.g. accelerators, short cuts, customization)

11. Aesthetic and minimalist design

12. Help users recognize, diagnose, and recover from errors (e.g. error messages in plain language)

13. Help and Documentation

  1. Cognitive Walkthroughs

Questions to ask:

  1. Usability testing
  2. Usability Engineering

  User testing that is more formal in the sense that interface specialists set explicit quantitative performance goals known as metrics

  e.g. new users must be able to create and save forms in first 10 mins.

  Arguments for explicit quantitative goals:

  Human factors engineers taken more seriously in engineering setting because they adopt quantitative, objective goals familiar to engineers

  Progress can be charted and success recognized

  1. Controlled Experiments

Evaluating Evaluation:
So far best methods are:

Possible Uses of Evaluation Methods in a Development Process:

Theory-based design

Preserving Design Rationale

Recording the design history of its rationale could serve several purposes:

Usability Design Process (Gould)

General Observations about system Design:

Steps in Designing a Good System

  1. Define problem customer wants to be solved.
  2. Identify tasks user wants to be performed.
  3. Learn user capabilities.
  4. Learn hardware/software constraints.
  5. Set specific usability targets.
  6. Sketch out user scenarios.
  7. Design and build prototype.
  8. Test prototype.
  9. Iteratively identify, incorporate, and test changes until:
  10. Install system
  11. Measure customer reaction and acceptance.

Principles of Usability Design

Principle 1: Early and Continual Focus on Users


Twenty experimental participants familiar with the IBM PC but unfamiliar with query languages will receive 60 minutes training using the new online query training system for novice users. They will then perform nine experimental tasks.

On task 1, 85% of tested users must complete it successfully in less than 15 minutes, with no help from the experimenter. They must use all reference and help materials, but no quick help. Task 1 consists of 6 steps:

    1. Create a query on the displayed query panel using the table SCHOOL.COURSES.
    2. Delete all column names except COURSE and TITLE.
    3. Save the current query with the name CTITLE.
    4. Run the query once.
    5. Get current query panel displayed.
    6. Clear current query panel so it contains nothing.


Checklist for Achieving Early and Continual Focus on Users


We defined a major group of potential users.


We talked with the users about the good and bad points of their present job and system.


Our preliminary system design discussions always kept in mind the characteristics of these users.


We watched these users performing their present jobs.


We asked them to think aloud as they worked.


We tried their jobs.


We did a formal task analysis.


We developed testable behavioral target goals for our proposed system.


Principle 2: Early - And Continual - User testing


Checklist for Achieving Early User Testing


We made informal, preliminary sketches of a few user scenarios -- specifying exactly what the user and system messages will be -- and showed them to a few prospective users.


We have begun writing the user manual, and it is guiding the development process.


We have used simulations to try out functions and organization of the user interface.


We have done early demonstrations.


We invited as many people as possible to comment on the on-going instantiations of all usability components.


We had prospective users think aloud as they use simulations, mock-ups, and prototypes.


We use hallway-and-storefront methods.


We used computer conferencing forums to get feedback on usability.


We did formal prototype user testing.


We compared our results to established behavioral target goals.


We met our behavioral benchmark targets.


We let motivated people try to find bugs in our system.


We did field studies.


We did follow-up studies on people who are now using the system we made.


Principle 3: Iterative Design

Software Tools (e.g. UIMS)


Checklist for Carrying out Iterative Design


All aspects of usability could be easily changed, i.e., we had good tools.


We regularly changed our system, manuals, etc., based upon testing results with prospective users.


Principle 4: Integrated Design


Checklist for Achieving Integrated Design


We considered all aspects of usability in our initial design.


One person had responsibility for all aspects of usability.


User Manual


Manuals for subsidiary groups (e.g. operators, trainers, etc.)


Identification of required functions.


User Interfaces.


Assure adequate system reliability and responsiveness.


Outreach programs (e,g, help system, training materials, hot-lines, videotapes, etc.)






Field Maintenance.


Support-group users.