Slides

Dr.  Salsabeel  Alabbady   1   Course  Informa6on     •  Syllabus:  available  on  eLearning   •  Teaching  methods: ...

2 downloads 834 Views 40MB Size
Dr.  Salsabeel  Alabbady  

1  

Course  Informa6on     •  Syllabus:  available  on  eLearning   •  Teaching  methods:     –  PP  Presenta6ons:  available  on  elarning.ju.edu.jo   –  Videos  (interac6ve)   –  Teamwork    

•  Evalua6on  methods:   –  Team  project  (20%)   –  Midterm  (30%)   –  Final  exam  (50%)   Dr.  Salsabeel  Alabbady  

2  

Objec6ves  of  the  course     •  At  the  end  of  this  course  you  will:  

–  Know  what  is  meant  by  a  “good  design”     –  Know  the  guidelines  that  can  be  applied  to  interface   design   –  Know  how  to  design  a  GUI   –  Build  your  porXolio   •  Work  on  a  project  with  new  ideas  to  apply  

–  Study  a  unique  topic  

•  A  computer  science  course  focused  on  users  

–  Skill  building  

•  Important  in  most  research    

Dr.  Salsabeel  Alabbady  

3  

Introduc6on     •  How  many  products  are  there  in  everyday  use?     •  Think  for  a  minute  what  you  use  in  a  typical  day:  cell  phone,   computer,  personal  organizer,  remote  control,  so_  drink  machine,   ATM,  6cket  machine,  library  informa6on  system,  the  web,   photocopier,  watch,  printer,  stereo,  calculator,  video  game  ….  The   list  is  endless.       •  Now  think  for  a  minute  how  usable  they  are?     •  How  many  are  actually  easy,  effortless,  and  enjoyable  to  use?     •  All  of  them,  several,  or  just  one  or  two?  This  list  is  probably  shorter.   Why  is  this  so?!!!  

Dr.  Salsabeel  Alabbady  

4  

Introduc6on     •  Many  products  that  require  users  to  interact  with   them  to  carry  out  their  tasks  (e.g.  buying  a  6cket   online  from  the  web,  photocopying  an  ar6cle,  …)  have   not  necessarily  been  designed  with  the  users  in  mind.       •  Typically,  they  have  been  engineered  as  systems  to   perform  set  func6ons.  While  they  may  work  effec6vely   from  an  engineering  perspec6ve,  it  is  o_en  at  the   expense  of  how  the  system  will  be  used  by  real  people.         •  The  aim  of  interac6on  design  is  to  redress  this  concern   by  bringing  usability  into  the  design  process.     Dr.  Salsabeel  Alabbady  

5  

Why  HCI?   •  Examples  on  poor  design:     –  Not  having  auto  backups  in  SW     –  Car  radio  design:  divert  the  driver’s  afen6on   away  from  the  road  completely  in  order  to   tune  the  radio  or  adjust  the  volume     –  Disabled  /children  buggies     Dr.  Salsabeel  Alabbady  

6  

Why  HCI?  Examples  on  poor  design  

Dr.  Salsabeel  Alabbady  

7  

Why  HCI?  Examples  on  poor  design  

Dr.  Salsabeel  Alabbady  

8  

HCI-­‐  Defini6on     •  Human–computer  interac/on   (HCI)  “involves  the  study,   planning,  design  and  uses  of   the  interac6on  between  people   (users)  and  computers”       •  HCI  is  also  some6mes  referred   to  as  human–machine   interac/on  (HMI),  man– machine  interac/on  (MMI)  or   computer–human  interac/on   (CHI)   Dr.  Salsabeel  Alabbady  

9  

HCI-­‐  Defini6on     •  We  don’t  necessarily  see  a  single  user  with  a  desktop  computer.     •  By  USER  we  may  mean:  an  individual  user,  a  group  of  users  working   together,  or  a  sequence  of  users  in  an  organiza6on.  Each  dealing  with   some  part  of  the  task  or  process.  The  user  is  whoever  trying  to  get  the  job   done  using  the  technology.   •  By  COMPUTER  we  mean  any  technology  ranging  from  the  general  desktop   computer  to  a  large-­‐scale  computer  system,  a  process  control  system  or   an  embedded  system.  The  system  may  include  non-­‐computerized,   including  other  people.       •  By  INTERACTION  we  mean  any  communica6on  between  a  user  and     computer,  be  it  direct  or  indirect.         (Dix  et  al.,  2004,  p.4)  

Dr.  Salsabeel  Alabbady  

10  

Who  is  involved  in  HCI?   "Because  human–computer  interac6on  studies  a  human  and  a  machine  in   communica6on,  it  draws  from  suppor6ng  knowledge  on  both  the  machine   and  the  human  side”.    

On  the  machine  side,   techniques  in  computer   graphics,  opera6ng   systems,  programming   languages,  and   development   environments  are  relevant  

HCI  

On  the  human  side,   communica6on  theory,   graphic  and  industrial   design  disciplines,   linguis6cs,  social  sciences,   cogni6ve  psychology,   social  psychology,  and   human  factors  such  as   computer  user  sa6sfac6on   are  relevant.  And,  of   course,  engineering  and   design  methods  are   relevant.  

Dr.  Salsabeel  Alabbady  

11  

HCI  Goals   • 

PEOPLE  use  COMPUTER  to  accomplish  WORK  

• 

This  outlines  the  three  major  issues  of  concern:     –  The  people   –  The  computer     –  And  the  tasks  that  are  performed    

• 

The  system  must  support  the  user’s  task  which  gives  us  a  fourth  focus,  USABILITY:   if  the  system  forces  the  user  to  adopt  an  unacceptable  mode  of  work  then  it  is   unusable.    

• 

Usability  means  to  allow  users  to  carry  out  tasks:     –  –  –  – 

Safely     Effec6vely     Efficiently     Enjoyably     Dr.  Salsabeel  Alabbady  

    12      

HCI  Goals   • 

The   user’s   current   task   are   studied   and   supported   by   computers,   which   can   in   turn   affect   the   nature   of   the   original   task   and   cause   it   to   evolve.   i.e.   word   processing   has   made  it  easy  to  manipulate  paragraphs  and  reorder  documents.    

• 

There  are  three  ‘use’  words  that  must  all  be  true  for  a  product  to  be  successful;  it  must   be:   –  Useful:  accomplish  what  is  required:  play  music,  cook  dinner,  format  a  document;     –  Usable:  do  it  easy  and  naturally,  without  danger  of  error,  etc.;     –  Used:  make  people  want  to  use  it,  be  afrac6ve,  engaging,  fun,  etc.                   (Dix  et  al.,  2004,  p.4)

 

 

Dr.  Salsabeel  Alabbady  

13  

HCI  in  summary     •  A  discipline  concerned  with  the:   –  Design   –  Implementa6on   –  Evalua6on  

 Of  interac6ve  compu6ng  system  for  human  use    

Dr.  Salsabeel  Alabbady  

14  

Differences  with  related  fields-­‐   Ergonomics     •  Human   factors   and   ergonomics   (HF&E),   also   known   as   comfort   design,   func6onal   design,   and   user-­‐friendly   systems,   is   the   prac6ce   of   designing   products,  systems  or  processes  to  take  proper  account  of  the  interac6on   between  them  and  the  people  that  use  them.   •  It  is  a  mul6disciplinary  field  incorpora6ng  contribu6ons  from  psychology,   engineering,  and  industrial  design.  In  essence,  it  is  the  study  of  designing   equipment  and  devices  that  fit  the  human  body  and  its  cogni6ve  abili6es.   The   two   terms   ‘Human   factors’   and   ‘Ergonomics’   are   essen6ally   synonymous.       •  HF&E  is  employed  to  fulfill  the  goals  of  occupa6onal  health  and  safety  and   produc6vity.  It  is  relevant  in  the  design  of  such  things  as  safe  furniture  and   easy-­‐to-­‐use   interfaces   to   machines   and   equipment.   Proper   ergonomic   design  is  necessary  to  prevent  repe66ve  strain  injuries,  which  can  develop   over  6me  and  can  lead  to  long-­‐term  disability.  

Dr.  Salsabeel  Alabbady  

15  

Differences  with  related  fields-­‐   Ergonomics     •  Ergonomics  can  help  reduce  costs  by   improving  safety.   •  Ergonomics  comprise  three  main  fields  of   research:     –  Physical     –  cogni6ve     –  organiza6onal  ergonomics  

Dr.  Salsabeel  Alabbady  

16  

Ergonomics-­‐  Physical     •  Physical  ergonomics:    is   concerned  with  physical   ac6vity       •  Physical  ergonomics  is   important  in  the  medical   field  

Dr.  Salsabeel  Alabbady  

17  

Ergonomics-­‐  Cogni6ve     •  Cogni6ve  ergonomics:       is  concerned  with  mental   processes,  such  as  memory  and   reasoning,  as  they  affect   interac6ons  among  humans  and   other  elements  of  a  system   (Relevant  topics  include   decision-­‐making,  skilled   performance,  human  reliability,   work  stress  and  training  as  these   may  relate  to  human-­‐system   and  HUMAN-­‐COMPUTER   INTERACTION  DESIGN)   Dr.  Salsabeel  Alabbady  

18  

Ergonomics-­‐  Organiza6onal       •  Organiza6onal   ergonomics:  is   concerned  with   op6mizing  the   workplace,   everything  from   teamwork  to   assessing   teleworking  and   quality   management.   Dr.  Salsabeel  Alabbady  

19  

Task  1     •  Bring  examples  on  how  a  bad  design  may  lead   to  problems  (and  maybe  disasters).     •  Keywords:  poorly  design,  bad  design,  HCI     •  Design  focus   –  Think  ‘User’   –  Involve  the  users   –  Iterate     Dr.  Salsabeel  Alabbady  

20  

chapter 1

The human

Dr. Salsabeel Alabbady

1

Components of an Interactive system Human Computer system

Interactive process

Components of interactive system Dr. Salsabeel Alabbady

2

Input-Output channels • 

A person’s interaction with the outside world occurs through information being received and sent: input and output.

• 

Input in the human occurs mainly through the senses and output through the motor control of the effectors.

• 

There are five major senses: sight, hearing, touch, taste and smell.

• 

Similarly there are a number of effectors, including the limbs, fingers, eyes, head and vocal system.

Dr. Salsabeel Alabbady

3

Vision Two stages in vision • physical reception of stimulus • processing and interpretation of stimulus

Dr. Salsabeel Alabbady

4

The Eye - physical reception •  mechanism for receiving light and transforming it into electrical energy •  light reflects from objects •  images are focused upside-down on retina •  retina contains rods for low light vision and cones for colour vision •  ganglion cells (brain!) detect pattern and movement Dr. Salsabeel Alabbady

5

The Eye - physical reception

Dr. Salsabeel Alabbady

6

Interpreting the signal •  Size and depth –  visual angle indicates how much of view object occupies (relates to size and distance from eye)

–  E.g. hilltop

Dr. Salsabeel Alabbady

7

Interpreting the signal (cont) •  Brightness –  subjective reaction to levels of light –  e.g. If you want to draw attention on an object make it bright and the background dark

•  Colour –  8% males and 1% females colour blind (unable to discriminate between red and green)

Dr. Salsabeel Alabbady

8

Interpreting the signal (cont) •  The visual system compensates for: –  Size: our expectations affect the way an image is perceived. For example, if we know that an object is a particular size, we will perceive it as that size no matter how far it is from us. –  Movement: movement of the image on the retina which occurs as we move around and as the object which we see moves. Although the retinal image is moving, the image that we perceive is stable.

Dr. Salsabeel Alabbady

9

Interpreting the signal (cont) •  Context is used to resolve ambiguity: The context in which the object appears allows our expectations to clearly disambiguate the interpretation of the object, as either a B or a 13.

Dr. Salsabeel Alabbady

10

Interpreting the signal (cont)

Dr. Salsabeel Alabbady

11

Optical Illusions

the Ponzo illusion: suggested that the

Human mind judges an object’s size based

on its background

the Muller Lyer illusion

These illusions demonstrate that our perception of size is not completely reliable Dr. Salsabeel Alabbady

12

Optical Illusions

The quick brown Fox jumps over the the lazy dog. These are just a few examples of how the visual system compensates, and some- times overcompensates, to allow us to perceive the world around us.

Dr. Salsabeel Alabbady

13

Design focus Where’s the middle?



Optical illusions highlight the differences between the way things are and the way we perceive them – and in interface design we need to be aware that we will not always perceive things exactly as they are.



The way that objects are composed together will affect the way we perceive them, and we do not perceive geometric shapes exactly as they are drawn.



For example, we tend to magnify horizontal lines and reduce vertical. So a square needs to be slightly increased in height to appear square and lines will appear thicker if horizontal rather than vertical.



Optical illusions also affect page symmetry. We tend to see the center of a page as being a little above the actual center – so if a page is arranged symmetrically around the actual center, we will see it as too low down. In graphic design this is known as the optical center – and bottom page margins tend to be increased by 50% to compensate.

Dr. Salsabeel Alabbady

14

Reading •  Several stages: –  visual pattern perceived –  decoded using internal representation of language –  interpreted using knowledge of syntax, semantics, pragmatics (study of language)

•  Reading involves saccades and fixations –  during reading, the eye makes jerky movements called saccades followed by fixations. Perception occurs during the fixation periods. The eye moves backwards over the text as well as forwards, in what are known as regressions. If the text is complex there will be more regressions. Dr. Salsabeel Alabbady

15

Reading •  Word shape is important to recognition: –  This means that removing the word shape clues (for example, by capitalizing words) is detrimental to reading speed and accuracy.

•  Negative contrast improves reading from computer screen (dark characters on a light screen)

Dr. Salsabeel Alabbady

16

Task 2 •  Based on what you have learnt today, give an example for poor or good design (from the vision perspective) and define why..

Dr. Salsabeel Alabbady

17

Hearing •  The sense of hearing is often considered secondary to sight, but we tend to underestimate the amount of information that we receive through our ears. •  Close your eyes for a moment and listen Provides information about environment: distances, directions, objects etc.

•  Auditory system filters sounds –  can attend to sounds over background noise. –  for example, the cocktail party phenomenon. Where we can pick out our name spoken across a crowded noisy room. Dr. Salsabeel Alabbady

18

Hearing (cont) •  Sound can convey a remarkable amount of information. It is rarely used to its potential in interface design, usually being confined to warning sounds and notifications. •  Suggest ideas for an interface which uses the properties of sound effectively. You might approach this exercise by considering how sound could be added to an application with which you are familiar. Use your imagination. Dr. Salsabeel Alabbady

19

Hearing (cont) •  Answer: speech and non-speech sounds •  Speech sounds: –  Can obviously be used to convey information –  This is useful the visually impaired people –  And any application where the user’s attention has to be divided (for example, flight control, tutorials, etc.)

Dr. Salsabeel Alabbady

20

Hearing (cont) •  Answer: speech and non-speech sounds •  Non-speech sounds : –  Attention – to attract the user’s attention to a critical situation or to the end of a process, for example. –  Status information – continuous background sounds can be used to convey status information. For example, monitoring the progress of a process (without the need for visual attention).

–  Confirmation – a sound associated with an action to confirm that the action has been carried out. For example, associating a sound with deleting a file.

–  Navigation – using changing sound to indicate where the user is in a system. For example, what about sound to support navigation in hypertext? Dr. Salsabeel Alabbady

21

Touch •  Provides important feedback about environment. •  It tells us when we touch something hot or cold, and can therefore act as a warning. •  It also provides us with feedback when we attempt to lift an object, for example. Consider the act of picking up a glass of water. If we could only see the glass and not feel when our hand made contact with it or feel its shape, the speed and accuracy of the action would be reduced. •  virtual reality •  May be key sense for someone who is visually impaired.

Dr. Salsabeel Alabbady

22

Touch •  Some areas more sensitive than others e.g. fingers. •  It is possible to measure the acuity of different areas of the body using the two-point threshold test.

Dr. Salsabeel Alabbady

23

Touch •  How can we provide ‘touch’ to the user? “Virtual reality”

Dr. Salsabeel Alabbady

24

University  of  Liverpool  –  Virtual  Engineering  Centre  

25

University  of  Liverpool  –  Virtual  Engineering  Centre  

26

Touch, Design focus •  • 

Handling the goods E-commerce has become very successful in some areas of sales, such as travel services, books and CDs, and food. However, in some retail areas, such as clothes shopping, e-commerce has been less successful. Why?

• 

When buying train and airline tickets and, to some extent, books and food, the experience of shopping is less important than the convenience. So, as long as we know what we want, we are happy to shop online.

• 

With clothes, the experience of shopping is far more important. We need to be able to handle the goods, feel the texture of the material, check the weight to test quality. Even if we know that something will fit us we still want to be able to handle it before buying.

• 

Research into haptic interaction is looking at ways of solving this problem. By using special force feedback and tactile hardware, users are able to feel surfaces and shape. For example, a demonstration environment called TouchCity allows people to walk around a virtual shopping mall, pick up products and feel their texture and weight. A key problem with the commercial use of such an application, however, is that the haptic experience requires expensive hardware not yet available to the average eshopper. However, in future, such e-commerce experiences are likely to Dr. immersive Salsabeel Alabbady

27

be the norm.

Touch, Design focus

Dr. Salsabeel Alabbady

28

Smell, taste •  Virtual reality -  Sight/visual:  panoramic  3D  display   -  Sound/auditory:  3D  audio  effect   -  Touch/tacBle:  hapBc  and  force  feedback.  VibraBon  and   Tempt.   -  Smell/olfactory:  smell  replicaBon,  chemistry,  diagnosis,   biology  ,  oil  exploraBon  in  geology     -  Taste/gusta2on:  taste  replicaBon,  virtual  sense  of  taste  in   games,  tourism,  cooking  shows  

Dr. Salsabeel Alabbady

29

Smell, taste •  Getting closer to The Matrix: Japanese show 5-senses virtual reality system Smells, touch and more added to VR environment •  http://www.itworld.com/personal-tech/ 329667/japanese-show-5-sensesvirtual-reality-system Dr. Salsabeel Alabbady

30

Task 3 •  Suggest an interface that can utilise hearing, smell, touch, or taste to improve user interaction …

Dr. Salsabeel Alabbady

31

Movement •  Time taken to respond to stimulus: reaction time + movement time •  Movement time dependent on age, fitness etc. •  Reaction time - dependent on stimulus type: –  visual ~ 200ms –  auditory ~ 150 ms –  pain ~ 700ms

•  Increasing reaction time decreases accuracy in the unskilled operator but not in the skilled operator. Dr. Salsabeel Alabbady

32

Movement •  Time taken to respond to stimulus: reaction time + movement time •  A simple action such as hitting a button in response to a question involves a number of processing stages. •  The stimulus (of the question) is received through the sensory receptors and transmitted to the brain. •  The question is processed and a valid response generated. •  The brain then tells the appropriate muscles to respond. Each of these stages takes time, which can be roughly divided into reaction time and movement time. Dr. Salsabeel Alabbady

33

Movement (cont) •  Movement time is dependent largely on the physical characteristics of the subjects: their age and fitness, for example. •  Reaction time varies according to the sensory channel through which the stimulus is received. –  A person can react to an auditory signal in approximately 150 ms, –  to a visual signal in 200 ms –  and to pain in 700 ms –  a combined signal will result in the quickest response Dr. Salsabeel Alabbady

34

Movement (cont) • 

Factors such as skill or practice can reduce reaction time, and fatigue can increase it.

• 

A second measure of motor skill is accuracy.

• 

One question that we should ask is whether speed of reaction results in reduced accuracy. This is dependent on the task and the user.

• 

In some cases, requiring increased reaction time reduces accuracy. This is the premise behind many video games where less skilled users fail at levels of play that require faster responses.

• 

However, for skilled operators this is not necessarily the case. Studies of keyboard operators have shown that, although the faster operators were up to twice as fast as the others, the slower ones made 10 times the errors.

Dr. Salsabeel Alabbady

35

Movement (cont) •  Speed and accuracy of movement are important considerations in the design of interactive systems, primarily in terms of the time taken to move to a particular target on a screen. The target may be a button, a menu item or an icon, for example. •  The time taken to hit a target is a function of the size of the target and the distance that has to be moved. Dr. Salsabeel Alabbady

36

Movement (cont) •  Since users will find it more difficult to manipulate small objects, targets should generally be as large as possible and the distance to be moved as small as possible. •  This has led to suggestions that pie-chart-shaped menus are preferable to lists since all options are equidistant. •  If lists are used, the most frequently used options can be placed closest to the user’s start point (for example, at the top of the menu). ⇒  targets as large as possible distances as small as possible Dr. Salsabeel Alabbady

37

Memory •  Have you ever played the memory game? •  The idea is that each player has to recount a list of objects and add one more to the end. There are many variations but the objects are all loosely related: ‘I went to the market and bought a lemon, some oranges, . . .’ or ‘I went to the zoo and saw monkeys, and lions, and tigers . . .’ and so on. •  As the list grows objects are missed out or recalled in the wrong order and so people are eliminated from the game. The winner is the person remaining at the end. Such games rely on our ability to store and retrieve information, even seemingly arbitrary items. •  This is the job of our memory system. Dr. Salsabeel Alabbady

38

Memory •  Much of our everyday activity relies on memory. As well as storing all our factual knowledge, our memory contains our knowledge of actions or procedures. •  It allows us to repeat actions, to use language, and to use new information received via our senses. It also gives us our sense of identity, by preserving information from our past experiences.

Dr. Salsabeel Alabbady

39

Memory There are three types of memory function: Sensory memories Short-term memory or working memory

Long-term memory Selection of stimuli governed by level of arousal.

Dr. Salsabeel Alabbady

40

sensory memory •  Buffers for stimuli received through senses –  iconic memory: visual stimuli moving finger (see it in more than one place), firework displays where moving sparklers leave a persistent image .This indicates a persistence of the image after the stimulus has been removed –  echoic memory: aural stimuli echoic memory allows brief ‘play-back’ of information. Have you ever had someone ask you a question when you are reading? You ask them to repeat the question, only to realize that you know what was asked after all. This experience, too, is evidence of the existence of echoic memory. –  haptic memory: tactile stimuli

•  These memories are constantly overwritten by new information coming in on these channels. Dr. Salsabeel Alabbady

41

Short-term memory (STM) •  Scratch-pad for temporary recall, –  For example, calculate the multiplication 35 × 6 in your head. To perform calculations such as this we need to store the intermediate stages for use later. –  Or consider reading. In order to comprehend this sentence you need to hold in your mind the beginning of the sentence as you read the rest. Both of these tasks use short-term memory.

–  rapid access ~ 70ms –  rapid decay ~ 200ms –  limited capacity - 7± 2 chunks

Dr. Salsabeel Alabbady

42

Short-term memory (STM) •  There are two basic methods for measuring memory capacity –  Length of a sequence which can be remembered in order •  Average person can remember limited 7± 2 chunks

–  Items to be freely recalled in any order

Dr. Salsabeel Alabbady

43

Examples

265397620853 44 113 245 8920 HEC ATR ANU PTH ETR EET Dr. Salsabeel Alabbady

44

Examples •  Cashing in!!! –  Closure gives you a nice ‘done it’ when we complete some part of a task. –  At this point our minds have a tendency to flush short-term memory in order to get on with the next job. –  Early automatic teller machines (ATMs) gave the customer money before returning their bank card. –  On receiving the money the customer would reach closure and hence often forget to take the card. –  Modern ATMs return the card first!

•  7 ± 2, it is often suggested that this means that lists, menus and other groups of items should be designed to be no more than 7 items long. Dr. Salsabeel Alabbady

45

Long-term memory (LTM) •  Repository for all our knowledge – slow access ~ 1/10 second – slow decay, if any – huge or unlimited capacity

Dr. Salsabeel Alabbady

46

Long-term memory (LTM) Memorable or secure? • 

As online activities become more widespread, people are having to remember more and more access information, such as passwords and security checks.

• 

The average active internet user may have separate passwords and user names for several email accounts, mailing lists, e-shopping sites, e-banking, online auctions and more! Remembering these passwords is not easy.

• 

From a security perspective it is important that passwords are random. Words and names are very easy to crack, hence the recommendation that passwords are frequently changed and constructed from random strings of letters and numbers.

• 

But in reality these are the hardest things for people to commit to memory. Hence many people will use the same password for all their online activities (rarely if ever changing it) and will choose a word or a name that is easy for them to remember, in spite of the obviously increased security risks. Security here is in conflict with memorability! A solution to this is to construct a nonsense password out of letters or numbers that will have meaning to you but will not make up a word in a dictionary (e.g. initials of names, numbers from significant dates or postcodes, and so on). Then what is remembered is the meaningful rule for constructing the password, and not a meaningless string of alphanumeric characters.

• 

Dr. Salsabeel Alabbady

47

Individual differences •  long term – gender, physical and intellectual abilities •  short term – effect of stress or fatigue •  changing – age Ask yourself: will design decision exclude section of user population? For example, the current emphasis on visual interfaces excludes those who are visually impaired, unless the design also makes use of the other sensory channels Dr. Salsabeel Alabbady

48

Task 4 •  Give bad/good design regarding what you’ve learnt about memory …

Dr. Salsabeel Alabbady

49

chapter 2

the computer

The Computer  a computer system is made up of various elements  each of these elements affects the interaction –  input devices – text entry and pointing –  output devices –  virtual reality – special interaction and display devices

–  paper – as output (print) and input (scan) –  memory – RAM & permanent media, capacity & access –  processing – speed of processing, networks

Interacting with computers  to understand human–computer interaction … need to understand computers! what goes in and out devices, paper, sensors, etc.

what can it do?

memory, processing, networks

Interacting with computers • 

When we interact with computers, what are we trying to achieve?

• 

Consider what happens when we interact with each other?!

• 

We are either passing information to other people, or receiving information from them.

• 

Often, the information we receive is in response to the information that we have recently imparted to them, and we may then respond to that.

• 

Interaction is therefore a process of information transfer.

• 

Relating this to the electronic computer, the same principles hold: interaction is a process of information transfer, from the user to the computer and from the computer to the user.

Interacting with computers • 

The first part of this chapter concentrates on the transference of information from the user to the computer and back.

• 

We begin by considering a current typical computer interface and the devices it employs, largely variants of keyboard for text entry, mouse for positioning and screen for displaying output.

• 

Then we move on to consider devices that go beyond the keyboard, mouse and screen: entering deeper into the electronic world with virtual reality and 3D interaction and outside the electronic world looking at more physical interactions.

• 

In addition to direct input and output, information is passed to and fro via paper documents. Which describes printers and scanners. Although not requiring the same degree of user interaction as a mouse or keyboard, these are an important means of input and output for many current applications.

Interacting with computers •  We then consider the computer itself, its processor and memory devices and the networks that link them together. •  We note how the technology drives and empowers the interface. •  The details of computer processing should largely be irrelevant to the end-user, •  But the interface designer needs to be aware of the limitations of storage capacity and computational power.

A ‘typical’ computer system

?

•  screen, or monitor, on which there are windows •  keyboard •  mouse/trackpad

window 1

window 2

•  variations –  –  –  –  – 

   

desktop laptop PDA Samrtphone Tablets

12-37pm

the devices dictate the styles of interaction that the system supports If we use different devices, then the interface will support a different style of interaction

Interactivity? •  Batch processing: minimal interaction with the machine –  –  –  – 

punched card stacks or large data files prepared long wait …. Hours or days line printer output for example printing pay checks or entering the results from a questionnaire

•  Now most computing is interactive –  rapid feedback –  the user in control (most of the time) –  Seconds or fractions of a second…

•  The field of Human– Computer Interaction largely grew due to this change in interactive pace.

Richer interaction,, everywhere – every when Computers are coming out of the box!

sensors and devices everywhere

text entry devices keyboards (QWERTY et al.) handwriting, speech

Keyboards • Most common text input device • Allows rapid entry of text by experienced users

layout – QWERTY •  Standardised layout (layout of the digits and letters is fixed)   but … –  non-alphanumeric keys are placed differently –  minor differences between UK and USA keyboards (above 3: £ or $) –  accented symbols needed for different scripts

•  QWERTY arrangement not optimal for typing – layout to prevent typewriters jamming! It increased the spacing between common pairs of letters so that sequentially-struck keys would not jam. - It was also developed for two-finger typing •  Alternative designs allow faster typing but large social base of QWERTY typists produces reluctance to change.

QWERTY (ctd)

2

1 Q

4

3 W

E

S

A Z

6

R

D X

5

C

T

F

Y

G V

8

7 U

H

B

SPACE

I

J N

9 O

K M

0 P

L ,

.

alternative keyboard layouts

Alphabetic

–  keys arranged in alphabetic order –  not faster for trained typists –  not faster for beginners either!

Dvorak –  common letters under dominant fingers –  biased towards right hand –  common combinations of letters alternate between hands –  10-15% improvement in speed and reduction in fatigue –  But - large social base of QWERTY typists produce market pressures not to change

Handwriting recognition •  Text can be input into the computer, using a pen and a digesting tablet –  natural interaction

•  Technical problems: –  capturing all useful information - stroke path (the way in which the letter is drawn), pressure, etc. in a natural manner –  segmenting joined up writing into individual letters –  interpreting individual letters –  coping with different styles of handwriting

•  Used in PDAs, and tablet computers …

Speech recognition •  Speech recognition is a promising area of text entry •  Improving rapidly •  Most successful when: –  single user – initial training and learns individualities –  limited vocabulary systems

•  Problems with –  –  –  – 

external noise interfering imprecision of pronunciation large vocabularies different speakers

Speech recognition •  Despite its problems, speech technology has found markets: •  telephone information systems, •  access for the disabled, •  in hands-occupied situations (especially military) and for those suffering RSI

Numeric keypads •  for entering numbers quickly: –  calculator, PC keyboard

•  for telephones not the same!! ATM like phone

1

2

3

7

8

9

4

5

6

4

5

6

7

8

9

1

2

3

*

0

#

0

.

=

telephone

calculator

positioning, pointing and drawing mouse, touchpad trackballs, joysticks etc. touch screens, tablets eyegaze, cursors

Eyegaze •  control interface by eye gaze direction –  e.g. look at a menu item to select it

•  uses laser beam reflected off retina –  … a very low power laser!

•  •  •  • 

mainly used for evaluation potential for hands-free control high accuracy requires headset cheaper and lower accuracy devices available sit under the screen like a small webcam

Cursor keys •  Four keys (up, down, left, right) on keyboard. •  Very, very cheap, but slow. •  Useful for not much more than basic motion for textediting tasks. •  No standardised layout, but inverted “T”, most common

display devices Screens Large displays

paper: printing and scanning

memory short term and long term speed, capacity, compression formats, access

processing and networks speed limits of interaction networked computing

virtual reality and 3D interaction positioning in 3D space moving and grasping seeing 3D (helmets and caves)

Emerging  Technology:  Virtual  Reality  (VR)   Mixed  Reality  (MR)  

Real  Environment    

Augmented  Reality  (AR)  

Our  world  

AR  adds  to  the  real   world  computer-­‐ generated  objects  

Augmented  Virtuality  (AV)   Virtual  Reality  (VR)  

AV  refers  to  the   merging  of  real   world  objects  into   virtual  worlds  

The  viewer  is   immersed  in  a   computer  generated,   interacDve  3D   environment  where   every  component  is   not  real    

Virtual  Reality  InteracDve  Devices   Virtual  Reality  interac.ve  devices:   •  There  are  interacDve  devices  that   immerse  the  user  in  VE,  which  let  the   user  feel  physically  present    in  a  non-­‐ physical  world.   •  The  interacDve  devices  send  and  receive   informaDon  and  are  worn  as:   Ø  goggles,     Ø  headsets,     Ø  gloves,     Ø  or  body  suits:  3D  movies.   Ø  With  tracking  devices  

Head Mounted Display

Motion Capture

Gesture Recognition Gloves

Active 3D stereo Glasses

positioning in 3D space •  cockpit and virtual controls –  steering wheels, knobs and dials … just like real!

•  the 3D mouse –  six-degrees of movement: x, y, z + roll, pitch, yaw

•  data glove –  fibre optics used to detect finger position

•  VR helmets –  detect head motion and possibly eye gaze

•  whole body tracking –  accelerometers strapped to limbs or reflective dots and video processing

3D displays •  desktop VR –  perspective and motion give 3D effect

•  seeing in 3D –  use stereoscopic vision –  VR helmets –  screen plus shuttered specs, etc.

also see extra slides on 3D vision

VR headsets •  small TV screen for each eye •  slightly different angles •  3D effect

Immersion  In  Virtual  Reality  And  System  VariaDons  

Immersion  In  Virtual  Reality  And  System  VariaDons   Full  Immersion:   • 

To  create  the  sense  of  full  immersion,  the   technology  can  fool  the  5  senses  through:   -  Sight/visual:  panoramic  3D  display   -  Sound/auditory:  3D  audio  effect   -  Touch/tacDle:  hapDc  and  force  feedback.   VibraDon  and  Tempt.   -  Smell/olfactory:  smell  replicaDon,   chemistry,  diagnosis,  biology  ,  oil   exploraDon  in  geology     -  Taste/gustaDon:  taste  replicaDon,  virtual   sense  of  taste  in  games,  tourism,  cooking   shows  

-  Cave  Automated  Virtual  Environment  (CAVE)  

Immersion  In  Virtual  Reality  And  System  VariaDons   Semi-­‐immersive:  

VR motion sickness •  In  real  life  when  we  move  our  head  the  image  our  eyes  see  changes  accordingly.     •  VR  systems  produce  the  same  effect  by  using  sensors  in  the  goggles  or  helmet   and  then  using  the  posiDon  of  the  head  to  determine  the  right  image  to  show.     •  If  the  system  is  slow  in  producing  these  images  a  lag  develops  between  the  user   moving  his  head  and  the  scene  changing.     •  If  this  delay  is  more  than  a  hundred  milliseconds  or  so  the  feeling  becomes   disorienDng.     •  E.g.  sea     •  Users  of  VR  can  experience  similar  nausea  and  few  can  stand  it  for  more  than  a   short  while.  In  fact,  keeping  laboratories  sanitary  has  been  a  major  push  in   improving  VR  technology.    

Chapter  3  

Interac)on  Design,,,  “INTERACTION  DESIGN  beyond  human-­‐computer   interac)on”      by  Helen  Sharp,  Yvonne  Rogers,  Jenny  Preece          

Dr.  Salsabeel  Alabbady  

1  

ì  

2  

Introduction   ì 

In  the  previous  two  chapters  we  have  looked  at  the  human  and  the  computer   respec)vely.  

ì 

We  are  interested  in  how  the  human  user  uses  the  computer  as  a  tool  to   perform,  simplify  or  support  a  task.  In  order  to  do  this  the  user  must   communicate  his  requirements  to  the  computer.    

ì 

ì 

ì 

There  are  a  number  of  ways  in  which  the  user  can  communicate  with  the   system.  At  one  extreme  is  batch  input,  in  which  the  user  provides  all  the   informa)on  to  the  computer  at  once  and  leaves  the  machine  to  perform  the   task.     At  the  other  extreme  are  highly  interac)ve  input  devices  and  paradigms,   such  as  direct  manipula.on  and  the  applica)ons  of  virtual  reality.  Here  the   user  is  constantly  providing  instruc)on  and  receiving  feedback.  These  are  the   types  of  interac)ve  system  we  are  considering.     In  this  chapter,  we  consider  the  communica)on  between  user  and  system:   the  interac.on.     Dr.  Salsabeel  Alabbady  

Computer   system  

Human    

Interac)ve   process  

Components  of  interac)ve   system    

3  

Introduction   ì  Interac)on  involves  at  least  two  par)cipants:  the  user  and  the  system.     ì  Both  are  complex,  as  we  have  seen,  and  are  very  different  from  each  other  in  

the  way  that  they  communicate  and  view  the  domain  and  the  task.    

ì  The  interface  must  therefore  effec)vely  translate  between  them  to  allow  the  

interac)on  to  be  successful.  This  transla)on  can  fail  at  a  number  of  points  and   for  a  number  of  reasons.    

ì  The  use  of  models  of  interac)on  can  help  us  to  understand  exactly  what  is  

going  on  in  the  interac)on  and  iden)fy  the  likely  root  of  difficul)es.    

ì  They  also  provide  us  with  a  framework  to  compare  different  interac)on  styles  

and  to  consider  interac)on  problems.    

Dr.  Salsabeel  Alabbady  

4  

The  execution-­‐evaluation  cycle   ì  Norman’s  model  of  interac)on  is  perhaps  the  

most  influen)al  in  Human–Computer   Interac)on.   ì  The  user  formulates  a  plan  of  ac)on,    

ì  which  is  then  executed  at  the  computer  interface.   ì   When  the  plan,  or  part  of  the  plan,  has  been  

executed,     ì  the  user  observes  the  computer  interface  to   evaluate  the  result  of  the  executed  plan,     ì  and  to  determine  further  ac)ons.    

Dr.  Salsabeel  Alabbady  

5  

The  execution-­‐evaluation  cycle   ì 

The  interac)ve  cycle  can  be  divided  into  two  major  phases:  execu8on  and  evalua8on.    

ì 

These  can  then  be  subdivided  into  further  stages,  seven  in  all.  The  stages  in  Norman’s   model  of  interac)on  are  as  follows:    

ì 

ì 

1.  Establishing  the  goal:  the  user’s  no)on  of  what  needs  to  be  done  

ì 

2.  Forming  the  inten)on:  translates  the  user’s  no)on  to  more  specific  inten)on  

ì 

3.  Specifying  the  ac)on  sequence  that  will  reach  the  goal  

ì 

4.  Execu)ng  the  ac)on  

ì 

5.  Perceiving  the  system  state,  

ì 

6.  Interpre)ng  the  system  state  in  terms  of  the  user’s  expecta)ons    

ì 

7.  Evalua)ng  the  system  state  with  respect  to  the  goals  and  inten)ons.    

If  the  system  state  reflects  the  user’s  goal  then  the  computer  has  done  what  he  wanted   and  the  interac)on  has  been  successful;  otherwise  the  user  must  formulate  a  new  goal   and  repeat  the  cycle.    

Dr.  Salsabeel  Alabbady  

6  

The  execution-­‐evaluation  cycle   ì 

Norman  uses  a  simple  example  of  switching  on  a  light  to  illustrate  this  cycle.     ì 

Imagine  you  are  sibng  reading  as  evening  falls.    

ì 

You  decide  you  need  more  light;    

ì 

that  is  you  establish  the  goal  to  get  more  light.  

ì 

 From  there  you  form  an  inten)on  to  switch  on  the  desk  lamp,    

ì 

and  you  specify  the  ac)ons  required,  to  reach  over  and  press  the  lamp  switch.  

ì 

If  someone  else  is  closer  the  inten)on  may  be  different  –  you  may  ask  them  to  switch  on  the  light  for  you.    

ì 

Your  goal  is  the  same  but  the  inten)on  and  ac)ons  are  different.    

ì 

When  you  have  executed  the  ac)on  you  perceive  the  result,    

ì 

either  the  light  is  on  or  it  isn’t  and  you  interpret  this,  based  on  your  knowledge  of  the  world.    

ì 

For  example,  if  the  light  does  not  come  on  you  may  interpret  this  as  indica)ng  the  bulb  has   blown  or  the  lamp  is  not  plugged  into  the  mains,    

ì 

and  you  will  formulate  new  goals  to  deal  with  this.    

ì 

If  the  light  does  come  on,  you  will  evaluate  the  new  state  according  to  the  original  goals  –  is   there  now  enough  light?  If  so,  the  cycle  is  complete.  If  not,  you  may  formulate  a  new   inten)on  to  switch  on  the  main  ceiling  light  as  well.    

Dr.  Salsabeel  Alabbady  

7  

The  execution-­‐evaluation  cycle   ì  If  the  ac)ons  allowed  by  the  system  correspond  to  

those  intended  by  the  user,  the  interac)on  will  be   effec)ve.    

ì  Evalua)on  is  the  distance  between  the  physical  

presenta)on  of  the  system  state  and  the  expecta)on   of  the  user.    

ì  The  more  effort  that  is  required  on  the  part  of  the  

user  to  interpret  the  presenta)on,  the  less  effec)ve   the  interac)on.    

Dr.  Salsabeel  Alabbady  

8  

What  is  interaction  design?   ì  “Designing  interac)ve  products  to  support  people  in  their  

everyday  and  working  lives”  

ì  In  par)cular,  it  is  about  crea8ng  user  experiences  that  

enhance  and  extend  the  way  people  work,  communicate   and  interact.    

ì  In  this  sense,  it  is  about  finding  ways  of  suppor8ng  people.     ì  This  contrasts  with  sodware  engineering,  which  focuses  

primarily  on  the  produc)on  of  sodware  solu)ons  for  given   applica)ons.    

Dr.  Salsabeel  Alabbady  

9  

What  is  interaction  design?   ì 

A  simple  analogy  to  another  profession,  concerned  with  crea8ng  buildings,  may  clarify   this  dis)nc)on.  In  his  account  of  interac)on  design,  Terry  Winograd  asks  how   architects  and  civil  engineers  differ  when  faced  with  the  problem  of  building  a  house.    

ì 

Architects  are  concerned  with  the  people  and  their  interac8ons  with  each  other  and   within  the  house  being  built.  For  example,  is  there  the  right  mix  of  family  and  private   spaces?  Are  the  spaces  for  cooking  and  ea)ng  in  close  proximity?  Will  people  live  in   the  space  being  designed  in  the  way  it  was  intended  to  be  used?    

ì 

In  contrast,  engineers  are  interested  in  issues  to  do  with  realizing  the  project.  These   include  prac)cal  concerns  like  cost,  durability,  structural  aspects,  environmental   aspects,  fire  regula)ons,  and  construc)on  methods.    

ì 

Just  as  there  is  a  difference  between  designing  and  building  a  house,  so  too,  is  there  a   dis)nc)on  between  interac)on  design  and  sodware  engineering.  In  a  nutshell,   interac)on  design  is  related  to  sodware  engineering  in  the  same  way  as  architecture  is   related  to  civil  engineering.    

Dr.  Salsabeel  Alabbady  

10  

Working  together  as  a  multidisciplinary  team     ì  It  has  always  been  acknowledged  that  for  interac8on  design  to  

succeed  many  disciplines  need  to  be  involved.    

ì  The  importance  of  understanding  how  users  act  and  react  to  

events  and  how  they  communicate  and  interact  together  has  led   people  from  a  variety  of  disciplines,  such  as  psychologists  and   sociologists,  to  become  involved.    

ì  Likewise,  the  growing  importance  of  understanding  how  to  design  

different  kinds  of  interac8ve  media  in  effec)ve  and  aesthe)cally   pleasing  ways  has  led  to  a  diversity  of  other  prac))oners  becoming   involved,  including  graphic  designers,  ar8sts,  animators,   photographers,  film  experts,  and  product  designers.  

Dr.  Salsabeel  Alabbady  

11  

Working  together  as  a  multidisciplinary  team     ì  Bringing  together  so  many  people  with  different  backgrounds  

and  training  has  meant  many  more  ideas  being  generated,   new  methods  being  developed,  and  more  crea8ve  and   original  designs  being  produced.    

ì  But  ……   ì  Costs  involved  will  be  more.     ì  More  difficult  it  can  be  to  communicate  and  progress  forward  

the  designs  being  generated  (People  with  different   backgrounds  have  different  perspec)ves  and  ways  of  seeing   and  talking  about  the  world)  

Dr.  Salsabeel  Alabbady  

12  

Working  together  as  a  multidisciplinary  team     ì  In  prac)ce  is  that  confusion,  misunderstanding,  and  

communica8on  breakdowns  can  oden  surface  in  a  team.    

ì  Other  problems  can  arise  when  a  group  of  people  is  "thrown"  

together  who  have  not  worked  as  a  team.    

Dr.  Salsabeel  Alabbady  

13  

The  relationship  between  interaction  design,  HCI,   and  other  approaches   ì 

We  view  interac)on  design  as  fundamental  to  all  disciplines,  fields,  and   approaches  that  are  concerned  with  researching  and  designing  computer-­‐based   systems  for  people.    

Dr.  Salsabeel  Alabbady  

14  

What  is  involved  in  the  process  of  interaction  design?       ì  Essen)ally,  the  process  of  interac)on  design  involves  

four  basic  ac)vi)es:    

ì  Iden)fying  needs  and  establishing  requirements.     ì  Developing  alterna)ve  designs  that  meet  those  

requirements.     ì  Building  interac)ve  versions  of  the  designs  so  that  they  can   be  communicated  and  assessed.     ì  Evalua)ng  what  is  being  built  throughout  the  process.  

ì  These  ac)vi)es  are  intended  to  inform  one  another  

and  to  be  repeated.    

Dr.  Salsabeel  Alabbady  

15  

What  is  involved  in  the  process  of  interaction  design?       ì  Evalua)ng  what  has  been  built  is  very  much  at  the  heart  of  

interac)on  design.    

ì  Its  focus  is  on  ensuring  that  the  product  is  usable.     ì  It  is  usually  addressed  through  a  user-­‐centered  approach  to  design,  

which,  as  the  name  suggests,  seeks  to  involve  users  throughout  the   design  process.    

ì  There  are  many  different  ways  of  achieving  this:  for  example,  through  

observing  users,  talking  to  them,  interviewing  them,  tes8ng  them   using  performance  tasks,  modeling  their  performance,  asking  them   to  fill  in  ques8onnaires,  and  even  asking  them  to  become  co-­‐ designers.    

Dr.  Salsabeel  Alabbady  

16  

What  is  involved  in  the  process  of  interaction  design?       ì 

A  main  reason  for  having  a  beker  understanding  of  users  is  that  different  users  have   different  needs  and  interac)ve  products  need  to  be  designed  accordingly.     ì 

For  example,  children  have  different  expecta)ons  about  how  they  want  to  learn  or  play   from  adults.  They  may  find  having  interac)ve  quizzes  and  cartoon  characters  helping  them   along  to  be  highly  mo)va)ng,  whereas  most  adults  find  them  annoying.  So  interac)ve   products  must  be  designed  to  match  the  needs  of  different  kinds  of  users.    

ì 

In  addi)on  to  the  four  basic  ac)vi)es  of  design,  there  are  three  key  characteris)cs  of   the  interac)on  design  process:    

ì 

1.  Users  should  be  involved  through  the  development  of  the  project.    

ì 

2.  Specific  usability  and  user  experience  goals  should  be  iden)fied,  clearly   documented,  and  agreed  upon  at  the  beginning  of  the  project.    

ì 

3.  Itera)on  through  the  four  ac)vi)es  is  inevitable.    

Dr.  Salsabeel  Alabbady  

17  

The  goals  of  interaction  design     ì  Usability  goals:  concerned  with  mee)ng  

specific  usability  criteria  (e.g.,  efficiency)  

  ì  User  experience  goals:  concerned  with  

explica)ng  the  quality  of  the  user   experience  (e.g.,  to  be  pleasing)    

Dr.  Salsabeel  Alabbady  

18  

The  goals  of  interaction  design     ì  Usability  goals:  ensuring  that  interac)ve  

products  are  

ì  Effec)ve  to  use  (effec)veness)   ì  efficient  to  use  (efficiency)   ì  safe  to  use  (safety)   ì  have  good  u)lity  (u)lity)     ì  easy  to  learn  (learnability)   ì  easy  to  remember  how  to  use  (memorability)     Dr.  Salsabeel  Alabbady  

19  

The  goals  of  interaction  design     ì  Effec8ve  to  use  (effec8veness):  how  good  a  system  is  at  

doing  what  is  supposed  to  do      

ì  Ques)on:  Is  the  system  capable  of  allowing  people  to  learn  

well,  carry  out  their  work  efficiently,  access  the  informa)on   they  need,  buy  the  goods  they  want,  and  so  on?    

  ì  efficient  to  use  (efficiency):  refers  to  the  way  a  system  

supports  users  in  carrying  out  their  tasks    

ì  The  answering  machine  is  considered  efficient  in  that  it  let  the  user  carry  

out  common  tasks  (e.g.,  listening  to  messages)  through  a  minimal   number  of  steps  

ì  Ques)on:  Once  users  have  learned  how  to  use  a  system  to  carry  out  their  tasks,  can  they  

sustain  a  high  level  of  produc)vity?    

Dr.  Salsabeel  Alabbady  

20  

The  goals  of  interaction  design     ì  safe  to  use  (safety):  involves  protec)ng  the  user   from  dangerous  condi)ons  and  undesirable   situa)ons.     ì  To  make  computer-­‐based  systems  safer  in  this  sense  involves     ì  (i)  preven)ng  the  user  from  making  serious  errors  by  reducing  the  risk  of  

wrong  keys/bukons  being  mistakenly  ac)vated  (an  example  is  not  placing   the  quit  or  delete-­‐file  command  right  next  to  the  save  command  on  a   menu)  and  (ii)  providing  users  with  various  means  of  recovery  should  they   make  errors  

ì  Ques.on:  Does  the  system  prevent  users  from  making  

serious  errors  and,  if  they  do  make  an  error,  does  it   permit  them  to  recover  easily?    

Dr.  Salsabeel  Alabbady  

21  

The  goals  of  interaction  design     ì  have  good  u8lity  (u8lity):  refers  to  the  extent  to  which  

the  system  provides  the  right  kind  of  func)onality  so  that   users  can  do  what  they  need  or  want  to  do.     ì  An  example  of  a  system  with  high  u)lity  is  an  accoun)ng  

sodware  package  providing  a  powerful  computa)onal   tool  that  accountants  can  use  to  work  out  tax  returns.   ì  A  example  of  a  system  with  low  u)lity  is  a  sodware   drawing  tool  that  does  not  allow  users  to  draw  free-­‐hand   but  forces  them  to  use  a  mouse  to  create  their  drawings,   using  only  polygon  shapes.     ì  Ques.on:  Does  the  system  provide  an  appropriate  set  of   func)ons  that  enable  users  to  carry  out  all  their  tasks  in   the  way  they  want  to  do  them?     Dr.  Salsabeel  Alabbady  

22  

The  goals  of  interaction  design     ì  Learnability  refers  to  how  easy  a  system  is  to  learn  to  use.     ì  It  is  well  known  that  people  don't  like  spending  a  long  )me   learning  how  to  use  a  system.  They  want  to  get  started  straight   away  and  become  competent  at  carrying  out  tasks  without  too   much  effort.     ì  A  key  concern  is  determining  how  much  )me  users  are  prepared  

to  spend  learning  a  system.  There  seems  likle  point  in  developing   a  range  of  func)onality  if  the  majority  of  users  are  unable  or  not   prepared  to  spend  )me  learning  how  to  use  it.    

ì  Ques)on:  How  easy  is  it  and  how  long  does  it  take  (i)  to  get  

started  using  a  system  to  perform  core  tasks  and  (ii)  to  learn  the   range  of  opera)ons  to  perform  a  wider  set  of  tasks?    

Dr.  Salsabeel  Alabbady  

23  

The  goals  of  interaction  design     ì  Memorability  refers  to  how  easy  a  system  is  to  remember  

how  to  use,  once  learned.    

ì  This  is  especially  important  for  interac)ve  systems  that  are  

used  infrequently.     ì  Users  need  to  be  helped  to  remember  how  to  do  tasks.  There   are  many  ways  of  designing  the  interac)on  to  sup.  For  example   placing  all  the  drawing  tools  in  the  same  place  on  the  screen   can  help  the  user  remember  where  to  look  to  find  a  par)cular   tool  at  a  given  stage  of  a  task.     ì  Ques.on:  What  kinds  of  interface  support  have  been  

provided  to  help  users  remember  how  to  carry  out  tasks,   especially  for  systems  and  opera)ons  that  are  used   infrequently?    

Dr.  Salsabeel  Alabbady  

Chapter  3  

Interac)on  Design,,,  “INTERACTION  DESIGN  beyond  human-­‐computer   interac)on”      by  Helen  Sharp,  Yvonne  Rogers,  Jenny  Preece          

Dr.  Salsabeel  Alabbady  

1  

ì  

2  

The  goals  of  interaction  design  -­‐  User  experience   goals   ì 

The  realiza)on  that  new  technologies  are  offering  increasing  opportuni)es  for  suppor)ng   people  in  their  everyday  lives  has  led  researchers  and  prac))oners  to  consider  further   goals.    

ì 

The  emergence  of  technologies  (e.g.,  virtual  reality,  the  web,  mobile  compu)ng)  in  a   diversity  of  applica
Sa)sfying   Enjoyable   Fun   Entertaining   Helpful   mo)va)ng     aesthe)cally  pleasing     suppor)ve  of  crea)vity  rewarding     emo)onally  fulfilling    

Dr.  Salsabeel  Alabbady  

3  

The  goals  of  interaction  design  -­‐  User  experience   goals   ì  The  goals  of  designing  interac)ve  products  to  be  fun,  enjoyable,  

pleasurable,  aesthe)cally  pleasing  and  so  on  are  concerned   primarily  with  the  user  experience.    

ì  By  this  we  mean  what  the  interac)on  with  the  system  feels  like  to  

the  users.    

ì  For  example,  a  new  soBware  package  for  children  to  create  their  

own  music  may  be  designed  with  the  primary  objec)ves  of  being   fun  and  entertaining.  Hence,  user  experience  goals  differ  from   the  more  objec)ve  usability  goals  in  that  they  are  concerned  with   how  users  experience  an  interac
Dr.  Salsabeel  Alabbady  

4  

The  goals  of  interaction  design  -­‐  User  experience   goals   Usability  and  user   experience  goals.       Usability  goals  are   central  to  interac)on   design  and  are   opera)onalized  through   specific  criteria.       User  experience  goals   are  shown  in  the  outer   circle  and  are  less  clearly   defined.   Dr.  Salsabeel  Alabbady  

5  

More  on  usability:  design  and  usability  principles       ì  A  number  of  design  principles  have  been  promoted.     ì  The  best  known  are  concerned  with  how  to  determine  what  

users  should  see  and  do  when  carrying  out  their  tasks  using  an   interac)ve  product.    

ì  Here  we  briefly  describe  the  most  common  ones:  visibility,  

feedback,  constraints,  mapping,  consistency,  and  affordances.    

ì  Each  of  these  has  been  wri[en  about  extensively  by  Don  

Norman  (1988)  in  his  bestseller  The  Design  of  Everyday  Things.    

Dr.  Salsabeel  Alabbady  

6  

More  on  usability:  design  and  usability  principles       1.  Visibility     ì  The  more  visible  func
know  what  to  do  next.    

ì  In  contrast,  when  func)ons  are  "out  of  sight,"  it  makes  them  more  

difficult  to  find  and  know  how  to  use.    

ì  Norman  (1988)  describes  the  controls  of  a  car  to  emphasize  this  

point.    

ì  The  controls  for  different  opera)ons  are  clearly  visible  (e.g.,  

indicators,  headlights,  horn,  hazard  warning  lights),  indica
Dr.  Salsabeel  Alabbady  

7  

More  on  usability:  design  and  usability  principles       2.  Feedback     ì  Feedback  is  about  sending  back  informa
has  been  done  and  what  has  been  accomplished,  allowing  the   person  to  con)nue  with  the  ac)vity.    

ì  Various  kinds  of  feedback  are  available  for  interac)on  design-­‐  

audio,  tac)le,  verbal,  visual,  and  combina)ons  of  these.  

ì  Deciding  which  combina)ons  are  appropriate  for  different  kinds  

of  ac)vi)es  and  interac)vi)es  is  central.    

ì  Using  feedback  in  the  right  way  can  also  provide  the  necessary  

visibility  for  user  interac
Dr.  Salsabeel  Alabbady  

8  

More  on  usability:  design  and  usability  principles       3.  Constraints     ì  The  design  concept  of  constraining  refers  to  determining  ways  

of  restric
ì  There  are  various  ways  this  can  be  achieved.  A  common  design  

prac)ce  in  graphical  user  interfaces  is  to  deac
ì  Thereby  restric
stage  of  the  ac)vity.    

Dr.  Salsabeel  Alabbady  

9  

More  on  usability:  design  and  usability  principles      

• 

 

•   

A  menu  illustra)ng  restricted   availability  of  op)ons  as  an   example  of    logical   constraining.  Shaded  areas   indicate  deac)vated  op)ons.   +ve:  prevents  the  user  from   selec)ng  incorrect  op)ons  and   thereby  reduces  the  chance  of   making  a  mistake.    

Dr.  Salsabeel  Alabbady  

10  

More  on  usability:  design  and  usability  principles       ì  Norman  (1999)  classifies  constraints  into  three  categories:  

physical,  logical,  and  cultural.    

ì  3.1  Physical  constraints     ì  Refer  to  the  way  physical  objects  restrict  the  movement  of  

things.     ì  For  example,  the  way  an  external  disk  can  be  placed  into  a  disk   drive  is  physically  constrained  by  its  shape  and  size,  so  that  it  can   be  inserted  in  only  one  way.     ì  Likewise,  keys  on  a  pad  can  usually  be  pressed  in  only  one  way.    

Dr.  Salsabeel  Alabbady  

11  

More  on  usability:  design  and  usability  principles       ì  3.2  Logical  constraints     ì  rely  on  people's  common-­‐sense  reasoning  about  ac)ons  and  

their  consequences    

ì  Making  ac
logically  deduce  what  further  ac)ons  are  required.    

ì  Disabling  menu  op)ons  when  not  appropriate  for  the  task  in  

hand  provides  logical  constraining.  it  allows  users  to  reason   why  (or  why  not)  they  have  been  designed  this  way  and  what   op
Dr.  Salsabeel  Alabbady  

12  

More  on  usability:  design  and  usability  principles       ì  3.3  Cultural  constraints     ì  Rely  on  learned  conven)ons,  like  the  use  of  red  for  warning,  the  

use  of  certain  kinds  of  audio  signals  for  danger,  and  the  use  of   the  smiley  face  to  represent  happy  emo)ons.    

ì  Most  cultural  constraints  are  arbitrary  in  the  sense  that  their  

rela)onship  with  what  is  being  represented  is  abstract,  and   could  have  equally  evolved  to  be  represented  in  another  form   (e.g.,  the  use  of  yellow  instead  of  red  for  warning).  

ì   Accordingly,  they  have  to  be  learned.  Once  learned  and  

accepted  by  a  cultural  group,  they  become  universally   accepted  conven
Dr.  Salsabeel  Alabbady  

13  

More  on  usability:  design  and  usability  principles       4.  Mapping   ì  This  refers  to  the  rela)onship  between  two  things   ì  A  rela)onship  between  what  you  want  to  do  and  what  appears  

possible   ì  Rela)onship  between  controls  and  their  effects  in  the  world    

   

Dr.  Salsabeel  Alabbady  

14  

More  on  usability:  design  and  usability  principles           Which controls which?

Dr.  Salsabeel  Alabbady  

15  

More  on  usability:  design  and  usability  principles           Which controls which?

Dr.  Salsabeel  Alabbady  

16  

More  on  usability:  design  and  usability  principles       5. 

Consistency  

ì  This  refers  to  designing  interfaces  to  have  similar  opera)ons  and  use  similar  

elements  for  achieving  similar  tasks.    

ì  In  par)cular,  a  consistent  interface  is  one  that  follows  rules,  such  as  using  

the  same  opera
ì  A  much  more  effec)ve  design  solu)on  is  to  create  categories  of  commands  

that  can  be  mapped  into  subsets  of  opera
Dr.  Salsabeel  Alabbady  

17  

More  on  usability:  design  and  usability  principles       6. 

Affordance    

ì  Is  a  term  used  to  refer  to  an  a[ribute  of  an  object  that  allows  people  to  

know  how  to  use  it.    

ì  To  afford  means  "to  give  a  clue"  (Norman,  1988).  When  the  affordances  of  a  

physical  object  are  perceptually  obvious  it  is  easy  to  know  how  to  interact   with  it.    

ì  For  example,  a  door  handle  affords  pulling,  a  cup  handle  affords  grasping,  

and  a  mouse  bu[on  affords  pushing.    

ì  Graphical  elements  like  bu[ons,  icons,  links,  and  scroll  bars  are  talked  about  

with  respect  to  how  to  make  it  appear  obvious  how  they  should  be  used:   icons  should  be  designed  to  afford  clicking,  scroll  bars  to  afford  moving  up   and  down,  buYons  to  afford  pushing.    

Dr.  Salsabeel  Alabbady  

18  

TASK  5     ì  This  assignment  is  intended  for  you  to  put  

into  prac4ce  what  you  have  read  about  in   this  chapter.  Specifically,  the  objec4ve  is   to  enable  you  to  define  usability  and  user   experience  goals  and  to  use  design  and   usability  principles  for  evalua4ng  the   usability  of  an  interac4ve  product.    

ì  Find  an  example  and  examine  how  it  has  

been  designed,  paying  par)cular  a[en)on   to  how  the  user  is  meant  to  interact  with   it.    

Dr.  Salsabeel  Alabbady  

CHAPTER 3

1

Interaction Design,,, “INTERACTION DESIGN beyond human-computer interaction” by Helen Sharp, Yvonne Rogers, Jenny Preece

Introduction 2



In the previous two chapters we have looked at the human and the computer respectively.



We are interested in how the human user uses the computer as a tool to perform, simplify or support a task. In order to do this the user must communicate his requirements to the computer.



Human Computer system

There are a number of ways in which the user can communicate with the system. At one extreme is batch input, in which the user provides all the information to the computer at once and leaves the machine to perform the task.

Interactive process



At the other extreme are highly interactive input devices and paradigms, such as direct manipulation and the applications of virtual reality. Here the user Components of interactive system is constantly providing instruction and receiving feedback. These are the types of interactive system we are considering.



In this chapter, we consider the communication between user and system: the interaction.

Introduction 3



Interaction involves at least two participants: the user and the system.



Both are complex, as we have seen, and are very different from each other in the way that they communicate and view the domain and the task.



The interface must therefore effectively translate between them to allow the interaction to be successful. This translation can fail at a number of points and for a number of reasons.



The use of models of interaction can help us to understand exactly what is going on in the interaction and identify the likely root of difficulties.



They also provide us with a framework to compare different interaction styles and to consider interaction problems.

The execution-evaluation cycle

Need: Documenting work done Task: Save my sketch Goal: Safely store the sketch in a place which I can fetch it from

4

execution/evaluation loop goal execution

evaluation system

• • • • • • •

user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal

execution/evaluation loop goal execution

evaluation system

• • • • • • •

user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal

execution/evaluation loop goal execution

evaluation system

• • • • • • •

user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal

execution/evaluation loop goal execution

evaluation system

• • • • • • •

user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal

Donald Norman’s model • Seven stages – – – – – – –

user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal

• Norman’s model concentrates on user’s view of the interface

The execution-evaluation cycle 10



If the system state reflects the user’s goal then the computer has done what he wanted and the interaction has been successful; otherwise the user must formulate a new goal and repeat the cycle.

The execution-evaluation cycle 11



Norman uses a simple example of switching on a light to illustrate this cycle. 

Imagine you are sitting reading as evening falls.



You decide you need more light;



that is you establish the goal to get more light.



From there you form an intention to switch on the desk lamp,



and you specify the actions required, to reach over and press the lamp switch.



If someone else is closer the intention may be different – you may ask them to switch on the light for you.



Your goal is the same but the intention and actions are different.



When you have executed the action you perceive the result,



either the light is on or it isn’t and you interpret this, based on your knowledge of the world.



For example, if the light does not come on you may interpret this as indicating the bulb has blown or the lamp is not plugged into the mains,



and you will formulate new goals to deal with this.



If the light does come on, you will evaluate the new state according to the original goals – is there now enough light? If so, the cycle is complete. If not, you may formulate a new intention to switch on the main ceiling light as well.

The execution-evaluation cycle 12

Using Norman’s model Some systems are harder to use than others Gulf of Execution user’s formulation of actions



The difference between the user’s formulation of the actions to reach the goal and the actions allowed by the system. If the actions allowed by the system correspond to those intended by the user, the interaction will be effective. The interface should therefore aim to reduce this gulf

actions allowed by the system

Gulf of Evaluation user’s expectation of changed system state



actual presentation of this state

The gulf of evaluation is the distance between the physical presentation of the system state and the expectation of the user. If the user can readily evaluate the presentation in terms of his goal, the gulf of evaluation is small. The more effort that is required on the part of the user to interpret the presentation, the less effective the interaction

Using Norman’s model

Using Norman’s model

The execution-evaluation cycle 16

 If the actions allowed by the system correspond to

those intended by the user, the interaction will be effective.  Gulf of Evaluation is the distance between the

physical presentation of the system state and the expectation of the user.  The more effort that is required on the part of the

user to interpret the presentation, the less effective the interaction.

Human error - slips and mistakes

slip understand system and goal correct formulation of action incorrect action

mistake may not even have right goal! Fixing things? slip – better interface design mistake – better understanding of system

Human error - slips and mistakes

slip understand system and goal correct formulation of action

you mistype or you accidentally press the mouse button at the wrong time (incorrect action)

mistake you may think that the magnifying glass icon is the ‘find’ function, but in fact it is to magnify the text Fixing things? slip – better interface design mistake – better understanding of system

Relevance for interaction design 19

Mental Models 

People have mental models of how things work:  how

does your car start?  how does an ATM machine work?  how does your computer boot? 

Allows people to make predictions about how things will work

Based on slide by Saul Greenberg

Mental Model 21

If you’ve used an iPad before, your mental model of reading a book on an iPad will be different than that of someone who has never used one, or doesn’t even know what iPads are. If you’ve been using a Kindle for the past year, then your mental model will be different from someone who has never read a book electronically. And once you get the iPad and read a couple of books on it, whichever mental model you had in your head before will start to change and adjust to reflect your experience.

Mental Model 22

Mental Model 23

Step 1: Write the first word that comes to mind when you hear the following words: COLOR FURNITURE FLOWER Step 2: How many said red for color? How many said blue? For furniture: How many said chair? couch? For flower: How many said rose? daisy?

Mental Models 

Mental models built from affordances  constraints  mappings  positive transfer  cultural associations/standards  instructions  interactions 



Mental models are often wrong! Based on slide by Saul Greenberg

Our mental models of how bicycles work can simulate this to know it won’t work Slide adapted from Saul Greenberg

Norman’s Action Cycle 

Human action has two primary aspects  Execution:

doing something  Evaluation: comparison of what happened to what was desired

Action Cycle start here Goals

Execution

Evaluation

The World

Action Cycle start here Goals

Execution

Evaluation

Intention to act

Evaluation of interpretations Interpreting

Sequence of actions Execution of seq uence of actions

the perception

Perceiving the state of the world

The World

Norman’s Action Cycle 

Execution has three stages:  Start

with a goal  Translate into an intention  Translate into a sequence of actions  

Now execute the actions Evaluation has three stages:  Perceive

world  Interpret what was perceived  Compare with respect to original intentions

Gulf of Evaluation 

The amount of effort a person must exert to interpret  the

physical state of the system  how well the expectations and intentions have been met 

We want a small gulf!

Example 

Scissors 

affordances:  



constraints 



between holes and fingers suggested and constrained by appearance

positive transfer 



big hole for several fingers, small hole for thumb

mapping 



holes for insertion of fingers blades for cutting

learnt when young

conceptual model 

implications clear of how the operating parts work Based on slide by Saul Greenberg

Bad Example 

Digital Watch 

affordances 



four push buttons, not clear what they do

contraints and mapping unknown no visible relation between buttons and the end-result of their actions negative transfer  little association with analog watches 





cultural standards 



conceptual model 



somewhat standardized functionality, but highly variable must be taught; not obvious

How to design a better one? Based on slide by Saul Greenberg

Norman’s HCI model 33







Norman’s HCI model consists of three types: User’s Mental Model ; System Image Model ; Conceptual Model. The User’s Mental Model (something the user has (forms)) 





is the model of a machine’s working that a user creates when learning and using a computer. It is not technically accurate. It may also be not stable over time.

User’s mental models keep changing , evolving as learning continues. The mental model of a device is formed by interpreting its perceived actions and its visible structure.

Norman’s HCI model 34











The Conceptual Model (something the designer does) This is the technically accurate model of the computer / device / system created by designers / teachers/researchers for their specific internal technical use. Conceptual model of the system needs to be as close as possible to the System’s Image Model. The User model (what the user develops in the self to explain the operation of the system) and the system image model (the system’s appearance, operation way it responds) is usually a blend of the users mental model and conceptual model all rolled into one. Ideally, the design model and user model have to be as close as possible for the systems acceptance.

Relevance for interaction design 35

Norman’s HCI model 36



Design model is the designer’s conceptual model



System model is a model of the way the system works



System image results from the physical structure of what has been built (including documentation, instructions, labels) – it is what the user “sees”



User’s model is the “mental model” developed by the user through interaction with the system 

User tries to match the mental model to the system model

Assignment

• Draw the Users Mental Model for a Transfer of Money from one account to another on an ATM • Using Normans seven principles draw a Normans Interaction Diagram for 2 Tasks in any application software of your choice.

What is interaction design? 38









“Designing interactive products to support people in their everyday and working lives” In particular, it is about creating user experiences that enhance and extend the way people work, communicate and interact. In this sense, it is about finding ways of supporting people. This contrasts with software engineering, which focuses primarily on the production of software solutions for given applications.

What is interaction design? 39



A simple analogy to another profession, concerned with creating buildings, may clarify this distinction. In his account of interaction design, Terry Winograd asks how architects and civil engineers differ when faced with the problem of building a house.



Architects are concerned with the people and their interactions with each other and within the house being built. For example, is there the right mix of family and private spaces? Are the spaces for cooking and eating in close proximity? Will people live in the space being designed in the way it was intended to be used?



In contrast, engineers are interested in issues to do with realizing the project. These include practical concerns like cost, durability, structural aspects, environmental aspects, fire regulations, and construction methods.



Just as there is a difference between designing and building a house, so too, is there a distinction between interaction design and software engineering.

Working together as a multidisciplinary team 40







It has always been acknowledged that for interaction design to succeed many disciplines need to be involved. The importance of understanding how users act and react to events and how they communicate and interact together has led people from a variety of disciplines, such as psychologists and sociologists, to become involved. Likewise, the growing importance of understanding how to design different kinds of interactive media in effective and aesthetically pleasing ways has led to a diversity of other practitioners becoming involved, including graphic designers, artists, animators, photographers, film experts, and product designers.

Working together as a multidisciplinary team 41



  

Bringing together so many people with different backgrounds and training has meant many more ideas being generated, new methods being developed, and more creative and original designs being produced. But …… Costs involved will be more. More difficult it can be to communicate and progress forward the designs being generated (People with different backgrounds have different perspectives and ways of seeing and talking about the world)

Working together as a multidisciplinary team 42





In practice is that confusion, misunderstanding, and communication breakdowns can often surface in a team. Other problems can arise when a group of people is "thrown" together who have not worked as a team.

The relationship between interaction design, HCI, and other approaches 43



We view interaction design as fundamental to all disciplines, fields, and approaches that are concerned with researching and designing computer-based systems for people.

What is involved in the process of interaction design? 44



Essentially, the process of interaction design involves four basic activities: 1. 2. 3.

4. 

Identifying needs and establishing requirements. Developing alternative designs that meet those requirements. Building interactive versions of the designs so that they can be communicated and assessed. Evaluating what is being built throughout the process.

These activities are intended to inform one another and to be repeated.

What is involved in the process of interaction design? 45



 



Evaluating what has been built is very much at the heart of interaction design. Its focus is on ensuring that the product is usable. It is usually addressed through a user-centered approach to design, which, as the name suggests, seeks to involve users throughout the design process. There are many different ways of achieving this: for example, through observing users, talking to them, interviewing them, testing them using performance tasks, modeling their performance, asking them to fill in questionnaires, and even asking them to become codesigners.

What is involved in the process of interaction design? 46



A main reason for having a better understanding of users is that different users have different needs and interactive products need to be designed accordingly. 

For example, children have different expectations about how they want to learn or play from adults. They may find having interactive quizzes and cartoon characters helping them along to be highly motivating, whereas most adults find them annoying. So interactive products must be designed to match the needs of different kinds of users.



In addition to the four basic activities of design (requirements, design, build and evaluate), there are three key characteristics of the interaction design process:



1. Users should be involved through the development of the project.



2. Specific usability and user experience goals should be identified, clearly documented, and agreed upon at the beginning of the project.



3. Iteration through the four activities is inevitable.

The goals of interaction design 47



Usability goals: concerned with meeting specific usability criteria (e.g., efficiency)



User experience goals: concerned with explicating the quality of the user experience (e.g., to be pleasing)

The goals of interaction designUsability 48



Usability goals: ensuring that interactive products are  Effective

to use (effectiveness)  efficient to use (efficiency)  safe to use (safety)  have good utility (utility)  easy to learn (learnability)  easy to remember how to use (memorability)

The goals of interaction designUsability 49



Effective to use (effectiveness): how good a system is at doing what is supposed to do  Question:

Is the system capable of allowing people to learn well, carry out their work efficiently, access the information they need, buy the goods they want, and so on?



efficient to use (efficiency): refers to the way a system supports users in carrying out their tasks 

An answering machine is considered efficient in that it let the user carry out common tasks (e.g., listening to messages) through a minimal number of steps



Question: Once users have learned how to use a system to carry out their tasks, can they sustain a high level of productivity?

The goals of interaction designUsability 50



safe to use (safety): involves protecting the user from dangerous conditions and undesirable situations. To make computer-based systems safer in this sense involves  (i) preventing the user from making serious errors by reducing the risk of wrong keys/buttons being mistakenly activated (an example is not placing the quit or delete-file command right next to the save command on a menu) and (ii) providing users with various means of recovery should they make errors 

 Question:

Does the system prevent users from making serious errors and, if they do make an error, does it permit them to recover easily?

The goals of interaction designUsability 51



have good utility (utility): refers to the extent to which the system provides the right kind of functionality so that users can do what they need or want to do. An example of a system with high utility is an accounting software package providing a powerful computational tool that accountants can use to work out tax returns.  A example of a system with low utility is a software drawing tool that does not allow users to draw free-hand but forces them to use a mouse to create their drawings, using only polygon shapes.  Question: Does the system provide an appropriate set of functions that enable users to carry out all their tasks in the way they want to do them? 

The goals of interaction designUsability 52



Learnability refers to how easy a system is to learn to use. 

It is well known that people don't like spending a long time learning how to use a system. They want to get started straight away and become competent at carrying out tasks without too much effort.



A key concern is determining how much time users are prepared to spend learning a system. There seems little point in developing a range of functionality if the majority of users are unable or not prepared to spend time learning how to use it.



Question: How easy is it and how long does it take (i) to get started using a system to perform core tasks and (ii) to learn the range of operations to perform a wider set of tasks?

The goals of interaction designUsability 53







Measuring Learnability ISO Standard 9241-111 provides the following guidance on measuring learnability: Effectiveness measures: Number of functions learned  Percentage of users who manage to learn to criterion 



Efficiency measures: Time to learn to criterion  Time to re-learn to criterion 



Satisfaction measures: 

Rating scale for ease of learning

The goals of interaction designUsability 54 

Improving the Learnability of a User Interface:



Researchers have identified seven factors that impact the learnability of a user interface:



Visibility of commands and menu options: Make commands and menu options highly visible and easy to find. For example near the object that they impact - a right click on an object displays a list of available operations. Command feedback: Provide feedback messages that a user command has succeeded or advise of failure. Continuity of task sequences: When a user starts a command from a menu or by clicking on an icon, provide direction until the task sequence is complete. For example, by providing a sequence of dialog boxes or instructions in a status bar. Do not require users to jump from one menu to another while performing a single task. Design conventions: Use design conventions that are common to office, web, products. For example the common menu structure: Edit > Paste Help presentation: Provide enhanced descriptions for user interface components: dialog boxes, fields that require input and image details. For example, a user password registration field is labelled "password must be > 6 characters and have letters and numbers” Error prevention: Avoid operations that do not succeed because of some simple and predictable mistake. Provide guidance if an inappropriate command is activated or enable commands only when they can be used in the correct context.

 







The goals of interaction designUsability 55



Memorability refers to how easy a system is to remember how to use, once learned. 





This is especially important for interactive systems that are used infrequently. Users need to be helped to remember how to do tasks. There are many ways of designing the interaction to support this. For example placing all the drawing tools in the same place on the screen can help the user remember where to look to find a particular tool at a given stage of a task.

Question: What kinds of interface support have been provided to help users remember how to carry out tasks, especially for systems and operations that are used infrequently?

The goals of interaction design - User experience goals 56





The realization that new technologies are offering increasing opportunities for supporting people in their everyday lives has led researchers and practitioners to consider further goals. The emergence of technologies (e.g., virtual reality, the web, mobile computing) in a diversity of application areas (e.g., entertainment, education, home, public areas) has brought about a much wider set of concerns. As well as focusing primarily on improving efficiency and productivity at work, interaction design is increasingly concerning itself with creating systems that are:         

Satisfying Enjoyable Fun Entertaining Helpful motivating aesthetically pleasing supportive of creativity rewarding emotionally fulfilling

The goals of interaction design - User experience goals 57







The goals of designing interactive products to be fun, enjoyable, pleasurable, aesthetically pleasing and so on are concerned primarily with the user experience. By this we mean what the interaction with the system feels like to the users. For example, a new software package for children to create their own music may be designed with the primary objectives of being fun and entertaining. Hence, user experience goals differ from the more objective usability goals in that they are concerned with how users experience an interactive product from their perspective, rather than assessing how useful or productive a system is from its own perspective. The relationship between the two is shown in Figure 1.7.

The goals of interaction design - User experience goals 58

Usability and user experience goals.

Usability goals are central to interaction design and are operationalized through specific criteria. User experience goals are shown in the outer circle and are less clearly defined.

More on usability: design and usability principles 59

 





A number of design principles have been promoted. The best known are concerned with how to determine what users should see and do when carrying out their tasks using an interactive product. Here we briefly describe the most common ones: visibility, feedback, constraints, mapping, consistency, and affordances. Each of these has been written about extensively by Don Norman (1988) in his bestseller The Design of Everyday Things.

More on usability: design and usability principles 60

1. 







Visibility The more visible functions are, the more likely users will be able to know what to do next. In contrast, when functions are "out of sight," it makes them more difficult to find and know how to use. Norman (1988) describes the controls of a car to emphasize this point. The controls for different operations are clearly visible (e.g., indicators, headlights, horn, hazard warning lights), indicating what can be done. The relationship between the way the controls have been positioned in the car and what they do makes it easy for the driver to find the appropriate control for the task at hand.

More on usability: design and usability principles 61

2. 







Feedback Feedback is about sending back information about what action has been done and what has been accomplished, allowing the person to continue with the activity. Various kinds of feedback are available for interaction design- audio, tactile, verbal, visual, and combinations of these. Deciding which combinations are appropriate for different kinds of activities and interactivities is central. Using feedback in the right way can also provide the necessary visibility for user interaction.

More on usability: design and usability principles 62

3. 





Constraints The design concept of constraining refers to determining ways of restricting the kind of user interaction that can take place at a given moment. There are various ways this can be achieved. A common design practice in graphical user interfaces is to deactivate certain menu options by shading them. Thereby restricting the user to only actions permissible at that stage of the activity.

More on usability: design and usability principles 63



A menu illustrating restricted availability of options as an example of logical constraining. Shaded areas indicate deactivated options.



+ve: prevents the user from selecting incorrect options and thereby reduces the chance of making a mistake.

More on usability: design and usability principles 64





Norman (1999) classifies constraints into three categories: physical, logical, and cultural. 3.1 Physical constraints  Refer

to the way physical objects restrict the movement of things.  For example, the way an external disk can be placed into a disk drive is physically constrained by its shape and size, so that it can be inserted in only one way.  Likewise, keys on a pad can usually be pressed in only one way.

More on usability: design and usability principles 65

 





3.2 Logical constraints rely on people's common-sense reasoning about actions and their consequences Making actions and their effects obvious enables people to logically deduce what further actions are required. Disabling menu options when not appropriate for the task in hand provides logical constraining. it allows users to reason why (or why not) they have been designed this way and what options are available.

More on usability: design and usability principles 66

 





3.3 Cultural constraints Rely on learned conventions, like the use of red for warning, the use of certain kinds of audio signals for danger, and the use of the smiley face to represent happy emotions. Most cultural constraints are arbitrary in the sense that their relationship with what is being represented is abstract, and could have equally evolved to be represented in another form (e.g., the use of yellow instead of red for warning). Accordingly, they have to be learned. Once learned and accepted by a cultural group, they become universally accepted conventions.

More on usability: design and usability principles 67

4. 

Mapping This refers to the relationship between two things A

relationship between what you want to do and what appears possible  Relationship between controls and their effects in the world

More on usability: design and usability principles 68

Which controls which?

More on usability: design and usability principles 69

Which controls which?

More on usability: design and usability principles 70

5.

Consistency



This refers to designing interfaces to have similar operations and use similar elements for achieving similar tasks.



In particular, a consistent interface is one that follows rules, such as using the same operation to select all objects. For example, a consistent operation is using the same input action to highlight any graphical object at the interface, such as always clicking the left mouse button.



A much more effective design solution is to create categories of commands that can be mapped into subsets of operations. For the word-processing application, the hundreds of operations available are categorized into subsets of different menus. All commands that are concerned with file operations (e.g., save, open, close) are placed together in the same file menu

More on usability: design and usability principles 71 6.

Affordance



Is a term used to refer to an attribute of an object that allows people to know how to use it.



To afford means "to give a clue" (Norman, 1988). When the affordances of a physical object are perceptually obvious it is easy to know how to interact with it.



For example, a door handle affords pulling, a cup handle affords grasping, and a mouse button affords pushing.



Graphical elements like buttons, icons, links, and scroll bars are talked about with respect to how to make it appear obvious how they should be used: icons should be designed to afford clicking, scroll bars to afford moving up and down, buttons to afford pushing.

TASK 6 – Obligatory 72





This assignment is intended for you to put into practice what you have read about in this chapter. Specifically, the objective is to enable you to define usability and user experience goals and to use design and usability principles for evaluating the usability of an interactive product. Find an example and examine how it has been designed, paying particular attention to how the user is meant to interact with it.

Chapter  3  

ì  

“THE  PROCESS  INTERACTION  DESIGN”            

Dr.  Salsabeel  Alabbady  

Interaction  Design     Four  Basic  Activities   ì  Four  basic  ac:vi:es  for  interac:on  design:     1.  Iden:fying  needs  and  establishing   requirements     2.  Developing  alterna:ve  designs     3.  Building  interac:ve  versions  of  the  designs     4.  Evalua:ng  designs    

Dr.  Salsabeel  Alabbady  

Interaction  Design  Activities   1.  Iden:fying  needs  and  establishing  requirements     ì  In  order  to  design  something  to  support  people   ì  We  must  know  who  our  target  users  are     ì  And  what  kind  of  support  an  interac:ve  product   could  usefully  provide.     ì  These  needs  form  the  basis  of  the  product's   requirements  and  underpin  subsequent  design   and  development.     Dr.  Salsabeel  Alabbady  

Interaction  Design  Activities   2.  Developing  alterna:ve  designs     ì  This  is  the  core  ac:vity  of  designing:  actually  sugges=ng  ideas  

for  mee=ng  the  requirements.    

ì  This  ac:vity  can  be  broken  up  into  two  sub-­‐ac:vi:es:  

conceptual  design  and  physical  design.    

ì  Conceptual  design  involves  producing  the  conceptual  model  

for  the  product  and  a  conceptual  model  describes  what  the   product  should  do,  behave  and  look  like.    

ì  Physical  design  considers  the  detail  of  the  product  including  

the  colors,  sounds,  and  images  to  use,  menu  design,  and  icon   design.    

Dr.  Salsabeel  Alabbady  

Interaction  Design  Activities   3. 

Building  interac:ve  versions  of  the  designs    

ì  Interac:on  design  involves  designing  interac=ve  products.   ì   The  most  sensible  way  for  users  to  evaluate  such  designs,  then,  is  to  

interact  with  them.    

ì  This  requires  an  interac=ve  version  of  the  designs  to  be  built,  but  that  

does  not  mean  that  a  soTware  version  is  required.    

ì  There  are  different  techniques  for  achieving  "interac=on”,  not  all  of  

which  require  a  working  piece  of  soTware.    

ì  For  example,  paper-­‐based  prototypes  are  very  quick  and  cheap  to  

build  and  are  very  effec:ve  for  iden:fying  problems  in  the  early  stages   of  design,  and  through  role-­‐playing  users  can  get  a  real  sense  of  what   it  will  be  like  to  interact  with  the  product.    

Dr.  Salsabeel  Alabbady  

Interaction  Design  Activities   4.  Evalua:ng  designs     ì  Evalua:on  is  the  process  of  determining  the  usability  and  

acceptability  of  the  product  or  design  that  is  measured  in  terms   of  a  variety  of  criteria  including:   ì   the  number  of  errors  users  make  using  it,     ì  how  appealing  it  is,     ì  how  well  it  matches  the  requirements,  and  so  on.    

ì  Interac:on  design  requires  a  high  level  of  user  involvement  

throughout  development,  and  this  enhances  the  chances  of  an   acceptable  product  being  delivered.    

Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

Three key characteristics Three key characteristics permeate these four activities: 1. Focus on users early in the design and evaluation of the artifact 2. Identify, document and agree specific usability and user experience goals 3. Iteration is inevitable. Designers never get it right first time Dr.  Salsabeel  Alabbady  

Some practical issues •  These  ques:ons  must  be  answered  if  we  are  going  to   be  able  to  "do"  interac:on  design  in  prac:ce.  These   are:     •  Who are the users? •  What are ‘needs’? •  Where do alternatives come from? •  How do you choose among alternatives?

Dr.  Salsabeel  Alabbady  

Who are the users? • Not as obvious as you think: — those who interact directly with the product — those who manage direct users — those who receive output from the product — those who test the system — those who make the purchasing decision — those who use competitor’s products ??? • Three categories of user: — primary: frequent hands-on users of the system — secondary: occasional or via someone else; — tertiary: affected by the introduction of the system, or will influence its purchase. Wider term: stakeholders Dr.  Salsabeel  Alabbady  

Who are the users? •  Wider term: stakeholders •  There  is  a  surprisingly  wide  collec:on  of  people  who   all  have  a  stake  in  the  development  of  a  successful   product.  These  people  are  called  stakeholders.       •  Stakeholders  are  "people  or  organiza=ons  who  will   be  affected  by  the  system  and  who  have  a  direct  or   indirect  influence  on  the  system   requirements"  (Kotonya  and  Sommerville,  1998).     Dr.  Salsabeel  Alabbady  

Who are the users? (cont’d) • What are their capabilities? Humans vary in many dimensions! • Some examples are: — size of hands may affect the size and positioning of input buttons; — motor abilities may affect the suitability of certain input and output devices; — height if designing a physical kiosk; — strength - a child’s toy requires little strength to operate, but greater strength to change batteries Dr.  Salsabeel  Alabbady  

What are ‘needs’? • Users rarely know what is possible • Users can’t tell you what they ‘need’ to help them achieve their goals • Instead, look at existing tasks: — their context — what information do they require? — who collaborates to achieve the task? — why is the task achieved the way it is? • Envisioned tasks: —  can be rooted in existing behaviour —  can be described as future scenarios Dr.  Salsabeel  Alabbady  

What are ‘needs’? •  If  a  product  is  a  new  inven=on,  then  it  can  be  difficult   to  iden=fy  the  users  and  representa=ve  tasks  for   them;     •  e.g.,  before  microwave  ovens  were  invented,  there   were  no  users  to  consult  about  requirements  and   there  were  no  representa=ve  tasks  to  iden=fy.     •  Those  developing  the  oven  had  to  imagine  who  might   want  to  use  such  an  oven  and  what  they  might  want   to  do  with  it.     Dr.  Salsabeel  Alabbady  

Where do alternatives come from? •  Humans stick to what they know works •  But considering alternatives is important to ‘break out of the box’ •  Designers are trained to consider alternatives, software people generally are not •  How do you generate alternatives? — ‘Flair and creativity’: research & synthesis — Seek inspiration: look at similar products or Dr.  Salsabeel   Alabbady  at very different products look

Where do alternatives come from? Normally,  innova=ons  arise  through  cross-­‐fer:liza:on  of  ideas  from  different  applica=ons,       the  evolu:on  of  an  exis:ng  product  through  use  and  observa=on,  or  straigh^orward   copying  of  other,  similar  products.       For  example,  if  you  think  of  something  commonly  believed  to  be  an  "inven:on,"  such  as  the   steam  engine,  this  was  in  fact  inspired  by  the  observa:on  that  the  steam  from  a  ke_le   boiling  on  the  stove  liTed  the  lid.       Clearly  there  was  an  amount  of  crea:vity  and  engineering  involved  in  making  the  jump  from   a  boiling  ke_le  to  a  steam  engine,  but  the  ke_le  provided  the  inspira:on  to  translate   experience  gained  in  one  context  into  a  set  of  principles  that  could  be  applied  in  another.       As  an  example  of  evolu=on,  consider  the  word  processor.  The  capabili:es  of  suites  of  office   soTware  have  gradually  increased  from  the  :me  they  first  appeared.  Ini:ally,  a  word   processor  was  just  an  electronic  version  of  a  typewriter,  but  gradually  other  capabili=es,   including  the  spell-­‐checker,  thesaurus,  style  sheets,  graphical  capabili:es,  etc.,  were  added.     Dr.  Salsabeel  Alabbady  

How do you choose among alternatives? • 

• 

Choosing  among  alterna:ves  is  about  making  design  decisions:     •  Will  the  device  use  keyboard  entry  or  a  touch  screen?     •  Will  the  device  provide  an  automa:c  memory  func:on  or  not?       These  decisions  will  be  informed  by  the  informa:on  gathered  about  users  and  their  tasks,  and  by  the   technical  feasibility  of  an  idea.    

• 

The  decisions  fall  into  two  categories:    

• 

those  that  are  about  externally  visible  and  measurable  features,  e.g,  externally  visible  and   measurable  factors  for  a  building  design  include  the  ease  of  access  to  the  building,  the  amount  of   natural  light  in  rooms,  the  width  of  corridors,  and  the  number  of  power  outlets.  In  a  photocopier,   externally  visible  and  measurable  factors  include  the  physical  size  of  the  machine,  the  speed  and   quality  of  copying,  the  different  sizes  of  paper  it  can  use,  and  so  on.    

• 

and  those  that  are  about  characteris=cs  internal  to  the  system  that  cannot  be  observed  or  measured   without  dissec=ng  it.  E.g.  the  number  of  power  outlets  will  be  dependent  on  how  the  wiring  within   the  building  is  designed  and  the  capacity  of  the  main  power  supply;  the  choice  of  materials  used  in  a   photocopier  may  depend  on  its  fric:on  ra:ng  and  how  much  it  deforms  under  certain  condi:ons.     Dr.  Salsabeel  Alabbady  

How do you choose among alternatives? •  Evaluation with users or with peers e.g. prototypes •   Prototyping  is  used  to  overcome  poten=al  client   misunderstandings  and  to  test  the  technical  feasibility  of  a  suggested   design  and  its  produc:on.   •  Technical feasibility: some not possible •  Quality thresholds: Usability goals lead to usability criteria (set early and checked regularly) — safety: how safe? — utility: which functions are superfluous? — effectiveness: appropriate support? task coverage, information available — efficiency: performance measurements Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

Chapter  3  

ì  

“THE  PROCESS  INTERACTION  DESIGN”            

Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

  Identifying  needs   What,  how,  and  why?       in  this  design  ac8vity?     ì  What  are  we  trying  to  achieve   ì  There  are  two  aims:   ì  One  aim  is  to  understand  as  much  as  possible  about  the  users,  

their  work,  and  the  context  of  that  work,  so  that  the  system   under  development  can  support  them  in  achieving  their  goals;   this  we  call  "iden8fying  needs.”    

ì  Second  aim  is  to  produce,  from  the  needs  idenFfied,  a  set  of  

stable  requirements  that  form  a  sound  basis  to  move  forward   into  thinking  about  design.    

Dr.  Salsabeel  Alabbady  

  Identifying  needs   What,  how,  and  why?    

ì  How  can  we  achieve  this?    

ì  A  t  the  beginning  of  the  requirements  acFvity,  we  know  that  we  

have  a  lot  to  find  out  and  to  clarify.    

ì  At  the  end  of  the  acFvity  we  will  have  a  set  of  stable  

requirements  that  can  be  moved  forward  into  the  design   acFvity.    

ì  In  the  middle,  there  are  ac8vi8es  concerned  with  gathering  

data,  interpre8ng  or  analyzing  the  data,  and  capturing  the   findings  in  a  form  that  can  be  expressed  as  requirements.    

Dr.  Salsabeel  Alabbady  

  Identifying  needs   What,  how,  and  why?     ì  Why  bother?  The  importance  of  geFng  it  right     ì  Causes  of  IT  project  failure  (Taylor,  2000)     ì  The  research  involved  detailed  quesFoning  of  38  IT  

professionals  in  the  UK.  When  asked  about  which  project   stages  caused  failure,  respondents  menFoned  "requirements   defini8on"  more  than  any  other  phase.    

ì  When  asked  about  cause  of  failure,  "unclear  objec8ves  and  

requirements"  was  menFoned  more  than  anything  else,    

ì  and  for  cri8cal  success  factors,  "clear,  detailed  requirements"  

was  menFoned  most  oYen.    

Dr.  Salsabeel  Alabbady  

  What  are  requirements      

ì  A  requirement  is  a  statement  about  an  intended  product  that  

specifies  what  it  should  do  or  how  it  should  perform.    

ì  Requirements  should  be  as  specific,  unambiguous,  and  clear  as  

possible.    

ì  E.g.  ,  a  requirement  for  a  website  might  be  that  the  Fme  to  

download  any  complete  page  is  less  than  5  seconds.    

ì  Different  Kinds  of  Requirements  in  SE   ì  FuncFonal  and  non-­‐funcFonal  requirements    

Dr.  Salsabeel  Alabbady  

Different  kinds  of  requirements       ì  Func8onal  requirements  :  which  say  what  the  system  should  do   ì  Non-­‐Func8onal  requirements  :  which  say  what  constraints  there  are  on  the  

system  and  its  development    

ì  Examples:   ì  A  func8onal  requirement  for  a  word  processor  may  be  that  it  should  

support  a  variety  of  formaFng  styles.  A  non-­‐func8onal  requirement  for  a   word  processor  might  be  that  it  must  be  able  to  run  on  a  variety  of   plaLorms  such  as  PCs,  Macs  and  Unix  machines.    

ì  A  different  kind  of  non-­‐funcFonal  requirement  would  be  that  it  must  be  

delivered  in  six  months'  8me.  This  represents  a  constraint  on  the   development  acFvity  itself  rather  than  on  the  product  being  developed.    

ì  If  we  consider  interac8on  devices  in  general,  other  kinds  of  non-­‐funcFonal  

requirements  become  relevant  such  as  physical  size,  weight,  and    color.  

Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì  1.  Func(onal  requirements  :  capture  what  the  product  should  do     ì  For  example,  a  funcFonal  requirement  for  a  smart  fridge  might  be  that  

it  should  be  able  to  tell  when  the  buOer  tray  is  empty.    

ì  2.  Data  requirements:  capture  the  type,  volaFlity,  size/amount,  

persistence,  accuracy,  and  value  of  the  amounts  of  the  required  data.     ì  All  interacFve  devices  have  to  handle  greater  or  lesser  amounts  of  

data.   ì   For  example,  if  the  system  under  consideraFon  is  to  operate  in  the   share-­‐dealing  applicaFon  domain,  then  the  data  must  be  up-­‐to-­‐date   and  accurate,  and  is  likely  to  change  many  Fmes  a  day.  In  the  personal   banking  domain,  data  must  be  accurate,  must  persist  over  many   months  and  probably  years,  is  very  valuable,  and  there  is  likely  to  be  a   lot  of  it.     Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì  3.  Environmental  requirements  or  context  of  use  refer  to  the  circumstances  

in  which  the  interacFve  product  will  be  expected  to  operate.    

ì  Four  aspects  of  the  environment  must  be  considered  when  establishing  

requirements.    

3.1  Physical   ì  3.2  Social   ì  3.3  OrganizaFonal   ì  3.4  Technical   ì 

ì  3.1  First  is  the  physical  environment  such  as  how  much  ligh8ng,  noise,  and  

dust  is  expected  in  the  operaFonal  environment.  Will  users  need  to  wear   protecFve  clothing,  such  as  large  gloves  or  headgear,  that  might  affect  the   choice  of  interacFon  paradigm?  How  crowded  is  the  environment?  For   example,  an  ATM  operates  in  a  very  public  physical  environment.  Using   speech  to  interact  with  the  customer  is  therefore  likely  to  be  problema8c.    

Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì  3.2  The  second  aspect  of  the  environment  is  the  social  

environment.    

ì  Social  aspects  of  interacFon  design,  such  as  collabora8on  and  

coordina8on,  need  to  be  explored  in  the  context  of  the  current   development.    

ì  For  example,  will  data  need  to  be  shared?  If  so,  does  the  

sharing  have  to  be  synchronous,  e.g.,  does  everyone  need  to   be  viewing  the  data  at  once,  or  asynchronous,  e.g.,  two  people   authoring  a  report  take  turns  in  ediFng  and  adding  to  it?  Other   factors  include  the  physical  locaFon  of  fellow  team  members,   e.g.,  do  collaborators  have  to  communicate  across  great   distances?    

Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì  3.3  The  third  aspect  is  the  organiza8onal  environment,  e.g.,  

how  good  is  user  support  likely  to  be,                                                                               how  easily  can  it  be  obtained,                                                                                                                 and  are  there  facili8es  or  resources  for  training?                                           How  efficient  or  stable  is  the  communica8ons  infrastructure?   How  hierarchical  is  the  management?  and  so  on.    

ì  3.4  Finally,  the  technical  environment  will  need  to  be  

established:  for  example,  what  technologies  will  the  product   run  on  or  need  to  be  compa8ble  with,  and  what  technological   limita8ons  might  be  relevant?    

Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì 

4.  User  requirements:  capture  the  characteris8cs  of  the  intended  user  group.    

ì 

we  menFoned  the  relevance  of  a  user's  abili8es  and  skills,  and  these  are  an  important   aspect  of  user  requirements.    

ì 

But  in  addiFon  to  these,  a  user  may  be  a  novice,  an  expert,  a  casual,  or  a  frequent  user.    

ì 

For  example,  a  novice  (beginner)  user  will  require  step-­‐by-­‐step  instrucFons,  probably  with   prompFng,  and  a  constrained  interacFon  backed  up  with  clear  informaFon.    

ì 

An  expert,  on  the  other  hand,  will  require  a  flexible  interac8on  with  more  wide-­‐ranging   powers  of  control.    

ì 

If  the  user  is  a  frequent  user,  then  it  would  be  important  to  provide  shortcuts  such  as   funcFon  keys  rather  than  expecFng  them  to  type  long  commands  or  to  have  to  navigate   through  a  menu  structure.    

ì 

A  casual  or  infrequent  user,  rather  like  a  novice,  will  require  clear  instruc8ons  and  easily   understood  prompts  and  commands,  such  as  a  series  of  menus.  The  collecFon  of   aeributes  for  a  "typical  user"  is  called  a  user  profile.  Any  one  device  may  have  a  number  of   different  user  profiles.    

Dr.  Salsabeel  Alabbady  

More  kinds  of  requirements       ì  5.  Usability  requirements:  capture  the  usability  goals  

and  associated  measures  for  a  parFcular  product.    

ì  We  described  a  number  of  usability  goals:  

effec8veness,  efficiency,  safety,  u8lity,  learnability,   and  memorability.  If  we  want  to  meet  these  usability   goals,  then  we  must  idenFfy  the  appropriate   requirements.  

ì   We  also  described  some  user  experience  goals,  such  

as  making  products  that  are  fun,  enjoyable,   pleasurable,  aesthe8cally  pleasing,  and  mo8va8ng.  

Dr.  Salsabeel  Alabbady  

Data  gathering     ì  Data  gathering  techniques  are:     ì  QuesFonnaires,  (yes/no,  MCQs,  Open,  …)  (hard  copy,  electronic)   ì  Interviews,  (Structured,  semi-­‐structures,  unstructured)  (face-­‐to-­‐

face,  telephone,…)   ì  Focus  groups  and  workshops,  (group  of  stakeholders)   ì  ObservaFon,  (outside  observaFon,  parFcipant)   ì  Studying  documentaFon.     ì  Some  of  them,  such  as  the  interview,  require  ac8ve  

par8cipa8on  from  stakeholders,  while  others,  such  as  studying   documentaFon,  require  no  involvement  at  all.    

Dr.  Salsabeel  Alabbady  

Overview  of  data-­‐gathering  techniques  used  in   the  requirements  activity       Technique  

Good  for  

Kind  of  data    

Advantages  

disadvantages  

Ques8onnaires  

Answering  specific   quesFons  

QuanFtaFve  and   qualitaFve  data  

Can  reach  many  people  with   low  resource  

The  design  is  crucial.  Response   rate  maybe  low.  Responses  may   not  be  what  you  want  

interviews  

Exploring  issues  

Some  QuanFtaFve   but  mostly   qualitaFve  data    

Interviewer  can  guide   interviewee  if  necessary.   Encourages  contact  between   developers  and  users    

Time  consuming.  ArFficial   environment  may  inFmidate   interviewee      

Focus  groups   and  workshops  

CollecFng  mulFple   viewpoints  

Some  QuanFtaFve   but  mostly   qualitaFve  data    

Highlights  areas   of  consensus   and  conflict.  Encourages   contact  between  developers   and  users    

Possibility  of  dominant   characters    

Understanding   context  of  user   acFvity    

QualitaFve  

Observing  actual  work  gives   insights  that  other  techniques     can't  give    

Very  Fme  consuming.  Huge   amounts  of  data    

Learning  about   procedures,   regulaFons  and   standards  

QuanFtaFve    

Observa8on  

Studying   documenta8on    

Dr.  Salsabeel  Alabbady  

 

 

No  Fme  commitment  from   Day-­‐to-­‐day  working  will   users  required     differ  from  documented   procedures    

Choosing  between  techniques     ì  Kind  of  informa(on  required:  At  the  beginning  of  the  project  you  

may  not  have  any  specific  quesFons  that  need  answering,  so  it's   beeer  to  spend  Fme  exploring  issues  through  interviews  rather  than   sending  out  quesFonnaires.    

ì  The  resources  available  will  influence  your  choice,  too.  For  example,  

sending  out  ques8onnaires  na8onwide  requires  sufficient  8me,   money,  and  people  to  do  a  good  design,  try  it  out  (i.e.,  pilot  it),  issue   it,  collate  the  results  and  analyze  them.  If  you  only  have  three  weeks   and  no  one  on  the  team  has  designed  a  survey  before,  then  this  is   unlikely  to  be  a  success.  

ì  The  loca8on  and  accessibility  of  the  stakeholders  need  to  be  

considered.  It  may  be  aeracFve  to  run  a  workshop  for  a  large  group   of  stakeholders,  but  if  they  are  spread  across  a  wide  geographical   area,  it  is  unlikely  to  be  pracFcal.    

Dr.  Salsabeel  Alabbady  

Data  gathering  guidelines     ì  Focus  on  idenFfying  the  stakeholders'  needs.     ì  Involve  all  the  stakeholder  groups.  It  is  very  important  to  make  

sure  that  you  get  all  the  views  of  the  right  people.    

ì  Involving  only  one  representa8ve  from  each  stakeholder  

group  is  not  enough,  especially  if  the  group  is  large.  If  you  only   involve  one  representaFve  stakeholder  then  you  will  only  get  a   narrow  view.    

ì  Use  a  combina8on  of  data  gathering  techniques.  Each  

technique  will  yield  a  certain  kind  of  informaFon,  from  a  certain   perspecFve.    

Dr.  Salsabeel  Alabbady  

Data  gathering  guidelines     ì  Support  the  data-­‐gathering  sessions  with  suitable  props,  such  

as  task  descripFons  and  prototypes  if  available.    

ì  Run  a  pilot  session  if  possible  to  ensure  that  your  data-­‐

gathering  session  is  likely  to  go  as  planned.    

ì  How  you  record  the  data  during  a  face-­‐to-­‐face  data-­‐gathering  

session  is  just  as  important  as  the  technique(s)  you  use.  Video   recording,  audio  recording,  and  note  taking  are  the  main   opFons.    

ì  DATA  ANALYSIS   Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

Chapter  3  

ì  

“THE  PROCESS  INTERACTION  DESIGN”            

Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

Design,  prototyping  and  construction     ì  Introduc8on     ì  Prototyping  and  construc8on   ì  What  is  prototype?   ì  Low-­‐fidelity  prototyping   ì  High-­‐fidelity  prototyping     ì  Compromises  in  prototyping  

ì  Conceptual  design:  moving  from  requirements  to  first  design     ì  Physical  design:  geMng  concrete  

Dr.  Salsabeel  Alabbady  

Introduction     ì  Design  ac8vi8es  begin  once  a  set  of  requirements  has  been  

established.    

ì  There  are  two  types  of  design:  conceptual  and  physical.     ì  Conceptual  design:  is  concerned  with  developing  a  conceptual  model  

that  captures  what  the  product  will  do  and  how  it  will  behave.  

ì  Physical  design:  is  concerned  with  details  of  the  design  such  as  screen  

and  menu  structures,  icons,  and  graphics.    

ì  The  design  emerges  itera8vely,  through  repeated  design-­‐evalua8on-­‐

redesign  cycles  involving  users  (prototyping  and  construc8on).     Dr.  Salsabeel  Alabbady  

Introduction     ì  There  are  two  dis8nct  circumstances  for  design:     ì  one  where  you're  star8ng  from  scratch     ì  and  one  where  you're  modifying  an  exis8ng  product  (adding  

addi8onal  features)  

Dr.  Salsabeel  Alabbady  

Prototyping  and  construction       ì  It  is  oUen  said  that  users  can't  tell  you  what  they  want,  but  

when  they  see  something  and  get  to  use  it,  they  soon  know   what  they  don't  want.  

ì   Having  collected  informa8on  about  work  prac8ces  and  views  

about  what  a  system  should  and  shouldn't  do,  we  then  need  to   try  out  our  ideas  by  building  prototypes  and  itera8ng  through   several  versions.  And  the  more  itera;ons,  the  be
Dr.  Salsabeel  Alabbady  

What  is  prototype?     ì  A  prototype  can  be  anything  from  a  paper-­‐based  storyboard  

through  to  a  complex  piece  of  soAware,  and  from  a  cardboard   mockup  to  a  molded  or  pressed  piece  of  metal.    

ì  A  prototype  allows  stakeholders  to  interact  with  an  envisioned  

product,  to  gain  some  experience  of  using  it  in  a  realis8c   seMng,  and  to  explore  imagined  uses.    

Dr.  Salsabeel  Alabbady  

What  is  prototype?     ì  A  prototype  can  be  anything  from  a  paper-­‐based  

storyboard  through  to  a  complex  piece  of  soAware,   and  from  a  cardboard  mockup  to  a  molded  or  pressed   piece  of  metal.    

ì  A  prototype  allows  stakeholders  to  interact  with  an  

envisioned  product,  to  gain  some  experience  of  using   it  in  a  realis8c  seMng,  and  to  explore  imagined  uses.    

ì  So  a  prototype  is  a  limited  representa8on  of  a  design  

that  allows  users  to  interact  with  it  and  to  explore  its   suitability.    

Dr.  Salsabeel  Alabbady  

What  is  prototype?     ì  E.g.  Palmpilot  (wood),,  cardboard  box  (desktop  laser  printer)  

Dr.  Salsabeel  Alabbady  

Why  prototype?     ì  useful  aid  when  discussing  ideas  with  stakeholders     ì  a  communica8on  device  among  team  members     ì  an  effec8ve  way  to  test  out  ideas  for  yourself     ì  answer  ques8ons  and  support  designers  in  choosing  between  alterna8ves     ì  test  out  the  technical  feasibility  of  an  idea   ì  clarify  some  vague  requirements   ì  do  some  user  tes8ng  and  evalua8on   ì  check  that  a  certain  design  direc8on  is  compa8ble  with  the  rest  of  the  system  

development.    

Dr.  Salsabeel  Alabbady  

Why  prototype?     ì  A  paper-­‐based  

prototype  of  a   handheld   device  to   support  an   au8s8c  child.    

Dr.  Salsabeel  Alabbady  

Low-­‐fidelity  prototyping     ì 

A  low-­‐fidelity  prototype  is  one  that  does  not  look  very  much  like  the  final  product.    

ì 

For  example,  it  uses  materials  that  are  very  different  from  the  intended  final  version,  such   as  paper  and  cardboard  rather  than  electronic  screens  and  metal.    

ì 

Low-­‐fidelity  prototypes  are  useful  because  they  tend  to  be:   ì 

simple,    

ì 

cheap,    

ì 

quick  to  produce.    

ì 

and  quick  to  modify  so  they  support  the  explora8on  of  alterna8ve  designs  and  ideas.    

ì 

This  is  par8cularly  important  in  early  stages  of  development,  during  conceptual  design  for   example,  because  prototypes  that  are  used  for  exploring  ideas  should  be  flexible  and   encourage  rather  than  discourage  explora8on  and  modifica8on.    

ì 

Low-­‐fidelity  prototypes  are  never  intended  to  be  kept  and  integrated  into  the  final   product.  They  are  for  explora;on  only.    

Dr.  Salsabeel  Alabbady  

Examples  of  Low-­‐fidelity  prototyping   ì  Storyboards   ì  Sketching   ì  Index  cards   ì  Wizard  of  Oz  

Dr.  Salsabeel  Alabbady  

Low-­‐fidelity  prototyping     ì 

Storyboarding  

ì 

A  storyboard  consists  of  a  series  of  sketches  showing   how  a  user  might  progress  through  a  task  using  the   device  being  developed.    

ì 

It  can  be  a  series  of  sketched  screens  for  a  GUI-­‐   based  soAware  system,  or  a  series  of  scene  sketches   showing  how  a  user  can  perform  a  task  using  the   device.    

ì 

When  used  in  conjunc;on  with  a  scenario,  the   storyboard  brings  more  detail  to  the  wriZen  scenario   and  offers  stakeholders  a  chance  to  role-­‐play  with   the  prototype,  interac8ng  with  it  by  stepping   through  the  scenario.    

Dr.  Salsabeel  Alabbady  

Low-­‐fidelity  prototyping     ì 

Depicts  a  person   using  a  new  system   for  digi8zing  images.    

ì 

This  example   doesn't  show   detailed  drawings  of   the  screens   involved,  but  it   describes  the  steps   a  user  might  go   through  in  order  to   use  the  system.    

Dr.  Salsabeel  Alabbady  

Storyboard  example    

Dr.  Salsabeel  Alabbady  

Low-­‐fidelity  prototyping     ì 

Sketching    

ì 

They  don’t  have  to  be  anything  more  than  simple  boxes,  s8ck   figures,  and  stars  

ì 

Elements  that  you  may  require  in  a  storyboard  sketch,  for  example,   include:   ì 

‘things’:  people,  parts  of  a  computer,  desks,  books,  etc.    

ì 

‘ac8ons’:  give,  find,  transfer,  ad  write.    

ì 

Icons,  dialog  boxes,  …    

Dr.  Salsabeel  Alabbady  

Index  cards     ì  Small  cards  (3  X  5  inches)   ì  Each  card  represents  one  screen   ì  mul8ple  screens  can  be  shown  easily  on  a  table  or  the  wall   ì  Thread  or  lines  can  indicate  rela8onships  between  screens  like   ì  sequence   ì  hyperlinks   ì  OUen  used  in  website  development  

Dr.  Salsabeel  Alabbady  

Index  card  example  (screen  1)  

Dr.  Salsabeel  Alabbady  

Index  card  example  (screen  2)  

Dr.  Salsabeel  Alabbady  

Wizard-­‐of-­‐Oz  prototyping     ì  Simulated  interac8on   The  user  thinks  they  are  interac8ng  with  a  computer,  but  a  developer   is  providing  output  rather  than  the  system.   User   >Blurb  blurb   >Do  this   >Why?  

Dr.  Salsabeel  Alabbady  

High-­‐fidelity  prototyping     ì  Choice  of  materials  and  methods   ì  similar  or  iden8cal  to  the  ones  in  the  final  product  

ì  Looks  more  like  the  final  system   ì  appearance,  not  func8onality  

ì  Common  development  environments   ì  Macromedia  Director,  Visual  Basic,  Smalltalk,   ì  Web  development  tools  

ì  Misled  user  expecta8ons   ì  users  may  think  they  have  a  full  system   Dr.  Salsabeel  Alabbady  

High-­‐fidelity  prototyping    

Dr.  Salsabeel  Alabbady  

Difference   Low-­‐fidelity  prototype  

High-­‐fidelity  prototype  

Relative  effectiveness  of  low-­‐  vs.  high-­‐fidelity   prototypes  (Rudd  et  al.,  1996)       Disadvantages   Type   Advantages   Low-­‐fidelity   prototype  

•  •  •  •  • 

Lower  development  cost.   Evaluate  mul8ple  design  concepts.   Useful  communica8on  device.     Address  screen  layout  issues.     Useful  for  iden8fying  market   requirements.     •  Proof-­‐of-­‐concept.    

•  Limited  error  checking.     •  Poor  detailed  specifica8on  to  code  to.   Facilitator-­‐driven.     •  Limited  u8lity  aUer  requirements   established.     •  Limited  usefulness  for  usability  tests.   Naviga8onal  and  flow  limita8ons.    

High-­‐fidelity   prototype  

•  •  •  • 

Complete  func8onality.     Fully  interac8ve.     User-­‐driven.     Clearly  defines  naviga8onal   scheme.   Use  for  explora8on  and  test.     •  Look  and  feel  of  final  product.   Serves  as  a  living  specifica8on.   Marke8ng  and  sales  tool.    

•  More  expensive  to  develop.     •  Time-­‐consuming  to  create.    

Dr.  Salsabeel  Alabbady  

•  Reviewers  and  testers  tend  to   comment  on  superficial  aspects   rather  than  content.     •  Developers  are  reluctant  to  change   something  they  have  craUed  for   hours.     •  A  soUware  prototype  can  set   expecta8ons  too  high.  

•  Not  effec8ve  for  requirements  gathering.    

Compromises  in  prototyping     ì  Prototypes  involve  compromises:  the  inten8on  is  to  produce  

sth  quickly  to  test  an  aspect  of  the  product.    

ì  Low-­‐fidelity  prototype:  paper-­‐based  prototype  an  obvious  

compromise  is  that  the  device  doesn't  actually  work    

ì  High-­‐fidelity,  soUware-­‐based  prototyping:     ì  the  response  speed  may  be  slow,     ì  or  the  exact  icons  may  be  sketchy,     ì  or  only  a  limited  amount  of  func8onality  may  be  available.    

Dr.  Salsabeel  Alabbady  

Compromises  in  prototyping     ì  Two  common  compromises  that  oUen  must  be  traded  against  

each  other  are  breadth  of  func8onality  provided  versus  depth.    

ì  These  two  kinds  of  prototyping  are  called:   ì  Horizontal  prototyping  (providing  a  wide  range  of  func8ons  but  

with  liZle  detail)     ì  Ver;cal  prototyping  (providing  a  lot  of  detail  for  only  a  few   func8ons)    

ì  The  compromises  made  in  order  to  produce  the  prototype  

must  not  be  ignored,  par8cularly  the  ones  that  are  less  obvious   from  the  outside.  We  s8ll  must  produce  a  good-­‐quality  system   and  good  engineering  principles  must  be  adhered  to.    

Dr.  Salsabeel  Alabbady  

Conceptual  design:  moving  from   requirements  to  first  design   ì 

Conceptual  design  is  concerned  with  transforming  the  user  requirements  and  needs   into  a  conceptual  model.    

ì 

conceptual  model  is:  "a  descrip8on  of  the  proposed  system  in  terms  of  a  set  of   integrated  ideas  and  concepts  about  what  it  should  do,  behave,  and  look  like,  that   will  be  understandable  by  the  users  in  the  manner  intended."    

ì 

Ideas  for  a  conceptual  model  may  emerge  during  data  gathering,  but  remember  you   must  separate  the  real  requirements  from  solu8on  ideas.    

ì 

Key  guiding  principles  of  conceptual  design  are:     ì  ì  ì  ì 

ì 

Keep  an  open  mind  but  never  forget  the  users  and  their  context.     Discuss  ideas  with  other  stakeholders  as  much  as  possible.     Use  low-­‐fidelity  prototyping  to  get  rapid  feedback.     Iterate,  iterate,  and  iterate.  Remember  "To  get  a  good  idea,  get  lots  of  ideas"  (ReMg,  1994).    

Considering  alterna8ves  and  repeatedly  thinking  about  different  perspec8ves  helps  to   expand  the  solu8on  space  and  can  help  prompt  insights.  Prototyping  and  scenarios   are  two  techniques  to  help  you  explore  ideas  and  make  design  decisions    

Dr.  Salsabeel  Alabbady  

Physical  design:  getting  concrete   ì 

Physical  design  involves  considering  more  concrete,   detailed  issuer;  of  designing  the  interface,  such  as  screen   or  keypad  design,  which  icons  to  use,  how  to  structure   menus,  etc.    

ì 

during  physical  design  it  will  be  necessary  to  revisit   decisions  made  during  conceptual  design    

ì 

Design  is  about  making  choices  and  decisions,  and  the   designer  must  strive  to  balance  environmental,  user,  data   and  usability  requirements  with  func;onal  requirements.    

ì 

These  are  oUen  in  conflict.    

ì 

For  example,  a  cell  phone  must  provide  a  lot  of  func8onality   but  is  constrained  by  having  only  a  small  screen  and  a  small   keyboard.  This  means  that  the  display  of  informa8on  is   limited  and  the  number  of  unique  func8on  keys  is  also   limited,  resul8ng  in  restricted  views  of  informa8on  and  the   need  to  associate  mul8ple  func8ons  with  func8on  keys.     Dr.  Salsabeel  Alabbady  

Chapter  4  

ì  

“THE  PROCESS  INTERACTION  DESIGN”            

Dr.  Salsabeel  Alabbady  

A  simple  lifecycle  model  for  interaction  design       Identify needs/ establish requirements

(Re)Design Evaluate Build an interactive version Final product Dr.  Salsabeel  Alabbady  

Evaluation  –  Introduction     ì 

Designers  assume  that  if  they  and  their  colleagues  can  use  the  soBware  and  find  it  aEracFve,  others   will  too!  

ì 

Designers  prefer  to  avoid  doing  evaluaFon  because  it  adds  development  Fme  and  costs  money.    

ì 

So  why  is  evaluaFon  important?    

ì 

Because  without  evalua7on,  designers  cannot  be  sure  that  their  so;ware  is  usable  and  is  what  users   want.    

ì 

But  what  do  we  mean  by  evaluaFon?    

ì 

There   are   many   definiFons   and   many   different   evaluaFon   techniques,   some   of   which   involve   users   directly,  while  others  call  indirectly  on  an  understanding  of  users'  needs  and  psychology.    

ì 

In   this   book   we   define   evaluaFon   as   “the   process   of   systema7cally   collec7ng   data   that   informs   us   about   what   it   is   like   for   a   par7cular   user   or   group   of   users   to   use   a   product   for   a   par7cular   task   in   a   certain  type  of  environment”.     Dr.  Salsabeel  Alabbady  

What,  why,  and  when  to  evaluate   ì  Users  want  systems  that  are  easy  to  learn  and  to  

use  as  well  as  effecFve,  efficient,  safe,  and   saFsfying.    

ì  Being  entertaining,  aEracFve,  and  challenging,  etc.  

is  also  essenFal  for  some  products.  So,  knowing   what  to  evaluate,  why  it  is  important,  and  when   to  evaluate  are  key  skills  for  interacFon  designers.    

Dr.  Salsabeel  Alabbady  

What  to  evaluate   ì  There  is  a  huge  variety  of  interacFve  products  with  a  vast  

array  of  features  that  need  to  be  evaluated.    

ì  Some   features,   such   as   the   sequence   of   links   to   be  

followed   to   find   an   item   on   a   website,   are   oBen   best   evaluated   in   a   laboratory,   since   such   a   seUng   allows   the   evaluators  to  control  what  they  want  to  invesFgate.    

ì  Other   aspects,   such   as   whether   a   collabora7ve   toy   is  

robust  and  whether  children  enjoy  interac7ng  with  it,  are   beEer  evaluated  in  natural  seDngs,  so  that  evaluators  can   see  what  children  do  when  leB  to  their  own  devices.    

Dr.  Salsabeel  Alabbady  

What  to  evaluate   ì  There   has   also   been   a   growing   trend   towards   observing   how  

people  interact  with  the  system  in  their  work,  home,  and  other   seDngs,  the  goal  being  to  obtain  a  beEer  understanding  of  how   the  product  is  (or  will  be)  used  in  its  intended  seUng.    

ì  For  example,  at  work  people  are  frequently  being  interrupted  by  

phone  calls,  others  knocking  at  their  door,  email  arriving,  and  so   on-­‐  to  the  extent  that  many  tasks  are  interrupt-­‐driven.  

ì  Only  rarely  does  someone  carry  a  task  out  from  beginning  to  end  

without   stopping   to   do   something   else.   Hence   the   way   people   carry  out  an  ac7vity  (e.g.,   preparing   a   report)   in  the  real  world  is   very   different   from   how   it   may   be   observed   in   a   laboratory.   Furthermore,   this   observaFon   has   implicaFons   for   the   way   products  should  be  designed.    

Dr.  Salsabeel  Alabbady  

Why  you  need  to  evaluate   ì  Just  as  designers  shouldn't  assume  that  everyone  is  like  them,  they  also  

shouldn't  presume  that  following  design  guidelines  guarantees  good   usability.    

ì  EvaluaFon  is  needed  to  check  that  users  can  use  the  product  and  like  it.     ì  Bruce  Tognazzini,  successful  usability  consultant,  comments  that:     ì  "Itera+ve  design,  with  its  repea+ng  cycle  of  design  and  tes+ng,  is  the  only  

validated  methodology  in  existence  that  will  consistently  produce  successful   results.  If  you  don't  have  user-­‐tes+ng  as  an  integral  part  o  f  your  design   process  you  are  going  to  throw  buckets  of  money  down  the  drain."    

ì  Problems  are  fixed  before  the  product  is  shipped,  not  aBer.     ì  The  team  can  concentrate  on  real  problems,  not  imaginary  ones.     ì  The  range  of  features  to  be  evaluated  is  very  broad.   Dr.  Salsabeel  Alabbady  

When  to  evaluate   ì  The   product   being   developed   may   be   a   brand-­‐new  

product  or  an  upgrade  of  an  exisFng  product.    

ì  If   the   product   is   new,   then   considerable   Fme   is  

usually  invested  in  market  research.  Designers  oBen   support   this   process   by   developing   mockups   of   the   poten7al   product   that   are   used   to   elicit   reacFons   from   potenFal   users.   As   well   as   helping   to   assess   market   need,   this   ac7vity   contributes   to   u n d e r s t a n d i n g   u s e r s '   n e e d s   a n d   e a r l y   requirements.    

Dr.  Salsabeel  Alabbady  

When  to  evaluate   ì  In  the  case  of  an  upgrade,  there  is  limited  scope  for  

change  and  aKen7on  is  focused  on  improving  the   overall  product.    

ì  evaluaFons  compare  user  performance  and  

aDtudes  with  those  for  previous  versions.    

ì  In  contrast,  new  products  do  not  have  previous  

versions  and  there  may  be  nothing  comparable  on   the  market,  so  more  radical  changes  are  possible  if   evaluaFon  results  indicate  a  problem.    

Dr.  Salsabeel  Alabbady  

When  to  evaluate   ì  Forma)ve  evalua)ons:  EvaluaFons  done  

during  design  to  check  that  the  product   conFnues  to  meet  users'  needs  are  know   as.    

ì  Summa)ve  evalua)on:  EvaluaFons  that  are  

done  to  assess  the  success  of  a  finished   product.     Dr.  Salsabeel  Alabbady  

An  evaluation  framework-­‐Introduction     ì  EvaluaFon  is  driven  by  quesFons  about  how  well  the  design  or  

par7cular  aspects  of  it  sa7sfy  users'  needs.    

ì  Some  of  these  quesFons  provide  high-­‐level  goals  to  guide  the  

evaluaFon.  Others  are  much  more  specific.    

ì  For  example,     ì  can  users  find  a  parFcular  menu  item?     ì  Is  a  graphic  useful  and  aEracFve?     ì  Is  the  product  engaging?     ì  PracFcal  constraints  also  play  a  big  role  in  shaping  evaluaFon  plans:     ì  Fght  schedules,     ì  low  budgets,     Dr.  Salsabeel  Alabbady  

Evaluation  paradigms  and  techniques     ì  Paradigm  e.g.  usability  tesFng   ì  Technique  e.g.  The  techniques  associated  

with  usability  tesFng  are:  user  tesFng  in  a   controlled  environment;  observa7on  of   user  acFvity  in  the  controlled  environment   and  the  field;  and  ques7onnaires  and   interviews.    

Dr.  Salsabeel  Alabbady  

Evaluation  paradigms   ì  In  this  book  we  idenFfy  four  core  evaluaFon  

paradigms:    

ì  (1)  "quick  and  dirty"  evaluaFons;     ì  (2)  usability  tesFng;     ì  (3)  field  studies;     ì  and  (4)  predicFve  evaluaFon.     Dr.  Salsabeel  Alabbady  

"Quick  and  dirty"  evaluation     ì  A  "quick  and  dirty"  evaluaFon  is  a  common  pracFce  in  which  designers  

informally  get  feedback  from  users  or  consultants  to  confirm  that  their  ideas   are  in  line  with  users'  needs  and  are  liked.    

ì  "Quick  and  dirty"  evaluaFons  can  be  done  at  any  stage  and  the  emphasis  is  on  

fast  input  rather  than  carefully  documented  findings.  

ì   For  example,  early  in  design  developers  may  meet  informally  with  users  to  get  

feedback  on  ideas  for  a  new  product.    

ì  At  later  stages  similar  meeFngs  may  occur  to  try  out  an  idea  for  an  icon,  check  

whether  a  graphic  is  liked,  or  confirm  that  informaFon  has  been  appropriately   categorized  on  a  webpage.    

ì  This  approach  is  oBen  called  "quick  and  dirty"  because  it  is  meant  to  be  done  in  

a  short  space  of  7me.  GeUng  this  kind  of  feedback  is  an  essenFal  ingredient  of   successful  design.    

Dr.  Salsabeel  Alabbady  

"Quick  and  dirty"  evaluation     ì  Any  involvement  with  users  will  be  highly  informa7ve  

and  you  can  learn  a  lot  early  in  design  by  observing   what  people  do  and  talking  to  them  informally.    

ì  The  data  collected  is  usually  descrip7ve  and  informal  

and  it  is  fed  back  into  the  design  process  as  verbal  or   wriEen  notes,  and  sketches,  etc.    

ì  Another  source  comes  from  consultants,  who  use  their  

knowledge  of  user  behavior,  the  market  place  and   technical  know-­‐how,  to  review  soBware  quickly  and   provide  suggesFons  for  improvement.  

Dr.  Salsabeel  Alabbady  

Evaluation  paradigms   ì  In  this  book  we  idenFfy  four  core  evaluaFon  

paradigms:    

ì  (1)  "quick  and  dirty"  evaluaFons;     ì  (2)  usability  tesFng;     ì  (3)  field  studies;     ì  and  (4)  predicFve  evaluaFon.     Dr.  Salsabeel  Alabbady  

Usability  testing   ì 

Usability  tesFng  involves  measuring  typical  users'  performance  on  carefully  prepared  tasks  that  are  typical  of   those  for  which  the  system  was  designed.    

ì 

Users'  performance  is  generally  measured  in  terms  of  number  of  errors  and  7me  to  complete  the  task.  

ì 

 As  the  users  perform  these  tasks,  they  are  watched  and  recorded  on  video  and  by  logging  their  interacFons  with   soBware.  This  observaFonal  data  is  used  to  calculate  performance  7mes,  iden7fy  errors,  and  help  explain  why   the  users  did  what  they  did.    

ì 

User  saFsfacFon  ques7onnaires  and  interviews  are  also  used  to  elicit  users'  opinions.    

ì 

The  defining  characterisFc  of  usability  tesFng  is  that  it  is  strongly  controlled  by  the  evaluator  (Mayhew,1999).    

ì 

There  is  no  mistaking  that  the  evaluator  is  in  charge!  Typically  tests  take  place  in  laboratory-­‐like  condi7ons  that   are  controlled.  Casual  visitors  are  not  allowed  and  telephone  calls  are  stopped,  and  there  is  no  possibility  of   talking  to  colleagues,  checking  email,  or  doing  any  of  the  other  tasks  that  most  of  us  rapidly  switch  among  in   our  normal  lives.    

ì 

Everything  that  the  parFcipant  does  is  recorded-­‐  every  keypress,  comment,  pause,  expression,  etc.,  

 

so  that  it  can  be  used  as  data.   Dr.  Salsabeel  Alabbady  

Evaluation  paradigms   ì  In  this  book  we  idenFfy  four  core  evaluaFon  

paradigms:    

ì  (1)  "quick  and  dirty"  evaluaFons;     ì  (2)  usability  tesFng;     ì  (3)  field  studies;     ì  and  (4)  predicFve  evaluaFon.     Dr.  Salsabeel  Alabbady  

Field  studies     ì  The  disFnguishing  feature  of  field  studies  is  that  they  are  

done   in   natural   seDngs   with   the   aim   of   increasing   understanding   about   what   users   do   naturally   and   how   technology  impacts  them  

ì  QualitaFve   techniques   such   as   interviews,   observa7on,  

and  par7cipant  observa7on  

ì  The   data   takes   the   form   of   events   and   conversaFons  

that   are   recorded   as   notes,   or   by   audio   or   video   recording,  and  later  analyzed  using  a  variety  of  analysis   techniques.    

Dr.  Salsabeel  Alabbady  

Evaluation  paradigms   ì  In  this  book  we  idenFfy  four  core  evaluaFon  

paradigms:    

ì  (1)  "quick  and  dirty"  evaluaFons;     ì  (2)  usability  tesFng;     ì  (3)  field  studies;     ì  and  (4)  predicFve  evaluaFon.     Dr.  Salsabeel  Alabbady  

Predictive  evaluation     ì  In  predicFve  evaluaFons  experts  apply  their  knowledge  of  typical  

users,  oBen  guided  by  heurisFcs,  to  predict  usability  problems.   Another  approach  involves  theoreFcally-­‐  based  models.    

ì  The  key  feature  of  predic7ve  evalua7on  is  that  users  need  not  be  

present,  which  makes  the  process  quick,  rela7vely  inexpensive,   and  thus  aKrac7ve  to  companies;  but  it  has  limita7ons.    

Dr.  Salsabeel  Alabbady  

Characteris7cs  of  different  evalua7on  paradigms     Evalua7on   paradigms  

“Quick  and   dirty”    

Role  of   users  

Natural  behavior   To  carry  out   set  tasks  

Natural  behavior   Users  generally  not   involved  

Who   controls  

Evaluators  take   Evaluators   minimum  control   strongly  in   control  

Evaluators  try  to   Expert  evaluators   develop   relaFonships   with  users  

LocaFon    

Natural   environment  or   laboratory  

Laboratory  

Natural   environment  

Laboratory-­‐oriented  but   oBen  happens  on   customer’s  premises  

With  a   prototype  or   product.      

Most  oBen  used   early  in  design   to  check  that   users'  needs  are   being  met  or  to   assess  problems   or  design   opportuniFes.    

Expert  reviews  (oBen   done  by  consultants)   with  a  prototype,  but   can  occur  at  any  Fme.   Models  are  used  to   assess  specific  aspects   of  a  potenFal  design.    

When  used   Any  Fme  you   want  to  get   feedback  about  a   design  quickly.      

Usability   tes7ng  

Field  studies  

Predic7ve  

Characteris7cs  of  different  evalua7on  paradigms     Evalua7on   “Quick  and   paradigms   dirty”    

Usability  tes7ng  

Field  studies  

Predic7ve  

Type  of   data  

QuanFtaFve   someFmes   staFsFcally   validated.  Users’   opinion  collected   by  quesFonnaire   or  interview.    

QualitaFve   descripFons   oBen   accompanied   with  sketches,   scenarios.  

list  of  problems  from   expert  reviews.   QuanFtaFve  figures   from  model  

Using   qualitaFve   informal   descripFon  

Fed  back   Sketches,   into  design   quotes,   by   descripFve   report.    

Report  of   DescripFons  that   performance   include  quotes,   measures,  errors   sketches   etc.  Findings   provide  a   benchmark  for   future  versions.    

Reviewers  provide  a  list   of  problems,  oBen  with   suggested  soluFons.   Times  calculated  from   models  are  given  to   designers.    

Characteris7cs  of  different  evalua7on  paradigms     Evalua7on   “Quick  and   paradigms   dirty”    

Usability  tes7ng  

Field  studies  

Philosophy     Reviewers   Applied  approach   May  be  objecFve   provide  a  list  of   based  on   observaFon     problems,   experimentaFon     oBen  with   suggested   soluFons.   Times   calculated  from   models  are   given  to   designers.    

Predic7ve   PracFcal  heurisFcs  and   pracFFoner  experFse   underpin  expert   reviews.  Theory   underpins  models.    

Evaluation  paradigms  and  techniques     ì  Paradigm  e.g.  usability  tesFng  

ì Technique  e.g.  The  techniques  associated   with  usability  tesFng  are:  user  tesFng  in  a   controlled  environment;  observaFon  of   user  acFvity  in  the  controlled  environment   and  the  field;  and  quesFonnaires  and   interviews.    

Dr.  Salsabeel  Alabbady  

Evaluation  techniques   ü  observing  users   ü  asking  users  their  opinions   ü  asking  experts  their  opinions   ü  tesFng  users'  performance   ü  modeling  users'  task  performance  to  predict  

the  efficacy  of  a  user  interface     Dr.  Salsabeel  Alabbady  

Observing  users   ì  ObservaFon  techniques  help  to  iden7fy  needs  leading  to  new  

types  of  products  and  help  to  evaluate  prototypes.    

ì  Notes,  audio,  video,  and  interacFon  logs  are  well-­‐  known  ways  

of  recording  observa7ons  and  each  has  benefits  and   drawbacks.    

ì  Obvious  challenges  for  evaluators  are  how  to  observe  without  

disturbing  the  people  being  observed  and  how  to  analyze  the   data,  parFcularly  when  large  quanFFes  of  video  data  are   collected  or  when  several  different  types  must  be  integrated  to   tell  the  story  (e.g.,  notes,  pictures,  sketches  from  observers).    

  Dr.  Salsabeel  Alabbady  

Asking  users   ì  Asking  users  what  they  think  of  a  product-­‐     ì  whether  it  does  what  they  want;     ì  whether  they  like  it;     ì  whether  the  aestheFc  design  appeals;     ì  whether  they  had  problems  using  it;     ì  whether  they  want  to  use  it  again-­‐     ì  is  an  obvious  way  of  geUng  feedback.     ì  Interviews  and  ques7onnaires  are  the  main  techniques  for  doing  

this.  The  quesFons  asked  can  be  unstructured  or  Fghtly   structured.  They  can  be  asked  of  a  few  people  or  of  hundreds.   Interview  and  quesFonnaire  techniques  are  also  being  developed   for  use  with  email  and  the  web.    

Dr.  Salsabeel  Alabbady  

Asking  experts   ì  SoBware  inspecFons  and  reviews  are  long  established  techniques  

for  evaluaFng  soBware  code  and  structure  

ì  Guided  by  heurisFcs,  experts  step  through  tasks  role-­‐playing  

typical  users  and  idenFfy  problems.  

ì   Developers  like  this  approach  because  it  is  usually  rela7vely  

inexpensive  and  quick  to  perform  compared  with  laboratory  and   field  evalua7ons  that  involve  users.    

ì  In  addi7on,  experts  frequently  suggest  solu7ons  to  problems.    

Dr.  Salsabeel  Alabbady  

User  testing     ì  Measuring  user  performance  to  compare  two  or  more  designs  has  

been  the  bedrock  of  usabilitytesFng.  

ì  These  tests  are  usually  conducted  in  controlled  seUngs  and  

involve  typical  users  performing  typical,  well-­‐defined  tasks.    

ì  Data  is  collected  so  that  performance  can  be  analyzed.     ì  Generally  the  Fme  taken  to  complete  a  task,  the  number  of  errors  

made,  and  the  navigaFon  path  through  the  product  are  recorded.    

ì  Descrip7ve  sta7s7cal  measures  such  as  means  and  standard  

devia7ons  are  commonly  used  to  report  the  results.    

Dr.  Salsabeel  Alabbady  

Modeling  users’  task  performance     ì  There  have  been  various  aEempts  to  model  

human-­‐computer  interac7on  so  as  to  predict  the   efficiency  and  problems  associated  with   different  designs  at  an  early  stage  without   building  elaborate  prototypes.    

ì  These  techniques  are  successful  for  systems  with  

limited  funcFonality.    

Dr.  Salsabeel  Alabbady  

The  rela7onship  between  evalua7on  paradigms  and  techniques     Evalua7on   paradigms  

“Quick  and   dirty”    

Usability  tes7ng  

Observing   users  

Important  for   seeing  how   users  behave   in  their   natural   environments    

Video  and  interacFon   ObservaFon  is  the   logging,  which  can  be   central  part  of  any  field   analyzed  to  idenFfy   study.     errors,  invesFgate   routes  through  the   soBware,     or  calculate   performance  Fme  

N/A  

User  saFsfacFon   quesFonnaires     are  administered  to   collect  users'   opinions.  Interviews   may  also  be  used  to   get  more  details.    

N/A  

Asking  users   Discussions   with  users  and   potenFal   users   individually,  in   groups  or   focus  groups.    

Field  studies  

The  evaluator  may   interview  or     discuss  what  she   sees  with   parFcipants.    

Predic7ve  

The  rela7onship  between  evalua7on  paradigms  and  techniques     Evalua7on   paradigms  

“Quick  and   dirty”    

Usability  tes7ng  

Asking   experts  

To  provide   N/A   criFques   (called  "crit   reports")  of   the  usability  of   a  prototype.    

Field   Predic7ve   studies   N/A  

Experts  use  heurisFcs  early   in     design  to  predict  the  efficacy   of  an  interface.    

User  tesFng     N/A    

TesFng  typical  users   N/A   on  typical  tasks  in  a   controlled   laboratory-­‐like   seUng  is  the   cornerstone  of   usability  tesFng.     To  carry  out  set  tasks  

N/A  

Modeling   N/A   users’  task   performance    

N/A  

Models  are  used  to     predict  the  efficacy  of  an   interface     or  compare  performance   Fmes  between  versions.    

N/A  

A  framework  to  guide  evaluation     ì 

Well-­‐planned  evaluaFons  are  driven  by  clear  goals  and  appropriate  quesFons  (Basili  et  al.,   1994).    

ì 

T  o  guide  our  evaluaFons  we  use  the  D  E  C  I  D  E  framework,  which  provides  the  following   checklist  to  help  novice  evaluators:    

ì 

1.  Determine  the  overall  goals  that  the  evaluaFon  addresses.    

ì 

2.  Explore  the  specific  ques+ons  to  be  answered.    

ì 

3.  Choose  the  evalua+on  paradigm  and  techniques  to  answer  the  quesFons.    

ì 

4.   denFfy  the  prac+cal  issues  that  must  be  addressed,  such  as  selecFng  parFcipants.    

ì 

5.  

ì 

6.  Evaluate,  interpret,  and  present  the  data.    

I

Decide  how  to  deal  with  the  ethical  issues.    

Dr.  Salsabeel  Alabbady  

Key  points   ì 

Key  points    

ì 

A  n  evaluaFon  paradigm  is  an  approach  in  which  the  methods  used  are  influenced  by  par-­‐  Fcular  theories  and  philosophies.  Four  evaluaFon   paradigms  were  idenFfied:    

ì 

1.  "quick  and  dirty"   2.  usability  tesFng   3.  field  studies   4.  predicFve  evaluaFon    

ì 

Methods  are  combinaFons  of  techniques  used  to  answer  a  quesFon  but  in  this  book  we  oBen  use  the  terms  "methods"  and  "techniques"   interchangeably.  Five  categories  were  idenFfied:   I.  observing  users    

ì 

2.  asking  users   3.  asking  experts   4.  user  tesFng   5.  modeling  users'  task  performance    

ì 

The  DECIDE  framework  has  six  parts:   1.  Determine  the  overall  goals  of  the  evaluaFon.   2.  Explore  the  quesFons  that  need  to  be  answered  to  saFsfy  the  goals.   3.  choose  the  evaluaFon  paradigm  and  techniques  to  answer  the  quesFons.  4.  IdenFfy  the  pracFcal  issues  that  need  to  be  considered.   5.  Decide  on  the  ethical  issues  and  how  to  ensure  high  ethical  standards.   6.  Evaluate,  interpret,  and  present  the  data.    

ì 

Drawing  up  a  schedule  for  your  evaluaFon  study  and  doing  one  or  several  pilot  studies  will  help  to  ensure  that  the  study  is  well  designed  and   likely  to  be  successful.     Dr.  Salsabeel  Alabbady  

chapter 10

universal design

Introduction • 

The discussion that we had on human psychology in Chapter 1 talked about general human abilities and, in reality, people are much more varied than the discussion suggests.

• 

People have different abilities and weaknesses; they come from different backgrounds and cultures; they have different interests, viewpoints and experiences; they are different ages and sizes.

• 

All of these things have an impact on the way in which an individual will use a particular computing application and, indeed, on whether or not they can use it at all.

• 

Given such diversity, we cannot assume a ‘typical’ user or design only for people like ourselves.

Introduction •  Universal design: is the process of designing products so that they can be used by as many people as possible in as many situations as possible. •  In our case, this means particularly designing interactive systems that are usable by anyone, with any range of abilities, using any technology platform. •  This can be achieved by designing systems either to have built in redundancy or to be compatible with assistive technologies. An example of the former might be an interface that has both visual and audio access to commands; an example of the latter, a website that provides text alternatives for graphics, so that it can be read using a screen reader.

Universal design principles •  We have defined universal design as ‘the process of designing products so that they can be used by as many people as possible in as many situations as possible’. •  But what does that mean in practice? •  Is it possible to design anything so that anyone can use it – and if we could, how practical would it be? •  Wouldn’t the cost be prohibitive? •  In reality, we may not be able to design everything to be accessible to everyone, and we certainly cannot ensure that everyone has the same experience of using a product, but we can work toward the aim of universal design and try to provide an equivalent experience.

Universal design example •  Although it may seem like a huge task, universal design does not have to be complex or costly. •  In fact, if you are observant, you will see many examples of design that attempt to take account of user diversity. •  Next time you cross the road, look at the pavement. •  The curb may be lowered, to enable people who use wheelchairs to cross more easily. •  The paving near the curb may be of a different texture – with raised bumps or ridges – to enable people who cannot see to find the crossing point. •  The parent with a child in a buggy, or the traveller with wheeled luggage, can cross the road more easily.

universal design example

universal design example

universal design example

universal design example

universal design example

universal design example

universal design example

universal design example Bad design

universal design example •  Notice how many modern buildings have automatic doors that open on approach. •  Or lifts that offer both visual and auditory notification of the floor reached. •  And, whilst these designs make the crossing, the building and the lift more accessible to people who have disabilities, notice too how they also help other users. –  The parent with a child in a buggy, or the traveller with wheeled luggage, can cross the road more easily; –  the shopper with heavy bags, or the small child, can enter the building; –  and people are less likely to miss their floor because they weren’t paying attention.

•  Universal design is primarily about trying to ensure that you do not exclude anyone through the design choices you make but, by giving thought to these issues, you will invariably make your design better for everyone.

universal design example

universal design principles - NCSW

•  Seven principles give us a framework in which to develop universal designs. 1.  Equitable use 2.  Flexibility in use 3.  Simple and intuitive to use 4.  Perceptible information 5.  Tolerance for error 6.  Low physical effort 7.  Size and space for approach and use

universal design principles 1.  Equitable use:

- NCSW

•  the design is useful to people with a range of abilities and appealing to all. •  No user is excluded or stigmatized. •  Wherever possible, access should be the same for all; •  Where identical use is not possible, equivalent use should be supported. •  Where appropriate, security, privacy and safety provision should be available to all.

universal design principles - NCSW

2. Flexibility in use: •  The design allows for a range of ability and preference, through choice of methods of use and adaptivity to the user’s pace, precision and custom. 3. Simple and intuitive to use: •  Regardless of the knowledge, experience, language or level of concentration of the user. •  The design needs to support the user’s expectations and accommodate different language and literacy skills. •  It should not be unnecessarily complex and should be organized to facilitate access to the most important areas. •  It should provide prompting and feedback as far as possible.

universal design principles 4. Perceptible information:

- NCSW

•  The design should provide effective communication of information regardless of the environmental conditions or the user’s abilities. •  Redundancy of presentation is important: information should be represented in different forms or modes (e.g. graphic, verbal, text, touch). •  Essential information should be emphasized and differentiated clearly from the peripheral content. •  Presentation should support the range of devices and techniques used to access information by people with different sensory abilities.

universal design principles 5. Tolerance for error:

- NCSW

• 

Minimizing the impact and damage caused by mistakes or unintended behavior.

• 

Potentially dangerous situations should be removed or made hard to reach.

• 

Potential hazards should be shielded by warnings.

6. •  •  • 

low physical effort: systems should be designed to be comfortable to use, minimizing physical effort and fatigue. The physical design of the system should allow the user to maintain a natural posture with reasonable operating effort. •  Repetitive or sustained actions should be avoided.

universal design principles - NCSW 7. Size and space for approach and use: •  The placement of the system should be such that it can be reached and used by any user regardless of body size, posture or mobility. •  Important elements should be on the line of sight for both seated and standing users. •  All physical components should be comfortably reachable by seated or standing users. •  Systems should allow for variation in hand size and provide enough room for assistive devices to be used.

Universal design principles - NCSW •  These seven principles give us a good starting point in considering universal design. •  They are not all equally applicable to all situations, of course. •  For example, principles six and seven would be vital in designing an information booth but less important in designing word-processing software. •  But they provide a useful check- list of considerations for designers, together with guidelines on how each principle can be achieved.

Multi-modal vs. Multi-media •  Multi-modal systems –  use more than one sense (or mode ) of interaction e.g. visual and aural senses: a text processor may speak the words as well as echoing them to the screen

•  Multi-media systems –  use a number of different media to communicate information e.g. a computer-based teaching system: may use video, animation, text and still images: different media all using the visual mode of interaction; may also use sounds, both speech and non-speech: two more media, now using a different mode

Users with disabilities •  visual impairment –  screen readers

•  hearing impairment –  text communication, gesture

•  physical impairment –  speech I/O, eyegaze, gesture, predictive systems (e.g. Reactive keyboard)

•  speech impairment –  speech synthesis, text communication

•  dyslexia –  speech input, output

•  autism –  communication, education

… plus … •  age groups –  older people e.g. disability aids, memory aids, communication tools to prevent social isolation –  children e.g. appropriate input/output devices, involvement in design process

•  cultural differences –  influence of nationality, generation, gender, race, class, religion, political persuasion etc. on interpretation of interface features –  e.g. interpretation and acceptability of language, cultural symbols, gesture and colour

Chapter6- Introduction



“HCI Beyond the GUI, design Non-traditional interfaces” Edited by Philip Kortum

Dr. Salsabeel Alabbady

Nontraditional interfaces 1.

Haptic user interfaces

2.

Gesture interfaces

3.

Locomotion interfaces

4.

Auditory interfaces

5.

Speech user interfaces

6.

Interactive voice response interfaces

7.

Olfactory interfaces

8.

Taste interfaces

9.

Small-screen interfaces

10.

Multimode interfaces: two or more interfaces to accomplish the same task

11.

Multimode interfaces: combining interfaces to accomplish a single task

Dr. Salsabeel Alabbady

Haptic user interfaces 

Haptic interfaces use the sensation of touch to provide information to the user.



Rather than visually inspecting a virtual three-dimensional object on a computer monitor, a haptic display allows a user to physically “touch” that object.



The interface can also provide information to the user in other ways, such as vibrations.



Of course, the gaming industry has led the way in introducing many of these nontraditional interfaces to the general public.



Various interface technologies have heightened the realism of game play and make the game easier and more compelling. Dr. Salsabeel Alabbady

Haptic user interfaces 

One of the early interfaces to take advantage of haptics can be found in Atari’s Steel Talons sit-down arcade game.



The game was fun to play because the controls were reasonably realistic and the action was nonstop; however, unlike other contemporary first-person shooter games.



Atari integrated a haptic feedback mechanism that was activated when the user’s helicopter was “hit” by enemy fire.



Other contemporary games used sounds and changes in the graphical interface (flashing, bullet holes) to indicate that the user was taking enemy fire.



Atari integrated what can best be described as a “knocker” in the seat of the game.



This haptic interface was both effective and compelling.

Dr. Salsabeel Alabbady

Haptic user interfaces Atari’s Steel Talons helicopter simulation, circa 1991. While the graphics were unremarkable (shaded polygons), the game employed a haptic interface in the player’s seat (as indicated by the arrow) that thumped the player (hard!) when the helicopter was being hit by ground fire. The added interface dimension caused the player to react in more realistic ways to the “threat” and made the information more salient.

Dr. Salsabeel Alabbady

Gesture interfaces  Gesture interfaces use hand and

face movements as input controls for a computer.  Although related to haptic

interfaces, gesture interfaces differ in the noted absence of machine-mediated proprioceptive or tactile feedback. Dr. Salsabeel Alabbady

Gesture interfaces 

In 2001, Konami released a game called MoCap Boxing.



Unlike earlier versions of boxing games that were controlled with joysticks and buttons, Konami’s game required the player to actually box.



The player donned gloves and stood in a specified area that was monitored with infrared motion detectors.

Dr. Salsabeel Alabbady

Gesture interfaces 

This technology, too, has found its way into the home with the recent introduction of Nintendo’s Wii system.



Unlike other current home gaming systems, Wii makes extensive use of the gesture interface in a variety of games, from bowling to tennis,



Allowing the user to interact in a more natural style than previously when interaction was controlled via buttons interfaced to the GUI.

Dr. Salsabeel Alabbady

Gesture interfaces 

Gesture interfaces range from systems that employ hand motion for language input to those that use gestures to navigate (e.g., “I want to go that way”) and issue commands (e.g., “Pick that up”) in a virtual-reality environment.

Dr. Salsabeel Alabbady

Locomotion interfaces 

Deal with large-scale movement or navigation through an interface.



Interfaces in research labs not only include treadmill-type interfaces, but have moved in other interesting directions. These kinds of interfaces are frequently associated with high-end simulators.

Dr. Salsabeel Alabbady

Auditory interfaces 

Auditory interfaces have also been used extensively to augment complex interfaces and to spread the cognitive load in highly visual interfaces.



Recently, auditory interfaces have been employed as a substitute for more complex visual interfaces, and the term “sonification” has been coined to describe these kinds of auditory interfaces.



In a sonified interface, representations that are typically visual, such as graphs and icons, are turned into sound, that is, sonified, so that they can be interpreted in the auditory rather than the visual domain.

Dr. Salsabeel Alabbady

Speech user interfaces  Recent advances in computing power have brought the

possibility of robust speech interfaces into reality.  Early implementations of single-word speech command

interfaces have led to continuous–speech dictation systems and state-of-the-art systems that employ powerful semantic analysis to interpret a user’s intent with unstructured speech.

Dr. Salsabeel Alabbady

Interactive Voice Response Interfaces 

Interactive voice response systems (IVRs) are in widespread use in the commercial world today, yet receive scant attention in traditional human factors.



The interface has been embraced by the business community because of its huge potential for cost savings and because when implemented correctly it can result in high customer satisfaction ratings from the user as well.



Because of the ubiquity of the interface, however, poorly designed IVR interfaces abound, and users are left to suffer.

Dr. Salsabeel Alabbady

Olfactory Interfaces 

As with many of the interfaces in this book, the advent of computers and simulated environments has pushed secondary interfaces, like olfaction, to the fore.



Cinemas



shopping (perfume fragrance samples)



entertainment (the smell of burning rubber as you drive a video game race car)

Dr. Salsabeel Alabbady

Taste Interfaces  Without a doubt, interfaces that rely on taste are one of the least

explored of the nontraditional interfaces.  Taste interfaces are usually discussed in terms of simulation, in

which a particular taste is accurately represented to simulate a real taste (e.g., for a food simulator).

Dr. Salsabeel Alabbady

Small Screen interfaces 

While devices such as mobile telephones and MP3 players have continued to shrink, the problems with controlling and using these miniature devices have grown.



From the physical ergonomics associated with using the systems to the navigation of tiny menus, the new systems have proven to be substantially different, and more difficult, to use than their bigger brethren.



The chapter on small-screen interfaces will discuss how these miniature GUIs are designed and tested, and how special interface methods (predictive typing, rapid serial presentation of text, etc.) can be employed to make these interfaces significantly more usable.

Dr. Salsabeel Alabbady

Multimode Interfaces: Two or More Interfaces to Accomplish the Same Task 

In many instances, a task can be accomplished using one or more interfaces that can be used in a mutually exclusive fashion.



Providing multiple interfaces for single systems means that the seemingly independent interfaces must be designed and tested together as a system, to ensure that users who move back and forth between the two interfaces can do so seamlessly.



The chapter will explore the more common mutually exclusive multimode interfaces (MEMM), including IVR/GUI, speech/GUI, small screen/GUI, small screen/IVR, and small screen/speech, and discuss the human factors associated with the design and implementation of these multimode interfaces.

Dr. Salsabeel Alabbady

Multimode Interfaces: Combining Interfaces to Accomplish a Single Task  Interfaces that employ multiple interfaces into a single

“system” interface.  Auditory and visual interfaces are frequently combined to

create effective interfaces.  Virtual-reality systems are a prime example of this kind of

interface, where vision, audition, speech, haptic, and gesture interfaces are combined in a single integrated experience. Dr. Salsabeel Alabbady



“HCI Beyond the GUI, design Non-traditional interfaces” Edited by Philip Kortum

Dr. Salsabeel Alabbady

1. Haptic Interfaces



Marcia K. O’Malley, Abhishek Gupta

Dr. Salsabeel Alabbady

1. Haptic Interfaces



Marcia K. O’Malley, Abhishek Gupta

http://www.youtube.com/watch?v=6wJ9Aakddng

Dr. Salsabeel Alabbady

1. Haptic Interfaces



Marcia K. O’Malley, Abhishek Gupta

http://www.youtube.com/watch?v=RD5EGQ1nnKk

Dr. Salsabeel Alabbady

Introduction  In general the word “haptic” refers to the sense of touch.  This sense is essentially twofold, including both:  Cutaneous touch: refers to the sensation of surface features and tactile perception and is usually conveyed through the skin.  kinesthetic touch sensations, which arise within the

muscles and tendons, allow us to interpret where our limbs are in space and in relation to ourselves.  Haptic sensation combines both tactile and kinesthetic

sensations. Dr. Salsabeel Alabbady

Introduction 

The sense of touch is one of the most informative senses that humans possess.



Mechanical interaction with a given environment is vital when a sense of presence is desired, or when a user wishes to manipulate objects within a remote or virtual environment with manual dexterity.



The haptic display, or force-reflecting interface, is the robotic device that allows the user to interact with a virtual environment or teleoperated remote system.



The haptic interface consists of a real-time display of a virtual or remote environment and a manipulator, which serves as the interface between the human operator and the simulation.

Dr. Salsabeel Alabbady

Introduction 

The user moves within the virtual or remote environment by moving the robotic device.



Haptic feedback, which is essentially force or touch feedback in a man– machine interface, allows computer simulations of various tasks to relay realistic, tangible sensations to a user.



Haptic feedback allows objects typically simulated visually to take on actual physical properties, such as mass, hardness, and texture.



With the incorporation of haptic feedback into virtual or remote environments, users have the ability to:





push,



pull,



feel,



and manipulate objects in virtual space

Rather than just see a representation on a video screen. Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces Computer-aided design and manufacturing (CAD/CAM): is the use of computer systems to assist in the creation, modification, analysis, or optimization of a Design.

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces Design prototyping

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Allowing users to manipulate virtual objects before manufacturing them enhances production evaluation.

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Dangerous work environment

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Training in surgical procedures

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Training in surgical procedures

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Training in surgical procedures http://www.youtube.com/watch?v=QmtHecrOVXo

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

Training in military

http://www.youtube.com/watch?v=V34gCw4fyLs

Dr. Salsabeel Alabbady

Haptic robot hand and glove

http://www.youtube.com/watch?v=oZXJYCuNtdo

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces • The Tactile Vision Substitution System (TVSS) was, perhaps, one of the most dramatic early examples of this interest (Bach-y-Rita, 1972). • The original TVVS employed a camera connected to a computer and an array of vibrating stimulators on the skin of the back. The basic idea was to allow people to “see” with their skin. • A derivative of the TVVS was the Optacon, a portable tactile display to permit blind people to read printed material.

• The main unit of the Optacon contained a template or “array” with 144 tiny pins. • The pins of the array vibrated to create a tactile image of alphabets and letters as a camera lens was moved over them. Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces - Optacon The Optacon (Optical to tactile convertor) is an electromechanical device that enables blind people to read printed material that has not been transcribed into Braille. Braille is a tactile writing system used by blind and visually impaired. It is traditionally written with embossed paper.

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces

http://www.youtube.com/watch?v=_b0J1sI-DOo

Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces - Optacon • Recent advances in haptic interfaces have led to renewed research efforts to build haptic interfaces for the blind or visually impaired. • There are three types of haptic interfaces for accessibility: • Devices like the Optacon that tactually display material to be read, • Haptic navigational aids for navigation without sight, • and haptic interfaces for web or computer access. Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces B-Touch Mobile Phone

http://weburbanist.com/2010/04/05/12-ingenious-gadgets-technologies-for-the-blind/ Dr. Salsabeel Alabbady

Areas benefit of haptic interfaces B-Touch Mobile Phone

Braille E-Book

Navigation Bracelet

Braille Polaroid Camera

http://weburbanist.com/2010/04/05/12-ingenious-gadgets-technologies-for-the-blind/ Dr. Salsabeel Alabbady

When to select a haptic interface?  Haptic interfaces have a number of beneficial characteristics, such

as:   

Enabling perception of limb movement and position, Improving skilled performance of tasks (typically in terms of increased precision and speed of execution of the task), Enabling virtual training in a safe and repeatable environment.

 Improves feelings of realism in the task,

 Support hand–eye coordination tasks

Dr. Salsabeel Alabbady

Data needed to build the interface 

Upon determining that the inclusion of haptic feedback is beneficial to a virtual or remote environment display, a number of decisions must be made in order to build a haptic interface system:



The designer must determine if tactile or kinesthetic feedback is preferred



These decisions are dependent on the type of feedback that the designer wishes to provide.



For example, if the desire is to provide a simple alert to the user or to display textures or surface roughness, then a tactile device is most appropriate.



In contrast, if 2D or 3D shape perception, discrimination, or presence in the virtual or remote environment is the goal, then kinesthetic devices are preferred.

Dr. Salsabeel Alabbady

Data needed to build the interface 

If kinesthetic, then the designer must select a probe- or joystick-type device that is grasped by the user, or an exoskeleton device that is worn by the user.



When selecting a desktop device versus a wearable exoskeleton device for kinesthetic force display, the designer must decide on the importance of mobility when using the interface and the nature of the feedback to be displayed.



Most tactile feedback devices are pin arrays

Dr. Salsabeel Alabbady

Design guidelines 

Work space well matched to human operator.



Ensure a safe and well-designed system that is not overqualified for the job.



Consider human sensitivity to tactile stimuli



Use active movement rather than passive movement of the human operator, and minimize fatigue by avoiding static positions



In multimodal systems, it is important to minimize confusion of the operator and limit control instabilities by avoiding time lags among haptic/visual loops



Ensure Realistic Display of Environments with Tactile Devices

Dr. Salsabeel Alabbady

Task – Submission at the end of the class

 Think of an application that haptic interface

maybe useful for,  For whom it will be useful? (Users) and why do

you think it is useful?  Is it tactile or/and kinesthetic based?  What HW you will need?

Dr. Salsabeel Alabbady



“HCI Beyond the GUI, design Non-traditional interfaces” Edited by Philip Kortum

Dr. Salsabeel Alabbady

Nontraditional interfaces 1.

Haptic user interfaces

2.

Gesture interfaces

3.

Locomotion interfaces

4.

Auditory interfaces

5.

Speech user interfaces

6.

Interactive voice response interfaces

7.

Olfactory interfaces

8.

Taste interfaces

9.

Small-screen interfaces

10.

Multimode interfaces: two or more interfaces to accomplish the same task

11.

Multimode interfaces: combining interfaces to accomplish a single task

Dr. Salsabeel Alabbady

2. Gesture Interfaces



Michael Nielsen, Thomas B. Moeslund, Moritz Sto ̈rring, Erik Granum

Dr. Salsabeel Alabbady

2. Gesture Interfaces



Michael Nielsen, Thomas B. Moeslund, Moritz Sto ̈rring, Erik Granum

http://www.youtube.com/watch?v=6ECVHsUVuJg

Dr. Salsabeel Alabbady

Introduction

 Gestures originate from natural interaction

between people.  They consist of movements of the body and face

as nonverbal communication that complements verbal communication.  This is the inspiration behind using gesture

interfaces between man and machine. Dr. Salsabeel Alabbady

Introduction  Gesture interfaces can navigate a Windows interface just

as well or better than the mouse cursor,  While they may be more or less useless when it comes to

fast computer games, such as three-dimensional (3D) shooters and airplane simulators.  When developing a gesture interface, the objective should

not be “to make a gesture interface”. A gesture interface is not universally the best interface for any particular application. The objective is “to develop a more efficient interface” to a given application. Dr. Salsabeel Alabbady

Technology and applicability  The most interesting potential in this field of research is to make

accessory- free and wireless gesture interfaces, such as in virtualreality and intelligent rooms,  because the use of physical and wired gadgets makes the interface

and gesturing tedious and less natural.  The first solutions required expensive data gloves or other such

intrusive equipment with wires that made the user feel uncomfortable.  Greater success came with pen-based gestures (e.g., Palm handheld

devices), where trajectories were recognized as gestures. Dr. Salsabeel Alabbady

Mechanical and tactile interfaces 

Early gesture interfaces relied on mechanical or magnetic input devices. Examples include the data glove , the body suit, and Nintendo Wii.



Single-point touch interfaces are well known as pen gestures (Long et al., 2000), most commonly seen in Palm handheld devices.



But recent research has developed multipoint touches directly onto the screen, used in the iGesturePad, which open up a new and more efficient interface potential.



There are examples of research in making the computer aware of human emotions shown in body language. De Silva et al. (2006) detected emotion intensity from gestures using sensors that read galvanic skin response. Dr. Salsabeel Alabbady

Touchless gesture control - enabling revolutionary natural user interfaces

http://www.youtube.com/watch?v=vypsUm2O8Nk

Dr. Salsabeel Alabbady

Facial gestures  Recent work has also focused on facial gestures (face expressions

and poses)—detecting reactions and emotions.  This information can be used for:  automatic annotation in human behavior studies,  accessibility for paralyzed people,  and feedback to an intelligent learning system .

Dr. Salsabeel Alabbady

Facial gestures

Dr. Salsabeel Alabbady

The Future of Gesture Control -- Introducing Myo: Thalmic Labs at TEDxToronto

http://www.youtube.com/watch?v=8QSx5nBPj6Q

Dr. Salsabeel Alabbady

Design guidelines  The Midas Touch 

The relevance of this story to gesture interface design is important: The Midas touch refers to the ever-returning problem of when to start and stop interpreting a gesture.



As a designer you can expect spontaneous gestures from users all the time if the goal is natural immersive behavior.



Therefore, the gesture recognition must be very tolerant.



Otherwise, users would suffer rigid constraints to their behavior while in the system.



Unfortunately, designers often select rigid constraints as solutions, such as forcing the user not to move between gestures.



Alternatively, users may have a manual trigger that tells the system when a gesture starts and stops. Dr. Salsabeel Alabbady

Design guidelines 

Cultural issues



Nonverbal communication is culturally dependent in typology (semantics), rhythm, and frequency. Perhaps there are even gender differences.



Conventional interfaces that are international are generally in English, but most software is available with a series of national language packages, and people in some nations use different keyboards.



In a gesture interface, this can be translated to selectable gesture vocabularies if it should become a problem that an emblem is illogical to another culture. Furthermore, if a culturally dependent gesture is used, this does not necessarily mean that it is utterly illogical for people of other cultures to learn it.



It is critical to consider cultural aspects when analyzing and developing gesture interfaces/detectors with a focus on natural human conversation and behavior. The system must be able to distinguish (and/or synthesize) those parameters on rhythm, frequency, and typology.

Dr. Salsabeel Alabbady

Design guidelines 

Sequencing



The choice of gestures may depend on your choice of sequencing.



Avoiding problems later in the process is easier if you design the sequencing from the start.



Sequence design involves deciding each step that a user and the system will go through to accomplish a given task.



It is obvious that sequencing can affect interface efficiency and learnability.

Dr. Salsabeel Alabbady



“HCI Beyond the GUI, design Non-traditional interfaces” Edited by Philip Kortum

Dr. Salsabeel Alabbady

Nontraditional interfaces 1.

Haptic user interfaces

2.

Gesture interfaces

3.

Locomotion interfaces

4.

Auditory interfaces

5.

Speech user interfaces

6.

Interactive voice response interfaces

7.

Olfactory interfaces

8.

Taste interfaces

9.

Small-screen interfaces

10.

Multimode interfaces: two or more interfaces to accomplish the same task

11.

Multimode interfaces: combining interfaces to accomplish a single task

Dr. Salsabeel Alabbady

3. Locomotion Interfaces



Mary C. Whitton, Sharif Razzaque

Dr. Salsabeel Alabbady

Introduction 

Locomotion is a special type of movement,



Locomotion, as used by life scientists, refers to the act of an organism moving itself from one place to another.



This includes actions such as flying, swimming, and slithering.



For humans, locomotion is walking, running, crawling, jumping, swimming, and so on.



The focus of this chapter is computer-based locomotion interfaces for moving about in computer-generated scenes.



One way to think of these interfaces is that they are virtual-locomotion interfaces for virtual scenes.

Dr. Salsabeel Alabbady

virtual-locomotion interface.

University of Utah’s Treadport virtual-locomotion interface. Left: The user walks on a treadmill while viewing the moving virtual landscape on the large projector screens. Right: To simulate hills, the entire treadmill can tilt up. To simulate the user’s virtual inertia, the Treadport physically pushes or pulls the user via a large rod that connects to a user-worn harness. Dr. Salsabeel Alabbady

Real-walking virtual locomotion interface.

Dr. Salsabeel Alabbady

Dominant metaphors for virtual locomotion 

There are three dominant metaphors for virtual locomotion:

1. In real- walking–style systems, input movements and resulting movement through the

space are as natural and as much like really walking as possible. Examples are twodimensional (2D) treadmills, walking-in-place, and real-walking interfaces. 2. In vehicle-style interfaces, input movements and response are similar to driving a

vehicle. For example, Fleming and colleagues (2002) developed a joystick interface in which pushing the joystick up (away from you) moves you forward, and pushing the joystick left rotates you left but does not move you forward. To move left, you must first turn (rotate) left using the joystick and then move forward. This is similar to how a driver operates a tank. Both real-walking and vehicle-style interfaces fall into a category of locomotion techniques that attempt to mimic, as closely as possible, how we walk or control vehicles in the real world. Dr. Salsabeel Alabbady

Dominant metaphors for virtual locomotion 

There are three dominant metaphors for virtual locomotion:

3. In contrast, magical-style interfaces are those that permit movements that have no

natural corollary in the real world but that serve a useful purpose when you are moving about in a virtual scene. For instance, the ability to teleport between two distant locations is a magical component of a locomotion interface. Similarly, the ability to adapt the length of your virtual stride so that you move miles with each step is a magical property. Regardless of the choice of metaphor, the interface has to convert the user’s physical movement (in the real world) into virtual movement—both direction and speed—in the virtual scene.

Dr. Salsabeel Alabbady

Joyman: an Immersive and Entertaining Interface for Virtual Locomotion

http://www.youtube.com/watch?v=VqJAjeWBoBA

Dr. Salsabeel Alabbady

Avatar

avatar behind the scenes motion capture

Dr. Salsabeel Alabbady

Motion capture / Locomotion

Dr. Salsabeel Alabbady

Motion capture / Locomotion

Dr. Salsabeel Alabbady

Motion capture / Locomotion

Dr. Salsabeel Alabbady

Motion capture / Locomotion

http://www.youtube.com/watch?v=uO9c3DUUdzA

Dr. Salsabeel Alabbady

Motion capture / Locomotion

http://www.youtube.com/watch?v=MON0b3z_qCs

Dr. Salsabeel Alabbady

Motion capture / Locomotion

Dr. Salsabeel Alabbady

Motion capture / Locomotion

Dr. Salsabeel Alabbady

Design guidelines 

General interface design rules apply to locomotion interfaces as well:



1. Always consider the user’s goals for the interface as the highest priority.



2. Perform many iterations of the design ! test ! revise cycle.



3. Always include tests with actual users and the actual tasks those users will perform.

Dr. Salsabeel Alabbady

Design guidelines specific to whole-body locomotion interfaces 1.

Match the locomotion metaphor to the goals for the interface You should always consider whether the locomotion metaphor suits the goal of the whole-body locomotion interface. For example, if the interface’s goal is to simulate real walking, then the interface should require the user to really turn her body to turn (walking metaphor), rather than turning by manipulating a steering wheel or hand controller (vehicle metaphor).

2. Consider supplementing visual motion cues using other senses The visual display alone may not always communicate some kinds of movement. For example, if the user of a game is hit with bullets, the quick and transient shock movement will be imperceptible in a visual display because the visual system is not sensitive to those kinds of motions. An auditory display and motion platform (e.g., a shaker) would work better. On the other hand, a motion platform would not be appropriate for conveying very slow and smooth motion such as canoeing on a still pond.

Dr. Salsabeel Alabbady

Design guidelines specific to whole-body locomotion interfaces 3. Consider user safety Since the user will be physically moving, consider her safety. One thing you must always consider as a designer is the set of cables that might connect to equipment worn by the user. Will the cables unintentionally restrict her movement or entangle her or cause her to fall? What prevents her from falling? And if she does fall, does the equipment she is wearing cause her additional injury? How will the cables be managed?

4. Consider how long and how often the interface will be used As a designer you must also consider the length of time the person will be using the interface and how physically fatiguing and stressful the motions she must make are. For example, if the interface requires her to hold her arms out in front of her, she will quickly tire. If the interface requires her to slam her feet into the ground, repeated use might accelerate knee and joint injury. As discussed above, you should also be on the lookout for symptoms of simulator sickness, and if you suspect that your locomotion interface is causing some users to become sick, investigate ways of changing the interface to address it.

Dr. Salsabeel Alabbady

Design guidelines specific to whole-body locomotion interfaces 5. Do not assume that more sophisticated technology (higher tech) is better One assumption that we have seen novice interface designers make that often results in a bad interface design is that an interface is better simply because it is “higher tech.” You must consider the goals of the user and make design choices based on that.

Dr. Salsabeel Alabbady