jump to content

Course Description

This course will introduce students to usability design concepts and authentic test methods for interactive media. Usability is grounded in tailoring interface design for the most effective user experience. Students will design, conduct and analyze usability tests based on established principles, research findings and theory. Upon completion of this course, students will be able to develop a usability test plan, recruit appropriate users, create robust task scenarios, facilitate testing, and analyze and communicate the test results.

Recommended Textbook(s)

Recommended Reading

  • Dumas, Joseph S. and Beth A. Loring. Moderating Usability Tests: Principles and Practices for Interacting. Morgan Kaufmann, 2008.
  • Dumas, Joseph S. and Janice C. Redish. A Practical Guide to Usability Testing. Intellect Ltd, 1999.
  • Hackos, JoAnn T. and Janice C. Redish. User and Task Analysis for Interface Design. Wiley, 1998.
  • Krug, Steve. Don’t Make Me Think: A Common Sense Approach to Web Usability. Second edition. New Riders Press, 2005.
  • Kuniavsky, Mike. Observing the User Experience: A Practitioner’s Guide to User Research. Morgan Kaufmann, 2003.
  • Synder, Carolyn. Paper Prototyping: The fast and easy way to define and refine user interfaces. Morgan Kaufmann, 2003.
  • Stone, Debbie with Caroline Jarrett, Mark Woodroffe and Shailey Micocha. User Interface Design and Evaluation Morgan Kaufmann, 2005
  • Tullis, Thomas and Bill Albert. Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics. Morgan Kaufmann, 2008.

Technologies Required

  • Mac or PC computers with internet connection
  • Web Browsers (Firefox, Opera, Safari, Internet Explorer)
  • TechSmith Camtasia Studio

Competencies

Topic Competency Evaluation Method
Usability Concepts & Design Strategies
  • Describe usability concepts
  • Identify design strategies
Journal Blog
Usability Testing Methods
  • Distinguish what the methods are
  • Describe test methods
  • Identify the approprate methods to use for testing
  • Usability Quiz
  • Case Study Evaluation
Planning for a Usability Test
  • Recognize which test method is appropriate for the situation
  • Distinguish between a co-discovery or individual testing session
  • Identify tools, parameters, and constraints for testing
  • Describe the intended audience
Test Plan SOW (document):  choose 1 of the sites they evaluated for assignment 1; build a justification for testing this site using one of the methods discussed in Usability Testing Methods section (what method is best, individual vs. co-discovery, tools, etc.)
Recruiting
  • Identify the intended audience
  • Developing strategies for recruiting individuals that resemble the intended audience
  • Develop recruitment materials (i.e. screener, follow-up email, etc.)
  • Explain incentive for participants
  • Define a preferred number of participants
Screener Document
Designing a Usability Test Plan
  • Design appropriate tasks for the usability test
  • Describe tentative data measures associated success factors
  • Create a schedule for testing dates and milestones
  • Describe the testing environment, required equipment, and logistics
  • Develop testing materials (i.e. moderator/facilitator guide, consent forms, and surveys)
Test Plan
Conducting a Usability Test Session
  • Describe the criteria for being an effective moderator (i.e. based on co-discovery or individual sessions)
  • Identify and demonstrate appropriate note taking techniques
  • Execute the usability test plan
Usability Lab
Analyzing Data
  • Identify and interpret the testing results(i.e. qualitative, quantitative, assessments, etc.)
  • Create and organize collected data for further analysis
  • Identify data patterns and recognize data that could potentially be invalid
  • Identify and prioritize findings into design revisions or solutions
Data Summary Memo
Communicating Results
  • Identify the appropriate communication method for intended audience who is using the results
  • Describe findings into actionable recommendations
  • Distinguish findings that have the most impact
  • Indicate areas where further research is required
Usability Report
Variation of Techniques
  • Distinguish what the method variations are (i.e. eye tracking, biometrics, etc.)
  • Describe variations on basic methods
  • Identify the appropriate methods to use for testing
Brief Paper

Assignments

Journal Blog

   

Journal Blog Entry

Read Ch. 1:  What makes something Usable?, Ch. 2:  What is Usability Testing?, and Ch3. When should you test?  Keep a journal of usability topics in the blog. The blog will develop and grow as you learn additional usability concepts and methods. Note:  If you do not already have a blog, either sign up for a free blog on Blogger or WordPress or download WordPress from http://www.wordpress.org and install it on your server (you will need to create an account first at http://wordpress.com). Use usability evaluation strategies discussed in this course to make your blog more user friendly. You should update the blog each week.  Each post should be at least 200 words in length and must be in your own words. Be sure to cite all sources. Below are topics and suggested ideas you might explore on your blog:

  • Return-On-Investment with Usability
  • Usability Myths & Truths
  • Discuss an everyday thing that can be enhanced by usability
  • Measuring the Usability of a Product (Identify how you would measure the usability of the site)
  • Identify a Company that uses a User Centered Design (UCD) approach and specify what makes the approach effective
  • Based on the usability testing method, identify limitations with each method
  • Identify a website that you would like to redesign. Specify the type of techniques/methods you would use to evaluate the usability and why you would use these techniques.
  • Develop a case study as to when you would use exploratory, assessment, validation, and comparison testing.
  • Discuss your experience conducting the usability lab.  Was it everything you imagined?
  • Discuss and explain an alternative method/technique for conducting usability evaluations. Recommended: Subscribe to your classmates’ blog RSS feeds using |Google Reader, Netvibes, Bloglines, or another RSS reader of your choice so you can learn from your peers.
Journal Blog Grading Rubric
Criteria Performance Quality Score
0 points 1 point 2 points 3 points
Blog Posts (evaluated weekly) Blog post was not published on time, is poorly written, does not contain content relevant to the course, or does not meet the post length requirement. Blog post was published on time, contains some spelling and/or grammatical errors, meets the post length requirement, but content is not very relevant to the course, or does not expand upon course topics. Blog post was published on time, contains no spelling and/or grammatical errors, meets the post length requirement, and the content expands upon course topics. Blog post is published on time, is very well written with no typos, grammar, or spelling errors, expands upon course topics, and exceeds the minimum post length. Post contains images where relevant to the content, and links to plenty of sources and resources.  

Usability Quiz Questions

Read Ch. 2:  What is Usability Testing? The following resources provide examples of potential quiz questions:

  • Human Factors International
  • DePaul University — Evaluating Human Computer Interaction
  • Additional Questions (source: http://www.cs.utep.edu/nigel/hci/test2-08.pdf and http://www.cs.utep.edu/nigel/hci/test2-08.pdf)
    • Give two reasons why you may wish to use a low-fidelity prototype instead of a high-fidelity one.
    • What are cognitive walkthroughs good for?
    • Identify 3 things wrong with the following error message: “You have entered an invalid string character in Field 123A.”
    • Some textbooks on usability engineering include “survey” as a usability method. For example, you could simply mail a list of questions about some software to a population of users. This has various advantages, but also some disadvantages. List two kinds of information about an interface which typically cannot be obtained by a survey, and for each, explain what other usability method would be able to obtain that information.
    • What sorts of potential usability problems would a heuristic evaluation be unable to detect?

Case Study Evaluation

Read Ch. 3: When Should You Test? The following resources provide examples of case studies to be used during discussion and evaluation of usability testing methods:

Test Plan Statement of Work (SOW)

Read Ch. 3:  When Should You Test? Student will evaluate a site, product, or interface they would like to conduct usability testing on. Once the site, product, or interface is identified, the students must build a justification for testing the site, identify the goals and objectives for conducting the test, and identify the methods used to conduct the evaluation.

Test Plan Statement of Work (SOW)
Criteria Performance Quality Score
0 point 1 point 2 points 3 points
Justification Description of the reason to conduct the test is not present. Justification is present but not fully complete. Justification identifies one reason as to why to conduct the site. There are three or more grammatical errors. Justification is complete, accurate, and provides general information regarding the testing method or approach used. There are two or more grammatical errors. Justification content is complete. The justification as to why to evaluate the site is clear, complete, and utilizes citation reinforcing the approach and methods used. There are no grammatical errors.  
Goals & Objectives Descriptions of the goals for testing are not present. Goals are present; however, the goals are not measureable. The goals are very vague and general. Goals are not traceable. There are three or more grammatical errors presented. Goals are present. The goals are measureable, traceable, and reinforce the justification for testing. There are three or more grammatical errors presented. Goals are present. The goals are measureable, traceable, and reinforce the justification for testing. There are no grammatical errors presented.  
Testing Method or Approach Identified Method or type of evaluation is not presented. Method does not match the methods discussed in class and has no relevance to the test scenario being discussed. There are three or more grammatical errors presented. Method(s) are present. The methods are relevant to the type of site and stage in the development process of the site and product. The method/approach selected does not help reinforce the goals and objectives of the site, product, or interface. Method(s) are present. The methods are relevant to the type of site and stage in the development process. The method selected will help gather the appropriate information to support the usability goals and objectives.  

Recruitment Screener

Read Ch. 7: Find and Select Participants. The primary goal of the recruitment screen assignment is to develop the screener and scheduling approach for participants. Students will identify the audience targeted for the study; preferred method for recruiting; develop a list of questions and screening script used to filter participants who are not appropriate for the study; and develop materials for scheduling participants (i.e. scheduling email, reminder call script, method for ensuring a variety of demographic needs are reflected in the participant pool).

Recruitment Screener and Scheduling
Criteria Performance Quality Score
0 point 1 point 2 points
Audience Identification Description of the audience and their needs were not provided. The audience segments/groups for testing were not identified. Justification is present but not fully complete. Justification identifies one reason as to why to conduct the site. There are three or more grammatical errors. No research was performed to identify the audience segments/groups. A description of each audience segment/group along with the preferred sample size for each group was identified. There are two or more spelling errors. Justification is complete, accurate, and provides general information regarding the testing method or approach used. There are two or more grammatical errors.Research was preformed to help identify the audience segment/groups. A summary of the research citations were presented. A description of the audience segment along with distinct qualifiers for each segment/group. The preferred sample size for each group along with a suggested number of participants needs to accommodate “No Shows”. An approach for how to handle “No Shows”, such as recruiting floaters and double booking.  
Recruiting Approach No approach for recruiting was defined. An itemized listing of options as to how to recruit. A detailed approach for recruiting was defined. The approach included whether it was direct or indirect recruiting as well as how the recruiting would be executed.  
Screener Questions & No screener question and script was defined. A list of screener questions was defined; but no recruiting script was created. There are two or more spelling errors. Both the screener questions and recruiting script were created. There are no spelling errors.  
Scheduling Materials No scheduling time slots were defined. A scheduling method and document were created to assign participants to a timeslot. The document displayed the participants’ names, contact information (email and phone number), timeslot. A scheduling method and document were created to assign participants to a timeslot. The document displayed the participants’ names, contact information (email and phone number), timeslot, and basic demographics of the participants.  

Usability Test Plan

Read Ch. 5:  Develop the Test Plan, Ch. 6:  Setup Testing Environment, Ch. 8:  Prepare Test Materials. The usability test plan should serve as a blueprint serving as the main vehicle of communication:

  • Outlining the testing logistics;
  • Defining required resources and testing materials;
  • Defining purpose of the test;
  • Defining testing methodology;
  • Outlining the task scenarios utilized for achieving the testing goals and purpose;
  • Identifying the data collected during testing;
  • Mapping the specific data collected to a testing goals;

 

After completing the test plan, perform a pilot test using the test plan. In the appendix, include changes that were made to the test plan based on the outcome from the pilot test.

Designing Usability Test Plan
Criteria Performance Quality Score
0 point 1 point 2 points 3 points
Test Logistics The test logistics are not complete. It does not include the testing location, listing of equipment and materials needed for testing, software (browser platforms and resolution testing will be conducted on), and hardware the computer type used (MAC, PC, or Mobile device). A partial outline is presented regarding the testing logistics. Two of the four (location, equipment, software, and hardware) logistics are presented. The testing logistics are complete and accurate. There are two or more grammatical errors. The testing logistics are complete and accurate.  
Resources & Testing Materials The test plan does not outline the testing materials (consent forms, facilitator guide, demographic questionnaire, and post questionnaire) and resources need to make this test plan all encompassing. A partial listing of the resources and testing materials are noted. A complete listing and description of the testing materials are noted. There are two or more grammatical errors. A comprehensive listing and description of the testing material are noted.  
Testing Purpose The purpose, goals, and objectives for conducting the test are not clearly defined. The purpose for the test is defined; but the goals and objectives associated with the test are not defined. The purpose, goals, and objectives for conducting the test are defined. There are two or more spelling errors. The purpose, goals, and objectives for conducting the test are defined.  
Testing Methodology Description of the testing methodology and process is not defined. A partial and/or vague explanation of the testing methodology and process is presented. Several spelling errors are identified. Description of the testing methodology and process is defined. There are two or more spelling errors. Description of the testing methodology and process is clearly defined.  
Task Scenarios Tasks scenarios are not present in the test plan. Task scenarios are leading the user and have several spelling errors. The task scenarios provide context to the task at hand. The scenarios are not leading. There are two or more spelling errors identified. The task scenarios provide context to the task at hand. The scenarios are not leading. After each task scenario, there are appropriate questions for probing when needed.  There are no spelling errors.  
Data Collection No mention regarding the type of data collected was identified. A list of the type of data to be collected was identified. A description of the type of data to be collected was identified. Each data type was associated to a testing goal to help ensure that the appropriate metrics were being collected to identify if the goal is met. There are two or more spelling and grammatical errors. A description of the type of data to be collected was identified. Each data type was associated to a testing goal to help ensure that the appropriate metrics were being collected to identify if the goal is met. There are no spelling errors.  
Pilot Testing No pilot testing was conducted. Pilot testing was conducted and areas for enhancement were identified. However, no modifications were made to the test plan to reflect the changes. Pilot testing was conducted and areas for enhancement were identified. A brief write up was completed identifying the changes. The appropriate changes were made to reflect the feedback from the pilot test. There are two or more spelling errors. Pilot testing was conducted and areas for enhancement were identified. A brief write up was completed identifying the changes. The appropriate changes were made to reflect the feedback from the pilot test. There are no spelling errors.  

Usability Lab

Read Ch. 4:  Skills for Test Moderators, Ch. 9:  Conduct Test Sessions, Ch. 10:  Debrief Participants and Observers. Conduct at least 4 usability test sessions with each user group of at least 30 minutes each. Take notes about your experiences during the usability session because you will be writing a few paragraphs about your experiences. Post-lab, identify ways that the session could be improved to create a better test for the users and/or moderators.

For all usability sessions:

Include the following information in your lab report.

  • Describe the usability test plan you completed.
  • Copy and paste the URL of the web site you tested within your lab report.
  • Write a few paragraphs about your experiences using the simulation. Describe any surprises encountered or frustrations that felt—from the user and/or moderator perspective.
  • Suggest three ways that the usability test design could be improved.
Usability Lab Rubric
Criteria Performance Quality Score
0 point 1 point 2 points
Content Description of the usability test session was incomplete or had inaccuracies, or there are five or more grammatical errors. Content is complete, accurate, and addresses the surprises or frustrations encountered by users and/or moderators. There are three or more grammatical errors. Content is complete, accurate and addresses the surprises or frustrations encountered by the users and/or moderators. There are no grammatical errors.  
Critical Thinking Lacking critical thinking. Three relevant recommendations for improvement were not provided. Some critical thinking (application, analysis, synthesis, and evaluation) evident. Three relevant recommendations for improvement were provided. Clear evidence of critical thinking (application, analysis, synthesis, and evaluation). Three relevant recommendations for improvement were provided.  

Data Summary Memo

Read Ch. 11:  Analyze Data & Observations and Ch. 14:  Expanding from Usability Testing to Designing User Experience. The point of this activity is to write a preliminary summary of raw data collected in the previous usability lab. Data Summary Memo required components:

  1. purpose/introduction
  2. test users profile (# of participants, basic demographic information, technology competency, etc. Any information collected prior to test)
  3. definition of anticipated raw data (what was collected, and how)
  4. summary of raw results (use tables and graphs wherever possible)
  5. early insights (illustrate patterns, trends, outliers, etc. citing tables/graphs)
  6. summary
Data Summary Memo Rubric
Criteria Performance Quality Score
0 point 1 point 2 points
User Profiles Profile was not completed. Profile was poorly written. Profile is incomplete or has inaccuracies. Profile is complete, accurate and has relevant details.  
Data Data definition/summary was not completed. Data summary is poorly written. Graphs/tables were not provided. Data definition/summary is incomplete or has inaccuracies. Data definition /summary is complete, accurate and has relevant details.  
Memo Memo is incomplete. The content brief is poorly written. There are three or more grammatical errors. Memo includes all required components but is inconsistent with the Users profile and/or Data results. Memo includes all required components, is accurate and addresses all relevant profile and data. There are no grammatical errors.  
Critical Thinking Lacking critical thinking. Early data insights were not provided. Some critical thinking (application, analysis, synthesis, and evaluation) evident. Early data insights were provided. Clear evidence of critical thinking (application, analysis, synthesis, and evaluation). Early data insights were provided.  

Usability Report

Write a Usability Report

Read Ch. 12:  Report Findings and Recommendations and Ch. 14:  Expanding from Usability Testing to Designing User Experience. Create a Usability Report for the activity and data you described in the Usability Lab and Data Summary Memo. The Usability Report should include:

  • Introduction/overview of project
  • Data Summary Memo
  • Hierarchy of Findings (distinguish findings that have the most impact)
  • Recommendations (describe findings into actionable recommendations)
  • Future Research (identify areas where further research is required)
  • Conclusion
Usability Report Rubric
Criteria Performance Quality Score
0 point 1 point 2 points
Usability Report Usability Report document is incomplete. The strategy is poorly written. There are three or more grammatical errors. Usability Report includes all required components and is well written but is lacking in details. There are three or fewer grammatical errors. Usability Report includes all required components, is exceptionally well written and includes relevant details. There are no grammatical errors.  
Critical Thinking Lacking critical thinking. Responses tend to be inaccurate or unclear. Some critical thinking (application, analysis, synthesis, and evaluation) evident, but responses may not directly address the issue. Clear evidence of critical thinking (application, analysis, synthesis, and evaluation). Responses are characterized by originality and relevance to the topic.  

Brief paper

Read Ch. 13: Variations on Basic Method. The Brief Paper is a 2–3 page written critique with a brief description of a (usability test) technique variation, and how it would have worked/not worked for your previous usability exercise.

Take into account the proposed user audience for the site, test parameters/constraints, test session and findings, etc. (in other words, the topics we discussed in class) and make an attempt to justify why the technique would have worked for your usability exercise conducted in class—or not.

Include the following information in your Brief Paper.

  • Describe the variation of usability technique you selected.
  • Compare and contrast this new technique with the technique you used for class.
  • Justify why the variation why the new technique would have worked well for you—or not.
Brief Paper Rubric
Criteria Performance Quality Score
0 point 1 point 2 points
Brief Paper Brief Paper was not completed on time, is poorly written, and/or does not contain relevant content. Brief Paper was completed on time, is well written, contains some spelling and/or grammatical errors, most of the content is relevant. Brief Paper was completed on time, is extremely well written, contains no spelling or grammatical errors, all of the content is relevant.  
Critical Thinking Lacking critical thinking. The justification(s) tend to be inaccurate or unclear. Some critical thinking (application, analysis, synthesis, and evaluation) evident, but justification(s) may not directly address the issues. Clear evidence of critical thinking (application, analysis, synthesis, and evaluation). Justifications are characterized by relevance to the topic.  

Examination Questions

Learning Modules

Help us develop our living curriculum. Contribute a learning module.

Contributors

Primary Course Developers: Dara Solomon and Dana Bryant

Production Editor: Terry Morris