Sunteți pe pagina 1din 7

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

Sree Anirudh J Bhandaram


Portfolio of a Usability, User Experience & Web development Professional Usability Web Dev Others Photography You are here: Home Usability Usability Testing

Usability Testing
Prepared for: Staci Autovino Kemp Prepared by: Kate Cushman Walders Sanket Anil Naik Sharmishsta Dhuri Sree Anirudh J Bhandaram May 15, 2009

Executive Summary

DataMentor is the friendly front door to assessment data and instruction with solutions for the classroom level teacher. It allows teachers, administrators, curriculum specialists, and other professionals interested in improvement through the use of data to analyze district assessment scoring results and select from an array of valuable options to target student growth and improvement based on individual or district needs. It allows the user to analyze district assessment scoring data, identify performance gaps between individual school districts and the local BOCES aggregate result, and explore professional development resources. The goal of the usability test was to validate the usability of the most common tasks of the website. Participants agreed to test the site. Four tests were conducted in the usability test lab and three others were conducted at the participants location of choice. All the participants except one used a laptop computer running windows vista. A screen capturing software, Webinaria, was used to record mouse clicks, pages visited and the participants comments. Not all participants were able to complete all the tasks successfully. For the task which required the participant to navigate to a trend summary chart, 85% of the participants successfully completed the task. For the task which required the participant to find a lesson plan, only 42% successfully completed the task! The task that required participant to navigate to the lesson plans provided the most difficulty for participants, with an average task time of 427 seconds, and a completion rate of 42%.

1/7

2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

The chart below shows the average task completion time. Its pretty clear that task 4 was the most difficult task.

2
2.1

Method
Participants

The main users of the DataMentor website are teachers and administrators. Many participants had roles that were not limited to teaching. Of the seven participants, four were males and three were females. We had three novice users and four expert users. The participants included one administrator, four teachers and two data analysts.

2.1.1 Team Roles


Before conducting the usability test, we identified roles for the team members based on the duties and processes for the test. We came up with four team roles: Test Administrator Test Moderator Note taker Observer Then, we assigned roles to all of us members as following:
Team Member Kate Walders Sanket Naik Sharmishtha Anirudh Bhandaram Team Role Administrator Moderator, Note taker Moderator, Note taker Note taker, Observer

After assigning roles, we conducted a pilot test to familiarize ourselves with these roles. Although the team roles were specified, the note takers occasionally played different roles during the tests. As the test administrator, Kate recruited all the participants. She was the teams point of contact with Staci and the participants. During the test, Kate greeted the participants, read off the orientation script, obtained written consent and gave the pre-test to the participants. She also conducted the tests by handing the tasks and reading the tasks aloud to the participant. She also took notes on participants comments in addition to the note takers. Anirudh helped setup the data capture software during each session in addition to being the observer. Anirudh also took notes during one usability test. Sanket and Sharmishtha were test moderators and note takers; they were in the usability testing room with the participant and recorded participants comments. For two tests, Sanket and Anirudh couldnt attend because of scheduled classes, Kate and Sharmishtha conducted two tests on their own, assuming all the roles. After the tests were completed, Sanket analyzed the data, Kate and Sharmishtha were responsible for the website recommendations and Anirudh created the usability test report.

2/7

2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

2.2

Protocols

The test was designed to collect data relevant to the tasks, which enabled us to make suggestions for the website. The test also helped validate our heuristic evaluation of the website. The participants were encouraged to think aloud which assisted us in obtaining comments and suggestions from the participant which otherwise would have been difficult to get. Our main focus was the path followed by the participant to complete a task. This gave us insight into the design of the website and the frustrations experienced by a participant while completing the tasks. We also collected the task times, number of times the help system was accessed, and the frequency of the back button use. The primary roles for the testing were administrator and note taker. The administrator was responsible for direct communication with the participants. As the administrator, Kate read the tasks, answered questions, oriented the participants and conducted the post-task and post test interviews. The note takers were responsible for recording the various types of data that came out of the test via observation or the communication between the participant and the administrator. Users were encouraged to follow the think aloud protocol as they performed the tasks. Our administrator kept her input to a minimum until after the tasks were completed. She discussed the tasks with the participants in more detail while she was filling out the post task and post test questionnaires.

2.3

Procedure

The usability test lab was reserved for four of the participants who consented to test in the lab. The participants were informed about the test well in advance. One of the participants tested the website at her residence. The other two tested in an empty classroom at a suburban middle school. During the test, Kate greeted the participants, gave introductory instructions, requested that they fill out a pretest form and sign a consent form. They were informed that their voice and their actions on the screen would be captured. The participants were encouraged to think aloud and were told to inform us when they think that they completed the task. If they asked for assistance because they were having trouble, they were redirected but were never told the solution to the task (unless they gave up). They were informed that we were testing the website, not them. We attempted to make them feel as comfortable as possible. When the participant completed each task, they were asked to answer a set of generic questions about the task. Participants were encouraged to discuss the task and give any comments they had on the website or the task.

2.4

Equipment

All of the participants, except one, tested on Sharmishthas laptop computer which had Windows Vista installed on it. A mouse was provided to make interaction with the website easier. The website was accessed through Mozilla Firefox in its default configuration. The website required a flash plug-in which was preinstalled. Sounds were turned on to get feedback from the user about the sounds on the website.

3/7

2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

2.5

Testing

Four usability tests were conducted in the usability test lab in the Thomas Golisano College of Computing and Information Sciences at Rochester Institute of Technology. The usability testing lab was designed for occupancy by 4 people at once. Although the testing lab had two computers, we decided to use our own laptop because we had users that were not going to test in the lab. Because environmental factors did not play a significant role in the test, we decided to ignore the presence of more than two people in the room. All the tests were conducted with the same administrator, but the remaining roles were played out by different team members in the tests.

Tasks

Our task scenarios were designed to mimic the most common tasks a user would perform on the website. Based on these task scenarios, we came up with a set of research questions that we wanted to explore during the test. The following are the research question used to design the tasks: How easily and successfully do teachers and administrators find subject resources? How quickly can a teachers or administrator find their grade and subjects trend summary charts? What are the major usability flaws that cause the most frustration for teachers and administrators? How easily do teachers and administrators understand what is clickable? You can find the tasks used in the test in the appendix.

Test Results

During the test, the note takers observed and noted all the comments that the participants made. The data collected included task times, number of times help was referenced and number of pages traversed. The quantitative data was categorized according to the tasks. The qualitative data collected can be seen in the appendix.

4.1

Task 1 View chart list

On average, a participant spent 4 minutes performing this task. The bulk of the time was spent on analyzing the charts. This task was the easiest for the participants and all users successfully completed this task. Avg. Time Back Time on Pages time on Participant(in button help traversed each sec) used tutorial help use

4/7

2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

1 2 3 4 5 6 7

94 411 368 96 440 113 117

4 12 11 4 7 N/A 4

0 5 4 0 0 N/A 0

0 72 104 0 90 N/A 0

0 36 72 0 90 N/A 0

4.2 Task 2 View trend summary and related performance indicators


All of the users were able to complete this task. On an average, a participant spent 6 minutes on this task. The average pages traversed was 15. One user traversed 51 pages to complete this task though the ideal number was 5 pages. The location of trend summary at the bottom of the page wasnt visible. Avg. time Back button Time on on used help each help use 0 0 0 6 0 0 5 10 10 1 0 0 11 0 0 N/A N/A N/A 4 0 0

ParticipantTime 1 2 3 4 5 6 7 60 629 192 199 706 119 269

Pages traversed 5 10 11 6 51 N/A 10

4.3 Task 3 Distinguish resources to locate different assessment views


One user failed this task. The average user took 7 minutes to complete the task. Some of the users tried to locate the January 2007 ELA Regents through the chart list which is where they expected but did not find it. Pages traversed 860 38 398 16 Failed8 74 4 58 3 386 N/A 582 23 Time Back button Total time on used help tutorial 11 0 8 32 0 0 0 0 0 0 N/A N/A 6 134 Avg. time on each help use 0 32 0 0 0 N/A 67

4.4

Task 4 Find the resource to locate lesson plans

This task was considered to be the most difficult task of the four. 4 out of 7 users failed this task.
5/7 2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

On average, 30 pages were traversed performing this task. Pages traversed 150 21 failed 67 failed 23 failed 23 failed 29 204 N/A 205 13 Time Back button Total Time on used help tutorial 14 0 17 152 10 60 6 38 6 18 N/A N/A 3 0 Avg. time on each help use 0 38 60 38 18 N/A 0

4.5

Conclusions

On average, all tasks went beyond 5 minutes. The maximum of time was spent on searching for the data. An average user traversed 17 web pages for each task because they couldnt find the data where they expected it. The help system was referred to only for 19 seconds on average, where a user spent time reading the manuals but finally gave up. Few users viewed the help video for more than several seconds. The back button was used an average of 5 times because the users didnt find appropriate navigation links. Many of our users went to the Accelerate U site by mistake as the DataMentor website didnt give the user proper notification that he/she was leaving the site.

5
5.1

Recommendations
Task 1 View chart list
Legends should be visible throughout the chart list so that a user can refer to it anytime they want to. Make chart sorting buttons more obvious.

5.2 Task 2 View trend summary and related performance indicators


Trend Summary chart option should be listed as sub-menu for chart list

5.3 Task 3 Distinguish resources to locate different assessment views


List all similar kind of data at one place. For example, a user could not get to the January 2007 ELA regions from the chart list. Font size of different assessment views (teachers/students) should be larger. Organize test questions in assessment such that each question is on its own page and provide a navigation system that lets the user choose a question and/or that lets a user go to the previous or next question

6/7

2010/03/19 17:51

Usability Testing | Sree Anirudh J Bhandaram

http://www.bionecron.com/portfolio/usability/usability-testing/

5.4

Task 4 Find the resource to locate lesson plans


The lesson plan is frequently used by target users. Almost all users had problem searching this option. The lesson plan should be part of resources navigation.

5.5

Overall
The data should be reorganized to provide categorical view for user to make user goals more easily obtainable. Rename some of the information types to be more descriptive like new features could be new assessment updates. Define resources. Make more information choices readily apparent. Move features that are not primary data (like new features) . Search is the users lifeline when navigation fails. Design of search should be presented as a simple box, since thats what users are looking for. Dealing with PDF files while browsing through website is not a preferred option for any user; most PDFs do not follow standard browser conventions. Layouts of PDFs are often optimized for a sheet of paper, which rarely matches the size of the users browser window and it is difficult to scroll such document. PDF is great for printing and for distributing manuals. Reserve it for this purpose and convert any information that needs to be browsed or read on the screen into real web pages. A wall of text is deadly for an interactive experience. When user is looking into help, the actual text should be in the following format: subheads bulleted lists highlighted keywords short paragraphs CSS style sheets unfortunately give websites the power to disable a Web browsers change font size button and specify a fixed font size. In DataMentor the font is fixed and small in size, which significantly reduces readability for most users. The website should provide a way to give a user preferences and let them resize text as needed. It is important to consider font size for DataMentor website as some of the target audience age group is more than age 40. Need to provide a hierarchical view of the website to allow users to browse through it seamlessly without the use of browser back button.

Elsewhere

Facebook Flickr Deli.icio.us Linkedin Twitter

Sree Anirudh J Bhandaram. Powered by WordPress, using the Manifest theme

7/7

2010/03/19 17:51

S-ar putea să vă placă și