Analytical And Empirical Methods For Usability Evaluation Computer Science Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Usability Evaluation is evaluation on the quality of the usability of the interface design. Two common methods used for this evaluation process are Heuristic Evaluation (analytical method) and User Testing (empirical method). Both of these methods have their own strengths and limitations, depending on the objectives and the context of the usability evaluation.

TASK 1: Methods Description

The Heuristics Evaluation

Heuristic evaluation is a usability inspection technique first developed by Jakob Nielsen and his colleagues (Sharp, 2007). According to Jakob Nielsen (1994), Heuristic evaluation is a usability engineering method for finding the usability problems in a user interface design so that they can be attended to as part of an iterative design process. It is usually conducted by a small group of evaluators presented with an interface design and are required to evaluate whether each of its elements follows a set of established usability principles or `heuristics'. The inspection of the interface is performed by each individual evaluator. Independently, the evaluators work through the interface several times, at least twice. The first round is to get the feel of the flow of the interaction, whilst the second or more, to inspect on specific various dialogue elements of the interface. All problems encountered are recorded and listed under the respective heuristics. `Only after all evaluations have been completed are the evaluators allowed to communicate and have their findings aggregated. This procedure is important in order to ensure independent and unbiased evaluations from each evaluator' (Jakob Nielsen, 1990). The rational for grouping back the evaluators is to discover as many problems as possible, because one single evaluator would not be able to find all the problems. Different evaluator will discover different problems.

The Heuristic Evaluation (developed by Jakob Nielsen and Mack (1994) consists of ten (10) general principles or "a rule of thumb." They are Visibility of system status; Match between system and the real world; User control and freedom; Consistency and standards; Error prevention; Recognition rather than recall; Flexibility and efficiency of use; Aesthetic and minimalist design; Help users recognize, diagnose, and recover from errors; and Help and documentation. However, the evaluator may add new heuristic element or category-specific heuristics to supplement the general heuristics.

Strengths of Heuristic Evaluation:

Ease in Organizing Evaluation: It is easy to do it. Being independent the evaluators can inspect anytime as long as there is product to be inspected. The whole inspection process will be handled by the evaluators without any intervention from anybody, until the final part of the evaluation session whereby they come together to discuss everything that has been discovered. Before the final part, the evaluators will repeat the inspection several times, at least twice.

Less Cost: Compared to user testing evaluation which needs proper laboratory, heuristics evaluation involves lesser cost and time. The number of evaluators are very much less, from 3 to 5 persons, compared to user testing method which requires many more users participating to be effective. Heuristic inspection can also be conducted anywhere, even at the office or industry premises.

Effective Detection of Problems: By virtue of their experiences and knowledge, it is quite easy for the evaluators to identify major and minor problems of the interface. The evaluators discussion session will help identification of problems that are missed out or overlooked by other evaluators.

Weaknesses

Single Evaluator and Problem detection: Heuristic Evaluation can be done by a single individual evaluator. However, it is hard for one person to discover all the usability problems in an interface. According to Nielsen, averaging from over 6 projects , a single evaluator only found 35% of the usability problems of the interface.

Not Providing Systematic Problem Fixes: `Heuristic evaluation does not provide a systematic way to generate fixes to the usability problems or a way to assess the probable quality of any design.'(J.Nielsen)

The Paradox of the Less Cost: Employing a single evaluator can reduce cost. But the risk is the coverage of the problems. Thus, it is necessary to engage `multiple evaluators' which means a higher cost.

The Risk of Professional Differences: If the evaluators are not part of the development team, they may not be aware of technical limitations on the design or why certain design decisions were made. This may lead to differences between the development team and the evaluators, which may impede communication problems and correction of the usability problems identified during the usability evaluation.

Capturing of one-time low priority problem: As Heuristic evaluations are loosely structured, there is a likely- hood for the evaluators to capture one-time low-priority problems which may not be important to correct. It will be a waste of time and cost if evaluators are pre-occupied with addressing this kind of problems, not only to the evaluators themselves, but also to the product developer.

EMPIRICAL METHOD: User Testing

User testing is a user-centred usability evaluation method that employs evaluators to observe and record the performance of user during the test session. According to Helen Sharp who quoted from Dumas and Redish(1999), `the goal is to test whether the product being developed is usable by the intended user population to achieve the tasks for which it was designed.' Thus, it is a method of finding problem areas by engaging the persons who are similar to the target users to perform certain tasks. Administered in the user testing laboratory and in a controlled environment within the scheduled time, the users are required to perform either certain tasks, such as navigating through the menus, information searching, reading the different typefaces, or other specific tasks. During the user testing sessions, the evaluator `interpretes the user's actions in order to infer how these actions are related to the usability issues in the design of the interface'(J. Nielsen, 1994). The evaluators record everything about what, how and when the users do. The participants' attitude, action, behaviors, facial expression, remarks or other kind of gestures are all recorded.

.

Strengths of User Testing

The Strength of Empirical Method: Identifying product problems by observing empirically the users interaction with the interface will give better understanding of the problems and needs of the users. It will be further enhanced when followed by the brainstorming session.

The Productive Aspect of Brainstorming: The brainstorming or `think aloud' sessions between the evaluators and the users encourages users to talk and share their experiences. This obviously helps discovery of many more problems for the correction or improvement of the product.

Users Need Come First: Sometimes, evaluators or software development specialists have their own biased views of what users want or need. User tests will neutralize the situation by telling what is really needed or expected from the users perspectives. Giving priority to problems raised by the users will also cool down professional differences between the evaluators and the system developer.

Weaknesses

Problem in Determining The Right Number of User Testers: The problem is to decide on how many users are ideal for the test. There are different of opinions pertaining to the right numbers of users to be engaged in the lab. A research studies conducted by Virzi(1990) show that 80% of problems uncovered based on a group of 5, and 90%, from a group of 10 users. Such be the case, more user numbers implies more costs. However, practically, it is not easy to decide on the most cost-effective numbers or optimum number of users to participate in the usability testing, because results may differ if tested in different context , environment and time.

Non-coverage of Functionalities: Sometimes, the test could not cover all the functionalities of the product in such constraint time and environment.

Inability to Capture Minor Usability Problems: Though this method of evaluation is good in discovering major HCI problems, it is quite difficult to observe minor usability problems, such as typographic inconsistency. There is a possibility that even the same gesture observed from the users can be interpreted differently by each individual evaluator.

Conclusion

Judging from the strengths and weaknesses of both Heuristic Evaluation and User Testing, it is rather difficult to conclude which either method is better. It is an `evaluator-centred' versus `user-centred' approaches . Ideally, it is good to have both methods as both have their own advantages and can complement each other.

REFERENCE

Edited by Jakob Nielsen and Robert L. Mack, Usability Inspection Methods: How To Conduct Heuristic Evaluation(J.Nielsen,. 1994), John Wiley & Sons, New York, NY, 1994.

Web evaluation: Heuristic evaluation vs. user testing, International Journal of Industrial Ergonomics, Volume 39, Issue 4, July 2009, Pages 621-627 Wei-siong Tan, Dahai Liu, Ram Bishu(2008),

Interacting with Computers: Heuristic evaluation: Comparing ways of finding and reporting usability problems, Volume 19, Issue 2, March 2007, Pages 225-240

Usability testing: a review of some methodological and technical aspects of the method, International Journal of Medical Informatics Volume 79, Issue 4, April 2010, Pages e18-e23, J.M. Christian Bastien

Helen Sharp, Yvonne Rogers and Jenny Preece, Interaction Design: Beyond Human-Computer Interaction,2nd Edition(2007), Chapter 14 & 15, John Wiley & Sons, Ltd.

Jacob Nielson, How to conduct a Heuristic Evaluation (Article), Para 11

http://www.useit.com/papers/heuristic/heuristic_evaluation.html

TASK 2: WEBSITE EVALUATION INFORMATION

Website: http://www.iuctt.edu.my/

Website Owner: International University College of Twintech Technology

Date of Evaluation: 19th November 2010

EVALUATION METHOD : Heuristic Evaluation Model (J. Nielsen, 1994)

EVALUATION INSTRUMENT: Heuristic Evaluation Inspection Check List (see Attachment 1)

The ten (10) Heuristics are:

Visibility of system status

Match between system and the real world

User Control and Freedom

Consistency and standards

Error prevention

Recognition rather than recall

Flexibility and efficiency of use

Aesthetic and minimalist design

Help users recognize, diagnose, and recover from errors

Help and documentation

Five (5) Usabilty Problems To the User

Homepage Layout

The Problems: The contents on the Homepage was not conveniently arranged. The presence of big actively alternating photographs right at the top-right of the page and leaving a big white space on the left side is a waste of website space. With such placement, it makes the upper part of the Homepage `misaligned'. Active items such as `Twintech Yemen Branch' thumbnail and other branches presented in the for m of hypertext in Sabah, Kota Bharu, and Sri Damansara are not visibly located. Even `Online Application ' flag was not visible when the website is open. Surprisingly, the `Online Application' was not functioning. When it was clicked, it leads to nothing, as if it was inactive flag!

Usabilty Problems To the User: Poorly arranged contents` may put off a user. A user will find difficulty in searching the required items or pages. Icons which are not visible will waste the users time in searching for the right page for more information or guidance. For example, a user has to search for other branches somewhere down the pages or has to go to the next page to look for Twintech branches. There is an occurance of poor feedback system for basic information. For example, when `online application form' icon was clicked, there was no feed back. Instead of displaying the user with the requested online application form, there was nothing shown! This can cause potential college applicants abandoning the screen . Non-functioning Online Application will a put off case! The single level Menu somewhat takes a longer searching time for the user. This affect flexibility and ease of navigation or control of the website content. It will be more convenient if a user can select item contents by making choices at the main menu.

"Twintech Yemen Branch" icon is poorly located. It would be more visible if it is located at the top page of the Homepage instead of at the bottom of the page. Likewise, active hypertext or icons for other branches in Malaysia are not placed on the Homepage, instead a user has to look for the link for other branches in Kota Bharu and Sarawak, Malaysia. They are located under the `Students' menu!

Information about the Convocation was very redundant. On the Homepage, information on convocation were located in three places . It would be less confusing if the same information is placed under one banner! As such a user needs to scroll further down or to go to other pages to search for the branches.

The heuristics not conformed:

Aesthetic and Minimalist Design: This is because attention was not given to the Homepage layout which can give visual impact to the user. Basic content items should be visibly arranged and attractive to the user. The scattered arrangement of active online Twintech branches and no result Online Application flag will somewhat discourage a user to continue browsing. Also, the arrangement of big photographs at the top page create big white spaces; and having inactive caption-less thumbnail pictures randomly in the pages, spoils the attractiveness of the website.

Flexibility and Efficiency of Use: In relation to the above comments, the poor arrangement of active icons limits efficiency in accessing other pages of the website.

Consistency and Standards: The many forms of text, be it title, subtitles, paragraphs, captions, or font sizes depict inconsistency and oversight of basic standards. Even, in one page, a user will discover low contrasting title or sub-title lines and use of fade text that can be mistaken for hypertext functions.

The Menu and Search Function (General Navigation Problems):

The Problems: The Menu, rightly located on the upper left of the Homepage consists of five (5) menu items: Home; Student; Programmes of Studies ; About Us; and Contact Us. However, the Menu was not designed to have a `pull down' feature which could facilitate user selection more quickly, especially for the `Programmes of Studies', Student, or About Us.

Usability Problem To The User: If a user wants to access a specific Programme, e.g. Biotechnology, a user has to go to the Programme page, then from there, a user will select `Biotechnology' from the list. If it is a multilevel menu, a user can easily access the `Biotechnology' page by just one click! Menu consisting of five (5) main items also affect the speed of access to the rest of the contents. In relation to this, the menu does not contain adequate basic contents. For example, under `About Us' menu, there was no information about the college organizational structure, though it was established as a University College about 12 years ago! Even if a user clicks `Contact Us' of the `About Us' menu, relevant information is not there.

The Heuristic Not Conformed:

User Control and Freedom: The single level menu delays quick operation of a user. A user has to search `manually' to look for perhaps, hidden required information.

Flexibility and Ease of use: Relating to the control problem, the existing feature of the menu design limits the flexibility of a user. Instead of selecting an item direct from the menu, a user has to undergo several search process.

Consistency and Standards: Menu design does not reach a normal standard of sophistication. Simply, `philosophy of standard' should pay attention to the way an interface system pleases the user.

Non-functioning `Online Application' Flag and `Search' Button.

The Problems: These two problems are of the same kind. Both buttons/ boxes lead to nothing. The `Online Application' is placed on every page of this website. However, when a user clicks to get the application form, it does not display anything. In fact it takes quite a while about 2 minutes downloading time just to give `Connection is Time Out' message. Several attempts were made, but getting the same result. Likewise, the website `Search' button also demonstrates the same feedback. It did not function as expected. The search box/bar is ineffective. There is no feedback after inputting requested information, but only refreshed the page. For example, at the Student page, when the word `biotechnology' was typed in the Search box , it displayed the Homepage instead of the `Biotechnology' page! Also when clicking active Convocation picture , it does not bring to the convocation information page, but to the Homepage again!

Usability Problems To The User: The non-functioning `Online Application' will put off the user or potential college applicant. Perhaps, the user may try a few times. But if the feedback is `zero' , the session does not only waste the users time, but also can lead to distress. It took about 3 minutes to wait for the Online Application to get the `time expired' message! The duration is considered very long for an interface efficiency standards. Regarding the `non-performing' Search box , it is a sheer waste of time to the user when the requested information was not provided. In fact, the Search result was the display of the Homepage again! The same feedback appeared when a `Convocation' picture icon was clicked, that is, the Homepage was again displayed. This kind of feedback undoubtedly will bring about mental distress and frustrations to a user . It affects the smoothness of the navigation flow.

The Heuristics Not conformed:

Visibility of system status: For all the three active items, the feedback system was very unsatisfactory. The Online Application failed to produce the application page; the Search box also failed to search the right information but bring the Homepage to the user; and the active Convocation also led the user to the Homepage. So, the website of course kept the user informed, but about something else that a user was not interested in!

Error prevention: Error prevention alert was not flashed or signaled to the user, especially for failing to display the online application form.

Flexibility and efficiency in use: As these functions were not functioning as expected, the website did not give opportunity for the user to browse or explore easily and freely to learn more about its contents.

Consistency and Standards: From the results obtain , that is, poor feedback system, it is clear that the website was not been inspected or user tested satisfactorily. The designing of the web missed the basic interface standards in helping users to have joyful experiences.

User control and freedom: The user was not able to experience freedom in moving around because the browsing process was hampered by active yet non-functioning icon buttons.

Quick Exiting Problem and Home

The Problems: There is no `button' for quick page exiting. A user has to use the standard `Home button' to exit a task. So, when a user wants to exit the page quickly, the user has to scroll down for Home button right to the bottom of the page, or using the `Back Space' arrow. There is also no `Home' hypertext line located visibly in the working area where a user does not have to look for Home button down the page (or using the `Previous -`Next' page approach). Even that, it was disappointing because it takes quite a time. It would be better if the Home or Previous -Next page button are placed at the active page or at a more visible place or easy to reach on the screen.

Usability problem to The User: Since the `quick exit' is not built in the interface design, a user has to either exit by clicking the home button which is far down the pages or using the `back Arrow' of the Window upper Left corner to exit. As an example , when `Staff Webmail' is clicked, it prompts an `authentication window' for the user to enter `User Name' and `Password' . If a user decide not to carry on but like to exit the page, a user has to use `Cancel' button of the window, because there is no quick exit icon. However, the moment the Cancel button is clicked, it displayed another webmail window for Log In. At this page a user cannot exit quickly because there is no Home button anywhere. A user has to click `Back Arrow' on the browser to go back to the Homepage. The was no `emergency exit' system on this particular page. This is quite inconvenience as it takes more steps to exit.

The heuristics not conformed:

User Control and Freedom: Under this heuristic principle, there should be an interface function for "emergency exit'' to leave unwanted state without having to go through an extended dialogue' (J.Nielsen,1994) . But in the above case, a user has to do few other steps before exiting by clicking the `Back Arrow' key.

Consistency and Standards: To be in-sync with today's standard, a system should help user to exit with one key stroke, instead of going through several steps.

Flexibility and Efficiency of use: After the inspection done, it is obvious that that interface system designer overlooked the `flexibility and efficiency of use ' heuristic that can facilitate both the expert and the novice user.

Text Quality: Low colour contrast., font sizes and a mix of upper cases and lower case in paragraphs or text.

The Problems: In general, the colour contrast of the text writing may distress a user. The choice of too low shade of grey used for date lines, some sub-titles, and some small paragraphs strains the eyes of a user. In contrast, the use of blue `Contact Us' box occupying almost the whole screen can surprise a user. Some smudgy letters of the address are not readable.

Usability problem To The User: The low colour contrast together with small font tends to strain the eyes of the users eyes. As normally experienced, fade text lines usually indicate hypertext, but in this web, they are not. This may confuse a user and can lead to disappointment and distress. As it is not visally clear due to its small fonts and fade grey lines, it is quite annoying to a user. A user has to spend time focusing on the text to read it. Even the grey shades of colour contrast as well as the font size contrast between the titles and sub-titles are not helping the user to see the lines or paragraphs clearly and conveniently. Further more on the Faculty pages, there are a mix of red and black colour paragraphs. In contrast, the `Contact Us' page displays a very bold sizable fonts that is quite irritating visual. With blue background and black smudgy writings, a user who is not familiar with the language will find difficulty in reading and understanding the message or the words.

The heuristic not conformed:

Consistency and Standards: Using too low fade grey colour text does not conform to the principle of contrast for effectiveness in conveying a message. There is inconsistency in the choice of colours and shades of colours used to convey certain contents of the website.

Aesthetic and minimalist design: The display of text visual with mixed of colours and fonts reduce the aesthetic aspect of this website. It can be an `eye sore' if a user stays long on the web.

SUMMARY OF THE EVALUATION OUTCOME

Based on the Heuristic Evaluation conducted, problems that have been discovered mainly pertains to the Visiblity of system status, User control and freedom, Consistency and standards, Flexibility and efficiency of use, and Aesthetic and minimalist design. The heuristic inspection conducted could not get feedbacks on other heuristics, namely, Match between system and the real world; Error prevention; Recognition rather than recall; Help users recognize, diagnose, and recover from errors; and Help and documentation. Perhaps, this is due to the nature of the web which is designed to give college information rather than a more interactive interface system. One aspect to be highlighted is the `Online Application Box'. As the `Online Application' page cannot be accessed, it is difficult to inspect whether the system has Error Prevention system or not. So far, there was no `Error' alert found.

Basic or major problems that should be rectified by the IUCTT are the Feedback system, Homepage layout , Menu quality, Home button, Quick Exit button, and the whole text quality. The non-functioning `online application flag' will put off potential applicants. Likewise, the failure of the `search' button to fulfill a user's request can result in user frustration. The feedback display of the Homepage instead of the expected page or address whenever a user clicks `Read More' or other `Twintech branches' is very irritating. The appearance and the layout of the Homepage as well as the limited menu functions may affect the interface efficiency and the effectiveness. The layout of the Homepage screen and other pages are not visually attractive. There are many white spaces both in the homepage and other pages. Regarding the Menu, there is no pop-up or pull-down menu. Thus, the Menu functions could be further improved to facilitate navigation of website contents. To be in-sync with today's standard, tt should be `upgraded' from single level to multilevel as well as increase the menu items from five(5) to seven (7) items or more wherever relevant. The `Contact Us' page looks irritating due to the `smudgy' black bold font. For greater flexibility and ease of interface interaction, the system should incorporate easy access Home and Quick Exit button to increase the efficiency and the effectiveness of navigation. The text for the whole website should be improved in term of colour contrast, font sizes for the text titles, subtitles, dates, hypertext, letter cases ( the choice of lower and upper cases)and the like. Judging from the text in the webdsite, it can be inferred that this part has not been given attention. The text part of this website needs to be improved taking the aesthetic aspects into consideration. Other minor problems to be addressed are the non-functional inactive thumbnail-like pictures, arrangement of active thumbnail pictures for more visibility and the layout of all the website pages. For further improvement, this website should be re-inspected heuristically or by user testing method for the interest of the users.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.