Search and Find

Book Title

Author/Publisher

Table of Contents

Show eBooks for my device only:

 

Observing the User Experience - A Practitioner's Guide to User Research

Observing the User Experience - A Practitioner's Guide to User Research

of: Mike Kuniavsky

Elsevier Reference Monographs, 2003

ISBN: 9780080497563 , 577 Pages

Format: PDF, ePUB, Read online

Copy protection: DRM

Windows PC,Mac OSX geeignet für alle DRM-fähigen eReader Apple iPad, Android Tablet PC's Apple iPod touch, iPhone und Android Smartphones Read Online for: Windows PC,Mac OSX,Linux

Price: 52,95 EUR



More of the content

Observing the User Experience - A Practitioner's Guide to User Research


 

CHAPTER 1 Typhoon: A Fable

Sometimes it takes a long time for something to be obvious: a shortcut in the neighborhood that you’ve known all of your life, a connection between two friends, the fact that your parents aren’t so bad. It can take a while for the unthinkable to seem clearly natural in retrospect.

So it is with Web sites and user research. For a long time in the short history of Web development, the concept of putting an unfinished product in front of customers was considered an unthinkable luxury or pointless redundancy. The concerns in Web design circles were about branding (“make sure the logo has a blue arc!”) or positioning (“we’re the amazon.com of bathroom cleaning products!”) or being first to market. Investigating and analyzing what users needed was not part of the budget. If a Web site or a product was vaguely usable, then that meant it was useful (and that it would be popular and profitable and whatever other positive outcomes the developers wanted from it). Asking users was irrelevant and likely to damage the brilliance of the design.

Recent history has clearly proved that model wrong. It’s not enough to be first to market with a blue circle arc and an online shopping cart. Now it’s necessary to have a product that’s actually desired by people, that fulfills their needs, and that they can actually use. That means user research. User research is the process of understanding the impact of design on an audience. Surveys, focus groups, and other forms of user research conducted before the design phase can make the difference between a Web site (or any designed product) that is useful, usable, and successful, and one that’s an unprofitable exercise in frustration for everyone involved.

Nowadays, it seems obvious that a product should be desired by its audience. But that wasn’t always the case. Let’s step back to the Web world of the mid-1990s, when perfectly smart and reasonable people (including me) couldn’t imagine designing a product that users wouldn’t like. Here’s what happens when you don’t think about the user.

The Short History of Typhoon


In the heady days of 1996, PointCast was king. A service that ingeniously transformed the mundane screen saver into a unique advertising-driven news and stock service, it was the wunderkind on the block. It attracted tens of thousands of users and landed its creators on the covers of all the industry magazines. Information coming to people, rather than people having to ask for it, was a brand-new concept and quickly acquired a buzzword summarizing it. It was push technology, and it was the future. Soon, everybody was on the push bandwagon, building a push service.

Let me tell you a fable about one company that was on the bandwagon. It’s based on a true story, but the details have been changed in the interest of storytelling (and to protect the innocent, of course). Let’s call the company Bengali. Bengali had several high-profile successes with online news and information services, and now was confident, ready, and eager to take on a new challenge. They wanted to create something entirely revolutionary—to challenge everyone’s assumptions about media and create the next television, radio, or printing press. They decided that their dreams of creating a new medium through the Internet had its greatest chance for success in push.

It was going to be called Typhoon, and it would put PointCast to shame. Bengali went into skunkworks mode, developing Typhoon completely in-house, using the most talented individuals and releasing it only when it was completely ready.

PointCast Killer development takes a lot of work. The developers worked on the project in secret for a year, speaking about it to no one outside the company (and few inside). Starting with “How will the society of the future interact with its media?” the development team created a vision of the medium of the future. They questioned all of their assumptions about media. Each answer led to more questions, and each question required envisioning another facet of the future.

The software and the vision grew and mutated together. The final product was intricate, complex, and patched together in places, but after a year Bengali was ready to release it.

When it was ready to launch, it was shown to top company management. Although undeniably impressed with the magnitude of the achievement, the executives felt some apprehension. Some wondered who the audience was going to be. Others asked the team how people would use it. Although the team had answers for everything (over the year, they had developed a very thorough model of the program and how it was to be used), they admitted that they had to qualify most of their answers because the software had not been put in front of many end users. They were experienced developers, and they had done some in-house evaluation, so they figured that—if not right on—their design was pretty close to what their users would want. But, to placate the executives and check where the rough spots were, the developers decided to do some user research before launching it.

A dozen people were picked and invited for one-on-one user tests. They came in, one at a time, over the course of several days. The plan was to have them sit down, give some initial thoughts about the product, and then let them try a couple of different tasks with it.

The tests were a disaster. This is a portion of a verbatim transcript from one session.

USABILITY TEST SESSION TRANSCRIPT

If this is just graphics that are supposed to look that way it’s kind of confusing because you think that maybe it’s supposed to do something … I don’t know.

All of these words down here are barely legible.

If this is supposed to say something that I’m supposed to understand, I guess it’s very hard to figure what that stuff is.

None of these headlines are making any sense to me.

I don’t know what to make of these.

I know if I click on them they’ll do something but …

It’s not inspiring me to click on any of them so far.

Also, there’s no status bar so you’re not really sure when a page is finished being loaded.

I don’t know if these numbers have anything to do with that or not … I don’t know.

I hope that that’s a story that I’m going to follow the link to.

This must be [downloading over] a 28.8 [modem] I’m assuming.

It seems a little slow.

This doesn’t seem like what I was looking for.

I’m really curious about what these numbers are down here.

They may be nothing.

I just want to know what they’re about.

OK, I don’t really want to follow that …

I’m waiting for some sort of text to tell me what this is going to be about.

Since there’s nothing, I don’t know what’s it’s about.

I’m not even sure if the page is finished loading.

OK, there it is …

When I hold that down I would hope that it would stay there but it keeps going away.

Even without seeing what the user is talking about, the frustration and confusion are clear. When the test participants tried to use it, either they used it differently from how the developers intended or, when they used it as intended, it didn’t behave as they expected. From a usability standpoint, it was largely unusable.

As bad as this situation was, one point was even worse: none of the participants knew what Typhoon was. It became clear that people would never begin trying to work with it because they had no idea what it was for. Usability was beside the point because the product was incomprehensible.

The developers scrambled to fix the problems, to make Typhoon clearer, and to help people use it, but there were no clear directions for them to go in. The product launch was coming up fast, and they were able to fix some of the most obvious problems. Many problems remained, however, and their confidence shaken, they nervously suspected that many more problems existed.

When Typhoon launched, it was met with confusion. Neither the press nor its initial users knew what to do with it. It got significantly less traffic than Bengali had projected and, despite an aggressive advertising campaign, the number of users kept dwindling.

As the development team worked on the next revision of Typhoon, the direction in which to take it became less clear. Fundamental questions kept popping up. There was constant debate about scope, audience, purpose, and functionality. What had seemed certain suddenly seemed precarious. The executives were quickly losing confidence in the team’s ability to fix Typhoon. The debates continued. Their fixes failed to keep visitors from abandoning the product, and after a couple of months, the project leader was replaced with someone who had a more traditional software background. The new project leader tried to revamp Typhoon into a more ordinary news service, but the new specs made Typhoon look just like the company’s other products, which ran contrary to the very premise of the service. Requests for additional staff to implement deeper changes were denied. Opinions about how to attract visitors began to multiply and diverge. The team felt betrayed; they felt that their creativity had been wasted and that their good ideas had been being thrown out by management.

Four months after the project launched, the...