Tag: methodology

  • Rethinking user research for the social web

    While the Web has evolved from flat documents to being fluidly ambient, we’re using user research methods from 1994. In this session, Dana presents 5 major issues confronting UXers working in the social web, challenges you to creative solutions, and shares experiences from pioneering researchers. Watch the video of this talk from ConveyUX in 2013…

  • Crowd-sourced research: trusting a network of co-researchers

    Crowd-sourced research: trusting a network of co-researchers

    In the fall of 2012, I seized the opportunity to do some research I’ve wanted to do for a long time. Millions of users would be available and motivated to take part. But I needed to figure out how to do a very large study in a short time. By large, I’m talking about reviewing…

  • Usability testing is broken: Rethinking user research for social interaction design

    How many of you have run usability tests that look like this: Individual, one-hour sessions, in which the participant is performing one or more tasks from a scenario that you and your team have come up with, on a prototype, using bogus or imaginary data. It’s a hypothetical situation for the user, sometimes, they’re even…

  • Researcher as director: scripts and stage direction

    For most teams, the moderator of user research sessions is the main researcher. Depending on the comfort level of the team, the moderator might be a different person from session to session in the same study. (I often will moderate the first few sessions of a study and then hand the moderating over to the…

  • Testing in the wild, seizing opportunity

    When I say “usability test,” you might think of something that looks like a psych experiment, without the electrodes (although I’m sure those are coming as teams think that measuring biometrics will help them understand users’ experiences). Anyway, you probably visualize a lab of some kind, with a user in one room and a researcher…

  • Tools for plotting a future course of design, checking progress

    “Let’s check this against the Nielsen guidelines for intranets,” she said. We were three quarters of the way through completing wireframes for a redesign. We had spent 4 months doing user research, card sorting, prototyping, iterating, and testing (a lot). At the time, going back to the Nielsen Norman Group guidelines seemed like a really…

  • What are you asking for when you ask for a heuristic evaluation?

    Every usability professional I know gets requests to do heuristic evaluations. But it isn’t always clear that the requester actually knows what is involved in doing a heuristic evaluation. Some clients who have asked me to do them have picked up the term “heuristic evaluation” somewhere but often are not clear on the details. Typically,…

  • Looking for love: Deciding what to observe for

      The team I was working with wanted to find out whether a prototype they had designed for a new intranet worked for users. Their new design was a radical change from the site that had been in place for five years and in use by 8,000 users. Going to this new design was a big…

  • Testing in the wild defined

    Lately I’ve been talking a lot about “usability testing in the wild.” There are a lot of people out there who make their livings as usability practitioners. Those people know that the conventional way to do usability testing is in a laboratory setting. If you have come to this blog from outside the world of…

  • Data collecting: Tips and tricks for taking notes

    A common mistake people make when they’re new to conducting usability tests is taking verbatim notes. Note taking for summative tests can be pretty straightforward. For those you should have benchmark data that you’re comparing against or at least clear success criteria. In that case, data collecting could (and probably should) be done mostly by the…