Enjoy some lighthearted fun on a holiday!
Some of us are putting together an informal half-day conference focused on what’s new and innovative with user research methods. The idea is to find out what cool ideas and methods others have been trying out, and share what we have been upto ourselves. Its a free half-day conference. We chose a Friday afternoon to make it easy for you to come by. The location is Bolt & Peters office in downtown San Francisco. The setting will be informal (there will be food and drinks!). And this will be followed by a cocktail hour at 5.30 PM.
Talks are 20 minutes long and will cover a wide range of topics related to trends in User Research. Speakers include Wendy Castleman from Intuit, Ravit Lichtenberg from HP, Lane Becker from Adaptive Path and more. Go to the User Research Wiki if you want to learn more or want to sign up.
It should be a fun afternoon. Look forward to seeing you there!
Nate Bolt from Ethnio (a truly web-based usability tool) and I will be giving two talks on remote user research this week. First talk is at Adaptive Path’s User Experience week, second is at Philadelphia CHI. You can find out more on the MindCanvas blog.
In the meantime, we have exciting new developments with MindCanvas. I will blog it as soon as I have a moment free!
I have long argued that the difference between qualitative and quantitative research is more about what a researcher does with the method & data rather than the method itself. One key difference is the amount of structure in gathering data. Open-ended methods such as interviews and observation are unstructured ways of gathering data. On the other extreme, surveys etc. are close-ended. Respondents can choose from a few given options.
In the middle are semi-structured methods like card-sorting, freelisting etc. which can be used for either qualitative or quantitative research.
This question has cropped up a few times. The simple answer is: MindCanvas is a service that depends on a number of tools. Some we built, and then there are other tools we use to run the service. Its what market research companies call full service research – we understand your design question, we collaboratively create the study (using our templates etc.), we have contacts with panel companies if you need a specific type of sample… Once data gathering is complete, you get visualizations and all the data within a 1-2 days.
One reason that its exciting to be finally out with MindCanvas is because of the feedback you receive. For a while now, MindCanvas was something we talked about with friends, but not a public forum (I think that its good to release early, but not so early that your design vision is not communicated yet).
One of the first few feedback /queries we have received is about MindCanvas as a never-ending game linked to from someone’s website or software. This is a model we have talked about on and off (and it’s interesting to see it come up so early in discussions).
For the past year, if someone asked me what I am upto – I told them consulting, which is the truth, but only half the truth. The other half, which has kept me up at night (both literally and figuratively) is MindCanvas. No, don’t click the url yet – first hear the story.
We started working on it a year ago, but I have been dreaming of it for a long time. It probably started with seeing the boredom in the eyes of my research subjects (sorry participants) in graduate school. And continued with consulting, with every computer-based study I ever did. We do qualitative research as well, but truth be told – everyone resorts to surveys, at some time or other. The idea of basing product decisions on bored respondents robotically checking HTML boxes and choosing dropdowns really bothers me.
So, what is the solution? A better survey application with some AJAX peppered in? No, we wanted to go far beyond that. Our goal was to reimagine what online research can be. MindCanvas was the answer (a side-note about the name: it satisfies my geeky desire for it to be something about the mind. The words “brain” & “neuro” were summarily rejected by the rest of the team!)
MindCanvas is a research service to help companies gather insights about customers’ thoughts & feelings. We use Game-like Elicitation Methods (GEMs) to let online users participate in answering the complex questions that you face in designing a product or service.
This thought came back to me again and again during the DUX conference that I just got back from. Many speakers told us about the “ethnographic research” they conducted. Sometimes they shared some video of their observations – of children playing, or people in their homes, sitting on a chair, or watching TV. And the audience would watch delightedly – look at that, its people! People playing, laughing, sitting, walking… It all seemed very rosy – “we observed some people, maybe for a few hours, maybe we lived with them for a week or two – they still send us postcards – the dears. And at the end of it, we had the Aha moment, when it all fell into place. And the product was born.” And everyone lived happily ever after.
For the next BayCHI panel we are focusing on User Research – or the initial understanding of the user – their needs, mental models, preferences, the usage context – leading upto the product conceptualization.
We have a great set of panelists who work in User Research at four Bay Area companies. Their experiences cover a broad range of products and markets. Panelists are Klaus Kaasgaard, Senior Director of User Experience at Yahoo!, Christian Rohrer, Director of User Experience Research at eBay, Sheryl Ehrlich, Senior User Research Manager at Adobe Systems, and Kaaren Hanson Director of User Experience at Intuit. I will be moderating this panel.
Broadly I agree – site structure cannot be final final till you consider page layout and other aspects of the design. Card-sorting results are merely suggestions. You need to add in other design and business considerations.
But the problems with creating structures based on card-sorting, mentioned in the article, are not really problems with card-sorting. The problems are more with half-baked understanding or usage of the technique. For example, the article mentions that browser pages cannot accommodate too many top-level headings, long titles etc., and how this impacts structural decisions. But these and other issues can easily be handled with good card-sorting practices and more better analysis.