louisrosenfeld.com logotype

Home > Bloug Archive

May 26, 2009: The user research dashboard

Just a few partly-formed thoughts about reports, the kind that make sense of user research of both the web analytics and user experience varieties, and how they might be integrated:

  • Reports should be built around questions. Really, reports should be answers to questions. Of all the cool dataviz work that Jeff Veen and his team did for Google Analytics, bringing the questions to the fore was their most impressive and useful achievement.
  • But even well-designed reports are of limited value if they're just reports of one variety. If you think analytics app when you hear the word "report," you're not getting the full picture. Ditto that if you think of reports that pretty much come from your user studies. And so on.
  • What we really need are ways to integrate reports from these sources and more. Envision a dashboard that provides access to your analytics reports, help desk log reports, task analysis testing reports, reports from surveys, content inventory reports, the whole array of stuff regardless of whether it's quantitative or qualitative.
  • But a bunch of reports all in once place, while convenient, isn't especially meaningful. We might be able to enable synthesis and, ultimately, derive meaning from our reports if we connect them in logical ways. How? Well, you might have a quantitative report that examines some sort of behavioral data to establish what's going on with your site, identifying interesting questions to follow up on along the way. You might use a qualitative user study to answer some of those questions. Solution: present them together in a way that shows the relationship between the questions from one and the answers from the other.
  • Time fracks up stuff like this. The quantitative report might be from last quarter, and the related qualitative data might come from tests run many months later. But both types might be updated on a frequent basis, making it hard to depict a snapshot where both types ought to be connected.

Make sense? Nah, I didn't think so; it's late and I'm tired; just wanted to get these thoughts down, hoping that they'll make sense in the morning. Hoping that someone some brilliant visual thinker like Veen or Dave Gray stumbles upon this posting, makes sense of it, and has it all worked out by the time I wake up...

email this entry

Comment: Edward Vielmetti (May 27, 2009)

Lou - to overgeneralize, there are two kinds of reports, one set that you generate the same ones every reporting period to show some kind of continuous adherence to a standard, and the other that is always digging a little deeper or a little differently into a huge pile of data.

You'd want to provide analysts with tools so that they can craft a standard set of reports for whatever you are doing so that their interns can answer the easy questions just as easily as they can. The report might even be condensed so far as a simple yes/no checkbox if you have completely made the whole process routine.

More interesting and more frustrating (and more expensive) is the exploratory analytics, the one where the next question doesn't come up until after you've seen the report and you want to tease apart something. These are expensive enough that in my experience it's rare to do them in depth except at major decision time, the kind of background work that justifies a campaign's worth of effort to get the next campaign sold.

I find myself looking at Google Analytics and asking myself the same questions every time, but not being able to pull from a library of good reports to answer it. Things like "traffic spiked this day, wtf happened and do I want more like this or was it transient". You can do all of the charting and reporting you want, but until you can tell that story as a use case and a narrative the numbers are just numbers.

Comment: Lou Rosenfeld (May 27, 2009)

Ed, that's where qualitative research is really worth the investment. The quantitative data told you "what" and made you ask "why?". The qualitative stuff answers "why". (Shameless plug: that's what my recent talk is all about: http://www.slideshare.net/lrosenfeld/marrying-web-analytics-and-user-experience ) In any case, what I'm saying is that I'd like a dashboard that not only shows all sorts of reports, but makes those connections (what => why) visually. Now that would be cool.

Comment: Carol Smith (May 31, 2009)

I recently have come upon a need for something very similar. I'm analyzing a variety of data for a start-up and I would love to be able to show it to stakeholders in a "dashboard" view.

I want a tool I can update with data from a variety of sources that keeps some history and is flexible (date ranges, depth of detail, etc.). Data sources would include combinations of user searches, analytics data (popular pages, visitors, etc.), survey results and anything else I want.

From what I've found so far it will need to be custom made - if you find a solution please share!

Add a Comment:



URL (optional, but must include http://)

Required: Name, email, and comment.
Want to mention a linked URL? Include http:// before the address.
Want to include bold or italics? Sorry; just use *asterisks* instead.

DAYENU ); } else { // so comments are closed on this entry... print(<<< I_SAID_DAYENU
Comments are now closed for this entry.

Comment spam has forced me to close comment functionality for older entries. However, if you have something vital to add concerning this entry (or its associated comments), please email your sage insights to me (lou [at] louisrosenfeld dot com). I'll make sure your comments are added to the conversation. Sorry for the inconvenience.