Aug 092011
 

The numbers look good, but my thinking could be debated

This post takes a lighthearted and self-deprecating look at analytics and metrics.

I will use myself as the first example (not the woman in the picture), because I recently caught myself in a surprising analytics story. I will also get more serious toward the end.

As many of you know, I took on a recent project to move my blog to self-hosted WordPress. One thing I learned in this move was how I had slowly become addicted to analytics:

  • How do Page Views (on wordpress.com) change when I posted various kinds of content, or at different times of the day?
  • How am I tracking day-to-day, week-to-week, month-to-month?
  • Are my numbers going up?

So here’s what happened

When I moved looseneurons.com, I lost the built-in analytics, so I switched to good old Google Analytics thinking about how much more control I would have, how much more information I could gather, what great new questions I could answer…

Almost immediately I noticed my logs tracking Visitors as the key metric. The Page Views are still there, but the emphasis shifted.

I learned some neat things, like on my young and growing blog, visitors tend to view an average of 4 pages during each visit.

That’s cool… but my philosophy about what was measured and why it mattered hadn’t shifted yet. I mentioned to a friend that looking at a Visitor count that was 1/4 the size of my earlier Page Views was triggering my psychology more than I would have expected.

I was totally unprepared. And that was a great event to notice!

By the way, his response to me: “If you don’t like what GA is telling you, turn it off.”

Hmm. Perfect. Thanks for the help, Dave.

Turning situation into action

So I wondered:

  1. Am I the only one this happens to, or does it happen to my customers?
  2. Do metrics that are weak, missing, flawed or incomplete nevertheless build some emotional connection in people (besides me)?
  3. When introducing new metrics, how long does it take to build new thinking at more than a superficial level, where metrics take on meaning, relevance and value beyond simple numeric signals of better or worse?

Have you ever thought about other people’s need for a philosophical shift when introducing new metrics? Do you have examples of this at work?

How do you act in this kind of situation, as a customer who became self-aware or as an analyst? It seems more to me than just a training issue.

What specific actions do you take when metrics change, to ease their adoption and quick semantic value?

Getting to fundamentals

As I consider this, I notice that I DO pay attention to shifts in metrics when they involve incentive compensation (e.g., designing a commission plan for sales engineers).

I was just triggered in this case that the metric I am looking at (Page Views or Visitors) is about operational performance… which at a larger scale drives enterprise value and the ultimate incentive for everyone employed!

Now, I work in web sites and metrics all day, every day, but the blog numbers I was looking had taken on meaning that I confess was not completely rational.

When given “better” numbers, I continued to press them into my old mold of reasoning and measuring, even though I could rationally understand the reasons things appeared to change.

(Incidentally, I am not serious about turning off GA. I have way too many numbers I can fixate on now!)

An example in software

It's the tests that matter - code coverage is a result of good tests!

Many of you also know TDD is a software development practice I truly love. One of the greatest teams I have worked with built an automated test harness that exercised 100% of the code for an automated call center back in 2001.

That “code coverage” metric became so well-known by the people in and around that team… that many of them set the metric (lowered to an 80% threshold) as a “target” against which future projects would be measured.

However, we had never set out intending to produce “100% code coverage” when we wrote the software – it had been an ancillary outcome to our intention of writing an awesome product.

Over the ensuing years, I have seen the code coverage threshold blind development teams and management to poorly written software and fragile test harness… though they do exercise a lot of the code.

It has been a personal struggle of mine to shift management, team and developer philosophy since.

(If you are curious about some other lighthearted posts I have written, check out Post-it Notes, The IT Super-Genius, Corvettes and Google+, Flattened by a Panda and Serendipity and Social Media. They help me channel my Loose Neurons creatively, and like this one they also carry a note of importance.)

  4 Responses to “I swear I’m gonna turn off Google Analytics…”

  1. Ah, you’re playing in a arena where I’ve spent years now, Mr. Faw. Analytics are a fascinating thing, but I believe you hit the most important part about analytics – metrics invoke action, or more importantly “should” invoke action.

    Analytics that does not drive us to understand something or reacting to something isn’t really analytics…it’s just data. The analysis is in turning it into something measurable and quantifiable to provoke a change. Incidentally, that action sometimes might just be nothing. If the measurements are ideal or better than expected, then things just might be working, but it still has prompted the review of the data compared to whatever thing prompted its creation.

    In work practice, I do this all the time (as a good analyst should, I suppose). Many times I will engage in a discussion with someone who simply does not know the facts or even just refuses to admit to the facts. Compiling a meaningful metric and presenting it can be one of the best ways to “leave them speechless” and prompt needed steps to be taken.

    • Actually, Moose, I had you in mind when I wrote this… not as the “bad” example I played, but as the analyst that I know you are. Catching myself in the act was really great.

      Thanks for your observations.

      • It does raise an interesting question though concerning the usage of analytics. Some are more cut and dry than others. Sales are up or down and we deep dive to figure out why or what contributes and business adjusts to meet the things impacting accordingly. But when we look at things like analytics on a blog, what are the actions? It’s an interesting thought. I know you have done some exhaustive tests to measure various trends, but it can be tough to identify tangible actions sometimes depending on what we are measuring.

  2. On LinkedIn Group, Lean IT Enterprise, Richard Askew wrote:

    Thanks to Don Wheeler, I use a general framework that is often called an operational definition. The framework is composed of three questions:

    (1) What do you want to accomplish? (A clearly stated objective is needed for focus; but it’s not sufficient),

    (2) By what method will you accomplish your objective (A method for accomplishing the objective is necessary; but it’s still not sufficient) and

    (3) How will you know when you’ve accomplished your objective (We need a way to measure our movement toward the objective). A good metric is developed from an objective and a method for achieving it.

    This is means-based improvement, rather than results based. Our perspective shifts to the process as the means to achieve the objective.

    Don Wheeler writes about his in his book, ‘Twenty Things You Need to Know’ (2009, SPC Press).

Leave a Reply