n0d3.org

Icon

Netflix DVD scratches

Two out of the last three DVDs I got from Netflix were scratched in such a way that a key scene was not watchable.

I went to call them to see what is up. They give you a code on the site to enter on the phone.

Who wants to bet that the code is a primary key between the web analytics cookie and and the phone system. Clever.

Changing Epistemologies for the Data-Driven: Thoughts post-The Browser interview

After talking with Jonathan on his radio show The Browser I spent a lot of time thinking about epistemology. I first learned  that 50 cent word in college, in a music industry class. We were reading Neal Postman’s Amusing Ourselves to Death.

I remember the opening, in which Postman says that much time had been spent worrying about about the dystopian future presented by an all-knowing government entity in Orwell’s 1984 when in fact we were much closer to Huxley’s Brave New World. The rest of the book goes on about how the kind of thinking that is valued by a culture whose primary media is television is different than the kind of thinking valued by a culture whose primary media is the written word (keep in mind that Amusing Ourselves to Death was published in 1985–we aren’t talking about the internet, blogs or Twitter–the newspaper or magazine article might be the most shortest unit of writing used for transmitting knowledge).

Some of the basics of technology and the transmission of ideas/knowledge that Postman discusses are like so:

  • Oral culture: values memory capabilities
  • Written culture: technology displaces memory and promotes linear thinking
  • Television culture: displaces linear thinking and promotes meta-discourse (which brings to mind Jon Stewart’s interview with Jim Cramer, forward to around 1:28)

It’s worthwhile to note that none of these technology/epistemology pairs completely obliterates the previous entry. Memory is still useful, just not as useful in a written culture, for example.

A culture in which data is the primary technology of knowledge transfer

What I’ve been mulling over, since being interviewed for the show, is a data-driven culture. I don’t foresee an immediate future in which everyone is numbers-literate or aware of the meaning behind various analytics metrics any more than I envision a future in which everyone is aware of how television is effective as a communication medium or a future in which everyone knows how to read, write and produce a well-reasoned essay.

But I do envision a future in which those who are data-literate begin to wield a distinct advantage over those who are not. How many lyric poets can be replaced by The Marble Threshing Floor: A Collection of Greek Folksongs? How will the advantage distribute itself? What sort of conversations will organizations and individuals have should a data-literate advantage become apparent? Will anyone miss the individual voice of the poet? The visual narrative of silent film? The letterpress version of Gill Sans?

Rate of change

I’ve been going down this rabbit hole of “change will increasingly be the norm” for awhile. Primarily as a result of reading and agreeing with a couple of HBR articles about change. I do think that we’re experiencing an exponential growth curve in turbulence of information/knowledge generation. We’re trying to assemble meaning around all of this.

I think maybe it’s like when you’re playing Katamari Damacy, growing a star. For awhile, children’s toys are pretty big but slowly you get the star bigger. Almost without noticing, those children’s toys are seeming smaller and smaller. The sense of scale changes and adjusts to the size of the objective (the star) and the size of the objects of your attention (the things you’re rolling up into your star). Before you know it, ocean liners are seemingly small objects. In the game, every now and then there’s a noticeable moment when the perspective really does change to allow a larger perspective to encompass larger objects.

Perhaps, with regard to the distribution of a data culture, we are in one of those noticeable moments when the perspective changes.

5 Analytics Tools for Twitter

Pre-preamble: Please note that I will be presenting “Measuring Social Media” at the May 27, 2009 Burlington, Vermont Web Analytics Wednesday and will be making a more up-to-date post on similar material following that event (between now and that time, feel free to help me out with resources and questions in the comments).

I was recently asked for my top five to ten Twitter analytics tools. And only the free ones no less. Elaine does a lot to help keep the town I work in well-stocked with bright, ambitious internet marketing interns, so I’ll see what I can do.

Preamble

You knew I wouldn’t be able to keep this brief, right? I use blog posts as a way to refine my own thoughts about things, which means I can ramble on and on. If all you want is your quick point-by-point list of five great tools for measuring Twitter, you’ll probably be better off at one of the articles listed in the “Measuring Twitter” section of the Twitter for Business resource. If you enjoy my rambling, do please continue (and help me refine my thinking).

Turns out that it really is about you. And to a frightening degree.

Before we get into any sort of discussion at all about analytics tools, it’s very very important that we understand why we’re measuring anything. How we measure will effect how we use the tool. Sort of like quantum physics for everyday life.

Twitter is a perfect example. There are many folks who get very very excited by the number of followers they have on Twitter, it becomes their key performance indicator. As a result these folks do all sorts of shady things to increase their follower count. But an artificially inflated follower count on Twitter is unlikely to increase their own engagement with consumers. Their chosen metric, taken by itself, encourages non-productive behavior. As an aside, these marketers are using the same key performance indicator (KPI) as the thriving newspaper and broadcast television industries: potential impressions.

So let’s be clear about something, right away: what we choose as our key performance indicators says as much about us as it does about what we’re measuring.

Analytics in context

Now that we have that out of the way, most of my tools and suggestions are going to relate directly to the Reach/Acquisition/Engagement/Conversion/Satisfaction consumer life-cycle model that Justin Cutroni was kind enough to school me on. There’s lots of writing already on this model, probably most relevant would be in Web Analytics Demystified, but I’ll give you a thumbnail sketch just in case you’re new to all this measurement stuff:

  1. Reach: Before anyone can take action on your message, they need to be exposed to your message. That’s reach. In old-media it is measured in “impressions” aka the number of people who could have possibly seen your message.
  2. Acquisition: These people “heard” your message and took some action of some sort, signifying that your message had some sort of relevance to their needs.
  3. Engagement: People are using/interacting with/reading/watching/playing with your message.
  4. Conversion: People are confirming that your message is relevant to them through tangible, concrete actions (in business this would result, most likely, in someone giving you money or permission to try to sell them something in person).
  5. Satisfaction: It was as good for them as it was for you.

Analytics and action

I’ll be the first to admit that it can be entertaining to watch numbers move and shift in an analytics package. But if you aren’t taking action on those numbers, then it’s just geeky entertainment. You’d be better off going out to see a movie or hanging out with your friends.

Another model I’m fond of has to do with tactical decision-making and action. It’s called the OODA loop. It has lots of other nicey nicey names like “Continuous Improvement Process” and so on. But it was initially invented as a way for fighter pilots to dominate their opposition. The same guy who came up with the OODA loop (Johnathan Boyd) also invented the F-16. OODA, as you might have guessed, is an acronym:

  1. Observe: Lots of room for analytics here. Gather up that data.
  2. Orient: There’s room for analytics here as well, put that data in context.
  3. Decide: The end result of all that hard work should result in a decision. The data presentation options provided by your analytics packages should, ideally, help your decision-makers (who may or may not prefer to look at pretty pictures or spreadsheet-like tables or both in making their decisions)
  4. Act: This is where you begin to test the decision made in the previous step.

Once you get to step 4 you go back to step 1 and just keep repeating. The theory with OODA is that whomever can move through the OODA loop fastest will overwhelm their competition.

Ummm, weren’t you going to mention something about Twitter and analytics?

Ok, I’m getting real close now. Twitter is a social media service in which you can spread your message(s) to potential customers. If awareness of the consumer life-cycle is built into your use of Twitter then you will have an easier time applying consumer life-cycle analytics to the medium as well.

Twitter is also a media in which you can “listen” in various ways. As a result, what you learn via Twitter directly and through the application of analytics can help inform your decision-making process (especially during the Observe and Orient stages of the OODA loop).

5 Tools for the Analysis of your Awesome Twitter Life

I will present my five favorite Twitter analytics tools of the moment, along with what portion of the customer life cycle I use said tools to measure. I’ll also make some profound statement about OODA loops and said tool.

Here we go, five free Twitter analytics tools in no particular order:

The Advanced Twitter Search: Measuring Voice-of-Customer

Twitter is all about the content and this seemingly simple and mundane feature of Twitter gets you right to the content you need in order to develop meaningful insights. This tool used to be called Summize before Twitter bought it and folded it into the main app. Now it’s just Twitter Search.

You can search by date-range, geography or account. You can search for attitudinal signifiers (positive, negative, question). You can search by key word, phrase or topic.

To get scalable, repeatable results you’ll want to dip into the API and start piping Twitter advanced search data into your own database, but even without going that far you should be able to make some good use of the Twitter search for your analytical purposes.

Using Twitter Search to measure the customer life cycle:

  • Reach: Identify your market by identifying users talking about your brand/topic.
  • Engagement: Gather data on conversations about your brand/topic. Gather data on individuals directly addressing you or your brand on Twitter.
  • Satisfaction: Gather data on attitudes related to your brand/topic.

Deploying Twitter Search in your OODA loop:

  • Observe: Who is talking? What are they saying?
  • Orient: What else are they talking about? When do they say it?
  • Decide: If providing actual voice-of-customer data will help your decision-makers, advanced twitter search will help you find it.

Cli.gs: Measuring Message/Topic Relevance

There are many link-shortening utilities out there and they change all the time. I’ve tested out several and my current favorite is Cli.gs. Sadly, the service has among the worst UIs of the lot. But the features make up for it:

  • Click-through number that separates humans from bots
  • 301 redirect (not an analytics benefit, but still worth mentioning)
  • Retweet citations
  • Choose page connected to the clig based on geo-targetting (aka: send French visitors to French language version of your page, German visitors to German language version etc).

Being able to filter somewhat between humans and bots helps to improve your signal-to-noise ratio in your data. The 301 might help your SEO a little (I haven’t done any testing on this, and would love it if someone had something on this). Being able to follow the clig through various retweets is useful, especially to see how/if your tweet got edited. Multiple languages helps you segment by language.

It’s important to note, on the topic of tracking clickthroughs from Twitter, that you are tracking several variables: time of day that gets the most clickthroughs (by topic, by hashtag, etc), topic that gets the most clickthroughs, tweet format that gets the most clickthroughs etc. This is all adding up to a metric that helps identify the relevance of your tweet and your influence with your audience(s). Keep in mind that your audience changes throughout the day.

Using Cli.gs to measure the customer life cycle:

  • Reach: Track retweets (for those who don’t edit the link)
  • Acquisition: Measure the number of times your message was deemed relevant enough for someone to waste precious time clicking on your link.
  • Satisfaction: Collect commentary/retweet edits (for those who don’t edit the link)

Deploying Cli.gs in your OODA loop:

  • Observe: How many humans took action? How many bots took action?
  • Orient: What day did they take action, how long did the clig remain relevant?
  • Decide: Voice-of-Customer data related to the link may help your decision-makers.

Twitalyzer: Measuring Twitter Accounts

Sooner or later everyone wants to know how they stack up. Twitalyzer will help you do that, but more importantly it will help you identify where you might want to take action to improve your use of Twitter. Using straightforward methods to analyze performance in five domains of Twitter use (Influence, Signal, Generosity, Velocity and Clout), Eric Petersen’s Twitalyzer is both transparent and informative.

Your numbers may be small and sorry looking, but the tool helps you identify ways to improve that are based on your content and behavior, not on the number of followers you can collect. Twitalyzer is one of the few tools that measures how users engage with others via Twitter.

Using Twitalyzer to measure the customer life cycle:

  • Engagement: This tool is actually measuring you, measure your engagement on Twitter.

Deploying Twitalyzer in your OODA loop:

  • Orient: Twitalizer uses an index system that ranks account usage against all other account usage data it has. What’s your engagement and influence on Twitter compared to other users?
  • Decide: Twitalizer presents specific recommendations for improving scores in each domain in measures.

Dan Zarella: Getting the bigger picture

Ok. I don’t know Dan Zarella personally so I can’t tell you whether he’s a tool or not. Given the fact that he makes available some excellent, insightful research on how people are using Twitter, I’m going to guess that he’s not. In particular, his research into how messages are retweeted via Twitter is a must read for putting your own efforts in context.

Using Dan Zarella in the consumer life cycle:

  • Reach: Understand what a reasonable reach for your message might be.

Deploying Dan Zarella in your OODA loop:

  • Orient: How do your efforts match up against others?

Google Analytics: Conversion by traffic source

Yup. I couldn’t resist. One of the best Twitter analytics tools you’re going to find is by going to your Traffic Source reports in Google Analytics and looking for Twitter. Note that you’ll have to make some changes to the way you make your Cli.gs links (create cligs from campaign-tagged URLs) if you want to capture those via Twitter as well.

You may want to set up a profile or advanced segment for your Twitter traffic if you want to get especially in-depth. But to get started, you can probably learn quite a bit just by looking at the Traffic Source report.

Using Google Analytics in the consumer life cycle:

  • Acquisition: Track number visitors arrived on your site via Twitter.
  • Engagement: Track the likelihood of bounce from Twitter. Track how many on-site actions visitors from Twitter make. Track other engagement metrics.
  • Conversion: Track the likelihood of Twitter visitors to complete a site goal.

Deploying Google Analytics in your OODA loop:

  • Observe: What is the volume of traffic your Twitter efforts are driving to your conversion funnel? How is the quality of traffic from Twitter?
  • Orient: How is the quality of traffic from Twitter in relation to Twitter?

Some parting thoughts on Twitter and analytics

As a communications channel, Twitter is primarily a reach tool. However, given the interactive nature of the format, there are applications for Twitter in all aspects of the customer life cycle. In addition to tracking and analyzing specific actions people take as a result of Twitter activity, the content of Twitter itself is a fertile ground for gathering voice-of-customer, attitudes and buzz data.

Thanks for toughing it out to the end of this blog post. Please let me know how to improve it either in the comments here or send me a note via Twitter (@gahlord).

Importance vs Viability

I would like to hear more about using “importance” and “viability” as the sharpening stones for Occam’s razor. I first heard of this criteria from Dani at Epik. And it’s pretty genius. More later.

Analyze your memory

A little outbound walk down memory lane. Web analytics through the ages.

Key Performance Indicators for Knowledge Sharing

I have recently been asked: What type of Knowledge Sharing KPI’s have you have come across?

I’ve decided to answer this question by making a Comment Bait article and also seeking some comments (from some serious analytics folks) on this. If you are unfamiliar with the terms bandied about please read the comments. I will commit to updating the main article (and citing the sources) over time. But for the best survey of alternative viewpoints please review the comment thread judiciously. If there are no comments, then I guess I’m dead-on. ;)

Also, please note that I haven’t fully reviewed the WAA metrics definitions yet. So if you are WAA-aware and I discuss a metric that is covered by the standard, please note it and I will correct quickly as I intend this post to be WAA-aware. (sound like a foreign language? make you wanna go waaaah? these aren’t the droids you’re looking for, move along)

Executive Summary: KPIs for Knowledge Sharing:

Short answer: None. The size of my company (about 20) has made it fairly easy for me to value the impact of my knowledge sharing initiatives on a qualitative basis.

Long answer: None (see above) but if I had the opportunity to do this entirely on a larger scale using a metrics system I trusted here’s what I’d be looking at (also, before I even get into this, please do some Google on “visitor engagement” because “knowledge sharing” is, imho, very very close to “visitor engagement”.)

Definitions

My objective here is to make the most flexible definition of KPIs for Knowledge Sharing. In order to do that I need to make a few definitions up front. I will improve these definitions based on comments. So here we are, at the start of things.

Goals/objective

This is probably the most important definition. What business objective (usually quantifiable, ultimately, in currency) are you trying to achieve with your knowledge sharing initiative?

Period

The amount of time required to take action in which a measurable result will occur. For example, in biology a female has a period within which, should she copulate, a measurable result will occur. I know that’s a bit sticky of a definition for some folks and I’ll change it when I get a better one but you get the idea: Time. Take action. Measurable result.

Roll Call

The number of users that are relevant to the study. In this case, all the people with whom you want to share knowledge.

Measurable Action

An action undertaken by someone that is measured by your system. For example, visiting a page or registering as a user or posting an article or giving a comment, assuming your technology is capable of registering and reporting that action (so that you can measure it).

Active Users

Unique users who have performed a measurable action within the period of your system.

Constituents

The people who care about and regularly interact with the system.

Mean or Average

If you shmushed everyone in the world together and divided by the number of people you shmushed together you would get the average: an easy number that represents a middle-point that has a tangential bearing to an actual decision-making human being.

Median or Typical

If you lined up a bunch of actual decision-makers and selected the decision-maker that was dead center of the line, you’d have the typical decision-maker.

Hypothetical KPIs for Knowledge Sharing:

Here are a few KPIs and how to, perhaps, calculate knowledge sharing. Remember, if no one is willing to put aside time or money in their budget to act on your data then it isn’t a KPI, it’s merely interesting.

Knowledge Sharing initiatives should focus on building “active users.” This might be an ongoing process and could very well require time/money assets to develop into a policy of growth. If your organization is not willing to commit to an ongoing strategy for Knowledge Sharing then it may be best not to engage in these activities.

Information Metrics of Knowledge Sharing

Let’s transform that data into information.

Please take note of the following caveat: Early in your study, more individuals will participate and try your software. Over time these individuals will revert to their regular habits. The ratio of reversion is a valuable metric beyond the scope of this article.

Company Roll Call vs Uniques using knowledge sharing system

Company Roll Call divided by number of Active Users. What it tells you: How much of the company relies on this resource per Period.

Percentage of problems solved

If technology/individual discipline allows, you should know how many specific task-oriented problems are resolved using your system. This assumes your UI allows the users to easily designate a resolution to a problem. If you are really clever you might have a “related problem” feedback button as well. What it tells you: The quality of your knowledge sharing initiatives.

Bounce Rate

Bounce rate for any Knowledge Sharing Article. What it tells you: Relationship of headline to any user need.

Time-on-page

Time on page for positive response (user indicates the article helped) is below typical could indicate a well written solution page. If the time-on-page is greater then perhaps it’s a less well written solution page.

en-zero-dee-three

N0D3 is my loose collection of random navel-gazing. You might find articles about web culture, analytics, Burlington or anything else I feel like writing about. If you find my posts a bit lengthy, you may want to try my Twitter feed instead.

Archives