I’m going to start the week by running the risk of bumming you out. At least we’ll have the rest of the week to recover, right? I was looking at some analytics data this morning and as I looked at it, I realized that much of it is wrong. So is a lot of the other information this client is using to make decisions. Yours is too, by the way. I’ll explain why but along with the realization came an insight that I think will be helpful to your business.
When I began in digital we used server logs to track traffic. They were pretty accurate although pretty limited as well. Web analytics came along and the quantity and quality of the information we got about who was coming to our web sites, how they got there, and what they were doing improved quite a bit. As business people, we were able to make content and marketing decisions based on the data we were getting.
Things have grown quite a bit more complex over the last 20 years and that complexity has obscured much of the good, useful information. Anyone who knows analytics will tell you that much of the referral data you see (where traffic comes from) is wrong. “Direct” traffic is way overstated. “Referred” traffic is encumbered by referrer spam. A lot of so called direct traffic is really dark social traffic (I send you a link). Transfers from HTTPS to HTTP sites report as direct as well. Keyword data is “not available.”
I’m not trying to make your head hurt nor to get really wonky. The point is that if you’re relying on that data to make decisions, you’re really just guessing. It’s the same with much of your ad data. I’ve written before about the lack of transparency in the programmatic ad markets and that opaqueness obscures the validity of the data as well.
I can add search data, email data, and more to the list of what probably isn’t what you think it is, but all of this fostered a thought: what do we really know that’s truly actionable?
I can answer that. We can know how our products and services are really differentiated and how much better we are at solving peoples’ problems. We can know (yay review sites!) how good our customer service is. We can know how our revenues and costs and changing and we can ask why.
I’m the last guy to say we should ignore that large and growing amount of data every business gets each minute. But maybe the time has come to act on what we KNOW and less on what we really don’t. What do you think?
The Memorial Day weekend gave me a little time to get caught up on some reading. Some of what I was reading were analytics reports (I know – get a life) and while I very much appreciate the cycle of continual improvement Google fosters within their analytics product, that cycle yields a continuously growing amount of data. The problem that I have isn’t so much understanding what I’m reading but trying to figure out why any of it matters to my clients. I also spend time figuring out which of the numbers are lying to me.
It’s no secret that there are an awful lot of bad actors in the digital world. Once it becomes clear how fraud is detected those bad actors move on to another form. If viewability is important, they create sites where there is 100% viewability but no content of any value. I had a client get all excited about an increase in referral traffic until I pointed out that most of that traffic was coming as a result of referrer spam. When we filtered it out, traffic was flat. Another prospect got excited by the large “stickiness” – time on site and pages viewed – that her site has. They were impressive until you filtered out the IP addresses of her employees, who spent hours a day on the site.
Silly things, I know, but it points to a common problem. An IDG study of a couple of years ago pointed out that nearly half of marketers said they struggle to make sense of the vast amount of data they get. The other half thinks they know what the numbers mean, yet many of their plans are built to achieve unrealistic metrics. The problem is compounded by what the paper identifies as the accuracy problem I mentioned above:
Why is data accuracy still such a big issue? One possible reason is a lack of investment in a defined data management process that includes ongoing, consistent data migration, data maintenance, quality control and governance. Too often data is held and managed in multiple organizational silos. This results in inconsistency, duplication, gaps and errors.
So while “garbage in, garbage out” isn’t a particular revelation, it does serve as an excellent reminder to take out the trash as best you can while compiling all of that data. You with me?
You’ve probably heard the old joke about the kid and the pile of horse manure. There are many variants, but the basic story is that a kid is digging through a huge pile of horse manure. When he is asked why his response is “with this much manure, there has to be a pony in here somewhere.” It’s a story a use to help clients understand the nature of data. Any of us who are in business see more and more of it each day. In fact, we’re probably setting up systems to provide more of it to us as well. The unfortunate truth is that most of it is…well…manure.
(Photo credit: Wikipedia)
We’re after the pony, or at least we should be. The pony is the actionable insights that are contained within the data and not the accumulation of data itself, It does take a lot of digging, and that digging can begin only after we set up systems to gather and to organize the flood of data. Knowing that website traffic grew as measured by session count tells you very little. Understanding how it grew or if that growth was because a bunch of referrer spammers hit it gives you actionable information (update the spam filters!). Knowing that your store sales were up 5% without understanding that you’ve lost market share can cause you to think that you’re doing well when in fact you’re losing ground.
Say “so what” to yourself a lot. If you can’t explain why a piece of data is meaningful, you need to discard it because it’s the manure surrounding the pony inside. If you can’t put something into a broader context, push to do so. If you can’t determine a course of action based on a particular nugget of information, ignore it and keep digging until you get to the pony. Make sense?
I don’t think there has been a baseball movie made that didn’t feature some weathered old guy seated in the bleachers somewhere. He usually utters undecipherable baseball jargon while taking copious notes. This, dear reader, is the baseball scout, who used to be how talent was discovered. If you’ve seen or read Moneyball, you know that the scout is an endangered species. This article from USA Today last week talks about how many pro scouts are still unemployed one month before the start of spring training. The reason? Data.
(Photo credit: Wikipedia)
Baseball is in the throes of the Moneyball movement. Teams have been laying off scouts and turning to sabermetrics, which Wikipedia defines as the empirical analysis of baseball, especially baseball statistics that measure in-game activity. Baseball has fallen in love with data. Maybe your business has too.
Here is the problem, both for you and for baseball. There are certain things that don’t show up in data. A player’s leadership qualities in the dugout aren’t quantifiable. Potential can often be visible but not measurable. That’s true in your office as well. The data may show you what it happening but it’s hard for it to show you what could be happening. That requires humans: scouts.
We all need scouts. We need people who use the data as a tool but who also have the experience and wisdom to know when the data is missing something. That doesn’t mean projecting one’s wishes into the numbers nor distorting the story those numbers tell. It is, however, an acknowledgment that there is often a bigger picture than what’s inside the frame.
Here is a quote from a scout:
I’ve got 23 years in the business,’’ Wren said, “and now clubs don’t want that experience? I look at teams now, and they’re hiring guys who aren’t really scouts. They’re sabermetric guys from the office, and they put them in the field like they’re scouts, just to give them a consensus of opinion.
That’s dangerous for a baseball team. It could be fatal for you. You’re up!
A lot of folks who thought they were in marketing are finding out that they’re really computer scientists. That’s a shame in my book. Surprised I’d say that after all of the rants in this space about the need to measure actionable data? Let me explain what I mean and how I think there will always be a place for real marketers.
(Photo credit: Wikipedia)
Computers and the data they can generate are really good at many things. I, for one, am very much looking forward to the day when they are driving all of our cars. One thing at which computers suck is creativity. They provide great creative tools like Photoshop, but the ability to create is intrinsically human, in my book. They aren’t great at improvising. They can’t “pretend.” I’ve not heard of them mashing up a couple of concepts into a third. Yet those tasks are the essence of great marketing.
We are complex creatures. There are things within the human mind and character that no computer can understand. They might get the “what” (actions you took) but most of us in marketing are interested in the “why” at least as much. It’s great that, as recent research found, 92.3% of respondents said they maintain databases to host information on customers or prospects, at least to some extent. I wonder if that data dependency is replacing the human side of marketing.
I like this quote from a recent article by someone at Adobe:
Buyers’ behavior isn’t always rational. People make strange decisions that defy neat algorithmic understanding. Often, customers are not simply looking for the highest-quality product for the lowest possible price. Indeed, the burgeoning field of behavioral economics is revealing on an almost daily basis how irrational consumers can be—and how seemingly irrelevant factors can influence purchasing decisions. Savvy marketing adapts to these nuances.
Exactly. Computers don’t do irrational. We do need to use data as a tool, but we can’t assume that our jobs are done because we’ve got a system that aggregates and reports. We can’t dive so deeply into data that we drown in it. Computers can’t do marketing well because they lack the skills that make great marketing: intuition, creativity, innovation, compassion, and imagination. You might think your marketing job has morphed into that of a computer scientist, and if it has, you have a problem. Great marketers know how to use those tools within the context of the human to human interactions that make business flow. Do you?
Ever encounter a situation where things seem backwards? Maybe you’ve seen a parent being told what to do by a child or a customer being berated by a service rep. It makes you wonder who is in charge or who is working for whom. I have another thought along those lines today, and it has to do with data. There was a post from AdAge by their data reporter, Katie Kaye who wrote the following about the NY Times piece on Amazon:
The article should inspire us to question the value of decisions based entirely on data to create business efficiencies at the expense of human empathy and the arguable imperfections that can benefit any organization or project.
I like that. It makes you ask who is in charge here: the humans or the numbers. We all ingest more data than we can consume, and, unfortunately, some of us allow that massive intake to be regurgitated as unconsidered decisions. That’s a bad idea. The data is there to serve us, not the other way around.
I’m the first to say that we need lots of data. Without impartial feedback, we’re flying blind, and data can help us make better decisions. The key there is “help US”. Data without the context of a plan is useless. Data that’s not actionable is useless. Data that causes us to overreact, however, is dangerous. If you watched any election coverage last night, you probably heard a lot about early results and the need to wait for data from key precincts. How many times has someone in your organization overreacted to an early piece of data, only to find out that it was not at all typical of the overall results? We need a plan, we need context, and we need a little patience.
When we chase after outliers, we’re working for the data. That’s backward. Data, and all the other technological tools in our arsenals, needs to work for us. Make sense?