They play a football game while the ads aren’t running although the one they played last night was not great. If you think I’m emphasizing the ads over the game or being a little too tongue in cheek some polls find up to half the viewers consider the ads their favorite part of the viewing experience.
This morning there is ample discussion of the ads and given that time in the game itself cost north of $4,000,000 for a 30-second unit, the brands running these ads try to deploy them before the game in the hopes that they’ll “go viral” to some extent. They were successful: Super Bowl ads running on YouTube weeks before the big game were watched 66,058,625 times before this weekend. Since that’s all the ads in the aggregate, it’s only a fraction of the audience the commercials had in the game broadcast. However, every eyeball is valuable and the digital versions can be looked at in other ways that demonstrate engagement.
According to Tubular Labs, an analytics company, a number of the ads also generated some buzz via tweets and Facebook shares and they compared those activities to the ads’ YouTube views to measure the total viewer engagement with the ads. That’s where I get a little lost and here’s what I mean.
There is an AXE ad with 3.6million views. It was shared on Facebook 50,000 times and tweeted roughly 5,900 times. The analytics company says these social actions translate into a 1.6% engagment rate which was the highest they saw. The lowest engagement, for a Butterfinger ad, was tiny – .03% and shares were in the hundreds. Interesting, but it leaves out a very key measurement.
What is every one of those shares for the AXE ad went something like this: “Kiss For Peace” is the worst ad I’ve ever seen. Why would you waste money on this crap? I’m never going to consider AXE again. Engaged? Yes. But is that the sort of engagement we want as marketers? What if every Butterfinger share raved about how good it the ad was and expressed a desire to eat a Butterfinger immediately? Better?
It’s always important to measure. It’s also important to dig a little deeper into those measurements. I’d take smaller positive engagement over larger expressions of anger every time. It’s not just “what is it?” but also “what of it?” as we gather data. Make sense?