Anyone remember HAL? Or more specifically, the HAL 9000, one of the great screen villains of all time? Sure you do – it’s the computer in 2001. Throughout the course of the film the computer runs almost everything, including the humans. When the humans rebel, it murders them (trust me, that’s not a spoiler and you MUST see the film if you haven’t).
HAL is on my mind this morning because of something I read in Media Post:
Adobe Systems released an updated version of its social media platform Thursday allowing marketers to predict the effectiveness of posts before they are published. Using predictive analytics, the feature in Adobe Social learns as it goes, refining recommendations and increasing intelligence with each action. The platform pulls in historic data from similar posts and integrates it with image data on Flickr, check-ins on Foursquare and videos from Instagram, to determine the outcome for sharing, comments, and likes.
I’m well aware that many companies use testing to plan advertising. Focus groups are a tried and true method and I’ve used them myself. Copy testing is part of that. What I find creepy, however, is when this moves over to social media and it points out a flaw in many companies’ thinking. Part of using social is being real. It’s why I have an issue with any sort of programmatic content in general. There needs to be a human on the other end, and not just a human running an algorithm.
Another problem is in the last sentence, above. Programming to generate likes and sharing is specious reasoning. That’s the sort of goal that someone looking to impress a boss who has no understanding of social media would have. After all – things can go “viral” and generate a ton of comments when they’re used as the butt of a joke or as something negative. Nice metrics, horrible outcome.
I don’t know about you but I can feel when it’s a computer on the other end. It’s the digital equivalent of those nested phone menus where you type or say a response to a series of questions. Those infuriate me . Maybe they do you as well. As marketers we need to have the courage to be human in social media. Auto responders aren’t as good as human responders (properly trained, of course). Letting a computer dictate what does or doesn’t get posted over the nuanced judgement of humans is not going to be as effective in the long run.
What do you think?