If you spend any time reading about what’s going on in digital media, you’re probably aware of Rupert Murdoch‘s war on free content. His beef is that many companies spend lots of resources to produce original content and then others such as Google, Drudge, and HuffPo aggregate it and make money off of it with minimal investment. He wants to place a lot of content behind pay walls which, in my opinion, will be an interesting experiment but one which is doomed to failure unless there is a systematic change in the nature of the web. But that’s not the subject of today’s piece. Instead, Rupert’s now gone one step beyond.
The Silicon Alley Insider reports on an interview Rupert did in which
he says he doesn’t want people coming to his sites through search engines. “What’s the point of having someone coming occasionally?” Murdoch wants them coming to his site directly. He has pretty low opinion of searchers. “If they’re just search people… They don’t suddenly become loyal readers.” They get 15-20 headlines and just click on the most interesting.
Now THAT is a problem. One of the things on which I spend a lot of time with my clients is building “discoverability” into their content offerings. With 80%+ of web sessions starting with a search of some sort, being invisible to search engines also means being invisible to consumers. It’s not that I don’t know where, for example, the NY Times information lives – I just don’t want to have to surf around their site to find it. I use RSS and search to ingest a lot of information each day and if what I’m looking for isn’t readily available, much less findable at all, I’m going to move on. The web user is task-oriented and as a content provider you need to be too.
Anyone out there going to rush to the NY Post web site to read it if you didn’t get there via search? While I don’t argue that content might need a better business model, hiding from users isn’t it. Of course, someone I respect a lot, Mark Cuban, totally disagrees with me. What do you think?