Oy Vey – A Survey

Citizens registered as an Independent, Democra...

As one might expect a couple of weeks before an election, the air is filled with the sound of pollsters. I don’t know about you but we get called almost every day (and night) to answer some poll about our voting intentions (absolutely), our choice of candidate (not you if you don’t stop calling), and our basic demography (I’m a 29-year-old woman with blond hair and a deep voice as far as they know).
Political polls aren’t the only kind you read about, and involve in your business, every day. TV ratings are polls, as are almost all media data. So it’s good to keep a few things in mind as you see the data.

What has me on this bandwagon are a couple of polls which came out the other day.  They were measuring the same Senate race.  One, fielded by the Democrats, had their candidate up 41-36.  The other, fielded by a polling outfit well-known to lean Republican, had their guy up 44-40.  Both polls made headlines and both can’t be right so which do you trust?

I think many folks trust the result they want and expect and that’s not limited to politics.  One of the worst things I think many of us do is see the world as we hope it to be without acknowledging the reality of how it is.   Obviously there are some pretty big differences in the two survey samples above – maybe geographic, maybe demographic – that yielded such different results.  On-line surveys are way less reliable than phone surveys, and phone surveys that are land-line only are skewed badly as well (and yes, stat freaks, I understand the margin of error thing).  I know for a fact that certain companies tell their employees to vote up their cause in some on-line business surveys – think the pollster is weighting their results from the company IP address?  Hardly.

The point here is that we need to drill down with any form of data and ask a lot of questions about how the data was generated and not just the results.  If we act on information derived from a skewed sample, we’re probably going to act incorrectly.  Ask about sample.  Think about how the pollster, that sample or the question itself can contain bias.  Look at how the data is expressed – “People voting for candidate A has doubled in the last 2 weeks” as the lede is kind of meaningless if “A” is still 35 points behind.  “The game got a record number of viewers” can be meaningless if the rating is down but the number of eyes is up due to the growth of the universe.

We’ve all heard the “challenge authority” mandate.  Polls of all kinds are exactly the types of “authority” that need challenging.  Especially if you’re making decisions based on their results.

Questions?

Enhanced by Zemanta

Leave a comment

Filed under Helpful Hints

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.