Reported research results rarely tell a true story

Way back in the dark ages, when I was in graduate school getting a master’s in communication, the subject of opinion research so interested me that I developed and conducted a survey for my master’s project.

The result of that intense effort has been my enduring interest and unmitigated skepticism about polls and other studies that are disseminated by the news media.

Although the news media have grown more sophisticated, especially about political polls, I find myself to be continually astonished by how much evidently shoddy research finds validation on the Internet, in newspapers and magazines, and on radio and television.

Regular readers no doubt recall my recurring rants about research on the nation’s housing market, none of which is entirely — or, for that matter — even mostly accurate. Case-Shiller is my favorite target, a great example being in Sunday’s New York Times, when Shiller cited his research based on 407 and 296 respondents in different years as if they represented a national sample of home buyers.  Impossible!

For that matter, how could 407 and 296 responses each reflect national sentiment?  If 296 is sufficient, why poll 407?  Conversely– you get the idea.

If only Shiller were alone.  But none of the others — not Trulia, not Zillow, not RealtyTrac, not CoreLogic, not Radar logic, not the federal government, not one — reveals the true story.

Findings may be out of date, selective (e.g. only single-family houses), too broad (encompassing whole Metropolitan Statistical Areas), too reliant on repeat sales, too narrowly focused on mid- to low-priced properties and so on.

Yet most journalists report uncritically on the studies that flood their inboxes.

What got me started writing this rant were two reports on the same day last week. One was in the Wall Street Journal, which quoted the National Association of Home Builders (NAHB), as having sponsored new research that demonstrated how three-quarters of respondents said owning a home is the best long-term investment. The disinterested NAHB!  Who are they kidding?

Of course, there were more results, but I couldn’t force myself to read on even though I included a link to the piece in my Weekly Roundup post on Friday.

The second report was in the online version of Realtor magazine (and subsequently echoed by other sites).  Said the magazine:

The number of people unhappy about moving is nearly four times higher than the historical average, according to results of a new moving sentiment survey.

I decided to go to the source, which turned out to be a company that sells moving supplies. Nowhere in its press release or on the company’s Web site did I find a scintilla of information about how the so-called survey was conducted. Neither was revealed the number of respondents.

However, I did learn that “husband-and-wife team Matt and Beth McCabe founded [the company] after their own move left them with an awareness of the difficulty and expense of finding moving boxes.”

My point is that we who consume research results need to ask at least four fundamental questions:

  1. Who is sponsoring the survey;
  2. Who is conducting it;
  3. How are the questions phrased;
  4. What is the sampling methodology?

To their credit, many of the traditional news media now make a point of describing research methods.  For example, the Journal included this information in its article:

The NAHB survey of 2,012 likely voters was conducted from May 3-9 by Neil Newhouse, a Republican pollster from Public Opinion Strategies, and Celinda Lake, a Democratic strategist from Lake Research Partners.

But the only way to judge the quality of the research is to know whether the individual reporting on it has a keen eye for subterfuge, evasion and the integrity of the work.  Even better, but far too time-consuming, would be to look at the study and its methodology yourself.

Unfortunately, you can’t even count on having confidence in the results from knowing that a recognized pollster such as Gallup or Harris did the work.  If there are 20 questions on a survey, there are 20 ways of manipulating the answers.  If there are 2,000 respondents, there are myriad ways of choosing them and slicing and dicing their replies.

The 2,000 number is a good one for getting a national result.  But once responses are narrowed by one parameter and then others — say, by state, then age, then income — forget about having much confidence in the stats.

We all know that statistics can be massaged to serve a particular interest.  I, for one, prefer my massages to be hands-on; Swedish will do just fine, thank you very much.

Subscribe by Email

Malcolm Carter
Licensed Associate Real Estate Broker
Senior Vice President
Charles Rutenberg Realty
127 E. 56th Street
New York, NY 10022

M: 347-886-0248
F: 347-438-3201

Malcolm@ServiceYouCanTrust.com
Web site

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s