Be skeptical of what you read in the paper – Part III

Updated on

By Paul Sonkin author of Pitch the Perfect Investment: The Essential Guide to Winning on Wall Street

This post is the third in an ongoing series focusing on incredibly idiotic stories in the newspaper.

Today’s victim is Jason Zweig, a reporter for whom we have the utmost respect (well maybe the only reporter we have any respect for). But even good reporters can write mediocre (or even nonsensical) stories on slow news days. I feel that the story in Zweig’s December 2, 2017 The Intelligent Investor column suffers from “sensationalism bias,” which is when the media overhypes a story. A simple example is when someone involved in a shark attack is reported on in the news (as a point of reference, there were 98 non-fatal shark attacks and 6 fatal attack in the US in 2015 according to a story in Newsweek). Contrast this coverage to stories that don’t make it into the news like the fact that there were 467,000 bicycle related injuries and 1,000 bicycle related deaths in the US in 2015 according to the CDC. The shark attacks make good headlines, the bicycle accidents don’t.

Zweig’s story Ignorance Certainly Isn’t Bliss for Investors starts by discussing the “bubble” in bitcoin and the fact that “On Nov. 30 the fifth-most popular stock among brokerage customers of Fidelity Investments was Bitcoin Investment Trust, a n unregistered fund that holds bitcoin.” This story seems a little like reporting on a shark attack. The Bitcoin Investment Trust has a 1,868,700 shares outstanding giving it a market cap of $3.1 billion. On December 1, 2017 the trust traded 122,022 shares, which at $1,666 per share equals $203 million of stock traded. Contrast this dollar amount to a rough calculation of total dollar volume on US exchanges of approximately $407 billion dollars on December 1st. The Bitcoin Investment Trust accounted for 0.05% of total dollar volume. This small trading share is roughly the same proportion of total shark attacks to bicycle injuries at 0.02%. The Bitcoin Investment Trust is a very small segment of the market. Hardly worth the mention.

Next the story frames the importance of “circle of competence” mentioning Buffett and Munger (to establish instant credibility of the statement because who could possibly argue with them.) Zweig then mentions a recent survey of 1,900 people who indicated that they had retirement plans. The total number surveyed was 2,918 people, which if extrapolated would imply that 65% of Americans aged 18-64 who are employed and not working for the government have 401ks, which is inconsistent with a study mentioned in a story on Bloomberg claimed that only 32% of Americans have saving in retirement accounts. Zweig’s story states other statistics from the survey and concludes that many of the 1,900 people surveyed don’t know much about the fees they are being charged and then proclaims, “That’s the tip of the iceberg of ignorance.”

I take issue with this claim. These individuals are at least saving something for retirement. Compared to those people NOT saving, these investors are SMART. And in terms of not knowing what the fees are, I think it’s likely that the individuals in the retirement plan assumed that the sponsor chose funds with moderate fees. Maybe the assumption is incorrect and the fee is high at 1.2%, which isn’t great, but at least the individuals are saving. I don’t know the exact questions asked in the survey – maybe people have a general sense that the fees are 1.0% to 1.5%, but it depends on how the questions were asked – the devil is in the details. I’m not sure that investor stupidity correlates with lack of knowledge of your 401k fees.

Then the story discusses another study that asked roughly 5,800 LinkedIn users five basic questions concerning financial literacy. Zweig states, “Only 38% of those surveyed got all five questions correct.” That statement sounds pretty damning. The actual study broke out the results into what they called the “Big 3” questions (on compounding, inflation and diversification) and the “Big 5,” which added two questions on bond prices and mortgages. Of the Big 3, the overall percent answering correctly was 76%. Adding the other two questions reduced the percentage of correct answer to 38%. Where did people mess up? If we look at the responses to the individual questions (appendix A of the study) we see that most of the questions were answered correctly. The one question that seems to have tripped-up most people was the one concerning bond prices.

 

Percent answering correctly:
Compounding 94%
Inflation 90%
Diversification 86%
Mortgage 94%
Bond Prices 45%

That question that most people got wrong was:

  1. Bond Pricing. If interest rates fall, what should happen to bond prices? Please select one.

_ They will rise

_ They will fall

_ They will stay the same

_ There is not relationship between bond prices and the interest rate

_ Don’t know

_ Prefer not to say

What is not clear from the survey results is if a response of “don’t know” or “prefer not to say” is considered an incorrect answer. By looking at the data, however, it appears that individuals got most of the questions correct, but did much worse on the bond question.

I also take issue with the statement Zweig makes, “…more than one-third of chief executives, chief financial officers and chief operating officers didn’t get all the answers right…” The study states,

“…we augmented the January and July, 2014, versions of a monthly omnibus survey that LinkedIn sends to its membership base. We added the Big 5 financial literacy questions to their standard omnibus survey and included additional questions that measured financial knowledge, beliefs and financial behaviors.”

The study then states that respondents spent an average of 8:41 minutes in the “first wave” that went out and 12:18 minutes on the second. It also mentioned that 1/3 of the people surveyed had a Bachelor’s degree and 1/3 a Master’s degree. It also stated that 1/3 of the respondents reported having a C-level job (CEO, CFO and COO).

Let’s think about this statement for a moment. What C-level executive has the time, or inclination, to respond to a LinkedIn surveys on financial literacy that take 9 to 12 minutes? I seriously doubt that the C-level executives in the study come from Fortune 500 companies – who has that much time?  While 1/3 of the individuals in the survey appear to be C-level executives and 2/3 have college or advanced degrees, are the respondents truly representative of the population? I’m not sure it is, which questions the validity of the study. It is probably not a stretch to think that a group of fairly educated people, with enough time on their hands to complete a long survey, might confuse the relationship between interest rates and bond prices.

I suppose that if you are a reporter trying to write a provocative story, it is better to say that “Only 38% of those surveyed got all five questions correct. Worse, more than one-third of chief executives, chief financial officers and chief operating officers didn’t get all the answers right” than to say, “Most people get basic financial questions correct but don’t fully understand the inverse relationship between bond prices and interest rates.” I think this story is a clear case of “cherry picking” data to make a point or at least cause a stir.

Don’t get me wrong, I think Jason Zweig is an excellent reporter – I just think that this story is not one of his better ones…

Leave a Comment