Studies Say…

I love this quotation( attributed to one Andrew Lang, who was born in 1844): “He uses statistics as a drunken man uses lamp-posts… for support rather than illumination.”

Actually, we all do that from time to time, and political psychologists tell us it is the mark of “confirmation bias”–the very human habit of cherry-picking available information in order to select that which confirms our preferred worldviews.

Because that is such a common behavior, and because we can easily find ourselves citing to “authorities” that are less than authoritative (and sometimes totally bogus), I’m going to bore you today by sharing information from a very useful tutorial on assessing the credibility of “studies,” as in “studies confirm that..” or “recent studies tell us that…”

Academics who have conducted peer reviews of journal submissions are well aware that many studies are fatally flawed, and should not be used as evidence for an argument or as confirmation of a theory. (If I were doing research on voter attitudes, and drew my sample–the population that I surveyed–from readers of this blog, my results would be worthless. While that might be an extreme case, many efforts at research fail because the methodology is inappropriate, the sample size is too small, the questions are posed in a confusing manner, etc.)

The tutorial suggests that journalists intending to cite to a study ask several pertinent questions before making a decision whether to rely upon the research:

The first question is whether the study has been peer-reviewed; in other words, has a group of scholars familiar with the field approved the methodology? This is not foolproof–professors can be wrong–but peer review is blind (the reviewers don’t know who conducted the study, and the authors don’t know who is reviewing it), and tends to be a good measure of reliability. If the study has been published by a well-regarded academic journal, it’s safe to assume that its conclusions are well-founded.

Other important inquiries included looking to see who funded the research in question.

 It’s important to know who sponsored the research and what role, if any, a sponsor played in the design of the study and its implementation or in decisions about how findings would be presented to the public. Authors of studies published in academic journals are required to disclose funding sources. Studies funded by organizations such as the National Science Foundation tend to be trustworthy because the funding process itself is subject to an exhaustive peer-review process.

The source of funding is especially relevant to the possibility that the authors have a conflict of interest. (Remember those “studies” exonerating tobacco from causing cancer? Surprise! They were paid for by the tobacco companies.)

Other important elements in the evaluation may include the age of the study, since, as the post noted,  “In certain fields — for example, chemistry or public opinion — a study that is several years old may no longer be reliable.”

Sample size and the method used to select survey respondents are obviously important, and statistical conclusions should be presented in a way that allows readers to review their calculations. It’s also worth looking closely to see whether the study’s conclusions are actually supported by the reported data. As the post notes,

Good researchers are very cautious in describing their conclusions – because they want to convey exactly what they learned. Sometimes, however, researchers might exaggerate or minimize their findings or there will be a discrepancy between what an author claims to have found and what the data suggests.

In an information environment increasingly characterized by misleading claims, spin and outright propaganda, the ability to distinguish trustworthy research findings from those that are intellectually suspect or dishonest is fast becoming an essential skill.

Comments

A Spoonful of Sugar Makes the Dishonesty Go Down….

Evidently, you can’t even trust research from Harvard. At least, not all of it.

A number of media outlets have reported that in the 1960s,

prominent Harvard nutritionists published two reviews in a top medical journal downplaying the role of sugar in coronary heart disease. Newly unearthed documents reveal what they didn’t say: A sugar industry trade group initiated and paid for the studies, examined drafts, and laid out a clear objective to protect sugar’s reputation in the public eye.

The consequences of this deception are several, and they are all deeply disturbing.

First–and most obvious–is the misdirection of subsequent research and government efforts to improve heart health. Thanks largely to the reputation of Harvard and its research faculty, the publications sent other medical researchers down different paths, and retarded accurate evaluation of the role sugar plays in heart disease.

The trade group solicited Hegsted, a professor of nutrition at Harvard’s public health school, to write a literature review aimed at countering early research linking sucrose to coronary heart disease. The group paid the equivalent of $48,000 in 2016 dollars to Hegsted and colleague Dr. Robert McGandy, though the researchers never publicly disclosed that funding source, Kearns found.

Hegsted and Stare tore apart studies that implicated sugar and concluded that there was only one dietary modification — changing fat and cholesterol intake — that could prevent coronary heart disease. Their reviews were published in 1967 in the New England Journal of Medicine, which back then did not require researchers to disclose conflicts of interest.

These, and similar, research reports led to the belief that fat, not sugar, was the culprit, and Americans went on a low-and-no fat binge. What was particularly pernicious about the hundreds of new products designed to meet the goal of lowering fat content was the food industry’s preferred method of making low-fat offerings taste good: the addition of sugar. Lots of sugar.

The health consequences of this dishonesty–however grave– are ultimately less troubling than the damage done to academic credibility.

We live in an era where significant numbers of people reject scientific facts that conflict with their preferred worldviews. News of academic corruption provides them with “evidence” that science is a scam and scholarship–especially scholarship that debunks their beliefs– is ideologically tainted.

Even the best, most rigorous research studies are only as good as the hypotheses tested and the methodologies employed. Some will inevitably prove to be flawed, no matter how honestly conducted. That’s unfortunate enough, but when industry can “buy” favorable results, it further undermines the credibility of all research results.

The discovery of the sugar industry’s role in twisting nutritional research results joins what we now know about the similar machinations of cigarette companies and fossil fuel industries.

In 2009, I wrote a book titled Distrust, American Style, examining the causes and effects of our mounting levels of social distrust. I wish I could say that time has made the book and its conclusions obsolete–but I can’t.

It’s understandable–but deeply disturbing– that so many Americans no longer trust science, business, government or each other.  Without trust, social capital erodes, suspicion replaces collaboration, and societies disintegrate.

Comments

Good Without God

It has been an article of faith (pun intended) among politicians and pundits that Americans will not vote for non-religious candidates. President Eisenhower famously said that “Americans need religion, and I don’t care which religion it is,” nicely capturing the conviction of most Americans that only believers can be trusted to do the nation’s business.

Our preference for piety has led–among other things– to the ludicrous spectacle of thrice-married, biblically-ignorant Donald Trump courting Evangelicals and tweeting out “questions” about Hillary Clinton’s religious bona fides.

The public is evidently willing to overlook the history of religious warfare and the long list of injustices perpetrated in the name of religion–at least, when those wars have been waged and those injustices perpetrated by adherents of their own religion.

Americans who remain firmly convinced that religious belief is an unalloyed good will find a recent study reported by the L.A. Times disconcerting.

The article began by noting the growth of what have been called the “nones.”

The number of American children raised without religion has grown significantly since the 1950s, when fewer than 4% of Americans reported growing up in a nonreligious household, according to several recent national studies. That figure entered the double digits when a 2012 study showed that 11% of people born after 1970 said they had been raised in secular homes. This may help explain why 23% of adults in the U.S. claim to have no religion, and more than 30% of Americans between the ages of 18 and 29 say the same.

The obvious question raised by these statistics is the ultimate fate of the children raised by nonbelievers. Can they possibly turn out to be upstanding, moral citizens without experiencing prayers at mealtimes and morality lessons at Sunday school? Without being warned that God is watching them?

Evidently, they can.

Far from being dysfunctional, nihilistic and rudderless without the security and rectitude of religion, secular households provide a sound and solid foundation for children, according to Vern Bengston, a USC professor of gerontology and sociology.

When Bengston noticed the growth of nonreligious Americans becoming increasingly pronounced, he decided in 2013 to add secular families to his study in an attempt to understand how family life and intergenerational influences play out among the religionless.

He was surprised by what he found: High levels of family solidarity and emotional closeness between parents and nonreligious youth, and strong ethical standards and moral values that had been clearly articulated as they were imparted to the next generation.

“Many nonreligious parents were more coherent and passionate about their ethical principles than some of the ‘religious’ parents in our study,” Bengston told me. “The vast majority appeared to live goal-filled lives characterized by moral direction and sense of life having a purpose.”

As the writer of the article noted, nonreligious family life has its own sustaining moral and ethical values, including “rational problem solving, personal autonomy, independence of thought, avoidance of corporal punishment, a spirit of ‘questioning everything’ and, far above all, empathy.”

The article concludes with a summary of social science research:

Comments

Don’t Confuse Me with the Facts!

Just how depressing have America’s policy debates become? What is the extent to which emotion and ideology have replaced reliance on facts, evidence and data–and what are the consequences of our refusal to confront unpleasant realities?

Permit me to offer just two examples.

In Florida, as you have probably heard, state workers are not permitted to use the phrase “climate change.” As the Guardian wryly noted,

You might have missed it, but Florida has solved climate change. Our state, with 1,300 miles of coastline and a mean elevation of 100 feet, did not, however, limit greenhouse emissions. Instead, the state’s Department of Environmental Protection (DEP), under Republican governor Rick Scott, forbade employees from using terms like “climate change,” “global warming” or “sea-level rise”. They’re all gone now. You’re welcome, by the way.

It’s pointless to call linguistic distortions of reality like this Orwellian: people tune you out when you use that word and, besides, Big Brother at least had wit. These are just the foot-stamping insistent lies of intellectual toddlers on the grift. It is “nuh-uh” as public policy. This is an elected official saying, “I put a bag over your head, so that means now I’m invisible” and then going out looting.

It isn’t only Florida; Scott Walker’s Wisconsin has a similar rule.

North Carolina went them one better:

In North Carolina, the legislature passed a ruling after the state’s Coastal Resources Commission released an estimate predicting the sea will rise 39 inches along the state’s coast in a century, ABC News reported.

The estimation alarmed developers and seaside residents. If the state was to take action, it would cost hundreds of millions of dollars, said ABC. North Carolina would need to draw new flood zones, build waste-treatment plants and elevate roads, and several permits of planned development projects would be in jeopardy.

So the state’s legislature promptly addressed the problem–with a bill banning the actual measurement of sea levels; henceforth, sea-level rise “may be predicted based only on historical data.”

It isn’t only climate change. For a number of years, Congress has banned federal research by the CDC on gun violence–a ban it extended in the immediate aftermath of the Charleston church shooting that left 9 people dead.

The ban began with the 1996 Dickey Amendment, which barred the CDC from involvement in any research that could be interpreted as advocating tougher gun laws. Jack Dickey, a Republican Congressman from Pine Bluff, Arkansas, who was then a junior member of the House Appropriations Committee, authored a rider to a spending bill that also slashed $2.6 million from the CDC’s budget— the precise amount that the organization had dedicated to studying gun violence the year before.

Ever since, CDC studies on guns and public health have been virtually non-existent. Dickey has since expressed regret over sponsoring the measure.

Every single day, 89 Americans die from gun violence, and yet we refuse to support research on the causes, effects and consequences of those deaths.

Representative David Price, vice chair of the House Gun Violence Prevention Task Force, recently argued that

“Regardless of where we stand in the debate over gun violence, we should all be able to agree that this debate should be informed by objective data and robust scientific research.”

Representative Price is wrong. There is nothing that ideologues and interest groups fear more than “objective data and robust scientific research.” Their most fervent hope is that public policy debates continue to be conducted in the absence of evidence. Their motto is: don’t confuse me with science or fact.

Problem is, as Neil DeGrasse Tyson is fond of noting, science is true whether or not you believe in it. Facts exist whether we accept them or not.

Ignoring reality is ultimately unsustainable.

Comments

Penny Wise, Pound Foolish

A friend from Wisconsin sometimes sends me clippings from his local papers that he thinks I will find interesting. The most recent was a perfect example of so much that’s wrong with our policymaking today: it told of a researcher at the University of Wisconsin at Madison who had been studying strains of the Ebola virus.

Just as his research was beginning to show promise–just as he and his team had created a potential vaccine– his funding was terminated. As the paper reported

The experiments demonstrated the vaccine, when administered in two doses, is effective even against the most deadly Ebola strains. That’s when the money ran out…Kawaoka could not proceed with tests to determine whether the vaccine regimen might work with humans.

I’m sure that those parceling out the ever-shrinking resources available for such studies figured Ebola was too abstract an issue–that scarce funds needed to be directed to research with more immediate application.

Whoops….

Researchers in all areas, including but certainly not limited to the sciences, have raised concerns over the dwindling of government resources for research and development. This lack of concern for investing in our continued progress, like our disinclination to maintain and improve our basic infrastructure, signals a country on the decline.

Who was it who said “The true meaning of life is to plant trees under whose shade you do not expect to sit”? Whether that is the meaning of individual lives may be up for debate–but concern for the long term and the willingness to invest in it absolutely must be a central precept for any nation that wishes to be–or remain–great.

Comments