Tag Archives: research

American Exceptionalism

“American Exceptionalism” has meant different things at different times. Usually, however, the meanings ascribed to that phrase have been positive. Over at The World’s Most Dangerous Beauty Parlor, however, “El Jefe” has described a far less rosy aspect of our exceptionalism.

As a country, the US is 4.4% of the world’s population, yet we own 42% of the world’s guns.  Let that sink in.  Our homicide rate in the US is over 300% that of the average of the rest of the OECD.

As he also points out, there are many ways in which the population of the U.S. is not exceptional.

  1. Do we have mental health problems?  Of course, but so does every other country.
  2. Do we sell violent video games?  Yes, but so does every other country.
  3. Do we have violent television shows and movies?  Yes, but so does every other country.
  4. Do we have a breakdown in the family unit?  Yes, but so does every other country.
  5. How about churches?  Are our churches shrinking?  Yes, but they are doing the same in other countries.

What we have that other countries don’t have–or at least, don’t have as much of–is guns. Lots and lots of guns.

After the Las Vegas mass shooting, Americans engaged in what has now become a ritual of hand-wringing and mutual recriminations. Critics of our lax gun regulations pointed out that large majorities of Americans (including a majority of NRA members) want to tighten those restrictions; defenders of the armament status-quo insisted that widespread gun ownership equals “freedom.”

Although most of the commentary rehashed arguments we hear after every mass shooting–and we have a lot of mass shootings–I did learn something new, and it was both terrifying and encouraging. Half of the 265 million guns in the U.S. are owned by 3% of the population–and only 22% of us own any firearms.

It’s encouraging to know that my non-armed household is in the majority; the news–and the high number of gun deaths– sometimes make it seem as if every American old enough to lift a gun owns one.

What’s terrifying is the likelihood that  (with the possible exception of people who may be collecting historic muskets and powder-horns) the 3% who possess vast arsenals are scary dudes.

We don’t know nearly enough about gun owners or gun violence, because Congress refuses to allow the CDC or other agencies to fund research on the subject. But USA Today recently reported on a privately-financed survey of gun ownership.

Researchers found that the top reason people owned guns was for protection from other people, even though the rate of violent crime has dropped significantly the past two decades, said Deborah Azrael, director of research at the Harvard Injury Control Research Center and one of the study’s authors.

Azrael said the study tried to update numbers and trends that hadn’t been reviewed in two decades. Separate reports on background checks and gun storage, based on the same survey, are scheduled to be released later this year.

“In a country where 35,000 people a year die by firearms, we haven’t been able to come out with a survey on gun violence for 20 years,” she said. “That’s a real failure of public health and public policy.”

The study also found that gun owners tend to be white, male, conservative, and residents of rural areas. Presumably–hopefully–that means that most of them are hunters, not crazed militia-men. On the other hand, a lot of America’s guns are handguns: the study found 111 million handguns nationwide, a 71% increase from the 65 million handguns in 1994.

So long as we have Trump in the White House and a Congress wholly-owned by the NRA and the gun manufacturers, we are unlikely to impose the sorts of reasonable restrictions that other countries have found effective, and we’re equally unlikely to get the kind of research we need.

I’d really like to know more about that 3%……

Love of Money

Here’s a challenge: how many biblical phrases must an evangelical Christian ignore in order to justify supporting Donald Trump?

I know–you have a life, and you are too busy to compile them all.

My personal favorite is the admonition that “Love of money is the root of all evil.” (Note: it isn’t the money–it’s the love of money.) Next time your pious neighbor explains that Trump’s riches are evidence of his worthiness, you might ask him about 1 Timothy 6:10.

I thought about that verse when I read a recent column summarizing research on the moral effects of wealth. It was written by Charles Mathewes, a Professor of Religious Studies at the University of Virginia, and Evan Sandsmark, a PhD student in Religious Studies at the University, and it touched on several issues with which this blog has recently dealt.

The authors note that people with great wealth used to be viewed as morally suspect (“The idea that wealth is morally perilous has an impressive philosophical and religious pedigree.”) but that such attitudes have changed. (As I’ve previously noted, I attribute the change to Calvin…)

We seem to view wealth as simply good or neutral, and chalk up the failures of individual wealthy people to their own personal flaws, not their riches. Those who are rich, we seem to think, are not in any more moral danger than the rest of us.

Recent research suggests otherwise, however. As they explain:

The point is not necessarily that wealth is intrinsically and everywhere evil, but that it is dangerous — that it should be eyed with caution and suspicion, and definitely not pursued as an end in itself; that great riches pose great risks to their owners; and that societies are right to stigmatize the storing up of untold wealth.

After quoting historical figures like Aristotle and religious books (including Hindu texts and the Koran), they quote Pope Francis, who has waxed eloquent on the subject, and then segue to current social science research.

Over the past few years, a pile of studies from the behavioral sciences has appeared, and they all say, more or less, “Being rich is really bad for you.” Wealth, it turns out, leads to behavioral and psychological maladies. The rich act and think in misdirected ways.

When it comes to a broad range of vices, the rich outperform everybody else. They are much more likely than the rest of humanity to shoplift and cheat , for example, and they are more apt to be adulterers and to drink a great deal . They are even more likely to take candy that is meant for children. So whatever you think about the moral nastiness of the rich, take that, multiply it by the number of Mercedes and Lexuses that cut you off, and you’re still short of the mark. In fact, those Mercedes and Lexuses are more likely to cut you off than Hondas or Fords: Studies have shown that people who drive expensive cars are more prone to run stop signs and cut off other motorists .

The rich are the worst tax evaders, and, as The Washington Post has detailed, they are hiding vast sums from public scrutiny in secret overseas bank accounts.

They also give proportionally less to charity — not surprising, since they exhibit significantly less compassion and empathy toward suffering people. Studies also find that members of the upper class are worse than ordinary folks at “reading” people’ s emotions and are far more likely to be disengaged from the people with whom they are interacting — instead absorbed in doodling, checking their phones or what have you. Some studies go even further, suggesting that rich people, especially stockbrokers and their ilk (such as venture capitalists, whom we once called “robber barons”), are more competitive, impulsive and reckless than medically diagnosed psychopaths. And by the way, those vices do not make them better entrepreneurs; they just have Mommy and Daddy’s bank accounts (in New York or the Cayman Islands) to fall back on when they fail.

The authors note studies suggesting that great material wealth actually makes people less willing to share.

All in all, not a pretty picture–although we should remember that statistics don’t necessarily describe individuals. (Not every rich guy is a Koch brother or a Donald Trump; there are the Warren Buffetts.) Nevertheless,

So the rich are more likely to be despicable characters. And, as is typically the case with the morally malformed, the first victims of the rich are the rich themselves. Because they often let money buy their happiness and value themselves for their wealth instead of anything meaningful, they are, by extension, more likely to allow other aspects of their lives to atrophy. They seem to have a hard time enjoying simple things, savoring the everyday experiences that make so much of life worthwhile. Because they have lower levels of empathy, they have fewer opportunities to practice acts of compassion — which studies suggest give people a great deal of pleasure . They tend to believe that people have different financial destinies because of who they essentially are, so they believe that they deserve their wealth , thus dampening their capacity for gratitude, a quality that has been shown to significantly enhance our sense of well-being. All of this seems to make the rich more susceptible to loneliness; they may be more prone to suicide, as well.

Given all this, I’m trying to work up my sympathies for our unhappy, morally-malformed President–but his sheer awfulness keeps getting in the way….


Studies Say…

I love this quotation( attributed to one Andrew Lang, who was born in 1844): “He uses statistics as a drunken man uses lamp-posts… for support rather than illumination.”

Actually, we all do that from time to time, and political psychologists tell us it is the mark of “confirmation bias”–the very human habit of cherry-picking available information in order to select that which confirms our preferred worldviews.

Because that is such a common behavior, and because we can easily find ourselves citing to “authorities” that are less than authoritative (and sometimes totally bogus), I’m going to bore you today by sharing information from a very useful tutorial on assessing the credibility of “studies,” as in “studies confirm that..” or “recent studies tell us that…”

Academics who have conducted peer reviews of journal submissions are well aware that many studies are fatally flawed, and should not be used as evidence for an argument or as confirmation of a theory. (If I were doing research on voter attitudes, and drew my sample–the population that I surveyed–from readers of this blog, my results would be worthless. While that might be an extreme case, many efforts at research fail because the methodology is inappropriate, the sample size is too small, the questions are posed in a confusing manner, etc.)

The tutorial suggests that journalists intending to cite to a study ask several pertinent questions before making a decision whether to rely upon the research:

The first question is whether the study has been peer-reviewed; in other words, has a group of scholars familiar with the field approved the methodology? This is not foolproof–professors can be wrong–but peer review is blind (the reviewers don’t know who conducted the study, and the authors don’t know who is reviewing it), and tends to be a good measure of reliability. If the study has been published by a well-regarded academic journal, it’s safe to assume that its conclusions are well-founded.

Other important inquiries included looking to see who funded the research in question.

 It’s important to know who sponsored the research and what role, if any, a sponsor played in the design of the study and its implementation or in decisions about how findings would be presented to the public. Authors of studies published in academic journals are required to disclose funding sources. Studies funded by organizations such as the National Science Foundation tend to be trustworthy because the funding process itself is subject to an exhaustive peer-review process.

The source of funding is especially relevant to the possibility that the authors have a conflict of interest. (Remember those “studies” exonerating tobacco from causing cancer? Surprise! They were paid for by the tobacco companies.)

Other important elements in the evaluation may include the age of the study, since, as the post noted,  “In certain fields — for example, chemistry or public opinion — a study that is several years old may no longer be reliable.”

Sample size and the method used to select survey respondents are obviously important, and statistical conclusions should be presented in a way that allows readers to review their calculations. It’s also worth looking closely to see whether the study’s conclusions are actually supported by the reported data. As the post notes,

Good researchers are very cautious in describing their conclusions – because they want to convey exactly what they learned. Sometimes, however, researchers might exaggerate or minimize their findings or there will be a discrepancy between what an author claims to have found and what the data suggests.

In an information environment increasingly characterized by misleading claims, spin and outright propaganda, the ability to distinguish trustworthy research findings from those that are intellectually suspect or dishonest is fast becoming an essential skill.

A Spoonful of Sugar Makes the Dishonesty Go Down….

Evidently, you can’t even trust research from Harvard. At least, not all of it.

A number of media outlets have reported that in the 1960s,

prominent Harvard nutritionists published two reviews in a top medical journal downplaying the role of sugar in coronary heart disease. Newly unearthed documents reveal what they didn’t say: A sugar industry trade group initiated and paid for the studies, examined drafts, and laid out a clear objective to protect sugar’s reputation in the public eye.

The consequences of this deception are several, and they are all deeply disturbing.

First–and most obvious–is the misdirection of subsequent research and government efforts to improve heart health. Thanks largely to the reputation of Harvard and its research faculty, the publications sent other medical researchers down different paths, and retarded accurate evaluation of the role sugar plays in heart disease.

The trade group solicited Hegsted, a professor of nutrition at Harvard’s public health school, to write a literature review aimed at countering early research linking sucrose to coronary heart disease. The group paid the equivalent of $48,000 in 2016 dollars to Hegsted and colleague Dr. Robert McGandy, though the researchers never publicly disclosed that funding source, Kearns found.

Hegsted and Stare tore apart studies that implicated sugar and concluded that there was only one dietary modification — changing fat and cholesterol intake — that could prevent coronary heart disease. Their reviews were published in 1967 in the New England Journal of Medicine, which back then did not require researchers to disclose conflicts of interest.

These, and similar, research reports led to the belief that fat, not sugar, was the culprit, and Americans went on a low-and-no fat binge. What was particularly pernicious about the hundreds of new products designed to meet the goal of lowering fat content was the food industry’s preferred method of making low-fat offerings taste good: the addition of sugar. Lots of sugar.

The health consequences of this dishonesty–however grave– are ultimately less troubling than the damage done to academic credibility.

We live in an era where significant numbers of people reject scientific facts that conflict with their preferred worldviews. News of academic corruption provides them with “evidence” that science is a scam and scholarship–especially scholarship that debunks their beliefs– is ideologically tainted.

Even the best, most rigorous research studies are only as good as the hypotheses tested and the methodologies employed. Some will inevitably prove to be flawed, no matter how honestly conducted. That’s unfortunate enough, but when industry can “buy” favorable results, it further undermines the credibility of all research results.

The discovery of the sugar industry’s role in twisting nutritional research results joins what we now know about the similar machinations of cigarette companies and fossil fuel industries.

In 2009, I wrote a book titled Distrust, American Style, examining the causes and effects of our mounting levels of social distrust. I wish I could say that time has made the book and its conclusions obsolete–but I can’t.

It’s understandable–but deeply disturbing– that so many Americans no longer trust science, business, government or each other.  Without trust, social capital erodes, suspicion replaces collaboration, and societies disintegrate.

Good Without God

It has been an article of faith (pun intended) among politicians and pundits that Americans will not vote for non-religious candidates. President Eisenhower famously said that “Americans need religion, and I don’t care which religion it is,” nicely capturing the conviction of most Americans that only believers can be trusted to do the nation’s business.

Our preference for piety has led–among other things– to the ludicrous spectacle of thrice-married, biblically-ignorant Donald Trump courting Evangelicals and tweeting out “questions” about Hillary Clinton’s religious bona fides.

The public is evidently willing to overlook the history of religious warfare and the long list of injustices perpetrated in the name of religion–at least, when those wars have been waged and those injustices perpetrated by adherents of their own religion.

Americans who remain firmly convinced that religious belief is an unalloyed good will find a recent study reported by the L.A. Times disconcerting.

The article began by noting the growth of what have been called the “nones.”

The number of American children raised without religion has grown significantly since the 1950s, when fewer than 4% of Americans reported growing up in a nonreligious household, according to several recent national studies. That figure entered the double digits when a 2012 study showed that 11% of people born after 1970 said they had been raised in secular homes. This may help explain why 23% of adults in the U.S. claim to have no religion, and more than 30% of Americans between the ages of 18 and 29 say the same.

The obvious question raised by these statistics is the ultimate fate of the children raised by nonbelievers. Can they possibly turn out to be upstanding, moral citizens without experiencing prayers at mealtimes and morality lessons at Sunday school? Without being warned that God is watching them?

Evidently, they can.

Far from being dysfunctional, nihilistic and rudderless without the security and rectitude of religion, secular households provide a sound and solid foundation for children, according to Vern Bengston, a USC professor of gerontology and sociology.

When Bengston noticed the growth of nonreligious Americans becoming increasingly pronounced, he decided in 2013 to add secular families to his study in an attempt to understand how family life and intergenerational influences play out among the religionless.

He was surprised by what he found: High levels of family solidarity and emotional closeness between parents and nonreligious youth, and strong ethical standards and moral values that had been clearly articulated as they were imparted to the next generation.

“Many nonreligious parents were more coherent and passionate about their ethical principles than some of the ‘religious’ parents in our study,” Bengston told me. “The vast majority appeared to live goal-filled lives characterized by moral direction and sense of life having a purpose.”

As the writer of the article noted, nonreligious family life has its own sustaining moral and ethical values, including “rational problem solving, personal autonomy, independence of thought, avoidance of corporal punishment, a spirit of ‘questioning everything’ and, far above all, empathy.”

The article concludes with a summary of social science research: