On Friday, I delivered what the University calls “the Last Lecture.” The idea is that the scholar chosen to deliver the lecture shares lessons based upon his/her life experience and scholarship. In lieu of my usual “Sunday Sermon”–and with apologies for its length–here is the speech I delivered, which was titled: DEFENDING REASON IN AN UNREASONABLE TIME.
I’m immensely honored—and flattered—to be asked to give this talk, but I must tell you that the request threw me into a real panic. “Last Lecture” sounds so portentous—not to mention that it seems to be an invitation to share whatever I might know before I totter off to the old folks home or the grave. Those who know me know I rarely have trouble sharing my opinionated perspectives, but the task of “summing up,” of connecting the conclusions I’ve reached to the life experiences that led me to those conclusions, seemed—and still seems—overwhelming.
But here I am.
I think I have always been a “political” person, in the sense that the question that has always fascinated me is: how should people live together? What sort of social and political arrangements are most likely to nourish our humanity and promote—in Aristotle’s term—human flourishing? If the old African proverb is right, if it “takes a village to raise a child,” what should that village look like, and how should its inhabitants behave? How do we build that kind of village? Is the human community headed in the right direction, or are we on the wrong road? My conclusions have been shaped by my life experiences as much as by my scholarship, and for the last several years, some of them have been keeping me up at night.
Let me begin with an important caveat: unlike so many of you in this room, I am not a scholar in the traditional sense; in fact, I have been a lifelong dilettante. (I do prefer the term “generalist,” but as Popeye said, “I yam what I yam”…) I’ve done a lot of different things over the past 50+ years, and the result is that I know a little about a lot of things, but depth isn’t my strong suit. Over the years, however—probably as a defense mechanism—I’ve convinced myself that there is value in casting one’s intellectual net rather widely. In my case, at least, it has allowed me to connect some seemingly unconnected dots, even when my own mastery of the subjects involved is tenuous or superficial.
Let me get the biography out of the way. I was born in 1941, and I am very much a product of the 1950s, when women were expected to be decorative and submissive—or at the very least, quiet. You can see the problem.
I grew up Jewish in Anderson, Indiana, where being Jewish was at best exotic and at worst, Satanic, and where I was usually the only Jew my classmates had ever encountered. Those experiences undoubtedly deepened my interest in the nature of justice, the tendency to divide people between “we” and “they,” and the effects of marginalization. They also kindled an ongoing fascination with the ways in which religion shapes our worldviews and how it intersects with and influences civil law.
I left Anderson for college when I was 16. I wanted to major in liberal arts, but my father insisted that I get a teaching degree, because if my eventual husband died, I would need something to fall back on. (At the time, educated women were secretaries, teachers or nurses; I couldn’t type and the sight of blood made me queasy. Ergo! I’d teach.) Because I was so young, my parents sent me to Stephens College for Women, a two-year school that took very seriously its obligation to act in loco parentis. After Stephens, I briefly attended the University of North Carolina, where the most indelible lesson I learned was that when you pay Full Professors 3000/year, you get what you pay for. (This was still the late 1950s, but even then, 3000 wasn’t much.) I transferred to IU Bloomington to finish my undergraduate degree, got married and divorced, and later did a semester at Butler, pursuing an MA in literature that I never finished.
I married a second time and took my first job (well, first if you don’t count the summer I worked for my father’s friend at his—no kidding—Cadillac-Rambler agency, where I was billed as Anderson’s first female used car salesman.) I began my adult work life as a high school English teacher. When I became pregnant with my first child, however, I could no longer teach—Even though I was married, those days, once women teachers or librarians “showed,” we could no longer be in the classroom. The theory evidently was that the kids would know what we’d been up to…
I went to law school when I was 30 and had three small children (four if you count the husband I had at that time). There were very few women in law school then, and my most important epiphany revolved around the need for potty parity… the few women’s restrooms had been included as a grudging accommodation to the secretarial staff. After graduating law school, I was the first female lawyer hired at what was then Baker and Daniels. (To give you a flavor of the time—serial interviews with prospective associates were conducted by several of the partners, and I was in conversation with two who were being very careful not to ask improper questions—this was barely ten years after creation of the EEOC. Since I had three children, I thought it reasonable to volunteer my childcare arrangements. One of the partners was so obviously relieved that I wasn’t some sort of radical feminist, he blurted out: “It isn’t that there’s anything wrong with being a woman. We hired a man with a glass eye once!)
I practiced corporate law at B & D for three years, until Bill Hudnut asked me to take charge of the City’s legal department. I was the first woman Corporation Counsel in Indianapolis–or, to the best of my knowledge, in any major metropolitan area. At the time, Indianapolis had two newspapers. The afternoon paper, the Indianapolis News, had a front-page “gossip” blurb, and I still recall its juicy little item after my appointment was announced: “What high-ranking city official appointed his most recent honey to a prominent position…” I guess it was inconceivable that I’d been appointed because I was a decent lawyer, or even because I represented a constituency Bill was reaching out to. I was a divorced female, Bill had a reputation, and that could only mean one thing.
I left City Hall to be the Republican candidate for Congress in 1980, running against Andy Jacobs, Jr., in what was then Indiana’s 11th Congressional district. That was back when Republicans were rational, and political campaigns less toxic. The worst thing I said about Andy was that he was a Democrat. My youngest son later served as his Congressional page, and after he retired, Andy and I would occasionally have lunch. As I say, things were different then….
I also remarried during that campaign and I’m happy to report that the third time was the charm—it’s been 35 years and counting.
After the election, I practiced law, started a Real Estate Development Company that went broke during the recession of the late 1980s, and served six years as the Executive Director of Indiana’s ACLU. (NUVO headline: ICLU Taken Over by Card-Carrying Republican!) I joined IUPUI’s faculty in 1998.
Like many of you, I’ve lived through the women’s movement, the Civil Rights movement, the 60s, the sexual revolution, the gay rights movement, the decades of religious zealotry that a friend calls “America’s most recent Great Awakening,” and a dizzying explosion of new technologies. As George Burns once said, I’m so old I remember when the air was clean and sex was dirty.
All of these experiences required me to think in different ways and from a variety of perspectives about the questions with which I began this talk: what is a just society? How do we mediate the tensions between individual rights and the common good? Who gets to decide what the common good is? Can government institutions ensure social order without doing unnecessary damage to individual autonomy? How?
When I first became politically active, at nineteen, it was as a Republican. I was persuaded—and remain persuaded—by what has been called the “libertarian principle,” the belief that the best society is one in which individuals are free to set and pursue our own life goals, determine our own telos, so long as we don’t harm the person or property of a non-consenting other, and so long as we are willing to grant an equal right to others. Back then, with some notable exceptions, the GOP understood the importance of “so long as” in those last two caveats. Times, obviously, have changed. The political party to which I belonged then no longer exists, except in name.
For those who begin with the libertarian principle as I just shared it, good faith political arguments tend to revolve around the nature and severity of the “harms” that government can legitimately prohibit or regulate, and the extent of government’s obligation to provide a physical and social infrastructure to be paid for through citizens’ “dues,” called taxes. Needless to say, we are not having those good faith arguments today—instead, we are in what may well be an existential struggle between science and reason on the one hand, and a variety of fundamentalisms—characterized by the rejection of reason, evidence and empirical knowledge—on the other.
My first book was What’s a Nice Republican Girl Like Me Doing at the ACLU? I wrote it while I was still at the ACLU, and had begun to recognize the truly appalling extent to which the general public is ignorant of America’s history, philosophy and constitutional system. My book was intended to be a sort of chatty introduction to those subjects. In it, I first articulated something I still call “the American Idea–” that this is a nation not based upon geography, ethnicity or conquest, but on a theory of social organization, a philosophy of governance that was meant to facilitate e pluribus unum—out of the many, one. The American Idea set up an enduring conversation about the proper balance between “I” and “we”–between individual rights and majoritarian passions.
At IUPUI, my first major research project was a three-year study of the Charitable Choice provisions of 1996 welfare reform, usually referred to as George W. Bush’s “faith-based initiative.” The Ford Foundation funded an investigation into the premises upon which that effort was based: the idea that faith-based institutions are better able to move people off welfare, the belief that there were “armies of compassion” waiting to rush in to help (and not so incidentally save the government money), and the belief that religious organizations had been unfairly excluded from governments’ contracting regimes.
All of those assumptions were wrong. Empirical research failed to substantiate the superiority of faith-based nonprofits—indeed, in our three-state study of job-training efforts, the secular organizations had better results—and because most faith-based organizations with the interest and capacity to work with government were already doing so (and had been welcomed, not excluded), the promised “armies of compassion” failed to materialize.
What I really learned from this research was that the Bush Administration hadn’t the foggiest notion how America’s social welfare system actually worked (or didn’t), and—far worse, from my perspective—had only the dimmest understanding of the First Amendment’s religion clauses. (The one thing the Bush Administration did extremely well was convince me that I was no longer a Republican.)
At the ACLU, I had recognized the extent of civic illiteracy among the general population; my Charitable Choice research introduced me to officials at the highest levels of government who viewed the Constitution and Bill of Rights as nuisances to be circumvented. That research also introduced me to the immense and underappreciated influence of religious worldviews on public policy formation.
Of the nine books I’ve written, the two that taught me the most—the ones that required the “deepest dives” into our philosophy of government and suggested some answers to Aristotle’s question—were God and Country: America in Red and Blue and my most recent textbook Talking Politics? What You Need to Know Before You Open Your Mouth.
It was while researching God and Country that I came across Frank Lambert’s illuminating description of the difference between the Founding Fathers and those he calls the Planting Fathers. The first set, the Planting Fathers, did come to the colonies for religious freedom, but they defined religious liberty as “freedom to do the right thing.” They came to America to build a ‘ Shining City on the Hill,’ and most of them believed that God not only wanted them to follow the “right way,” but that He also wanted them to make sure their neighbors did too. Religious freedom meant that government would establish the “correct” religion.
One hundred and fifty years later, when George Washington swore to ‘faithfully execute the office of President of the United States,” he undertook to “preserve, protect, and defend” a Constitution those Planters would have found incomprehensible. The new constitution was the product of the men we call the Founding Fathers, and it made no reference whatever to God or divine providence, citing as its sole authority ‘the people of the United States.’” In the intervening 150 years, the Enlightenment had changed the way we defined liberty.
It was an epiphany, because it provided a lens through which to understand so much of our current political environment. American politics remains a contest between the numerous Puritans still among us and those I call Modernists, both secular and religious. Policymaking has become a power struggle between Puritans who believe the role of government is to make the rest of us live “godly” lives, based upon their particular version of the good, and those of us who demand that government act upon what John Rawls called “public reasons,” based upon logical persuasion and scientific and empirical understandings. Contemporary Puritans remain deeply antagonistic to the Enlightenment and to secular ways of knowing—especially science—and they utterly reject the notion that each of us gets to define our own morality. Scroll down a Facebook page, or read the comments section of an online newspaper, and you’ll come across posts from fundamentalists of various stripes who wrap themselves in victimhood whenever government fails to impose their preferred worldviews on everyone else.
In God and Country, I wanted to “unpeel the onion,” to explore the ways in which ostensibly secular policy preferences are actually rooted in religious ways of seeing reality. In many of our current policy arguments, of course, the religious dimensions are obvious: think death penalty, abortion and same-sex marriage. But there are also debates about presumably secular policies in which unrecognized religious perspectives rather than data and evidence are really driving the disputes. Take economic issues—particularly those involving poverty and inequality, issues where the influence of America’s early Calvinism remains strong.
Mitt Romney may be Mormon, but his disdain for the 47% was grounded in a culture with a deeply entrenched, if bastardized, version of Calvinism; a belief that God smiles upon the “elect,” and the poor are poor because they are morally defective. Accusations that poor folks lack “middle class values” are a modern, sanitized version of that theologically-rooted conviction.
In God and Country I explored the religious roots of policy preferences about the economy, the environment, foreign policy and criminal justice—and I learned a lot about different versions of Christianity in the process. But what I really came to understand was the importance of paradigms, the worldviews that shape our perceptions of reality. Even those of us who consider ourselves entirely secular, who have no doctrinal religious beliefs, hold views that are rooted in early, religious ways of understanding the world around us.
Paradigms are really where biography matters. My views of reality and human obligation were shaped by my own religious background and culture. On my office wall is a cross-stitched paraphrase of a Talmudic injunction to the effect that, while God doesn’t expect us to perfect the world in our generation, we aren’t free not to try. I was raised in a congregation that chanted “Justice, Justice shalt thou pursue.” I am not religious but I remain a product of a very specific culture. As we all are.
And that brings us back to the question that has consumed me for most of my 73 years: what sort of social order, what kind of legal system, is most likely to protect both individual liberty and the common good?
Given my biography and experiences, it shouldn’t be surprising that I think Enlightenment philosophers like Montesqiueu and Locke, and Founders like Madison and Jefferson got the big parts right. Not that our Constitution and jurisprudence don’t have some pretty substantial defects, but the basic balance between individual autonomy and government power, the constraints upon the use and misuse of that power, are essential to a society that facilitates human flourishing, and to a village that nurtures its children. The problem is, few of our citizens know anything about that balance or those constraints.
As many of you know, I founded and direct a Center for Civic Literacy here at IUPUI, because we desperately need to reinvigorate civic education. When a polity is very diverse, as in the United States, it is critically important that citizens know at least the basic outlines of the country’s history, philosophy and governing architecture; in the absence of other ties—race, religion, national origin—a common understanding of, and devotion to, constitutional principles is critical to the formation of national identity and commitment to the common good.
Such devotion is a far cry from the faux patriotism so often displayed by our least self-aware policymakers. Someone needs to explain to Rudy Giuliani that genuine patriotism has to be based on an understanding of the nation’s history and institutions—the good, the bad and the ugly—if it is to enable, rather than impede, deliberative discourse and democratic citizenship.
It won’t surprise anyone here that I see education, the search for truth, respect for evidence, the willingness to examine and re-examine one’s most basic beliefs, as the essential key to human progress. (I tell my students that—whatever else they learn, I will consider my classes successful if they leave using two phrases far more frequently than they did before they enrolled: it depends, and it’s more complicated than that.) The ability to live with ambiguity and complexity, the ability to internalize Learned Hand’s observation that “the spirit of liberty is the spirit that is not too certain it is right” is at the heart of the human enterprise.
And that pursuit is increasingly under attack. Recently, Wisconsin Governor Scott Walker proposed cutting $300 million from the University of Wisconsin’s budget. Even more appalling, he wanted the university’s mission changed from “Basic to every purpose of the system is the search for truth” to “meeting the state’s workforce needs.”
As the New York Times wrote in a scathing editorial, “It was as if a trade school agenda were substituted for the idea of a university.”
More recently, legislators in Oklahoma have proposed to eliminate AP History classes in that state, because they teach students “negative things” about America. In other words, the classes educate rather than indoctrinate.
Scott Walker and those Oklahoma legislators are emblematic of the anti-intellectualism and the assault on reason and evidence that has come to characterize the American Right. These shallow and ambitious politicians believe that education and job training are synonymous, that scholarly research and the “search for truth” are elitist non-essentials, and that humans don’t need to know anything that doesn’t either promote an unreflective nationalism or lead to directly to gainful employment. They’d have handed Socrates that cup of hemlock in a heartbeat.
My husband says I’ve been in a bad mood since 2000, and he’s right. The problem, as I see it, is twofold: first, Americans increasingly inhabit alternate realities; the second is that we no longer understand ourselves to be bound by a social contract.
In Talking Politics, I argued for the need to distinguish between facts that have been documented and agreed to by responsible people of all ideological perspectives, and the different conclusions and interpretations that may legitimately be drawn from those facts. To use an analogy from the courtroom, two sides to a conflict may “stipulate” to what happened, but then proceed to argue in good faith about what those agreed-to stipulations really tell us. That’s the way our political discourse should work. But increasingly, as we all know, it doesn’t.
We have lawmakers who reject evolution and the massive evidence of global climate change, not because they can marshal evidence—persuasive or otherwise—but because they choose not to believe it. There is mounting concern over what scholars call “motivated reasoning”—the tendency to cherry-pick evidence and to see only those things that support one’s preferred beliefs. Our ability to construct our own realities has enormously increased with the precipitous decline of what has been called the journalism of verification, and the advent of what Eli Pariser calls “the filter bubble,” the sophisticated algorithms that filter the information we get online and thus allow us to inhabit a world that reinforces our pre-existing biases and beliefs.
The problem is, as Neil DeGrasse Tyson recently put it, reality doesn’t care whether you believe in it or not. We may well have passed an environmental tipping point; even if we haven’t, the world my grandchildren will inherit will be less conducive to human habitation than the world I’ve inhabited. I have argued that we should view environmental policy through the lens of Pascal’s wager—if we act to mitigate environmental harms, and it turns out we’ve been wrong about climate change, the only damage done will be to the bottom line of the fossil fuel industry (which can well afford it), and the rest of us will enjoy cleaner air and water. If, however, we ignore 98% of climate scientists, and it turns out they were right, we risk making the planet uninhabitable. That seems like a no-brainer to me, but apparently you can’t reason people out of positions they didn’t reason themselves into. So much for science and reason.
In the realities occupied by our contemporary social Darwinists and historical revisionists, there is no concept of a Social Contract, and no appreciation for the importance of the physical and social infrastructure required by our village. Too many of our lawmakers and pundits seem oblivious to the multitude of systems—both physical and social— that our village needs in order to function, let alone flourish. When those largely invisible, taken-for-granted networks of support don’t work—or when they have been corrupted or co-opted so that they only work for some groups and individuals—a society fails to function as it should.
Folks in that alternate reality object reflexively to any and all social welfare proposals, seeing them as contrary to the principles that animated the Constitution and Bill of Rights. They need to re-read Locke, and revisit—or visit—social contract theory. That theory tells us that government’s job is to promote both individual freedom and the common good. Despite the increasingly hysterical rhetoric from the Right, the two are not mutually exclusive. In fact, they are interdependent.
Without a social welfare infrastructure, even the most privileged individuals can’t flourish for long. Arguments about how a social safety net should be structured, and the proper limits to place on government largesse are entirely reasonable and important, but those aren’t the arguments we are having. Instead, we are hearing from people who believe they are entitled to the benefits of the physical and social infrastructure we all pay for—but disfavored “others” are not. I have been absolutely astonished by the vituperative and inhumane opposition to the Affordable Care Act (aka ‘Obamacare’). Opponents aren’t arguing for alternative programs—they’ve made it quite clear that they simply don’t want their taxes used to provide health insurance to “unworthy” poor people.
Another example is the current effort—in Indiana and elsewhere—to exempt so-called “bible-believing Christians” from compliance with otherwise applicable civil rights laws. In our system, religious citizens have absolute liberty to believe whatever they want—that’s the individual rights pole of the continuum. But religious or political beliefs, no matter how sincere, don’t entitle people to sacrifice newborns or blow up abortion clinics, and they don’t entitle them to engage in behavior that is contrary to America’s cultural and legal commitment to civic equality. That’s the public good end of the continuum. There is no religious privilege to behave in ways that we collectively deem destructive to our social health.
Let me just share a final few observations:
Social justice is a term we don’t hear very often these days. Social justice is aspirational, and its elements are subject to debate, but at its heart, the concept is concerned with mutual obligation and the common good. In its broadest outlines, a just society is one that meets the basic human needs of its members, without regard to their identities or social status—a society that does not draw invidious distinctions between male and female, black and white, gay and straight, religious and atheist, Republican and Democrat, or any of the other categories into which we like to sort our fellow humans. It is a society that recognizes and respects the inherent dignity and value of each person.
We should want to make our village more just for many reasons, practical as well as moral: for one thing, a more equitable society is in the long-term best interests of even those people who don’t feel any obligation to feed hungry children or find jobs for ex-offenders or make health care accessible to poor people. That’s because in order to remain competitive in the global economy, America needs to make use of all its talent. Social injustices that prevent people from contributing their talents cost all of us in lost opportunities and unrealized promise.
It’s obvious that many Americans don’t much care about wasted resources, but the second argument should be compelling even to the “I’ve got mine and that’s all that counts” crowd. Democracies require stability in order to survive. In countries where there are great gaps between the rich and poor, in countries where some groups of people go through their lives being marginalized or despised while others enjoy privileges and respect, in countries where some people are exploited while others benefit—that stability is hard to come by. A wealthy friend of mine once put it this way: “I’d rather pay more in taxes than spend my days worrying about angry mobs rioting in the streets or desperate people kidnapping my children.”
Even in highly individualistic America, no one succeeds solely by his own efforts. That social and physical infrastructure I’ve been harping on supports and enables entrepreneurship and wealth creation, and we taxpayers have built and maintained that infrastructure. And that’s fine. That’s what it’s for. We all benefit when someone builds a better mousetrap, or improves on the other guy’s widget. But when that entrepreneur profits from his better mousetrap, we who supplied that infrastructure have a right to a portion of his profits in the form of taxes. We also have a moral obligation to those for whom the existing infrastructure just isn’t sufficient or accessible.
There’s a credit card commercial that says “Membership has its privileges.” Membership in society should have its privileges as well. That’s not an argument for massive welfare programs or redistribution of wealth. It is an argument for fundamental fairness, an argument that recognizes that we all benefit when inclusive social structures operate in the interests of all of our members.
From time to time, greed and fear obscure the reality of human interdependence. Unfortunately, we seem to be living in one of those times–an era characterized by an intentional refusal to recognize that there is such a thing as the common good, and a willful ignorance of the importance of mutual social obligation.
Addressing that willful ignorance is what social justice requires, but that is easier said than done.
I’m painfully aware that cultural institutions, folkways and intellectual paradigms influence people far more than logic and reason, and that culture is incredibly difficult to change. Structural barriers and ingrained privilege don’t disappear without significant upheavals or outright revolutions.
We may be approaching such a period of upheaval, not unlike the Sixties. When I look around, I see a depressing revival of tribalism, and widespread expressions of a racism I thought we’d moved beyond. The election of an African-American President was a sign of progress, but it clearly lifted a rock—and what crawled out is unbelievably ugly and destructive. The growth in inequality threatens to exceed the inequities of the gilded age, if it hasn’t already, and it is hard to argue with those who look around and see not a republic, not a democracy, but an oligarchy.
When I look at America’s politics, I’m reminded of a 1999 movie called “The Sixth Sense.” The young boy in that movie saw dead people; I see crazy people. I know that isn’t politically correct, but how else would you characterize some of the voices dominating our public discourse? How else explain the “birthers” and conspiracy theorists, the “Faux News” pundits and the websites peddling nativism, paranoia and propaganda? In what universe is Sarah Palin a potential Vice-President, or Roy Moore a state Supreme Court Justice or James Inhofe Chair of the Senate Committee on the Environment? On what planet do people pay attention to buffoons like Donald Trump or Ted Cruz or Louie Gohmert?
If I had to guess why so many of our fellow-citizens appear to have gone off the deep end—why they are trying to stockpile guns, roll back women’s rights, put gays back in the closet, stigmatize African-Americans and stereotype Muslims—I think the answer is fear. Change is creating a very different world from the one most of us grew up in, and the pace of that change continues to accelerate. As a result, we have a lot of bewildered and disoriented people who find themselves in an increasingly ambiguous world; they are frantic for bright lines, clear rules, simple answers to complicated issues, and especially, for someone to blame. People who are confounded by new realities, and especially those who are unhappy or dissatisfied with their lives, evidently need to attribute their problems and disappointments to some nefarious “other.” So the old racist and sexist and homophobic tropes get trotted out.
Unfortunately, the desire for a world where moral and policy choices are clear and simple is at odds with the messy reality of life in our global village, and the more these fearful folks are forced to confront that messy reality, the more frantically they cling to their ideological or theological touchstones.
It may be that this phenomenon is nothing new, that there aren’t really more crazy people than before. Maybe, thanks to the Internet and social media, we are just more aware of them. I hope that’s true, but I don’t know–I only know that a scroll through Facebook elevates my blood pressure.
At the end of the day, what will prevent us from fashioning a social order that promotes and enables human flourishing is continuation of this retreat into anti-intellectualism, bigotry and various kinds of fundamentalism. We villagers only become fully human—we only flourish—through constant learning, by opening ourselves to new perspectives, by reaching out and learning from those who are different.
I do see some welcome signs that the fever is abating, at least in the United States and at least among younger Americans. I would turn this country over to my students in a heartbeat: they may not be the best-informed generation, but they are inclusive and intellectually curious, and they care deeply about the planet and about their communities. For my grandchildren’s sake, I hope they can salvage this “village” we call Earth from the mess my generation is leaving them—and despite the fact that this has been my “Last Lecture,” I hope I hang around long enough to see if they succeed.