Newspapers

More on judges, politics, and ideology

Mark D. White

Adam Liptak has a "Sidebar" in today's New York Times titled "'Politicians in Robes? Not Exactly, but..." discussing judges' voting records and the politics of the president who nominated them, citing data that finds a clear link and accusing judges of deciding cases based on "ideology." My comment is below:

Of course judges are ideological, but this does not necessarily translate into naked politics. Each judge has his or her own style of jurisprudence that may appeal more to presidents of one party or the other. A president will nominate judges with judicial philosophies that support his (or, someday, her) policy agenda. From the point of the view of presidents, judges and their judicial philosophies are tools, but to the judges, they are acting on principle. There is little ground for reading a judge's record as political rather then principled simply because his or her decisions are often in favor of the party of the president who nominated him or her.

I have more to say on this theme in this earlier post (written before the Supreme Court decided the Obamacare case, obviously).


Not so sweet: Crossing the line from science to opinion in the sugar wars

Mark D. White

In a New York Times op-ed this morning, Daniel E. Lieberman, an evolutionary biologist at Harvard, makes a strong case for our strong attraction to sugar, but a weak case for paternalistic action on the part of the government to limit our consumption of it.

The problem is captured by the question he poses after he lays out the scientific reasoning: "What should we do?" Professor Lieberman somehow makes the leap from "people are eating too much sugar" to "the government should do something to stop this" without appreciating the size of the chasm he's jumping. (Just ask David Hume.) He fails to explain why this is a problem that justifies government intrusion into the choices of individuals--he simply takes it for granted that because we have evolved to crave more sugar than he thinks is optimal, the government is entitled to adjust our sugar consumption to bring it in line with what he thinks is optimal.

This is yet another example of the most offensive aspect of paternalism: value substitution. We are lucky to have access to Professor Lieberman's scientific insights regarding why we crave sugar so much. But then he asserts his opinion that we eat too much sugar, based on his opinion regarding our optimal diets (based on our ancestors' nutritional nirvana long before Coca-Cola and Nabisco). This much is fine--everyone is entitled to his (or her) opinion, and he is fortunate that The New York Times gives him a platform to express it. But like all who endorse paternalism based on what they think we can do better, Professor Lieberman crosses a line when he shifts from expressing his opinion regarding our behavior to endorsing government action to adjust our behavior based on that opinion.

But isn't it commonsense that we should eat less sugar (and salt, and trans fats) and more healthy foods? Of course. But is that all our only concern? Is it our only interest? Should it be? Not only does value substitution mispresent our true interests, but it only greatly oversimplifies them. I know I shouldn't eat too much sugar. But I also think a sugary treat or drink is a fine complement to a meal, or the cornerstone of a celebration, or a nice way to acknowledge a job well. People's interests are complex, and while they do include health, they also include a myriad of other things, things that are ignored when a scientific expert or government regulator proclaims, "Too much! Too much!"

After making a reasonable case for limiting unhealthy foods in schools--a proposal I have very little problem with--Professor Lieberman proclaims that "adults need help too." This is the paternalistic mindset in a nutshell: you need help because we know better than you how you should run your life. Again, he makes a tremendous leap, from "you are being tempted by cheap sugar from the food industry" to "you need the government to step in and counter this influence." First, there is no way to know how much sugar consumption is due to "irresistible temptation" and how much is due to open-eyed choice. Paternalistic regulation--whether in the form of prohibition, taxes, or nudges--is a blunt tool that misses the nuances of human decision-making.

Second, this treats people as slaves to their passions--there's Hume again--which must be manipulated by the state to counter manipulation by industry. Professor Lieberman devotes just two sentences to spreading information, but decides that it hasn't done enough--"enough," of course, based on his opinion regarding what should have happened. But here's another possibility: people know how bad sugar is from them, and armed with that information plus all of their multifaceted interests, they nonetheless choose to eat more sugar than Professor Lieberman would like them to.

Professor Lieberman concludes with this: "We have evolved to need coercion." I hope he's not making that claim on the basis on his scientific expertise, because science cannot tell us what we need. Science can help explain why we do what we do--as Professor Lieberman well details in the early part of his article--but it has nothing to contribute to what we need. Such a proclamation requires knowledge of our goals, interests, or "purpose"--the last one a teleological notion which scientists normally disavow--and none of which science or government knows better than people themselves.


New York Times' Room for Debate: Wrong Question, Wrong Answers

Mark D. White

That was fast--in a "Room for Debate" feature that went online Saturday evening, the New York Times asked "What's the Best Way to Break Society's Bad Habits?" The contributors, predictably, take the question at face value and answer accordingly. But the question is nonsensical and the answers beside the point.

"Society" does not have bad habits--people do. And it is not for "society," or anyone in it, to decide whether a person's habits are bad, except that person himself or herself. Others are free to tell a person they think he or she has bad habits, to try to persuade or inform him or her about why these habits are bad, but only the person who has these habits can judge whether they are bad, based on his or her own interests.

So the question the Times poses is based on a false premise. A better question would be, what can we do as a society to help people conquer habits that they themselves judge are bad? Paternalism won't work, since it paints with too broad a brush, affecting everyone with a particular habit whether they think it's bad or not. The best way to help people break self-identified bad habits is to hold them responsible for their consequences.

But exactly the opposite is happening: we are moving away from individual responsibility and toward collective responsibility. This shows up most clearly in health care, where the more responsibility the government takes (or forces private insurers to take) for people's unhealthy behavior, without being able to charge more in premiums or deductibles to make up for it, the less incentive people have to moderate such behavior. If they were faced with even some of the costs of their behavior (as they would under a more flexible private health insurance system), people could make a fairly rational decision whether the cigarettes, or soda, or fatty foods, are worth the eventual cost. But now their personal costs are opaque, consisting of taxes or insurance premiums largely unrelated to their behavior.

And the all-too-predictable result of more collective responsibility for health care is more governmental control of behavior. Restrictions on unhealthy behavior are not just paternalistic anymore--they're now a public cost problem. Cities and states are eager to cite rising Medicare costs as justifications for their restrictions on smoking, trans fats, and other health risks. (Forget broccoli: academics today seriously endorse plans to mandate exercise.) But this is like the boy who shot his parents and then argues for mercy because he's an orphan; by claiming responsibility for health care costs, the government has created the crisis (or at least this particular part of it) which "justifies" restrictions on behavior.

Let people judge whether their own habits are good or bad, and let them take responsibility for the consequences of these decisions. That's the right answer to the right question.


Mayor Bloomberg nudges New Yorkers away from the Big Gulp--and towards two Little Gulps instead

BloombergOh, Mayor Bloomberg--you make writing a book about libertarian paternalism and nudges too easy. (Thanks!) But seriously, you help show why it's important to write this book, that's it's not just some pie-in-the-sky idea that lives only in the ivory tower, but one that affects the real world.

Yesterday The New York Times reported that New York City Mayor Michael Bloomberg, through his Board of Health, is planning to limit sizes of sugary drinks like soda (other than diet), energy drinks, and sweetened coffee drinks, to 16 ounces. (One person on Twitter remarked that this is still 13 ounces more generous than the TSA.) This applies to prepackaged bottles of beverages sold in bodegas or delis (but not grocery stores or convenience stores) as well as drinks poured by an employee or customer, such as fountain soda sold at fast food restaurants, sports games, and movie theaters.

According to the article,

The mayor, who said he occasionally drank a diet soda “on a hot day,” contested the idea that the plan would limit consumers’ choices, saying the option to buy more soda would always be available.

“Your argument, I guess, could be that it’s a little less convenient to have to carry two 16-ounce drinks to your seat in the movie theater rather than one 32 ounce,” Mr. Bloomberg said in a sarcastic tone. “I don’t think you can make the case that we’re taking things away.”

No, he's not taking away people's soda or limit consumer choices--people are free to buy more, smaller drinks or take advantage of free refills--but he is hoping to affect their choices, or he wouldn't be doing this in the first place. This element of cynical manipulation lies behind all nudges, the idea that regulators can leave your options unchanged substantively but still change your behavior for the better.

This leads to another offensive aspect of nudges: to change behavior without curtailing options, they rely on the same cognitive biases and dysfunctions that its proponents use to justify their imposition. I assume that Bloomberg blames short-sightedness or lack of willpower for New Yorkers' heavy consumption of sugary drinks, but his plan will only work if people were too lazy, hurried, or absent-minded to consider other options for getting more soda. (His sarcasm about the inconvenience of buying two sodas is ironic, since that inconvenience is one thing that he's counting on to drive the success of his plan.)

What do I see coming from this? A lot of delis and bodegas working to reclassify themselves as grocery stores instead of "food service establishments" (a health department classification) and a lot more restaurants that serve fountain sodas offering free refills or "buy one cup get one free" deals. Consumers won't have to "seek out" ways to get their fix; business will be more than happy to provide them. Like most poorly crafted regulation, this ban on large sugary drinks will certainly shift some behavior, but in efforts to circumvent the ban, not to conform to it.

New Yorkers are smarter than you give them credit for, Mayor Bloomberg. Maybe it's all that sugar.


Let's Be More Productive

Mark D. White

In The New York Times over the weekend, Tim Jackson contributed a piece titled "Let's Be Less Productive." In it, he decries the modern obsession with productivity gains, while recognizing the role it has played in increasing standards of living. He cites necessarily stagnant productivity in the arts, services, and craft industries, which William Baumol noted years ago, terming it the "cost disease" (because wages would have to remain competitive while productivity stayed the same), but cautions against increasing productivity throughout the economy because of other detrimental effects--specifically on jobs, if higher productivity is not accompanied by growth.

I have no problem with tempering the push for higher productivity, especially in areas in which it can hardly be expected. Productivity is a means to an end and therefore it is only valuable insofar as it actually serves that end. But I think there is an end which can benefit from higher productivity that Jackson doesn't see: a less work-centered conception of meaningful life. Instead, he sees higher productivity as a threat to full employment:

Ever-increasing productivity means that if our economies don’t continue to expand, we risk putting people out of work. If more is possible each passing year with each working hour, then either output has to increase or else there is less work to go around. Like it or not, we find ourselves hooked on growth.

On a certain level he's right; if we produce the same amount of output more efficiently, that means less resources will be required, including labor. For people who want to work, who need to work, this is of great concern, which makes this an important matter to discuss during these dire economic times.

But more generally, we should consider if work is a means to an end or an end in itself. It's the former for most everybody, of course, but the latter for only some. It's a cultural stereotype that Americans live to work while Europeans work to live, but it is based on a kernel of truth. Some people find their life's meaning primarily in work, but others find it more in other aspects of life, such as service, art, family, or love. Higher productivity may result in fewer jobs, yes, but insomuch as some people find a job a burden--and have other means to support themselves, such as a spouse or a partner--they can enjoy other aspects of life if they have other means of support, due to higher productivity.

There are other benefits to this aspect of higher productivity. It would relieve the modern necessity of the two-earner family, either allowing a two-parent family to live on one earner's income, or a single-parent family to live more comfortably on one income. And higher productivity can also--if you're so inclined--finance a stronger welfare state, to support those who want to work but can't find a job, and have no partner or other financial support. Even without growth, higher productivity enables a state to fund social welfare programs. (Just look at Sweden, where a fairly unrestrictive regulatory environment for business has led to productivty gains and growth to support their extensive welfare state.)

There is plenty of room to bemoan the single-minded focus on productivity espoused by many in business and government, and at the same time to recognize that the loss of jobs it creates (in the absence of corresponding growth) has some broader societal benefits, including lessening our reliance on our jobs and careers to give meaning to our lives and relaxing the economic burden on families. Work to live, indeed!


Does cognitive science relieve us of responsibility--or require us to redirect our effort?

Mark D. White

In this morning's New York Times, James Atlas discusses recent books about cognitive processes and neuroscience, such as Jonah Lehrer's Imagine: How Creativity Works, Charles Duhigg's The Power of Habit: Why We Do What We Do in Life and Business, and Leonard Mlodinow's Subliminal: How Your Unconscious Mind Rules Your Behavior. Atlas highlights several interesting things about this publishing trend, including the increasingly analytical focus on the "how" of thought rather than the more existential "why," and the shrinking space allowed for true agency to operate amid of the ever-expanding, hidden wiring in the brain. The latter concern motivates the title of his piece, "The Amygdala Made Me Do It" as well as his characterization of this genre as "Can't-Help-Yourself" books.

Even though I'm an advocate of autonomy, willpower, and self-knowledge myself, I find little to be troubled by here, and quite a bit to be excited about. Freud, of course, posited that much of our thought happens under the surface, and this idea has been brought into modern experimental psychology and described in books such as Timothy D. Wilson's Strangers to Ourselves: Discovering the Adaptive Unconscious (which I highly recommend). And it is the title of Wilson's more recent book, Redirect: The Surprising New Science of Psychological Change, that suggests a positive interpretation of these developments, as described by Atlas:

The Power of Habit and Imagine belong to a genre that has become increasingly conspicuous over the last few years: the hortatory book, armed with highly sophisticated science, that demonstrates how we can achieve our ambitions despite our sensory cluelessness.

The discovery of increasing levels of complex scaffolding underneath conscious thought processes need not threaten intentional choice, but rather help to enable it. Citing David Hume, Williams James, and Daniel Kahneman (no Adam Smith, Jonathan!), Atlas focuses on the importance of habits to everyday action, and how the conscious mind can redirect those habits for its own ends, such as countering a habit of watching TV with a healthier habit of exercising. Such a person is making an intentional and strategic choice by harnessing the power of habit; our habitual nature is thus transformed from a liability to an asset, from a weight to a tool.

The danger lies in letting the easy intertia of habit take over and forgetting we have the responsibility to choose which habits to nurture and which to reject. The current picture of our brains casts each of us as the CEO of our minds rather than the entry-level employee or even middle manager: we have the ability to command and direct our cognitive resources, but we retain responsibility for what we do with them.


Much (More) Ado about Happiness

Mark D. White

In this morning's Wall Street Journal, James Bovard pokes a little fun at the US government's plans for measuring gross domesic happiness (of which Nicolas Sarkozy was a leading advocate), pointing to how well they currently measure the myriad economic statistics regarding things that aren't entirely subjective. Many economists take this very seriously, however; as it happens, I'm currently working on several projects that, to some extent, deal with this issue (as is Deirdre McCloskey, if I remember correctly). The literature is recent but already vast: a marvelous summary and critique can be found in Daniel Hausman's 2010 Economics and Philosophy article, "Hedonism and Welfare Economics."

My take, in a nutshell, is that measuring happiness is neither feasible nor desirable. It is not feasible because happiness is irreconciliably multifaceted (along several dimensions) and inescapably subjective. It is not desirable because any official focus on happiness violates ideals of liberal neutrality and personal autonomy regarding persons' individual pursuit of the "good life," and the measurement of such can only lead to excessive government manipulation (if not paternalism) towards that imposed end (as Bovard describes). Rather, institutions should be established and maintained to ensure that person have the maximal capacity to make choices in pursuit of their own interests consistent with all others doing the same. Only such a system can ensure respect for persons' own interests and the choices they make towards them.


Revise and resubmit... with style!

Mark D. White

SwordThis weekend's Wall Street Journal featured an article by Helen Sword entitled "Yes, Even Professors Can Write Stylishly," in which she criticized the quality of writing by most academics--and praised the exceptions, explaining some features that make good academic writing shine. (More can be found in her book, Stylish Academic Writing.) Having done my share of both writing and editing for various audiences, as well as lots of reading and refereeing, Sword's article got me to thinking about the challenges of academic writing and the pursuit of style in it.

I would tend to think that scholars in the humanities (such as philosophy and law) have more latitude--and more responsibility--to write stylishly than those in the physical and social sciences. Scientific writing is often rigidly formatted (sometimes explicitly by journals): present the problem, explain the model, derive results (theoretically or empirically), and interpret the results. This should all be written well, of course, but I think style is of less concern when you're explaining a negative second derivative or a statistically significant coefficient.

I remember writing my early economics articles (in theoretical industrial organization), in which it seemed all I was doing in the middle 80% of the paper was bridging the gaps between equations. The only "real" writing came at the beginning and the end, the parentheses that held the "stuff"--but ironically, the parentheses were the only part most people would read, so I learned quickly that careful attention to them was crucial.

Writing in the humanities--a category in which I would include non-scientific economics--is less structured. This gives scholars more freedom to exercise their personal style, and at the same time provides less scaffolding under which to hide bad writing. Naturally, philosophers and legal scholars, who are trained at crafting arguments, often have the most polished prose, but many economists excel at this as well.

Personally, I find much more stylistic freedom writing books (or contributions for edited volumes) than journal articles. While I agree with Sword that journal editors (and referees) value clear writing as much as anybody, I don't know if I'd agree that they appreciate stylish writing. Maybe it's just me, but I feel constrained to write very formally when I write for a journal; based on what I read in journals, it seems that is what most journal editors expect. (I say "most," because I know some journal editors that are exceptional in this way.)

I tend to be a fairly good mimic when I write, so I've been able to adopt my writing style to whatever venue I'm writing for, whether it be journal, newspaper op-ed, popular magazine, academic book or popular trade. Out of those, the journal "style" is definitely my least favorite, and happily I'm at a stage in my career where I'm no longer dependant on journal publications for professional advancement. But while books may reach a wider audience (especially outside academia), there is a degree to which regular journal publications help keep your name in the thick of things in a particular academic community, so in that sense I miss writing for journals. (It's just not as enjoyable--and shouldn't writing be enjoyable?)

Mimckry does not always pay off, though. For instance, Deirdre McCloskey is my favorite academic writer as far as style is concerned. I adore her tone--playful and gracious, yet firm and forceful--and I have to be very careful not to indulge my inner mimic and churn out third-rate McCloskeyisms when I write!

I make no claims to any significant degree of craft or style in my writing, academic or popular. I am very grateful when friends or colleagues read my work and say they can "hear my voice" in it, and I am especially happy when they say this in reference to my academic writing, in which it is more challenging to be myself. (My natural voice comes out much more easily in my chapters in the Blackwell Philosopy and Pop Culture books and my Psychology Today posts.) Whatever academic style I have is most apparent in Kantian Ethics and Economics: Autonomy, Dignity, and Character, especially in the introduction (available here). I'll just keeping trying to improve and hone my style as I keep writing. (And writing. And writing. And...)


On Character (in The New York Times' The Stone)

Mark D. White

In this morning's The Stone column in The New York Times, UNC visiting professor Iskra Fileva offers "Character and Its Discontents," in which she writes eloquently on the nature of character in response to the situationist critiques of Gilbert Harman and John Doris. Her article doesn't lend itself well to quotes--it really must be read in full to be appreciated--but two points stood out to me.

  1. Even when we judge people to have behaved inconsistently with what we took to be their character traits, this may be the fault of our limited perception of their character rather than any inconsistency of their part. (She attributes this point to psychologist Gordon Allport.) This parallels my point against paternalism, that the only knowledge regulators have of a person's interests in what they can infer from his or her choices or behavior, for which there can always be multiple explanations. By the same token, it is difficult to infer character traits from behavior with any confidence, and therefore it is difficult to make any judgments of inconsistency based upon them (just as external judgments of poor choices cannot be made simply based on observations of previous ones).
  2. Unity of character is an aspirational goal, rather than something to be taken for granted. This reminds me of Kant's understanding of autonomy as a responsibility as well as a capacity, in that all of us have the potential to be autonomous but we have to work at it constantly, exercising our strength of will, in order to maintain it. It is also consistent with what I wrote in Kantian Ethics and Economics (in chapter 3, based on the work of Christine Korsgaard and Ronald Dworkin) about constructing, expressing, confirming our characters through the choices we make, which is a responsibility for personal integrity that we all have.

I also appreciated that she began the piece with a discussion of character in fiction, which is important for more pragmatic reasons. Nonetheless, creators have a responsibility to their audience to maintain behavioral consistency in their characters, who can have complex motivations up to a point. This makes them more fascinating, but beyond this point, the characters themselves become imperceptible, such as in absurdist literature and theater, as Fileva mentions, or in poorly written traditional fiction (a frequent complaint of fans of "serial fiction" such as myself).