I Don’t Care How Good His Paintings Are, He Still Belongs In Prison

George W. Bush committed an international crime that killed hundreds of thousands of people.

Critics from the New Yorker and the New York Times agree: George W. Bush may have been an inept head of state, but he is a more than capable artist. In his review of Bush’s new book Portraits of Courage: A Commander in Chief’s Tribute to America’s Warriors (Crown, $35.00), New Yorker art critic Peter Schjeldahl says Bush’s paintings are of “astonishingly high” quality, and his “honestly observed” portraits of wounded veterans are “surprisingly likable.” Jonathan Alter, in a review titled “Bush Nostalgia Is Overrated, but His Book of Paintings Is Not,” agrees: Bush is “an evocative and surprisingly adept artist.” Alter says that while he used to think the Iraq War was “the right war with the wrong commander in chief,” he now thinks that it was the “wrong war” but with “the right commander in chief, at least for the noble if narrow purpose of creatively honoring veterans through art.”

Alter and Schjeldahl have roughly the same take on Bush: he is a decent person who made some dreadful mistakes. Schjeldahl says that while Bush “made, or haplessly fronted for, some execrable decisions…hating him took conscious effort.” Alter says that while the Iraq War was a “colossal error” and Bush “has little to show for his dream of democratizing the Middle East,” there is a certain appeal to Bush’s “charming family, warm relationship with the Obamas, and welcome defense of the press,” and his paintings of veterans constitute a “message of love” and a “step toward bridging the civilian-military divide.” Alter and Schjeldahl both see the new book as a form of atonement. Schjeldahl says that with his “never-doubted sincerity and humility,” Bush “obliviously made murderous errors [and] now obliviously atones for them.” Alter says that Bush is “doing penance,” and that the book testifies to “our genuine, bipartisan determination to do it better this time—to support healing in all of its forms.”

This view of Bush as a “likable and sincere man who blundered catastrophically” seems to be increasingly popular among some American liberals. They are horrified by Donald Trump, and Bush is beginning to seem vastly preferable by comparison. If we must have Republicans, let them be Bushes, since Bush at least seems good at heart while Trump is a sexual predator. Jonathan Alter insists he is not becoming nostalgic, but his gauzy tributes to Bush’s “love” and “warmth” fully endorse the idea of Bush’s essential goodness. Now that Bush spends his time painting puppies and soldiers, having mishaps with ponchos and joking about it on Ellen, more and more people may be tempted to wonder why anyone could ever have hated the guy.

Nostalgia takes root easily, because history is easy to forget. But in Bush’s case, the history is easily accessible and extremely well-documented. George W. Bush did not make a simple miscalculation or error. He deliberately perpetrated a war crime, intentionally misleading the public in order to do so, and showed callous indifference to the suffering that would obviously result. His government oversaw a regime of brutal torture and indefinite detention, violating every conceivable standard for the humane treatment of prisoners. And far from trying to “atone,” Bush has consistently misrepresented history, reacting angrily and defensively to those who confront him with the truth. In a just world, he would be painting from a prison cell. And through Alter and Schjeldahl’s effort to impute to Bush a repentance and sensitivity that he does not actually possess, they fabricate history and erase the sufferings of Bush’s victims.

First, it’s important to be clear what Bush actually did. There is a key number missing from both Alter and Schjeldahl’s reviews: 500,000, the sum total of Iraqi civilians who perished as a result of the U.S. war there. (That’s a conservative estimate, and stops in 2011.) Nearly 200,000 are confirmed to have died violently, blown to pieces by coalition air strikes or suicide bombers, shot by soldiers or insurgents. Others died as a result of the disappearance of medical care, with doctors fleeing the country by the score as their colleagues were killed or abducted. Childhood mortality and infant mortality shot up, as well as malnutrition and starvation, and toxins introduced by American bombardment led to “congenital malformations, sterility, and infertility.” There was mass displacement, by the millions. An entire “generation of orphans” was created, with hundreds of thousands of children losing parents and wandering the streets homeless. The country’s core infrastructure collapsed, and centuries-old cultural institutions were destroyed, with libraries and museums looted, and the university system “decimated” as professors were assassinated. For years and years, suicide bombings became a regular feature of life in Baghdad, and for every violent death, scores more people were left injured or traumatized for life. (Yet in the entire country, there were less than 200 social workers and psychiatrists put together to tend to people’s psychological issues.) Parts of the country became a hell on earth; in 2007 the Red Cross said that there were “mothers appealing for someone to pick up the bodies on the street so their children will be spared the horror of looking at them on their way to school.” The amount of death, misery, suffering, and trauma is almost inconceivable.

These were the human consequences of the Iraq War for the country’s population. They generally go unmentioned in the sympathetic reviews of George W. Bush’s artwork. Perhaps that’s because, if we dwell on them, it becomes somewhat harder to appreciate Bush’s impressive use of line, color, and shape. If you begin to think about Iraq as a physical place full of actual people, many of whom have watched their children die in front of them, Bush’s art begins to seem ghoulish and perverse rather than sensitive and accomplished. There is a reason Schjeldahl and Alter do not spend even a moment discussing the war’s consequences for Iraqis. Doing so requires taking stock of an unimaginable series of horrors, one that makes Bush’s colorful brushwork and daytime-TV bantering seem more sickening than endearing.

But perhaps, we might say, it is unfair to linger on the subject of the war’s human toll. All war, after all, is hell. We must base our judgment of Bush’s character not on the ultimate consequences of his decisions, but on the nature of the decisions themselves. After all, Schjeldahl and Alter do not deny that the Iraq War was calamitous, with Alter calling it one of “the greatest disasters in American history,” a “historic folly” with “horrific consequences,” and Schjeldahl using that curious phrase “murderous error.” It’s true that both obscure reality by using vague descriptors like “disaster” rather than acknowledging what the invasion meant for the people on whom it was inflicted. But their point is that Bush meant well, even though he may have accidentally ended up causing the birth of ISIS and plunging the people of Iraq into an unending nightmare.

anatomyad2

Viewing Bush as inept rather than malicious means rejecting the view that he “lied us into war.” If we accept Jonathan Alter’s perspective, it was not that Bush told the American people that Iraq had weapons of mass destruction when he knew that it did not. Rather, Bush misjudged the situation, relying too hastily and carelessly on poor intelligence, and planning the war incompetently. The war was a “folly,” a bad idea poorly executed, but not an intentional act of deceit or criminality.

This view is persuasive because it’s partially correct. Bush did not “lie that there were weapons of mass destruction,” and it’s unfortunate that anti-war activists have often suggested that this was the case. Bush claims, quite plausibly, that he believed that Iraq possessed WMDs, and there is no evidence to suggest that he didn’t believe this. That supports the “mistake” view, because a lie is an intentional false statement, and Bush may have believed he was making a true statement, thus being mistaken rather than lying.

But the debate over whether Bush lied about WMDs misstates what the actual lie was. It was not when Bush said “the Iraq regime continues to possess and conceal some of the most lethal weapons ever devised” that he lied to the American people. Rather, it was when he said Iraq posed a “threat” and that by invading it the United States was “assuring its own national security.” Bush could not have reasonably believed that the creaking, isolated Saddam regime posed the kind of threat to the United States that he said it did. WMDs or not, there was nothing credible to suggest this. He therefore lied to the American people, insisting that they were under a threat that they were not actually under. He did so in order to create a pretext for a war he had long been intent on waging.

This is not to say that Bush’s insistence that Saddam Hussein had WMDs was sincere. It may or may not have been. The point is not that Bush knew there weren’t WMDs in Iraq, but that he didn’t care whether there were or not. This is the difference between a lie and bullshit: a lie is saying something you know to be untrue, bullshit is saying something without caring to find out if it’s true. The former highest-ranking CIA officer in Europe told 60 Minutes that the Bush White House intentionally ignored evidence contradicting the idea that Saddam had WMDs. According to the officer, when intelligence was provided that contradicted the WMD story, the White House told the officer that “this isn’t about intel anymore. This is about regime change,” from which he concluded that “the war in Iraq was coming and they were looking for intelligence to fit into the policy.” It’s not, then, that Bush knew there were no WMDs. It’s that he kept himself from finding out whether there were WMDs, because he was determined to go to war.

The idea that Saddam posed a threat to the United States was laughable from the start. The WMDs that he supposedly possessed were not nuclear weapons, but chemical and biological ones. WMD is a catch-all category, but the distinction is important; mustard gas is horrific, but it is not a “suitcase nuke.” Bashar al-Assad, for example, possesses chemical weapons, but does not pose a threat to the U.S. mainland. (To Syrians, yes. To New Yorkers, no.) In fact, according to former Saddam aide Tariq Aziz, “Saddam did not consider the United States a natural adversary, as he did Iran and Israel, and he hoped that Iraq might again enjoy improved relations with the United States.” Furthermore, by the time of the U.S. invasion, Saddam “had turned over the day-to-day running of the Iraqi government to his aides and was spending most of his time writing a novel.” There was no credible reason to believe, even if Saddam possessed certain categories of weapons prohibited by international treaty, that he was an active threat to the people of the United States. Bush’s pre-war speeches used terrifying rhetoric to leap from the premise that Saddam was a monstrous dictator to the conclusion that Americans needed to be scared. That was simple deceit.

In fact, Bush had long been committed to removing Saddam, and was searching for a plausible justification. Just “hours after the 9/11 attacks,” Donald Rumsfeld and the Vice Chairman of the Joint Chiefs of Staff were pondering whether they could “hit Saddam at the same time” as Osama bin Laden as part of a strategy to “move swiftly, go massive.” In November of 2001, Rumsfeld and Tommy Franks began plotting the “decapitation” of the Iraqi government, pondering various pretexts for “how [to] start” the war. Possibilities included “US discovers Saddam connection to Sept. 11 attack or to anthrax attacks?” and “Dispute over WMD inspections?” Worried that they wouldn’t find any hard evidence against Saddam, Bush even thought of painting a reconnaissance aircraft in U.N. colors and flying it over Iraqi airspace, goading Saddam into shooting it down and thereby justifying a war. Bush “made it clear” to Tony Blair that “the U.S. intended to invade… even if UN inspectors found no evidence of a banned Iraqi weapons program.”

Thus Bush’s lie was not that there were weapons of mass destruction. The lie was that the war was about weapons of mass destruction. The war was about removing Saddam Hussein from power, and asserting American dominance in the Middle East and the world. Yes, that was partially to do with oil (“People say we’re not fighting for oil. Of course we are… We’re not there for figs.” said former Defense Secretary Chuck Hagel, while Bush CENTCOM commander John Abizaid admitted “Of course it’s about oil, we can’t really deny that”). But the key point is that Bush detested Saddam and was determined to show he could get rid of him; according to those who attended National Security Council meetings, the administration wanted to “make an example of Hussein” to teach a lesson to those who would “flout the authority of the United States.” “Regime change” was the goal from the start, with “weapons of mass destruction” and “bringing democracy” just convenient pieces of rhetoric.

Nor was the war about the well-being of the people of Iraq. Jonathan Alter says that Bush had a “dream of democratizing the Middle East” but simply botched it; Bush’s story is almost that of a romantic utopian and tragic hero, undone by his hubris in just wanting to share democracy too much. In reality, the Bush White House showed zero interest in the welfare of Iraqis. Bush had been warned that invading the country would lead to a bloodbath; he ignored the warning, because he didn’t care. The typical line is that the occupation was “mishandled,” but this implies that Bush tried to handle it well. In fact, as Patrick Cockburn’s The Occupation and Rajiv Chandrasekaran’s Imperial Life in The Emerald City show, American officials were proudly ignorant of the Iraqi people’s needs and desires. Decisions were made in accordance with U.S. domestic political considerations rather than concern for the safety and prosperity of Iraq. Bush appointed totally inexperienced Republican Party ideologues to oversee the rebuilding effort, rather than actual experts, because the administration was more committed to maintaining neoconservative orthodoxies than actually trying to figure out how to keep the country from self-destructing. When Bush gave Paul Bremer his criteria for who should be the next Iraqi leader, he was emphatic that he wanted someone who would “stand up and thank the American people for their sacrifice in liberating Iraq.”

As the situation in Iraq deteriorated into exactly the kind of sectarian violence that the White House had been warned it would, the Bush administration tried to hide the scale of the disaster. Patrick Cockburn reported that while Bush told Congress that fourteen out of eighteen Iraqi provinces “are completely safe,” this was “entirely untrue” and anyone who had gone to these provinces to try and prove it would have immediately been kidnapped or killed. In tallies of body counts, “U.S. officials excluded scores of people killed in car bombings and mortar attacks from tabulations measuring the results of a drive to reduce violence in Baghdad.” Furthermore, according to the Guardian “U.S. authorities failed to investigate hundreds of reports of abuse, torture, rape and even murder by Iraqi police and soldiers” because they had “a formal policy of ignoring such allegations.” And the Bush administration silently presided over atrocities committed by both U.S. troops (who killed almost 700 civilians for coming too close to checkpoints, including pregnant women and the mentally ill) and hired contractors (in 2005 an American military unit observed as Blackwater mercenaries “shot up a civilian vehicle” killing a father and wounding his wife and daughter).

Then, of course, there was torture and indefinite detention, both of which were authorized at the highest levels. Bush’s CIA disappeared countless people to “black sites” to be tortured, and while the Bush administration duplicitously portrayed the horrific abuses at Abu Ghraib as isolated incidents, the administration was actually deliberately crafting its interrogation practices around torture and attempting to find legal loopholes to justify it. Philippe Sands reported that the White House tried to pin responsibility for torture on “interrogators on the ground,” a “false” explanation that ignored the “actions taken at the very highest levels of the administration” approving 18 new “enhanced interrogation” techniques, “all of which went against long-standing U.S. military practice as presented in the Army Field Manual.” Notes from 20-hour interrogations reveal the unimaginable psychological distress undergone by detainees:

Detainee began to cry. Visibly shaken. Very emotional. Detainee cried. Disturbed. Detainee began to cry. Detainee bit the IV tube completely in two. Started moaning. Uncomfortable. Moaning. Began crying hard spontaneously. Crying and praying. Very agitated. Yelled. Agitated and violent. Detainee spat. Detainee proclaimed his innocence. Whining. Dizzy. Forgetting things. Angry. Upset. Yelled for Allah. Urinated on himself. Began to cry. Asked God for forgiveness. Cried. Cried. Became violent. Began to cry. Broke down and cried. Began to pray and openly cried. Cried out to Allah several times. Trembled uncontrollably.

Indeed, the U.S. Senate Select Intelligence Committee’s report on CIA interrogation tactics concluded that they were “brutal and far worse than the CIA represented to policymakers.” They included “slamming detainees into walls,” “telling detainees they would never leave alive,” “Threats to harm the children of a detainee, threats to sexually abuse the mother of a detainee, threats to cut a detainee’s mother’s throat,” waterboardings that sometimes “evolved into a series of near drownings,” and the terrifyingly clench-inducing “involuntary rectal feedings.” Sometimes they would deprive detainees of all heat (which “likely contributed to the death of a detainee”) or perform what was known as a “rough takedown,” a procedure by which “five CIA officers would scream at a detainee, drag him outside of his cell, cut his clothes off, and secure him with Mylar tape. The detainee would then be hooded and dragged up and down a long corridor while being slapped and punched.” All of that is separate from the outrage of indefinite detention in itself, which kept people in cages for years upon years without ever being able to contest the charges against them. At Guantanamo Bay, detainees became “so depressed, so despondent, that they had no longer had an appetite and stopped eating to the point where they had to be force-fed with a tube that is inserted through their nose.” Their mental and emotional conditions would deteriorate until they were reduced to a childlike babbling, and they frequently attempted self-harm and suicide. The Bush administration even arrested the Muslim chaplain at Guantanamo Bay, U.S. Army Captain James Yee, throwing him in leg irons, threatening him with death, and keeping him in solitary confinement for 76 days after he criticized military practices.

printedit

Thus President Bush was not a good-hearted dreamer. He was a rabid ideologue who would spew any amount of lies or B.S. in order to achieve his favored goal of deposing Saddam Hussein, and who oversaw serious human rights violations without displaying an ounce of compunction or ambivalence. There was no “mistake.” Bush didn’t “oops-a-daisy” his way into Iraq. He had a goal, and he fulfilled it, without consideration for those who would suffer as a result.

It should be mentioned that most of this was not just immoral. It was illegal. The Bush Doctrine explicitly claimed the right to launch a preemptive war against a party that had not actually attacked the United States, a violation of the core Nuremberg principle that “to initiate a war of aggression…is not only an international crime; it is the supreme international crime, differing only from other war crimes in that it contains within itself the accumulated evil of the whole.” Multiple independent inquiries have criticized the flimsy legal justifications for the war. Former U.N. Secretary General Kofi Annan openly declared the war illegal, and even Tony Blair’s former Deputy Prime Minister concurred. In fact, it’s hard to see how the Iraq War could be anything but criminal, since no country—even if it gathers a “coalition of the willing”—is permitted to simply depose a head of state at will. The Iraq War made the Nuremberg Laws even more empty and selective than they have always been, and Bush’s escape from international justice delegitimizes all other war crimes prosecutions. A core aspect of the rule of law is that it applies equally to all, and if the United States is free to do as it pleases regardless of its international legal obligations, it is unclear what respect anybody should hold for the law.

George W. Bush may therefore be a fine painter. But he is a criminal. And when media figures try to redeem him, or portray him as lovable-but-flawed, they ignore the actual record. In fact, Bush has not even made any suggestion that he is trying to “atone” for a great crime, as liberal pundits have suggested he is. On the contrary, he has consistently defended his decision-making, and the illegal doctrine he espoused. He even wrote an entire book of self-justifications. Bush is not a haunted man. And since any good person, if he had Bush’s record, would be haunted, Bush is not a good person. Kanye West had Bush completely right. He simply does not think very much about the lives of people darker than himself. That sounds like an extreme judgment, but it’s true. If he cared about them, he wouldn’t have put them in cages. George Bush may love his grandchildren, he may paint with verve and soul. But he does not care about black or brown people.

It’s therefore exasperating to see liberals like Alter and Schjeldahl offer glowing assessments of Bush’s book of art, and portray him as soulful and caring. Schjeldahl says that Bush is so likable that hating him “takes conscious effort.” But it only takes conscious effort if you don’t think about the lives of Iraqis. If you do think about the lives of Iraqis, then hating him not only does not take conscious effort, but it is automatic. Anyone who truly appreciates the scale of what Bush inflicted on the world will feel rage course through their body whenever they hear his voice, or see him holding up a paintbrush, with that perpetual simpering grin on his face.

Alter and Schjeldahl are not alone in being captivated by Bush the artiste. The Washington Post’s art critic concluded that “the former president is more humble and curious than the Swaggering President Bush he enacted while in office [and] his curiosity about art is not only genuine but relatively sophisticated.” This may be the beginning of a critical consensus. But it says something disturbing about our media that a man can cause 500,000 deaths and then have his paintings flatteringly profiled, with the deaths unmentioned. George W. Bush intentionally offered false justifications for a war, destroyed an entire country, and committed an international crime. He tortured people, sometimes to death.

But would you look at those brushstrokes? And have you seen the little doggies?

How Liberals Fell In Love With The West Wing

Aaron Sorkin’s political drama shows everything wrong with the Democratic worldview…

In the history of prestige tv, few dramas have had quite the cultural staying power of Aaron Sorkin’s The West Wing.

Set during the two terms of fictional Democratic President and Nobel Laureate in Economics  Josiah “Jed” Bartlet (Martin Sheen) the show depicts the inner workings of a sympathetic liberal administration grappling with the daily exigencies of governing. Every procedure and protocol, every piece of political brokerage—from State of the Union addresses to legislative tugs of war to Supreme Court appointments—is recreated with an aesthetic authenticity enabled by ample production values (a single episode reportedly cost almost $3 million to produce) and rendered with a dramatic flair that stylizes all the bureaucratic banality of modern governance.

Nearly the same, of course, might be said for other glossy political dramas such as Netflix’s House of Cards or Scandal. But The West Wing aspires to more than simply visual verisimilitude. Breaking with the cynicism or amoralism characteristic of many dramas about politics, it offers a vision of political institutions which is ultimately affirmative and approving. What we see throughout its seven seasons are Democrats governing as Democrats imagine they govern, with the Bartlet Administration standing in for liberalism as liberalism understands itself.

More than simply a fictional account of an idealized liberal presidency, then, The West Wing is an elaborate fantasia founded upon the shibboleths that sustain Beltway liberalism and the milieu that produced them.

“Ginger, get the popcorn

The filibuster is in

I’m Toby Ziegler with The Drop In

What Kind of Day Has It Been?

It’s Lin, speaking the truth

—Lin-Manuel Miranda, “What’s Next?

During its run from 1999 to 2006, The West Wing garnered immense popularity and attention, capturing three Golden Globe Awards and 26 Emmys and building a devout fanbase among Democratic partisans, Beltway acolytes, and people of the liberal-ish persuasion the world over. Since its finale more than a decade ago, it has become an essential part of the liberal cultural ecosystem, its importance arguably on par with The Daily Show, Last Week Tonight, and the rap musical about the founding fathers people like for some reason.

If anything, its fandom has only continued to grow with age: In the summer of 2016, a weekly podcast hosted by seasons 4-7 star Joshua Malina, launched with the intent of running through all 154 episodes (at a rate of one per week), almost immediately garnered millions of downloads; an elaborate fan wiki with almost 2000 distinct entries is maintained and regularly updated, magisterially documenting every mundane detail of the West Wing cosmos save the characters’ bowel movements; and, in definitive proof of the silence of God, superfan Lin-Manuel Miranda has recently recorded a rap named for one of the show’s most popular catchphrases (“What’s next?”).

While certainly appealing to a general audience thanks to its expensive sheen and distinctive writing, The West Wing’s greatest zealots have proven to be those who professionally inhabit the very milieu it depicts: Washington political staffers, media types, centrist cognoscenti, and various others drawn from the ranks of people who tweet “Big, if true” in earnest and think a lanyard is a talisman that grants wishes and wards off evil.  

The West Wing “took something that was for the most part considered dry and nerdy—especially to people in high school and college—and sexed it up,” former David Axelrod advisor Eric Lesser told Vanity Fair in a longform 2012 feature about the “Sorkinization of politics” (Axelrod himself having at one point advised West Wing writer Eli Attie). It “very much served as inspiration”, said Micah Lasher, a staffer who then worked for Michael Bloomberg.

Thanks to its endless depiction of procedure and policy, the show naturally gibed with the wonkish libidos of future Voxsplainers Matt Yglesias and Ezra Klein. “There’s a cultural meme or cultural suggestion that Washington is boring, that policy is boring, but it’s important stuff,” said Klein, adding that the show dramatized “the immediacy and urgency and concern that people in this town feel about the issues they’re working on.” “I was interested in politics before the show started,” added Yglesias. “But a friend of mine from college moved to D.C. at the same time as me, after graduation, and we definitely plotted our proposed domination of the capital in explicitly West Wing terms: Who was more like Toby? Who was more like Josh?”

Far from the Kafkaesque banality which so often characterizes the real life equivalent, the mundane business of technocratic governance is made to look exciting, intellectually stimulating, and, above all, honorable. The bureaucratic drudgery of both White House management and governance, from speechwriting, to press conference logistics, to policy creation, are front and center across all seven seasons. A typical episode script is chock full of dweebish phraseology — “farm subsidies”, “recess appointments”, “census bureau”, “congressional consultation” — usually uttered by swift-tongued, Ivy League-educated staffers darting purposefully through labyrinthine corridors during the infamous “walk-and-talk” sequences. By recreating the look and feel of political processes to the tee, while garnishing them with a romantic veneer, the show gifts the Beltway’s most spiritually-devoted adherents with a vision of how many would probably like to see themselves.

In serving up this optimistic simulacrum of modern US politics, Sorkin’s universe has repeatedly intersected with real-life US politics. Following the first season, and in the midst of the 2000 presidential election contest, Salon’s Joyce Millman wrote: “Al Gore could clinch the election right now by staging as many photo-ops with the cast of The West Wing as possible.” A poll published during the same election found that most voters preferred Martin Sheen’s President Bartlet to Bush or Gore. A 2008 New York Times article predicted an Obama victory on the basis of the show’s season 6-7 plot arc. The same election year, the paper published a fictionalized exchange between Bartlet and Barack Obama penned by Sorkin himself. 2016 proved no exception, with the New Statesman’s Helen Lewis reacting to Donald Trump’s victory by saying: “I’m going to hug my West Wing boxset a little closer tonight, that’s for sure.”

Appropriately, many of the show’s cast members, leveraging their on-screen personas, have participated or intervened in real Democratic Party politics. During the 2016 campaign, star Bradley Whitford—who portrays frenetically wily strategist Josh Lyman—was invited to “reveal” who his [fictional] boss would endorse:

“There’s no doubt in my mind that Hillary would be President Bartlet’s choice. She’s—nobody is more prepared to take that position on day one. I know this may be controversial. But yes, on behalf of Jed Bartlet, I want to endorse Hillary Clinton.”

Six leading members of the cast, including Whitford, were even dispatched to Ohio to stump for Clinton (inexplicably failing to swing the crucial state in her favor).

anatomyad2

During the Democratic primary season Rob Lowe (who appeared from 1999-2003 before leaving in protest at the ostensible stinginess of his $75,000/episode salary) even deployed a clip from the show and paraphrased his own character’s lines during an attack on Bernie Sanders’ tax plan: “Watching Bernie Sanders. He’s hectoring and yelling at me WHILE he’s saying he’s going to raise our taxes. Interesting way to communicate.” In Season 2 episode “The Fall’s Gonna Kill You”, Lowe’s character Sam Seaborn angrily lectures a team of speechwriters:  

“Every time your boss got on the stump and said, ‘It’s time for the rich to pay their fair share,’ I hid under a couch and changed my name…The top one percent of wage earners in this country pay for twenty-two percent of this country. Let’s not call them names while they’re doing it, is all I’m saying.”

What is the actual ideology of The West Wing? Just like the real American liberalism it represents, the show proved to be something of a political weather vane throughout its seven seasons on the air.

Debuting during the twilight of the Clinton presidency and spanning much of Bush II’s, it predictably vacillated somewhat in response to events while remaining grounded in a general liberal ethos. Having writing credits for all but one episode in The West Wing’s first four seasons, Sorkin left in 2003, with Executive Producer John Wells characterizing the subsequent direction as more balanced and bipartisan. The Bartlet administration’s actual politics—just like those of the real Democratic Party and its base—therefore run the gamut from the stuff of Elizabeth Warren-esque populism to the neoliberal bilge you might expect to come from a Beltway think tank having its white papers greased by dollars from Goldman Sachs.  

But promoting or endorsing any specific policy orientation is not the show’s true raison d’être. At the conclusion of its seven seasons it remains unclear if the Bartlet administration has succeeded at all in fundamentally altering the contours of American life. In fact, after two terms in the White House, Bartlet’s gang of hyper-educated, hyper-competent politicos do not seem to have any transformational policy achievements whatsoever. Even in their most unconstrained and idealized political fantasies, liberals manage to accomplish nothing.

The lack of any serious attempts to change anything reflect a certain apolitical tendency in this type of politics, one that defines itself by its manner and attitude rather than a vision of the change it wishes to see in the world. Insofar as there is an identifiable ideology, it isn’t one definitively wedded to a particular program of reform, but instead to a particular aesthetic of political institutions. The business of leveraging democracy for any specific purpose comes second to how its institutional liturgy and processes look and, more importantly, how they make us feel—virtue being attached more to posture and affect than to any particular goal. Echoing Sorkin’s 1995 film The American President (in many ways the progenitor of The West Wing) it delights in invoking “seriousness” and the supposedly hard-headed pragmatism of grownups.

cast2

Consider a scene from Season 2’s “The War at Home”, in which Toby Ziegler confronts a rogue Democratic Senator over his objections to Social Security cuts prospectively to be made in collaboration with a Republican Congress. The episode’s protagonist certainly isn’t the latter, who tries to draw a line in the sand over the “compromising of basic Democratic values” and threatens to run a third party presidential campaign, only to be admonished acerbically by Ziegler:  

“If you think demonizing people who are trying to govern responsibly is the way to protect our liberal base, then speaking as a liberal…go to bed, would you please?…Come at us from the left, and I’m gonna own your ass.”

The administration and its staff are invariably depicted as tribunes of the serious and the mature, their ideological malleability taken to signify their virtue more than any fealty to specific liberal principles.

Even when the show ventures to criticize the institutions of American democracy, it never retreats from a foundational reverence for their supposed enlightenment and the essential nobility of most of the people who administer them. As such, the presidency’s basic function is to appear presidential and, more than anything, Jed Bartlet’s patrician aura and respectable disposition make him the perfect avatar for the West Wing universe’s often maudlin deference to the liturgy of “the office.” “Seriousness,” then— the superlative quality in the Sorkin taxonomy of virtues—implies presiding over the political consensus, tinkering here and there, and looking stylish in the process by way of soaring oratory and white-collar chic.   

“Make this election about smart, and not. Make it about engaged, and not. Qualified, and not. Make it about a heavyweight. You’re a heavyweight. And you’ve been holding me up for too many rounds.”

—Toby Ziegler, Hartsfield’s Landing (Season 3, Episode 14)

Despite its relatively thin ideological commitments, there is a general tenor to the West Wing universe that cannot be called anything other than smug.

It’s a smugness born of the view that politics is less a terrain of clashing values and interests than a perpetual pitting of the clever against the ignorant and obtuse. The clever wield facts and reason, while the foolish cling to effortlessly-exposed fictions and the braying prejudices of provincial rubes. In emphasizing intelligence over ideology, what follows is a fetishization of “elevated discourse” regardless of its actual outcomes or conclusions. The greatest political victories involve semantically dismantling an opponent’s argument or exposing its hypocrisy, usually by way of some grand rhetorical gesture. Categories like left and right become less significant, provided that the competing interlocutors are deemed respectably smart and practice the designated etiquette. The Discourse becomes a category of its own, to be protected and nourished by Serious People conversing respectfully while shutting down the stupid with heavy-handed moral sanctimony.  

In Toby Ziegler’s “smart and not,” “qualified and not” formulation, we can see a preview of the (disastrous) rhetorical strategy that Hillary Clinton would ultimately adopt against Donald Trump. Don’t make it about vision, make it about qualification. Don’t make it about your plans for how to make people’s lives better, make it about your superior moral character. Fundamentally, make it about how smart and good and serious you are, and how bad and dumb and unserious they are.

“The administration and its staff are invariably depicted as tribunes of the serious and the mature, their ideological malleability taken to signify their virtue…”

In this respect, The West Wing’s foundational serious/unserious binary falls squarely within the tradition that has since evolved into the “epic own/evisceration” genre characteristic of social media and late night TV, in which the aim is to ruthlessly use one’s intellect to expose the idiocy and hypocrisy of the other side. In a famous scene from Season 4’s “Game On”, Bartlet debates his Republican rival Governor Robert Ritchie (James Brolin). Their exchange, prompted by a question about the role of the federal government, is the stuff of a John Oliver wet dream:  

Richie: My view of this is simple. We don’t need a federal Department of Education telling us our children have to learn Esperanto, they have to learn Eskimo poetry. Let the states decide, let the communities decide on health care and education, on lower taxes, not higher taxes. Now he’s going to throw a big word at you — ‘unfunded mandate’, he’s going to say if Washington lets the states do it, it’s an unfunded mandate. But what he doesn’t like is the federal government losing power. I call it the ingenuity of the American people.”

Bartlet: Well first of all let’s clear up a couple of things: unfunded mandate is two words, not one big word. There are times when we are 50 states and there are times when we’re one country and have national needs. And the way I know this is that Florida didn’t fight Germany in World War Two or establish civil rights. You think states should do the governing wall-to-wall, now that’s a perfectly valid opinion. But your state of Florida got 12.6 billion dollars in federal money last year from Nebraskans and Virginia’s and New Yorkers and Alaskans, with their Eskimo poetry — 12.6 out of the state budget of 50 billion. I’m supposed to be using this time for a question so here it is: Can we have it back please?”

In an even more famous scene from Season 2 episode “The Midterms”, Bartlet humiliates homophobic talk radio host Jenna Jacobs by quoting scripture from memory, destroying her by her very own logic.

printedit

If Richie and Jacobs are the obtuse yokels to be epically taken down with facts and reason, the show also elevates several conservative characters to reinforce its postpartisan celebration of The Discourse. Republicans come in two types: slack-jawed caricatures, and people whose high-mindedness and mutual enthusiasm for Putting Differences Aside make them the Bartlet Administration’s natural allies or friends regardless of whatever conflicts of values they may ostensibly have. Foremost among the latter is Vinick: a moderate, pro-choice Republican who resembles John McCain (at least the imaginary “maverick” John McCain that liberals continue to pretend exists) and is appointed by Bartlet’s Democratic successor Matthew Santos to be Secretary of State. (In reality, there is no such thing as a “moderate” Republican, only a polite one. The upright and genial Paul Ryan, whom President Bartlet would have loved, is on a lifelong quest to dismantle every part of America’s feeble social safety net.)

Thus Bartlet Democrats do not see Republicans as the “enemy,” except to the extent that they are rude or insufficiently respectful of the rules of political decorum. In one Season 5 plot, the administration opts to install a Ruth Bader Ginsburg clone (Glenn Close) as Chief Justice of the Supreme Court. The price it pays—willingly, as it turns out—is giving the other vacancy to an ultra-conservative justice, for the sole reason that Bartlet’s staff find their amiable squabbling stimulating. Anyone with substantively progressive political values would be horrified by a liberal president’s appointment of an Antonin Scalia-style textualist to the Supreme Court. But if your values are procedural, based more on the manner in which people conduct themselves rather than the consequences they actually bring about, it’s easy to chuckle along with a hard-right conservative, so long as they are personally charming (Ziegler: “I hate him, but he’s brilliant. And the two of them together are fighting like cats and dogs … but it works.”)

“What’s next?”

Through its idealized rendering of American politics and its institutions, The West Wing offers a comforting avenue of escape from the grim and often dystopian reality of the present. If the show, despite its age, has continued to find favor and relevance among liberals, Democrats, and assorted Beltway acolytes alike, it is because it reflects and affirms their worldview with greater fidelity and catharsis than any of its contemporaries.

But if anything gives that worldview pause, it should be the events of the past eight years. Liberals got a real life Josiah Bartlet in the figure of Barack Obama, a charismatic and stylish politician elected on a populist wave. But Obama’s soaring speeches, quintessentially presidential affect, and deference to procedure did little to fundamentally improve the country or prevent his Republican rivals from storming the Congressional barricades at their first opportunity. Confronted by a mercurial TV personality bent on transgressing every norm and truism of Beltway thinking, Democrats responded by exhaustively informing voters of his indecency and hypocrisy, attempting to destroy him countless times with his own logic, but ultimately leaving him completely intact. They smugly taxonomized as “smart” and “dumb” the very electorate they needed to win over, and retreated into an ideological fever dream in which political success doesn’t come from organizing and building power, but from having the most polished arguments and the most detailed policy statements. If you can just crush Trump in the debates, as Bartlet did to Richie, then you’ve won. (That’s not an exaggeration of the worldview. Ezra Klein published an article entitled “Hillary Clinton’s 3 debate performances left the Trump campaign in ruins,” which entirely eliminated the distinction between what happens in debates and what happens in campaigns. The belief that politics is about argument rather than power is likely a symptom of a Democratic politics increasingly incubated in the Ivy League rather than the labor movement.)

Now, facing defeat and political crisis, the overwhelming liberal instinct has not been self-reflection but a further retreat into fantasy and orthodoxy. Like viewers at the climax of The West Wing’s original run, they sit waiting for the decisive gestures and gratifying crescendos of a series finale, only to find their favorite plotlines and characters meandering without resolution. Shockingly, life is not a television program, and Aaron Sorkin doesn’t get to write the ending.

The West Wing is many things: a uniquely popular and lavish effort in prestige TV; an often crisply-written drama; a fictionalized paean to Beltway liberalism’s foundational precepts; a wonkish celebration of institutions and processes; an exquisitely-tailored piece of political fanfiction.

But, in 2017, it is foremost a series of glittering illusions to be abandoned.

Illustrations by Meg T. Callahan.

The Dangerous Academic is an Extinct Species

If these ever existed at all, they are now deader than dodos…

It was curiosity, not stupidity that killed the Dodo. For too long, we have held to the unfair myth that the flightless Mauritian bird became extinct because it was too dumb to understand that it was being killed. But as Stefan Pociask points out in “What Happened to the Last Dodo Bird?”, the dodo was driven into extinction partly because of its desire to learn more about a new, taller, two-legged creature who disembarked onto the shores of its native habitat: “Fearless curiosity, rather than stupidity, is a more fitting description of their behavior.”

Curiosity does have a tendency to get you killed. The truly fearless don’t last long, and the birds who go out in search of new knowledge are inevitably the first ones to get plucked. It’s always safer to stay close to the nest.

Contrary to what capitalism’s mythologizers would have you believe, the contemporary world does not heap its rewards on those with the most creativity and courage. In fact, at every stage of life, those who venture beyond the safe boundaries of expectation are ruthlessly culled. If you’re a black kid who tends to talk back and call bullshit on your teachers, you will be sent to a special school. If you’re a transgender teenager like Leelah Alcorn in Ohio, and you unapologetically defy gender norms, they’ll make you so miserable that you kill yourself. If you’re Eric Garner, and you tell the police where they can stick their B.S. “loose cigarette” tax, they will promptly choke you to death. Conformists, on the other hand, usually do pretty well for themselves. Follow the rules, tell people what they want to hear, and you’ll come out just fine.

Becoming a successful academic requires one hell of a lot of ass-kissing and up-sucking. You have to flatter and impress. The very act of applying to graduate school to begin with is an exercise in servility: please deem me worthy of your favor. In order to rise through the ranks, you have to convince people of your intelligence and acceptability, which means basing everything you do on a concern for what other people think. If ever you find that your conclusions would make your superiors despise you (say, for example, if you realized that much of what they wrote was utter irredeemable manure), you face a choice: conceal your true self or be permanently consigned to the margins.

The idea of a “dangerous” academic is therefore somewhat self-contradictory to begin with. The academy could, potentially, be a place for unfettered intellectual daring. But the most daring and curious people don’t end up in the academy at all. These days, they’ve probably gone off and done something more interesting, something that involves a little bit less deference to convention and detachment from the material world. We can even see this in the cultural archetype of the Professor. The Professor is always a slightly harrumphy—and always white and male—individual, with scuffed shoes and jackets with leather elbows, hidden behind a mass of seemingly disorganized books. He is brilliant but inaccessible, and if not effeminate, certainly effete. But bouncing with ideas, so many ideas. There is nothing particularly menacing about such a figure, certainly nothing that might seriously threaten the existing arrangements of society. Of ideas he has plenty. Of truly dangerous ones, none at all.

If anything, the university has only gotten less dangerous in recent years. Campuses like Berkeley were once centers of political dissent. There was open confrontation between students and the state. In May of 1970, the Ohio National Guard killed four students at Kent State. Ten days later, police at the historically black Jackson State University fired into a crowd of students, killing two. At Cornell in 1969, armed black students took over the student union building in a demand for recognition and reform, part of a pattern of serious upheaval.

But over the years the university became corporatized. It became a job training center rather than an educational institution. Academic research became progressively more specialized, narrow, technical, and obscure. (The most successful scholarship is that which seems to be engaged with serious social questions, but does not actually reach any conclusions that would force the Professor to leave his office.)

anatomyad2

The ideas that do get produced have also become more inaccessible, with research inevitably cloaked behind the paywalls of journals that cost astronomical sums of money. At the cheaper end, the journal Cultural Studies charges individuals $201 for just the print edition, and charges institutions $1,078 for just the online edition. The science journal Biochimica et Biophysica Acta costs $20,000, which makes Cultural Studies look like a bargain. (What makes the pricing especially egregious is that these journals are created mostly with free labor, as academics who produce articles are almost never paid for them.) Ideas in the modern university are not free and available to all. They are in fact tethered to a vast academic industrial complex, where giant publishing houses like Elsevier make massive profits off the backs of researchers.

Furthermore, the academics who produce those ideas aren’t exactly at liberty to think and do as they please. The overwhelming “adjunctification” of the university has meant that approximately 76% of professors… aren’t professors at all, but underpaid and overworked adjuncts, lecturers, and assistants. And while conditions for adjuncts are slowly improving, especially through more widespread unionization, their place in the university is permanently unstable. This means that no adjunct can afford to seriously offend. To make matters worse, adjuncts rely heavily on student evaluations to keep their positions, meaning that their classrooms cannot be places to heavily contest or challenge students’ politics. Instructors could literally lose their jobs over even the appearance of impropriety. One false step—a video seen as too salacious, or a political opinion held as oppressive—could be the end of a career. An adjunct must always be docile and polite.

All of this means that university faculty are less and less likely to threaten any aspect of the existing social or political system. Their jobs are constantly on the line, so there’s a professional risk in upsetting the status quo. But even if their jobs were safe, the corporatized university would still produce mostly banal ideas, thanks to the sycophancy-generating structure of the academic meritocracy. But even if truly novel and consequential ideas were being produced, they would be locked away behind extortionate paywalls.

The corporatized university also ends up producing the corporatized student. Students worry about doing anything that may threaten their job prospects. Consequently, acts of dissent have become steadily de-radicalized. On campuses these days, outrage and anger is reserved for questions like, “Is this sushi an act of cultural appropriation?” When student activists do propose ways to “radically” reform the university, it tends to involve adding new administrative offices and bureaucratic procedures, i.e. strengthening the existing structure of the university rather than democratizing it. Instead of demanding an increase in the power of students, campus workers, and the untenured, activists tend to push for symbolic measures that universities happily embrace, since they do not compromise the existing arrangement of administrative and faculty power.

It’s amusing, then, that conservatives have long been so paranoid about the threat posed by U.S. college campuses. The American right has an ongoing fear of supposedly arch-leftist professors brainwashing nubile and impressionable young minds into following sinister leftist dictates. Since massively popular books like Roger Kimball’s 1990 Tenured Radicals and Dinesh D’Souza’s 1992 Illiberal Education: The Politics of Race on Campus, colleges have been seen as hotbeds of Marxist indoctrination that threaten the civilized order. This is a laughable idea, for the simple reason that academics are the very opposite of revolutionaries: they intentionally speak to minuscule audiences rather than the masses (on campus, to speak of a “popular” book is to deploy a term of faint disdain) and they are fundamentally concerned with preserving the security and stability of their own position. This makes them deeply conservative in their day-to-day acts, regardless of what may come out of their mouths. (See the truly pitiful lack of support among Harvard faculty when the university’s dining hall workers went on strike for slightly higher wages. Most of the “tenured radicals” couldn’t even be bothered to sign a petition supporting the workers, let alone march in the streets.)

But left-wing academics are all too happy to embrace the conservatives’ ludicrous idea of professors as subversives. This is because it reassures them that they are, in fact, consequential, that they are effectively opposing right-wing ideas, and that they need not question their own role. The “professor-as-revolutionary” caricature serves both the caricaturist and the professor. Conservatives can remain convinced that students abandon conservative ideas because they are being manipulated, rather than because reading books and learning things makes it more difficult to maintain right-wing prejudices. And liberal professors get to delude themselves into believing they are affecting something.

harmlessacedemics

Today, in what many call “Trump’s America,” the idea of universities as sites of “resistance” has been renewed on both the left and right. At the end of 2016, Turning Point USA, a conservative youth group, created a website called Professor Watchlist, which set about listing academics it considered dangerously leftist. The goal, stated on the Turning Point site, is “to expose and document college professors who discriminate against conservative students and advance leftist propaganda in the classroom.”

Some on the left are delusional enough to think that professors as a class can and should be presenting a united front against conservatism. At a recent University of Chicago event, a document was passed around from Refusefascism.org titled, “A Call to Professors, Students and All in Academia,” calling on people to “Make the University a Zone of Resistance to the Fascist Trump Regime and the Coming Assault on the Academy.”

Many among the professorial class seem to want to do exactly this, seeing themselves as part of the intellectual vanguard that will serve as a bulwark against Trumpism. George Yancy, a professor of philosophy and race studies at Emory University, wrote an op-ed in the New York Times, titled “I Am A Dangerous Professor.” Yancy discussed his own inclusion on the Professor Watchlist, before arguing that he is, in fact, dangerous:

“In my courses, which the watchlist would like to flag as ‘un-American’ and as ‘leftist propaganda,’ I refuse to entertain my students with mummified ideas and abstract forms of philosophical self-stimulation. What leaves their hands is always philosophically alive, vibrant and filled with urgency. I want them to engage in the process of freeing ideas, freeing their philosophical imaginations. I want them to lose sleep over the pain and suffering of so many lives that many of us deem disposable. I want them to become conceptually unhinged, to leave my classes discontented and maladjusted…Bear in mind that it was in 1963 that the Rev. Dr. Martin Luther King, Jr. raised his voice and said: ‘I say very honestly that I never intend to become adjusted to segregation and discrimination.’… I refuse to remain silent in the face of racism, its subtle and systemic structure. I refuse to remain silent in the face of patriarchal and sexist hegemony and the denigration of women’s bodies.”

He ends with the words:

“Well, if it is dangerous to teach my students to love their neighbors, to think and rethink constructively and ethically about who their neighbors are, and how they have been taught to see themselves as disconnected and neoliberal subjects, then, yes, I am dangerous, and what I teach is dangerous.”

Of course, it’s not dangerous at all to teach students to “love their neighbors,” and Yancy knows this. He wants to simultaneously possess and devour his cake: he is doing nothing that anyone could possibly object to, yet he is also attempting to rouse his students to overthrow the patriarchy. He suggests that his work is so uncontroversial that conservatives are silly to fear it (he’s just teaching students to think!), but also places himself in the tradition of Martin Luther King, Jr., who was trying to radically alter the existing social order. His teaching can be revolutionary enough to justify Yancy spending time as a philosophy professor during the age of Trump, but benign enough for the Professor Watchlist to be an act of baseless paranoia.

Much of the revolutionary academic resistance to Trump seems to consist of spending a greater amount of time on Twitter. Consider the case of George Ciccariello-Maher, a political scientist at Drexel University who specializes in Venezuela. In December of 2016, Ciccariello-Maher became a minor cause célèbre on the left after getting embroiled in a flap over a tweet. On Christmas Eve, for who only knows what reason, Ciccariello-Maher tweeted “All I Want for Christmas is White Genocide.” Conservatives became enraged, and began calling upon Drexel to fire him. Ciccariello-Maher insisted he had been engaged in satire, although nobody could understand what the joke was intended to be, or what the tweet even meant in the first place. After Drexel disowned Ciccariello-Maher’s words, a petition was launched in his defense. Soon, Ciccariello-Maher had lawyered up, Drexel confirmed that his job was safe, and the whole kerfuffle was over before the nation’s half-eaten leftover Christmas turkeys had been served up into sandwiches and casseroles.

Ciccariello-Maher continues to spend a great deal of time on Twitter, where he frequently issues macho tributes to violent political struggle, and postures as a revolutionary. But despite his temporary status as a martyr for the cause of academic freedom, one who terrifies the reactionaries, there was nothing dangerous about his act. He hadn’t really stirred up a hornet’s nest; after all, people who poke actual bees occasionally get bee stings. A more apt analogy is that he had gone to the zoo to tap on the glass in the reptile house, or to throw twigs at some tired crocodiles in a concrete pool. (When they turned their rheumy eyes upon him, he ran from the fence, screaming that dangerous predators were after him.) U.S. academics who fancy themselves involved in revolutionary political struggles are trivializing the risks faced by actual political dissidents around the world, including the hundreds of environmental activists who have been murdered globally for their efforts to protect indigenous land.

“University faculty are less and less likely to threaten any aspect of the existing social or political system…”

Of course, it’s true that there are still some subversive ideas on university campuses, and some true existing threats to academic and student freedom. Many of them have to do with Israel or labor organizing. In 2014, Steven Salaita was fired from a tenured position at the University of Illinois for tweets he had made about Israel. (After a protracted lawsuit, Salaita eventually reached a settlement with the university.) Fordham University tried to ban a Students for Justice in Palestine group, and the University of California Board of Regents attempted to introduce a speech code that would have punished much criticism of Israel as “hate speech.” The test of whether your ideas are actually dangerous is whether you are rewarded or punished for expressing them.

In fact, in terms of danger posed to the world, the corporatized university may itself be more dangerous than any of the ideas that come out of it.

In Hyde Park, where I live, the University of Chicago seems ancient and venerable at first glance. Its Ye Olde Kinda Sorta Englande architecture, built in 1890 to resemble Oxbridge, could almost pass for medieval if one walked through it at dusk. But the institution is in fact deeply modern, and like Columbia University in New York, it has slowly absorbed the surrounding neighborhood, slicing into older residential areas and displacing residents in landgrab operations. Despite being home to one of the world’s most prestigious medical and research schools, the university refused for many years to open a trauma center to serve the city’s South Side, which had been without access to trauma care. (The school only relented in 2015, after a long history of protests.) The university ferociously guards its myriad assets with armed guards on the street corners, and enacts massive surveillance on local residents (the university-owned cinema insists on examining bags for weapons and food, a practice I have personally experienced being selectively conducted in a racially discriminatory manner). In the university’s rapacious takeover of the surrounding neighborhood, and its treatment of local residents—most of whom are of color—we can see what happens when a university becomes a corporation rather than a community institution. Devouring everything in the pursuit of limitless expansion, it swallows up whole towns.

The corporatized university, like corporations generally, is an uncontrollable behemoth, absorbing greater and greater quantities of capital and human lives, and churning out little of long-term social value. Thus Yale University needlessly decided to open a new campus in Singapore despite the country’s human rights record and restrictions on political speech, and New York University decided to needlessly expand to Abu Dhabi, its new UAE campus built by low-wage workers under brutally repressive conditions. The corporatized university serves nobody and nothing except its own infinite growth. Students are indebted, professors lose job security, surrounding communities are surveilled and displaced. That is something dangerous.

Left professors almost certainly sense this. They see themselves disappearing, the campus becoming a steadily more stifling environment. Posturing as a macho revolutionary is, like all displays of machismo, driven partially by a desperate fear of one’s impotence. They know they are not dangerous, but they are happy to play into the conservative stereotype. But the “dangerous academic” is like the Dodo in 1659, a decade before its final sighting and extinction: almost nonexistent. And the more universities become like corporations, the fewer and fewer of these unique birds will be left. Curiosity kills, and those who truly threaten the inexorable logic of the neoliberal university are likely to end up extinct.

Illustrations by Chris Matthews.

Fines and Fees Are Inherently Unjust

Fining people equally hurts some people far more than others, undermining the justifications of punishment…

Being poor in the United States generally involves having a portion of your limited funds slowly siphoned away through a multitude of surcharges and processing fees. It’s expensive to be without money; it means you’ve got to pay for every medical visit, pay to cash your checks, and frankly, pay to pay your overwhelming debts. It means that a good chunk of your wages will end up in the hands of the payday lender and the landlord. (It’s a perverse fact of economic life that for the same property, it often costs less to pay a mortgage and get a house at the end than to pay rent and end up with nothing. If I am wealthy, I get to pay $750 a month to own my home while my poorer neighbor pays $1,500 a month to own nothing.) It’s almost a law of being poor: the moment you get a bit of money, some kind of unexpected charge or expense will come up to take it away from you. Being poor often feels like being covered in tiny leeches, each draining a dollar here and a dollar there until you are left weak, exhausted, and broke.

One of the most insidious fine regimes comes from the government itself in the form of fines in criminal court, where monetary penalties are frequently used as punishment for common misdemeanors and ordinance violations. Courts have been criticized for increasingly imposing fines indiscriminately, in ways that turn judges into debt collectors and jails into debtors’ prisons. The Department of Justice found that fines and fees in certain courts were exacted in such a way as to force “individuals to confront escalating debt; face repeated, unnecessary incarceration for nonpayment despite posing no danger to the community; lose their jobs; and become trapped in cycles of poverty that can be nearly impossible to escape.” A new report from PolicyLink confirms that “Wide swaths of low-income communities’ resources are being stripped away due to their inability to overcome the daunting financial burdens placed on them by state and local governments. There are countless stories of people being threatened with jail time for failing to pay fines for “offenses” like un-mowed lawns or cracked driveways.

Critics have targeted these fines because of the consequences they are having on poor communities. But it’s also important to note something further. The imposition of flat-rate fines and fees does not just have deleterious social consequences, but also fundamentally undermines the legitimacy of the criminal legal system. It cannot be justified – even in theory.

I work as a criminal defense attorney, and I have defended both rich and poor clients (mostly poor ones). Many of my clients have been given sentences involving the imposition of fines. For everyone, regardless of wealth, if a fine means less (or no) jail time, it’s almost always a better penalty. But, and this should be obvious, fines don’t mean the same thing to different people. For my poor clients, a fine means actual hardship. In extreme cases, it can mean a kind of indenture, as the reports have pointed out. If you make $1,000 a month, and are trying to pay rent and support yourself, a $500 fine means a lot. It means many months of indebtedness as you slowly work off your debt to the court. It might mean not buying clothes for your child, or forgoing necessary medical treatment.

Of course, the situation changes if you’re wealthy, or even middle-class. You write the check, you leave the court, the case is over. For my wealthy clients, a fine isn’t just the best outcome, it’s a fantastic outcome, because it means the crime which you are alleged to have committed has led to no actual consequences that affect you in a substantive way. You haven’t had to make any sacrifices –  your life will look precisely the same in the months after the fine was imposed as it did in the months before. Wealthy defendants want to know: “What can I pay to make this go away?” And sometimes paying to make it go away is exactly what they can do as courts will often accept pre-trial fines in exchange for dismissal.

As I said, it’s not news that it’s harder to pay a fine if you’re poor. But the implications of this are rarely worked all the way through. For if it’s true that the punishment prescribed by law hurts one class of defendants far more than it hurts another class of defendants, then the underlying justification for having the punishment in the first place is not actually being served, and the basic principle of equality under the law is being undermined.

anatomyad2

If fines are imposed at flat rates, poor people are being punished while rich people are not. If it’s true that wealthy defendants couldn’t care less about fines (and a millionaire with a $500 fine really couldn’t care less), then they’re not actually being deprived of anything in consequence of their violation of law. Punishment is supposed to serve the goals of retribution, deterrence, or rehabilitation. Leaving aside for the moment whether these are actually worthy goals, or whether criminal courts actually care about these goals, flat-rate fines don’t serve any of them when it comes to wealthy defendants. There’s no deterrence or rehabilitation, because if you can pay an insignificant fee to commit a crime, there’s no reason not to do it again. It’s wildly unclear how a negligibly consequential fine would deter a wealthy frat boy from continuing to urinate in public, whereas a person trying to escape homelessness might become very careful not to rack up any more fines.

Nor does the retribution imposed have a rational relationship to the significance of the crime. If the point of retribution is to make someone suffer a harm in proportion to the suffering they themselves have imposed (a dubious idea to begin with), flat-rate fines make no sense, because some people are being sentenced to far greater suffering than others. This means that it is unclear what we believe the actual correct retributive amount is supposed to be. It’s as if we punish in accordance with the philosophy of “an eye for an eye,” but we live in a society where some people start with one eye and some people start with a twenty eyes. Taking “an eye for an eye” means something quite different when imposed on a one-eyed man than it does with a twenty-eyed man. The one-eyed man has been punished with blindness while the twenty-eyed man can shrug and simply have one of the lenses removed from his spectacles.

This is important for how we view the law. If courts aren’t calibrating fees based on people’s actual wealth, then massively differential punishments are being imposed. Some people receive indenture while others receive no punishment at all, even given the same offense at the same level of culpability. If fines are supposed to have anything to do with making a person experience consequences for their crime, whether retributive consequences or rehabilitative consequences, then punishments are failing their stated purpose and being applied grossly unequally.

It may be objected that fines do not constitute an unequal application of the law, because they are applied equally to all. But the point here is that application of a law equally in each case does not mean “equal application of law to all” in any meaningful sense. In other contexts, this is perfectly clear. A law forbidding anyone from wearing a yarmulke and reading the Torah does not constitute the “equal application of law to all.” It clearly discriminates against Jews, even though Christians, Muslims, Hindus, and the non-religious are equally prohibited from wearing yarmulkes. (The absurdity of “equal application” meaning “legal equality” was well captured by Anatole France, who wrote that “The law, in its majestic equality, forbids the rich as well as the poor to sleep under bridges.”)

It is inevitable that laws will always affect people differently, because people will always be different. But if some people are given something that constitutes far more of a burdensome punishment for them than it is for others, the actual purposes of the law aren’t being served. Separate from the equality arguments, for a large class of people punishment simply isn’t even serving its intended function.

Of course, you could easily take a step toward this, by fining people in accordance with a percentage of their income rather than at a flat rate (or redistributing all wealth). If a fine is, say, 2% of one’s annual income, then a person with a $20,000 income would face a $400 fine whereas a person with a $200,000 income would face a $4,000 fine. That’s still grossly unfair of course, because $400 means far more to the poorer person than $4,000 does to the richer person. You wouldn’t have a fair system of fines until you figure out how to make the rich experience the same kinds of effects that fines impose on the poor. The fact that even massively increasing fines on the rich wouldn’t bring anything close to equal consequences should show how totally irrational our present system is.

But rather than having courts appropriate larger quantities of rich people’s wealth (though their wealth obviously does need appropriating), we could also simply reduce the harm being inflicted on the poor, through reforming local fines-and-fees regimes. It’s clear that in many cases, fines don’t have anything to do with actual punishment; they’re revenue-raising mechanisms, a legalized shakedown operation, as the Justice Department’s report on Ferguson made clear. Courts aren’t interested in actually calculating the deterrence effects of certain financial penalties. They want to fund their operations, and poor people’s paychecks are a convenient piggy bank.

We know that fines and fees have, in many jurisdictions, created pernicious debt traps for the poor, arising from trivial offenses. But it’s when we examine the comparative impact on wealthy defendants that this system is exposed as being irrational as well as cruel. It doesn’t just ensnare the less fortunate in a never-ending Kafkaesque bureaucratic nightmare. It also fundamentally delegitimizes the entire legal system, by severing the relationship between punishments and their purpose. It makes a joke out of the ideas of both the punishment fitting the crime and equality under the law, two bedrock principles necessary for  “law” to command any respect at all. So long as flat-rate fines are disproportionately impacting the poor, there is no reason to believe that criminal courts can ever be places of justice.

Rahm Emanuel’s College Proposal Is Everything Wrong With Democratic Education Policy

Emanuel’s idea is the reductio ad absurdum of the “college solves poverty” idea…

On Wednesday, Chicago Mayor Rahm Emanuel announced a new educational proposal: starting with this year’s freshman class, every student in the Chicago public school system will be required to show an acceptance letter from a college, a trade school or apprenticeship, or a branch of the military in order to graduate. “We live in a period of time when you earn what you learn,” Mayor Emanuel said. (Democratic politicians’ attempts at folksiness are always pretty grim.) “We want to make 14th grade universal,” he also said. The proposed measure is almost certainly a publicity stunt which will have little effect in practice. But Emanuel has made it clear how he thinks educational problems should be solved.

The Emanuel plan is perhaps the stupidest idea a nationally prominent politician has publicly endorsed in the past decade. I hesitate to even explain why it’s stupid lest I insult my readers’ intelligence by belaboring the obvious. But it’s worth spelling out what’s wrong with this, because the fact that a major Obama-aligned Democratic politician is attempting to do this says a great deal about the worldview of the establishment Democratic Party. So here goes.

In Mayor Emanuel’s opinion, working-class kids are too stupid to recognize their own interests. They’re simply unaware that people who go to college earn more than people who don’t, which is why (silly them) they don’t go to college. If you just force them to go to college by flunking them out of high school unless they promise to go to college, they’ll all become highly compensated white-collar workers and America will be a wealthier place.

Allow me to propose an alternative model: working-class kids are not stupid. They’re aware that college grads earn more money on average than they ever will. They’re also aware that not all college degrees are created equal, and that a degree from a community college or some fly-by-night for-profit—the kind of school most working-class kids from Chicago might actually get into—is dramatically less valuable than one from Sarah Lawrence, where Rahm got his BA. They’re aware that college degrees aren’t what they once were, partly because so many degrees are from mediocre institutions; perhaps they’ve seen family members work hard to get that University of Phoenix diploma only to wind up little better off than they’d have been otherwise.

They’re also aware that college costs money, not only money for tuition but all the money you won’t be able to earn while you’re in school, and that people whose parents can’t support them, people who may in fact need to help support their families themselves, can’t afford to just not work for two to four years. Finally, they’re aware that college is hard, particularly for working-class kids with less academic preparation than their middle-class peers who also have less social support and need to work while their peers are studying, and that working-class kids are at a high risk of dropping out. They know that going into debt to attend a college and then dropping out with no degree can be financially catastrophic.

In other words, they know, unlike their mayor, that what happens to the average kid who goes to college—a middle-class kid from the suburbs with white-collar parents who can afford to subsidize his textbooks and partying for four years—is a very poor indicator of what will happen to them, personally, if they decide to go to college. Knowing all this, they make their choice; 62% of Chicago’s high school students decide to have a crack at college after they graduate, 38% don’t.

Now, it may well be that there are a few kids in that 38% who are making the wrong choice, just as there are a few in that 62% (very possibly more than a few) who are making the wrong choice and will just end up dropping out with debt or graduating with a worthless degree and more debt. It might be that a better school guidance program would push some kids into college for whom it’s the right decision. But Rahm isn’t proposing to nudge a few more kids into college; he’s proposing to hold the high school degree of every student in the system hostage until they all go to college, or sign up for the army, or enter an apprenticeship.

What’s likely to happen if his proposal passes? Well, trade schools and apprenticeship programs are bright enough to know that the world only needs so many plumbers, so not a lot of students are going to manage to go that route. Some will join the army, at which stage Mr. Emanuel can congratulate himself for having forced some working-class kids to die for their country on pain of facing the stigma of the high school dropout for the rest of their lives. Some will simply decide to leave high school without graduating. But many will be forced into a choice they know is the wrong one, and have a crack at whatever community college or awful open-admissions for-profit college they can get an acceptance letter from. Expect to see the already overburdened and underfunded community college system pushed to the wall. Expect to see a small boom in the for-profit college industry and the exploitative student loan industry that feeds it. Expect to see many, many students drop out of school with nothing to show for it but un-bankruptable education debt that will haunt them for years.

anatomyad2

And finally, perhaps most importantly, expect to see those students who do manage to graduate from whatever bottom-tier school is willing to accept them quickly discover that the degree Rahm Emanuel forced them to earn at great personal expense isn’t worth the paper it’s printed on. First, because college-educated workers, like any other commodity, are subject to the law of supply and demand, and Rahm’s plot to dump hundreds of thousands more of them onto the Chicago labor market will cause supply to greatly outpace demand and prices to crater. Second, because employers will recognize that people who got a college degree from a bottom-tier school that slashed admissions standards to take advantage of the Rahm-and-debt-fueled bonanza don’t have the same skill set or qualifications as the college students they now pay higher wages. In other words, producing a genuinely more educated workforce is a lot harder than Rahm’s plan to print a whole bunch more college diplomas, but even if you could produce a genuinely more educated workforce it wouldn’t raise wages; you’d just have more people competing for the same number of white-collar jobs., and wages would go down.

(Of course, middle-class kids who went to Sarah Lawrence would still do just fine.)

Emanuel’s plan, in other words, will be a disaster if implemented. But if the plan were just his own idiosyncratic idiocy, it would be beneath refutation. Unfortunately, it’s not. The mayor of Chicago is an utterly characteristic representative of the dominant wing of the Democratic Party, and his “you earn what you learn” claptrap reflects what has been a core element of its messaging and policy for decades: the notion that we can solve poverty through education. For most of my lifetime, the Democratic Party’s answer to the apparently permanent stagnation of working-class wages has been to advise the electorate that it’s a knowledge economy and only a better-educated workforce can hope to earn more.

This is terrible policy based on obviously shoddy reasoning: while it’s true that highly educated computer programmers make a lot of money, the notion that if everyone were a highly educated computer programmer everyone would make more money is absurd, first because not everyone can become a highly educated computer programmer and second because if everyone could then computer programmers would no longer make a lot of money.

It should be emphasized, though, that  on top of being terrible policy this is also terrible messaging. When voters hear that your analysis of the economy is that it simply has no place anymore for uneducated workers, and that your plan to increase working-class wages is “educate people better for the knowledge economy,” they get three messages: first, that if you’re a low-income thirty-year-old high school graduate with a family who can’t go to school, the Democrats’ plan for you is that you’ll die poor, because hey, it’s a knowledge economy, what can they do? It’s a knowledge economy. Second, that Democrats think your poverty is pretty much your fault for not doing better in school. And third, that Democrats are so completely out of touch that they genuinely believe that becoming a high-tech worker is a serious option for your working-class kids. In other words, what you hear is that Democrats don’t know you, don’t care about you, look down on you, and have no plan to help you. Is it any wonder that you don’t bother to vote, or that if you do you vote for someone who promises to bring the jobs back?

Every time Democrats say or imply that there’s no way for people to succeed in the 21st-century economy without a college degree, they announce loud and clear that they’ve largely given up on helping the existing working class.

But if the Democratic line on education fails on policy and politics grounds alike, why are they so attached to it? I’d suggest two reasons.

First, claiming that class differences result from educational achievement flatters the American elite’s sense of its own meritocracy. If differences in income are mostly explained by differences in education, elites don’t have to worry about why their own incomes have skyrocketed over the past three decades while the rest of the country has done so poorly; it’s the natural result of market forces rewarding talent and hard work. You can see this perhaps most clearly in Silicon Valley entrepreneurs’ excitement about charter schools, an excitement most of the Democratic establishment shares: charters are the noblesse oblige of an utterly self-confident meritocratic elite, an elite which believes that they earned what they have and that the way to make everyone else better off is not to take from the deserving rich and give to the undeserving poor but to make the poor more deserving. (The fact that many of these charters’ educational model is to replace those stupid, lazy public school teachers with brilliant and disruptive Yale graduates says everything here.) The education-solves-poverty line sells well with affluent white-collar professionals, and the average Democratic politician spends vastly more time addressing herself to the needs of those professionals than talking to working-class voters.

But second, and far more importantly, building an economy that once again provides decent, well-paying and dignified jobs for the working class is very difficult. It’s far easier to pretend that the jobs are waiting in the wings if only the working class were educated enough to deserve them than to take on the employers who refuse to offer those jobs. Rebuilding the American working class would require a higher minimum wage, a serious effort to encourage unionization in the service sector, and, at least in areas with sky-high unemployment (places like Chicago), a major federal jobs program to put people to work and force private-sector employers to raise wages. Every one of those initiatives would require direct confrontation with businesses big and small. Creating more innovative charter schools, or forcing more students into college, requires no such confrontation. Placing the burden of fixing the economy on working-class students and their teachers rather than on big business and the wealthy makes plenty of political sense, in its way.

But it won’t work. And liberal pundits who scoff at Trump voters by reminding them that those manufacturing jobs he promised won’t come back would do well to remember that Democrats’ agenda on working-class jobs is just as empty a promise.

The Regrettable Decline of Space Utopias

Why is it only the libertarians who fantasize about space these days?

Star Trek is one of those TV shows whose basic premise would be horrifying if the show weren’t so utterly committed to its own optimism. Viewed in the abstract, it’s hard to imagine how anybody stays sane on a starship. Star Trek characters are constantly flying blind into some fresh hell. Literally every corner of the universe they visit, Starfleet encounters some fucked-up shit that defies all extant scientific knowledge. Crew members are routinely bodyswapped, brainwashed, possessed by alien lifeforms, or implanted with false memories. Oh, and most crew members bring their entire families on board, so during the ship’s weekly brushes with death, they all get to grapple with the knowledge that their spouse and children will almost certainly be burned alive or suffocated in the vacuum of space. Everyone on that show should be on the verge of complete psychosis, but somehow, they all seem pretty contented with their lives. The characters’ preternatural level of peace with the unknown is probably one of the main reasons why Star Trek is extraordinarily comforting to watch.

Another reason why Star Trek is comforting is that there are no goddamn lawyers in space.

This is not completely true. There are a couple of lawyers in space. But there are no lawyers affiliated with the United Federation of Planets, the big, happy humanitarian alliance of planetary civilizations that are committed to universal peace, cultural interchange, and the accumulation of scientific knowledge. There are a few itinerant JAGs, but there’s no shipboard counsel. There are no legal teams dispatched to scenes of interstellar conflict. When characters find themselves in compromising situations, they never ask if they can speak to an attorney.

This, on the one hand, is completely bonkers. After all, non-Federation planets have all kinds of nutty legal standards, ranging from “guilty until proven innocent” to “automatic death penalty for anybody who accidentally steps on a flowerbed inside the invisible Punishment Zone.” Given the many entirely foreseeable dangers of this approach, you’d think that every starship would have some highly-trained legal wonk on board, ready to deal with these horrifying situations. But nope. It’s implied that the Federation does have lawyers somewhere, and there even is a loose notion that they are important to the effective functioning of the judicial system. In one episode, we learn that during a period of Earth history known as the Post-Atomic Horror (which is scheduled to occur—get ready, guys—in the mid-21st century), all the world’s lawyers were systematically murdered. This is characterized as having been an undesirable development for humanity, so we can infer that the legal profession was subsequently reinstated. But whenever there’s a legal hearing of any kind, Starfleet personnel either A) represent themselves, or B) are represented by a random bridge officer who is deputed to act as counsel.

Now you might say, on the one hand, that we shouldn’t read too much into this. Maybe writing a random lawyer into a storyline was just going to be one more actor cluttering up the set, frittering away the weekly episode budget with dispensable lines. But the complete absence of lawyers across multiple Star Trek seasons, each under different creative direction, each with their own standalone law-centric episodes, is at least a little weird. So is there some other reason why the Federation has no need for lawyers?

spacebux

One of the central premises of the Star Trek universe, which is set a couple centuries into the future, is that humanity has evolved—not dramatically beyond all recognition, but nonetheless significantly. After a period of mass calamity on Earth, characterized by nuclear war, genocide, and famine, the remainder of Earth’s global population finally comes to the negotiating table, as it were. A world government is established. Societies are rebuilt. Money is abolished. All basic human needs are provided for. People enter professions, learn trades, and provide services because they find these activities fulfilling, not out of economic necessity. Crime is almost nonexistent; with the elimination of material want, the impetus for most kinds of crime is also eliminated, and it’s implied that psychological dispositions towards violence are somehow detected and rehabilitated in their early stages. The establishment of an egalitarian regime of resource distribution, and the discovery of alien civilizations on other planets, seems to have drawn the human species together and eroded social distinctions. While there are still pockets of institutional corruption, and although humans still sometimes give in to their lesser impulses, people are largely motivated by goodwill. Federation officers in particular have a widespread reputation for honesty, which other civilizations, weirdly, mostly seem to accept at face value.

These characteristics seem to percolate through the Federation legal system. In the courtroom episodes, there are never “gotcha” moments where somebody wins on a technicality or gets tripped up by an arcane legal formulation. Making a common-sense argument, or a soliloquy to general principles of justice, is usually enough to win over an adjudicator. The implication seems to be that in a world where fact-finders are honest, and where parties can make more or less sensible claims in their own defense, the system can afford to be equitable and ad hoc. It’s the ultimate access-to-justice dream where—even better than a lawyer for every client—the law is so reasonable and the judges so fair that every person can represent themselves in court with total confidence, or, at most, bring along a moderately clever friend to help them make their case. In addition, when interacting with other legal systems, the strong presumption of integrity on the part of Federation actors often helps the legal process along.

This all may seem fairly pie-in-the-sky—but could it actually be possible? Could humanity, someday, theoretically, if basic material insecurities were resolved, reach a general state of compassion and reasonability towards one another? Could lawyers, at present a hideous but necessary evil, eventually be rendered obsolete by more humane social attitudes? God, that would be amazing, wouldn’t it?

Of course, the opposing theory of human nature says that our impulse towards selfishness and cruelty is so deeply-rooted, spiritually or biologically, that we can never hope to eliminate it; that at most, we might mitigate it, but that this will never be a durable achievement across cultures or across generations. This theory is quite popular, but we have no idea if it’s true. It certainly seems to be humanity’s default mode, if we make no attempts at self-improvement. But our species hasn’t been around terribly long, in the grand scheme of things, and if we’re honest with ourselves, most of us haven’t exactly been doing our utmost to better the world we live in. As G.K. Chesterton once wrote about Christianity: “Christianity has not been tried and found wanting; it has been found difficult, and not tried.” The same could easily be said for most schemes of social organization that require some form of moral effort or voluntary material renunciation.

Sadly, utopias are presently out of vogue, as the tedious proliferation of dystopian fiction and disaster films seems to indicate. No genre is safe. Game of Thrones is the dystopian reboot of Lord of the Rings; House of Cards is the dystopian reboot of The West Wing; Black Mirror is the dystopian reboot of The Twilight Zone. The slate of previews at every movie theatre has become an indistinguishably sepia-toned effluence of zombies, terrorists, and burnt-out post-apocalyptic hellscapes. Even supposedly light-hearted superhero movies now devote at least 3.5 hours of their running time to the lavishly-rendered destruction of major metropolises.

There is clearly some deep-seated appeal to these kinds of films; and indeed, it would take a heart of inhuman moral fiber to truly regret the sudden vanishing of New York City, whose existence serves no beneficial purpose for humanity that I’m aware of. But my general feeling is that our fondness for dystopian narratives is a pretty nasty indulgence, especially for those of us who live mostly comfortable lives, far-removed from the visceral realities of human suffering. Watching scenes of destruction from the plush chair of a movie theater, or perhaps on our small laptop screen while curled up in bed, heightens our own immediate sense of safety. It numbs us to the grinding, intermittent, inescapable reality of violence in neglected parts of our world, which unmakes whole generations of human beings with terror and dread.

spacecolors1

Immersing ourselves in narratives where 99% of the characters are totally selfish also engrains a kind of fashionable faux-cynicism that feels worldly, but is in fact simply lazy. I say faux-cynicism because I don’t believe that most people who profess to be pessimists truly believe that humanity is doomed, at least not in their lifetimes, or in their particular geographic purviews: if they did, then watching a film that features the drawn-out annihilation of a familiar American landscape would probably make them crap their pants. But telling yourself that everything is awful, and nothing can be fixed, is a marvelously expedient way to absolve yourself of personal responsibility. There is, happily, nothing about an apocalyptic worldview that obligates you to give up any of the comforts and conveniences that have accrued to you as a consequence of global injustice; and you get to feel superior to all those tender fools who still believe that a kinder world is possible! It’s a very satisfying form of moral escapism. No wonder our corporate tastemakers have been churning this stuff out.

And there’s no doubt that it’s often hard to make utopias seem dramatically sophisticated. Star Trek is renowned, even by those who love it, for being campy as hell. Moral tales in general are too often sugary and insubstantial. They’re suitable for kids, or maybe emotionally-stunted adults, but they’re not something to be taken seriously. We have come to view utopian narratives as inherently hokey, and preachy. But dystopias are, of course, their own form of preaching; they are preaching another hypothesis about humanity, which, due to moody lighting and oblique dialogue, has an entirely undeserved appearance of profundity, and the illusory farsightedness of a self-fulfilling prophecy.

TWO PLEAS FOR THE FUTURE OF HUMANITY

But don’t we all want a world without lawyers? Isn’t that, at least, something that our whole species can agree on? Star Trek tells us that there are two hurdles between us and this great goal: global economic justice, and warp-speed technology. These may take several more centuries to achieve. But here are two things we can all start working on now.

1. Make utopias popular again.

Fictional narratives are a huge factor in shaping our expectations of what is possible. However, as discussed earlier, utopias are hard to write. You have to forfeit a lot of the cheap tricks that writers use to generate dramatic momentum. After all, it’s always easy to create tension when all your characters are self-serving, back-stabbing bastards; less so when your characters mostly get along. (The writers of Star Trek: TNG famously tore their hair out over creator Gene Roddenberry’s insistence that all the main cast had to be friends.) Constructing plots that are based primarily around problem-solving takes a lot of intricate planning. But we’ve seen a thousand narrative iterations of societal collapse: why not write some narratives about societal construction? What would a better world look like, at different stages of its realization—at its inception? Weathering early internal crises? When facing an existential threat? We should put more imagination into thinking about what this could look like, and how to generate emotional investment in the outcome.

Aspirational fiction seems especially important at this moment in our national history, when a significant number of Americans cast a ballot for a candidate they disliked, or were even disturbed by, simply because they wanted something different. There’s always been a gambling madness in the human spirit, a kind of perverse, instinctive itchiness that suddenly makes us willing to court disaster, simply on the off-chance of altering the mundane or miserable parameters of our daily lives. If we could transform some of that madness into a madness of optimism and creativity, rather than boredom, rage, and despair, that could only be a good thing.

2. Don’t let assholes win the space race.

Do you know who’s really excited about interplanetary exploration these days? Silicon Valley tycoons, and white supremacists. Elon Musk wants to set up a creepy private colony on Mars for ultra-rich survivalists who can shell out $200,000 for their spot, and has stated his own intention of dying on Mars. Meanwhile, a fresh-faced crop of racists are convinced that if the U.S. would only give up trying to provide social services and education to its citizens, lily-white geniuses could easily be conquering the galaxy at this very moment. As Richard Spencer (of “Heil Trump” fame) has it:

“[O]ur Faustian destiny to explore the outer universe. That is what we were put on this earth to do. We weren’t put on this earth to be nice to minorities, or to be a multiculti fun nation. Why are we not exploring Jupiter at this moment? Why are we trying to equalize black and white test scores? I think our destiny is in the stars. Why aren’t we trying for the stars?”

These dickheads are trying for the stars! The rest of us therefore need to make sure they don’t get there first. If the likes of Elon Musk and Richard Spencer are humanity’s ambassadors, our entrée into outer space will simply be a high-tech recapitulation of all the moral horrors of our last Age of Exploration. Thankfully, I’m pretty sure Richard Spencer is no astrophysicist, and Elon Musk’s would-be spacecrafts keep exploding on the launchpad. Now is our chance to thwart them!

Space exploration doesn’t have to be a last-ditch effort to save the species after we screw everything up on earth; nor should it be an alternative project to building an egalitarian global society. We still have time to make a better world here, on the planet we do have, before we inflict ourselves on other parts of the universe. Space travel may well have an improving effect on humanity, but we should also make a point of improving ourselves before we head out into the interstellar beyond. Only then will we have earned the privilege to Boldly Go.

Starfleet or bust!

Illustrations by Mike Freiheit 

CNN Will Never Be Good For Humanity

Cable news is incapable of being a serious adversary to Donald Trump…

It should be perfectly obvious to anyone that there is no war between Donald Trump and CNN. It may look like there is. But there isn’t. This is because Donald Trump and CNN share the exact same core objective: to put on a really good show.

I say this is “perfectly obvious.” That’s because it’s an undeniable fact that CNN exists to serve the interests of the Turner Broadcasting System, which in turn exists to serve the interests of Time Warner, Inc., which exists to serve the interests of the shareholders of Time Warner, Inc. And Donald Trump exists to serve the interests of Donald Trump, whose primary interest is in appearing on television a lot and being famous and powerful. These two sets of interests are perfectly symbiotic, and there is no reason that there should be any serious conflict between them. Donald Trump wants to be on television. CNN wants people to watch television. And because people watch television when Donald Trump is on it, neither CNN nor Trump has any reason to make any effort to seriously undermine the other.

It’s bizarre, however, that when I have mentioned to people the simple fact that Donald Trump and CNN have the same relationship as clownfish and sea anemones, I have been treated like some kind of conspiracy theorist. I am, it is suggested, positing some kind of worldview in which media and political elites gather in backrooms and conspire over cigars. I am being cynical, and implying that nothing is as it seems and that we’re all stupified, zombified sheeple, unaware that the powers that be are laughing behind our backs while we obsess over a spectacle manufactured for consumption.

But in actual fact, I’m implying nothing conspiratorial at all, and it exasperates me endlessly that the idea should be perceived this way. I don’t think Sean Spicer and Wolf Blitzer meet for breakfast each morning and plot out the day’s Trump feud. Rather, it’s simply that by independently pursuing their own personal/institutional objectives, they benefit one another. This requires no shady collusion whatsoever. After all, the clownfish and the sea anemone do not have to work things out in a smoke-filled room. They don’t even particularly have to like one another. They simply go about their business, and the same thing happens to be good for both parties. Thinking about how relationships emerge from rational self-interest doesn’t make you Glenn Beck with his chalkboard; it’s standard economic thinking.

I’ll give you further evidence that I’m not offering a “conspiracy”: you don’t usually see conspiracies described openly in the pages of the Hollywood Reporter. And yet here we are:

On the TV front, [network president Jeff Zucker] and CNN have ridden the Trump wave as adeptly as any outlet. In the critical 25-to-54 demographic, CNN’s daytime audience in January was up 51 percent year-over-year (Fox News was up 55 percent); it pulled in an extra $100 million in ad revenue (counting both TV and digital) last year compared with past election years. Profit for 2016 neared $1 billion, and the short-term outlook suggests the Trump bump will lead to another $1 billion haul. “It’s going to turn 2017 into an even better year than we already expected to have,” says Zucker. 

Here’s the New York Daily News‘s Don Kaplan:

The feud between Donald J. Trump and CNN is like an iceberg: There’s so much more going on beneath the surface than anyone knows. At first glance, it would seem completely adversarial, but it’s not… Those who know Zucker understand his ego is almost as outsized as Trump’s, and given their history, the pair shares a special bond — one that entitles Zucker to a level of access other news executives do not enjoy. Zucker told New York Magazine the pair talked at least once a month during Trump’s campaign for the White House.

And Politico:

In fact, the presidential campaign and the first few weeks of the Trump administration have proven to be a boon to the bottom line for CNN and its competition. In many respects, Trump’s vitriol toward the media and the tough coverage of his administration reinforce themselves, driving coverage forward.

By all accounts, the rise of Donald Trump in American politics has been fantastically good news for CNN, which has seen an incredible ratings boost and reaped a billion dollar profit from the campaign cycle. And Jeff Zucker is an old friend of Donald Trump’s, having launched Trump’s television career by commissioning The Apprentice in 2004. (You can find lots of photos of them hanging out together.) For the head of a network with an ostensibly adversarial relationship with the new president, Zucker has seemed remarkably pleased with the direction of things: “This is the best year in the history of cable news … for everybody. We’ve all benefited.” (The New York Times recently observed that “nibbling filet mignon in a private dining room overlooking Central Park, Jeffrey A. Zucker, the president of CNN, did not look like a man perturbed.”) According to Politico, Zucker and CNN recognized early on that “Trump would be a ratings machine,” and deliberately gave him “quite a bit of coverage,” including broadcasting many of Trump’s rallies and speeches in full. Faced with the fact of his own complicity in the rise of a terrifying and incompetent president, Zucker said he had no regrets, and reportedly “sleeps great at night.”

cnn-president-jeff-zucker-has-a-framed-donald-trump-tweet-in-his-office
Donald Trump and CNN’s Jeff Zucker

All of this is completely at odds with the received idea that Trump and the network are in a fight to the death, with Trump undermining journalists, ushering in a post-fact era, and posing a serious threat to the freedom of the press. CNN contributors and correspondents declare that Trump poses an “existential crisis” for American journalism and poses a threat to democracy and free speech. But television executives don’t seem to share that opinion. During the election CBS’s Les Moonves seconded Zucker’s perspective:

It may not be good for America, but it’s damn good for CBS… For us, economically speaking, Donald’s place in the election is a good thing… Donald’s place in this election is a good thing… The money’s rolling in, and this is fun. It’s a terrible thing to say. But bring it on, Donald. Keep going.

Could anyone who actually had serious grave concerns about Trump speak like this? (Moonves later insisted he had been joking, though since what he said was true, it’s unclear what the joke was supposed to be.) Certainly anyone who thought that the future of the press was at stake, or recognized that millions of lives could potentially be destroyed through mass deportation (let alone nuclear war and climate change) you would have a hard time classifying anything about the election as “fun” or wishing Trump continued political success.  Yet that’s how the heads of CBS and CNN are feeling: they’re not worried. They’re downright pleased. For them (as opposed to everyone else), this is great. It is, as Zucker put it, “a very exciting time.” You don’t have to speculate especially wildly, then, in order to be skeptical of there being any real “hostility” between Trump and CNN. All you have to do is listen to its chief executive’s words.

Again, this doesn’t necessitate believing that there is a conscious effort on CNN’s part to help Trump. While overt media-political collaboration does happen (according to Cenk Uygur’s internal account of working at MSNBC, the Obama administration had significant pull with executives there and shaped the network’s tone), the real question is simply whether it’s possible for a profit-driven media to care much about serious journalism or moral values if ratings and profits lie elsewhere. Financial self-interest powerfully shapes us on a subconscious level, and it’s easy to see why the optimal position for CNN at the moment is to feel like they are opposing Trump while not actually doing anything to seriously undermine him.

anatomyad2

And that’s precisely what seems to be happening. Yes, there are regular spats with Sean Spicer and Kellyanne Conway. These are entertaining; they even go viral! But after Donald Trump’s recent speech to Congress, in which he accomplished the spectacular feat of reading from a set of prepared remarks for the first time in his political career, CNN declared him “presidential,” with even the network’s progressive commentators gushing over Trump. It was somewhat bizarre to see Trump’s supposed bitter adversaries giving him totally undeserved praise for a transparently manipulative bit of agitprop. But as The Atlantic‘s Derek Thompson explained, television news is a show, and shows demand narratives, and Trump steadily becoming statesmanlike is a great narrative, so there was no reason not to give Trump the story he wanted:

The fundamental bias in punditry is not toward “presidential” behavior or against “resistance.” it is more simply pro-plot twist. Narrative shifts are great for television, so great that it is irresistible to manufacture them in the absence of actual shifting narratives.

(Journalistic symbiosis with Trump has a long history, by the way. Ever since the New York Times compared him to Robert Redford in 1976, before writing in 1989 that The Art of the Deal made one “believe in the American Dream again,” Trump has been offering the press great stories, and the press have dutifully printed them. Trump knows the ins and outs of media as well as anyone alive, and has been phenomenally successful at using the news to his advantage in order to build his celebrity and, ultimately, his power.)

Anybody who believes that CNN’s rhetorical commitment to journalism is actually serious should read the Hollywood Reporter‘s account of Zucker’s plans for the network. Serious adversarial reporting such as Jake Tapper’s has a place because Tapper successfully draws viewers. But the rest of the network’s plans have barely any connection to anything resembling journalism. Its future is in stand-up comics (W. Kamau Bell) and TV chefs (Anthony Bourdain—I love him, but that’s what he is.) They’re paying 25 million dollars to a YouTube vlogger named Casey Neistat, a man whose specialty appears to be giddily trying out incredibly expensive goods and services on camera, and whose plans for how to use the $25 million are inscrutably vague and buzzword-laden. To bolster their investigative reporting, CNN poached a team from BuzzFeed who had “broken several major stories, including Trump’s appearance in a soft-core Playboy video.” (A consequential scoop if there ever was one.)

But while the network’s preference for popularity over integrity would seem undeniable, CNN editorial VP Andrew Morse has insisted that it isn’t what it looks like: “We are decidedly not in the clickbait business… We don’t do cat videos, we don’t do waterskiing squirrels.” Morse might be a little more believable if the network’s politics section didn’t literally run headlines like “Haha Guys, This Bird Looks Like Donald Trump.” (He might also want to check the network archives before confidently declaring that CNN is free of cat and squirrel-based news stories; in fact, CNN is the perfect place to go for a “Squirrels Eating Potato Chips” video, and in the weeks before the election they were literally running stories like “Here’s The Whole Election In Cat GIFS.”)

The point here is not that there is something wrong with providing access to amusing cat photos or clips of squirrels noshing on Pringles. It is simply that CNN is a company, not a public service, and it can be expected to act like a company. Its aim is to produce content that people will watch. Sometimes the public’s taste will coincide with the public good. But not too often. And the rise of somebody like Donald Trump, who constitutes both a unique threat to human wellbeing and a unique opportunity for compelling television, heightens the tension between the journalistic and economic motivations of CNN. And since it’s the economic dimension that directs most corporate action, especially when there are billions of dollars to be made, CNN has a lot to gain from being just antagonistic enough toward Trump to guarantee some good entertainment without being so antagonistic as to bring him down and have to return to C-SPAN levels of thrilling political discourse. Thus to use Moonves’s formulation, in the Trump era, what’s “bad for America” is great for CNN.

The fact that CNN will never be good for humanity is not really the fault of the people who work at CNN. After all, it’s hard to see how they could do anything differently. (Though, to their credit, they have experimented with some impressively elevated programming.) Once your mandate is to get viewers, you’ve already got a pernicious conflict of interest, and the quest for viewers (or clicks) is endemic to contemporary American media. So much is driven by the pursuit of eyes on the page or screen, and anyone working within that system will struggle to do things that are morally necessary but don’t really attract a viewership.

This is a very old criticism, but I think in many ways it is a correct one. (The most clichéd sentiments are also often the truest sentiments.) When the production of media is motivated by profit, the temptations to sacrifice integrity are going to be great. In the case of Donald Trump, these temptations will be all but irresistible. An age that requires resistance therefore requires independent nonprofit media. Economics still runs the world, and behind the apparent war between CNN and the Trump administration is a relationship just as agreeable as that of the clownfish and the sea anemone.

Looking Where The Light Is

The left has focused on the easy fights rather than the necessary ones…

The glamour of the Oscar red carpet and the grime of a violent street protest like those that greeted Milo Yiannopoulos at the University of California at Berkeley last month seem like an incongruous pairing. Yet in the left’s fixation on each I see a strange symmetry.

The ongoing efforts to diversify the Academy Awards, and the limited, temporary success of same, are noble and worthwhile. However little they may have to do with actual quality in movies, the Oscars matter, culturally and economically. The ceremony is watched by millions, and who gets awarded influences who gets to continue making movies and of what stature. In any given human competition, even one as cynical as the fight for status in Hollywood, we should strive to make the playing field more equitable and more diverse. There’s little doubt that celebrity shapes our cultural conceptions of what kind of lives are valued, for good and bad, and so we should want our showcases of celebrity to reflect the full sweep of human difference. Much work remains to be done to make the film industry and its award shows more inclusive, diverse spaces, but when a little progress was made on Oscar night, I was pleased.

Yet I can’t help but observe that this particular pageant now draws a truly inordinate amount of attention in left-wing discursive spaces, on an annual cycle. The #OscarsSoWhite controversy dominates discussion of race and diversity for weeks leading up to every ceremony and for weeks after. Social media buzzes with endless debate about the symbolic meaning of various nominations and wins; the takes industry churns out reams of nearly-identical copy, probing every possible dimension of this story. Meanwhile, the vast and seemingly invulnerable architecture of white supremacy stands untroubled. I don’t expect an awards show to tear down our racist system, nor do I think every victory has to be a major one. But it would seem others disagree. What else would explain the sheer volume of attention this story attracts year after year? With all of the vast number of ways that people of color remain marginalized and oppressed in our country, particularly given the contemporary political situation, the outsized priority that diversifying this tiny awards show has taken on seems misguided. Hollywood is a small industry, and the number of people who could ever plausibly win an Academy Award is a truly limited group.

That stance—that diversifying the Oscars and other high-profile ventures enjoyed by a tiny elite is a worthwhile endeavor, that we should celebrate it and take inspiration from it, but that it is ultimately a minor victory that does not imply a larger ability to address racial inequality—seems sensible, to me, and not worthy of great controversy. And yet when I push back gently against the larger meaning of the ceremony, I receive howls of objection. To question the preeminent role that the Academy Awards take on in our race discourse is to be accused of not caring about diversity at all. Of course we should push for diversity in this context; of course representation matters. But in a world of limited political and attentional resources, I don’t think it’s unfair to ask basic questions about priority.

I can’t help but conclude that the disproportionate attention fixed on the Oscars stems from a natural but potentially destructive impulse: the desire to focus our political gaze on arenas where it seems we might plausibly win. Hollywood is a business and its corporations are as unprincipled as any others, but at least the industry is reliably made up of people with progressive sympathies. The people who make up the Academy may be affluent and disconnected from middle and working class American life, but they are solidly blue. The media that follows the industry is almost universally politically liberal. Prominent people who commit gaffes and say offensive things are regularly called to account in the industry; the institutions of the entertainment business at least pay lip service to fighting racism and sexism. So the attention we pay to those worlds seems somehow proportionate to our odds of achieving progress within them. The problem is that almost nobody lives in those worlds, and the space between them and the day-to-day lives of average people of color is vast. Saying so does not disrespect the achievement of those who have finally begun to be recognized for their excellence by their industry, nor does it imply that representation doesn’t matter. It merely insists on recognizing the numbers we’re talking about here.

What does this have to do with black bloc protests against Milo Yiannopoulos and the punching of Richard Spencer? In these instances, too, I perceive a dogged insistence on fixating on the pleasant-but-minor at the expense of taking in the broad horrors of the larger picture.

The left has always had a certain preoccupation with political violence. Wherever you find contemporary left-wing protests, you will find sentiment about “really doing something,” usually implying some kind of insurrectionary violence. Comparisons to past victories achieved through force, such as in the French or Cuban revolutions, are common. So too are discussions about the moral permissibility of such violence under different political philosophies. Indeed, if you’ve been on the left for as long as I have, you will have found them inescapable, endless dorm room-style conversations about who is a fair target for violence, of which type, under which circumstances. For a long time I have opted out of those conversations, for a simple reason: the question of the morality of left-wing political violence is irrelevant in a world in which the potential efficacy of left-wing political violence is so limited. The state’s monopoly on violent power has grown exponentially since the great armed socialist revolutions, and so has its surveillance capability. Meanwhile the most recent examples of left violence in the United States could hardly be less encouraging, with groups like the Weather Underground having achieved none of their strategic aims despite planting a lot of bombs. 21st century America is not 1950s Cuba or 1910s Russia. There is no potential for armed liberation here, even if we had some sort of an army, which we don’t. I do not have time for moral arguments based on ludicrous hypotheticals.

Incidents like the black bloc protests at Berkeley or the punching of Richard Spencer grant people license to overestimate the current potential of violent resistance. Hey, Spencer got punched; never mind that the Trump administration reinstituted the global gag rule on abortion the next day. Hey, Milo’s talk got canceled; never mind that the relentless effort to deport thousands, a bipartisan effort for which the Obama administration deserves considerable blame, went on without a hitch. Better to make yet another meme out of Spencer getting hit than to attempt to confront the full horror of our current predicament.

I mean, think about it: if I said “the Nazi punch” to fellow travelers on the left, every one of them would know exactly what incident I’m talking about. So what really is the value of this tactic? How important can a tactic be if its application is so rare that a single use of it caught so much attention? If I said “the protest” or “the legislation” or “the strike,” the immediate question would be, what protest, what legislation, what strike? Because those things are routinely-accessed parts of political organizing. Punching Nazis is not, because as execrable as Spencer is, and as much responsibility as we have to protect people of color from his followers, the actual number of Nazis wandering the American streets is very low. The national conference of Spencer’s organization got about 200 attendees in a country of 315 million. Meanwhile mainstream conservatism has an army of millions. But again, perhaps that is the reason for this fixation: Spencer, a cartoon villain, seems defeatable. The relentless and organized conservative movement does not.

anatomyad2

That Yiannopoulos has attracted an enormous amount of attention relative to his actual power has not gone unnoticed. Neither he nor Spencer has as much real-world power as, say, the treasurer of Wichita, Kansas. And there is certainly a danger in contributing to this disproportionate attention here. But it’s worth asking whether that attention is precisely a function of Yiannopoulos’s relative lack of power. We attacked his book contract because the left is well-represented in publishing; we criticized his appearances at college campuses because we have some power in universities. His followers are not the huge numbers of the wealthy and connected that the Republican party enjoys but a limited number of marginal gamers and social outcasts. Yes, of course, he has the potential to do real harm to real people, and we must prevent that from happening. But consider the claim that he was going to out an undocumented student during his visit to campus. Who really threatened that student? Yiannopoulos, or the uniformed authorities who would have actually carried out the actual violent application of state force? (It is entirely unclear to me why Yiannopoulos would not have simply shared that information with ICE after his appearance was shut down anyway. Does Milo not own a cellphone?) Again, the same dynamic: Yiannopoulos’s followers seem punchable, subject to the application of a level of force that we imagine we can bring to bear. ICE doesn’t. The forces of state violence, I assure you, are perfectly capable of rolling right over the most passionate antifas. It turns out you can’t punch an MRAP or a Predator drone.

This, then, is what I think the political investment in the Oscars and the rabid fixation on Nazi punching and the black bloc share: they provide the left with something pleasant to think about. Neither is a vehicle for any kind of larger victory. Neither can be replicated at the kinds of scale that would be necessary to rescue us from our current condition. But both become an object of online obsession thanks to the convenient fact that both seem like battlefields on which we can win.

It’s become a cliché, at this point, but it’s still a powerful image: the man who searches for his keys at night not where he lost them but next to a lamp post, because that’s where he has light to look. That’s what I think about when I see the left fixating on these things, a political movement that is so desperate for good news that it’s willing to lie to itself to find it. The conservative takeover of state, Congressional, and federal government has been a slow-building horror. The compromises and betrayals of the Obama administration have revealed how little soaring rhetoric and liberal promises mean. Years of seeming progress on social issues did not prevent a man who regularly engaged in racist tropes and bragged of molesting women from winning the White House. A left-wing insurgent movement captured widespread dissatisfaction with a rigged economy and a feckless Democratic Party to build an unprecedentedly enthusiastic youth movement, powered by a sophisticated messaging and fundraising apparatus, and pushed for the nomination of a solidly left-wing presidential candidate. That effort failed, as the centrist establishment waged all-out war on the candidate and his followers, a war that continued on after the election with the smear campaign waged against Keith Ellison. The Trump presidency has been as terrible as advertised, as he has put together a brutish kakistocracy filled with a rogue’s gallery of America’s worst people. We are powerless to stop many of his actions. The urge to retreat to fantasy and fixation has never been more understandable, or more dangerous.

The left has almost no political power, but it has cultural power, so it obsesses over cultural spaces. The left controls few institutions, so it obsesses over college campuses where it does enjoy a modicum of control, despite the fact that full-time residential college students are a tiny fraction of the population. The left cannot keep the president from saying patently offensive things about immigrants and Muslims, so it enforces a rigid and unforgiving linguistic code in progressive media. We cannot stop drug companies from gouging destitute people with AIDS in sub-Saharan Africa, so we scourge Justine Sacco for making jokes about it. Arguments about the morality of no platforming conservative speakers studiously ignore the fact that in most places, it is precisely the conservatives who have the power to dictate who gets to speak and when, not the leftists. The more that genuine power to do good slips from our grasp, the more tightly we clutch to the few tendrils of control we seem to have.

The stock reply, always, is “we can do both” – that there is world enough and time to punch Richard Spencer, crank out a few memes, and then go stuff envelopes for the local tenant’s union. I have no doubt that many of the people who spend a great deal of their attention on issues of dubious connection to the broader effort for social justice go out into the real world and do the work. But I want to trouble this contention that we can do both. I always want to ask not if we can do both but if we are doing both. The reflexive, unthinking insistence on what we hypothetically could be doing in addition to fixating on symbolic victories seems remote from a real-world political condition in which we aren’t actually doing much more than that. To look out at how limited our progress has been should compel us to ask if, given the very real weakness of the left in our present era, we might actually have to make tough choices about where to focus our time and our attention. Maybe we need to divert some of our mental energy from being the class clowns and discourse police back into more tangible forms of political work.

For weeks, the memes and jokes about Spencer getting hit went on. For weeks, Milo dominated left-wing conversation. Meanwhile Donald Trump put people like Betsy DeVos and Jeff Sessions into positions of considerable real-world power. Both attracted considerable attention, to the credit of the left and our conversation, but in left spaces neither came close to earning the fixation of the two neo-fascist figures who incontrovertibly, indisputably enjoy vastly less power than either DeVos or Sessions. I pointed out, several times, that this all seemed like a poor use of resources. The pushback to my questions was intense and vociferous. I was accused of Nazi sympathies, of caring more about broken windows than undocumented immigrants, of making free speech arguments I had in fact never made. When I would turn the conversation back to the actual practical effect of political violence, when I would ask basic questions about what our larger goals are and how these tactics actually make them easier to meet, I would never encounter serious disagreement about their potential to create change. Everyone, to their credit, seemed aware that we are not punching our way out of our problems. But the obsession continued, as did reflexive, angry lashing out at anyone who asked about whether any of this was useful. The response to questions about the real-world usefulness of Nazi punching was not disagreement on the questions themselves but, more or less, an anguished cry of just let us have this.

I can’t help noticing how the worm has turned. After all, for the entirety of the 2016 presidential primaries and election, the left critiqued the liberal addiction to politics-as-therapy. The Trump-is-Voldemort, Hillary-as-Khaleesi, West Wing fantasy school of liberal political iconography was roundly mocked in the radical left’s online spaces. And not without cause. As we said at the time, the fixation on this symbolic engagement, which depended on a set of social and cultural connections enjoyed by a very few, seemed to run directly counter to the interests of actually winning a campaign, which requires playing to as large of an audience as possible. Many people noted that Hillary’s appearance on the trendy show Broad City simply played to the precise kind of cultured urbanites who would never have voted for her opponent in the first place. Meanwhile all of the “yas kweens” and Game of Thrones mashups served merely to distract from the potent weaknesses of her candidacy.

But what would happen if that same potent microscope was turned on the left, post-election? Could the obsession with Richard Spencer and Milo Yiannopoulos survive the same sorts of questions? It seems perfectly plain to me that setting the Spencer punch to the tune of “Never Gonna Give You Up” is precisely as therapeutic as porting Hillary into Dr. Who. Both do far more to identify the people creating these memes with a particular social caste than they do to spread a plausibly constructive political message. Neither is connected to any coherent narrative of political victory. And yet the same people who mocked the Hillary memes now while away long hours delicately adjusting Photoshop layers for yet another meme of that punch. I cannot comprehend of a consistent, internally-coherent philosophy that sees the former as worthless and the latter as worthwhile. Liberals, too, said “just let us have this,” and the answer from the left was a loud “no.” What right does the left now have to demand otherwise for themselves? “Politics is not therapy,” it turns out, is a statement that applies to everyone or no one.

None of this is to reject the importance of satire. None of it is to suggest that we must be joyless. Satire remains an absolutely vital part of a healthy political tendency. The problem develops when the satirical sensibility so fully saturates an ideology that satire essentially never ends. I love to read a good satirical article in magazines like the Baffler or listen to a political comedy podcast like Chapo Trap House. Then the article is over and the podcast ends, and you have to return to the grim reality. But social media and the 24-hour internet cycle means that the satire never has to end, that you can always jack right back in, and there’s always another person to tell you that those conservative rubes are uncool and unfunny, always an escape into “lol nothing matters.” The jokey, superior, blankly sarcastic tone of limitless derision is ubiquitous online, but it is essentially universal in left spaces. Snarky gloating is now almost impossible to avoid in left-wing spaces, the old vision of the dour communist now entirely old fashioned compared to the digitally-enabled class clown. Strange that this attitude has grown at a time of near-total defeat for the left. Strange that so many on the left gloat like the Harlem Globetrotters while they lose like the Washington Generals. Or perhaps not so strange.

Many people who take part in social media politics deny that they think it has any impact, strange as it may seem for those who engage in call outs morning, noon, and night. They insist that they know the online space does not meaningfully impact real-world politics. But strange as it might seem, I think this is wrong. I would, at this point, reject the notion that social media and online political spaces are irrelevant to real-world political engagement. It is true that the online space cannot be a site of activism or organizing, that the levers of power simply do not exist in those forums, that one cannot tweet their way to justice. But I have increasingly come to find that the basic communicative tenor of broad political movements is in fact deeply influenced by how people interact online, the vocabulary and norms and social codes that can appear so inscrutable from the outside. We are social creatures, and every hit of dopamine from the likes and retweets we consciously dismiss as unimportant conditions us, in this massive experiment in behaviorism called the internet. No, social media can’t get a union certified or block legislation, but it can etch ideals about what kind of behaviors are rewarded by the social hierarchy in the minds of the young and the impressionable. That this condition amounts to the worst of both worlds should go without saying.

And so I think that perhaps it is time to say that all of the ironizing and jokes and endless meme-ification are not just politically inert, as nearly everyone acknowledges, but actively malignant. A generation of young leftists is being conditioned to fully separate their emotional and communicative engagement with politics from the actual reality of politics. We are creating a vast social architecture to make losing feel like winning. We need not experience the joys of hard-won progress when the temporary thrills of a sick burn are always moments away. The addiction to jokes is like the addiction to anything else – it starts out as a method to achieve pleasure but gives way to pathology, and though victory remains elusive, you can always get another hit, and then another, and then another…. Meanwhile, the world is what it is.

I am not counseling despair. There are green shoots. The Women’s March protests and many that have followed demonstrate widespread populist unrest with our current political leadership. Groups like the Democratic Socialists of America have seen their ranks swell since the election. Organizations both national (like the ACLU) and local (like many urban tenant unions and immigrant rights groups) have found new public support and interest. Left-wing discontent within the Democratic Party is not going away, and Trump’s presidency is uniquely embattled for one so young. But let’s not fool ourselves about how grim the situation is, and let’s not allow our coping strategies to overwhelm our basic understanding of just how badly we are losing.

Make and enjoy satire when useful; it’s an important tool. Tell jokes when you feel it’s appropriate; I will too. Enjoy the moments of victory along the way, which will be rare and valuable. But tell the truth. Tell the truth about where we actually are, about how bad things have gotten. Be real, with yourself and with others, about just how deep the pit we find ourselves in is, and be prepared to face it without the numbing analgesic of endless jokes and memes. You don’t have to succumb to fatalism. I myself have not; a better world is possible. But to achieve it you must have the courage to live in the mire of our awful, awful reality.

The Social Science of Success

Duckworth, Cuddy, and Gladwell promise people the secret ingredients of human achievement…

What if I told you that all of your professional and personal dreams were within your grasp? That if you just had the right knowledge then you could accomplish whatever you wished. Step right up, Step right up! Come quickly now! Our psychologists have run the experiments, crunched the numbers, and done The Science! This is The Science that overturns any obstacles in your path. Guaranteed! Call today!

American carnival barkers have long made comfortable livings selling panaceas to desperate people. In a country where so many live lives of frustration and economic misery, plenty of willing customers can be found for those promising to unlock the doors to success and riches. Pop social science literature has its own kind of snake oil to sell you. It doesn’t take the form of a cure-all elixir, a late night infomercial, or a dubious start-up pitch. Rather, it is peddled by well-credentialed academics, who promise to give you the Science that will tell you how to live. Drawing on findings from their research, they insist on having found a Theory of Everything, one that can explain All Human Achievement. And they want to share it with you, for a very reasonable price.

Based on the gushing blurbs to be found on these two books, naïve readers might believe that indeed, the True Secret of Success has recently been discovered. On the back of Presence: Bringing Your Boldest Self to Your Biggest Challenges, Jane McGonigal writes that “this book will forever change how you carry yourself.” Simon Sinek adds: “This book is a must-read for every doer out there.”

The praise for Angela Duckworth’s Grit: The Power of Passion and Perseverance is equally dizzying. Daniel Gilbert, an esteemed social psychologist at Harvard and bestselling author, writes “Psychologists have spent decades searching for the secret of success, but Duckworth is the one who found it.” The very secret of success itself! Larry Summers was impressed enough to write: “The ideas in this book have the potential to transform education, management, and the way its readers live. Duckworth’s Grit is a national treasure.”  “This book will change your life,” says Dan Heath, a professor at Duke’s business school and bestselling author.

Angela Duckworth’s résumé is perhaps peerless. Former White House intern, McKinsey consultant turned tough-neighborhood middle-school teacher, degrees from Harvard and Oxford, start-up co-founder, now a tenured professor at the University of Pennsylvania, and a MacArthur “genius grant” Award recipient. When she announces, from her own position of success, that she has discovered the source of human achievement, one is encouraged to take her seriously.

Duckworth defines “grit” in her book as “perseverance and passion for long-term goals” and as she self-deprecatingly notes in her talks, her 20s were defined by a chaotic search for a purpose—Duckworth had “little grit.” She had no grand goals, but during her stint as a teacher she noticed that it was not always the most intelligent students who did the best, rather it was the ones that toughed it out and worked hard as hell that did—those with grit. Duckworth headed to graduate school to explore this observation further. There, she began studying high achievement through interviews with professionals in “investment banking, painting, journalism, academia, medicine, and law” in order to figure out what distinguishes “star performers.” From these interviews, she further confirmed that neither innate ability nor simply raw number of hours of practice explained who was in the 1% of the top 1%. Rather, there was something else: “a ferocious determination.” After one especially enlightening interview, she describes her reaction, “I came to a fundamental insight that would guide my future work: Our potential is one thing. What we do with it is quite another.”

Duckworth formalized this insight into a questionnaire—the “Grit Scale.” 12 simple questions, measuring things like whether respondents set goals, are committed to long-term success, overcome failure and adversity, and generally speaking get shit done. Duckworth then went out into the real world to test her idea. Her book examines the “Beast Barracks,” the rigorous summer boot camp that every incoming West Point freshman must go through. She administered the Grit Scale to all cadets in 2004 and she found that “98%” of the grittiest cadets made it through the Beast. Duckworth concludes:  What matters for making it through Beast? Not your SAT scores, not your high school rank, not your leadership experience, not your athletic ability. Or your Whole Candidate Score. What matters is grit.” Further studies of finalists in the National Spelling Bee and GPAs among Ivy League undergraduates corroborated these findings – grittier spellers went further and grittier Ivy League graduates had better grades.

This all sounds quite compelling, and even commonsensical. It’s also a useful corrective antidote to the conservative fetishization of Ayn Rand’s “lone genius.” For Duckworth, success is about commitment, not being a Nietzschian superman.

But Duckworth’s theory suffers from a glaring myopia. It’s examining success among a particular subset of people: essentially, those from the top 5% of the distribution of a given profession. Duckworth is interested in studying success among successful people. She’s looking at environments where everyone is already very successful, such as West Point and the National Spelling Bee.

This means that Duckworth isn’t looking at determinants of success and failure such as, for instance, wealth. She explicitly leaves aside social context early on. As she says:

“Of course, your opportunities – for example, having a great coach or teacher – matter tremendously, too, and maybe more than anything about the individual. My theory doesn’t address these outside forces, nor does it include luck. It’s about the psychology of achievement, but because psychology isn’t all that matters, it’s incomplete.” It’s a fair admission. But she only makes it briefly before returning to expounding at length on the power of her theory.

Social scientists typically refer to this bias as “sampling on the dependent variable.” That is to say, her dependent variable of interest, the thing she wishes to explain, is achievement, and she only selects cases with high achieving individuals. One might be impressed to learn that 98% of “gritty” West Point cadets made it through Beast Barracks, but there’s an additional statistic you need to know: 95% of all West point cadets make it through. (Duckworth acknowledges this fact in her academic paper on developing the Grit scale, but it is conspicuously absent from her book.) Thus grit may explain something, but it doesn’t explain much. It might tell us why certain West Point cadets do slightly better than certain other West Point cadets. But it leaves aside an important question: how do people become West Point cadets to begin with?

In fact, we don’t even know that “grit” at West Point tells us anything about success at all. That’s because Duckworth doesn’t study the people who leave West Point, just the people who stay. But for all we know, the people who drop out are not failures. Perhaps they just didn’t enjoy military service that much. Is it really that unthinkable that a few of the more independent-minded 18 year olds could arrive at West Point, only to make a swift exit after having a drill instructor scream in their face because a quarter didn’t bounce off the bed? It could be that plenty of (eventually highly successful) people come in with a naïve, romantic notion of military service, but quickly figure out it’s not for them. Duckworth hasn’t produced a study showing that grit predicts success, but one showing that grit predicts conformity and the ability to endure institutions.

The issue here isn’t that Duckworth is doing uninteresting research—far from it. It’s that she is trying to convince us that it implies more than it actually does. (Is she explaining 10% of the world or 90% of it?) It’s also true that by picking the particular groups she does, Duckworth furthers a dangerous myth about “success.” She may have an accurate theory explaining variations among the people in the top 10% of the income distribution. But for the remaining 90%, whom she does not study, the determinants of “success” are far different. For them, social circumstances, rather than individual psychology, could be more important. When Duckworth puts aside “outside forces,” she somehow imagines that the mind can exist in a vacuum. That we can assume away the structural impediments to success, such as a lack of access to healthcare or a stable income, endemic interpersonal violence, state coercion, and persistent forms of bigotry. Because she only looks at success, but doesn’t study failure, she doesn’t see how perfectly gritty and determined kids can be held back by the misfortune of growing up in the wrong neighborhood.

Consequently, Duckworth’s findings could just as easily lend themselves to a full-throated endorsement of social democratic redistributionist policies and politics. She ends the book by acknowledging that grit is not the only thing that matters in life. She does say that she would much rather have good kids rather than gritty or great ones. Nevertheless, she emphasizes  individual psychology over social conditions:

“We all face limits—not just in talent, but in opportunity. But more often than we think, our limits are self-imposed. We try, fail, and conclude we’ve bumped our heads against the ceiling of possibility. Or maybe after taking just a few steps we change direction. In either case, we never venture as far as we might have…To be gritty is to invest, day after week after year, in challenging practice. To be gritty is to fall down seven times, and rise eight.”

This views life outcomes in terms of individual effort. But she could just as easily have concluded that in order for grit to matter, people need to be free of institutional barriers to success, or that we should make sure people aren’t pushed down seven times out of eight. If everyone started as social or economic equals, then grit might be the deciding factor. But they don’t start as equals.

If the core argument of Grit is that the ability to pursue one’s goals is far more important than innate differences in talent, then Duckworth could come out in favor of removing impediments to goal pursuing, such as the drudgery of low-wage labor. She could have taken a note from John Maynard Keynes in the Economic Possibilities of Our Grandchildren, who suggested that a future of abundant free-time would be the norm if the gains from technology are redistributed. Instead, Duckworth has given the misleading impression that grit is what’s needed to overcome structural obstacles, even though she has only studied people who have made it past those structural obstacles already.

Duckworth has given the misleading impression that grit is what’s needed to overcome structural obstacles, even though she has only studied the people who have made it past those structural obstacles already.

Absent this, Duckworth’s book therefore provides convenient arguments for those who wish to justify inequality. After all, it’s grit that determines success. If you don’t succeed, you’re probably just insufficiently gritty. This may partly explain why her book has reached such heights of popularity; Americans love theories that simultaneously tell individuals they can do anything (even though they probably can’t) and rationalize the economic status quo.

Of course, this isn’t what Duckworth says, and she cannot control the uses of her book. Journalists have over-simplified the findings of Grit. Moreover, she has been forthcoming and responsive to criticisms of the book, and in an interview with NPR said “I aspire to be a scientist who remains open to criticism because I can’t possibly be 100% right about everything!” Moreover, she came out publicly against a Department of Education initiative to transform grit into a portion of national educational assessment, writing in the New York Times that, “I worry I’ve contributed, inadvertently, to an idea I vigorously oppose: high-stakes character assessment.” Intellectual integrity like this must be celebrated.

But while Duckworth cannot perfectly control how her work will be used, she could nonetheless have made sure the book emphasized the limitations of her studies. She does frame grit as an exploration of the nature of success qua success, not one marginal aspect of success within a small non-representative subpopulation. And she does overplay her hand, arguing that grit is the secret sauce which is well beyond what her research can actually support.

pres

Amy Cuddy’s work follows a similar pattern: an initial study with some interesting empirical findings, blown far beyond its boundaries into a theory of nearly everything. Unlike Duckworth, however, Cuddy bears more of the responsibility for the misrepresentation.

Cuddy’s major idea is “power poses,” the notion that if one adopts an open and expansive body posture, one can become less nervous and a better leader. Supposedly, the correct poses trigger one’s brain to increase the production of testosterone and lower the amount of cortisol. Cuddy’s initial experiments suggested that adopting a power pose for a few minutes had a measurable effect on body chemistry, pharmacologically inducing confidence and competence.

As attractive as that sounds, unfortunately, the central findings of Cuddy’s work have largely been discredited. Dorsa Amir, a biological anthropology PhD student at Yale, explained on a popular biology blog shortly after Cuddy’s book appeared that her ideas make little sense from a natural science standpoint:

“In general, hormones like testosterone and cortisol are dynamic. Both hormones have a diurnal rhythm, which means they change throughout the day. They’re also influenced by dozens of variables: the obvious ones like age, sex, and weight help determine clinical guidelines for what ‘normal’ levels look like….How did Cuddy and colleagues control for these phenomena? In short: they didn’t.”

Noted statistician Andrew Gelman of Columbia University and a colleague of his, Kaiser Fung, expressed further doubts that Cuddy followed sound statistical procedures. They wrote in Slate that the “power poses” concept was a prime example of “social scientific malpractice”: the small sample size of the original study meant that “variation is high, so anything that does appear to be statistically significant (the usual requirement for publication) will necessarily be large, even if it represents nothing but chance fluctuations.” In other words, one can immediately see how this “massive effect” was obtained: natural variation in hormonal levels between respondents led to variation before and after the poses, and given a small sample (42 people), a massive effect was found due to high levels of variation.

This criticism has led Cuddy’s colleagues to distance themselves from this work. For instance, Dana Carney, one of the coauthors of the original power poses paper, posted an unequivocal rebuke on her faculty website:

“I do not believe that ‘power pose’ effects are real. I discourage others from studying power poses. I do not teach power poses in my classes anymore. I do not talk about power poses in the media and haven’t for over 5 years (well before skepticism set in).”

In response, Cuddy shifted the goalposts, saying: “The key finding, the one that I would call ‘the power posing effect,’ is simple: adopting expansive postures causes people to feel more powerful… The other outcomes (behavior, physiology, etc.) are secondary to the key effect.” Notice how she has adjusted the claim. The original claim is that if one adopts a power pose, one’s primordial Darwinian brain stem goes into action, and one’s body chemistry shifts. This second claim Cuddy now defends is that if one adopts the power pose, one feels more powerful. But this isn’t much of a claim at all, since all it suggests is a placebo effect. (Although it should be noted that even this is dubious, since the findings themselves are likely just an artifact of statistical noise.)

anatomyad2

Cuddy’s is a clearer case of malpractice. Her work was subjected to criticism for years prior to the publication of Presence. Unlike Duckworth, Cuddy has not responded to the scrutiny of the scientific process openly, and she has only recently dealt with it at all. Her 2007 study failed to replicate in 2010, yet she delivered a TED talk on her work in 2012 (now the second most watched talk of all time), and released Presence in 2015.

It’s a shame that Cuddy staked so much on power poses, because the (significant) portions of her book that have nothing to do with the poses are quite interesting. Her main point is about “presence” itself, which she defines as “the state of being attuned to and able to comfortably express our true thoughts, feelings, values, and potential.” These parts of her work are well-written and compelling. Her chapter on “imposter syndrome” and her self-doubts is written with great humanity and humility. She recounts the countless emails she has received from all over the world of people inspired by her work, especially young women in countries with brutal patriarchal structures. We are introduced to people who have overcome major adversities who went on to reach incredible academic and professional heights. Cuddy herself has quite a life-story: she entered her PhD program just a year after a traumatizing car accident that resulted in severe head trauma. If she had left aside the “science” of power poses, and instead mused on confidence, adversity, and the realization of human potential, it would have made for a solid and enlightening read.

Don’t bother to protest. Don’t attribute economic differences to historical forces or bigotry. Just strike the right pose.

Both of these psychology books have clearly scratched an itch: topping bestseller lists and establishing a public platform for both authors. And both have something in common: they purport to explain success as a function of individual-level characteristics, offering readers strategies to change themselves for the better. One book suggests that diligence and hard work pays off in the long run, while the other argues that interpersonal dynamics can be changed by adjusting one’s body language. These theories have in common that they individualize people’s social outcomes, suggesting that it’s factors of our own making (rather than, say, oppressive social structures) that shape our chances in life.

In placing so much emphasis on factors like grit and body language, Duckworth and Cuddy present a worryingly apolitical view of inequality. Look, they say, don’t bother to protest. God forbid you should join a union. Don’t attribute economic differences to historical forces, or to bigotry. Just strike the right pose. Grit your teeth. Forget structural disadvantages and the precarious post-industrial economy, just have passion and perseverance.

glad

One can perhaps blame Malcolm Gladwell for a lot of this. In the late 2000s, Gladwell pioneered the “this nifty thing explains success” subgenre of nonfiction. Whether it was his 10,000-hours hypothesis (the Beatles were good because they practiced a lot) or his “David and Goliath” idea (seeming impediments can actually be people’s unique advantages), Gladwell offers a series of empirically questionable mini-theories, each of which is designed to explain success using every means other than social structure. Gladwell has dedicated his professional career to trying to uncover what it is about individuals that makes some succeed while others fail. He has never considered the possibility that perhaps it isn’t something about individuals at all. (One can imagine a Gladwell-style book cover with the title Capitalism: Why Some Individuals Succeed While Others Fail. But one cannot necessarily imagine anyone reading it.)

This also speaks to a broader incentive problem in the social sciences. In terms of making one a highly sought-after public intellectual, clever Gladwellian empirical findings are far more valuable than nuanced, humble career-spanning research. James Heckman, the Nobel Prize-winning economist who has spent his career refining statistical methods and empirically studying the sources of poverty, expressed his frustration in a 2005 interview with the Federal Reserve Bank of Minneapolis, “In some quarters of our profession, the level of discussion has sunk to the level of a New Yorker article: coffee-table articles about ‘cute’ topics, papers using ‘clever’ instruments….Most of this work is without substance, but it makes a short-lived splash and it’s easy to do. Many young economists are going for the cute and the clever at the expense of working on hard and important foundational problems.” Though he doesn’t name the book, Heckman is almost certainly referring in part to the effect of Freakonomics on the profession. Figuring out why some nations are poor and others are rich is a very hard question. One the other hand, producing clever statistics showing that Sumo wrestlers cheat, as Steven Levitt does in Freakonomics, is much more fun and lucrative.

gladwel1l

The rewards of producing bestselling “pop” theories exist across professions. Niall Ferguson, the now Stanford-based economic historian and fetishist of empires, went from producing detailed histories of banking to pumping out books like Civilization: The West and the Rest which explained “The Six Killer Apps of Western Civilization.” Ferguson’s bestsellers landed him on the speaker circuit, enabling to him to charge more than $75,000 a pop for a talk. He evidently now goes from hedge fund to hedge fund telling financiers how morally sound and intellectually innovative their work is. At this, he is apparently quite good, at least according to Steve Drobny of Drobny Capital, who says: “Niall Ferguson is the best speaker we’ve hired for our hedge fund events.” Why bother to do the hard work when you can grift hedge fund managers with a quick spin through the killer apps of the West?

Ferguson, Gladwell, Duckworth, and Cuddy thereby illustrate two serious problems with the contemporary intelligentsia. First, you’re under great pressure to produce a novel empirical finding, and if you can develop one surprising enough, you can get yourself a TED talk. Second, there are deep anxieties within our contemporary society and economy, and the bestselling ideas are those that simultaneously flatter the rich and comfort the poor. Tell the wealthy they are gritty rather than lucky, that they are special Davids who slew Goliath. Tell them that they pose with power. Tell the poor that life is tough, but if they stick it out, and develop some presence, they too can make it. With a hell of a lot of people at the bottom, and a few at the top, one can do well by offering people secrets for how to get from one end to the other. Above all, don’t ever suggest that it’s luck or pre-existing wealth that determine your lot in life. What readers want is one weird trick to fix it all.

If you want to get rich, then, we know how to do it. The true secret to success may be neither grit nor presence. But Grit and Presence have certainly made their authors very successful indeed.

Peculiarities of the Yankee Confederate

When small town New Englanders embrace Dixie kitsch…

Earlier this year, my rural Massachusetts hometown became unexpectedly embroiled in controversy, after a police officer mounted a Confederate flag at his home in plain view of the 10-year-old African American boy who lived across the street. The boy’s parents, raising their son in the age of Tamir Rice, naturally felt somewhat alarmed to discover that local law enforcement harbored Confederate sympathies. The town’s Human Rights Commission (we have those here) was promptly alerted and a town meeting was called. There, most attendees condemned the officer’s actions and tried to explain the (seemingly) obvious racial subtext.

But plenty of town residents defended the officer. The local newspaper heard from readers insisting that “saying someone is racist by owning a flag” was far more racist than the flag itself. Another encouraged the boy’s family to “get over it,” lamenting that “if it’s not a flag, it’s how you say ‘happy holidays.’ If it’s not that, it’s a Starbucks cup.” And the officer’s own response? “The flag has no negative connotations to me.”

One can sympathize, for perhaps a second, with those professing themselves baffled by anyone “mad about a flag.” But for them, it may be useful to consider how the same response would sound if someone hoisted a “Death to Black People” flag with a picture of a lynching on it. “I can’t believe you’re mad about a flag; next you’ll be mad about a coffee cup” doesn’t sound quite so reasonable when we draw out what the Confederacy means to a black audience. (Remember, too, that it was not social justice types but right-wing Christians who threw a fit over the insufficient festiveness of the paper cups at Starbucks.) But the more curious question is: if the flag doesn’t have any negative connotations, what possible connotations does it have, when flown in small-town New England? What causes people born and raised in the North, many of them with no historical or familial connection to the South, to align themselves with a symbol of Southern pride, treason, and slavery?

conf1

When challenged, fans of the Stars ’n’ Bars have plenty of rehearsed answers. Most often, they will say they appreciate the Confederacy’s place in American history and lament the efforts of revisionist historians to erase it from our collective memory. And following up with “Appreciate what about it, precisely?” will get one nothing except mumbled clichés about the rebel spirit.

The charge that the left is attempting to wipe away history is a strange one. In reality, it would be nearly impossible to find a left-leaning historian who doesn’t want Americans to talk more about the Civil War, slavery, and Reconstruction in order to better understand modern institutional racism. Nobody is less inclined to erase the Confederacy from American history than the left. When we do see efforts to remove inconvenient facts from the standard curriculum, they usually come from conservatives in the South. It was the Texas Board of Education who refused to allow the fact-checking of history textbooks that used hilariously banal euphemisms to describe chattel slavery, referring to slaves as “immigrants” and “workers.” The movement to sanitize and decontextualize Confederate imagery is a far greater crime against the integrity of the historical record than the efforts of leftists to point out that the South did not just stand for “states’ rights,” but the states’ right to maintain a very particular thing. It’s their own fact-blindness that causes history-challenged conservatives to be genuinely stunned that anyone would want to remove the flag from the South Carolina State House after an avowed neo-Confederate and white supremacist massacred nine black churchgoers.

Understanding the cultural pathology behind Northern use of the Confederate flag is like understanding the rise of Donald Trump as a serious politician. It is inexplicable, essentially unfathomable. Yet one can attempt tentative hypotheses, which involve a nuanced examination of race, class, the rural/urban divide, and the widespread human attraction to nauseating kitsch. Just as one can only hope to approximate the structural causes of our 45th president, one can only guess cautiously at why, in the Berkshires of Connecticut and Massachusetts, the Stars and Stripes and the Stars and Bars can hang from the same flagpole without anyone batting an eye or sensing a paradox.

anatomyad2

The entire idea of the flag as an enduring Southern symbol is its own revisionist lie. After all, the Stars and Bars flag was barely used in the Old South, revived only in the mid-20th century by white supremacists who would rather see black children hanged from trees than given equal access to the public school system. The symbols of the Confederacy had largely remained the domain of veterans groups until they were deliberately resurrected as a way to resist the Civil Rights Movement. The rebirth began shortly after World War II, when Truman’s decision to integrate the Army increased tensions between Northern and Southern Democrats and inspired Strom Thurmond to run for president as a Dixiecrat. Thurmond, the grandson of a Confederate veteran and a staunch segregationist, employed the battle flag in his campaign as an explicitly racist gesture. In 1956, Georgia creatively incorporated the battle flag design into its state flag to protest Brown v. Board of Education.

In 1961, Governor George Wallace raised the battle flag over the Alabama state capitol. Wallace, one of the most passionate defenders of segregation, also espoused a white-centered form of populism. He targeted the federal government not just because it outlawed segregated schools, but because it enriched elites at the expense of the common man. He tailored his message to blue-collar white voters who felt left behind and condescended to by Washington. Wallace had a gift for pandering: “…when the liberals and intellectuals say the people don’t have any sense, they talkin’ about us people… But hell, you can get good solid information from a man drivin’ a truck, you don’t need to go to no college professor”, he said in 1966. Rather than embracing a truly populist platform like Huey Long in the 1930s, Wallace encouraged his white supporters to direct most of their anger toward newly enfranchised blacks. When he ran as an independent in the 1968 presidential election he won 13.5% of the popular vote, a significant improvement upon Thurmond’s 2.4%. Despite being a neoconfederate at heart, he made significant headway outside the South, attracting tens of thousands at rallies above the Mason-Dixon line; his populist rhetoric and outsider image endeared him to blue-collar whites as far north as Wisconsin. Many union members who would have otherwise voted Democratic bought into his warning that integration would destroy the labor movement. (As always, people straddling the line between the lower and middle classes were the easiest prey for fear-based politics.) Through all this, Wallace stood with the Confederate flag behind him, figuratively and literally. Among the many disastrous consequences of the 1968 election was the permanent association of unpolished white populism with Southern pride. From then on, it became a safe bet that whenever lower-middle-class white resentment bubbled to the surface, no matter where in the country, it would come wrapped in the Confederate flag.

conf2

Northern whites lack a unified ethnocultural identity. This could be due to the outcome of the Civil War—the victors may write history, but the losers are often awash in fear, resentment, and self-pity. Such forces bind the populace together and can prove very dangerous in the hands of nationalists (think interwar Germany). It may also be due to their relative diversity; in the 19th and 20th centuries America received a massive influx of immigrants from all over Europe and the majority settled in the heavily industrialized Northeast and mid-Atlantic. Maintaining a straightforward regional identity in the face of constant demographic upheaval is difficult if not impossible.

Now, imagine yourself in the rural North in an age where it is mandated that you consciously create a capital-I Identity for yourself. One is supposed to create this “identity” through consumer choices and Facebook cover photos. You are white, as are most of the people you know. You have a high school education and all your employment prospects are either blue collar or low-level white collar. You subscribe to a personal philosophy that emphasizes disciplined physical labor as the bedrock of proper morality, but you also take pride in your lack of city-boy etiquette and frequently engage in lighthearted but legal hedonism. How do you categorize yourself? What do you “identify” as?

Well, fortunately, an identity just for you has been consolidated into a few symbols, hobbies, and character traits, turned into a packaged cultural commodity for your instantaneous adoption and consumption. This identity is The South. The fake, commodified South, that is, not to be confused with the actually existing South, which has a rich cultural history and (unlike the commodified South) has black people in it. This imaginary South is about all-camo outfits and huntin’, fishin’, and spittin’ to spite coastal elites who want to make it illegal to hunt, fish, and spit. The commodified South is Duck Dynasty, McDonald’s sweet tea, and country songs that have “country” in the title. People seem to really like this stuff, which is why, compared to other regions, the South is overrepresented among Zippo lighter designs and truck decals.

Partially divorced of context, what was once a symbol of an aristocratic slave society becomes, paradoxically, part of a tradition of populist Americana along with John Wayne, Chief Wahoo, and the Pixar version of Route 66. Fully divorced of context, the flag becomes a symbol of vague, noncommittal rebellion. It takes its place alongside a series of meaningless but ubiquitous kitschy products including wolf shirts, the pissing Calvin decal, skull-adorned lighters, and overly aggressive Minions memes about what people can and can’t do before you’ve had your coffee.

The small bit of context that the flag does retain is used to sinister ends. Among rural whites, a watered-down version of neoconfederate ideology serves as a kind of mutant substitute for class consciousness. This is especially evident in modern country music, where many songs are essentially a bullet point list of stereotypes: big trucks, cheap beer, dirt roads, and physically demanding blue collar work. Take, for example, Lee Brice’s 2014 smash hit “Drinking Class”:

“I belong to the drinking class / Monday through Friday, man we bust our backs / If you’re one of us, raise your glass / I belong to the drinking class.”

The structure of Brice’s lyrics shows a keen awareness of socioeconomic class. But this is not the labor movement’s conception of class, with its exhortation to social change. The Lee Brice theory of class is empty of meaning. It’s hopeless and sad; nothing is left but solipsistic in-group pride and alcoholism. The vice neuters any revolutionary fervor. A member of the Drinking Class isn’t interested in social climbing and he would never dream of doing away with class distinctions altogether.

conf3

The Drinking Class man knows life is pretty rotten, that you work and drink until you die. But, strongly encouraged by millionaire tribunes of the working poor like the guy from Dirty Jobs, the guy from Duck Dynasty, and the guy from Larry the Cable Guy (plus fellow reality star Donald J. Trump), he adopts flimsy, prejudiced rationalizations to explain his very real feelings of being forgotten and exploited. He justifies his toil as morally necessary, rather than exploitative. And like a surly teen alienated from his parents and bored with masturbation, he joins a cultural clique and cements his place in it by lashing out at its real or imaginary enemies. To get back at the elites who mocked him for making little sense, he begins to do things that make little sense, such as flying a Confederate flag in Massachusetts. (Half-assed clique membership is often embarrassing, like when homophobic metalheads get tricked into wearing leather daddy outfits.)

We can therefore find explanations, if not justifications, for the peculiar existence of our Yankee Confederate. Some of it is stupid, some of it is racist, and some of it is a misguided response to the need for identity and solidarity. Like depressed teens, alienated rural whites aren’t imagining their suffering, and they do have legitimate grievances about the unending despair of the American status quo. But they have reacted in a way that’s difficult to defend either rationally or morally.

The solution here is to organize against the policies that created an alienated rural working class in the first place. To the extent that the flag is a product of the search for identity and community, one needs to have a better, less appalling identity to offer people. To the extent that the flag is a product of racism, what is racism itself a product of? Working class whites have often blamed their problems on nonwhites, but this is irrational scapegoating. And since it’s irrational scapegoating, the left should think seriously about how to give people real explanations for their problems, as well as solutions. The New England Confederate is a bizarre and horrifying sight, but he is not without his structural causes. If we can offer a unifying message to working class people of all races, we may see fewer members of the Drinking Class embrace backward cultural symbols and buy into the South as consumer lifestyle brand. Stars and Bars keychains may create a cheap rush of ersatz proletarian solidarity, but they are no substitute for the real thing.

Illustrations by Gurleen Rai