The Dangerous Academic is an Extinct Species

If these ever existed at all, they are now deader than dodos…

It was curiosity, not stupidity that killed the Dodo. For too long, we have held to the unfair myth that the flightless Mauritian bird became extinct because it was too dumb to understand that it was being killed. But as Stefan Pociask points out in “What Happened to the Last Dodo Bird?”, the dodo was driven into extinction partly because of its desire to learn more about a new, taller, two-legged creature who disembarked onto the shores of its native habitat: “Fearless curiosity, rather than stupidity, is a more fitting description of their behavior.”

Curiosity does have a tendency to get you killed. The truly fearless don’t last long, and the birds who go out in search of new knowledge are inevitably the first ones to get plucked. It’s always safer to stay close to the nest.

Contrary to what capitalism’s mythologizers would have you believe, the contemporary world does not heap its rewards on those with the most creativity and courage. In fact, at every stage of life, those who venture beyond the safe boundaries of expectation are ruthlessly culled. If you’re a black kid who tends to talk back and call bullshit on your teachers, you will be sent to a special school. If you’re a transgender teenager like Leelah Alcorn in Ohio, and you unapologetically defy gender norms, they’ll make you so miserable that you kill yourself. If you’re Eric Garner, and you tell the police where they can stick their B.S. “loose cigarette” tax, they will promptly choke you to death. Conformists, on the other hand, usually do pretty well for themselves. Follow the rules, tell people what they want to hear, and you’ll come out just fine.

Becoming a successful academic requires one hell of a lot of ass-kissing and up-sucking. You have to flatter and impress. The very act of applying to graduate school to begin with is an exercise in servility: please deem me worthy of your favor. In order to rise through the ranks, you have to convince people of your intelligence and acceptability, which means basing everything you do on a concern for what other people think. If ever you find that your conclusions would make your superiors despise you (say, for example, if you realized that much of what they wrote was utter irredeemable manure), you face a choice: conceal your true self or be permanently consigned to the margins.

The idea of a “dangerous” academic is therefore somewhat self-contradictory to begin with. The academy could, potentially, be a place for unfettered intellectual daring. But the most daring and curious people don’t end up in the academy at all. These days, they’ve probably gone off and done something more interesting, something that involves a little bit less deference to convention and detachment from the material world. We can even see this in the cultural archetype of the Professor. The Professor is always a slightly harrumphy—and always white and male—individual, with scuffed shoes and jackets with leather elbows, hidden behind a mass of seemingly disorganized books. He is brilliant but inaccessible, and if not effeminate, certainly effete. But bouncing with ideas, so many ideas. There is nothing particularly menacing about such a figure, certainly nothing that might seriously threaten the existing arrangements of society. Of ideas he has plenty. Of truly dangerous ones, none at all.

If anything, the university has only gotten less dangerous in recent years. Campuses like Berkeley were once centers of political dissent. There was open confrontation between students and the state. In May of 1970, the Ohio National Guard killed four students at Kent State. Ten days later, police at the historically black Jackson State University fired into a crowd of students, killing two. At Cornell in 1969, armed black students took over the student union building in a demand for recognition and reform, part of a pattern of serious upheaval.

But over the years the university became corporatized. It became a job training center rather than an educational institution. Academic research became progressively more specialized, narrow, technical, and obscure. (The most successful scholarship is that which seems to be engaged with serious social questions, but does not actually reach any conclusions that would force the Professor to leave his office.)


The ideas that do get produced have also become more inaccessible, with research inevitably cloaked behind the paywalls of journals that cost astronomical sums of money. At the cheaper end, the journal Cultural Studies charges individuals $201 for just the print edition, and charges institutions $1,078 for just the online edition. The science journal Biochimica et Biophysica Acta costs $20,000, which makes Cultural Studies look like a bargain. (What makes the pricing especially egregious is that these journals are created mostly with free labor, as academics who produce articles are almost never paid for them.) Ideas in the modern university are not free and available to all. They are in fact tethered to a vast academic industrial complex, where giant publishing houses like Elsevier make massive profits off the backs of researchers.

Furthermore, the academics who produce those ideas aren’t exactly at liberty to think and do as they please. The overwhelming “adjunctification” of the university has meant that approximately 76% of professors… aren’t professors at all, but underpaid and overworked adjuncts, lecturers, and assistants. And while conditions for adjuncts are slowly improving, especially through more widespread unionization, their place in the university is permanently unstable. This means that no adjunct can afford to seriously offend. To make matters worse, adjuncts rely heavily on student evaluations to keep their positions, meaning that their classrooms cannot be places to heavily contest or challenge students’ politics. Instructors could literally lose their jobs over even the appearance of impropriety. One false step—a video seen as too salacious, or a political opinion held as oppressive—could be the end of a career. An adjunct must always be docile and polite.

All of this means that university faculty are less and less likely to threaten any aspect of the existing social or political system. Their jobs are constantly on the line, so there’s a professional risk in upsetting the status quo. But even if their jobs were safe, the corporatized university would still produce mostly banal ideas, thanks to the sycophancy-generating structure of the academic meritocracy. But even if truly novel and consequential ideas were being produced, they would be locked away behind extortionate paywalls.

The corporatized university also ends up producing the corporatized student. Students worry about doing anything that may threaten their job prospects. Consequently, acts of dissent have become steadily de-radicalized. On campuses these days, outrage and anger is reserved for questions like, “Is this sushi an act of cultural appropriation?” When student activists do propose ways to “radically” reform the university, it tends to involve adding new administrative offices and bureaucratic procedures, i.e. strengthening the existing structure of the university rather than democratizing it. Instead of demanding an increase in the power of students, campus workers, and the untenured, activists tend to push for symbolic measures that universities happily embrace, since they do not compromise the existing arrangement of administrative and faculty power.

It’s amusing, then, that conservatives have long been so paranoid about the threat posed by U.S. college campuses. The American right has an ongoing fear of supposedly arch-leftist professors brainwashing nubile and impressionable young minds into following sinister leftist dictates. Since massively popular books like Roger Kimball’s 1990 Tenured Radicals and Dinesh D’Souza’s 1992 Illiberal Education: The Politics of Race on Campus, colleges have been seen as hotbeds of Marxist indoctrination that threaten the civilized order. This is a laughable idea, for the simple reason that academics are the very opposite of revolutionaries: they intentionally speak to minuscule audiences rather than the masses (on campus, to speak of a “popular” book is to deploy a term of faint disdain) and they are fundamentally concerned with preserving the security and stability of their own position. This makes them deeply conservative in their day-to-day acts, regardless of what may come out of their mouths. (See the truly pitiful lack of support among Harvard faculty when the university’s dining hall workers went on strike for slightly higher wages. Most of the “tenured radicals” couldn’t even be bothered to sign a petition supporting the workers, let alone march in the streets.)

But left-wing academics are all too happy to embrace the conservatives’ ludicrous idea of professors as subversives. This is because it reassures them that they are, in fact, consequential, that they are effectively opposing right-wing ideas, and that they need not question their own role. The “professor-as-revolutionary” caricature serves both the caricaturist and the professor. Conservatives can remain convinced that students abandon conservative ideas because they are being manipulated, rather than because reading books and learning things makes it more difficult to maintain right-wing prejudices. And liberal professors get to delude themselves into believing they are affecting something.


Today, in what many call “Trump’s America,” the idea of universities as sites of “resistance” has been renewed on both the left and right. At the end of 2016, Turning Point USA, a conservative youth group, created a website called Professor Watchlist, which set about listing academics it considered dangerously leftist. The goal, stated on the Turning Point site, is “to expose and document college professors who discriminate against conservative students and advance leftist propaganda in the classroom.”

Some on the left are delusional enough to think that professors as a class can and should be presenting a united front against conservatism. At a recent University of Chicago event, a document was passed around from titled, “A Call to Professors, Students and All in Academia,” calling on people to “Make the University a Zone of Resistance to the Fascist Trump Regime and the Coming Assault on the Academy.”

Many among the professorial class seem to want to do exactly this, seeing themselves as part of the intellectual vanguard that will serve as a bulwark against Trumpism. George Yancy, a professor of philosophy and race studies at Emory University, wrote an op-ed in the New York Times, titled “I Am A Dangerous Professor.” Yancy discussed his own inclusion on the Professor Watchlist, before arguing that he is, in fact, dangerous:

“In my courses, which the watchlist would like to flag as ‘un-American’ and as ‘leftist propaganda,’ I refuse to entertain my students with mummified ideas and abstract forms of philosophical self-stimulation. What leaves their hands is always philosophically alive, vibrant and filled with urgency. I want them to engage in the process of freeing ideas, freeing their philosophical imaginations. I want them to lose sleep over the pain and suffering of so many lives that many of us deem disposable. I want them to become conceptually unhinged, to leave my classes discontented and maladjusted…Bear in mind that it was in 1963 that the Rev. Dr. Martin Luther King, Jr. raised his voice and said: ‘I say very honestly that I never intend to become adjusted to segregation and discrimination.’… I refuse to remain silent in the face of racism, its subtle and systemic structure. I refuse to remain silent in the face of patriarchal and sexist hegemony and the denigration of women’s bodies.”

He ends with the words:

“Well, if it is dangerous to teach my students to love their neighbors, to think and rethink constructively and ethically about who their neighbors are, and how they have been taught to see themselves as disconnected and neoliberal subjects, then, yes, I am dangerous, and what I teach is dangerous.”

Of course, it’s not dangerous at all to teach students to “love their neighbors,” and Yancy knows this. He wants to simultaneously possess and devour his cake: he is doing nothing that anyone could possibly object to, yet he is also attempting to rouse his students to overthrow the patriarchy. He suggests that his work is so uncontroversial that conservatives are silly to fear it (he’s just teaching students to think!), but also places himself in the tradition of Martin Luther King, Jr., who was trying to radically alter the existing social order. His teaching can be revolutionary enough to justify Yancy spending time as a philosophy professor during the age of Trump, but benign enough for the Professor Watchlist to be an act of baseless paranoia.

Much of the revolutionary academic resistance to Trump seems to consist of spending a greater amount of time on Twitter. Consider the case of George Ciccariello-Maher, a political scientist at Drexel University who specializes in Venezuela. In December of 2016, Ciccariello-Maher became a minor cause célèbre on the left after getting embroiled in a flap over a tweet. On Christmas Eve, for who only knows what reason, Ciccariello-Maher tweeted “All I Want for Christmas is White Genocide.” Conservatives became enraged, and began calling upon Drexel to fire him. Ciccariello-Maher insisted he had been engaged in satire, although nobody could understand what the joke was intended to be, or what the tweet even meant in the first place. After Drexel disowned Ciccariello-Maher’s words, a petition was launched in his defense. Soon, Ciccariello-Maher had lawyered up, Drexel confirmed that his job was safe, and the whole kerfuffle was over before the nation’s half-eaten leftover Christmas turkeys had been served up into sandwiches and casseroles.

Ciccariello-Maher continues to spend a great deal of time on Twitter, where he frequently issues macho tributes to violent political struggle, and postures as a revolutionary. But despite his temporary status as a martyr for the cause of academic freedom, one who terrifies the reactionaries, there was nothing dangerous about his act. He hadn’t really stirred up a hornet’s nest; after all, people who poke actual bees occasionally get bee stings. A more apt analogy is that he had gone to the zoo to tap on the glass in the reptile house, or to throw twigs at some tired crocodiles in a concrete pool. (When they turned their rheumy eyes upon him, he ran from the fence, screaming that dangerous predators were after him.) U.S. academics who fancy themselves involved in revolutionary political struggles are trivializing the risks faced by actual political dissidents around the world, including the hundreds of environmental activists who have been murdered globally for their efforts to protect indigenous land.

“University faculty are less and less likely to threaten any aspect of the existing social or political system…”

Of course, it’s true that there are still some subversive ideas on university campuses, and some true existing threats to academic and student freedom. Many of them have to do with Israel or labor organizing. In 2014, Steven Salaita was fired from a tenured position at the University of Illinois for tweets he had made about Israel. (After a protracted lawsuit, Salaita eventually reached a settlement with the university.) Fordham University tried to ban a Students for Justice in Palestine group, and the University of California Board of Regents attempted to introduce a speech code that would have punished much criticism of Israel as “hate speech.” The test of whether your ideas are actually dangerous is whether you are rewarded or punished for expressing them.

In fact, in terms of danger posed to the world, the corporatized university may itself be more dangerous than any of the ideas that come out of it.

In Hyde Park, where I live, the University of Chicago seems ancient and venerable at first glance. Its Ye Olde Kinda Sorta Englande architecture, built in 1890 to resemble Oxbridge, could almost pass for medieval if one walked through it at dusk. But the institution is in fact deeply modern, and like Columbia University in New York, it has slowly absorbed the surrounding neighborhood, slicing into older residential areas and displacing residents in landgrab operations. Despite being home to one of the world’s most prestigious medical and research schools, the university refused for many years to open a trauma center to serve the city’s South Side, which had been without access to trauma care. (The school only relented in 2015, after a long history of protests.) The university ferociously guards its myriad assets with armed guards on the street corners, and enacts massive surveillance on local residents (the university-owned cinema insists on examining bags for weapons and food, a practice I have personally experienced being selectively conducted in a racially discriminatory manner). In the university’s rapacious takeover of the surrounding neighborhood, and its treatment of local residents—most of whom are of color—we can see what happens when a university becomes a corporation rather than a community institution. Devouring everything in the pursuit of limitless expansion, it swallows up whole towns.

The corporatized university, like corporations generally, is an uncontrollable behemoth, absorbing greater and greater quantities of capital and human lives, and churning out little of long-term social value. Thus Yale University needlessly decided to open a new campus in Singapore despite the country’s human rights record and restrictions on political speech, and New York University decided to needlessly expand to Abu Dhabi, its new UAE campus built by low-wage workers under brutally repressive conditions. The corporatized university serves nobody and nothing except its own infinite growth. Students are indebted, professors lose job security, surrounding communities are surveilled and displaced. That is something dangerous.

Left professors almost certainly sense this. They see themselves disappearing, the campus becoming a steadily more stifling environment. Posturing as a macho revolutionary is, like all displays of machismo, driven partially by a desperate fear of one’s impotence. They know they are not dangerous, but they are happy to play into the conservative stereotype. But the “dangerous academic” is like the Dodo in 1659, a decade before its final sighting and extinction: almost nonexistent. And the more universities become like corporations, the fewer and fewer of these unique birds will be left. Curiosity kills, and those who truly threaten the inexorable logic of the neoliberal university are likely to end up extinct.

Illustrations by Chris Matthews.

Bourgeois Feminist Bullshit

The Rebecca Traister view of gender and the world…

The last decade has seen the flourishing of a genre we might describe as “Sociology Porn,” a form of pop-culture social studies that emerges from the incomprehensible minds of people like David Brooks. These are the richly rewarded ramblings eagerly consumed by a public that wants to appear learned without doing too much of the work of actually doing the research. Nuggets like the sort offered regularly by Brooks are like bacon-wrapped dates: Tiny morsels of fact wrapped in a rich coating of fatty nonsense.

Of late, Sociology Porn has turned its attention to Women and, in particular, Single Women. In the last few years, we’ve seen books like Moira Wiegel’s Labor of Love: The Invention of Dating, Jessica Valenti’s Sex Object: A Memoir, and Kate Bolick’s Spinster: Making a Life of One’s Own. As the (actual) sociologist Bella DePaulo began pointing out over a decade ago in Singled Out: How Singles Are Stereotyped, Stigmatized, and Ignored, and Still Live Happily Ever After, single people — not just women — have been on the rise for many years. Yet society and the laws governing mundane-but-deeply-important matters like taxation and housing have yet to catch up with the reality that fewer people than ever before see marriage as their lives’ crowning glory. (In addition, social benefits like childcare, healthcare, and paid sick leave remain entirely inaccessible to all but the most privileged single people.)

The newer books take such sociological analysis and numbers and turn them into fun, peppy narratives about how incredibly great and awesome it is that so many women are now resolutely single. Such a surge of interest in The Single Woman should bring joy to the millions of women who have borne the hardship of centuries of stigma. Single Women were once feared as witches (and then summarily executed), and they have been rendered Sad Sadies so desperate for sex that they take to hallucinating partners (recall The Unmarried Woman in Rear Window). The times are a-changing and today Single Women can take heart in the fact that they exist in numbers large enough to attract the attention of so many. Presumably, singletons read such books avidly and want to know: Will there now be more men to fuck? More women? What do I do with my infinite spare time? Will a Cat Cafe open in my neighborhood?

The latest in this body of pop sociology is Rebecca Traister’s All the Single Ladies: Unmarried Women and the Rise of an Independent Nation, based on a number of interviews with single women and a history of marriage drawn from several sources. The title, derived from Beyoncé’s hit single, firmly locates the book in popular culture, immediately signaling its affinity with the now, the hip, the happening. The subtitle is reassuring: It’s not that Unmarried Women (note the emphasis on marital status) are on the rise, because that would imply they were acting in solidarity and, perhaps, against the general (and married) public. Rather, their growing presence is proof that America is, true to its storied (if entirely fictional) history, The Land of the Free.  We Americans have allowed these women to remain unmarried and they add to our national identity and reaffirm that which has served to define us since the Boston Tea Party, our Love of Independence.

All the Single Ladies is an earnestly researched project, written by someone who is acutely sensitive to the political sensibilities of every single population. Much like the campaign of Hillary Clinton, a woman Traister has admired and written about for nearly a decade (her first book, Girls Don’t Cry, was about Clinton’s 2008 presidential campaign), this book reads like it went through multiple focus groups. It is careful never to offend and, in the process, offers little of substance that has not been said before.

Single status is now no longer a blight upon the land but a set of Very Useful Functions that keep the economy running smoothly.”

Like any decent research project, it is not without a few useful bits. Traister carefully points out, for instance, that in the years following World War II, the freedom from domesticity that so many white suburban women were able to enjoy in the late 1950s and onwards was enabled by the fact that so many black women could only find employment as domestic labor in their households. In terms of policy changes, her goals, which include more childcare resources and better pay, seem laudable enough.

But, ultimately, Traister has no real interest in Single Women except as a foil to the Married; she’s more interested in maintaining, not disrupting, the primacy of coupledom itself.  The biggest flaw in the book, and in Traister’s vision as a whole, is that she ultimately sees singles not as independent people, valuable in and of themselves, but as those whose value is created by their usefulness to married couples and the state. So, for instance, single women are amazing because, according to the Bureau of Labor Statistics, they “spend more than a trillion dollars annually.” According to another report, “single, childless, non-cohabiting women over the age of twenty-seven are spending more per capita than any other category of women on dining out, rent, or mortgage, furnishings, recreation, entertainment, and apparel: $50 billion a year on food, $22 billion on entertainment, and $18 billion on cars.”

But more than anything else, single women are now making for better marriages. What does Traister mean by this puzzling assertion? Well, it turns out that single women are not necessarily abandoning marriage altogether (phew!), but simply pursuing it later. In the process, according to Traister, they’re actually helping make marriage better. And they’re not just making their own marriages better, by being more sure and confident of their needs, she tells us excitedly: They help improve other people’s marriages by modeling what confident women should be like, for their married male coworkers. She quotes Susan B. Anthony: “Once men were afraid of women with ideas and a desire to vote. Today, our best suffragettes are sought in marriage by the best class of men.” Traister’s happy conclusions about the usefulness of unmarried women come much after she explores their rise in numbers. Which is to say: First, she gives us the bad news, that there are more of them, and then she reassures us that it’s all, actually, good news.


All of this will no doubt be a huge relief for millions of Single Women. Their single status is now no longer a blight upon the land but a set of Very Useful Functions that keep the economy running smoothly. They spend more! They eat well! Goodbye to stale cheese and rice and beans! They tap into the housing market! They make other women’s husbands better, like training wheels on a bicycle!

Traister’s utilitarian view of single women explains the kind of single women she interviewed for the book. Of course, in keeping with her earnest adherence to a focus-group-like-laser-focus, they are culturally and racially diverse. But none of these women are anything but secure and successful, and they all have stable incomes.

And none of them are so gauche as to dislike or, Goddess Hera forbid, hate or be opposed to marriage. Instead, in the spirit of modern bourgeois feminists, they cast their decisions within the penumbra of “choice” — not rebellion, not revolt against an institution that has historically enslaved women and children, not a desire to strike a path away from the economic and political constraints foisted upon them, not an angry demand that the state should do better about the needs of a marginalized population, but merely a choice, like picking the shirt or handbag that suits you best. She approvingly quotes Anita Hill, who says she wants people to understand that, “you can have a good life, despite what convention says, and be single. That doesn’t mean you have to be against marriage. It just means that there are choices that society should not impose on you.”

In Sex and the City, which lurks in the background for all works in this genre of Sociology Porn about Single Women, Charlotte famously shouts, while justifying her decision to become dependent on her husband, “I choose my choice, I choose my choice!” The much-quoted lines are symptomatic of the attitude towards marriage among bourgeois feminists. The mantra of “choice” is taken to mean that there are multiple kinds of feminism, including the sort where women relinquish their economic independence, or support the erasure of abortion rights. The feminism Traister upholds, as Hillary Clinton upheld, is what we might call a “Big Tent Feminism,” the sort that makes allowances for every possible variation of “feminism” under the logic that If Women Want It, It Must Be Feminist. No matter how poisonous the effects  may be (such as Hillary Clinton’s vote to authorize a brutal war that killed many thousands of innocent women), an empowered woman’s act is always a feminist act.

But what if Traister had interviewed those feminists who don’t see a separation between their economic interests and their gendered interests? What about the feminists who are single because they are resolutely against marriage? What about those who aren’t an economic benefit to everyone else, or who hate shopping? And what about those who think marriage is a terrible, rotting institution, yet eventually succumb to it because everything about American society, from healthcare to childcare to housing rights, is still organized to shut down women’s aspirations toward independence?

What if, for instance, we heard from the single woman in New York, also the epicenter of Traister’s world, who isn’t able to make rent but can’t stomach the idea of marrying that person with access to a rent-controlled apartment so she can stay? What of the single woman pushed into a homeless shelter and separated from her child by the Department of Child and Family Services, which deemed her an unfit mother because she was too poor because the city wouldn’t help her to find housing? Try telling these women how wonderful being single is, what a truly liberating choice they have freely made.


At the end of the book, Traister provides a section titled “Where Are They Now?”, with updates from some of her interview subjects. These are chirpy and chipper: “Happily single and living in Atlanta.” “She is happy.” “While finding work-life balance and managing finances remain ongoing challenges, Letisha wouldn’t trade her experiences of being a mom for anything.”

But those “ongoing challenges” eventually wear many women down, unless they become independently wealthy. As DePaulo astutely pointed out, the lives of married people are often literally subsidized by Singles — it’s Singles who are still expected to take up the (often un-or-underpaid) slack when married colleagues won’t work overtime, for instance. Even when tenured, single female academics are often expected to take on additional roles of nurturing students and extra unpaid administrative work.

In a section on wealth and poverty, Traister recognizes that there is a sticky underbelly of economic displacement and exploitation. She sees that the absence of wealth makes marriage coercive, and that being single is great if you’re rich and sucks if you’re poor. Yet on the whole, she paints a relentlessly sunny portrait of what it means to be Single in America. She knows that unless women are well-off, their single status is actually a source of material hardship. But this does not alter her generally rosy assessment, nor does it cause her to believe that a fundamental change in the economic order is necessary.

Broadly speaking, what Traister offers in her book is not an expansive history of a growing social trend, but a reassurance to a certain class of women and men that singledom does not threaten either the state of marriage or the state that requires people to be married and, most crucially, that the rise of singles will never threaten the stranglehold of capitalism. But a politically sharp diagnosis of singledom would not simply show that singles are rising in number, it would indicate the potential for their growth to actually disrupt the political status quo.

What would a more disruptive view of Singles look like?  What if we actually took gender out of the picture altogether? To consider those questions, we have to first understand why Traister’s work and bourgeois feminism have been so appealing in the first place.

Traister’s book is, as far as these things can be accurately gauged, wildly successful. It has been talked about a great deal and she has appeared in and on several venues, like Real Time with Bill Maher, CBS’s This Morning, Elle, and NPR’s Fresh Air. The blurbs for her work are full of a level of praise unusual even in the world of blurb-writing: The New York Times describes her as “visionary,” while Annie Lamott declares hers to be “the most brilliant voice on feminism in this country.” It’s not simply that Traister’s book has sold a lot of copies, but that her vision of Singles is an influential one. But what explains the over-the-top praise? Why would something making rather basic observations be hailed as “visionary” and “brilliant”? What’s with these gushing blurbs?

In fact, the blurbs make perfect sense. Bourgeois feminism has a strong hold on the liberal and progressive imagination. Traister’s analysis echoes every principle of this feminism, and it explains why an unremarkable book which reads like a homework assignment is praised as “visionary.”  All the Single Ladies and its particular version of feminism appeals to the upper middle-class feminist sensibility: It appears to empower, while in fact reaffirming power as it already exists. It flatters people into thinking that the existence of single women is revolutionary in itself, even though its whole argument is that they don’t disrupt the economy or anything else. But being single is no more revolutionary or interesting or world-changing than marrying; the point ought not to be what people are doing in their personal lives, but what changes we can bring about in their political and economic lives.


As we saw too clearly in the last election, this kind of feminism is incapable of thinking beyond the symbolic, and of creating meaningful changes in women’s lives. One cannot discuss Rebecca Traister without discussing Hillary Clinton, since Traister’s support for Clinton has been the source of much of her dispute with fellow feminists. Clinton is the exemplar of a “non-threatening” feminism, one that gets women into boardrooms (and into public office) without actually changing the underlying structure of companies or governments. The reason rich Democrats overwhelmingly favored Hillary Clinton in the primary is that Hillary Clinton offers their ideal political platform: symbolic gains for the gender, without the actual material gains that might require sacrificing some wealth.

Yet wealthy feminists like Traister, because they do not understand how women’s interests can actually conflict based on economic class, cannot understand opposition to Clinton as anything other than sexism. Thus when fifty-three percent of white women voted for Trump against the first female major-party presidential candidate, it created a puzzle for Traister. She was among those who saw Trump’s victory squarely as a result of racism and sexism. As she wrote in her diagnosis for New York magazine, Trump “was made possible by voters threatened by the increased influence of women and people of color.” For Traister, it is impossible that these women had a class identity, that they disliked Hillary Clinton for her ties to Wall Street. The logic of bourgeois feminism is that if you don’t vote for a woman, there must be something wrong with you. Implicit in this summation is a nasty bit of class-based innuendo: Only White Trash would vote for Trump against a female candidate. On this reasoning, women who don’t vote for women are essentially betraying their sex. We might recall here Madeleine Albright’s infamous statement in support of Hillary Clinton, that “there’s a special place in hell for women who don’t help each other!”

But Albright, Clinton, and Traister’s feminism is not the feminism of working class or middle class women; it is inherently about solidifying the interests of wealthy women — consider, for instance, Hillary Clinton’s deep, expressed contempt for baristas. In October of 2016, a taped speech of Hillary Clinton speaking to her wealthy fundraisers emerged, in which she described Sanders supporters as delusional “basement-dwellers.”  Her comments are worth quoting at some length:

“They’re children of the Great Recession. And they are living in their parents’ basement. They feel they got their education and the jobs that are available to them are not at all what they envisioned for themselves. And they don’t see much of a future….If you’re feeling like you’re consigned to, you know, being a barista, or you know, some other job that doesn’t pay a lot, and doesn’t have some other ladder of opportunity attached to it, then the idea that maybe, just maybe, you could be part of a political revolution is pretty appealing.”

This is the same contemptuous logic echoed by Traister, who can’t be bothered to interview any of these baristas who “don’t see much of a future,” and who might also be resolutely single women. Clinton could not, does not, and will not conceive of the fact that someone who is her daughter’s age and a barista might actually want to be one because she loves the job. Perhaps she just wants to get paid so well that she never has to take a second job. She wants to be unionized to guarantee she will not be fired because she refused to give her (married) manager a blowjob behind the fridge. And she doesn’t want to have to leave abruptly because she had a child, unwanted or not.

In Clinton’s remarks about baristas, one can sense the values held by this kind of feminism. One is “consigned” to being a barista, because being a barista is not what the successful people do. To the extent there is a problem, it is that people are not getting to run startups rather than pour coffee. But a socialist looks at the situation differently: The problem is not that people are “stuck” being baristas, it is that baristas are not accorded the respect and economic security that they should be given. For Clinton/Traister, there should be a meritocracy in which anyone can rise from their lowly, pitiful, underpaid position to become the boss. For someone committed to actual material equality, there shouldn’t be bosses, or lowly, underpaid positions, to begin with. It’s not that everyone should be able to get to the top of the hierarchy of female success, it’s that the hierarchy must be destroyed, so that people can do what they want with their lives without having to worry about whether they will be able to feed themselves or their children.

This lack of a serious vision of economic equality for women explains why Traister has massively overpraised Hillary Clinton’s significance for women. In her book on the 2008 election, Traister calls it “the election that changed everything for American women,” and has a chapter entitled “Hillary is us.” But Clinton’s 2008 campaign changed literally nothing for American women. They were still working the same jobs the day after she conceded as they were the day before she announced her run. And she definitely isn’t us in any important way. She’s not us, first and foremost, because she has several hundred million dollars of wealth, and because she doesn’t recognize that our lives are defined by the constraints of our economic conditions.

Clinton and Traister’s bourgeois feminism therefore absorbs the logic of capitalism — the accrual of wealth by the few — and vomits it out again as the affirmation of gender. In Traister’s worldview, single women are defined largely by their gendered interests, with economic interests being secondary. Traister makes feminism about empathy, desire, and shopping. But feminism is not something that comes about simply because of the presence of women; it is fundamentally about changing the world so that everyone, regardless of gender, has the same access to material benefits without needing to perform some economically useful function to the state and society. Gender is the tunnel through which we travel and understand one set of very, very stark oppressions. But a feminist revolution that simply ascribes “proper” functions to women alongside men or other women is not a revolution; it is simply a realignment of the status quo.

Feminist principles are not, ultimately, simply about making things better for women. They are about paying attention to gender in order to think about policies that make things better for everyone. So, for instance, a feminism that is simply about ensuring that women at the top get bathrooms with diaper-changing stations means nothing if the women and men who are cleaning those bathrooms — and presumably wiping baby shit from the walls — get neither time off nor the ability to place their children in care while at work. A policy that ensures that female professors get to take a year off after having their babies is useless if the system continues to simply hire adjuncts of all genders — who get no such benefits, no matter how well paid they are — to fill in for them.

Rebecca Traister and Bourgeois Feminists like her neither understand nor want any of this. And there is a Special Place In Hell for women who refuse to consider a feminism meant to ensure freedom for all, regardless of gender.

Killing You Softly With Her Dreams

Arianna Huffington’s war on sleep…

Arianna Huffington wants to put you to sleep.

In her new book, The Sleep Revolution: Transforming Your Life, One Night at a Time, Huffington dramatically announces that we are in the middle of an unacknowledged sleep crisis. There is a problem in our society, Huffington tells us: we have forgotten how to sleep. Fortunately, sleepless readers need not fear: Huffington’s handy little book is here to show you how to combat sleeplessness.

Sleep Revolution is written in classic Huffington style: part Deepak Chopra, part Oprah, and strung together with quotes from everyone from Persian poet Rumi to art critic Jonathan Crary to even (bafflingly for a self-described progressive), the anti-immigrant, Brexit-enabling, racist former Mayor of London, Boris Johnson.

The writing, it should go without saying, is bad. A chapter begins: “From the beginning of time, people have struggled with sleep.” In fact, from the beginning of time, sophomore English teachers have been taking red pen to any essay that starts with “from the beginning of time.” Her phrasing is often corny and uses too many exclamation points.

Sleep Revolution is less a book than a business plan, a typical product of the can-do inspiration industry made popular by the likes of Andrew Weil and Suze Orman, the snake oil salespeople of the 21st century. Like them, Huffington first tells you that you have a problem, one you were unaware you had. She then generously reveals the many products that can help alleviate your symptoms, suggesting plenty of expensive solutions. Huffington has learnt her trade from the best hucksters. She absorbs the techniques of assorted rich people’s gurus, like cult leaders Bhagwan Rajneesh and John-Roger, combining new age verbiage with sly admonitions to give up one’s material wealth (into their outstretched hands, of course).

Huffington undoubtedly possesses a kind of brilliance. It lies not in the quality of her thought or writing, but in her ability to understand and exploit the zeitgeist. The ideas in Sleep Revolution, such as they are, are mostly bits and pieces about sleep deprivation and the problems thereof cribbed and culled from a range of sources (likely the product of several intensive hours of Googling). To be sure, they are banal. And yet Huffington’s book is perfect for our moment in time: it arrives just as capitalism is making many of us more sleepless than ever.

Huffington is never so impolite as to mention that capitalism, which has done well by her and made her a multimillionaire, may be to blame for keeping people working long, sleepless hours. She prefers proposing solutions to diagnosing causes. She tells you to leave your smartphone outside your bedroom, to have warm baths, to disengage. Don’t tackle work emails after a certain time.

Her solutions have the convenient consequence of making you a better worker for your employers, without actually raising your material standard of living. After all, she writes, “it would actually be better for business if employees called in tired, got a little more sleep, and then came in a bit late, rather than call in sick a few days later or, worse, show up sick, dragging themselves through the day while infecting others.” Her advice to her fellow bosses is purely expedient: if the worker drones rest, more labor can be wrung out of them.

This approach to sleep is common in the discourse of “self-care,” in which people are constantly admonished to heal themselves with candles, self-affirmation, and long baths but not told that they can actually revolt against the systems that create their exhaustion in the first place. According to a massive amount of sleep literature, the worst thing we do is not sleep enough, yet that same literature never bothers to wonder what might be keeping us up at night.

Yet many people know full well why they can’t sleep. Many of us juggle multiple jobs to cobble together our livings, and the problem of sleeplessness cuts across class barriers. While those with little or no money battle exhaustion as they travel from job to job, even wealthier people are frequently like hamsters in their wheels, constantly working against the clock to hold on to and add to their fortunes. No matter who you are, under competitive capitalism the rule is the same: you sleep, you lose. Marx once pointed out that capital is vampire-like and feeds on dead labor. But that’s somewhat unfair to vampires. After all, unlike vampires, capital never sleeps.

Capitalism has never slept much, and has always relied on the lack of sleep of millions of workers to be as efficient as possible. In fact, until the invention of the eight-hour day and the weekend (both startlingly new ideas, for which workers had to fight hard) “work” as such simply carried on day by draining day. Even the idea of a legally mandated lunch break is astonishingly recent.

Among all of the Huffingtonian pro-sleep, self-help guidance, there is no discussion of the fact that people are compelled to walk around like zombies, without sleep. Take, for instance, the website Everyday Health which poses the question: “Why Don’t Americans Sleep Enough?” The answer: “Reasons why we’re not getting enough sleep abound, but one of the biggest changes behind the sleep decline is the availability of electricity and technological advances that allow us to work and play 24/7.” Note the phrasing: allow us to work 24/7! Yet most people don’t actually have a choice.

Consider that even something as simple as the lack of good transit systems can effectively ruin your chances of a good night’s sleep. In Chicago, where I live, and where the city’s segregation is enforced through its transit system, it can take two hours or more to get from the mostly white north side to the mostly black and brown south and west sides, and the trip usually involves multiple buses and trains. That’s a commute performed daily by many poorly-paid workers.

And that’s Chicago, a place with relatively good infrastructure. The situation is much worse for those living in cities and towns with little or no public transit (which is most of the United States). Researchers point to the economic consequences of rough commutes, but there are also substantial health costs involved when people spend so much of their lives traveling to and from their jobs and have little energy or time left to recharge or fully rest before the next day’s work. The sheer stress of getting to work can, in the long run, literally kill you. But work we must if we are to survive, and those on the bottom rungs run themselves ragged even before they start their workday.


Huffington is willfully oblivious to all of this, evading questions about workplace conditions even when they are most obvious. She writes that a “2015 Stanford University study of Chinese workers found that those who worked from home saw their productivity go up by 13 percent.”  Only Arianna Huffington could so blithely use the words “Chinese workers” and “productivity” together and not even offer the slightest hint that, perhaps, the rise in productivity is due to factors like the grinding exploitation they are likely to experience. Examples of such obtuseness about the exploitation of capitalism abound in the book, including her glowing praise for Goldman Sachs banning summer interns from staying overnight. Quartz’s sarcastic response to the news puts it best: “A rule that may be obvious to those of us in normal people jobs, this apparently was not clear enough to the aspiring bankers entering the intense Wall Street working environment for the first time.” Praise for such global and rapacious corporations makes it clear that success for Huffington is defined at astronomical levels; it’s not at all about ordinary workers, whose only job is to buy the products she and her friends sell.

Instead of discussing the larger context surrounding sleeplessness, Huffington wants, instead, to remind you of different consequences. Wrinkles, for instance.  She cites a UK experiment that showed that a lack of sleep resulted in a 45 percent increase in fine lines and wrinkles in women, and a rise in blemishes by 13 percent. She is also concerned that sleeplessness can cause “bad decisions,” and explains away Bill Clinton’s most indefensible presidential decisions as a possible result of a lack of sleep, for example “his inept handling of the issues of gays in the military — now widely considered to be one of the low points of his two-term presidency.” Here, as everywhere in the book, she simply ignores political ideology in favor of a diagnosis that locates acts and consequences entirely on the plane of personal problems.

Huffington is an inveterate name-dropper, and that’s no surprise given that her biggest project so far, Huffington Post, relies on the appearance of many of her celebrity “friends” to supply free labor.  “My friend Nora Ephron” makes an appearance, and she describes how “at a lunch for Jennifer Aniston, her manager took me aside,” and the time “when I interviewed [the Dalai Lama],” Oh, and we must not forget the time when “for my Thrive e-course on, I invited basketball great Kobe Bryant…” (That last one is a small masterpiece of economy, rolling her business enterprise, the planet’s most famous woman after the Queen of England, and a sports legend all into the same sentence.) Huffington’s desire to suck up (there is no elegant way to put this) to powerful and famous people requires her to be spectacularly clueless at times. Following up on the wrinkles theme, she writes effusively that “Jane Fonda credits her age-defying looks to sleep.” In fact, Fonda has gone on record as having had plastic surgery, a fact confirmed by no fewer than three aggregated stories on the Huffington Post itself.

Ultimately, Sleep Revolution tells us very little about what we need to know to get more sleep. Huffington’s slender thesis (“Sleep more so you can make more money”) is covered fully in her 4-minute TED talk on the subject, and solutions to sleeplessness are available in innumerable resources on the internet. The book is less important for what it says and more for what it reveals about Huffington’s place in enabling a particularly rapacious form of capitalism, one which first deprives people of sleep and then sells them the methods by which they might regain some of it.


Arianna Huffington likes to tell her life story as follows: once, a middle-class 16-year-old Greek girl saw a picture of Cambridge University and decided to study there. Against all odds, and with the help of a determined mother, she entered the august institution and quickly made a name for herself, even becoming only the third female president of the 200-year-old Cambridge Union. She became a well-known conservative author and public figure in England, and eventually left for America where she gained spectacular amounts of both wealth and fame.

But the story’s reality is somewhat more complex, and reflects her alliances with two particular powerful men. At the age of 21, Huffington, whose maiden name was Stassinopoulos, met the famed and influential British intellectual Bernard Levin, 22 years her senior, on a game show. Huffington wrote books in which she insisted that feminism could only appeal to “women with strong lesbian tendencies.”

Not surprisingly, it was in England, still replete with class snobbery, that she earned her most infamous put-downs, being labeled “the most upwardly mobile Greek since Icarus,” as well as “the Edmund Hillary of social climbing.” They’re good lines, though they’re also sexist. No one calls Bill Gates a social climber, and women seem to be the only ones subjected to such snide comments as they make their way upwards. That said, it’s true that large parts of Huffington’s social and financial capital have come about because she was the consort of two powerful men, and she does make much of her immense network of famous friends.

Huffington remained with Levin till she was 30, and then embarked on the next step of her journey, to New York. Only six years after her arrival in America, having ensconced herself in a social circle that included Barbara Walters and Henry Kissinger, she married the oil billionaire Michael Huffington. Levin had given her access to enormous intellectual and cultural capital; Huffington provided her with massive amounts of financial capital.

They divorced in 1997, when their two daughters were eight and six. She would go on to tell an interviewer that she doesn’t believe in marriage, just very good divorces. (Her settlement reportedly gained her $25 million.) Soon after, Michael Huffington came out as bisexual, and Arianna turned into a blazing liberal (whether or not those two facts are connected were the subject of speculation). She began working with Al Franken on Air America. (Remember Air America?) Explaining her sudden right-left shift, Huffington insists that she had always been socially liberal, and simply saw the light. A different hypothesis can be found in a friend’s observation that in famously liberal Los Angeles, to which Huffington returned after her divorce, her conservatism “would not have gotten [her] invited to a lot of parties.”

Huffington’s rapid geographic and ideological shape-shifting also meant additional scrutiny of the contradiction between her politics and her lifestyle. In 2003, the same year she ran unsuccessfully against Arnold Schwarzenegger in a gubernatorial campaign, she launched an incendiary ad campaign linking SUV owners to terrorists, despite having driven a Lincoln Navigator until the previous year. Huffington has complained about big money corroding democracy, but was a pivotal part of her husband’s unsuccessful campaign against Dianne Feinstein, in which he spent a then-unprecedented $30 million of his personal wealth. Whenever she has been challenged on these inconsistencies, Huffington has simply claimed to have subsequently seen the light.

In a 1995 Mother Jones piece designed as a Guide to Republicans, the comedian Paula Poundstone wrote, “It’s hard to pin down Arianna’s species. If only her ears drooped forward.” It’s a sharp assessment of Huffington’s innate tendency to switch positions. Poundstone also described what was then the celebrity’s fourth book, The Fourth Instinct:  “[S]he says we should be nice. She says it in 248 pages, using her own nice thoughts as a standard toward which we all should strive.” Clearly, the ability to expand a few scant phrases into hundreds of pages has not left Huffington.

But when it comes to discerning what species of political animal Huffington represents, the most striking and truthful description may come from an anonymous source, quoted by the Washington Post, speaking about her then-husband’s disastrous second campaign:

[O]ne person who knows the couple makes a particularly unflattering analogy. It is to the movie a while ago in which a creature would suddenly spring out of a human’s chest.

“I think of that thing in John Hurt in ‘Alien,’ “ he says, “but with better hair.”

“In Michael,” he says, “she’s found a host.”

In the mythology of the Alien films, the central figure (the aforementioned “thing”) is a vicious space species that exists purely to breed and take over every terrain it encounters, whether a ship or an entire planet. Its method of self-propagation, enabled by a gigantic queen, is to implant eggs in any available host. The egg eventually and quickly gestates and finally emerges as a fast-developing creature, mutating in the process and eventually becoming more human-like. By the fourth film in the series, Alien: Resurrection, the creature has developed a womb and gives live birth to its progeny, which proceeds to eat its mother alive.


In the films, alongside the titular, rapacious and monstrous being, there exists another equally deadly force: the ubiquitous Weyland Corporation. All through the series, it becomes clear that Weyland is, if not the only one left, at least one of the biggest corporations in the known universe. Its interests extend from the petty junk-harvesting of space debris and old ships to dreams of universal domination. Its intense desire to harness the Alien itself comes from the corporation’s ambition to use the creature as the ultimate biological weapon. The alien is a perfect killing machine, with acid for blood, blood so toxic it can melt thick steel and spurts out at even the slightest injury, causing massive harm to its adversaries. In the first film, the robot Ash describes the creature with admiration as a “[p]erfect organism. Its structural perfection is matched only by its hostility….I admire its purity. A survivor… unclouded by conscience, remorse, or delusions of morality.”

It is no wonder, then, that the Washington Post’s source should be reminded of the Alien franchise when asked to analyze Huffington.

Yet the Alien comparisons are striking not only for their insight into Huffington personally, but as a means by which to understand her enterprise and the larger formations of capitalism that she has helped to create and cement.

In August 2016, Huffington announced that she was leaving the Huffington Post to focus on her new startup, Thrive Global. The venture, according to the Wall Street Journal, will “work with companies to improve the well-being of their employees.” Set to launch in November, Thrive describes itself as a “corporate and consumer well-­being and productivity platform.” At, the visitor is led to understand that “By reducing stress and exhaustion, we can improve people’s health and increase productivity for both companies and individuals around the world,” and that “Thrive Global is a corporate and consumer well-­being and productivity platform.”

The point of such an enterprise, wrapped in such transparently vacuous new age verbiage, remains a mystery. For all their pretense otherwise, it’s clear that Huffington and her commercial partners care very little about the effects of sleeplessness on those who are not their target audience. In April 2016 a sleep-deprived Uber driver, too tired to continue driving, asked his passenger to take over, and woke up to find the car embroiled in a high-speech chase with police. A Huffington Post reporter, Sarah DiGiulio, was prevented from “writing” about the story. (At the HuffPo, “writing” means “linking to.”) Post senior editor Gregory Beyer told DiGiulio that they wouldn’t be linking to it because Huffington Post was currently “partnering with Uber on our drowsy driving campaign.” In other words, Huffington’s policy was to ignore or actively censor any story that actually proved that sleeplessness is a function of capitalism, and to protect her financial partner from being implicated in any resulting damage. In response to the story, Uber suspended the driver, then issued a statement about the dangers of sleeplessness (which predictably cited the company’s link up with the HuffPo and Toyota “to raise awareness of the issue and help save lives.”)

“I cried to dream again,”

—Caliban, The Tempest

The great irony of Huffington’s new enterprises, which promise both sleep and thriving, is that the Huffington Post itself feeds off the sleeplessness of its writers, people who are compelled to stay up all night in order to read and repost pieces about how sleeplessness is ruining their lives. The Huffington Post is notorious for paying not a single cent for most of its contributions, paying writers solely in illusory “publicity.” By building a hugely popular website on unpaid labor, HuffPo played a major role in establishing the pitiful compensation structure currently faced by online writers. If writers can’t sleep, it’s because they make HuffPo rates, i.e. nothing.

The Sleep Revolution is therefore a work of extraordinary gall. There is no consideration of the structural problems with sleeplessness, no critique of the systems which drive people from their beds toward jobs where they nod off to sleep in exhaustion. Arianna Huffington did not invent the web, but she is among those who created the news that never sleeps, in turn created by aggregators working around the clock, so that you might wake up at midnight or three or four in the morning, entertained by yet another set of links about Kate Middleton in a red dress or a hammock for your head so you can sleep on the train on the way to work.

In the Alien films, the Weyland Corporation sends its workers across the universe, millions of light years away in search of material and profits. But travel across the cosmos is time-consuming; workers would inevitably age along the journey, dulling their efficiency. Weyland’s solution is simple: Sleep pods that hold the bodies in suspended animation. Here all natural bodily functions cease, and the workers are reduced to nothing more than bodies. Once at their destination, the ship, a machine that possesses complete control over them, wakes them up and they continue their work. Everyone is a freelancer; everyone is put to sleep till their next gig. In the first film, when Captain Dallas hacks into the ship’s computer to discover the mission’s operating mandate, he discovers a chilling command stating that capturing the alien is the first and only priority. “Crew expendable,” it reads.

On her Twitter feed, Huffington retweets yet another famous billionaire, Melinda Gates, wife of Bill Gates: “Make sure to be gentle to yourself. Take time for yourself. Make sure that you’re taking care of yourself in order to be the best person and do your best job.” Ultimately, that’s all that matters to Huffington and her ilk, that the workers remain at their most fit, churning out content when awake, then suspended in pods until their labor is next required. And should these freelancers prove too costly, well, “crew expendable.” In space, no one can hear you cry in your dreams.

Illustrations by Chris Matthews

Racism and the American Pit Bull

The fear of certain breeds of dogs mirrors the fear of certain people…

Ceecil and Harambe. The names circulate in the public consciousness like those of beloved celebrity icons or fallen heroes, or street names evoking particular histories that we might be in danger of forgetting.

Cecil the lion was named after Cecil Rhodes, one of the most brutally racist imperialists ever to roam the planet. In July of 2015, Cecil was lured away from Zimbabwe’s Hwange Game Reserve and shot for sport by Walter James Palmer, an American dentist and hunting enthusiast. International outrage over Cecil’s death was instant and long-lasting, resulting in the creation of laws banning or curtailing trophy game hunting.

Harambe was a 17-year old lowland gorilla, also born and raised in captivity. He was shot to death by zookeepers in May 2016 when a three-year-old wandered into his enclosure. The gorilla had supposedly showed signs that he might prove harmful to the child, after dragging the boy through the water. The shooting provoked a public furor; a petition signed by over 500,000 insisted that the child’s family be prosecuted for negligence.

When such deaths and stories publicly erupt, they reveal more about the place of animals in human social relations than they do about the actual animals themselves. The culture relentlessly anthropomorphizes them, granting them names and imbuing them with human qualities in order to render them more sympathetic, more deserving of our attention and sympathy.

But such love for animals is profoundly selective. Only certain classes of relatable animals, ones bearing endearing names, are empathized with. Consider, in contrast, the fate of countless and nameless pit bulls.

Pit bulls have long been the bogey dogs of America, subject to harassment and torture because of the unwarranted fears about them. Few breeds have been as demonized, though a persistent public relations effort on the part of pit bulls’ fervent supporters may slowly be causing a shift in the tide of opinion. Pit bulls even continue to be exterminated as part of pre-emptive measures designed to protect the public. Bronwyn Dickey’s new book, Pit Bull: The Battle over an American Icon, reproduces a photograph of pit bulls euthanized in Kansas. There are no names attached to the photograph, no individual dogs here, only a dogpile, a small mountain of canine carcasses seemingly thrown casually atop one another, heads and paws facing in different directions.

The laws surrounding pit bulls are as vicious as the dogs’ supposed reputation. Out of all breeds, pit bulls are the most likely to be subject to Breed Specific Legislation (BSL) and they have been cruelly and mercilessly killed by irate neighbors and police. American police have become notorious for their practice of routinely executing dogs when entering houses under search warrants, and an inner-city pit bull owner cannot expect to see their dog survive any such encounter with police. There are few legal remedies available when police shoot dogs, and the Internet is full of disturbing, heartbreaking testimonies from bereaved owners who have seen their pets gunned down before their eyes.

Importantly, there is no such thing as a pit bull “breed” to begin with. There are several different breeds of dogs that are broadly defined as such, including the English bulldog, a short-legged, slobbering animal that was once literally bred to fight and corral bulls but is now light years away from its long-legged, active ancestor. No one could possibly look at a contemporary bulldog and imagine it as a vicious hunter. (These days they give off more of an “asthmatic Winston Churchill” vibe.)


One of the most prominent myths about pit bulls is that they have special locking jaws and that, once they’ve sunk their teeth into flesh, they cannot be dislodged without either thrusting a rod between their teeth or killing them. This is not the case, as can be deduced by both common sense and a glance at the skull of any pit bull. Another myth is that these demon jaws can exert pressure up to 740 pounds of pressure per square inch. This, too, is false.

As a result of such attitudes, pit bulls can essentially be hunted down at will, and their owners suffer from various forms of stigma. BSL means that people with pit bulls or other dogs defined as dangerous breeds cannot rent in many neighborhoods and are compelled to find housing in poorer and often more precarious areas. In 2012, a Maryland court ruled that pit bulls, unlike other dogs, were “inherently” dangerous, thus increasing the owners’ liability for their acts. (The court’s decision was later undone by the state legislature.)

It wasn’t always like this. Fans of vintage television may be familiar with Pete the Pup, the pit bull with a ring around his right eye who became a star on the show Little Rascals. Then there was “Stubby” (widely held to be a pit bull), who served with the Twenty-Sixth Yankee Division and played an active role with American troops as they traveled to fight alongside the French in 1917. Stubby even reportedly “took” his own German prisoner of war.

Dickey’s book explores how the pit bull went from being a beloved American icon to a much despised demon dog, subject to extermination at will. The shift in attitudes towards pit bulls reveals much about American society. Dickey’s assiduously researched book takes us through the creation of the breed, from its earlier place as a stalwart companion to war heroes (and, indeed, even as a war hero itself), through the 19th century when they were deployed in New York City’s notorious dog fighting rings.


In the 20th century, the 1970s witnessed the swift and precipitous decline of modern cities. As America’s urban areas struggled, poorer residents, often Latino and Black, came to depend on pit bulls, which were an affordable means of receiving protection and companionship.

The media vilification of pit bulls soon followed. Dickey suggests that the creation of the 24-hours news cycle, inaugurated by CNN in 1980, represented a turning point. The rise of cable television created a salacious interest in “ghetto” and “thug” stories, and the news networks loved to report on the viciousness of urban “animals” both canine and human. A July 1987 Sports Illustrated story about pit bulls featured a cover illustration of the dog snarling, open-mouthed, with fangs on full display. The title in large print and all caps: “BEWARE OF THIS DOG.” During this time, at the height of the Drug War, the media similarly stigmatized Latino and Black men. They were treated as toxic carriers of drug addiction and social dysfunction, much as rats and other animals have been cast as sources of disease.

The link made between savage beasts or dangerous animals and black humans is as old as the history of enslavement. As the actor Michael B. Jordan memorably phrased it: “Black males, we are America’s pit bull. We’re labeled vicious, inhumane, and left to die on the street.” (Jordan made the comment in a promotional interview for the film Fruitvale Station, in which he played Oscar Grant. Grant was an Oakland resident fatally shot by transit police, in a killing that anti-police brutality activists have described as an execution, and proof that black lives in America are treated as expendable.)


The history of relations between African Americans and dogs is complex. On plantations, dogs were trained to track and hunt runaway slaves, a practice that continued in the Southern use of police dogs against civil rights activists. Yet slaves also forged loving relationships with the animals. Dickey writes about Charles Ball, a slave “who escaped from a South Carolina plantation around 1812” and for whom “the love of a dog provided the only sense of comfort he knew.” Ball named his beloved dog Trueman but had to leave him behind during his final escape, knowing that the dog’s bark might give him away. In a poignant section of his memoirs, he wrote, “I recollected that he had always been ready to lay down his life for me; that when I was tied and bound to the tree to be whipped, they were forced to compel me to order my dog to be quiet, to prevent him from attacking my executioner in my defense.”

But one cannot tell the story of relations between African Americans and animals without noting the ways in which black people have been consistently dehumanized themselves. In a slave economy, Africans were treated not just as exploited labor, but as display items, suitable for zoos. Their bodies were presented as evidence that they were closer to baser animals like apes. Hundreds of year of racial pseudoscience, which lasted long after slavery’s abolition, offered supposed proof that they were less evolved than their white rulers and owners.

Black people were quite literally exhibited as curios and specimens. The most notorious example may be that of Saartje Baartman, born in South Africa in 1789 and sold in her twenties to two white men who took her around the world and put her on public view. They forced her to endure throngs of crowds who came to see and even poke the “Hottentot Venus,” endowed with larger buttocks and, so the rumor went, a more extensive labia than white women. Baartmen would die penniless in Paris only a few years later, and her genitals, brain, and skeleton could be viewed in the Museum of Man till the 1970s. Her remains were only returned to her homeland and buried in 2002.

Baartman was no anomaly. Dickey recounts the story of the Congolese pygmy Ota Benga who, in 1906, was exhibited alongside an orangutan trained to do tricks in the Bronx Zoo. The New York Times weighed in that Benga was part of “a race that scientists do not rate high on the human scale” adding that “it is probably a good thing that Benga doesn’t think very deeply.” Desperate and unable to return to his home, Benga committed suicide in 1916.

Charles Darwin’s theories of evolution were used to further dehumanize entire races of people, put on display or discussed, as Baartman and Benga were, as proof of “living links” between apes and men.

That dehumanization, the belief that black, brown, and other non-white people are lesser beings, persists today, especially evident in the continuing series of police brutality incidents that break out with depressing regularity in the United States. When Rodney King was beaten almost to death in Los Angeles, police officers told of their fear that he was under the effects of Phencyclidine (PCP). The same justification was heard in Chicago after the police killing of Laquan MacDonald, a 17-year-old shot sixteen times while walking down the street. Since MacDonald had drugs in his system, he had become that deadliest of creatures, the frenzied black male, capable of anything. In public discourse, a black man killed by police is inevitably cited or described as someone possessed by PCP, and thus possessed by forces beyond his control, forces which make him so lethal that only death will quell the danger to those around him.

We might recall the myth of pit bulls and their interlocking jaws.

The links between “animality” and race have always been vividly present even if never explicitly discussed. The concepts of breed, blood, and race have served to determine what constitutes the human, the non-human, and the purported differences between the two. Dickey traces the history of the concept of “breed”:

…how we think about breed and how we think about race inform each other, even though we may not always realize it. The very word ‘race’ comes from the world of dogs, in fact. It was first coined in medieval France, where hunters and falconers classed their animals according to function, like the English, but also according to “nobility,” in a quasi-caste system. The hounds belonging to French royalty were placed in the “highest” race, and the common guard dog belonged to the “lowest.” For several hundred years thereafter, writers across Europe referred to races, rather than breeds, of dog. This was transposed onto humans sometime during the Enlightenment as naturalists, most notably Buffon and Linnaeus, expanded their taxonomies.

The notions surrounding classification made it easy to attest that the “race” of pit bulls was inherently unstable, with persistent breed characteristics that can never diminish. To a degree, of course, dogs can be bred to indicate some characteristics more than others. Australian shepherd dogs will herd their humans if they’re not put to work in actual fields. But as Dickey shows in an entire chapter devoted to the issue, a “breed” has to be carefully maintained—its defining features can literally disappear in the matter of just a few generations of puppies. And, as the animal theorist Colin Dayan points out, “There is no pit bull gene for danger.”


In fact, Dickey’s research indicates that most of the animals supposedly involved vicious killings or injuries were not even actual pit bulls. Instead, the simple fact of an attack caused the animal to be identified as a pit bull, with even Golden Retrievers labeled as such. In the tautology established around pit bulls, all pit bulls are dangerous dogs and all dangerous dogs are pit bulls.

African Americans are subjected to the same axiomatic reasoning, even though the races of humans are just as indeterminate. Racial categories are a fluid mess, impossible to define with any precision. But the scientific reality, that race is far more social than biological, has done nothing to prevent confident pronouncements on the essential characteristics of racial groups. And it has certainly never kept African American men from being treated as a dangerous breed, in need of locking up.

The racist attitude connecting dogs and African Americans was never clearer than in the case of Michael Vick. 2007 saw the explosive revelation that Vick, star quarterback for the Atlanta Falcons, had been running a dogfighting ring out of his home in Virginia. The news shocked and horrified his fans and the general public. Reports emerged that several dogs from his “Bad Newz Kennels” had been “drowned, hanged, electrocuted, and beaten to death in addition to the daily pain and suffering they experienced as victims of dogfighting.” Vick himself had killed several of them.

Vick served twenty-three months of a three-year sentence, and after his release faced a massive public backlash. He found himself a pariah. No matter how many apologies Vick delivered, scant forgiveness was on display.

But as Dickey points out, none of the athlete’s public denouncers seemed to recall that in 1969, Doug Atkins, the white defensive end of the Saints, openly admitted to using his pit bull Rebel in dogfights. Atkins was inducted into the Pro Football Hall of Fame in 1982.

Vick, by contrast with Atkins, was not just denounced, but was relentlessly dehumanized. Dickey writes:

Critics called for Vick to be ‘neutered,’ electrocuted, or torn apart by dogs. Cartoonists portrayed him as an animal. PETA demanded that he receive a brain scan to test for possibly psychopathy before being allowed to return to football. Threats were made against Vick’s family members, specifically his children. In 2010, the conservative television commentator Tucker Carlson said, ‘I’m a Christian, I’ve made mistakes myself, I believe fervently in second chances, but Michael Vick killed dogs, and he did [it] in a heartless and cruel way. And I think personally he should have been executed for that. He wasn’t.’

The brutal language used to denounce Vick was widespread, legitimized by the fact that it was in defense of dogs. As Jane Berkey, founder of the Animal Farm Foundation put it to Dickey, “Finally, the public hated something worse than it hated pit bulls, and that was Michael Vick.”

Ironically, the Vick revelations created a massive turnaround in the public perception of pit bulls. Vick was ordered to pay $1 million for the long-term care of forty-nine dogs seized from his property. Eventually, according to reports, all but two were accepted by shelters and homes around the country, and they were referred to as the “Vicktory Dogs.” Sports Illustrated ran a story about the dogs, featuring one of the rescued animals on the cover.

Today, after a massive public relations campaign waged by Berkey and others, pit bulls are attaining a nearly mythic image, one completely opposite to their former reputation. They have been termed “nanny dogs” for their temperament and ability to get along with children. On its website, the online resource site Pit Bull Rescue Central lists figures like Helen Keller and Fred Astaire as notable owners of the dogs. All of the famous owners on the list are white. The implication is simple: Who is the typical pit bull owner? Not Michael Vick. The pit bull’s redemption in the public mind directly coincided with its transition from a dangerous “black” dog to a lovable “white” dog. Michael Vick, by contrast, was an animal and a savage, who deserved to be put down.


Dickey is relentless in exposing the brutal racism and classism at the heart of the pit bull scare. For that reason alone, her book is an invaluable resource for those who want and need a counter-narrative to the usual stereotyping of animals. Pit Bull is an important study of how one animal and its context can reveal everything about the link between race, class, and “animality.”

Yet, as much as it presents necessary histories and analyses, Pit Bull works less as a book than a collection of usually interesting essays. Dickey is clearly a superb journalist, but there’s a difference between writing a series of journalistic pieces and writing a unified book on a theme. As a monograph on pit bulls, this work lacks an animating principle and often lags in tempo. Dickey is often too caught up in her factual reporting to keep her eye on the larger coherence of the book.

To her credit, Dickey does not take the easy way out by concluding her book with details on happy pit bull owners. Instead, she focuses on organizations like Pets for Life, which provides pet care supplies and veterinary services to those who cannot afford to keep their dogs, even though they depend on their animals as a lifeline of love and support. Rather than romanticizing pit bulls as “nanny dogs” in accordance with the current trend, Dickey writes simply that:

They are no more or less deserving of good homes. They didn’t cause society’s ills, nor can their redemption—real or imagined—solve them….More important, there never was a ‘pit bull problem.’ What happened to these animals was a byproduct of human fears, and what humans feared was one another.

Meanwhile, the question of race flies to the surface every time animals return to public discussion. The death of Harambe, the lowland mountain gorilla, provoked an outcry against the child’s family that was distinctly racialized. Media reports demonized the African American family of the child, focusing on the father’s previous history with drugs. Some argued that the mother’s irresponsibility proved that all her other children were being neglected at home. Over 500,000 petitioners insisted that, surely, the mother’s behavior was negligent, and that she should face criminal charges. (Apparently none of the petitioners had ever actually lived with toddlers; preventing a truly determined preschooler from clambering into the gorilla enclosure would require superhuman vigilance.)

About a month later, a white two-year-old, Lane Graves, waded into waters at Orlando’s Walt Disney and, while his horrified parents tried to save him, was drowned by an alligator who made off with the body (it was eventually recovered). In that case, there was never any question of the parents having charges filed against them for negligence. Instead, the nation mourned. Five alligators were killed in the hunt for the one that had drowned the child. But no Facebook groups sprang up to pay tribute to the alligators.

The impulse to quickly separate black parents from their children in order to provide them with supposedly better homes has historical roots, as Dorothy Robertson demonstrates in her book, Shattered Bonds: The Color of Child Welfare. Robertson critiques the foster care system that has consistently wrested black children away from their parents and funneled them towards white adoptive or foster parents. In an eerie echo of this process, pit bull and animal owners left bereft by Katrina, many of whom refused to leave their pets in the face of disaster, watched as animal rescue organizations swooped in and took away their beloved companions. Attorney Steve Wise, quoted by Dickey, put it bluntly: “The message is, ‘You’re poor, and we can take care of these dogs a lot better than you can.’” It doesn’t take much to stir up the public perception that African Americans are irresponsible and uncivilized.

Of course, there are other explanations for the differing treatment of Harambe’s death versus that of the alligators. One is simply that alligators, lacking in any cuddly features, rank lower than gorillas on the likeability scale for humans. But Harambe’s death and the outrage that surrounded it also reflected a difference between the value placed on animal lives versus black lives. Once, apes were seen as contiguous to Africans and other non-white people, hence the placement of the orangutan next to Ota Benga. But apes are now anthropomorphized, and many would rather have seen the child die than the gorilla.

Contrary to previous mythology, apes are no longer signifiers of blackness. They are treated with compassion and dignity, recognized for their intelligence and sophistication. Yet no such transformation has occurred in the treatment of race for humans. Black bodies are still shot at will and caged by the hundreds of thousands. Black people continue to be treated as animals even as animals have become human.

The redemption of the pit bull shows that animals have finally transcended race. It is only black humans who must continue to bear its burden.

Whose Gay History?

Chicago went from raiding gay nightclubs to painting rainbows all over its streets. But queer history is more complicated than the standard progressive fable…

When 38-year-old Ron Huberman landed the coveted job as head of the country’s third-largest school system in Chicago, he did so with absolutely no background in education. But Chicago was ruled by then-Mayor Richard M. Daley, son of Richard J., who continued a proud dynastic tradition of political appointments. Huberman had formerly been appointed president of the Chicago Transit Authority (by Daley) and before that, was Daley’s Chief of Staff and, before that, Executive Director of the Office of Emergency Management and Communications (also appointed by Daley). By Daley standards, he was a perfect fit for the job.

But the news of Huberman’s appointment in 2009 was soon dwarfed by his apparent revelation to the Chicago Sun-Times: that he was gay. Huberman’s coming out left many in Chicago’s influential gay community bemused; he had already been out for a long time. He had a partner with whom he openly attended social and workplace events in gay bars and establishments all over town, and he had been out to his parents since the age of 15. In effect, Huberman re-emerged from a closet that he had thrown wide open many years ago.

Daley and his administration always had a tight grip on what kind of stories accompanied news of appointments, so the coming out story was clearly no accident. It was meant to deflect attention away from both Huberman’s lack of qualifications and the controversies surrounding CPS at the time. Daley had just announced the closure and reorganization of 22 schools and everywhere parents and students were agitating against the slashing of funds to the beleaguered system. Daley himself was not doing well in polls, facing widespread criticism for having ceded too much on a citywide parking meter contract which quadrupled residents’ parking costs. He would eventually decide to not run for re-election and Huberman, who began his CPS term promising to stay in for the long haul, would hand in his own resignation soon after Daley’s announcement.

But in retrospect, the Huberman appointment was a novel new kind of political scheme. Got a school district to kill? Hire the gay guy, have him “come out” to the press, and continue your decimation of schools while everyone is even momentarily distracted. 

To be gay in Chicago was once a potential source of shame and stigma, especially in the senior Daley’s administration. On April 25, 1964, police carried out an early morning raid on a nightclub called the Fun Lounge, to which the city’s gays flocked to mix and mingle. As John D. Poling writes in Out and Proud in Chicago, Cook County Sheriff Richard Ogilvie had placed the club under surveillance, describing its activities as “too loathsome to describe.” The raid resulted in the arrests of 109 people. The Chicago Tribune reported the names of eight teachers and four municipal employees in the paper, ruining their and several other lives in the process.

The Chicago of today is almost unrecognizably different. The city has become a hospitable landscape for gays, especially the wealthy and powerful sort. Chicago is now home to numerous gay nonprofits and swarms with gay politicians, activists, and officials.  It is home to gay men like Chuck Renslow, the founder of International Mr. Leather, a long-standing (since 1979) annual conference and contest for leathermen. It is also home to wealthy gay men like Fred Eychaner, one of the most powerful and influential men in the country, ranked as the sixth highest contributor to the Democratic National Committee. In 1998, Daley renovated, with great fanfare, the predominantly gay Lakeview neighborhood popularly known as Boystown. The $3.2 million facelift came with giant, phallic rainbow pylons that marked the area’s limits and was the ultimate sign that the city of Chicago loves its gays, at least of a certain type.

Within people’s lifetimes, then, Chicago went from police raids on gay lounges to taxpayer-funded rainbow streetscapes. All of which raises a baffling question: how did the city get from there to here?

It’s the question examined by historian Timothy Stewart-Winter’s new book Queer Clout: Chicago and the Rise of Gay Politics, which looks at Chicago in the post-war years in an attempt to identify just how these dramatic changes came about.

For the most part, gay history has focused on the coasts. It is widely and in some sense accurately assumed that those fleeing the repression of the heartland inevitably departed for the east or the west. But there have always been gays in the Midwest, and their story is only beginning to be told. Chicago is, as Stewart-Winter rightly points out, “a major transportation hub and one of the nation’s largest cities, and it drew gay migrants from across the Midwest.” It is also an international city, home to several immigrant communities, and historically a bastion of left-wing organizing (we gave the world the eight-hour week, and you’re very welcome).

Chicago has a long and storied history, having been home to several such political and social movements. Give this complexity, it makes sense that Stewart-Winter focuses on a particular period of gay history, and how it unfolded within the context of the growing civil rights movement.

Sensibly, he also emphasizes the local aspect, and the importance of state laws. State and municipal politics and law have always had a large effect on the lives of  gay people. It was, after all, local laws and policing practices that first made life hell for gays and then eased the restrictions on them.

And the state government has constantly been of major import. For example, in 1961, the Illinois legislature passed two laws which almost contradicted each other. The first decriminalized gay sex by repealing the Illinois “crime against nature” statute. But the second “altered liquor regulations in a way that gave the city of Chicago more power to keep gay bars closed after a raid.” This, Stewart-Winter points out, had a negative effect on gays as individuals and as a group: “Chicago’s experience thus revealed that legalizing intimate acts was not enough to make gay people feel safe when they gathered.”

But who were these “gay people?” They were a far more diverse group, economically and racially, than is traditionally acknowledged. One of the contributions of Stewart-Winter’s book is to examine how the struggles of gay people were fought in tandem with those of African Americans.

2016 is the 100th anniversary of the Great Migration, and Chicago was one of the cities to which African Americans moved from the south. The city’s racial history has been a troubling one, marked more by hostility, stigma, and exclusion than by acceptance, and the urban segregation and division between whites and blacks can also be seen in its gay community. Queer Clout examines the rise of gay power in the unavoidable context of black-white relations. Stewart-Winter posits that gay activists employed the tactics of the civil rights movement and even briefly worked with its leaders. Today’s gay movement is largely cynical in its use and appropriation of civil rights history and rhetoric: gay marriage activists have repeatedly and troublingly likened themselves to Rosa Parks. But at least for a brief period of time in Chicago, the alliance between white gays and African American civil rights activists was more palpable and genuine.

Early in the morning of December 4, 1969, Fred Hampton, chairman of the Illinois Black Panther Party and deputy chair of the national BPP, was murdered by police during a raid, while he lay sleeping. Also killed was Mark Clark, the Black Panther member on security duty at the time. The killings incited explosive responses amongst blacks and whites, with support or denunciation falling along mostly predictable racial lines. Stewart-Winter writes that the incident would “cement the fragile black-gay alliance in Chicago” when the leaders of the Mattachine Midwest, the leading gay organization at the time, were taken on a tour of the apartment in which the two men were killed, the walls still riddled with bullets. Shortly after, Mattachine Midwest and Chicago Gay Liberation, another new and more radical group, issued a joint statement supporting the Panthers in challenging the police version of the raid.

Equally fascinating is the political history of Chicago’s black politicians and their efforts on behalf of the gay community. Gays fighting against restrictive laws found allies in men like Alderman Clifford Kelley and Harold Washington, Chicago’s first and so far only black mayor.  Such alliances were not entirely outside of traditional Machine politics—Washington, for instance, was innately progressive in his sympathies, but his stance came just as much from political necessity: he needed gay white progressive votes to combat the racist vote-gathering of white politicians opposed to him. Ultimately, though, neither the alliances between activists nor the ones between politicians and activists would last very long. This was because, as Stewart-Winter writes, “…ironically, in the very years when policing and punishment in black neighborhoods began to increase, the policing of predominantly white gay establishments and neighborhoods became far less systematic.” As the gay rights movement scored victories, and the police raids finally stopped, the experiences of black and gay people were no longer as obviously comparable. The gay rights movement would go on to score several victories, and police raids on mainstream (white) bars finally stopped. But this came at the same time as deepening poverty and more police surveillance on the south and west sides where blacks and an increasing Latino population resided. Eventually, the racial rifts between white gays and the rest of the city widened again, as the differing populations dealt with more or less a sense of security and safety from the state.

Over the course of detailing such shifts and changes, Queer Clout introduces hitherto relatively unknown Chicago activists like Pearl Hart, a Jewish lesbian lawyer who defended prostitutes and left-wing activists, and Ron Sable, a gay physician and activist who would be instrumental in developing gay-focused health care resources in the city. And it reveals interesting details about those who have since gone on to rosy careers as established progressives. For instance: Jesús “Chuy” Garcia, a Latino member of the Chicago Board of Commissioners, became famous in 2015 for nearly ousting incumbent Mayor Rahm Emanuel. In a city famous for its allegiance to the Machine, and for only having elected one non-white mayor (Washington), Garcia was lauded as the lefty-progressive alternative to Emanuel, who has long been seen as one of the more conventional liberals of the Democratic party. But in Queer Clout, we learn that Garcia, had to be “hauled in and sort of beaten” by union people when he attempted to wiggle out of supporting a gay ordinance in the late ‘80s, according to writer Achy Obejas.

Queer Clout gives us many such tantalizing glimpses into Chicago political life, though it sometimes feels discordant and episodic as it tries to mold several stories and a wide range of characters into a larger, coherent narrative. The book will become a resource for those curious about gay history outside the coasts, and could easily have been at least twice its size.

In his most lucid chapter, “Lesbian Survival School,” Stewart-Winter paints a comprehensive and poignant picture of the challenges facing Chicago’s lesbian community as it worked on developing what was often a radical feminist agenda and also dealt with the complexities of race and ethnicity. Chicago was “the epicenter of the socialist feminist union movement that spread to more than a dozen cities in the 1970s,” Stewart-Winter points out. “Since the emergence of gay liberation,” he notes, “lesbian politics has been far more inflected by radicalism than has gay male politics.” As women, lesbians needed to pay more attention to matters like workplace harassment, equal pay, and abortion and reproductive rights. As feminists, they were more inclined to resist the pathway of marriage towards respectability.

But lesbians are not a homogeneous block, and there have been racial divisions from the beginning. Early lesbian feminism was imbued with the politics of separatism, something that black lesbians, who have historically needed and wanted to be part of their families and communities, have not always aligned with. In addition, white lesbians have historically tended to work on the assumption that queer=white.

As nearly anyone with the most casual understanding of the city knows, race continues to centrally define Chicago life and politics. In 2009, reporting on Chicago being ranked the most segregated city (it has recently moved down to third place), the Chicago Tribune interviewed several residents, asking why they chose to live in segregated neighborhoods. Not once did the paper even use the word “racism.” It concluded the piece by quoting a black woman: “There is a comfort level being among people of your own race…I don’t think that there was any intention of segregation behind that.”

But segregation in this city is not some genteel agreement between the races and ethnicities to quietly live away from each other. Rather, the segregation that is starkly evident to anyone who travels beyond the city’s justly celebrated downtown landscape is marked by economic devastation on the south side and much of the west. Everywhere on the south side, school buildings are shuttered, grocery stores are scarce, and large patches of neighborhoods are simply boarded up. Nearly all of this has been the result of many decades of brutally enforced, plantation-style racist and economic policies and actions. As recently as 1975, the Chicago Reporter sent a black journalist, Stephan Garnett, to Marquette Park, located in a white neighborhood, to report on its facilities. He was set upon by nearly 20 white men, had a beer bottle broken on his head, and his car set on fire. In 2011, mostly white and gay residents of Boystown, insisting that black youth coming to the area’s social service agencies were committing crimes in their neighborhood, declared that they would create dog squads to patrol in the evenings.

All of this is to say: the alliances between black and white gays and lesbians were doomed to fail in a city whose racism survives in its most unmediated, primal form and which serves as its living, breathing, heart. Stewart-Winter does not quite name racism as a motivation but does point to the ways in which these racial inequalities reinforce other kinds. Quoting a 1993 report by the Human Relations Foundation of Chicago, he writes about how the concentration of gay organizations in (mostly white) Lake View compels queers of color who live elsewhere to travel long distances for basic services.

Yet when it comes to the racial schism in Chicago, Stewart-Winter’s own analysis echoes the very problems that he attributes to queer activism. Given his attention to race, it’s surprising that Stewart-Winter often falls into the familiar trap of distinguishing between gay activists and black activists, thereby erasing the group of people who are both gay and black. We learn little about Chicago’s queer of color activist community or how it has been nourished in its historic spaces, like the famed Bronzeville, home to Chicago Jazz and Blues. The city’s black queer scene was for years celebrated even in the 1920s by Chicago Defender, black America’s paper of record. In more recent times, black queer life has been sustained by both groups and individuals. Affinity, a nonprofit group, has worked to provide resources for social services and support. The poet and organizer C.C. Carter organized Pow-Wow, a performance space that, for years, often showcased black queer talent. Academics like Cathy Cohen and Beth Richie have served as mentors to more radical groups like Black Youth Project. BYP, along with Assata’s Daughters, has worked with the local chapter of Black Lives Matter and others to defeat State’s Attorney Anita Alvarez in her recent re-election bid. Alvarez was notorious for not immediately investigating the killing of Laquan McDonald, the black youth who was killed with 16 shots fired by officer Jason Van Dyke.

Assata’s Daughters was founded by black queer women under 30, like Hannah Baptiste and Page May. The black prison abolitionist and activist Mariame Kaba has been the driving force behind an astonishingly large array of mostly queer black and youth-led radical organizing. Among Latinas, groups like the now dissolved Amigas Latinas survived for over two decades, bringing together lesbians and trans people in spaces that helped budding activists develop their own organizing talents. Many of them have gone on to work as immigration activists.

This kind of dynamic, activist fervor did not come out of the blue but has been decades in the making. Much of it has been slowly and carefully nurtured out of the limelight. In contrast, the white gay community has received far more city and community funding, and has, especially in recent years, been far more beholden to a mainstream national agenda which has included issues like gay marriage, hate crimes legislation, and Don’t Ask, Don’t Tell.

One of the few black activists discussed in the book is the late Vernita Gray, who died in March 2014 and who is acknowledged here as a primary source and mentor to the author. Her story becomes a way for Stewart-Winter to trace the lack of clout at the start of the book to its conclusion, where she becomes the symbol of the ultimate attainment of queer clout.

At the close of the book, Stewart-Winter proves that gay political success has been achieved by relating the tale of a wedding: the marriage of Gray, an out African American lesbian, to her white partner Patricia (Pat) Ewert. Gray was a longtime activist in the city, with a large part of her career devoted to working within the Chicago municipal machine. During the Daley administration, she worked in the office of the Cook County State’s Attorney’s Office, Dick Devine, for 18 years. Devine was first assistant to Richard M. Daley when he was the State’s Attorney, and both men have been publicly called out by police torture activists for not prosecuting police commander Jon Burge, finally convicted of torturing more than 100 black suspects. While she cannot be held responsible for these acts, Gray’s role in the office required her to frequently justify, defend and explain away the actions of Devine who was frequently accused by queer activists of engaging in police brutality (full disclosure: I was among the queers who protested against him).

Gray was diagnosed with terminal cancer in 2012. By then, she was already in a civil union partnership with Ewert. Gay marriage finally became legal in 2013 but was only scheduled to take effect in June 2014. Gray, fearing she would die before then, asked for and received, with the help of Lambda Legal, a waiver. On November 27, 2013, Gray and Ewert had a wedding ceremony in their home and were given a legal marriage certificate.

This intimate ceremony was widely covered world-wide, with write-ups in The Independent, the Chicago Tribune, The Guardian and the Daily Mail. The wedding was a brilliant framing of the poignancy of gay marriage: who but the most heartless monster could be critical of the cause after seeing a dying black woman unite with her white partner in holy matrimony?  It was a classic media moment. 

But concluding a book about queer clout in Chicago, one that claims to focus on local politics, with a gay marriage, makes no sense at all. Throughout, Stewart-Winter barely mentions the larger national battles like gay marriage. In fact, gay marriage was largely funded by and brought to local cities and towns by national organizations, and it was ultimately decided at the federal level, not the local one.

The concluding section on Gray’s wedding seems beautiful and innocuous. But it’s actually insidious, insofar as it makes gay marriage the center of gay politics. The message—and Stewart-Winter’s politics—are clear: Gay marriage constitutes the ultimate success of gays and lesbians. In a book that pays little attention to the vast richness of black queer life except as an accessory to white gay organizing, Gray (one of the few black lesbian activists mentioned a few times) becomes a tool for reconciling the deep racism that still exists in this city.

For many liberals and even progressives and leftists, gay marriage is seen as the pinnacle of achievement for the gay community, marking its entrance into a system of state-endowed rights and privileges not available to the unmarried. But looked at more critically, gay marriage is not an achievement for a community; rather, it shows the deep division in that community’s priorities.

In the early 1990s, the gay community, ravaged by the onset of AIDS, marched on behalf of Haitian immigrants who were being rounded up and placed in camps as suspected carriers of the HIV virus.  Collectively, they demanded an end to the discrimination that also kept HIV-positive gays out of hospitals and the institution of universal healthcare. Today, most wealthy and well-off white gays can afford HIV-medications while the more vulnerable, mostly poor, mostly women, and mostly people of color have to struggle for access to resources to which they must travel long distances. Stewart-Winter himself points out that the majority of social service and healthcare resources in Chicago are on the north side, compelling black and other queers of color to travel long distances to gain access to them.

A better postscript, then, would have continued Stewart-Winter’s documentation of the inequality in resources, and the continuing challenges facing queers of color in a city marked by an intense rise in violence and brutality towards immigrants and black and brown people. A key problem with Stewart-Winter’s book is that its focus on the development of clout leads him away from a fuller consideration of the power behind it. In the end, he attempts to paint a happy portrait of reconciliation—a black-white wedding, the national triumph of gay marriage. But power is at work behind clout, and power ultimately defines who lives or dies, who gets funding for HIV resources and who doesn’t.

When it comes to queer clout, it is simply not enough to note how it came about but to ask the bigger question: What is clout used for?

Photograph courtesy of Margaret Olin.

Elizabeth Gilbert and the Pinterest Fantasy Life

If only all writers had the luxury to think of their work as nothing but pure, magical creativity…

Elizabeth Gilbert’s latest—her seventh—book, Big Magic (Riverhead Books, $24.95), is a typical product of the hybrid world of publishing today: it began as a series of TED talks, its cover was premiered on the e-commerce site Etsy, and it now exists as a 288-page text that she has referred to as a “manifesto.” Big Magic is, on the surface, a cheery self-help manual, an optimistic and sunny nudge towards creativity for those who may hesitate to plunge headlong into making what she calls “whatever creates a revolution in your heart.”

Gilbert is expansive in her definition of the “creative,” and insists that writing is only one of many such endeavors available to anyone who wants to take up anything from, say, raising goats, or cross-stitch, or perhaps making quilts to sell on, well, Etsy. Still, given her experience, what she returns to most often is the world of writing.

In that sense, Big Magic is Gilbert’s first book-length foray into writing about writing, a profession she has been a part of for nearly twenty-five years. Gilbert is also an enormously successful writer — according to The Daily Beast’s Lauren Streib, she has “easily” made $10 million in royalties just from her 2007 blockbuster bestseller Eat, Pray, Love: One Woman’s Search for Everything across Italy, India, and Indonesia.

We can assume that it is the phenomenal success of that book which made Gilbert so successful on the TED circuit and why so many are so eager for her advice.  In a time when the “creative” fields are diminished in value but also seen as potentially profitable and simultaneously good for you, it is inevitable that millions would tune in to see what a best-selling author might have to offer them in terms of advice on how to emulate her example.

The line she walks in this new book is a fuzzy one: Gilbert wants to seem assured in conveying she has insider knowledge about her field, but she doesn’t want to acknowledge that being a writer is actually a profession. To do so would demystify her entire career and, really, mean that hers would be no different than the many books lining bookstores everywhere, promising everyone the best-kept secrets to publishing or the “creative life.” So, instead, she presents, in Big Magic, not a description of how to become a writer but how to be a writer. This is entirely in keeping with the message of Eat, Pray, Love, which similarly swept aside any material considerations—how, exactly, are women to embark upon epic journeys across the globe without independent and very large means?—in favor of a can-do quasi-spiritual set of injunctions about uncovering one’s true self. Because Big Magic is in so many ways an addendum to the earlier text, it becomes a case study of how a hyper-successful writer conceives of herself and her profession. In the end, Big Magic is not about actually helping people become better at creating work. Rather, it’s about furthering the informal literary empire spawned by Eat, Pray, Love.

That’s not to say that Big Magic is entirely without merit. Gilbert offers many useful checks against the unnecessarily dramatic stereotypes people are apt to immerse themselves in when they look for the creative life, such as the one that dictates that genius can only emerge from tormented lives. She calls for discipline in turning out work even when it seems impossible to keep at it. All of that is necessary advice—in a world where writing and/or creative work is as fetishized as it is ill-paid, it’s useful for those looking to create over the course of a lifetime to know that some myths are best abandoned.

Despite such helpful admonitions, most of Big Magic reads like carnival puffery from a fortune-teller. Gilbert combines Oprah-esque pithiness with strange, bizarre suppositions that render her the Deepak Chopra of Creative Work.

Take, for instance, her idea about creativity or magic, the theme of the book:

And when I refer to magic here, I mean it literally. Like, in the Hogwarts sense. I am referring to the supernatural, the mystical, the inexplicable, the surreal, the divine, the transcendent, the otherworldly. Because the truth is, I believe that creativity is a force of enchantment—not entirely human in its origins.

Writers/creators across the ages have attempted to describe the creative process and how it takes hold of them. But to describe it as “otherworldly” and “not entirely human in its origins” in the first quarter of the twenty-first century dissociates it from the material realities in which it takes place. It is inevitable, given the title of her book, that Gilbert should deploy this kind of language. But it is also disingenuous given that she also criticizes creative workers for being unrealistic about their writing practices.

Then, there is her idea about ideas:

I believe that our planet is inhabited not only by animals and plants and bacteria and viruses, but also by ideas. Ideas are a disembodied, energetic life-form. They are completely separate from us, but capable of interacting with us—albeit strangely. Ideas have no material body, but they do have consciousness, and they most certainly have will. Ideas are driven by a single impulse: to be made manifest. And the only way an idea can be made manifest in our world is through collaboration with a human partner. It is only through a human’s efforts that an idea can be escorted out of the ether and into the realm of the actual.

In other words, ideas are entities looking for the perfect home in the bodies and minds of creators. To support her theory, she gives an example of a novel she once planned to write, about Brazil and rainforest development. Due to various circumstances, she eventually stopped researching and developing it. In the meantime, she became friends with Ann Patchett, a fellow writer whom she first met at a conference. The two women bonded instantly, with Patchett landing a loving kiss on Gilbert after their panel.

They became epistolary friends as well, exchanging handwritten letters where they discussed their lives and work. Somewhere along the way, it transpired that Patchett, too, was considering a novel about Brazil with similar themes. But it was only when they met for breakfast one day that Gilbert discovered Patchett’s work bore striking similarities to her own, down to their both including a protagonist who was a spinster from Minnesota. 

From this coincidence, Gilbert decides that her theory is right: she had stopped working on the idea and it, presumably in a huff, floated off to take residence within Patchett’s mind instead.

Somehow, it never seems to occur to Gilbert that perhaps such a close friendship had to have resulted in some kind of basic symbiotic intellectual relationship. Instead, she decides that it confirms that ideas are like spectral beings that leap from body to body, seeking the ones that will put them forth into the world.

Or, perhaps, Gilbert simply ignores the truth, that intellectual work is rarely exclusively original, and is acted upon by factors too varied to see in the immediate moment. After all, that kind of theory would severely undercut the theme of the book which she announces quite smugly: “And that, my friends, is Big Magic.”

Such reductive and, really, bizarre assertions about the creative process seem out of place in a book that lays claim to helping readers get away from other myths, like the one about tormented genius. But they are in keeping with the mystical premises of Eat, Pray, Love.

That book redefined self-help literature for women. It has, and I think rightly, been criticised as “priv-lit,” dwelling too much on lifestyles only attainable by those who can afford to take a year off and travel in relative comfort, as Gilbert did. Jessa Crispin locates Gilbert’s memoir in a long tradition of inward-looking female, white memoirists who travel through foreign lands without ever considering the cultures they march through with depth or curiosity. Despite many such criticisms, the book has made Gilbert that rare thing, a multimillionaire author who will never again have to worry about financing her work.

Gilbert will always be defined as the author of Eat, Pray, Love. Even the republished version of her 2002 biography of Eustace Conway, The Last American Man, has her authorship of that memoir clearly noted on the front cover. For some authors—Harper Lee comes to mind—the enormous success of a first book can become an albatross, an achievement that clouds and freezes one’s sense of movement as a writer. But Gilbert, to her credit, has continued to write. After Eat, Pray, Love came a sequel of sorts, Committed, about the reasons why she decided to marry the man she wrote about falling in love with in the best seller. Sales were respectable, but it saw nothing like the success of Eat, Pray, Love (arguably, what could?) In 2013, Gilbert returned to fiction, and published a tome of a novel, The Signature of All Things, which received positive reviews.

The Signature of All Things was written with the luxury of time and place. Gilbert bought the largest and oldest house built on the tallest hill in artsy Frenchtown, New Jersey. 

She had the enormous attic fitted out with bookshelves and secret cabinets designed by the well-known carpenter Michael Flood. Her custom-made desk was built out of a 15-foot long slab of acacia.

But the book also meant a return to intensive research, three years spent studying arcane histories of herbs and biology. Elsewhere and in Big Magic, she talks about the process of the work that went into it, filling up boxes with note cards, producing a 70-page synopsis before she even began writing the book.

None of this comes to Gilbert as a set of recent habits: she has always been a writer. As she describes it in Big Magic, she grew up on her parents’ Connecticut Christmas tree farm (her father was a Chemical Engineer who grew the trees on the side, and her mother a nurse of Swedish descent) and she and her sister had no television growing up but were encouraged to read and write and to create their own worlds. According to Gilbert, she took vows early on, actual vows, to do everything she could to become and stay a writer all her life. She moved to New York to attend New York University and received a Bachelor’s in Political Science.

This is the point at which Gilbert’s account of her writing life (told not chronologically but in terms of themes woven through the book) in Big Magic varies significantly from the reality that she has alluded to in prior work. It’s not so much that she lies, exactly, but that she engages in strategic acts of omission. In Eat, Pray, Love, she admits to being a highly successful freelance writer. In 1993, she became the first unpublished short story writer to appear in Esquire since Norman Mailer. A 2013 New York Times profile notes that her editors still remember with “reverence” the skill apparent in her work, and she was widely published in the top magazines like GQ. Her GQ story about the Coyote Ugly Saloon became the basis for the hit film about the bar. She made enough money that she lost a considerable fortune in her divorce. All three of her first books either won awards or accolades from sources like The New York Times.

There has never been an idyllic time for writers, but in the 90s, during the time that Gilbert flourished, writers who made it to the upper echelons of the top magazines could make a decent or even excellent living from writing. The proliferation of internet publishing and related factors have since changed and exploded all that.

In light of all this, Gilbert could have written a very different and more realistic book. She could have retained the advice about discipline and plugging away, dispensed with the hoo-ha about ideas as beings and provided a more realistic view of what it takes to become a successful writer/creative producer like her.

But that would undercut all of Big Magic’s otherworldly mystification about the nature of creative production. So, instead, she completely downplays Eat, Pray, Love’s success and insists it took her completely by surprise. This is how she describes what happened:

I once wrote a book that accidentally became a giant best seller, and for a few years there, it was like I was living in a hall of fun house mirrors.


It was never my intention to write a giant best seller, believe me. I wouldn’t know how to write a giant best seller if I tried.

On the one hand, even the most savvy publishing houses will admit that, bar a few stratospheric authors like Tom Clancy, there is never any predicting a best seller. Still, Gilbert received a $200,000 advance to go forth, travel, and write the book. Her publisher, at least, seemed fairly confident in the book’s sales potential. (While Gilbert has spoken openly about her advance elsewhere, she doesn’t mention it in Big Magic.)

She also goes on to write, “It never occurred to me that my own thoughts and feelings might intersect so intensely with the thoughts and feelings of so many other people.” This is simply a lie: unless she had the cachet to get away with not writing one, publishing houses generally demand book proposals before committing to publication.  Pitching a book—and receiving such a large advance—is entirely about revealing exactly why the “thoughts and feelings” of the author might coincide with her readers. Even if she didn’t have to produce one, Viking would have at the very least asked for some sense from her as to why the book mattered enough at that particular time for them to publish it. In which case, Gilbert would have had to provide at least a perfunctory sense of her target audience and why they would want to buy her book.

In other words, Eat, Pray, Love was not some mere accident but a well-planned intervention into the zeitgeist of publications by and about women.

There’s nothing wrong with that, but the success of Big Magic—the potion that Gilbert is trying to market as “magic”—would never come about if she blithely and carelessly went off on a mysteriously funded jaunt across the world. Rather than convince the reader to anthropomorphize every aspect of the process, it would have been more honest of Gilbert to point to the structural, procedural elements of it—that you don’t just get a book contract like the one she received for Eat, Pray, Love without an agent and a few lawyers, for instance, or that book proposals are necessary and hard work.

Instead, Big Magic resolutely erases any evidence of such. It turns work like writing into, well, a “creative process,” and thus renders it not a profession but into something between an act of deep meditation and lots of wishful thinking. In fact, she relentlessly mocks those who complain about the conditions of writing as a profession:

From the volume of complaints that emerges from the professional creative class, you would think these people had been sentenced to their vocations by an evil dictator, rather than having chosen their work with a free will and an open heart.

In other words, Gilbert, who has spent half her life as a professional writer, now believes that hers is simply a vocation. Yet, when she actually describes the trajectory of her career—and it has been a long and illustrious one—she treats it not as a mystical calling but as work. At one point, for instance, she relates how her editor at GQ, where she was then a staff writer, pulled a story she had worked on for five months, a travel story about Serbia on which the publication had spent a lot of money. The editor’s rationale was that he realized she was not the person for the job and there was no point in her pursuing it any further; he told her to simply move on to the next assignment. Gilbert’s point in relating the anecdote is that writers must always be prepared to end projects that aren’t working. But we might glean a different story here: that no one hires a casual, vocational writer to work on a travel story about Serbia for five months. The freedom to flit, to cut one’s losses and move on, is possible only when one has the backing of a serious institution and serious money, plus an editor who can sign off on half-a-year’s salary and travel expenses for a project that never sees completion.

In fact, this is one of the dominant threads in conversations about the breakup of media outlets: that fewer places are able and willing to develop the kinds of writers who can do sustained long-form writing, and that this has been a negative for media in general. Gilbert displays no awareness of or interest in these changes, even as they fundamentally diminish the possibility of following her advice and becoming the kind of liberated, magical creative spirit she insists all writers should be.

This is yet another way in which Gilbert sidesteps the institutional and structural questions currently haunting the landscape of the creative fields—similar problems are rife in, say, dance or art—in favor of aphorisms designed to make the reader feel that she has been immersed in a spa offering creative well water as a lubricant for the soul. Yet, everywhere, in the arts, people are revolting against what they forthrightly call the exploitation of artists. In January 2016, the bestselling British writer Philip Pullman resigned as patron of the Oxford Literary Festival, citing the event’s refusal to pay featured writers as his reason. In New York, the Freelancers Union is gaining steady momentum as it collectivizes writers, arguing not just for fair but timely compensation.

The current writing economy is generally inhospitable for those who want to write for a living, even though there are some changes afoot, like the kind described above. Gilbert occupies a rare stratosphere of the creative world, but it took her years of hard work to get there, and in a time when writers were paid decently. In a non-Gilbertian world, the average writer is a freelancer (given how many magazines have cut their budgets) who has to hang on to editors like a bulldog on a mailman’s leg just to get tiny checks mailed to her. The utter instability of writing as a profession has meant that a long and steady career like hers is unlikely, no matter how much determination one brings to the effort.

Gilbert is clearly an intelligent and well-read woman, and has to be aware of these shifts. But she doesn’t really have to care about reality. Big Magic will undoubtedly make her big money, and while it’s not likely to become a high-grossing film (the concept of ideas as amorphous supernatural beings does not quite make for the same cinematic experience as Mumbai sunsets and Italian dessert tables), it will probably be incorporated into book clubs and become a teaching tool for a particular subset of women.

Gilbert makes a point of saying that the money is not the point, but offers little to explain how someone with, say, multiple jobs and unstable shifts might carve out the time and the energy to continue with creative work. As with Eat, Pray, Love, Gilbert isn’t interested in the reality of lives different from hers. Rather, she sells the idea that everyone can access her kind of success by magic.

In 2014, Gilbert sold her famously well-appointed house, telling the New York Times that she always had to move from a place once she had started and finished a project there. And so, the bookcases, the furnishings, even the imported statuary in the gardens, all of it was for sale for $999,999. It’s lovely, truly, that Gilbert has the financial resources to do so, but she appears to have lost a sense of the reality for many writers, who generally stay put in the same place and are barely able to make rent.

Without getting into any crude analysis of class politics, surely we can ask the simple question: if any place and a routine and discipline are all that’s required for a writer, why does Elizabeth Gilbert require such majestic spaces to write in?

Or we could ask an even simpler question of Gilbert, who scoffs at the very idea that the creative world should ever offer a living or stability: why shouldn’t someone who works tirelessly on a piece for, say, five months, expect to get paid really well for it?

Or even to be able to earn enough for rent? Gilbert can afford to believe in theories of creativity as magic, and wax on about the arrogance of creative workers who expect to make a living off their work—now that she has accumulated a small fortune of her own.

Ultimately, Big Magic isn’t really aimed at the “creative class,” but at a very particular kind of woman, a female consumer who wants to spend her money on a promise of a different life. In her acknowledgments, Gilbert thanks several people, but also thanks Etsy. It’s fitting; Etsy, like Gilbert, is a purveyor of goods with a quirky, homemade but polished aesthetic, professional goods given a carefully-honed sheen of amateurism.

Etsy’s visual cousin is Pinterest, a website that would have been inconceivable at the dawn of the internet: a visual repository of images of, well, things.  Need to know what a painted wooden blue table could look like, in fifty different shades and sizes? There will be a hundred images for you. Pinterest is ostensibly for the hobbyist — the idea is that you find, say, an image of a painted blue table and proceed to buff and transform that five-dollar table you found and carted home from the garage sale last summer.

But the truth is that what Pinterest offers most is a fantasy of what your imagined world might look like. If you’re like most people, your table will not be transformed. You might daydream about spending days lovingly sanding it and turning it a blue pastel, but the realities of life and work will intervene. Your table will collect dings and scratches over the years and become at best a larger holder of keys and the detritus of your life. Finally, one day, when you get ready to move, you’ll look at it and decide it’s too much trouble to take an unremarkable brown table with you. It will be stacked neatly against your dumpster, to be found by a delighted neighbor walking by, who will take it home with the exact same enthusiasm you once demonstrated, and will resolve to buff it and paint it blue, and the cycle will continue.

Big Magic is like a DIY Pinterest project, but about life itself. It is ultimately designed not for people who would like to think of writing as a profession, but for those who can afford to dabble in it. What Big Magic promises is akin to something you might find for sale on Etsy, to be recorded on Pinterest: a tiny mason jar that is also a snowglobe, a wishful, frozen fantasy of what the writing world might look like.

Photo by Timothy Greenfield-Sanders, provided courtesy of Riverhead Books.