What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Date

Click days in this calendar to see posts by day or month
<<May 2015>>
SuMoTuWeThFrSa
     0102
03040506070809
10111213141516
17181920212223
24252627282930
31      
new posts in all blogs
Viewing Blog: The Chicago Blog, Most Recent at Top
Results 26 - 50 of 1,858
Visit This Blog | Login to Add to MyJacketFlap
Blog Banner
Publicity news from the University of Chicago Press including news tips, press releases, reviews, and intelligent commentary.
Statistics for The Chicago Blog

Number of Readers that added this blog to their MyJacketFlap: 9
26. Excerpt: Renegade Dreams

9780226032719

An excerpt from Laurence Ralph’s

***

“Nostalgia, or the Stories a Gang Tells about Itself”

At the West Side Juvenile Detention Center, inmates hardly ever look you in the eyes. They almost never notice your face. Walk into a cell block at recreation time, for example, when young gang members are playing spades or sitting in the TV room watching a movie, and their attention quickly shifts to your shoes. They watch you walk to figure out why you came. I imagine what goes through their heads: Navy blue leather boots, reinforced steel toe, at least a size twelve. Must be a guard. That’s an easy one. Then the glass door swings open again. Expensive brown wingtips, creased khakis cover the tongue. A Northwestern law student come to talk about legal rights. Yep.

Benjamin Gregory wears old shoes, the kind a young affiliate wouldn’t be caught dead in. Still, the cheap patent leather shines, and, after sitting in the Detention Center’s waiting room for nearly an hour and a half, the squeak of his wingtips is a relief. It’s a muggy day, late in the spring of 2008. “I’ve been coming here for five years now,” he says. Mr. Gregory is a Bible-study instructor. “It’s a shame, but you can just tell which ones have their mothers and fathers, or someone who cares about them at home. Most of these kids don’t. Their pants gotta sag below their waist, even in prison garbs. All they talk about is selling drugs and gym shoes.”

Though I generally disagree with Mr. Gregory’s assessment of today’s young people—“hip hoppers,” as he calls them, not knowing I’m young enough to be counted in that group—his observations are, if not quite accurate, at least astute. The relationship between jail clothes and gym shoes is direct, with gang renegades—young gang affiliates that seasoned members claim don’t have the wherewithal to be in the gang—at the center. Until recently, Mr. Gregory couldn’t tell you what a gang renegade was; I educated him on the topic when he overheard inmates tossing the term around for sport. According to gang leaders, I tell him, renegades are to blame for gang underperformance. They are the chief instigators of “senseless” violence, say the leaders, and thus deserve any form of harm that befalls them, be it death, debility, or incarceration.

Ironically, Mr. Gregory’s generalized depiction of drug- and shoe-obsessed young inmates (shared by many prison guards, teachers, and even some scholars) can be compared to the way that gang members view renegades. Just as community leaders criticize the actions and affiliations of longtime Eastwoodians, older generations of gang members level critiques at young renegades. In what follows, I complicate the assumptions many have made about renegades by examining subjective versions of the Divine Knights’ contested—and contestable—history. Investigating the gang’s fraught past will help make clear the problems facing them at present. In the midst of unprecedented rates of incarceration, the anxieties that gang members harbor about the future of their organization are projected on to the youngest generation of gang members—and their gym shoes.

More precisely, in Eastwood gym shoes are emblems that embody historical consciousness. For gang members currently forty to sixty years old, the emergence of gym shoes signaled the end of an era in which affiliates pursued grassroots initiatives and involved themselves in local protest movements. Meanwhile, for the cohort of gang members who came of age in the “pre-renegade” era—those twenty-five to forty years old—gym shoes recall a time of rampant heroin trafficking, when battalions of young soldiers secured territories within a centralized leadership structure. As the younger of the two generations remembers it, this was the moment when loyalty began to translate into exorbitant profits. That these two elder generations of the Divine Knights hanker for a centralized and ordered system of governance places an enormous amount of pressure on the current generation, those gang members who are fifteen to twenty-five years old. We’ll see that just like the game of shoe charades that inmates play in jail, a renegade’s footwear can reveal his place in the world.

In the Divine Knights’ organization, wearing the latest pair of sneakers is considered the first status marker in the life and career of a gang member. For new members, having a fashionable pair of shoes signals one’s position as a legitimate affiliate. Later, in your teens and twenties, success is measured by whether a person can afford a nice car or your own apartment. Because most of the teenagers referred to as “renegades” have yet to progress to that stage, however, a fashionable pair of gym shoes is the pinnacle of possession.

Even though gang leaders claim that nowadays fashion trends of young gang members are too beholden to mainstream dictates anddon’t represent Divine Knights culture, gym shoes remain the badge of prestige most coveted by renegades. Exclusivity—whether or not the shoes can be easily purchased in ubiquitous commercial outlets like Foot Locker or only in signature boutiques—goes a long way in determining a shoe’s worth, as does pattern complexity: the more colors and textures that are woven onto the canvas of the shoe, the more valued that shoe becomes.

Over a two-year period during which I listened to gang members in informal settings and in facilitated focus groups with Divine Knights affiliates, I was able to sketch an outline of attributes concerning the five most popular gym shoes worn by young gang members in Eastwood. In some cases, the most popular brands and fashion trends evoke a past that has ceased to exist. Behold, the renegades’ “ Top 5” (in ascending order of significance):

№ 5

“ Tims,” or Timberland boots ($180), are not technically a gym shoe. But in Chicago, the term is used as a catchall for various types of men’s footwear. The construction boot of choice to tackle Chicago’s harsh winters, Tims serve a functional purpose in addition to being appreciated aesthetically. The tan “butter-soft” suede atop a thick rubber sole with dark brown leather ankle supports are staples of any shoe collection (and are typically the first pair of boots a renegade purchases). If in addition to the tan suede variety a person has Tims in other colors, he or she is thought to be an adept hustler in any climate.

№ 4

“Recs,” or Creative Recreations ($150), are a relatively new brand of sneaker popular with young renegades because they are available in an array of bright colors. Multiple textures—metallics, suedes, rubbers, and plastics—are combined on the synthetic leather canvas of each shoe. Recs also have a distinctive Velcro strap that runs across the toe. Considered the trendy of-the-moment shoe, Recs are held in high esteem by young renegades because they can only be found in a select few of Chicago’s signature boutiques.

№ 3

As the Timberland boot is to winter, the Air Force One, commonly referred to as “Air Forces” or “Ones” ($90), is to the other three seasons. This shoe is a staple of the renegade’s collection. If a young gang member has only one pair of gym shoes, they will likely be Ones. Although they come in a variety of color combinations, most affiliates begin with either white or black, with the expectation that their collection will grow in colorfulness. Moderately priced and available in a vast number of different styles, these might be the most popular gym shoes in the Divine Knights society.

№ 2

Signature shoes ($165). Young renegades are also likely to purchase the signature shoe of their favorite basketball player. For some, that’s LeBron James; for others, Kobe Bryant or, perhaps, Chicago-local Derrick Rose. As a gang member, one’s affinity for a particular player can override the aesthetic judgment of his or her friends. Still, purchasing a signature shoe entails several calculations, including when the shoe was released, which company manufactures them, and the popularity of the player in question at the moment. Given the danger that one’s signature shoe may prove undeserving of the time and effort invested in its purchase, no current player’s footwear can surpass the model by which the success of his shoe will no doubt be measured: Michael Jordan’s.

№ 1

“Jordans” ($230) are the signature shoe. A pair of Jordans is valuable to the young renegade for a number of reasons, chief among them that Michael Jordan, considered the greatest basketball player of all time, made his name playing for the Chicago Bulls. Thus, a particular geographic pride is associated with his apparel. Second, the risks involved with purchasing this particular signature shoe are greatly reduced because Jordan’s legacy is cemented in history. Third, since the first shoe one buys are not usually Jordans (because they are so expensive), there is a sense of achievement connected with finally being able to afford a pair.

Pre–renegade era Divine Knights can recall down to the year—sometimes even the day—that they purchased the same model of shoes currently being worn by young renegades. That older gang members hypocritically hassle renegades for the same consumer fetishes theythemselves once held dear bolsters the point that gym shoes have accrued additional symbolic value. At once, they point to the past and the future, similar to Eastwood’s greystones. Recall that greystones reference the past, specifically an era of Great Migration during which blacks traveled from the South to the Midwest in search of manufacturing jobs. At the same time, greystones are the primary form of capital for governmental investment. Just as city planners project future tax revenues based on empty and abandoned domiciles, a young renegade speculates on his future by buying a pair of Jordans.

For the Divine Knights, this form of speculation has, historically, required a young affiliate to position himself as a noteworthy member, thereby attracting the attention of a gang leader. Ideally, that leader will take a young Knight under his wing, bestow that affiliate with responsibilities, and reward his hard work with a share of the organization’s profits. In such a climate, adorning oneself with the most fashionable pair of shoes is a precondition for a person to prove himself worthy of the gang’s investment. A symbol of speculative capital, gym shoes—like greystones—are endowed with a double quality: They express highly charged notions of social mobility for one generation; and for another, older generation, they evoke a sense of nostalgia.

To fully understand the way in which the renegade’s gym shoes trigger an idealized notion of the past, it’s productive to dwell for a moment on the idea of nostalgia itself. From the initial use of the term—in 1688, when Johannes Hofer, a Swiss doctor, coined the term in his medical dissertation—nostalgia has been used to connect forms of social injury to the physical reality of the body. Hofer combined two Greek roots to form the term for this newfound malady: nostos (return home) and algia (longing). It describes “a longing for a home that no longer exists or has never existed.” Among the first to become debilitated by and diagnosed with this disease were Swiss soldiers who had been hired to fight in the French Revolution. Upon returning home, these soldiers were struck with “nausea, loss of appetite, pathological changes in the lungs, brain inflammation, cardiac arrests, high fever, and a propensity for suicide.” One of nostalgia’s most persistent symptoms was an ability to see ghosts.

To cure nostalgia, doctors prescribed anything from a trip to the Swiss Alps to having leeches implanted and then pulled from the skin, to sizable doses of opium. Nothing seemed to work. The struggles of ensuing generations only confirmed the difficulty, if not impossibility, of a cure. By the end of the eighteenth century, the meaning of nostalgia had shifted from a curable, individual sickness to what, literature scholar Svetlana Boym once called an incurable “historical emotion.” The burdens of nostalgia—the pressing weight of its historical emotion—are still very much with us. Interrupting the present with incessant flashes of the past, nostalgia retroactively reformulates cause and effect, and thus our linear notions of history.

“I love this walking stick,” Mr. Otis says to me. “And it’s not just ’cause I’m an old man, either.” He taps the stick on his stoop, adding, “I’ve had it since I was your age.”

Of all the Divine Knights symbols, the cane is Mr. Otis’s favorite. This is ironic, given that young gang members increasingly need canes as a consequence of the very violence Mr. Otis laments. Still, this seasoned gang veteran doesn’t associate his cane with injury but with pride and a masterful breadth of knowledge about his organization. When he was young, Mr. Otis tells me, canes, a symbol of gang unity, were hand-drawn on the custom-made shirts the Knights wore. Nowadays, Mr. Otis’s generation often contrasts the stability of the cane and its understated sophistication against the extravagance of sneakers. Why, I ask on a dusky October evening, is the cane his most cherished emblem? Mr. Otis clenches his hand into a fist, then releases one digit at a time, enumerating each of the gang’s symbols.

“Well, the top hat represents our ability to make things happen, like magicians do,” he says, wiggling his pinkie. Next comes the ring finger. “ The dice represents our hustle. You know what they say: Every day as a Divine Knight is a gamble. The playboy rabbit,” he continues, “represents that we’re swift in thought, silent in movement, and sm-o-o-th in deliverance. Of course, the champagne glass represents celebration.” Mr. Otis pauses briefly. “You can probably tell that all of these symbols have the young boys thinking that gang life is about trying to be pimps and players. But the cane”—signified by the pointer finger—“the cane represents consciousness. The knowledge that you must rely on the wisdom from your elders. The cane represents that we have to support one another—and support the community—to survive.”

We can’t see much on nights like this, but that doesn’t stop us from sitting on the stoop and watching the corner. The lights on Mr. Otis’s street either don’t work or are never on. In fact, were it not for the lamppost at the street’s end that serves as a mount for a police camera, the streetlights wouldn’t serve any purpose at all. Residents dismissively refer to the camera as the “blue light.” The device, which rests in a white box topped with a neon blue half-sphere, lights up every few seconds. Stationed to surveil the neighborhood, the blue light fulfills another unintended purpose: in the absence of working streetlights, the intermittent flash nearly illuminates the entire street. It is a vague luminescence, but just enough to make clear the molded boards of the vacant houses across the street. You can also distinguish the occasional trash bag blowing in the wind, like urban tumbleweed.

And you can spot the T-shirts—all of the young Eastwoodians in white tees—but that’s about all the blue light at the end of the street can brighten for Mr. Otis and me. From where we sit, you can’t identify the owners of those shirts; their faces aren’t perceptible, not even their limbs—just clusters of white tees floating in the distance, ghost-like. Mr. Otis, a veteran both of Eastwood stoops and Eastwood’s oldest gang, sees the ghosts as fleeting images of the “good ol’ gang,” as he calls it—a gang about to sink into oblivion.

Mr. Otis watches the street intently, as if he’s being paid for the task. And in a sense, he is: central to Mr. Otis’s work at the House of Worship’s homeless shelter is the supervision of his neighborhood. His street credentials, however, are far more valuable than anything he can see from his stoop. Mr. Otis was one of the first members to join the nascent gang in the 1950s. This was during the second Great Migration, when African Americans moved from the South to Chicago, settling in European immigrant neighborhoods. Back then, black youths traveled in packs for camaraderie, and to more safely navigate streets whose residents resented their presence. Because they were known to fight their white peers over access to recreational spaces, the image of black gangs as groups of delinquents emerged.

Mr. Otis became a leader of the Divine Knights in the 1960s, around the age of twenty-six. For the next forty years, he was—and remains—prominent both in the gang and in the community. Nowadays, he speaks about his youth with a mix of fondness and disdain. The two great narratives of his life, community decline and gang devolution, are also interwoven. “ Things were different when we were on the block,” he says. “We did things for the community. We picked up trash, even had a motto: ‘Where there is glass, there will be grass.’ And white folks couldn’t believe it. The media, they were shocked. Channel Five and Seven came around here, put us on the TV screen for picking up bottles.”

In these lively recollections, Mr. Otis connects the Divine Knights’ community-service initiatives to the political struggles of the civil rights movement. As a youngster, Mr. Otis was part of a gang whose stated goal was to end criminal activity. Around this time, in the mid-1960s, a radical new thesis articulated by criminologists and the prison-reform movement gained momentum. These researchers argued that people turned to crime because social institutions had largely failed them. Major street gangs became recipients of private grants and public funds (most notably from President Johnson’s War on Poverty) earmarked for community organization, the development of social welfare programs, and profit-making commercial enterprises. The Divine Knights ofthe 1960s opened community centers, reform schools, and a number of small businesses and management programs. Such were the possibilities when Reverend Dr. Martin Luther King Jr. relocated his family to a home near Eastwood.

In local newspaper articles, King explained that his decision to live on the West Side was political as well as purposeful. “I don’t want to be a missionary in Chicago, but an actual resident in a slum section so that we can deal firsthand with the problems,” King said. “We want to be in a section that typifies all the problems that we’re seeking to solve in Chicago.”

King’s organization, the Southern Christian Leadership Conference (SCLC), geared up for a broad attack on racism in the North. Their first northern push focused on housing discrimination; and they referred to it as “the open-housing campaign” because the SCLC wanted to integrate Chicago’s predominately white neighborhoods. As the SCLC gathered community support for their cause in May 1966, they developed relationships with Chicago’s street gangs. On Memorial Day, a riot broke out after a white man killed a black man with a baseball bat. Chaos ensued, resulting in the destruction of many local businesses. Gang members were rumored to be among the looters. Some civil rights leaders, in turn, feared that a spate of recent riots might jeopardize their campaign of nonviolence. When, during a rally at Soldier Field, a gang affiliate overheard a member of the SCLC state his reluctance to involve “gang fighters,” Chicago gang members (including many Knights) took this as a sign of disrespect and threatened to abandon King. A Chicago gang member was quoted as saying:

I brought it back to [a gang leader named] Pep and said if the dude feel this way and he’s supposed to be King’s number one man, then we don’t know how King feels and I believe we’re frontin’ ourselves off. Pep say there wasn’t no reason for us to stay there so we rapped with the other groups and when we gave our signal, all the [gang members] stood up and just split. When we left, the place was half empty and that left the King naked.

Days after the Soldier Field incident, in an effort to mend fences,King set up a meeting in his apartment and reassured gang membersthat he “needed the troops.” The Divine Knights were among the Chicago gangs to subsequently reaffirm their allegiance to King. After meeting with various gangs, top SCLC representatives were confident thatgangs could not only be persuaded to refrain from rioting, but might alsobe convinced to help calm trouble that might arise on their respective turfs. Moreover, “the sheer numbers of youths loyal to these organizations made them useful to the Southern Christian Leadership Conference’s objective of amassing an army of nonviolent protesters—even if including them came with the additional challenge of keeping them nonviolent.”

In June 1966, the Divine Knights were persuaded to participate in the two marches that Dr. King led into all-white neighborhoods during the Chicago Freedom Movement’s open-housing campaign. Inspired by the movement’s demand that the Chicago City Council increase garbage collection, street cleaning, and building-inspection services in urban areas, the Knights organized their own platform for political action. They scheduled a press conference with local media outlets to unveil their agenda on April 4, 1968. But just before the reporters arrived, King was assassinated. Less than twenty-four hours later, East-wood erupted in riots. The fires and looting following King’s murder destroyed many of the establishments along Murphy Road, Eastwood’s major commercial district at the time.

Many store owners left the neighborhood when insurance companies canceled their policies or prohibitively increased premiums, making it difficult to rebuild businesses in their previous location. This cycle of disinvestment, which peaked after King’s murder but had been steadily increasing since 1950, affected all of Eastwood’s retailers. By 1970, 75 percent of the businesses that had buoyed the community just two decades earlier were shuttered. There has not been a significant migration of jobs, or people, into Eastwood since World War II.

In the decades after the massive fires and looting, Mr. Otis and other gang elders maintain that the Divine Knights saw their power decline because they could do little to stop the other factions of the Knights from rioting. Neighborhood residents not affiliated with the gang were likewise dismayed. Here was evidence, with King’s murder, that the injustices allegedly being fought by the Divine Knights were, in fact, intractable. From Mr. Otis’s perspective, the disillusionment that accompanied King’s death, and the riots that followed—not to mention other assassinations, such as that of Black Panther leader Fred Hampton—all but ensured a downward spiral. The noble promise of the civil rights era was shattered, its decline as awful as its rise was glorious.

For Mr. Otis, the modern-day Divine Knights are as much about their forgotten history of activism as anything else. So on nights such as these—sitting on his stoop, watching the latest generation of gang members—he feels it his duty to share a finely honed civil rights legacy narrative with a novice researcher. “ Take notes on that,” he says. “Writethat down. We, the Divine Knights, got government money to build a community center for the kids. We were just trying to show ’em all: gangs don’t have to be bad, you know. Now these guys don’t have no history. They’re ‘Anonymous,’ ” Mr. Otis says sarcastically, referring to the name of one of many factions in this new renegade landscape, the Anonymous Knights.

Out in front of Mr. Otis’s stoop, ten or so gang members face each other like an offense about to break huddle. And then they do just that. The quarterback—Kemo Nostrand, the gang leader—approaches, retrieves a cell phone from his car, and then rejoins the loiterers. I ask Mr. Otis about Kemo and his crew: “Are they as disreputable as the younger gang members?”

“Look at ’em,” Mr. Otis says. “ They’re all outside, ain’t they? Drinking, smoking, wasting their lives away. They’re all outside.”

Nostalgia for the politically oriented gang is a desire for a different present as much as it is a yearning for the past. In Mr. Otis’s lamentations about the contemporary state of the gang, structural changes in the American social order are reduced to poor decision making. Mr. Otis and gang members of his generation fail to acknowledge that the gang’s latter-day embrace of the drug economy was not a simple matter of choice. The riots also marked the end of financial assistance for street organizations wanting to engage in community programming. When drug dealing emerged as a viable economic alternative for urban youth in the late 1970s, politicians had more than enough ammunition to argue that the Knights would always be criminal, as opposed to a political organization. The fact that both the local and federal government feared gangs like the Divine Knights for their revolutionary potential is airbrushed out of the romantic histories that Mr. Otis tells, where he invokes civilized marches in criticism of the gang’s present-day criminal involvement. In his version, for example, there is no mention of the gang members who, even during the civil rights heyday, were not at all civic-minded.

Whether or not this glorious perception of a political gang persists (or if it ever existed in the way Mr. Otis imagines), it is deployed nevertheless. Like the shiny new surveillance technology responsible fortransforming a person’s visage into a ghostly specter at night, the rosy civil rights lens through which Mr. Otis views the gang helps fashion the image that haunts him. Nostalgia, this historical emotion, reorders his memory.

The interview unfolds in a West Side Chicago barbershop, long since closed. Red Walker, the short, stocky, tattoo-covered leader of the Roving Knights—a splinter group of the Divine Knights—reminisces about what it has been like growing up in a gang. Walker has been a member of the Roving Knights for twenty years (since he was nine). Now, as a captain of the gang set, Red feels that the organization’s biggest problem is a lack of leadership. Comparing the gang of old to the one he now commands, he says, wistfully, “When I was growing up, we had chiefs. We had honor. There were rules that Knights had to follow, a code that gang members were expected to respect.”

A few of the Roving Knights’ strictures, according to Red: If members of the gang were shooting dice and somebody’s mom walked down the street, the Knights would move out of respect. When young kids were coming home from school, the Knights would temporarily suspend the sale of drugs. “We would take a break for a couple of hours,” Red says. “Everybody understood that. And plus, when I was coming up in the gang, you had to go to school. You could face sanctions if you didn’t. And nobody was exempt. Not even me.”

Red’s mother, he says, was a “hype”—the favored West Side term for drug addict. His father wasn’t present, and he didn’t have siblings. Red did, however, have a “soldier” assigned to him, whose responsibilities included taking him to school in the morning and greeting him when he got out. “Made sure I did my homework and everything,” Red says. “ These kids don’t have that. There’s no structure now. They govern themselves, so we call them renegades.”

It’s likely I will meet a lot of renegades on the streets of Eastwood, Red warns. Most are proudly independent, boisterous of their self-centered goals. Red says, “ They’ll even tell you, ‘Yeah, I’m just out for self. I’m trying to get my paper. Fuck the gang, the gang is dead.’ They’ ll tell you, straight up. But, you know what? They’re the ones that’s killing it, them renegades. I even had one in my crew.” Plopping down in the barber’s chair beside me, Red indicates that the story he’s about to tell is somewhat confidential, but he’s going to tell me anyway because he likes me—I’m a “studious motherfucker,” he jokes.

“You know how niggers be in here selling everything, right?” Red says. (He is referring to the daily transactions involving bootleg cable, DVDs, CDs, and candy.) “Well, back in the day, a long, long, long time ago, niggers used to sell something the police didn’t like us selling. We used to sell”—here Red searches for the right euphemism, settling on “muffins.” “Yeah, we had a bakery in this motherfucker. And cops, they hate muffins. So they would come up in here, try to be friendly, they’d snoop around, get they free haircut, and try to catch someone eating muffins or selling muffins, or whatever. But they could never catch nobody with muffin-breath around here. Never.”

One day, though, the police apprehended one of the “little shorties” working for Red, and the young man happened to have a muffin in his pocket. “Now, this wasn’t even an entire muffin. It was like a piece of a muffin—a crumb,” Red says. “Shorty wouldn’t have got in a whole lot of trouble for a crumb, you know? But this nigger sung. The nigger was singing so much, the cops didn’t have to turn on the radio. They let him out on the next block. He told about the whole bakery: the cooks, the clients. He told on everybody. And I had to do a little time behind that. That’s why in my new shop,” Red continues, glaring again at the recorder, “WE. DO. NOT. SELL. MUFFINS. ANY. MORE.”

Red pauses, seemingly satisfied by his disavowal of any current illegal muffin activity, then adds, “But, real talk: That’s how you know a renegade. No loyalty. They’ll sell you down the river for a bag of weed and a pair of Jordans.”

To read more about Renegade Dreams, click here.

Add a Comment
27. Excerpt: In Search of a Lost Avant-Garde

9780226173818

 

An excerpt from In Search of a Lost Avant-Garde: An Anthropologist Investigates the Contemporary Art Museum

by Matti Bunzl

***

“JEFF KOONS <3 CHICAGO”

I’m sitting in the conference room on the fifth floor of the MCA, the administrative nerve center which is off limits to the public. It is late January and the temperatures have just plunged to near zero. But the museum staff is bustling with activity. With four months to go until the opening of the big Jeff Koons show, all hands are on deck. And there is a little bit of panic. Deadlines for the exhibit layout and catalogue are looming, and the artist has been hard to pin down. Everyone at the MCA knows why. Koons, who commands a studio that makes Warhol’s Factory look like a little workshop, is in colossal demand. For the MCA, the show has top priority. But for Koons, it is just one among many. In 2008 alone, he will have major exhibits in Berlin, New York, and Paris. The presentation at the Neue Nationalgalerie is pretty straightforward. Less so New York, where Koons is scheduled to take over the roof of the Metropolitan Museum, one of the city’s premiere art destinations. But it may well be the French outing that most preoccupies the artist. With an invitation to present his work at Versailles, the stakes could not be higher. Indeed, when the show opens in the fall, the photos of Koons’s work at the Rococo palace, shockingly incongruous yet oddly at home, go around the world.

But on this morning, there is good news to share. As the marketing department reports, Koons has approved the publicity strategy for the MCA show. Most everyone in the group of curators, museum educators, and development staffers breathes a sigh of relief, not unlike the response of Don Draper’s team after a successful pitch. Mad Men, which made its widely hailed debut only a few months earlier, is on my mind, in fact. Sure, there is no smoking and drinking at the MCA. But the challenge faced by the museum is a lot like that of the fictional Sterling Cooper: how to take a product and fit it with an indelible essence, a singularity of feeling. Jeff Koons is hardly as generic as floor cleaner or facial cream. But given his ubiquity across the global art scene, the MCA presentation still needs a hook, something that can give it the luster of uniqueness.

Koons has history in Chicago, and that turns out to be the key. Yes, he may have been born and raised in Pennsylvania, graduated from the Maryland Institute College of Art in Baltimore, and settled in New York, where, after working as a commodities trader, he became a professional artist. But in between, for one year in the mid‑1970s, Koons lived and studied in Chicago, taking courses at the School of the Art Institute and serving as an assistant to painter Ed Paschke. Enough to imagine the MCA show as a homecoming, and the second one at that. The first, it now appears, was the 1988 exhibit, which had paved his way to superstardom and cemented an enduring relationship between artist and city. No one in the room can be certain how Koons actually feels about his old stomping grounds. But the slogan stands: Jeff Koons Chicago.

 ***

It’s a few weeks later. The group is back on the fifth floor. The mood is determined. On the agenda for today is the ad campaign. It will be a “communications blitz,” one of the staffers on the marketing team says. It will start with postcards sent to museum members urging them to save the date of the opening. “We need to communicate that it will be a real happening!”

“Koons is like a rock star,” someone seconds, “and we need to treat him like that.” Apparently, Justin Timberlake stopped by the MCA a few weeks ago, causing pandemonium among the school groups that happened to be touring the museum at the time. “Koons is just like that!” one of the marketers enthuses. “No, he’s not,” I’m thinking to myself. But for what seems like an endless few seconds, no one has the heart to burst the bubble. Finally, someone conjectures that, Koons’s art‑star status notwithstanding, people might not know what he looks like. It is suggested that the postcard feature a face shot.

The graphics for the ad campaign and catalogue cover are central to the conversation. We look at mockups, and the marketers share their excitement about the splashy images. There is much oohing and aahing. But, then, a minor hiccup. A curator notes that one of the pieces depicted in the copy will not actually be in the show, the loan request having been refused. Another image presents an issue as well. That piece will be on display, but it belongs to another museum. Maybe it, too, should be purged. No one is overly concerned, though. Given a virtually inexhaustible inventory of snazziness, Koons’s oeuvre is certain to throw up excellent replacements.

Splashy images would also be the focal point of an advertorial the marketing department is considering for a Conde Nast publication. Such a piece, to be run in eitherVanity Fair or Architectural Digest, would be written by the MCA but laid out in the magazine’s style. It would complement the more conventional press strategies, like articles in local newspapers, and would be on message. This, as signs from the real world indicate that Koons may, in fact, truly like Chicago. He wants to attend a Bulls game with his kids while in town. From the standpoint of marketing, it’s a golden opportunity. “This artist likes Chicago sports,” one staffer gushes, something people would be “pleasantly surprised by.” The narrative that emerges is that of the local boy who made good. Indeed, when the official marketing copy arrives a few weeks later, it features sentences like these: “The kid who went to art school in Chicago and loved surrealism, dada, and Ed Paschke and the imagists—that kid made it big.”

 ***

A few weeks later still, back in the conference room on the fifth floor. Today’s topic: tie‑ins and merchandizing. Some of it is very straightforward, including a proposed relationship with Art Chicago, the local art fair. Other ideas are more outlandish, like a possible connection to The Incredible Hulk. The superhero movie, based on the Marvel Comics and starring Edward Norton, is set to open in mid‑June, and the staff member pitching the idea seems to be half joking. But as I look around the room, there are a lot of nods. The show, after all, will feature several pieces from Koons’s Hulk Elvis series, large paintings that juxtapose the cartoon character with such Americana as the Liberty Bell.

More concrete is a tie‑in with Macy’s. This past fall, a large balloon version of Koons’sRabbit made its debut in New York’s Thanksgiving Day Parade. For the MCA show, it could make a trip to Chicago, where it would be displayed at the Macy’s store in the Loop. I’m wondering if that might be a risky move. After all, the company is considered something of an interloper in the city, having taken over Marshall Field’s beloved flagship store in 2006. But the marketers are ecstatic about the opportunity. “This will really leverage the promotional aspects,” one of them exclaims. The word that keeps coming up is “cross‑marketing,” and I take away that Jeff Koons stuff might soon be everywhere.

Stuff, in fact, is what really gets the group going today. Koons, it turns out, is a veritable merchandizing machine, which means that a lot of things can be sold in conjunction with the show. The list of products bearing his art ranges from the affordably populist ( beach towels from Target) to the high‑end luxurious (designs by Stella McCartney). But the news gets even better. Koons has given the MCA permission to manufacture a whole new line of T‑shirts featuring Rabbit. We pass around production samples, and everyone agrees that the baby tees, in light blue and pink, are too cute for words.

Then it’s time for the food. As early as January, I heard about plans to delight museum patrons with Koons‑themed cuisine. Initially, there was some talk of cheesecake, but more recently, the word in the hallways has been cookies. Turns out, it’s a go. Koons just approved three separate designs. We’re back to oohing and aahing, when one of the marketers suggests that some of the cookies could be signed and sold as a limited edition. Another joke, I think. But he goes on, noting that some people would choose not to eat the cookies so they could sell them at Sotheby’s in a couple of decades. The atmosphere is jocund. So when one of the curators points out that worms would be crawling out of the cookies by then, it’s all taken as part of the general levity.

 ***

The concept of marketing is quite new to contemporary art museums. In the good old days, it was simply not seen as a necessity. Giving avantgarde art a place was the objective, which meant that institutions catered to a small group of cognoscenti and worried little about attracting the general public. All this changed once the spiral of growth started churning. Larger museums required larger audiences, not just to cover ever‑increasing overhead but to validate contemporary art museums’ reinvention as major civic players.

The MCA is paradigmatic. Until the late 1970s, there was no marketing operation per se. What little advertising was done ran under the rubric of “public relations,” which was itself subsumed under “programming.” Public relations became a freestanding unit in the 1980s, and by the end of the decade marketing was officially added to its agenda. But it was not until the move into the new building in 1996 that a stand‑alone marketing department was added to the institution’s roster. The department made its presence felt immediately. Right away, its half dozen members began issuing a glossy magazine, initially called Flux and later renamed MCAMag, MCA Magazine, and eventually MCA Chicago. A few years later, a sprightly e‑news enterprise followed, keyed to the ever‑expanding website.

But marketing is much more pervasive in the contemporary art museum. Just before I arrive at the MCA, for example, its marketing department spearheads the museum’s fortieth‑anniversary celebration, a forty‑day extravaganza with numerous events and free admission. Some of the first meetings I attend at the institution are the postmortems, where the marketers take a lead in tallying the successes and failures of the initiative. There is much talk of “incentivizing membership.” Branding, too, is emphasized, particularly the ongoing need to “establish the museum’s point of view” by defining the contemporary. The latter is especially pressing in light of the imminent opening of the Art Institute’s Modern Wing. But the key word that recurs is gateway. The MCA, the marketers consistently argue, needs shows that appeal to “new and broader audiences” and signal that all Chicagoans are “welcome at the museum.”

Jeff Koons is their big hope for 2008. In a handout prepared for a meeting in February, they explain why: “Jeff Koons is by far one of the (if not the) most well known living artists today.” With the recent sale of Hanging Heart making “news even in InTouch Weekly” and his participation in the Macy’s Thanksgiving Day Parade, he “is doing what no artist has done since Andy Warhol.” He is becoming part of the “mainstream.” Even more importantly, “the art itself helps to make this a gateway.” The “images of pop culture icons and infl atable children’s toys democratize the art experience. Even the most novice of art viewers feel entitled to react to his work.”

With this, the marketing department takes a leading role in the preparations for the Koons show. Its members help organize and run the weekly meetings coordinating the museum‑wide efforts and rally the troops when morale is down. This is done corporate‑style, as in an exercise in which staffers go around the table to share what excites them personally about Jeff Koons. The curators can be conspicuously silent when marketing talk dominates the agenda. But that doesn’t mean there aren’t any tensions.

 ***

I’m having lunch with one of the curators. We’re sitting on the cafeteria side of Puck’s at the MCA, the vast, high‑ceilinged restaurant on the museum’s main exhibition floor. The conversation circles around a loaded topic, the frustrations with the marketing department. “I understand where they’re coming from,” she tells me, only to add that they may not “believe in the same things I do.” I ask for specifics and get a torrent that boils down to this: The curator sees the MCA as a space for adventure and experimentation where visitors encounter a contemporary culture they don’t already know. What marketing wants to do, she says, is to give people a pleasant experience amidst items they already like. “If they had their way, it would be Warhol all the time.” Individual viewpoints vary, of course. But I’m hearing similar things from other members of the curatorial department. Marketing, they tell me, can be fecklessly populist and insufficiently attuned to the intricacies of contemporary art and artists.

The feelings are mutual, or, to be more accurate, inverted. To the marketers, the curatorial department sometimes comes off as elitist and quixotic. When its members talk about some of the art they want to show, one of them tells me, it can “just sound crazy.” “Sometimes,” she continues, “I don’t even know what they are doing.” Even more exasperating, however, is the curators’ seeming disinterest in growing the museum’s audience. “They never think about how to attract more viewers” is a complaint I hear on more than one occasion.

If there is a convergence of views, it is only that the other side has too much power and influence.

***

For a while, I think that the fretting might be personal. Every institution, after all, breeds animosities, petty and otherwise, and the ever‑receptive anthropologist would seem to be the perfect outlet. But I am struck that the grievances are never ad hominem. The MCA’s employees, in fact, seem to genuinely like and respect one another. This is not surprising. The museum, after all, is a nonprofit whose staffers, no matter how “corporate” in orientation, could pursue eminently more lucrative careers elsewhere. The resulting feeling is that “we are all in this together,” a sentiment I hear expressed with equal regularity and conviction.

What, then, is it between curation and marketing? Over time, I come to see the tensions as intrinsic to the quest of bringing contemporary art to ever‑larger audiences. The issue, in other words, is structural.

 ***

When marketers look at contemporary art, they see a formidable challenge. Here is a product the general public knows little about, finds largely incomprehensible, and, occasionally, experiences as outright scary. This is as far as one can be from, say, introducing a new kind of soap. There, the relevance and purpose of the generic product are already well established, leaving marketers to work the magic of brand association. Maybe the campaign is all about fragrance or vitality or sex appeal—what it won’t be about is how soap itself is good for you.

Much of the marketing in the domain of high culture works in this very manner. When Chicago’s Lyric Opera advertises its new season, for example, it can safely assume that folks have a pretty accurate sense of the genre. What’s more, there is little need to justify the basic merits of the undertaking. Most people don’t go to the opera. But even those who find it boring or tedious are likely to accede to its edifying nature.

The same holds true for universal museums. In Chicago, that would be the Art Institute, whose holdings span the globe and reach from antiquity to the present. Marketing has relevance there, too. But much like at the Lyric, the value of the product is readily understood. So is its basic nature, particularly when it can take the form of such widely recognized icons as Georges Seurat’s La Grande Jatte or Grant Wood’s American Gothic.

At its most elemental, the marketing of a museum is orchestrated on the marquees at its entrance. With this in mind, the Art Institute’s advertising strategy is clear. It is the uncontested classics that get top billing, whether they are culled from the museum’s unparalleled collection or make an appearance as part of a traveling show. Mounted between the ornate columns at the majestic Michigan Avenue entrance, a typical tripartite display, as the one from April 2007, looks like this: In the middle, bright red type on a blue banner advertises Cézanne to Picasso, a major show on the art dealer Ambroise Vollard, co‑organized by the Art Institute and fresh from its stop at the Metropolitan Museum in New York. On the left, a bisected flag adds to the theme with apples by Cézanne and Picasso’s The Old Guitarist, one of the museum’s best‑loved treasures. On the right, finally, a streamer with an ornately decorated plate—horse, rider, birds, and plants—publicizes Perpetual Glory, an exhibit of medieval Islamic ceramics. A couple of months down the road, the billboards announce a show of prints and drawings collected by prominent Chicago families, an exhibit on courtly art from West Africa, and free evening admission on Thursdays and Fridays. A little later still, it is an exhibit of sixteenth‑and seventeenth‑century drawings, a display of European tapestries, and the Art Institute’s logo.

What’s not on the Art Institute marquees is contemporary art. Sure, a canonized living master like my good friend Jasper Johns can make an occasional appearance. But the edgy fare served up by the museum’s contemporary department is absent. Nothing announces William Pope.L in 2007, Mario Ybarra Jr. in 2008, or Monica Bonvicini in 2009. The contemporary art market may be booming, but the Art Institute’s marketers assume that the general public cares only so much.

Their colleagues at the MCA don’t have that option. Tasked with marketing a contemporary art museum to an ever‑expanding audience, they have to find ways to engage the general public in their rarefied institution. It is an act of identification. “Often I, myself, don’t understand the art in the museum at first,” one marketer tells me, “but that gives me an advantage. I get where our audience is coming from.”

The issue goes far beyond marquees, then, although they are its perfect representation. For what’s at stake is the public imaginary of contemporary art. This is where marketing and curation are at loggerheads. The two departments ultimately seek to tell fundamentally different stories about the MCA and its contents. For the curators, the museum is a space for the new and therefore potentially difficult. For the marketers, that is precisely the problem. “People tend to spend leisure time doing something that is guaranteed to be a good use of their time,” they implore their colleagues. “That often means sticking with the familiar.” And so the stage is set for an uneasy dance, a perpetual pas de deux in which the partners are chained together while wearing repelling magnets.

***

To read more about In Search of a Lost Avant-Garde, click here.

Add a Comment
28. Excerpt: Elaine Conis’s Vaccine Nation

9780226923765

An excerpt from Vaccine Nation: America’s Changing Relationship with Immunization

by Elaine Conis

(recent pieces featuring the book at the Washington Post and Bloomberg News)

***

“Mumps in Wartime”

Between 1963 and 1969, the nation‘s flourishing pharmaceutical industry launched several vaccines against measles, a vaccine against mumps, and a vaccine against rubella in rapid succession. The measles vaccine became the focus of the federally sponsored eradication campaign described in the previous chapter; the rubella vaccine prevented birth defects and became entwined with the intensifying abortion politics of the time. Both vaccines overshadowed the debut of the vaccine against mumps, a disease of relatively little concern to most Americans in the late 1960s. Mumps was never an object of public dread, as polio had been, and its vaccine was never anxiously awaited, like the Salk polio vaccine had been. Nor was mumps ever singled out for a high–profile immunization campaign or for eradication, as measles had been. All of which made it quite remarkable that, within a few years of its debut, the mumps vaccine would be administered to millions of American children with little fanfare or resistance.

The mumps vaccine first brought to market in 1968 was developed by Maurice Hilleman, then head of Virus and Cell Biology at the burgeoning pharmaceutical company Merck. Hilleman was just beginning to earn a reputation as a giant in the field of vaccine development; upon his death in 2005, the New York Times would credit him with saving “more lives than any other scientist in the 20th century.” Today the histories of mumps vaccine that appear in medical textbooks and the like often begin in 1963, when Hilleman‘s daughter, six–year–old Jeryl Lynn, came down with a sore throat and swollen glands. A widower who found himself tending to his daughter‘s care, Hilleman was suddenly inspired to begin work on a vaccine against mumps—which he began by swabbing Jeryl Lynn‘s throat. Jeryl Lynn‘s viral strain was isolated, cultured, and then gradually weakened, or attenuated, in Merck‘s labs. After field trials throughout Pennsylvania proved the resulting shot effective, the “Jeryl–Lynn strain” vaccine against mumps, also known as Mumpsvax, was approved for use.

But Hilleman was not the first to try or even succeed at developing a vaccine against mumps. Research on a mumps vaccine began in earnest during the 1940s, when the United States‘ entry into World War II gave military scientists reason to take a close look at the disease. As U.S. engagement in the war began, U.S. Public Health Service researchers began reviewing data and literature on the major communicable infections affecting troops during the First World War. They noted that mumps, though not a significant cause of death, was one of the top reasons troops were sent to the infirmary and absent from duty in that war—often for well over two weeks at a time. Mumps had long been recognized as a common but not “severe” disease of childhood that typically caused fever and swelling of the salivary glands. But when it struck teens and adults, its usually rare complications—including inflammation of the reproductive organs and pancreas—became more frequent and more troublesome. Because of its highly contagious nature, mumps spread rapidly through crowded barracks and training camps. Because of its tendency to inflame the testes, it was second only to venereal disease in disabling recruits. In the interest of national defense, the disease clearly warranted further study. PHS researchers estimated that during World War I, mumps had cost the United States close to 4 million “man days” from duty, contributing to more total days lost from duty than foreign forces saw.

The problem of mumps among soldiers quickly became apparent during the Second World War, too, as the infection once again began to spread through army camps. This time around, however, scientists had new information at hand: scientists in the 1930s had determined that mumps was caused by a virus and that it could, at least theoretically, be prevented through immunization. PHS surgeon Karl Habel noted that while civilians didn‘t have to worry about mumps, the fact that infection was a serious problem for the armed forces now justified the search for a vaccine. “To the military surgeon, mumps is no passing indisposition of benign course,” two Harvard epidemiologists concurred. Tipped off to the problem of mumps by a U.S. Army general and funded by the Office of Scientific Research and Development (OSRD), the source of federal support for military research at the time, a group of Harvard researchers began experiments to promote mumps virus immunity in macaque monkeys in the lab.

Within a few years, the Harvard researchers, led by biologist John Enders, had developed a diagnostic test using antigens from the monkey‘s salivary glands, as well as a rudimentary vaccine. In a subsequent set of experiments, conducted both by the Harvard group and by Habel at the National Institute of Health, vaccines containing weakened mumps virus were produced and tested in institutionalized children and plantation laborers in Florida, who had been brought from the West Indies to work on sugar plantations during the war. With men packed ten to a bunkhouse in the camps, mumps was rampant, pulling workers off the fields and sending them to the infirmary for weeks at a time. When PHS scientists injected the men with experimental vaccine, one man in 1,344 went into anaphylactic shock, but he recovered with a shot of adrenaline and “not a single day of work was lost,” reported Habel. To the researchers, the vaccine seemed safe and fairly effective—even though some of the vaccinated came down with the mumps. What remained, noted Enders, was for someone to continue experimenting until scientists had a strain infective enough to provoke a complete immune response while weak enough not to cause any signs or symptoms of the disease.

Those experiments would wait for well over a decade. Research on the mumps vaccine, urgent in wartime, became a casualty of shifting national priorities and the vagaries of government funding. As the war faded from memory, polio, a civilian concern, became the nation‘s number one medical priority. By the end of the 1940s, the Harvard group‘s research was being supported by the National Foundation for Infantile Paralysis, which was devoted to polio research, and no longer by OSRD. Enders stopped publishing on the mumps virus in 1949 and instead turned his full–time attention to the cultivation of polio virus. Habel, at the NIH, also began studying polio. With polio occupying multiple daily headlines throughout the 1950s, mumps lost its place on the nation‘s political and scientific agendas.

Although mumps received scant resources in the 1950s, Lederle Laboratories commercialized the partially protective mumps vaccine, which was about 50 percent effective and offered about a year of protection. When the American Medical Association‘s Council on Drugs reviewed the vaccine in 1957, they didn‘t see much use for it. The AMA advised against administering the shot to children, noting that in children mumps and its “sequelae,” or complications, were “not severe.” The AMA acknowledged the vaccine‘s potential utility in certain populations of adults and children—namely, military personnel, medical students, orphans, and institutionalized patients—but the fact that such populations would need to be revaccinated every year made the vaccine‘s deployment impractical. The little professional discussion generated by the vaccine revealed a similar ambivalence. Some observers even came to the disease‘s defense. Edward Shaw, a physician at the University of California School of Medicine, argued that given the vaccine‘s temporary protection, “deliberate exposure to the disease in childhood … may be desirable”: it was the only way to ensure lifelong immunity, he noted, and it came with few risks. The most significant risk, in his view, was that infected children would pass the disease to susceptible adults. But even this concern failed to move experts to urge vaccination. War had made mumps a public health priority for the U.S. government in the 1940s, but the resulting technology (imperfect as it was) generated little interest or enthusiasm in a time of peace, when other health concerns loomed larger.

After the war but before the new live virus vaccine was introduced, mumps went back to being what it long had been: an innocuous and sometimes amusing childhood disease. The amusing nature of mumps in the 1950s is evident even in seemingly serious documents from the time. When the New York State health department published a brochure on mumps in 1955, they adopted a light tone and a comical caricature of chipmunk–cheeked “Billy” to describe a brush with the disease. In the Chicago papers, health columnist and Chicago Medical Society president Theodore Van Dellen noted that when struck with mumps, “the victim is likely to be dubbed ‘moon–face.‘” Such representations of mumps typically minimized the disease‘s severity. Van Dellen noted that while mumps did have some unpleasant complications—including the one that had garnered so much attention during the war—“the sex gland complication is not always as serious as we have been led to believe.” The health department brochure pointed out that “children seldom develop complications,” and should therefore not be vaccinated: “Almost always a child is better off having mumps: the case is milder in childhood and gives him life–long immunity.”

Such conceptualizations helped shape popular representations of the illness. In press reports from the time, an almost exaggeratedly lighthearted attitude toward mumps prevailed. In Atlanta, papers reported with amusement on the oldest adult to come down with mumps, an Englishwoman who had reached the impressive age of ninety–nine. Chicago papers featured the sad but cute story of the boy whose poodle went missing when mumps prevented him from being able to whistle to call his dog home. In Los Angeles, the daily paper told the funny tale of a young couple forced to exchange marital vows by phone when the groom came down with mumps just before the big day.Los Angeles Times readers speculated on whether the word “mumps” was singular or plural, while Chicago Daily Defender readers got to laugh at a photo of a fat–cheeked matron and her fat–cheeked cocker spaniel, heads wrapped in matching dressings to soothe their mumps–swollen glands. Did dogs and cats actually get the mumps? In the interest of entertaining readers, newspapers speculated on that as well.

The top reason mumps made headlines throughout the fifties and into the sixties, however, was its propensity to bench professional athletes. Track stars, baseball players, boxers, football stars, and coaches all made the news when struck by mumps. So did Washington Redskins player Clyde Goodnight, whose story revealed a paradox of mumps at midcentury: the disease was widely regarded with casual dismissal and a smirk, even as large enterprises fretted over its potential to cut into profits. When Goodnight came down with a case of mumps in 1950, his coaches giddily planned to announce his infection to the press and then send him into the field to play anyway, where the Pittsburgh Steelers, they gambled, would be sure to leave him open for passes. But the plan was nixed before game time by the Redskins‘ public relations department, who feared the jubilant Goodnight might run up in the stands after a good play and give fans the mumps. Noted one of the team‘s publicists: “That‘s not good business.”

When Baltimore Orioles outfielder Frank Robinson came down with the mumps during an away game against the Los Angeles Angels in 1968, however, the tone of the team‘s response was markedly different. Merck‘s new Mumpsvax vaccine had recently been licensed for sale, and the Orioles‘ managers moved quickly to vaccinate the whole team, along with their entire press corps and club officials. The Orioles‘ use of the new vaccine largely adhered to the guidelines that Surgeon General William Stewart had announced upon the vaccine‘s approval: it was for preteens, teenagers, and adults who hadn‘t yet had a case of the mumps. (For the time being, at least, it wasn‘t recommended for children.) The Angels‘ management, by contrast, decided not to vaccinate their players—despite their good chances of having come into contact with mumps in the field.

Baseball‘s lack of consensus on how or whether to use the mumps vaccine was symptomatic of the nation‘s response as a whole. Cultural ambivalence toward mumps had translated into ambivalence toward the disease‘s new prophylactic, too. That ambivalence was well–captured in the hit movie Bullitt, which came out the same year as the new mumps vaccine. In the film‘s opening scene, San Francisco cop Frank Bullitt readies himself for the workday ahead as his partner, Don Delgetti, reads the day‘s headlines aloud. “Mumps vaccine on the market … the government authorized yesterday what officials term the first clearly effective vaccine to prevent mumps … ,” Delgetti begins—until Bullitt sharply cuts him off. “Why don‘t you just relax and have your orange juice and shut up, Delgetti.” Bullitt, a sixties icon of machismo and virility, has more important things to worry about than the mumps. So, apparently, did the rest of the country. The Los Angeles Times announced the vaccine‘s approval on page 12, and the New York Times buried the story on page 72, as the war in Vietnam and the race to the moon took center stage.

Also ambivalent about the vaccine—or, more accurately, the vaccine‘s use—were the health professionals grappling with what it meant to have such a tool at their disposal. Just prior to Mumpsvax‘s approval, the federal Advisory Committee on Immunization Practices at the CDC recommended that the vaccine be administered to any child approaching or in puberty; men who had not yet had the mumps; and children living in institutions, where “epidemic mumps can be particularly disruptive.” Almost immediately, groups of medical and scientific professionals began to take issue with various aspects of these national guidelines. For some, the vaccine‘s unknown duration was troubling: ongoing trials had by then demonstrated just two years of protection. To others, the very nature of the disease against which the shot protected raised philosophical questions about vaccination that had yet to be addressed. The Consumers Union flinched at the recommendation that institutionalized children be vaccinated, arguing that “mere convenience is insufficient justification for preventing the children from getting mumps and thus perhaps escorting them into adulthood without immunity.” The editors of the New England Journal of Medicine advised against mass application of mumps vaccine, arguing that the “general benignity of mumps” did not justify “the expenditure of large amounts of time, efforts, and funds.” The journal‘s editors also decried the exaggeration of mumps‘ complications, noting that the risk of damage to the male sex glands and nervous system had been overstated. These facts, coupled with the ever–present risk of hazards attendant with any vaccination program, justified, in their estimation, “conservative” use of the vaccine.

This debate over how to use the mumps vaccine was often coupled with the more generalized reflection that Mumpsvax helped spark over the appropriate use of vaccines in what health experts began referring to as a new era of vaccination. In contrast to polio or smallpox, the eradication of mumps was far from urgent, noted the editors of the prestigious medical journal the Lancet. In this “next stage” of vaccination, marked by “prevention of milder virus diseases,” they wrote, “a cautious attitude now prevails.” If vaccines were to be wielded against diseases that represented only a “minor inconvenience,” such as mumps, then such vaccines needed to be effective, completely free of side effects, long–lasting, and must not in any way increase more severe adult forms of childhood infections, they argued. Immunization officials at the CDC acknowledged that with the approval of the mumps vaccine, they had been “forced to chart a course through unknown waters.” They agreed that the control of severe illnesses had “shifted the priorities for vaccine development to the remaining milder diseases,” but how to prevent these milder infections remained an open question. They delineated but a single criterion justifying a vaccine‘s use against such a disease: that it pose less of a hazard than its target infection.

To other observers, this was not enough. A vaccine should not only be harmless—it should also produce immunity as well as or better than natural infection, maintained Oklahoma physician Harris Riley. The fact that the mumps vaccine in particular became available before the longevity of its protection was known complicated matters for many weighing in on the professional debate. Perhaps, said Massachusetts health officer Morton Madoff, physicians should be left to decide for themselves how to use such vaccines as “a matter of conscience.” His comment revealed a hesitancy to delineate policy that many displayed when faced with the uncharted territory the mumps vaccine had laid bare. It also hinted at an attempt to shift future blame in case mumps vaccination went awry down the line—a possibility that occurred to many observers given the still–unknown duration of the vaccine‘s protection.

Mumps was not a top public health priority in 1967—in fact, it was not even a reportable disease—but the licensure of Mumpsvax would change the disease‘s standing over the course of the next decade. When the vaccine was licensed, editors at the Lancet noted that there had been little interest in a mumps vaccine until such a vaccine became available. Similarly, a CDC scientist remarked that the vaccine had “stimulated renewed interest in mumps” and had forced scientists to confront how little they knew about the disease‘s etiology and epidemiology. If the proper application of a vaccine against a mild infection remained unclear, what was clear—to scientists at the CDC at least—was that such ambiguities could be rectified through further study of both the vaccine and the disease. Given a new tool, that is, scientists were determined to figure out how best to use it. In the process of doing so, they would also begin to create new representations of mumps, effectively changing how they and Americans in general would perceive the disease in the future.

A Changing Disease

Shortly after the mumps vaccine‘s approval, CDC epidemiologist Adolf Karchmer gave a speech on the infection and its vaccine at an annual immunization conference. In light of the difficulties that health officials and medical associations were facing in trying to determine how best to use the vaccine, Karchmer devoted his talk to a review of existing knowledge on mumps. Aside from the fact that the disease caused few annual deaths, peaked in spring, and affected mostly children, particularly males, there was much scientists didn‘t know about mumps. They weren‘t certain about the disease‘s true prevalence; asymptomatic cases made commonly cited numbers a likely underestimate. There was disagreement over whether the disease occurred in six– to seven–year cycles. Scientists weren‘t sure whether infection was truly a cause of male impotence and sterility. And they didn‘t know the precise nature of the virus‘s effects on the nervous system. Karchmer expressed a concern shared by many: if the vaccine was administered to children and teens, and if it proved to wear off with time, would vaccination create a population of non–immune adults even more susceptible to the disease and its serious complications than the current population? Karchmer and others thus worried—at this early stage, at least—that trying to control mumps not only wouldn‘t be worth the resources it would require, but that it might also create a bigger public health problem down the road.

To address this concern, CDC scientists took a two–pronged approach to better understanding mumps and the potential for its vaccine. They reinstated mumps surveillance, which had been implemented following World War I but suspended after World War II. They also issued a request to state health departments across the country, asking for help identifying local outbreaks of mumps that they could use to study both the disease and the vaccine. Within a few months, the agency had dispatched teams of epidemiologists to study mumps outbreaks in Campbell and Fleming Counties in Kentucky, the Colin Anderson Center for the “mentally retarded” in West Virginia, and the Fort Custer State Home for the mentally retarded in Michigan.

The Fort Custer State Home in Augusta, Michigan, hadn‘t had a single mumps outbreak in its ten years of existence when the CDC began to investigate a rash of 105 cases that occurred in late 1967. In pages upon pages of detailed notes, the scientists documented the symptoms (largely low–grade fever and runny noses) as well as the habits and behaviors of the home‘s children. They noted not only who slept where, who ate with whom, and which playgrounds the children used, but also who was a “toilet sitter,” who was a “drippy, drooley, messy eater,” who was “spastic,” who “puts fingers in mouth,” and who had “impressive oral–centered behavior.” The index case—the boy who presumably brought the disease into the home—was described as a “gregarious and restless child who spends most of his waking hours darting from one play group to another, is notably untidy and often places his fingers or his thumbs in his mouth.” The importance of these behaviors was unproven, remarked the researchers, but they seemed worth noting. Combined with other observations—such as which child left the home, for example, to go on a picnic with his sister—it‘s clear that the Fort Custer children were viewed as a petri dish of infection threatening the community at large.

Although the researchers‘ notes explicitly stated that the Fort Custer findings were not necessarily applicable to the general population, they were presented to the 1968 meeting of the American Public Health Association as if they were. The investigation revealed that mumps took about fifteen to eighteen days to incubate, and then lasted between three and six days, causing fever for one or two days. Complications were rare (three boys ages eleven and up suffered swollen testes), and attack rates were highest among the youngest children. The team also concluded that crowding alone was insufficient for mumps to spread; interaction had to be “intimate,” involving activities that stimulated the flow and spread of saliva, such as the thumb–sucking and messy eating so common among not only institutionalized children but children of all kinds.

Mumps preferentially strikes children, so it followed that children offered the most convenient population for studying the disease‘s epidemiology. But in asking a question about children, scientists ipso facto obtained an answer—or series of answers—about children. Although mumps had previously been considered a significant healthproblem only among adults, the evidence in favor of immunizing children now began to accumulate. Such evidence came not only from studies like the one at Fort Custer, but also from local reports from across the country. When Bellingham and Whatcom Counties in Washington State made the mumps vaccine available in county and school clinics, for example, few adults and older children sought the shot; instead, five– to nine–yearolds were the most frequently vaccinated. This wasn‘t necessarily a bad thing, said Washington health officer Phillip Jones, who pointed out that there were two ways to attack a health problem: you could either immunize a susceptible population or protect them from exposure. Immunizing children did both, as it protected children directly and in turn stopped exposure of adults, who usually caught the disease from kids. Immunizing children sidestepped the problem he had noticed in his own county. “It is impractical to think that immunization of adults and teen–agers against mumps will have any significant impact on the total incidence of adult and teen–age mumps. It is very difficult to motivate these people,” said Jones. “On the other hand, parents of younger children eagerly seek immunization of these younger children and there are numerous well–established programs for the immunization of children, to which mumps immunization can be added.”

Setting aside concerns regarding the dangers of giving children immunity of unknown duration, Jones effectively articulated the general consensus on immunization of his time. The polio immunization drives described in chapters 1 and 2 had helped forge the impression that vaccines were “for children” as opposed to adults. The establishment of routine pediatric care, also discussed in chapter 1, offered a convenient setting for broad administration of vaccines, as well as an audience primed to accept the practice. As a Washington, D.C., health officer remarked, his district found that they could effectively use the smallpox vaccine, which most “mothers” eagerly sought for their children, as “bait” to lure them in for vaccines against other infections. The vaccination of children got an added boost from the news that Russia, the United States‘ key Cold War opponent and foil in the space race, had by the end of 1967 already vaccinated more than a million of its youngsters against mumps.

The initial hesitation to vaccinate children against mumps was further dismantled by concurrent discourse concerning a separate vaccine, against rubella (then commonly known as German measles). In the mid1960s, rubella had joined polio and smallpox in the ranks of diseases actively instilling fear in parents, and particularly mothers. Rubella, a viral infection that typically caused rash and a fever, was harmless in children. But when pregnant women caught the infection, it posed a risk of harm to the fetus. A nationwide rubella epidemic in 1963 and 1964 resulted in a reported 30,000 fetal deaths and the birth of more than 20,000 children with severe handicaps. In fact, no sooner had the nation‘s Advisory Committee on Immunization Practices been formed, in 1964, than its members began to discuss the potential for a pending rubella vaccine to prevent similar outbreaks in the future. But as research on the vaccine progressed, it became apparent that while the shot produced no side effects in children, in women it caused a “rubella–like syndrome” in addition to swollen and painful joints. Combined with the fact that the vaccine‘s potential to cause birth defects was unknown, and that the vaccination of women planning to become pregnant was perceived as logistically difficult, federal health officials concluded that “the widespread immunization of children would seem to be a safer and more efficient way to control rubella syndrome.” Immunization of children against rubella was further justified based on the observation that children were “the major source of virus dissemination in the community.” Pregnant women, that is, would be protected from the disease as long as they didn‘t come into contact with it.

The decision to recommend the mass immunization of children against rubella marked the first time that vaccination was deployed in a manner that offered no direct benefit to the individuals vaccinated, as historian Leslie Reagan has noted. Reagan and, separately, sociologist Jacob Heller have argued that a unique cultural impetus was at play in the adoption of this policy: as an accepted but difficult–to–verify means of obtaining a therapeutic abortion at a time when all other forms of abortion were illegal, rubella infection was linked to the contentious abortion politics of the time. A pregnant woman, that is, could legitimately obtain an otherwise illegal abortion by claiming that she had been exposed to rubella, even if she had no symptoms of the disease. Eliminating rubella from communities through vaccination of children would close this loophole—or so some abortion opponents likely hoped. Eliminating rubella was also one means of addressing the growing epidemic of mental retardation, since the virus was known to cause birth defects and congenital deformities that led children to be either physically disabled or cognitively impaired. Rubella immunization promotion thus built directly upon the broader public‘s anxieties about abortion, the “crippling” diseases (such as polio), and mental retardation.

In its early years, the promotion of mumps immunization built on some of these same fears. Federal immunization brochures from the 1940s and 1950s occasionally mentioned that mumps could swell the brain or the meninges (the fluid surrounding the brain), but they never mentioned a risk of brain damage. In the late 1960s, however, such insinuations began to appear in reports on the new vaccine. Hilleman‘s early papers on the mumps vaccine trials opened with the repeated statement that “Mumps is a common childhood disease that may be severely and even permanently crippling when it involves the brain.” When Chicago announced Mumps Prevention Day, the city‘s medical director described mumps as a disease that can “contribute to mental retardation.” Though newspaper reporters focused more consistently on the risk that mumps posed to male fertility, many echoed the “news” that mumps could cause permanent damage to the brain. Such reports obscured substantial differentials of risk noted in the scientific literature. For unlike the link between mumps and testicular swelling, the relationship between mumps and brain damage or mental retardation was neither proven nor quantified, even though “benign” swelling of meninges was documented to appear in 15 percent of childhood cases. In a nation just beginning to address the treatment of mentally retarded children as a social (instead of private) problem, however, any opportunity to prevent further potential cases of brain damage, no matter how small, was welcomed by both parents and cost–benefit–calculating municipalities.

The notion that vaccines protected the health (and, therefore, the productivity and utility) of future adult citizens had long been in place by the time the rubella vaccine was licensed in 1969. In addition to fulfilling this role, the rubella vaccine and the mumps vaccine—which, again, was most commonly depicted as a guard against sterility and “damage to the sex glands” in men—were also deployed to ensure the existence of future citizens, by protecting the reproductive capacities of the American population. The vaccination of children against both rubella and mumps was thus linked to cultural anxiety over falling fertility in the post–Baby Boom United States. In this context, mumps infection became nearly as much a cause for concern in the American home as it had been in army barracks and worker camps two decades before. This view of the disease was captured in a 1973 episode of the popular television sitcom The Brady Bunch, in which panic ensued when young Bobby Brady learned he might have caught the mumps from his girlfriend and put his entire family at risk of infection. “Bobby, for your first kiss, did you have to pick a girl with the mumps?” asked his father, who had made it to adulthood without a case of the disease. This cultural anxiety was also evident in immunization policy discussions. CDC scientists stressed the importance of immunizing against mumps given men‘s fears of mumps–induced impotence and sterility—even as they acknowledged that such complications were “rather poorly documented and thought to occur rarely, if at all.”

As the new mumps vaccine was defining its role, the revolution in reproductive technologies, rights, and discourse that extended from the 1960s into the 1970s was reshaping American—particularly middle–class American—attitudes toward children in a manner that had direct bearing on the culture‘s willingness to accept a growing number of vaccines for children. The year 1967 saw more vaccines under development than ever before. Merck‘s own investment in vaccine research and promotion exemplified the trend; even as doctors and health officials were debating how to use Mumpsvax, Hilleman‘s lab was testing a combined vaccine against measles, rubella, and mumps that would ultimately help make the company a giant in the vaccine market. This boom in vaccine commodification coincided with the gradual shrinking of American families that new contraceptive technologies and the changing social role of women (among other factors) had helped engender.

The link between these two trends found expression in shifting attitudes toward the value of children, which were well–captured by Chicago Tribune columnist Joan Beck in 1967. Beck predicted that 1967 would be a “vintage year” for babies, for the 1967 baby stood “the best chance in history of being truly wanted” and the “best chance in history to grow up healthier and brighter and to get a better education than his forebears.” He‘d be healthier—and smarter—thanks in large part to vaccines, which would enable him to “skip” mumps, rubella, and measles, with their attendant potential to “take the edge off a child‘s intelligence.” American children might be fewer in number as well as costly, Beck wrote, but they‘d be both deeply desired and ultimately well worth the tremendous investment. This attitude is indicative of the soaring emotional value that children accrued in the last half of the twentieth century. In the 1960s, vaccination advocates appealed directly to the parent of the highly valued child, by emphasizing the importance of vaccinating against diseases that seemed rare or mild, or whose complications seemed even rarer still. Noted one CDC scientist, who extolled the importance of vaccination against such diseases as diphtheria and whooping cough even as they became increasingly rare: “The disease incidence may be one in a thousand, but if that one is your child, the incidence is a hundred percent.”

Discourse concerning the “wantedness” of individual children in the post–Baby Boom era reflected a predominantly white middle–class conceptualization of children. As middle–class birth rates continued to fall, reaching a nadir in 1978, vaccines kept company with other commodities—a suburban home, quality schooling, a good college—that shaped the truly wanted child‘s middle–class upbringing. From the late 1960s through the 1970s, vaccination in general was increasingly represented as both a modern comfort and a convenience of contemporary living. This portrayal dovetailed with the frequent depiction of the mild infections, and mumps in particular, as “nuisances” American no longer needed to “tolerate.” No longer did Americans of any age have to suffer the “variety of spots and lumps and whoops” that once plagued American childhood, noted one reporter. Even CDC publications commented on “the luxury and ease of health provided by artificial antigens” of the modern age.

And even though mumps, for one, was not a serious disease, remarked one magazine writer, the vaccination was there “for those who want to be spared even the slight discomfort of a case.” Mumps vaccination in fact epitomized the realization of ease of modern living through vaccination. Because it kept kids home from school and parents home from work, “it is inconvenient, to say the least, to have mumps,” noted a Massachusetts health official. “Why should we tolerate it any longer?” Merck aimed to capitalize on this view with ads it ran in the seventies: “To help avoid the discomfort, the inconvenience—and the possibility of complications: Mumpsvax,” read the ad copy. Vaccines against infections such as mumps might not be perceived as absolutely necessary, but the physical and material comfort they provided could not be undervalued.

To read more about Vaccine Nation, click here.

Add a Comment
29. Free e-book for February: Floating Gold

9780226430362

Our free e-book for February is Christopher Kemp’s idiosyncratic exegesis on the backstory of whale poop,

Floating Gold: A Natural (and Unnatural) History of Ambergris.

***

“Preternaturally hardened whale dung” is not the first image that comes to mind when we think of perfume, otherwise a symbol of glamour and allure. But the key ingredient that makes the sophisticated scent linger on the skin is precisely this bizarre digestive by-product—ambergris. Despite being one of the world’s most expensive substances (its value is nearly that of gold and has at times in history been triple it), ambergris is also one of the world’s least known. But with this unusual and highly alluring book, Christopher Kemp promises to change that by uncovering the unique history of ambergris.

A rare secretion produced only by sperm whales, which have a fondness for squid but an inability to digest their beaks, ambergris is expelled at sea and floats on ocean currents for years, slowly transforming, before it sometimes washes ashore looking like a nondescript waxy pebble. It can appear almost anywhere but is found so rarely, it might as well appear nowhere. Kemp’s journey begins with an encounter on a New Zealand beach with a giant lump of faux ambergris—determined after much excitement to nothing more exotic than lard—that inspires a comprehensive quest to seek out ambergris and its story. He takes us from the wild, rocky New Zealand coastline to Stewart Island, a remote, windswept island in the southern seas, to Boston and Cape Cod, and back again. Along the way, he tracks down the secretive collectors and traders who populate the clandestine modern-day ambergris trade.

Floating Gold is an entertaining and lively history that covers not only these precious gray lumps and those who covet them, but presents a highly informative account of the natural history of whales, squid, ocean ecology, and even a history of the perfume industry. Kemp’s obsessive curiosity is infectious, and eager readers will feel as though they have stumbled upon a precious bounty of this intriguing substance.

Download your free copy of Floating Gold, here.

Add a Comment
30. Sandra M. Gustafson on the State of the Union (2015)

President Obama Delivers State Of The Union Address

As with the past few years, we are fortunate enough to have scholar Sandra M. Gustafson contribute a post following Barack Obama’s annual State of the Union address, positing the stakes for Obama’s rhetorical position in light of recent events in Ferguson, Missouri, and New York City (while pointing toward their more deeply embedded and disturbing legacies, respectively). Read Gustafson’s 2015 post in full after the jump below.

***

Lives that Matter: Reflections on the 2015 State of the Union Address

by Sandra M. Gustafson

 In his sixth State of the Union address, President Barack Obama summarized the major achievements of his administration to date–bringing the American economy back from the Great Recession, passing and implementing the Affordable Care Act, advancing civil rights, and winding down wars in Iraq and Afghanistan, while shifting the emphasis of US foreign policy toward diplomacy and multilateralism – and presented a framework for new initiatives that he called “middle class economics,” including affordable child care, a higher minimum wage, and free community college. Commentators compared the president’s emphasis on the successes of his six years in office to an athlete taking a victory lap. Some considered that tone odd in light of Republican midterm victories, while others speculated about his aspirations to shape the 2016 presidential election.  More and more, the president’s rhetoric and public actions inform an effort to shape his legacy, both in terms of the direction of his party and with regard to his historical reputation. The 2015 State of the Union address was a prime example of the narrative emerging from the White House.

The announcement earlier on the day of the address that the president will visit Selma, Alabama, to commemorate the fiftieth anniversary of Bloody Sunday and the movement to pass the Voting Rights Act was just one of many examples of how he has presented that legacy over the years: as an extension of the work of Martin Luther King, Jr. Community organizing, nonviolent protest, and political engagement are the central components of the route to social change that the president offered in The Audacity of Hope, his 2006 campaign autobiography. The need to nurture a commitment to progressive change anchored in an expanded electorate and an improved political system has been a regular theme of his time in office.

In the extended peroration that concluded this State of the Union address, the president alluded to his discussion of deliberative democracy in The Audacity of Hope. He called for “a better politics,” which he described as one where “we appeal to each other’s basic decency instead of our basest fears,” “where we debate without demonizing each other; where we talk issues and values, and principles and facts,” and “where we spend less time drowning in dark money for ads that pull us into the gutter, and spend more time lifting young people up with a sense of purpose and possibility.” He also returned to his 2004 speech to the Democratic National Convention in Boston, quoting a now famous passage, “there wasn’t a liberal America or a conservative America; a black America or a white America—but a United States of America.”

The president’s biracial background and his preference for “both/and” ways of framing conflicts has put him at odds with critics such as Cornell West and Tavis Smiley, who have faulted him for not paying sufficient attention to the specific problems of black America. The approach that Obama took in his address to the police killings of unarmed black men in Ferguson, Missouri, and New York City did not satisfy activists in the Black Lives Matter coalition, which issued a rebuttal to his address in the form of a State of the Black Union message. To the president’s claim that “The shadow of crisis has passed, and the State of the Union is strong,” the activists responded emphatically, offering a direct rebuttal in the subtitle of their manifesto: “The Shadow of Crisis has NOT Passed.” Rejecting his assertions of economic growth and social progress, they assembled a list of counterclaims.

The president came closest to engaging the concerns of the activists when he addressed the issue of violence and policing. “We may have different takes on the events of Ferguson and New York,” he noted, juxtaposing micronarratives of “a father who fears his son can’t walk home without being harassed” and “the wife who won’t rest until the police officer she married walks through the front door at the end of his shift.” By focusing on the concerns of a father and a wife, rather than the young man and the police officer at risk, he expanded the possibilities for identification in a manner that echoes his emphasis on family. The “State of the Black Union” extends the notion of difference in an alternative direction and responds with a macronarrative couched in terms of structural violence: “Our schools are designed to funnel our children into prisons. Our police departments have declared war against our community. Black people are exploited, caged, and killed to profit both the state and big business. This is a true State of Emergency. There is no place for apathy in this crisis. The US government has consistently violated the inalienable rights our humanity affords.”

To the president’s language of the nation as a family, and to his statement that “I want our actions to tell every child in every neighborhood, your life matters, and we are committed to improving your life chances[,] as committed as we are to working on behalf of our own kids,” the manifesto responds by rejecting his image of national solidarity and his generalization of the “black lives matter” slogan. Instead it offers a ringing indictment: “This corrupt democracy was built on Indigenous genocide and chattel slavery. And continues to thrive on the brutal exploitation of people of color. We recognize that not even a Black President will pronounce our truths. We must continue the task of making America uncomfortable about institutional racism. Together, we will re-imagine what is possible and build a system that is designed for Blackness to thrive.”  After presenting a list of demands and declaring 2015 “the year of resistance,” the manifesto concludes with a nod to Obama’s 2008 speech on race, “A More Perfect Union”: “We the People, committed to the declaration that Black lives matter, will fight to end the structural oppression that prevents so many from realizing their dreams. We cannot, and will not stop until America recognizes the value of Black life.”

This call-and-response between the first African American president and a coalition of activists has two registers.  One register involves the relationship between part and whole (e pluribus unum). President Obama responds to demands that he devote more attention to the challenges facing Black America by emphasizing that he is the president of the entire nation. What is at stake, he suggests, is the ability of an African American to represent a heterogeneous society.

The other register of the exchange exemplifies a persistent tension over the place of radicalism in relation to the institutions of democracy in the United States.  The Black Lives Matter manifesto draws on critiques of American democracy in Black Nationalist, Black radical, and postcolonial thought. As I discuss in Imagining Deliberative Democracy in the Early American Republic, these critiques have roots reaching back before the Civil War, to abolitionist leaders such as David Walker and Maria Stewart, and even earlier to the Revolutionary War veteran and minister Lemuel Haynes. The recently released film Selma, which portrays the activism leading to the passage of the 1965 Voting Rights Act, highlights the tactics of Dr. King and his associates as they pressure President Johnson to take up the matter of voting. The film characterizes the radical politics of Malcolm X and the threat of violence as a means to enhance the appeal of King’s nonviolent approach, an argument that Malcolm himself made. It then includes a brief scene in which Malcolm meets with Coretta Scott King in a tentative rapprochement that occurred shortly before his assassination. This tripartite structure of the elected official, the moderate or nonviolent activist, and the radical activist willing to embrace violence has become a familiar paradigm of progressive social change.

Aspects of this paradigm inform Darryl Pinckney’s “In Ferguson.” Reporting on the violence that followed the grand jury’s failure to indict Officer Darren Wilson for Michael Brown’s killing, Pickney quotes the Reverend Osagyefo Sekou, one of the leaders of the Don’t Shoot coalition, on the limits of electoral politics. Voting is “an insider strategy,” Sekou says. “If it’s only the ballot box, then we’re finished.” Pickney also cites Hazel Erby, the only black member of the seven-member county council of Ferguson, who explained the overwhelmingly white police force as a result of low voter turnout. Pinckney summarizes: “The city manager of Ferguson and its city council appoint the chief of police, and therefore voting is critical, but the complicated structure of municipal government is one reason many people have been uninterested in local politics.” This type of local narrative has played a very minor role in the coverage.  It occupies a register between President Obama’s micronarratives focused on individuals and families, on the one hand, and the structural violence macronarrative of the Black Lives Matter manifesto on the other. This middle register is where specific local situations are addressed and grassroots change happens. It can also provide insight into broad structural problems that might otherwise be invisible.

The value of this middle register of the local narrative emerges in the light that Rachel Aviv shines on police violence in an exposé of the Albuquerque Police Department. In “Your Son is Deceased,” Aviv focuses on the ordeal of the middle class Torres family when Christopher Torres, a young man suffering from schizophrenia, is shot and killed by police in the backyard of the family home. Christopher’s parents, a lawyer and the director of human resources for the county, are refused information and kept from the scene of their son’s killing for hours. They learn what happened to Christopher only through news reports the following day. The parallels between the Torres and Brown cases are striking, as are the differences. Though the confrontation with the police that led to Torres’s death happened just outside his home, and though his parents knew and worked with city officials including the mayor, his death and the official response to it share haunting similarities with that of Brown. Aviv does not ignore the issue of race and ethnicity, mentioning the sometimes sharp conflicts in this borderlands region between Latino/as, Native Americans, and whites.  But in presenting her narrative, she highlights the local factors that foster the corruption that she finds to be endemic in the Albuquerque Police Department; she also foregrounds mental illness as a decisive element in a number of police killings–one that crosses racial and economic boundaries.

There is a scene in Selma, in which Dr. King invites his colleagues to explore the dimensions of the voter suppression problem. They begin listing the contributing factors—the literacy tests, the poll tax—and then one of the organizers mentions laws requiring that a sponsor who is a voter must vouch for someone who wishes to register. The sponsor must know the would-be voter and be able to testify to her or his character. In rural areas of the South, there might not be a registered black voter for a hundred miles, and so many potential voters could not find an acquaintance to sponsor them.  The organizers agree this should be their first target, since without a sponsor, a potential voter cannot even reach the downstream hurdles of the literacy test and the poll tax. This practice of requiring a sponsor was specifically forbidden in the Voting Rights Act. At present, there are attempts to revive a version of the voucher test.

*

Selma as a whole, and this scene in particular, exemplifies many of the central features of democratic self-governance that Danielle Allen describes in Our Declaration: A Reading of the Declaration of Independence in Defense of Equality. Allen, a classicist and political theorist at Princeton’s Institute for Advanced Study, develops what she calls a “slow reading” of the Declaration of Independence in order to draw out the meaning of equality, which she relates to political processes focused on democratic deliberation and writing. From the language of the Declaration, Allen draws five interconnected facets of the ideal of equality.  Equality, she explains, involves freedom from domination, for both states and individuals. It also involves “recognizing and enabling the general human capacity for political judgment” coupled with “access to the tool of government.”  She finds equality to be produced through the Aristotelian “potluck method,” whereby individuals contribute their special forms of knowledge to foster social good, and through reciprocity or mutual responsiveness, which contributes to equality of agency. And she defines equality as “co-creation, where many people participate equally in creating a world together.”[i]

Selma illustrates all of these features of equality at work in the Civil Rights Movement, and the discussion of how to prioritize different aspects of voter suppression is a compelling dramatization of the “potluck method.” Following Allen, what is called for now is the sharing of special knowledge among individuals and communities affected by violent policing, including representatives of the police.  The December killings of New York City police officers Wenjian Liu and Rafael Ramos further heightened the polarization between police and protestors. President Obama offered one strategy for defusing that polarization in his State of the Union address when he presented scenarios designed to evoke reciprocity and mutual responsiveness.  Christopher Torres’s killing introduces an additional set of issues about the treatment of people with mental illness that complicates the image of a white supremacist state dominating black bodies—as does the fact that neither Liu nor Ramos was white.

What is needed now is a forum to produce and publicize a middle register of knowledge that addresses both local circumstances, such as the overly complicated government structure in Ferguson or the corruption in the Albuquerque Police Department, and more systemic problems such as the legacy of racism, a weak system of mental health care, and ready access to guns. Such a forum would exemplify the potluck method and embody the ideals of deliberative democracy as President Obama described them in The Audacity of Hope. Noting the diffuse operations of power in the government of the United States, he emphasized the importance of building a deliberative democracy where, “all citizens are required to engage in a process of testing their ideas against an external reality, persuading others of their point of view, and building shifting alliances of consent.” The present focus on police violence offers an opportunity to engage in such a democratic deliberation. The issues are emotional, and the stakes are high. But without the social sharing that Aristotle compared to a potluck meal, we will all remain hungry for solutions.

[i] In “Equality as Singularity:  Rethinking Literature and Democracy,” I relate Allen’s treatment of equality to the approach developed by French theorist Pierre Rosanvallon and consider both in relation to literature. The essay appears in a forthcoming special issue of New Literary History devoted to political theory.

*

Sandra M. Gustafson is professor of English and American studies at the University of Notre Dame. She is writing a book on conflict and democracy in classic American fiction with funding from the National Endowment for the Humanities.

To read more about Imagining Deliberative Democracy in the Early American Republic, click here.

Add a Comment
31. Excerpt: How Many is Too Many?

C_Cafaro_How_9780226190655_jkt

Excerpted from

How Many is Too Many?: The Progressive Argument for Reducing Immigration into the United States 

by Philip Cafaro

***

How many immigrants should we allow into the United States annually, and who gets to come?

The question is easy to state but hard to answer, for thoughtful individuals and for our nation as a whole. It is a complex question, touching on issues of race and class, morals and money, power and political allegiance. It is an important question, since our answer will help determine what kind of country our children and grandchildren inherit. It is a contentious question: answer it wrongly and you may hear some choice personal epithets directed your way, depending on who you are talking to. It is also an endlessly recurring question, since conditions will change, and an immigration policy that made sense in one era may no longer work in another. Any answer we give must be open to revision.

This book explores the immigration question in light of current realities and defends one provisional answer to it. By exploring the question from a variety of angles and making my own political beliefs explicit, I hope that it will help readers come to their own well-informed conclusions. Our answers may differ, but as fellow citizens we need to keep talking to one another and try to come up with immigration policies that further the common good.

Why are immigration debates frequently so angry? People on one side often seem to assume it is just because people on the other are stupid, or immoral. I disagree. Immigration is contentious because vital interests are at stake and no one set of policies can fully accommodate all of them. Consider two stories from among the hundreds I’ve heard while researching this book.

* * *

It is lunchtime on a sunny October day and I’m talking to Javier, an electrician’s assistant, at a home construction site in Longmont, Colorado, near Denver. He is short and solidly built; his words are soft-spoken but clear. Although he apologizes for his English, it is quite good. At any rate much better than my Spanish.

Javier studied to be an electrician in Mexico, but could not find work there after school. “You have to pay to work,” he explains: pay corrupt officials up to two years’ wages up front just to start a job. “Too much corruption,” he says, a refrain I find repeated often by Mexican immigrants. They feel that a poor man cannot get ahead there, can hardly get started.

So in 1989 Javier came to the United States, undocumented, working various jobs in food preparation and construction. He has lived in Colorado for nine years and now has a wife (also here illegally) and two girls, ages seven and three. “I like USA, you have a better life here,” he says. Of course he misses his family back in Mexico. But to his father’s entreaties to come home, he explains that he needs to consider his own family now. Javier told me that he’s not looking to get rich, he just wants a decent life for himself and his girls. Who could blame him?

Ironically one of the things Javier likes most about the United States is that we have rules that are fairly enforced. Unlike in Mexico, a poor man does not live at the whim of corrupt officials. When I suggest that Mexico might need more people like him to stay and fight “corruption,” he just laughs. “No, go to jail,c he says, or worse. Like the dozens of other Mexican and Central American immigrants I have interviewed for this book, Javier does not seem to think that such corruption could ever change in the land of his birth.

Do immigrants take jobs away from Americans? I ask. “American people no want to work in the fields,” he responds, or as dishwashers in restaurants. Still, he continues, “the problem is cheap labor.” Too many immigrants coming into construction lowers wages for everyone— including other immigrants like himself.

“The American people say, all Mexicans the same,” Javier says. He does not want to be lumped together with “all Mexicans,” or labeled a problem, but judged for who he is as an individual. “I don’t like it when my people abandon cars, or steal.” If immigrants commit crimes, he thinks they should go to jail, or be deported. But “that no me.” While many immigrants work under the table for cash, he is proud of the fact that he pays his taxes. Proud, too, that he gives a good day’s work for his daily pay (a fact confirmed by his coworkers).

Javier’s boss, Andy, thinks that immigration levels are too high and that too many people flout the law and work illegally. He was disappointed, he says, to find out several years ago that Javier was in the country illegally. Still he likes and respects Javier and worries about his family. He is trying to help him get legal residency.

With the government showing new initiative in immigration enforcement—including a well-publicized raid at a nearby meat-packing plant that caught hundreds of illegal workers—there is a lot of worry among undocumented immigrants. “Everyone scared now,” Javier says. He and his wife used to go to restaurants or stores without a second thought; now they are sometimes afraid to go out. “It’s hard,” he says. But: “I understand. If the people say, ‘All the people here, go back to Mexico,’ I understand.”

Javier’s answer to one of my standard questions—“How might changes in immigration policy affect you?”—is obvious. Tighter enforcement could break up his family and destroy the life he has created here in America. An amnesty would give him a chance to regularize his life. “Sometimes,” he says, “I dream in my heart, ‘If you no want to give me paper for residence, or whatever, just give me permit for work.’ ”

* * *

It’s a few months later and I’m back in Longmont, eating a 6:30 breakfast at a café out by the Interstate with Tom Kenney. Fit and alert, Tom looks to be in his mid-forties. Born and raised in Denver, he has been spraying custom finishes on drywall for twenty-five years and has had his own company since 1989. “At one point we had twelve people running three trucks,” he says. Now his business is just him and his wife. “Things have changed,” he says.

Although it has cooled off considerably, residential and commercial construction was booming when I interviewed Tom. The main “thing that’s changed” is the number of immigrants in construction. When Tom got into it twenty-five years ago, construction used almost all native-born workers. Today estimates of the number of immigrant workers in northern Colorado range from 50% to 70% of the total construction workforce. Some trades, like pouring concrete and framing, use immigrant labor almost exclusively. Come in with an “all-white” crew of framers, another small contractor tells me, and people do a double-take.

Tom is an independent contractor, bidding on individual jobs. But, he says, “guys are coming in with bids that are impossible.” After all his time in the business, “no way they can be as efficient in time and materials as me.” The difference has to be in the cost of labor. “They’re not paying the taxes and insurance that I am,” he says. Insurance, workmen’s compensation, and taxes add about 40% to the cost of legally employed workers. When you add the lower wages that immigrants are often willing to take, there is plenty of opportunity for competing contractors to underbid Tom and still make a tidy profit. He no longer bids on the big new construction projects and jobs in individual, custom-built houses are becoming harder to find.

“I’ve gone in to spray a house and there’s a guy sleeping in the bathtub, with a microwave set up in the kitchen. I’m thinking, ‘You moved into this house for two weeks to hang and paint it, you’re gonna get cash from somebody, and he’s gonna pick you up and drive you to the next one.’ ” He seems more upset at the contractor than at the undocumented worker who labors for him.

In this way, some trades in construction are turning into the equivalent of migrant labor in agriculture. Workers do not have insurance or workmen’s compensation, so if they are hurt or worn out on the job, they are simply discarded and replaced. Workers are used up, while the builders and contractors higher up the food chain keep more of the profits for themselves. “The quality of life [for construction workers] has changed drastically,” says Tom. “I don’t want to live like that. I want to go home and live with my family.”

Do immigrants perform jobs Americans don’t want to do? I ask. The answer is no. “My job is undesirable,” Tom replies. “It’s dirty, it’s messy, it’s dusty. I learned right away that because of that, the opportunity is available to make money in it. That job has served me well”—at least up until recently. He now travels as far away as Wyoming and southern Colorado to find work. “We’re all fighting for scraps right now.”

Over the years, Tom has built a reputation for quality work and efficient and prompt service, as I confirmed in interviews with others in the business. Until recently that was enough to secure a good living. Now though, like a friend of his who recently folded his small landscaping company (“I just can’t bid ’em low enough”), Tom is thinking of leaving the business. He is also struggling to find a way to keep up the mortgage payments on his house.

He does not blame immigrants, though. “If you were born in Mexico, and you had to fight for food or clothing, you would do the same thing,” Tom tells me. “You would come here.”

* * *

Any immigration policy will have winners and losers. So claims Harvard economist George Borjas, a leading authority on the economic impacts of immigration. My interviews with Javier Morales and Tom Kenney suggest why Borjas is right.

If we enforce our immigration laws, then good people like Javier and his family will have their lives turned upside down. If we limit the numbers of immigrants, then good people in Mexico (and Guatemala, and Vietnam, and the Philippines …) will have to forgo opportunities to live better lives in the United States.

On the other hand, if we fail to enforce our immigration laws or repeatedly grant amnesties to people like Javier who are in the country illegally, then we forfeit the ability to set limits to immigration. And if immigration levels remain high, then hard-working men and women like Tom and his wife and children will probably continue to see their economic fortunes decline. Economic inequality will continue to increase in America, as it has for the past four decades.

In the abstract neither of these options is appealing. When you talk to the people most directly affected by our immigration policies, the dilemma becomes even more acute. But as we will see further on when we explore the economics of immigration in greater detail, these appear to be the options we have.

Recognizing trade-offs—economic, environmental, social—is indeed the beginning of wisdom on the topic of immigration. We should not exaggerate such conflicts, or imagine conflicts where none exist, but neither can we ignore them. Here are some other trade-offs that immigration decisions may force us to confront:

  • Cheaper prices for new houses vs. good wages for construction workers.
  • Accommodating more people in the United States vs. preserving wildlife habitat and vital resources.
  • Increasing ethnic and racial diversity in America vs. enhancing social solidarity among our citizens.
  • More opportunities for Latin Americans to work in the United States vs. greater pressure on Latin American elites to share wealth and opportunities with their fellow citizens.

The best approach to immigration will make such trade-offs explicit, minimize them where possible, and choose fairly between them when necessary.

Since any immigration policy will have winners and losers, at any particular time there probably will be reasonable arguments for changing the mix of immigrants we allow in, or for increasing or decreasing overall immigration, with good people on all sides of these issues. Whatever your current beliefs, by the time you finish this book you should have a much better understanding of the complex trade-offs involved in setting immigration policy. This may cause you to change your views about immigration. It may throw your current views into doubt, making it harder to choose a position on how many immigrants to let into the country each year; or what to do about illegal immigrants; or whether we should emphasize country of origin, educational level, family reunification, or asylum and refugee claims, in choosing whom to let in. In the end, understanding trade-offs ensures that whatever policies we wind up advocating for are more consciously chosen, rationally defensible, and honest. For such a contentious issue, where debate often generates more heat than light, that might have to suffice.

* * *

Perhaps a few words about my own political orientation will help clarify the argument and goals of this book. I’m a political progressive. I favor a relatively equal distribution of wealth across society, economic security for workers and their families, strong, well-enforced environmental protection laws, and an end to racial discrimination in the United States. I want to maximize the political power of common citizens and limit the influence of large corporations. Among my political heroes are the three Roosevelts (Teddy, Franklin, and Eleanor), Rachel Carson, and Martin Luther King Jr.

I also want to reduce immigration into the United States. If this combination seems odd to you, you are not alone. Friends, political allies, even my mother the social worker shake their heads or worse when I bring up the subject. This book aims to show that this combination of political progressivism and reduced immigration is not odd at all. In fact, it makes more sense than liberals’ typical embrace of mass immigration: an embrace shared by many conservatives, from George W. Bush and Orrin Hatch to the editorial board of the Wall Street Journal and the US Chamber of Commerce.

In what follows I detail how current immigration levels—the highest in American history—undermine attempts to achieve progressive economic, environmental, and social goals. I have tried not to oversimplify these complex issues, or mislead readers by cherry-picking facts to support pre-established conclusions. I have worked hard to present the experts’ views on how immigration affects US population growth, poorer workers’ wages, urban sprawl, and so forth. Where the facts are unclear or knowledgeable observers disagree, I report that, too.

This book is divided into four main parts. Chapters 1 and 2 set the stage for us to consider how immigration relates to progressive political goals. Chapter 2, “Immigration by the Numbers,” provides a concise history of US immigration policy. It explains current policy, including who gets in under what categories of entry and how many people immigrate annually. It also discusses population projections for the next one hundred years under different immigration scenarios, showing how relatively small annual differences in immigration numbers quickly lead to huge differences in overall population.

Part 2 consists of chapters 3–5, which explore the economics of immigration, showing how flooded labor markets have driven down workers’ wages in construction, meatpacking, landscaping, and other economic sectors in recent decades, and increased economic inequality. I ask who wins and who loses economically under current immigration policies and consider how different groups might fare under alternative scenarios. I also consider immigration’s contribution to economic growth and argue that unlike fifty or one hundred years ago America today does not need a larger economy, with more economic activity or higher levels of consumption, but rather a fairer economy that better serves the needs of its citizens. Here as elsewhere, the immigration debate can clarify progressive political aspirations; in this case, helping us rethink our support for endless economic growth and develop a more mature understanding of our economic goals.

Part 3, chapters 6–8, focuses on the environment. Mass immigration has increased America’s population by tens of millions of people in recent decades and is set to add hundreds of millions more over the twenty-first century. According to Census Bureau data our population now stands at 320 million people, the third-largest in the world, and at current immigration rates could balloon to over 700 million by 2100. This section examines the environmental problems caused by a rapidly growing population, including urban sprawl, overcrowding, habitat loss, species extinctions, and increased greenhouse gas emissions. I chronicle the environmental community’s historic retreat from population issues over the past four decades, including the Sierra Club’s failed attempts to adopt a consensus policy on immigration, and conclude that this retreat has been a great mistake. Creating an ecologically sustainable society is not just window dressing; it is necessary to pass on a decent future to our descendants and do our part to solve dangerous global environmental problems. Because sustainability is incompatible with an endlessly growing population, Americans can no longer afford to ignore domestic population growth.

Part 4, chapters 9–11, looks for answers. The chapter “Solutions” sketches out a comprehensive proposal for immigration reform in line with progressive political goals, focused on reducing overall immigration levels. I suggest shifting enforcement efforts from border control to employer sanctions—as several European nations have done with great success—and a targeted amnesty for illegal immigrants who have lived in the United States for years and built lives here (Javier and his wife could stay, but their cousins probably would not get to come). I propose changes in US trade and aid policies that could help people create better lives where they are, alleviating some of the pressure to emigrate. In these ways, Americans can meet our global responsibilities without doing so on the backs of our own poor citizens, or sacrificing the interests of future generations. A companion chapter considers a wide range of reasonable progressive “Objections” to this more restrictive immigration policy. I try to answer these objections honestly, focusing on the trade-offs involved. A short concluding chapter reminds readers of all that is at stake in immigration policy, and affirms that we will make better policy with our minds open.

How Many Is Too Many? shows that by thinking through immigration policy progressives can get clearer on our own goals. These do not include having the largest possible percentage of racial and ethnic minorities, but creating a society free of racial discrimination, where diversity is appreciated. They do not include an ever-growing economy, but feature an economy that works for the good of society as a whole. They most certainly do not include a crowded, cooked, polluted, ever-more-tamed environment, but instead a healthy, spacious landscape that supports us with sufficient room for wild nature. Finally our goals should include playing our proper role as global citizens, while still paying attention to our special responsibilities as Americans. Like it or not those responsibilities include setting US immigration policy.

* * *

Although I hope readers across the political spectrum will find this book interesting, I have written it primarily for my fellow progressives. Frankly, we need to think harder about this issue than we have been. Just because Rush Limbaugh and his ilk want to close our borders does not necessarily mean progressives should be for opening them wider. But this is not an easy topic to discuss and I appreciate your willingness to consider it with me. In fact I come to this topic reluctantly myself. I recognize immigration’s contribution to making the United States one of the most dynamic countries in the world. I also find personal meaning in the immigrant experience.

My paternal grandfather came to America from southern Italy when he was twelve years old. As a child I listened entranced to his stories, told in an accent still heavy after half a century in his adopted country. Stories of the trip over and how excited he was to explore everything on the big ship (a sailor, taking advantage of his curiosity, convinced him to lift some newspapers lying on deck, to see what was underneath …). Stories of working as a journeyman shoe repairman in cities and towns across upstate New York and Ohio (in one store, the foreman put my grandfather and his lathe in the front window so passers-by would stop to watch how fast and well he did his work). Stories of settling down and starting his own business, marrying Nana, raising a family.

I admired Grandpa’s adventurousness in coming to a new world, his self-reliance, his pride in his work, and his willingness to work hard to create a better future for himself and his family, including, eventually, me. Stopping by the store, listening to him chat with his customers, I saw clearly that he was a respected member of his community. When he and the relatives got together for those three-hour meals that grew ever longer over stories, songs, and a little wine, I felt part of something special, something different from my everyday life and beyond the experience of many of my friends.

So this book is not a criticism of immigrants! I know that many of today’s immigrants, legal and illegal, share my grandfather’s intelligence and initiative. The lives they are creating here are good lives rich in love and achievement. Nor is it an argument against all immigration: I favor reducing immigration into the United States, not ending it. I hope immigrants will continue to enrich America for many years to come. In fact, reducing current immigration levels would be a good way to insure continued widespread support for immigration.

Still, Americans sometimes forget that we can have too much of a good thing. Sometimes when Nana passes the pasta, it’s time to say basta. Enough.

When to say enough, though, can be a difficult question. How do we know when immigration levels need to be scaled back? And do any of us, as the descendants of immigrants, have the right to do so?

Answering the first question, in detail, is one of the main goals of this book. Speaking generally I think we need to reduce immigration when it seriously harms our society, or its weakest members. The issues are complex, but I think any country should consider reducing immigration:

  • When immigration significantly drives down wages for its poorer citizens.
  • When immigrants are regularly used to weaken or break unions.
  • When immigration appears to increase economic inequality within a society.
  • When immigration makes the difference between stabilizing a country’s population or doubling it within the next century.
  • When immigration-driven population growth makes it impossible to rein in sprawl, decrease greenhouse gas emissions sufficiently, or take the other steps necessary to create an ecologically sustainable society.
  • When rapid demographic shifts undermine social solidarity and a sense of communal purpose.
  • When most of its citizens say that immigration should be reduced.

Of course, there may also be good reasons to continue mass immigration: reasons powerful enough to outweigh such serious social costs or the expressed wishes of a nation’s citizens. But they had better be important. And in the case at hand they had better articulate responsibilities that properly belong to the United States and its citizens—and not help our “sender” countries avoid their own problems and responsibilities. Reversing gross economic inequality and creating a sustainable society are the primary political tasks facing this generation of Americans. Progressives should think long and hard before we accept immigration policies that work against these goals.

But what about the second question: do Americans today have a right to reduce immigration? To tell Javier’s cousins, perhaps, that they cannot come to America and make better lives for themselves and their families?

Yes, we do. Not only do we have a right to limit immigration into the United States, as citizens we have a responsibility to do so if immigration levels get so high that they harm our fellow citizens, or society as a whole. Meeting this responsibility may be disagreeable, because it means telling good people that they cannot come to America to pursue their dreams. Still, it may need to be done.

Those of us who want to limit immigration are sometimes accused of selfishness: of wanting to hog resources or keep “the American way of life” for ourselves. There may be some truth in this charge, since many Americans’ interests are threatened by mass immigration. Still, some of those interests seem worth preserving. The union carpenter taking home $30 an hour who owns his own house, free and clear, or the outdoorsman walking quietly along the edge of a favorite elk meadow or trout stream, may want to continue to enjoy these good things and pass them on to their sons and daughters. What is wrong with that?

Besides, the charge of selfishness cuts both ways. Restaurant owners and software tycoons hardly deserve the Mother Teresa Self-Sacrifice Medal when they lobby Congress for more low-wage workers. The wealthy progressive patting herself on the back for her enlightened views on immigration probably hasn’t ever totaled up the many ways she and her family benefit from cheap labor.

In the end our job as citizens is to look beyond our narrow self-interest and consider the common good. Many of us oppose mass immigration not because of what it costs us as individuals, but because we worry about the economic costs to our fellow citizens, or the environmental costs to future generations. Most Americans enjoy sharing our country with foreign visitors and are happy to share economic opportunities with reasonable numbers of newcomers. We just want to make sure we preserve those good things that make this a desirable destination in the first place.

All else being equal, Americans would just as soon not interfere with other people’s decisions about where to live and work. In fact such a laissez-faire approach to immigration lasted for much of our nation’s history. But today all else is not equal. For one thing this is the age of jet airplanes, not tall-masted sailing ships or coal-fired steamers. It is much quicker and easier to come here than it used to be and the pool of would-be immigrants has increased by an order of magnitude since my grandfather’s day. (In 2006, there were 6. million applications for the 50,000 green cards available under that year’s “diversity lottery.” ) For another, we do not have an abundance of unclaimed land for farmers to homestead, or new factories opening up to provide work for masses of unskilled laborers. Unemployment is high and projected to remain high for the foreseeable future. For a third, we recognize new imperatives to live sustainably and do our part to meet global ecological challenges. Scientists are warning that we run grave risks should we fail to do so.

Americans today overwhelmingly support immigration restrictions. We disagree about the optimal amount of immigration, but almost everyone agrees that setting some limits is necessary. Of course, our immigration policies should be fair to all concerned. Javier Morales came to America illegally, but for most of his time here our government just winked at illegal immigration. It also taxed his paychecks. After two and a half decades of hard work that has benefited our country, I think we owe Javier citizenship. But we also owe Tom Kenney something. Perhaps the opportunity to prosper, if he is willing to work hard. Surely, at a minimum, government policies that do not undermine his own attempts to prosper.

* * *

The progressive vision is alive and well in the United States today. Most Americans want a clean environment with flourishing wildlife, a fair economy that serves all its citizens, and a diverse society that is free from racism. Still, it will take a lot of hard work to make this vision a reality and success is not guaranteed. Progressives cannot shackle our hopes to an outmoded immigration policy that thwarts us at every turn.

Given the difficulties involved in getting 320 million Americans to curb consumption and waste, there is little reason to think we will be able to achieve ecological sustainability while doubling or tripling that number. Mass immigration ensures that our population will continue growing at a rapid rate and that environmentalists will always be playing catch up. Fifty or one hundred years from now we will still be arguing that we should destroy this area rather than that one, or that we can make the destruction a little more aesthetically appealing—instead of ending the destruction. We will still be trying to slow the growth of air pollution, water use, or carbon emissions—rather than cutting them back.

But the US population would quickly stabilize without mass immigration. We can stop population growth—without coercion or intrusive domestic population policies—simply by returning to pre-1965 immigration levels.

Imagine an environmentalism that was not always looking to meet the next crisis and that could instead look forward to real triumphs. What if we achieved significant energy efficiency gains and were able to enjoy those gains with less pollution, less industrial development on public lands, and an end to oil wars, because those efficiency gains were not swallowed up by growing populations?

Imagine if the push to develop new lands largely ended and habitat for other species increased year by year, with a culture of conservation developed around restoring and protecting that habitat. Imagine if our demand for fresh water leveled off and instead of fighting new dam projects we could actually leave more water in our rivers.

And what of the American worker? It is hard to see how progressives will succeed in reversing current powerful trends toward ever greater economic inequality in a context of continued mass immigration, particularly with high numbers of relatively unskilled and poorly educated immigrants. Flooded labor markets will harm poorer workers directly, by driving down wages and driving up unemployment. Mass immigration will also continue to harm workers indirectly by making it harder for them to organize and challenge employers, by reducing the percentage of poor workers who are citizens and thus able to vote for politicians who favor the poor, and by limiting sympathy between the haves and havenots, since with mass immigration they are more likely to belong to different ethnic groups.

But it does not have to be this way. We can tighten labor markets and get them working for working people in this country. Combined with other good progressive egalitarian measures—universal health care; a living minimum wage; a more progressive tax structure—we might even reverse current trends and create a more economically just country.

Imagine meatpacking plants and carpet-cleaning companies competing with one another for scarce workers, bidding up their wages. Imagine unions able to strike those companies without having to worry about scabs taking their members’ jobs. Imagine college graduates sifting through numerous job offers, like my father and his friends did fifty years ago during that era’s pause in mass immigration, instead of having to wait tables and just hope for something better.

Imagine poor children of color in our inner cities, no longer looked on as a problem to be warehoused in failing schools, or jails, but instead seen as an indispensable resource: the solution to labor shortages in restaurants and software companies.

Well, why not? Why are we progressives always playing catch up? The right immigration policies could help lead us toward a more just, egalitarian, and sustainable future. They could help liberals achieve our immediate goals and drive the long-term political agenda. But we will not win these battles without an inspiring vision for a better society, or with an immigration policy that makes that vision impossible to achieve.

To read more about How Many is Too Many?, click here.

Add a Comment
32. Everything’s coming up Howie

 

 

18download-superJumbo

Adam Gopnik, writing in the New Yorker, recently profiled eminent American sociologist Howard S. Becker (Howie, please: “Only my mother ever called me Howard”), one of the biggest names in the field for over half a century, yet still, as with so many purveyors of haute critique, better known in France. Becker is no wilting lily on these shores, however—since the publication of his pathbreaking Outsiders: Studies in the Sociology of Deviance (1963), he’s been presiding as grand doyen over methodological confrontations with the particularly slippery slopes of human existence, including our very notion of “deviance.” All this, a half dozen or so honorary degrees, a lifetime achievement award, a smattering of our most prestigious fellowships, and the 86-year-old Becker is still going strong, with his most recent book published only this past year.

From the New Yorker profile:

This summer, Becker published a summing up of his life’s method and beliefs, called “What About Mozart? What About Murder?” (The title refers to the two caveats or complaints most often directed against his kind of sociology’s equable “relativism”: how can you study music as a mere social artifact—what about Mozart? How can you consider criminal justice a mutable convention—what about Murder?) The book is both a jocular personal testament of faith and a window into Becker’s beliefs. His accomplishment is hard to summarize in a sentence or catchphrase, since he’s resolutely anti-theoretical and suspicious of “models” that are too neat. He wants a sociology that observes the way people act around each other as they really do, without expectations about how they ought to.

The provenances of that sociology have included: jazz musicians, marijuana users, art world enthusiasts, social science researchers, medical students, musicologists, murderers, and “youth,” to name a few.

9780226166490

As mentioned earlier, his latest book What About Mozart? What About Murder?  considers the pull of two methodologies: one, more pragmatic, which addresses its subjects with caution and rigor on a case-by-case basis, and the other, which employs a more speculative approach (guesswork) by asking “killer questions” that force us to reposition our stance on hypothetical situations, such as whether or not, indeed, murder is always already (*Becker might in fact kill me for a foray into that particular theoretical shorthand*) “deviant.”

Via Gopnik:

His work is required reading in many French universities, even though it seems to be a model of American pragmatism, preferring narrow-seeming “How?” and “Who, exactly?” questions to the deeper “Why?” and “What?” supposedly favored by French theory. That may be exactly its appeal, though: for the French, Becker seems to combine three highly American elements—jazz, Chicago, and the exotic beauties of empiricism.

On the heels of his appearance in the New Yorker, Becker participated in a recent, brief sitdown with the New York Times, where he relayed thoughts on Charlie Hebdo and the French media, Nate Silver, and jazz trios, among other concerns.

From that New York Times Q & A:

I work out in a gym with a trainer twice a week. Oh, it’s pure torture, but I’m 86 so you’ve got to do something to stay in shape. I do a mixture of calisthenics, Pilates and yoga—a lot of work on balance. My trainer has this idea that every year on my birthday I should do the same number of push-ups as I have years old. We work up to it over the year. I was born on the anniversary of the great San Francisco earthquake and fire in 1906. It seems auspicious but I don’t know why.

Auspicious indeed.

To read more by Becker, click here.

Add a Comment
33. A Show-Trial: An excerpt from Bengt Jangfeldt’s Mayakovsky

9780226056975

“A Show-Trial”

Excerpted from Mayakovsky: A Biography by Bengt Jangfeldt

***

Mayakovsky returned to Moscow on 17 or 18 September. The following day, Krasnoshchokov was arrested, accused of a number of different offenses. He was supposed to have lent money to his brother Yakov, head of the firm American–Russian Constructor, at too low a rate of interest, and to have arranged drink– and sex–fueled orgies at the Hotel Europe in Petrograd, paying the Gypsy girls who entertained the company with pure gold. He was also accused of having passed on his salary from the Russian–American Industrial Corporation ($200 a month) to his wife (who had returned to the United States), of having bought his mistress flowers and furs out of state funds, of renting a luxury villa, and of keeping no fewer than three horses. Lenin was now so ill that he had not been able to intervene on Krasnoshchokov’s behalf even if he had wanted to.

His arrest was a sensation of the first order. It was the first time that such a highly placed Communist had been accused of corruption, and the event cast a shadow over the whole party apparatus. Immediately after Krasnoshchokov’s arrest, and in order to prevent undesired interpretations of what had happened, Valerian Kuybyshev, the commissar for Workers’ and Peasants’ Inspection, let it be known that “incontrovertible facts have come to light which show Krasnoshchokov has in a criminal manner exploited the resources of the economics department [of the Industry Bank] for his own use, that he has arranged wild orgies with these funds, and that he has used bank funds to enrich his relatives, etc.” He had, it was claimed, “in a criminal manner betrayed the trust placed in him and must be sentenced to a severe punishment.”

Krasnoshchokov was, in other words, judged in advance. There was no question of any objective legal process; the intention was to set an example: “The Soviet power and the Communist Party will […] root out with an iron hand all sick manifestations of the NEP and remind those who ‘let themselves be tempted’ by the joys of capitalism that they live in a workers’ state run by a Communist party.” Krasnoshchokov’s arrest was deemed so important that Kuybyshev’s statement was printed simultaneously in the party organ Pravda and the government organ Izvestiya. Kuybyshev was a close friend of the prosecutor Nikolay Krylenko, who had led the prosecution of the Socialist Revolutionaries the previous year, and who in time would turn show trials and false charges into an art form.

When Krasnoshchokov was arrested, Lili and Osip were still in Berlin. In the letter that Mayakovsky wrote to them a few days after the arrest, the sensational news is passed over in total silence. He gives them the name of the civil servant in the Berlin legation who can give them permission to import household effects (which they had obviously bought in Berlin) into Russia; he tells them that the squirrel which lives with them is still alive and that Lyova Grinkrug is in the Crimea. The only news item of greater significance is that he has been at Lunacharsky’s to discuss Lef and is going to visit Trotsky on the same mission. But of the event which the whole of Moscow was talking about, and which affected Lili to the utmost degree—not a word.

Krasnoshchokov’s trial took place at the beginning of March 1924. Sitting in the dock, apart from his brother Yakov, were three employees of the Industry Bank. Krasnoshchokov, who was a lawyer, delivered a brilliant speech in his own defense, explaining that, as head of the bank, he had the right to fix lending rates in individual cases and that one must be flexible in order to obtain the desired result. As for the charges of immoral behavior he maintained that his work necessitated a certain degree of official entertainment and that the “luxury villa” in the suburb of Kuntsevo was an abandoned dacha which in addition was his sole permanent dwelling. (It is one of the ironies of history that the house had been owned before the Revolution by the Shekhtel family and accordingly had often had Mayakovsky as a guest—see the chapter “Volodya”). Finally, he pointed out that his private life was not within the jurisdiction of the law.

This opinion was not shared by the court, which ruled that Krasnoshchokov had lived an immoral life during a time when a Communist ought to have set a good example and not surrender to the temptations offered by the New Economic Policy. Krasnoshchokov was also guilty of having used his position to “encourage his relatives’ private business transactions” and having caused the bank to lose 10,000 gold rubles. He was sentenced to six years’ imprisonment and in addition three years’ deprivation of citizen’s rights. Moreover, he was excluded from the Communist Party. His brother was given three years’ imprisonment, while the other three coworkers received shorter sentences.

Krasnoshchokov had in fact been a very successful bank director. Between January 1923 and his arrest in September he had managed to increase the Industry Bank’s capital tenfold, partly thanks to a flexible interest policy which led to large American investments in Russia. There is a good deal of evidence that the charges against him were initiated by persons within the Finance Commissariat and the Industry Bank’s competitor, the Soviet National Bank. Shortly before his arrest Krasnoshchokov had suggested that the Industry Bank should take over all the National Bank’s industrial–financial operations. Exactly the opposite happened: after Krasnoshchokov’s verdict was announced, the Industry Bank was subordinated to the Soviet National Bank.

There is little to suggest that the accusations of orgies were true. Krasnoshchokov was not known to be a rake, and his “entertainment expenses” were hardly greater than those of other highly placed functionaries. But he had difficulties defending himself, as he maintained not one mistress but two—although he had a wife and children. The woman who figured in the trial was not, as one might have expected, Lili, but a certain Donna Gruz—Krasnoshchokov’s secretary, who six years later would become his second wife. This fact undoubtedly undermined his credibility as far as his private life was concerned.

When Lili and Elsa showed Nadezhda Lamanova’s dresses in Paris in the winter of 1924, it attracted the attention of both the French and the British press, where this photograph was published with the caption “soviet sack fashion.—Because of the lack of textiles in Soviet Russia, Mme. Lamanoff, a Moscow fashion designer, had this dress made out of sackcloth from freight bales.”

By the time the judgment was announced, Lili had been in Paris for three weeks. She was there for her own amusement and does not seem to have had any particular tasks to fulfill. But she had with her dresses by the Soviet couturier Nadezhda Lamanova which she and Elsa showed off at two soirees organized by a Paris newspaper. She would like to go to Nice, she confided in a letter home to Moscow on 23 February, but her plans were frustrated by the fact that Russian emigrants were holding a congress there. She was thinking of traveling to Spain instead, or somewhere else in France, to “bake in the sun for a week or so.” But she remained in Paris, where she and Elsa went out dancing the whole time. Their “more or less regular cavaliers” were Fernand Léger (whom Mayakovsky had got to know in Paris in 1922) and an acquaintance from London who took them everywhere with him, “from the most chic of places to the worst of dives.” “It has been nothing but partying here,” she wrote. “Elsa has instituted a notebook in which she writes down all our rendezvous ten days in advance!” As clothes are expensive in Paris too, she asks Osip and Mayakovsky to send her a little money in the event of their managing to win “some mad sum of money” at cards.

When she was writing this letter, there were still two weeks to go before Krasnoshchokov’s trial. “How is A[lexander] M[ikhailovich]?” she asked, in the middle of reporting on the fun she was having. But she did not receive a reply, or if she did, it has not been preserved. On 26 March, after a month in Paris, she took the boat to England to visit her mother, who was in poor health, but that same evening she was forced to return to Calais after being stopped at passport control in Dover—despite having a British visa issued in Moscow in June 1923. What she did not know was that after her first visit to England in October 1922 she had been declared persona non grata, something which all British passport control points “for Europe and New York” had been informed of in a secret circular of 13 February 1923.

 

 

“You can’t imagine how humiliating it was to be turned back at the British border,” she wrote to Mayakovsky: “I have all sorts of theories about it, which I’ll tell you about when we I see you. Strange as it may seem, I think they didn’t let me in because of you.” She guessed right: documents from the Home Office show that it was her relationship with Mayakovsky, who wrote “extremely libellous articles” in Izvestiya, which had proved her undoing. Strangely enough, despite being refused entry to Britain, she was able to travel to London three weeks later. The British passport authorities have no record of her entry to the country. Did she come in by an illegal route?

At the same time that Lili traveled to Paris, Mayakovsky set out on a recital tour in Ukraine. Recitals were an important source of income for him. During his stay in Odessa he mentioned in a newspaper interview that he was planning to set out soon on a trip round the world, as he had been invited to give lectures and read poems in the United States. Two weeks later he was back in Moscow, and in the middle of April he went to Berlin, where Lili joined him about a week later. According to one newspaper, Mayakovsky was in the German capital “on his way to America.”

The round–the–world trip did not come off, as Mayakovsky failed to obtain the necessary visas. It was not possible to request an American visa in Moscow, as the two countries lacked diplomatic ties. Mayakovsky’s plan was therefore to try to get into the United States via a third country. Britain’s first Labour government, under Ramsay MacDonald, had scarcely recognized the Soviet Union (on 1 February 1924) before Mayakovsky requested a British visa, on 25 March. From England he planned to continue his journey to Canada and India. In a letter to Ramsay MacDonald, Britain’s chargé d’affaires in Moscow asked for advice about the visa application. Mayakovsky was not known to the mission, he wrote, but was “a member of the Communist party and, I am told, is known as a Bolshevik propagandist.” Mr. Hodgson would not have needed to do this if he had known that on 9 February, the Home Office had also issued a secret circular about Mayakovsky, “one of the principal leaders of the ‘Communist’ propaganda and agitation section of the ‘ROSTA,’” who since 1921 had been writing propaganda articles for Izvestiya and “should not be given a visa or be allowed to land in the United Kingdom” or any of its colonies. In Mayakovsky’s case the circular was sent to every British port, consulate, and passport and military checkpoint, as well as to Scotland House and the India Office. But in the very place where people really ought to have known about it, His Majesty’s diplomatic mission in Moscow, they were completely unaware of it.

While he waited for an answer from the British, Mayakovsky made a couple of appearances in Berlin where he talked about Lef and recited his poems. On the 9 May he traveled back to Moscow in company with Lili and Scotty, the Scotch terrier she had picked up in England, tired of waiting for notification that never came. When he got to Moscow he found out that on 5 May London had instructed the British mission in Moscow to turn down his visa application.

VLADIMIR ILYICH

The preliminary investigation and subsequent trial of Krasnoshchokov caused a great stir, but it would certainly have got even more column inches if it had not been played out in the shadow of a significantly more important event. On 21 January 1924, Vladimir Lenin died after several years of illness.

Among the thousands of people jostling one another in the queues which snaked around in front of Trade Unions House, where the leader of the Revolution lay in state, were Mayakovsky, Lili, and Osip. Lenin’s death affected Mayakovsky deeply. “It was a terrible morning when he died,” Lili recalled. “We wept in the queue in Red Square where we were standing in the freezing cold to see him. Mayakovsky had a press card, so we were able to bypass the queue. I think he viewed the body ten times. We were all deeply shaken.”

Mayakovsky with Scotty, whom Lili bought in England. The picture was taken in the summer of 1924 at the dacha in Pushkino. Scotty loved ice cream, and, according to Rodchenko, Mayakovsky regarded “with great tenderness how Scotty ate and licked his mouth.” “He took him in his arms and I photographed them in the garden,” the photographer remembered. “I took two pictures. Volodya kept his tender smile, wholly directed at Scotty.” The photograph with Scotty is in fact one of the few where Mayakovsky can be seen smiling.

The feelings awakened by Lenin’s death were deep and genuine, and not only for his political supporters. Among those queuing were Boris Pasternak and Osip Mandelstam, who shared a far more lukewarm attitude to the Revolution and its leader. “Lenin dead in Moscow!” exclaimed Mandelstam in his coverage of the event. “How can one fail to be with Moscow in this hour! Who does not want to see that dear face, the face of Russia itself ? The time? Two, three, four? How long will we stand here? No one knows. The time is past. We stand in a wonderful nocturnal forest of people. And thousands of children with us.”

Shortly after Lenin’s death Mayakovsky tackled his most ambitious project to date: a long poem about the Communist leader. He had written about him before, in connection with his fiftieth birthday in 1920 (“Vladimir Ilyich!”), and when Lenin suffered his first stroke in the winter of 1923 (“We Don’t Believe It!”), but those were shorter poems. According to Mayakovsky himself, he began pondering a poem about Lenin as early as 1923, but that may well have been a rationalization after the event. What set his pen in motion was in any case Lenin’s death in January 1924.

Mayakovsky had only a superficial knowledge of Lenin’s life and work and was forced to read up on him before he could write about him. His mentor, as on so many other occasions, was Osip, who supplied him with books and gave him a crash course in Leniniana. Mayakovsky himself had neither the time nor the patience for such projects. The poem was written during the summer and was ready by the beginning of October 1924. It was given the title “Vladimir Ilyich Lenin” and was the longest poem Mayakovsky ever wrote; at three thousand lines, it was almost twice as long as “About This.” In the autumn of 1924 he gave several poetry readings and fragments of the poem were printed in various newspapers. It came out in book form in February 1925.

The line to the Trade Unions’ House in Moscow, where Lenin was lying in state.

So the lyrical “About This” was followed by an epic poem, in accordance with the conscious or unconscious scheme that directed the rhythm of Mayakovsky’s writing. If even a propaganda poem like “To the Workers in Kursk” was dedicated to Lili, such a dedication was impossible in this case. “Vladimir Ilyich Lenin” was dedicated to the Russian Communist Party, and Mayakovsky explains why, with a subtle but unambiguous reference to “About This”:

I can write
about this,
about that,
but now
is not the time
for love–drivel.
All my
resounding power
as a poet
give to you,
attacking class.

In “Vladimir Ilyich Lenin” Lenin is portrayed as a Messiah–like figure, whose appearance on the historical scene is an inevitable consequence of the emergence of the working class. Karl Marx revealed the laws of history and, with his theories, “helped the working class to its feet.” But Marx was only a theoretician, who in the fullness of time would be replaced by someone who could turn theory into practice, that is, Lenin.

The poem is uneven, which is not surprising considering the format. From a linguistic point of view—the rhyme, the neologisms—it is undoubtedly comparable to the best of Mayakovsky’s other works, and the depiction of the sorrow and loss after Lenin’s death is no less than a magnificent requiem. But the epic, historical sections are too long and prolix. The same is true of the tributes to the Communist Party, which often rattle with empty rhetoric (which in turn can possibly be explained by the fact that Mayakovsky was never a member of the party):

I want
once more to make the majestic word
“PARTY”
shine.
One individual!
Who needs that?!
The voice of an individual
is thinner than a cheep.
Who hears it—
except perhaps his wife?

The party
is a hand with millions of fingers
clenched
into a single destroying fist.
The individual is rubbish,
the individual is zero  …
We say Lenin,
but mean
The Party.
We say
The Party,
but mean Lenin.

One of the few reviewers who paid any attention to the poem, the proletarian critic and anti–Futurist G. Lelevich, was quite right in pointing out that Mayakovsky’s “ultraindividualistic” lines in “About This” stand out as “uniquely honest” in comparison with “Vladimir Ilyich Lenin,” which “with few exceptions is rationalistic and rhetorical.” This was a “tragic fact” that Mayakovsky could only do something about by trying to “conquer himself.” The Lenin poem, wrote Lelevich, was a “flawed but meaningful and fruitful attempt to tread this path.”

Lelevich was right to claim that “About This” is a much more convincing poem than the ode to Lenin. But the “tragic” thing was not what Lelevich perceived as such, but something quite different, namely, Mayakovsky’s denial of the individual and his importance. In order to “conquer” himself, that is, the lyrical impulse within himself, he would have to take yet more steps in that direction—which he would in fact do, although it went against his innermost being.

If there is anything of lasting value in “Vladimir Ilyich Lenin,” it is not the paeans of praise to Lenin and the Communist Party—poems of homage are seldom good—but the warnings that Lenin, after his death, will be turned into an icon. The Lenin to whom Mayakovsky pays tribute was born in the Russian provinces as “a normal, simple boy” and grew up to be the “most human of all human beings.” If he had been “king–like and god–like” Mayakovsky would without a doubt have protested and taken a stance “opposed to all processions and tributes”:

I ought
to have found words
for lightning–flashing curses,
and while
I
and my yell
were trampled underfoot
I should have
hurled blasphemies
against heaven
and tossed
like bombs at the Kremlin
my: NO!

The worst thing Mayakovsky can imagine is that Lenin, like Marx, will become a “cooling plaster dotard imprisoned in marble.” This is a reference back to “The Fourth International,” in which Lenin is depicted as a petrified monument.

I am worried that
processions
and mausoleums,
celebratory statues
set in stone,
will drench
Leninist simplicity
in syrup–smooth balsam—

Mayakovsky warns, clearly blind to the fact that he himself is contributing to this development with his seventy–five–page long poem.

The fear that Lenin would be canonized after his death was deeply felt—and well grounded. It did not take long before Gosizdat (!) began advertising busts of the leader in plaster, bronze, granite, and marble, “life–size and double life–size.” The busts were produced from an original by the sculptor Merkurov—whom Mayakovsky had apostrophized in his Kursk poem—and with the permission of the Committee for the Perpetuation of the Memory of V. I. Lenin. The target groups were civil–service departments, party organizations and trade unions, cooperatives, and the like.

After his return from Berlin in May 1924, Mayakovsky met with the Japanese author Tamisi Naito, who was visiting Moscow. Seated at the table next to Mayakovsky and Lili is Sergey Tretyakov’s wife, Olga. To left of Naito (standing in the center) are Sergey Eisenstein and Boris Pasternak.

The Lef members’ tribute to the dead leader was of a different nature. The theory section in the first issue of Lef for 1924 was devoted to Lenin’s language, with contributions by leading Formalists such as Viktor Shklovsky, Boris Eikhenbaum, Boris Tomashevsky, and Yury Tynyanov—groundbreaking attempts to analyze political language by means of structuralist methods. Lenin was said to have “decanonized” the language, “cut down the inflated style,” and so on, all in the name of linguistic efficiency. This striving for powerful simplicity was in line with the theoretical ambitions of the Lef writers but stood in stark contrast to the canonization of Lenin which was set in train by his successors as soon as his corpse was cold.

This entire issue of Lef was in actual fact a polemic against this development—indirectly, in the essays about Lenin’s language, and in a more undisguised way in the leader article. In a direct reference to the advertisements for Lenin busts, the editorial team at Lef in their manifesto “Don’t Trade in Lenin!” sent the following exhortation to the authorities:

We insist:
Don’t make matrices out of Lenin.
Don’t print his portrait on posters, oilcloths, plates, drinking
vessels, cigarette boxes.
Don’t turn Lenin into bronze.
Don’t take from him his living gait and human physiognomy,
which he managed to preserve at the same time as he led history.
Lenin is still our present.
He is among the living.
We need him living, not dead.
Therefore:
Learn from Lenin, but don’t canonize him.
Don’t create a cult around a man who fought against all kinds of
cults throughout his life.
Don’t peddle artifacts of this cult.
Don’t trade in Lenin.

In view of the extravagant cult of Lenin that would develop later in the Soviet Union, the text is insightful to the point of clairvoyance. But the readers of Lef were never to see it. According to the list of contents, the issue began on page 3 with the leader “Don’t Trade in Lenin!” But in the copies that were distributed, this page is missing and the pagination begins instead on page 5. The leadership of Gosizdat, which distributed Lef, had been incensed by the criticism of the advertisements for Lenin busts and had removed the leader. As if by some miracle, it has been preserved in a few complimentary copies which made it to the libraries before the censor’s axe fell.

To read more about Mayakovsky, click here.

Add a Comment
34. Daniel Albright (1945–2015)

Milan-Dan

On January 3, 2015, scholar Daniel Albright (1945–2015), the Ernest Bernbaum Professor of Literature at Harvard University—who counted himself, among other accolades, as an NEH Fellow, a Guggenheim Fellow, and a Berlin Prize Fellow at the American Academy in Berlin—passed away unexpectedly. The author of sixteen books, which straddled a range of interests from literary criticism and musicology to panaesthetics and the history of modernism, Albright taught in three departments at Harvard, where he had worked for the past decade.

From an article in the Harvard Crimson:

As an undergraduate at Rice University, Albright originally declared a major in mathematics before switching to English. Upon graduating from Rice in 1967, he attended Yale, where he received his M.Phil in 1969 and his Ph.D. in 1970. Prior to his arrival at Harvard in 2003, Albright taught at the University of Virginia, the University of Munich, the University of Rochester, and the Eastman School of Music.

Once at Harvard, he taught in the English, Music, and Comparative Literature departments. English Department chair and professor W. James Simpson spoke highly of Albright’s career in Cambridge.

“Whenever Dan was in a room, the room was full of fun and amusement and delight because of his range of literary allusions and music allusions,” Simpson said. “He was constantly delighting an audience.”

Among those books he edited or authored were three published and/or distributed by the University of Chicago Press: Untwisting the Serpent: Modernism in Music, Literature, and Other ArtsModernism and Music: An Anthology of Sources; and Evasions (Sylph Editions).

To read more about those works, click here.

To read a remembrance on Albright’s website, click here.

Add a Comment
35. Alice Kaplan on Patrick Modiano

AKaplan_150_225

 

Below follows, in full, an interview with Alice Kaplan on the career of recent Nobel Laureate Patrick Modiano. The interview was originally published online via the French-language journal Libérationshortly after the Nobel announcement.

***

The American academic Alice Kaplan, author of the outstanding The Collaborator: The Trial and Execution of Robert Brasillach, and more recently, Dreaming in French, teaches Modiano at Yale University, where she chairs the Department of French. She evokes for us the particular aura of the French Nobel Laureate in the United States.

Is Patrick Modiano well known in American universities?

There have been sixteen PhD dissertations on Modiano in American universities since 1987, a significant number, given that he is a both foreigner and a contemporary novelist. Yale University Press has just published a trilogy of novels originally published by the Editions du Seuil under the title Suspended Sentences. Modiano’s attraction comes from his style, which is laconic and beautiful but also quite accessible, in English as well as in French. Then there is the particular genre he invented, inspired by detective fiction, familiar to American readers. The obstacle is obviously the number of references to specific places in Paris that are everywhere in his books—all those street names and addresses that capture so well the atmosphere of different neighborhood, so that it’s probably necessary to have visited Paris at least once to really get him. At the same time, he knows exactly how to create an atmosphere. I always think about Edward Hopper when I read Modiano, there is a sense in his books that something horrible has happened; a crime is floating in the air, and a sense of someone or something missing. Modiano could write stories that take place in Brooklyn or in New Jersey. He’s invented such a specific a notion of place that you can think of certain places as being “Modianesque.”

Does he have many American readers?

It is difficult to say. He is published by David R. Godine, an editor of fine literary fiction, not a mass market publisher. My sense is that he is appreciated by the kind of reader who appreciates James Salter, for example, or by the kind of American reader who might have read Hélène Berr’s wartime diary in translation—which Modiano prefaced in the original French edition. His novel Dora Bruder has been widely read in the U.S. by intellectuals interested in the memory of the Holocaust—precisely because it is a book about forgetting. Historians find Modiano especially interesting because he offers a challenge to a certain kind of positivism with respect to memory. In the United States, people are always hunting for their identities. Genealogical search engines like ancestry.com, and television shows like Finding your roots are phenomenally successful. And here comes a writer who understands the mixed nature of ancestry, the racial and cultural mix fundamental to American identity and who describes the desire to understand where we come from, but also—and this is important—the impossibility of knowing everything, the confusion. I teach Dora Bruder in a class on the archives and the relationship of history to literature, because it is a book that tells us that we must also respect what we can’t know. In Modiano’s books, the person who knows the answer has just died, or else the narrator is so tired of searching that he stops before he gets to the last garage on his list. Modiano is often compared to Proust and even considered a kind of modern day Proust, and there is something to this, except that Proust is never tired of searching for lost time! The fact that Modiano’s fiction seems to slip through our fingers is an integral part of his literary project. He helps us understanding forgetting, the same way Proust helps us understand jealousy, or Stendhal ambition.

What does research on Modiano look like?

The last thesis I directed on Modiano investigated his use of the first person, the fact that he doesn’t engage in what the French call “autofiction”—i.e., fictional autobiography—but in something much more subtle. You can’t read Modiano for the identity politics that have become so fundamental in the American academy, where we read systematically through the grid of race and gender. Modiano is always questioning those categories, by showing the error of simplistic shortcuts. You cannot, for example, categorize him as a “Jewish writer.”

How did you discover his books?

I read La Place de L’Etoile while I was working on Céline’s anti-Semitic writing. I find it astonishing that Modiano published that book (a parody of anti-semitic prose) in 1968, well before the great wave of consciousness about the French collaboration with the Nazis that came in the wake of Robert Paxton’s groundbreaking research. Think about it: Modiano was alone, not part of any literary school, and he wrote about French anti-Semitism with incredible intuition. He was especially attuned to the anti-Semitic rhetoric around Jewish names. After La Place de l’Etoile, I devoured all of his books.

To read more by Alice Kaplan, click here.

Add a Comment
36. Free E-Book for January: The Hunt for Nazi Spies

9780226438931

 

Our free e-book for January is Simon Kitson’s The Hunt for Nazi Spies: Fighting Espionage in Vichy France; read more about the book below:

From 1940 to 1942, French secret agents arrested more than two thousand spies working for the Germans and executed several dozen of them—all despite the Vichy government’s declared collaboration with the Third Reich. A previously untold chapter in the history of World War II, this duplicitous activity is the gripping subject ofThe Hunt for Nazi Spies, a tautly narrated chronicle of the Vichy regime’s attempts to maintain sovereignty while supporting its Nazi occupiers.Simon Kitson informs this remarkable story with findings from his investigation—the first by any historian—of thousands of Vichy documents seized in turn by the Nazis and the Soviets and returned to France only in the 1990s. His pioneering detective work uncovers a puzzling paradox: a French government that was hunting down left-wing activists and supporters of Charles de Gaulle’s Free French forces was also working to undermine the influence of German spies who were pursuing the same Gaullists and resisters. In light of this apparent contradiction, Kitson does not deny that Vichy France was committed to assisting the Nazi cause, but illuminates the complex agendas that characterized the collaboration and shows how it was possible to be both anti-German and anti-Gaullist.

Combining nuanced conclusions with dramatic accounts of the lives of spies on both sides, The Hunt for Nazi Spies adds an important new dimension to our understanding of the French predicament under German occupation and the shadowy world of World War II espionage.

Download (through January 31) your free copy here.

Add a Comment
37. Paddy Woodworth on Our Once and Future Planet

9780226907390

A little more than a year ago, we published Paddy Woodworth’s Our Once and Future Planet,  an ambitious, even monumental account of the past, present, and future of the ecological restoration movement that was recently named one of the year’s Outstanding Academic Titles by the ALA’s Choice. Then, this past autumn, Paddy came to the States and spent a little over a month talking with people about the book in a variety of settings. Now that he’s back in Ireland and settling in for the holidays, we asked Paddy to offer some thoughts on what it’s like to hit the road promoting not just a book, but an idea.

Publishing a book is a little like casting a stone into a well. We write, as Seamus Heaney put it, “to set the darkness echoing.” And often we wait a long time for the echoes, and must count ourselves lucky if we hear any at all.

Our Once and Future Planet was published by the University of Chicago Press a year ago last October. It charts my journey into the challenging world of ecological restoration projects worldwide; it examines and ultimately finds precious if tenuous hope in restoration ecology’s promise to reverse the globalized degradation of our environment,

The echoes to the book returned slowly enough at first, especially in the US and UK print media, though at home the Irish media, press, radio, TV and online, responded very quickly and positively to the book. Individuals I respected in the restoration field, and readers I had never met, wrote to me praising the book generously. Two of its more controversial chapters engendered pained and sometimes painful personal individual responses from protagonists also, but no real public engagement. Sales were respectable, but the book was hardly flying off the shelves.

And, for a while thereafter, it seemed that that was that – the book was out there, somewhere, but not sending back many more messages.

Then, over the summer, new and louder echoes became audible. Excellent reviews in Bioscience and Science were particularly flattering for a journalist who had learned his ecology on the hoof and on the wing, as it were, while researching Planet.

But I had dreamed of reaching a public far beyond academia. The book’s narrative style, with a focus on the human dramas behind restoration projects, aims at engaging ordinary citizens with the urgent need to restore our landscapes, wherever we live. An opportunity to find this audience came, ironically enough, with an invitation from an academic to teach a seminar based entirely around my text, at DePaul University, Chicago.

Holding the attention of undergraduates is an acid test for any writer. I had some doubts, for sure: did my work really offer strong enough material to engage fifteen bright kids, with lots of other study and leisure options, over five three-hour sessions?

It was very heartening to find that it did: the seminar, co-taught with the invaluable and congenial support of my host, Dr Liam Heneghan, got gratifying reviews from students on Facebook: “by far one of the most stimulating courses I’ve had to date. . . . This is just what I needed,” one student wrote.

It was exciting to find that key restoration questions, like “who decides when and where to restore, and what target to restore to?” and issues like the “novel ecosystem” concept, aroused passionate and well-informed debates. And the quality of the written responses on restoration topics was, Liam and I agreed, exceptionally high.

The students’ first instinct was often to take the position that it must be scientists, or at least  “experts” who decide on restoration targets. But as we excavated that idea, and found that leading restoration ecologists, looking at the same ecological and cultural context, often disagree on the correct managerial response, many of them swung towards giving the final say to “community” consensus.

But again, examining the painful history of the North Branch Restoration Project in Chicago, there was a recognition that the comforting yet fuzzy notion of ‘community’ is deeply problematic; different stake-holding communities may have radically different views as to the appropriate ecological vision for a local landscape they all love in different ways.

“I had clear ideas about how to do restoration when I started the class,” one student said in our final session. “Now I can only see how complex it all is.” I guess that is a fair definition of an educational advance.

It was enlightening, too, to explore these topics at discussions organized through DePaul’s Institute of Nature and Culture, where I was a Visiting Fellow. The participants were  faculty members from fields as diverse as politics, literature, religion, philosophy and economics. To hear the arguments from my book explored, challenged and advanced through debate, at both undergraduate and faculty level, was a writer’s dream come true. This was the darkness echoing, indeed.

It was particularly illuminating for me to hear academics from the humanities endorse my book’s analysis of the polemical series of articles and books advocating the “novel ecosystems” concept. In my view, their authors use dangerously misleading  rhetorical devices to inflate policy arguments in favour of abandoning classic restoration targets. In language heavy with polemic and light on data, they argue for settling for the management of so-called “novel” ecosystems – “chronically degraded” is a much more accurate and appropriate term —  for whatever limited goods and services that they may offer. They thus undermine the case for investing in restoration, just at the moment when restoration science and practice is producing the most fruitful results in this young movement’s short history.

But equally, it was enormously refreshing to find that one of our brightest undergraduate students felt that we had all reached too cosy a consensus against the “novel” ecosystems concept, and set out to defend the contrary position, very cogently, in her final paper. That is, after all, what truly open debate is all about.

A growing sense that ecological restoration is now getting traction as a key conservation strategy, and provoking questions that help us reframe our troubled relationship to the natural work in imaginative and positive ways, was confirmed again and again in these classes, and on visits to other colleges. I was also invited to teach, in October and November, at the universities of Iowa, Wisconsin-Madison, Loyola (Chicago), Missouri,  William & Mary, Dartmouth College and Mount Holyoke.

And I found that restoration ecology now has a heightened profile on the core research and conservation agenda at Missouri Botanical Garden, where I was honoured to be made a research associate during a lecture visit.

Through this whole process, I was repeatedly challenged to reassess my own work. A book is a frozen moment in one’s development, and while I was happy to find that many of its arguments stood up well, I also found that my own position was evolving beyond some of my year-old conclusions. It was Curt Meine, the lucid historian of American conservation thinking, who alerted me to this.

He asked me, after I had given a lecture on the “novel ecosystem” concept at the Nelson Institute in Madison, whether I had not become more sharply critical about the concept than I had been in the final chapter in the book. And I realized that he was quite right: my erstwhile effort to synthesise opposing arguments, made by respected colleagues in good faith, was yielding to a more sharply honed opposition to proposals I saw doing real damage in the world.

What was perhaps most encouraging about this US trip, however were the indications that a much broader American audience now wants to learn about restoration issues. Public lectures organized by Morton Arboretum and by the Kane County Wild Ones in Illinois, and by Mount Kendal Retirement Home in New Hampshire and the Osher Institute at Dartmouth, all drew full houses, with more than 100 people at some events, along with substantial book sales.

The challenge now is to find ways of identifying and reaching the networks, in the Uand elsewhere, where there is hunger for discussion about restoration, and about how this strategy is being adopted in radically different socio-ecological contexts. Suggestions welcome!

 

 

 

 

 

 

 

 

 

 

Add a Comment
38. The Best Books of 2014

holiday_boot_camp

“Best,” from the Old English betest (adjective), betost, betst (adverb), of Germanic origin; related to Dutch and German best, also “to better.”*

*To better your end-of-the-year book perusing considerations, here’s a list of our titles we were geeked to see on many of the year’s Best of 2014 lists, from non-fiction inner-city ethnographies to the taxonomies of beetle sheaths:

John Drury’s Music at Midnight was named one of the ten best nonfiction books of 2014 by the Wall Street Journal.

Alice Goffman’s On the Run was named one of the 100 Notable Books of 2014 by the New York Times Book Review, one of only two university press books on the list, and one of the 30 best nonfiction books of 2014 by Publishers Weekly.

Rachel Sussman’s Oldest Living Things in the World was named one of the 100 best books of 2014 by Amazon, was one of three Chicago books on the Wall Street Journal’s six-book list of the Best Nature Gift Books of 2014, and topped Maria Popova’s list of the year’s best art, design, and photo books at Brainpickings.

Mark E. Hauber’s The Book of Eggs was one of three Chicago books on the Wall Street Journal’s six-book list of the Best Nature Gift Books of 2014.

Helen and William Bynum’s Remarkable Plants that Shape Our World was one of three Chicago books on the Wall Street Journal’s six-book list of the Best Nature Gift Books of 2014, was named by the Guardian’s Grrl Scientist to her list of the year’s best science books, and was named to the Globe and Mail’s Best Gift Books list for 2014.

Sylvia Sumira’s Globes was named one of the best gift books of the year in the design category by the Wall Street Journal.

Atif Mian and Amir Sufi’s House of Debt was named one of the best economics books of the year by the Financial Times.

Patrice Bouchard’s Book of Beetles was named one of the best science books of the year by Wired and was named to the Globe and Mail‘s Best Gift Books list for 2014.

Scott Richard Shaw’s Planet of the Bugs was named by the Guardian’s Grrl Scientist to her list of the year’s best science books.

James W. P. Campbell’s The Library was named to Buzzfeed’s 47 Incredibly Unique Books to Buy Everyone on Your List.

Howard S. Becker’s What About Mozart? What About Murder? was named a Book of the Year by two different contributors to the Times Higher Education.

Marta Gutman’s A City for Children was named a Book of the Year by the Times Higher Education. 

Retha Edens-Meier and Peter Bernhardt’s Darwin’s Orchids was named one of the best science books for the holidays by American Scientist.

Donald Westlake’s The Getaway Car was named to the Favorite Books of 2014 list by Christianity Today’s Books and Culture magazine and one of the five best suspense books to give as gifts this year in the Chicago Tribune’s Printers Row supplement.

Philip Ball’s Serving the Reich was named by Physics World as one of the top ten physics books of 2014.

Barbara Taylor’s The Last Asylum was named one of the Guardian’s best psychology books for 2014, one of the Guardian‘s Best Books of 2014 (twice!), and a Book of the Year by two contributors to the Times Higher Education.

Five books in paperback by Peter De Vries made William Giraldi’s Year in Reading list for the Millions.

Peggy Shinner’s You Feel So Mortal was recommended as a gift book in the Chicago Tribune’s Printers Row supplement.

Margaret F. Brinig and Nicole Stelle Garnett’s Lost Classroom, Lost Community was named the best book of the year by Education Next.

Happy holidays! Buy a book!

Add a Comment
39. Launch: Baronova and Baryshnikov

9780226167169

Irina Baronova and the Ballets Russes de Monte Carlo chronicles one of the most acclaimed touring ballet companies of the twentieth century, along with its prima ballerina and muse, the incomparable Irina Baronova. Along the way, it expands upon the rise of modern ballet as a medium, through an unprecedented archive of letters (over 2,000 of them), photographs, oral histories, and interviews conducted by Victoria Tennant, the book’s author and Baronova’s daughter. Earlier this month, the book was feted at a launch by none other than Mikhail Baryshnikov at his eponymous Arts Center in New York City. Although less sumptuous than those collected in the book, below follow some candid photos from the event:

Untitled

Victoria Tennant and ballerina Wendy Whelan

Untitled2

Mikhail Baryshnikov, Tennant, and Blythe Danner (L to R)

Untitled3

Bebe Neuwirth, Tennant, and Chris Calkins (L to R)

 

To read more about Irina Baronova and the Ballets Russes de Monte Carlo, click here.

Add a Comment
40. House of Debt on the Independent’s Best of 2014

9780226271651

Atif Mian and Amir Sufi’s House of Debt, a polemic about the Great Recession and a call to action about the borrowing and lending practices that led us down the fiscal pits, already made a splash on the shortlist for the Financial Times‘s Best Business Book of 2014. Now, over at the Independent, the book tops another Best of 2014 list, this time proclaimed, “the jewel of 2014.” From Ben Chu’s review, which also heralds another university press title—HUP’s blockbuster Capital by Thomas Piketty (“the asteroid”):

As with Capital, House of Debt rests on some first-rate empirical research. Using micro data from America, the professors show that the localities where the accumulation of debt by households was the most rapid were also the areas that cut back on spending most drastically when the bubble burst. Mian and Sufi argue that policymakers across the developed world have had the wrong focus over the past half decade. Instead of seeking to restore growth by encouraging bust banks to lend, they should have been writing down household debts. If the professors are correct—and the evidence they assemble is powerful indeed—this work will take its place in the canon of literary economic breakthroughs.

We’ve blogged about the book previously here and here, and no doubt it will appear on more “Best of” lists for business and economics—it’s a read with teeth and legs, and the ostensible advice it offers to ensure we avoid future crises, points its fingers at criminal lending practices, greedy sub-prime investments, and our failure to share—financially and conceptually—risk-taking in our monetary practices.

You can read more about House of Debt here.

 

Add a Comment
41. The Hoarders

nm2

David Drummond’s cover for The Hoarders, one of Paste Magazine’s 30 Best Book Covers of 2014.

This past week, New Yorker critic Joan Acocella profiled Scott Herring’s The Hoarders, a foray into the history of material culture from the perspective of clutter fetish and our fascination with the perils surrounding the urge to organize. The question Herring asks, namely, “What counts as an acceptable material life—and who decides?,” takes on a gradient of meaning for Acocella, who confronts the material preferences of her ninety-three-year-old mother, which prove to be in accord with the DSM V‘s suggestion that, “hoarding sometimes begins in childhood, but that by the time the hoarders come to the attention of the authorities they tend to be old.”

In The Hoarders, Herring tells the tale of Homer and Langley Collyer, two brothers to whom we can trace a legend (um, legacy?) of modern hoarding, whose eccentricity and ill health (Langley took care of Homer, who was both rheumatic and blind) led to a lion’s den of accrual, and a rather unfortunate end. As Acocella explains:

In 1947, a caller alerted the police that someone in the Collyer mansion may have died. After a day’s search, the police found the body of Homer, sitting bent over, with his head on his knees. But where was Langley? It took workers eighteen days to find him. The house contained what, in the end, was said to have been more than a hundred and seventy tons of debris. There were toys, bicycles, guns, chandeliers, tapestries, thousands of books, fourteen grand pianos, an organ, the chassis of a Model T Ford, and Dr. Collyer’s canoe. There were also passbooks for bank accounts containing more than thirty thousand dollars, in today’s money.

As Herring describes it, the rooms were packed almost to the ceilings, but the mass, like a Swiss cheese, was pierced by tunnels, which Langley had equipped with booby traps to foil burglars. It was in one of those tunnels that his corpse, partly eaten by rats, was finally discovered, only a few feet away from where Homer’s body had been found. He was apparently bringing Homer some food when he accidentally set off one of his traps and entombed himself. The medical examiner estimated that Langley had been dead for about a month. Homer seems to have died of starvation, waiting for his dinner.

The New Yorker piece also confronts the grand dames of American hoarding, Little Edie and Big Edie Bouvier Beale, cousins to Jacqueline Kennedy Onassis, and the subjects of Albert and David Maysles’ cult classic documentary Grey Gardens. Acocella positions the Beales as camped-out, if charming, odd fellows, but also points to an underlying class-based assumption in our embrace of their peculiarities:

If they crammed twenty-eight rooms with junk, that’s in part because they had a house with twenty-eight rooms. And if they declined to do the dishes, wouldn’t you, on many nights, have preferred to omit that task?

This is only slightly out of sorts with the stance Herring adopts in the book: as material culture changes, so too do our interactions with it—and each other. You can read much more from Acocella in her profile, but her takehome—we are what we stuff, except what we stuff is subject to the scrutinies and perversions of our social order (class and race among them)—is worth mentioning: we should get out from under it now, because it’s only going to get worse.

Read more about The Hoarders here.

 

 

 

 

Add a Comment
42. Forthcoming: The Big Jones Cookbook

It’s unconventional, to say the least, for a university press to publish a cookbook. But an exception to this rule, coming in Spring 2015, is Paul Fehribach’s Big Jones Cookbook, which expands upon the southern Lowcountry cuisine of the eponymous Chicago restaurant. As mentioned in the book’s catalog copy, “from its inception, Big Jones has focused on cooking with local and sustainably grown heirloom crops and heritage livestock, reinvigorating southern cooking through meticulous technique and the unique perspective of its Midwest location.” More expansively, Fehribach’s restaurant positions the social and cultural inheritances involved in regional cooking at the forefront, while the cookbook expands upon the associated recipes by situating their ingredients (and the culinary alchemy involved in their joining!) as part of a rich tradition invigorated by a kind of heirloom sociology, as well as a sustainable farm-to-table tradition.

This past week, as part of the University of Chicago Press’s Spring 2015 sales conference, much of the Book Division took to a celebratory meal at Big Jones, and the photos below, by editorial director Alan Thomas, both show Fehribach in his element, as well as commemorate the occasion:

141202_BigJones_05

 

141202_BigJones_10

141202_BigJones_15

To read more about The Big Jones Cookbook, forthcoming in Spring 2015, click here.

 

Add a Comment
43. Citizen: Jane Addams and the labor movement

arena_gay

On this day in 1931, Jane Addams became the first woman to win the Nobel Peace Prize. Read an excerpt from Louise W. Knight’s Citizen: Jane Addams and the Struggle for Democracy, about the ethics and deeply held moral beliefs permeating the labor movement—and Addams’s own relationship to it—after the jump.

***

From Chapter 13, “Claims” (1894)

On May 11 Addams, after giving a talk at the University of Wisconsin and visiting Mary Addams Linn in Kenosha, wrote Alice that their sister’s health was improving. The same day, a major strike erupted at the Pullman Car Works, in the southernmost part of Chicago. The immediate cause of the strike was a series of wage cuts the company had made in response to the economic crisis. Since September the company had hired back most of the workers it had laid off at the beginning of the depression, but during the same period workers’ wages had also fallen an average of 30 percent. Meanwhile, the company, feeling pinched, was determined to increase its profits from rents. In addition to the company’s refusing to lower the rent rate to match the wage cuts, its foremen threatened to fire workers living outside of Pullman who did not relocate to the company town. The result was that two-thirds of the workforce was soon living in Pullman. By April, many families were struggling to pay the rents and in desperate straits; some were starving. The company’s stance was firm. “We just cannot afford in the present state of commercial depression to pay higher wages,” Vice President Thomas H. Wickes said. At the same time, the company continued to pay its stockholders dividends at a the rate of 8 percent per annum, the same rate it had paid before the depression hit.

The workers had tried to negotiate. After threatening on May 5 to strike if necessary, leaders of the forty-six-member workers’ grievance committee met twice with several company officials, including, at the second meeting, George Pullman, the company’s founder and chief executive, to demand that the company reverse the wage cuts and reduce the rents. The company refused, and on May 11, after three of the leaders of the grievance committee had been fired and a rumor had spread that the company would lock out all employees at noon, twenty-five hundred of the thirty-one hundred workers walked out. Later that day, the company laid off the remaining six hundred. The strike had begun. “We struck at Pullman,” one worker said, “because we were without hope.”

For Addams, the coincidental timing of the strike and Mary’s illness, both of which would soon worsen, made each tragedy, if possible, a greater sorrow. The strike was a public crisis. Its eruption raised difficult questions for Addams about the ethics of the industrial relationship. What were George Pullman’s obligations to his employees? And what were his employees’ to him? Was it disloyal of him to treat his workers as cogs in his economic machine? Or was it disloyal of his workers to strike against an employer who supplied them with a fine town to live in? Who had betrayed whom? Where did the moral responsibility lie? Mary’s illness was Addams’s private crisis. Mary was the faithful and loving sister whose affection Addams had always relied on and whose life embodied the sacrifices a good woman made for the sake of family. Mary had given up her chance for further higher education for her family’s sake and had been a devoted wife to a husband who had repeatedly failed to support her and their children. The threat of her death stirred feelings of great affection and fears of desperate loss in Addams.

As events unfolded, the two crises would increasingly compete for Addams’s loyalty and time. She would find herself torn, unsure whether she should give her closest attention to her sister’s struggle against death or to labor’s struggle against the capitalist George Pullman. It was a poignant and unusual dilemma; still, it could be stated in the framework she had formulated in “Subjective Necessity”: What balance should she seek between the family and the social claim?

The causes of the Pullman Strike went deeper than the company’s reaction to the depression. For the workers who lived in Pullman, the cuts in wages and the high rents of 189394 were merely short-term manifestations of long-term grievances, all of them tied to company president George Pullman’s philosophy of industrial paternalism. These included the rules regarding life in Pullman, a privately owned community located within the city of Chicago. Pullman had built the town in 1880 to test his theory that if the company’s workers lived in a beautiful, clean, liquor- and sin-free environment, the company would prosper. Reformers, social commentators, and journalists across the country were fascinated by Pullman’s “socially responsible” experiment. Addams would later recall how he was “dined and feted throughout Europe . . . as a friend and benefactor of workingmen.” The workers, however, thought the Pullman Company exercised too much control. Its appointees settled community issues that elsewhere would have been dealt with by an elected government, company policy forbade anyone to buy a house, the town newspaper was a company organ, labor meetings were banned, and company spies were everywhere. Frustrated by this as well as by various employment practices, workers organized into unions according to their particular trades (the usual practice), and these various unions repeatedly struck Pullman Company in the late 1880s and early 1890s. The May 1894 strike was the first that was companywide.

Behind that accomplishment lay the organizing skills of George Howard, vice president of the American Railway Union (ARU), the new cross-trades railroad union that Eugene Debs, its president, had founded the previous year. To organize across trades was a bold idea. Howard had been in Chicago since March signing up members, and by early May he was guiding the workers in their attempted negotiations with the company. The ARU’s stated purpose was to give railroad employees “a voice in fixing wages and in determining conditions of employment.” Only one month earlier it had led railroad workers at the Great Northern Railroad through a successful strike. Thanks to the ARU as well as to the mediating efforts of some businessmen from St. Paul, Minnesota, voluntary arbitration had resolved the strike, and three-fourths of the wage cut of 30 percent had been restored. Impressed, 35 percent of Pullman’s workers joined the ARU in the weeks that followed, hoping that the new union could work the same magic on their behalf.

At first, the prospects for a similar solution at Pullman did not look promising. After the walkout, George Pullman locked out all employees and, using a business trip to New York as his excuse, removed himself from the scene. Meanwhile, a few days after the strike began, Debs, a powerful orator, addressed the strikers to give them courage. He had the rare ability to elevate a controversy about wages into a great moral struggle. The arguments he used that day, familiar ones in the labor movement, would be echoed in Jane Addams’s eventual interpretation of the Pullman Strike. “I do not like the paternalism of Pullman,” he said. “He is everlastingly saying, what can we do for the poor workingmen? . . . The question,” he thundered, “is what can we do for ourselves?”

At this point, the Civic Federation of Chicago decided to get involved. Its president, Lyman Gage, an enthusiast for arbitration, appointed a prestigious and diverse conciliation board to serve as a neutral third party to bring the disputing sides before a separate arbitration panel. Made up partly of members of the federation’s Industrial Committee, on which Addams sat, it was designed to be representative of various interests, particularly those of capital, labor, academia, and reform. It included bank presidents, merchants, a stockbroker, an attorney, presidents of labor federations, labor newspaper editors, professors, and three women civic activists: Jane Addams, Ellen Henrotin, and Bertha Palmer.

The board divided itself into five committees. In the early phase of the strike it would meet nightly, in Addams’s words, to “compare notes and adopt new tactics.” Having had some success in arranging arbitrations in the Nineteenth Ward, Addams was eager to see the method tried in the Pullman case. She would soon emerge as the driving force and the leading actor in the initiative.

The first question the board discussed was whether the Pullman workers wanted the strike to be arbitrated. Addams investigated the question by visiting the striking workers in Pullman, eating supper with some of the women workers, touring the tenement housing, and asking questions. Afterwards, she asked the president of the local ARU chapter, Thomas Heathcoate, and ARU organizer George Howard to allow the conciliation board to meet with the Strike Committee. Refusing her request, Howard told her that the ARU was willing to have the committee meet with the board but that first the Pullman Company would have to state its willingness to go to arbitration.

Meanwhile, three men from the conciliation board were supposed to try to meet with the Pullman Company. The board’s president, A. C. Bartlett, a businessman, was to arrange the meeting but, as of May 30, two weeks into the strike, he had done nothing. Frustrated, Addams stepped in. On June 1 she arranged for Bartlett, Ralph Easley (the Civic Federation’s secretary), and herself to meet with Vice President Wickes and General Superintendent Brown. At the meeting, which Bartlett failed to attend, Wickes merely repeated the company’s well-known position: that it had “nothing to arbitrate.”

Thwarted, Addams decided, with the board’s support, to try again to arrange for the board to meet with the Strike Committee. At a Conciliation Board meeting, Lyman Gage suggested that she propose that rent be the first issue to be arbitrated. Agreeing, Addams decided that, instead of taking the idea to the uncooperative Howard, she would take it over his head to Debs. Persuaded by Addams, Debs immediately arranged for members of the board to speak that night to the Strike Committee about the proposal. Once again, however, Addams’s colleagues failed to follow through. She was the only board member to turn up.

At the meeting, the strike leaders were suspicious, believing that arbitration was the company’s idea. No report survives of how Addams made her case to them, but one can glean impressions from a description of Addams that a reporter published in a newspaper article in June 1894. She described Addams as a “person of marked individuality[;] she strikes one at first as lacking in suavity and graciousness of manner but the impression soon wears away before [her] earnestness and honesty.” She was struck, too, by Addams’s paleness, her “deep” eyes, her “low and well-trained voice,” and they way her face was “a window behind which stands her soul.”

Addams must have made a powerful presentation to the Strike Committee. After she spoke, it voted to arbitrate not only the rents but any point. It was the breakthrough Addams had been hoping for. “Feeling that we had made a beginning toward conciliation,” Addams remembered, she reported her news to the board.

Meanwhile, with the workers and their families’ hunger and desperation increasing, tensions were mounting. Wishing to increase the pressure on the company, Debs had declared on June 1 that the ARU was willing to organize a nationwide sympathy boycott of Pullman cars among railroad employees generally if the company did not negotiate. The Pullman Company’s cars, though owned and operated by the company, were pulled by various railroads. A national boycott of Pullman cars could bring the nation’s already devastated economy to a new low point. Meanwhile, the ARU opened its national convention in Chicago on June 12. Chicago was nervous. Even before the convention began, Addams commented to William Stead, in town again for a visit, that “all classes of people” were feeling “unrest, discontent, and fear. We seem,” she added, “to be on the edge of some great upheaval but one can never tell whether it will turn out a tragedy or a farce.”

Several late efforts at negotiation were made. On June 15 an ARU committee of twelve, six of them Pullman workers, met with Wickes to ask again whether the company would arbitrate. His answer was the same: there was nothing to arbitrate and the company would not deal with a union. Soon afterward George Pullman returned to town. He agreed to meet with the conciliation board but, perhaps sensing the danger that the sincere and persuasive Jane Addams posed, only with its male members. At the meeting he restated his position: no arbitration. At this point, Addams recalled, the board’s effort collapsed in “failure.” The strike was now almost two months old. Addams had done everything she could to bring about arbitration. Resourceful, persistent, even wily, she had almost single-handedly brought the workers to the table, but because she was denied access to George Pullman on the pretext of her gender, she had failed to persuade the company. Her efforts, however, had made something very clear to herself and many others—that George Pullman’s refusal to submit the dispute to arbitration was the reason the strike was continuing.

The situation now became graver. At the ARU convention, the delegates voted on June 22 to begin a national boycott of Pullman cars on June 26 if no settlement were reached. Abruptly, on the same day as the vote, a powerful new player, the General Managers Association (GMA), announced its support for the company. The GMA had been founded in 1886 as a cartel to consider “problems of management” shared by the twenty-four railroad companies serving Chicago; it had dabbled in wage-fixing and had long been opposed to unions. George Pullman’s refusal to arbitrate had been, among other things, an act of solidarity with these railroad companies, his business partners. Disgusted with the outcome of the Great Northern Strike, they were determined to break the upstart ARU, which threatened to shrink the profits of the entire industry. Pullman departed the city again in late June for his vacation home in New Jersey, leaving the GMA in charge of the antistrike strategy. It announced that any railroad worker who refused to handle Pullman cars would be fired.

The ARU was undaunted. On June 26 the boycott began. Within three days, one hundred thousand men had stopped working and twenty railroads were frozen. Debs did not mince words in his message to ARU members and their supporters. This struggle, he said, “has developed into a contest between the producing classes and the money power of this country.” Class warfare was at hand.

Jane Addams was not in the city when the ARU voted for the boycott. She had gone to Cleveland to give a commencement speech on June 19 at the College for Women of Western Reserve University. But she was in Chicago when the boycott began. Chicago felt its impact immediately. There was no railroad service into or out of the city, and public transportation within the city also ceased as the streetcar workers joined the boycott. With normal life having ground to a halt, the city’s mood, which had been initially sympathetic to the workers, began to polarize along class lines. Working people’s sympathies for the railroad workers and hostility toward capitalists rose to a fever pitch while many people in the middle classes felt equally hostile toward the workers; some thought that the strikers should be shot. In Twenty Years Addams writes, “During all those dark days of the Pullman strike, the growth of class bitterness was most obvious.” It shocked her. Before the strike, she writes, “there had been nothing in my experience [that had] reveal[ed] that distinct cleavage of society which a general strike at least momentarily affords.”

The boycott quickly spread, eventually reaching twenty-seven states and territories and involving more than two hundred thousand workers. It had become the largest coordinated work stoppage in the nation’s history and the most significant exercise of union strength the nation had ever witnessed. The workers were winning through the exercise of raw economic power. Virtually the only railcars moving were the federal mail cars, which the boycotting railroad workers continued to handle, as required by federal law and as Debs had carefully instructed them. The railroad yards in the city of Chicago were full of striking workers and boycotters determined to make sure that other railroad cars did not move and to protect them from vandalism.

Now the GMA took aggressive steps that would change the outcome of the strike. On June 30 it used h its influence in Washington to arrange for its own lawyer, Edwin Walker, to be named a U.S. Special Attorney. Walker then hired four hundred unemployed men, deputized them as U.S. Marshals, armed them, and sent them to guard the federal mail cars in the railroad yards to be sure the mail got through. In the yards, the strikers and marshals eyed each other nervously.

Meanwhile, on June 29, Jane Addams’s family crisis worsened. Jane had visited Mary on June 28 but returned to Chicago the same day. That night she received word that her sister’s condition suddenly had become serious, and the following day she rushed back to Kenosha the next day accompanied by Mary’s son Weber Linn (apparently they traveled in a mail car thanks to Addams’s ties to the strikers). She deeply regretted having been gone so much. “My sister is so pleased to have me with her,” she wrote Mary Rozet Smith, “that I feel like a brute when I think of the days I haven’t been here.” As Addams sat by Mary’s bed in Kenosha, the situation in Chicago remained relatively calm. Nevertheless, the GMA now took two more steps that further drew the federal government into the crisis. Claiming that the strikers were blocking the movement of the federal mails (although the subsequent federal investigation produced no evidence that this was true), the GMA asked the U.S. attorney general to ask President Grover Cleveland to send federal troops to shut down the strike. Cleveland agreed, and on July 3 the first troops entered the city. The same day the attorney general ordered Special Attorney Walker to seek an injunction in federal court against the ARU in order to block the union from preventing workers from doing their work duties. The injunction was immediately issued.

By July 1 it was clear that Mary Addams Linn was dying. Addams wired her brother-in-law John and the two younger children, Esther, thirteen, and Stanley, who had recently turned eleven, to come from Iowa to Kenosha. By July 3 they had somehow reached Chicago but, because of the boycott, they could not find a train for the last leg of their trip. At last John signed a document relieving the railroad of liability; then, he, Esther, and Stanley boarded a train (probably a mail train) and within hours had arrived in Kenosha, protected, or so Esther later believed, by the fact that they were relatives of Jane Addams, “who was working for the strikers.” Mary’s family was now all gathered around her except for her oldest son, John, who was still in California. Unconscious by the time they arrived, she died on July 6.

Pullman strike
Illinois National Guard troops in front of the Arcade Building in Pullman during the Pullman Strike. [Neg. i21195aa.tif, Chicago Historical Society.]

While Jane Addams’s private world was crumbling, so was Chicago’s civic order. On July 4, one thousand federal troops set up camp around the Post Office and across the city, including the Pullman headquarters. On July 5 and 6 thousands of unarmed strikers and boycotters crowded the railroad yards, joined by various hangers-on—hungry, angry, unemployed boys and men. Many were increasingly outraged by the armed marshals and the troops’ presence. Suddenly a railroad agent shot one of them, and they erupted into violence. Hundreds of railroad cars burned as the troops moved in. Now the strikers were fighting not only the GMA but also the federal government. This had been the GMA’s aim all along. In the days to come, thousands more federal troops poured into the city.

After attending Mary’s funeral in Cedarville Jane Addams returned on July 9 to find Chicago an armed camp and class warfare on everyone’s minds. In working-class neighborhoods such as the Nineteenth Ward, people wore white ribbons in support of the strike and the boycott. Across town, middle-class people were greeted at their breakfast tables by sensational newspaper headlines claiming that the strikers were out to destroy the nation. One Tribune headline read, “Dictator Debs versus the Federal Government.” The national press echoed the theme of uncontrolled disorder. Harper’s Weekly called the strikers “anarchists.” And the nation remained in economic gridlock. Farmers and producers were upset that they could not move their produce to market. Passengers were stranded. Telegrams poured into the White House.

Like the strikers’ reputation, Hull House’s was worsening daily. Until the strike took place, Addams later recalled, the settlement, despite its radical Working People’s Social Science Club, had been seen as “a kindly philanthropic undertaking whose new form gave us a certain idealistic glamour.” During and after the strike, the situation “changed markedly.” Although Addams had tried to “maintain avenues of intercourse with both sides,” Hull House was now seen as pro-worker and was condemned for being so. Some of the residents were clearly pro-worker. Florence Kelley and one of her assistant factory inspectors, Alzina Stevens, befriended Debs during the strike and its aftermath. Stevens sheltered him for a time in her suburban home when authorities were trying to arrest him; Kelley tried to raise money for his bail after he was arrested later in July.

Addams and Hull House began to be severely criticized. Donors refused to give. Addams told John Dewey, who had come to town to take up his new position at the University of Chicago, that she had gone to meet with Edward Everett Ayer, a Chicago businessman with railroad industry clients who had often supported Hull House’s relief work, to ask him for another gift. Dewey wrote his wife, “[Ayer] turned on her and told her that she had a great thing and now she had thrown it away; that she had been a trustee for the interests of the poor, and had betrayed it [sic]—that like an idiot she had mixed herself in something which was none of her business and about which she knew nothing, the labor movement and especially Pullman, and had thrown down her own work, etc., etc.” That autumn Addams had “a hard time financing Hull-House,” a wealthy friend later recalled. “Many people felt she was too much in sympathy with the laboring people.” Addams merely notes in Twenty Years that “[in] the public excitement following the Pullman Strike Hull House lost many friends.”

And there were public criticisms as well. Some middle- and upper-class people attacked Addams, one resident remembered, as a “traitor to her class.” When Eugene Debs observed that “epithets, calumny, denunciation . . . have been poured forth in a vitriolic tirade to scathe those who advocated and practiced . . . sympathy,” one suspects that he had in mind the treatment Jane Addams received. Meanwhile, the workers were angry that Addams would not more clearly align herself with their cause. Her stance—that she would take no side—guaranteed that nearly everyone in the intensely polarized city would be angry with her.

Standing apart in this way was extremely painful. She was “very dependent on a sense of warm comradeship and harmony with the mass of her fellowmen,” a friend, Alice Hamilton, recalled. “The famous Pullman strike” was “for her the most painful of experiences, because . . . she was forced by conviction to work against the stream, to separate herself from the great mass of her countrymen.” The result was that Addams “suffered from . . . spiritual loneliness.” In these circumstances, no one could mistake Addams’s neutrality for wishy-washiness. Practicing neutrality during the Pullman Strike required integrity and courage. In being true to her conscience, she paid a tremendous price.

Of course, the strike was not the only reason she was lonely. Mary’s death was the other. And if, as we may suspect, Mary’s passing evoked the old trauma for Jane of their mother Sarah’s passing, not to mention the later losses of their sister Martha and their father, then the loneliness Addams felt in the last days of the strike and the boycott was truly profound.

She does not describe these feelings when she writes about the strike and Mary’s death in Twenty Years, but in the chapter about Abraham Lincoln, she conveys her feelings well enough. She tells about a walk she took in the worst days of the strike. In that “time of great perplexity,” she writes, she decided to seek out Lincoln’s “magnanimous counsel.” In the sweltering heat, dressed in the long skirt and long-sleeved shirtwaist that were then the fashion, Addams walked—because the streetcars were on strike—four and a half “wearisome” miles to St. Gaudens’s fine new statue of Lincoln, placed at the entrance to Lincoln Park just two years earlier, and read the words cut in stone at the slain president’s feet: “With charity towards all.” And then, still bearing on her shoulders the burden of public hatred that Lincoln had also borne, she walked the four and a half miles home.

Although the deployment of troops had broken the strike’s momentum, the government needed to put the strike’s leader behind bars to bring the strike to an end. On July 10 Debs was indicted by a grand jury for violating the injunction and arrested. Bailed out two days later, he was arrested again on July 17 to await trial in jail. However, when the government prosecutor, the ubiquitous Edwin Walker, became ill, the trial was postponed, and Debs went home to Indiana, where he collapsed gratefully into bed. The trial was held in November 1894; Debs would begin serving his six-month sentence in January 1895.

With Debs removed from leadership and fourteen thousand armed troops, police, and guardsmen bivouacked in Chicago, the strike and the boycott soon collapsed. On August 2 the ARU called off the strike, and on the same day the Pullman Company partially reopened. The railroads were soon running again. The anti-labor forces had won. Private industry and the federal government had shown that, united and with the power of the law on their side, no one, not even the hundreds of thousands of workers who ran the nation’s most crucial industry, could defeat them. If the strike had been successful, it would have turned the ARU into the nation’s most powerful union. Given that the strike failed, the opposite result took place. As the GMA had intended, the ARU died. After Debs was released from jail, he did not resurrect the union.

Although the strike was over, innumerable questions remained unanswered. For the country as a whole, whose only sources of information had been sensational news stories and magazine articles, the first question was: What were the facts? To sort these out, President Grover Cleveland appointed a three-person fact-finding commission to investigate and issue a report. Jane Addams would testify before the United States Strike Commission in August, as would George Pullman.

Meanwhile, for Addams and other labor and middle-class reformers in Chicago, the question was how to prevent or resolve future strikes. The Conciliation Board’s effort to promote voluntary arbitration had been promising, but its failure revealed, Addams believed, certain “weaknesses in the legal structure,” that is, in state and federal laws. On July 19, two days after Debs’ second arrest, as the troops began slowly to withdraw from the city, the Central Council of the Civic Federation met at the Commerce Club. At the meeting, M. C. Carroll, editor of a labor magazine and a member of the Conciliation Board, proposed that the federation host a conference “on arbitration or conciliation” to seek ideas about ways to avert “strikes and boycotts in the future.” The Central Council “enthusiastically endorsed” the proposal and appointed a committee to devise a plan. The hope was to do something immediately, while interest was high, to increase public support for arbitration legislation in Illinois and across the nation.

Addams missed the meeting because she was assisting at the Hull House Summer School at Rockford College, which began on July 10. But she was back in Chicago by the second week in August and had soon joined the arbitration conference committee. It devised a three-part strategy. First, it would convene “capital and labor” at a national conference titled “Industrial Conciliation and Arbitration” in Chicago in November to provide a forum for “calm discussion” of the questions raised by the strike and bring together information about methods of arbitration and conciliation. Second, conference participants from Illinois would press the Illinois General Assembly to pass a law creating a state board of arbitration. Third, a national commission would be named at the end of the conference to press for federal legislation. Elected as secretary to the committee, Jane Addams threw herself into organizing the event.

At the same time, she took on new family responsibilities. With Mary’s death, Jane Addams, at thirty-three, became the guardian and mother of the two younger Linn children. Their father had decided he could not afford to keep them. For the fall, she and Alice agreed that Stanley would live at Hull House and Esther would attend the preparatory boarding school that was affiliated with Rockford College. Weber, nineteen, was still a student at the University of Chicago. He would spend his vacations at Hull House. The oldest son, John, twenty-two, having returned from California, was once again a resident at Hull House and studying for the Episcopalian priesthood. Esther remembered Addams as taking “me and my brothers in as her own children. . . . [She] was a wonderful mother to us all.” Addams was particularly close to Stanley, who, according to Alice’s daughter Marcet, “became . . . Aunt Jane’s very own little boy[;] . . . he was always like a son to her.”

Jane Addams would honor this family claim for the rest of her life. Her niece and nephews, later joined by their children, would gather with her for holidays, live with her at Hull House at various times in their lives, and rely on her for advice, as well as for a steady supply of the somewhat shapeless sweaters that she would knit for them. Because few letters between Addams and the Linn children have survived, the historical record is mostly silent about the affectionate bonds that linked them and the faithfulness with which she fulfilled the maternal role. Her devotion arose from a deep understanding of what it felt like for a child to lose its mother and from a deep gratitude that she could give to Mary’s children the gift Mary had given her.

The Pullman Strike was a national tragedy that aroused fierce passions and left many scars. For many in the middle classes, including Jane Addams, some of the most painful scars were the memories of the intense hatred the strike had evoked between the business community and the workers. Was such class antagonism inevitable? Many were saying so, but Addams, committed as she was to Tolstoyan and Christian nonviolence, social Christian cooperation, and Comtean societal unity, found it impossible to accept the prevailing view. That fall she and John Dewey, now the first chair of the Department of Philosophy at the University of Chicago, discussed this question. In a letter to his wife Alice Dewey reported telling Addams that conflict was not only inevitable but possibly a good thing. Addams disagreed. She “had always believed and still believed,” he wrote, that “antagonism was not only useless and harmful, but entirely unnecessary.” She based her claim on her view that antagonisms were not caused by, in Dewey’s words, “objective differences, which would always grow into unity if left alone, but [by] a person’s mixing in his own personal reactions.” A person was antagonistic because he took pleasure in opposing others, because he desired not to be a “moral coward,” or because he felt hurt or insulted. These were all avoidable and unnecessary reactions. Only evil, Addams said, echoing Tolstoy, could come from antagonism.

During their conversation, she asked Dewey repeatedly what he thought. Dewey admitted that he was uncomfortable with Addams’s theory. He agreed that personal reactions often created antagonism, but as for history, he was enough of a social Darwinist and a Hegelian to believe that society progressed via struggle and opposition. He questioned her. Did she not think that, in addition to conflict between individuals, there were conflicts between ideas and between institutions, for example, between Christianity and Judaism and between “Labor and Capital”? And was not the “realization of . . . antagonism necessary to an appreciation of the truth and to a consciousness of growth”?

Again she disagreed. To support her case Addams gave two examples of apparently inevitable conflicts involving ideas or institutions that she interpreted differently. When Jesus angrily drove the moneychangers out of the temple, she argued, his anger was personal and avoidable. He had “lost his faith,” she said, “and reacted.” Or consider the Civil War. Through the antagonism of war, we freed the slaves, she observed, but they were still not free individually, and in addition we have had to “pay the costs of war and reckon with the added bitterness of the Southerner besides.” The “antagonisms of institutions,” Dewey told Alice, summarizing Addams’s response, “were always” due to the “injection of the personal attitude and reaction.”

Dewey was stunned and impressed. Addams’s belief struck him as “the most magnificent exhibition of intellectual & moral faith” that he had ever seen. “[W]hen you think,” he wrote Alice, “that Miss Addams does not think this as a philosophy, but believes it in all her senses & muscles—Great God.” Dewey, gripped by the power of Addams’s grand vision, told Alice, “I never had anything take hold of me so.”

But his intellect lagged behind. Struggling to find a way to reconcile his and Addams’s views, Dewey attempted a formulation that honored Addams’s devotion to unity, which he shared, while retaining the principle of antagonistic development that Addams rejected but he could not abandon. “[T]he unity [is not] the reconciliation of opposites,” he explained to his wife. Rather, “opposites [are] the unity in its growth.” But he knew he had avoided a real point of disagreement between them. He admitted to Alice, “[M]y pride of intellect . . . revolts at thinking” that conflict between ideas or institutions “has no functional value.” His and Addams’s disagreement—was it an antagonism?—was real, and in discovering it, the two had taken each other’s measure. Addams’s principled vision and spiritual charisma had met their match in the cool machinery of John Dewey’s powerful mind.

Two days later Dewey sent Addams a short note in which he retracted part of what he had said. He was now willing to agree, he wrote, that a person’s expectation of opposition was in and of itself not good and even that it caused antagonism to arise. “[T]he first antagonism always come[s] back to the assumption that there is or may be antagonism,” he wrote, and this assumption is “bad.” In other words, he was agreeing with Addams’s points that antagonism was evil and that it always began in the feelings or ideas of the individual. Dewey did not, however, retract his claim that conflict had its historical uses. These were, as he had said, to appreciate truth and to be conscious of its growth, that is, its spread. He was speaking as the Christian idealist he still was—someone who saw truth as God’s revelation. Antagonism, in other words, helped bring man to see the truth, and this was its value.

When Dewey agreed with Addams that opposition originated in individual feelings, he was joining her in rejecting the usual view that objective differences justified antagonism. This was the view that unions held. Workers believed that the antagonism between themselves and employers arose because workers lacked something real and necessary: sufficient negotiating power in the relationship. In denying this, Dewey and Addams were being, in the simplest sense, determinedly apolitical. Addams, despite her recent involvement with strikes and politics, still refused to believe that actual conditions could provide legitimate grounds for opposition. Her idealism, expressed in her fierce commitment to cooperation, Christian love, nonresistance, and unity, stood like a wall preventing her from seeing that power, as much as personal feelings, soured human relations. A strong mind is both an asset and a liability.

That fall, Hull House, returning to normalcy, resumed its rich schedule of classes, club meetings, lectures, and exhibits. As usual, Addams was seriously worried about the settlement’s finances. The size of the total deficit for the year is unknown, but her awareness that the household operating account was $888 in arrears surfaced in a letter to Mary. As she had in previous years, Addams paid for part of the debt herself (how much is unclear; the documentation does not survive). Mary Rozet Smith, among others, sent a generous check. “It gives me a lump in my throat,” Addams wrote her in appreciation, “to think of the dollars you have put . . . into the . . . prosaic debt when there are so many more interesting things you might have done and wanted to do.” Aware of the delicacy of asking a close friend for donations, Addams sounded a note of regret. “It grieves me a little lest our friendship should be jarred by all these money transactions.”

As before, the residents were in the dark about the state of Hull House’s finances. Despite her intentions to keep them informed, Addams had convened no Residents’ Meeting between April and October, perhaps because the strike and Mary’s illness had absorbed so much of her attention. Finally, in early November she and the residents had “a long solemn talk,” as she wrote Mary. She had laid “before folks” the full situation and asked them “for help and suggestions.” And she had vowed that she would “never . . .let things get so bad again” before she consulted them. “I hope,” she told Mary, “we are going to be more intimate and mutually responsible on the financial side.”

Addams was renewed in her determination for two reasons. First, there was the problem of her own worsening finances. Since July she had assumed the new financial burden, apparently without any help from Alice, of raising Mary’s two younger children. Second, there was her increasing fear, as the depression deepened and donations dropped because of Hull House’s involvement with the Pullman Strike, that her personal liability for Hull House’s debts could literally put her in the Dunning poorhouse. Meanwhile, she pushed herself to speak as often as she could to earn lecture fees. In October she reported to Alice that she had given five talks in one week. In November, she gave lectures in three states—Illinois, Wisconsin, and Michigan. It was all that she could think of to do: to work harder.

Hull House was doing well enough by other measures. The residents’ group continued to grow. Despite the house’s recently stained reputation and the risky state of its finances, five new residents arrived, all women, bringing the total to twenty. For 1894–95, the residents had decided, probably at Addams’s urging, to limit the size of the residents, group to that number. There was now a good mix of old and new, with the majority, like Starr, Lathrop, and Kelley, having been there two years or more. The number of men had shrunk from seven to two, but in a few years it would be back to five. Addams, as always, took her greatest pleasure in the effervescent dailiness of it all. The settlement was first and foremost something “organic,” a “way of life,” she told an audience at the University of Chicago that fall.

Furthermore, the residents’ book of maps was moving toward completion. Conceived originally as a way to publicize some of the data about the neighborhood from the Department of Labor study, it had expanded to include a collection of essays on various related subjects and had acquired a sober New York publisher, Thomas Y. Crowell and Company, and a glorious title,Hull-House Maps and Papers: A Presentation of Nationalities and Wages in a Congested District of Chicago, Together with Comments and Essays on Problems Growing Out of the Social Conditions. The byline, it was agreed, would read “Residents of Hull-House.” It would be published in March 1895. Five of the essays, those by Kelley, Lathrop, Starr, and Addams, were much-expanded versions of the presentations they had made at the Congress on Social Settlements the previous year. Five others rounded out the collection. These dealt with the Bohemians, the Italians, and the Jews of the neighborhood, the maps, and the wages and expenses of cloakmakers in Chicago and New York. The maps were the book’s original inspiration and its most extravagant feature. Printed on oiled paper, folded and tucked into special slots in the book’s front and back covers, they displayed, block by block and in graphic, color-coded detail, where people of different nationalities lived in the ward and the range of wages they earned.

Addams, happy to be back in the editor’s chair, wrote the prefatory note, edited essays, and wrote the meaty appendix that described the settlement’s activities and programs. The book’s title was likely also her handiwork. Descriptive, indeed, exhaustive, it was the sort of title in which she specialized. As she once admitted to Weber Linn, “I am very poor at titles.” The book was very close to her heart. When she wrote Henry Demarest Lloyd on December 1 to thank him for sending the house a copy of his Wealth Against Commonwealth, she observed, “I have a great deal of respect for anyone who writes a good book.” After Maps was published she noted to those to whom she sent copies, “We are very proud of the appearance of the child.”

Jane Addams’s contribution to Maps was her essay “The Settlement as a Factor in the Labor Movement.” Her intention was to give a history of Hull House’s relations with unions as a sort of case study and to examine why and how settlements should be engaged with the labor movement. The piece is straightforward in tone, nuanced, not polemical. In it she settles fully into the even-handed interpretive role she had first attempted in her speech on domestic servants eighteen months earlier.

But the essay also burns with the painful knowledge she gained from the Pullman Strike. She wrestles with the tension between the labor movement’s loyalty to its class interests and her own vision of a classless, universalized, democratic society. And she probes the philosophical question she and Dewey had been debating: Are (class) antagonisms inevitable? Are antagonisms useful? The resulting essay was the most in-depth exploration of the subject of class that Addams would ever write. She was trying to find her way back from the edge of the cliff—class warfare—to which the Pullman Strike had brought her and the nation.

On the question of what the strike accomplished, her thoughts had shifted somewhat. Although she had told Dewey that antagonism was always useless, she argues in “The Settlement as a Factor” that strikes, which were certainly were a form of antagonism, can be useful and necessary. Strikes are often “the only method of arresting attention to [the workers’] demands”; they also offer the permanent benefits of strengthening of the ties of “brotherhood” among the strikers and producing (at least when successful) a more “democratic” relation between workers and their employer. Perhaps Dewey had been more persuasive than he realized.

She still felt, however, that personal emotion was the main cause of antagonisms, including strikes. She admits that labor has a responsibility to fight for the interests of the working people (that is, more leisure and wealth) but only because achieving them would help the workingman feel less unjustly treated. She charges labor with storing up of “grudges” against “capitalists” and calls this “selfish.” She ignores the question of whether low wages and long hours are fair. Social justice is not a touchstone for her arguments in this essay.

Instead, Addams stresses the ideal she had emphasized since coming to Chicago: that of a society united by its sense of common humanity. She writes prophetically of “the larger solidarity which includes labor and capital” and that is based on a “notion of universal kinship” and “the common good.” One might read into her argument the conclusion of social justice, yet the principle remains uninvoked. Instead, Addams stays focused on feelings. She is calling for sympathy for others’ suffering, not for a change in workers’ physical condition.

Addams disapproves of capitalism but not because of its effects on the workers. The moral failings of the individual capitalist trouble her. She slips in a rather radical quotation by an unnamed writer: “The crucial question of the time is, ‘In what attitude stand ye toward the present industrial system? Are you content that greed . . . shall rule your business life, while in your family and social life you live so differently? Shall Christianity have no play in trade?’” In one place, although only one place, she takes workers’ perspective and refers to capitalists as “the power-holding classes.” (Here at last was a glancing nod toward power.) The closest she comes to making a social justice argument is in a sentence whose Marxist flavor, like the previous phrase, suggests Florence Kelley’s influence, yet it, too, retains Addams’s characteristic emphasis on feelings. She hopes there will come a time “when no factory child in Chicago can be overworked and underpaid without a protest from all good citizens, capitalist and proletarian.” While Debs had wanted to arouse middle-class sympathies as a ways to improve the working conditions of the Pullman laborers, Addams wanted the labor movement to cause society to be more unified in its sympathies. Their means and ends were reversed.

Addams found the idea that labor’s organizing efforts could benefit society compelling. “If we can accept” that possibility, she adds, then the labor movement is “an ethical movement.” The claim was a startling one for her to make. It seems the strike had shown her at least one moral dimension to the workers’ struggle. The negative had become the potentially positive. Instead of seeing labor’s union organizing as a symptom of society’s moral decay, as she once had and many other middle-class people still did, she was considering the hypothesis that labor organizing was a sign of society’s moral redemption.

The Pullman Strike also cracked her moral absolutism. In “The Settlement as a Factor” she argues for the first time that no person or group can be absolutely right or absolutely wrong. “Life teaches us,” she writes, that there is “nothing more inevitable than that right and wrong are most confusingly mixed; that the blackest wrong [can be] within our own motives.” When we triumph, she adds, we bear “the weight of self-righteousness.” In other words, no one—not unions and working-class people, not businesses and middle-class people, not settlement workers and other middle-class reformers—could claim to hold or ever could hold the highest moral ground. The absolute right did not exist.

For Addams, rejecting moral absolutism was a revolutionary act. She had long believed that a single true, moral way existed and that a person, in theory, could find it. This conviction was her paternal inheritance (one recalls her father’s Christian perfectionism) and her social-cultural inheritance. Moral absolutism was the rock on which her confident Anglo-American culture was grounded. (It is also the belief that most sets the nineteenth century in the West apart from the twenty-first century.) Now she was abandoning that belief. In the territory of her mind, tectonic plates were shifting and a new land mass of moral complexity was arising.

In the fall of 1894, as she was writing “The Settlement as a Factor,” this new perspective became her favorite theme. In October she warned the residents of another newly opened settlement, Chicago Commons, “not to be alarmed,” one resident recalled, “if we found our ethical standards broadening as we became better acquainted with the real facts of the lives of our neighbors.” That same month, speaking to supporters of the University of Chicago Settlement, she hinted again at the dangers of moral absolutism. Do not, she said, seek “to do good.” Instead, simply try to understand life. And when a group of young men from the neighborhood told her they proposed to travel to New York City that fall to help end political corruption and spoke disdainfully of those who were corrupt, she admonished them against believing that they were purer than others and asked them if they knew what harm they did in assuming that they were right and others were wrong.

What had she seen during the Pullman Strike that led to this new awareness? She had seen the destructive force of George Pullman’s moral self-righteousness. It seemed to her that his lack of self-doubt, that is, his unwillingness to negotiate, had produced a national tragedy; his behavior and its consequences had revealed the evil inherent in moral absolutism. In Twenty Years she writes of how, in the midst of the strike’s worst days, as she sat by her dying sister’s bedside, she was thinking about “that touch of self-righteousness which makes the spirit of forgiveness well-nigh impossible.”

She grounded her rejection of absolute truth in her experience. “Life teaches us,” she wrote. This was as revolutionary for her as the decision itself. In “Subjective Necessity” she had embraced experience as a positive teacher in a practical way. Here she was allowing experience to shape her ethics. The further implication was that ethics might evolve, but the point is not argued in “The Settlement as a Factor.” Still, in her eyes ideas no longer had the authority to establish truth that they once had. Her pragmatism was strengthening, but it had not yet blossomed into a full-fledged theory of truth.

The Pullman Strike taught her in a compelling way that moral absolutism was dangerous, but she had been troubled by its dangers before. She had made her own mistakes and, apparently, a whole train of them related to self-righteousness. The details have gone unrecorded, but they made her ready to understand, and not afterwards forget, something James O. Huntington, the Episcopal priest who had shared the podium with her at the Plymouth conference, had said in a speech at Hull House the year before the strike. “I once heard Father Huntington say,” she wrote in 1901, that it is “the essence of immorality to make an exception of one’s self.” She elaborated. “[T]o consider one’s self as . . . unlike the rank and file is to walk straight into the pit of self-righteousness.” As Addams interpreted Huntington, he meant there was no moral justification for believing in one’s superiority, not even a belief that one was right and the others wrong.

A deeply held, central moral belief is like a tent pole: it influences the shape of the entire tent that is a person’s thought. A new central belief is like a taller or shorter tent pole; it requires the tent to take a new shape. The tent stakes must be moved. Jane Addams had decided there was no such thing as something or someone that was purely right or purely wrong, but the rest of her thought had yet to be adjusted. Among other things, she still believed that a person of high culture was superior to those who lacked it; that is, she still believed that cultural accomplishment could justify self-righteousness.

Some hints of this can be found in the adjectives Addams attaches to democracy in “The Settlement as a Factor.” After proposing that the workers might lead the ethical movement of democracy, she anticipates the fear her readers might feel at this idea. “We must learn to trust our democracy,” she writes, “giant-like and threatening as it may appear in its uncouth strength and untried applications.” Addams was edging toward trusting that working-class people, people without the cultural training in “the best,” could set their own course. Such trust, should she embrace it, would require her to go beyond her old ideas—her enthusiasm for egalitarian social etiquette, for the principle of cooperation, and for the ideal of a unified humanity. Not feeling such trust yet, she was unable to give working people’s power a ringing endorsement. The essay is therefore full of warnings about the negative aspects of the labor movement.

These radical claims—that the labor movement was or could become ethical, that the movement was engaged in a struggle that advanced society morally, that capitalists were greedy and ethically compromised, and that there was no absolute right or wrong—opened up a number of complicated issues. Addams decided she needed to write a separate essay—would it be a speech?—to make these points more fully and to make them explicitly, as honesty compelled her to do, about the Pullman strike. Sometime in 1894, she began to write it. A page from the first draft, dated that year, survives with the title “A Modern Tragedy.” In its first paragraph she writes that, because we think of ourselves as modern, “it is hard to remember that the same old human passions persist” and can often lead to “tragedy.” She invited her readers to view “one of these great tragedies” from “the historic perspective,” to seek an “attitude of mental detachment” and “stand aside from our personal prejudices.” Still grieving over what had happened, Addams was hoping that the wisdom of culture, of the humanities, of Greek and Shakespearean tragedy could give her the comfort of emotional distance. But she had pulled too far back. The opening was so blandly vague and philosophical that no one could tell what the essay was about. She set the piece aside.

To read more about Citizen, click here.

Add a Comment
44. #UPWeek: Turabian Teacher Collaborative

9780226816319

 

Welcome to the third annual #UPWeek blog tour—we’re excited to contribute under Monday’s umbrella theme, “Collaboration,” with a post on the Turabian Teacher Collaborative. To get the ball rolling and further the mission, here’s where you can find other university presses, big and small, far and wide, posting on similarly synergetic projects today: the University Press of Colorado on veterinary immunology, the University of Georgia Press on the New Georgia Encyclopedia Project, Duke University Press on Eben Kirksey’s The Multispecies Salon, the University of California Press on Dr. Paul Farmer and Dr. Jim Yong Kim’s work on the Ebola epidemic in West Africa, the University of Virginia Press on their project Chasing Shadows (a special e-book and website devoted to Watergate-era Oval Office conversations), McGill-Queen’s University Press on the online gallery Landscape Architecture in Canada, Texas A & M University Press on a new consumer health advocacy series, Project MUSE on their history of collaboration, and Yale University Press on their Museum Quality Books series. Remember to follow #UPWeek on Twitter, and read on after the jump for the story of the Turabian Teacher Collaborative’s first two years.

***

One of the foundational principles of Kate Turabian’s classic writing guides is that research creates a community between writers and readers. Professors Joseph Williams and Gregory Colomb put the principle of a community into action when they collaborated several years ago to adapt Turabian’s guides for a new generation of student researchers. During their writing process, they circulated and reworked each other’s contributions so much that, “by the end of the process, no one could quite remember who had drafted what.”

Channeling the spirit of this “rotational” writing process, the Turabian Teacher Collaborative adds high school teachers and a university press into the mix of colleagues working to bring Turabian’s principles to a new audience. The University of Chicago Press developed this project with University of Iowa English education professors Bonnie Sunstein and Amy Shoultz, after determining that much in Turabian’s Student’s Guide to Writing College Papers aligns with the Common Core State Standards for English Language Arts. Sunstein and Shoultz suggested that the Press begin by inviting high school teachers to test the effectiveness of Turabian’s book, both at helping high schools meet the Common Core standards and at helping students become college ready.

To strategize for the project’s pilot year, participating teachers—from urban, rural, and suburban high schools in California, Illinois, Massachusetts, and Iowa—convened for a workshop at the Press in the summer of 2013. They all left equipped with a set of books and free classroom resources drawn from the book, including topic sheets and ELA Common Core–aligned lesson plans. Following the workshop, this team of teachers integrated these materials into their curricula and exchanged resources and insights on their experiences throughout the year. Later this month, several members of the Turabian Teacher Collaborative will share what they have learned with teachers from across the country at a workshop following the NCTE annual convention in Washington, DC.

And, of course, high school students are now part of the collaboration and its community of researchers, as they envision the needs of readers by engaging in peer review at every step of the writing process. As participating teacher Deb Aldrich of Kennedy High School in Cedar Rapids, Iowa, said of her students’ response to the book: “[They] acted as sounding boards, polite disagree-ers, questioners, cheerleaders, and empathizers. They would come to class and ask if we were meeting in our research groups today, which showed how much they valued participating in a real shared research conversation, not just an imaginary one in their heads. They acted and felt like academic researchers!”

The Press plans to use feedback like this to develop a teachers’ resource guide this year, as well as additional resources for research writing in future high school classrooms. As the collaborative moves into its second year, it is expanding to include high school teachers from across the disciplines who teach research and academic writing skills. Are you one of them? For more information, e-mail turabianteacher@press.uchicago.edu.

(in the spirit of #UPWeek, this post was collaboratively generated by University of Chicago Press staff members working with the TTC)

To learn more about the TTC project, click here.

Stay tuned for more from #UPWeek’s blog tour!

 

Add a Comment
45. UPWeek Day 2: Irina Baronova launch in pictures

Today is day two of #UPWeek, which considers the past, present, and future of scholarly publishing through pictures. Among posts dotting the web, you’ll find: a photographic history of Indiana University Press, documentation of 1950s and ’60s print publishing at Stanford University Press, a photo collage from Fordham University Press, a Q & A with art director Martha Sewell and short film of author and illustrator Val Kells at Johns Hopkins University Press, and images of the University Press of Florida through the years. With these surveys in mind, we’re happy to share a few snapshots from our own recent launch of Victoria Tennant’s Irina Baronova and the Ballets Russes de Monte Carlo at Peter Fetterman’s Gallery in Santa Monica, California (including a cameo by Norman Lear). Don’t forget to follow #UPWeek on Twitter to keep up with the AAUP’s celebration of university presses’ blogging culture.

***

IMG_0004

 

IMG_0097

 

IMG_0022

 

To read more about Irina Baronova and the Ballets Russes de Monte Carlo, click here.

Add a Comment
46. #UPWeek: FF is really TBT

Today is the last day of #UPWeek—so goes with it another successful tour of university press blogs. On that note, Friday’s theme is one of following: What are your must reads on the internet? Whom do you follow on social media? Which venues and scholars are doing right? University of Illinois Press tracks the geopolitics of imagination, University of Minnesota Press (hi, Maggie!) author John Hartigan explains the foibles of scholars on social media, University of Nebraska Press delivers another social media primer, NYU Press teaches us Key Words in Cultural Studies, Island Press tracks the interests of its editors, and Columbia University Press talks their University Press Round-Up.

Us? We’re running with the idea that history and progress aren’t synonymously bound. The way forward with media is often the way back or through, or at least a trip to the past demonstrates that the seed for new forms of mediation are (apologies for this) always already planted. I realize this makes Follow Friday a bit of Throwback Thursday, but here’s a great photo from UCP author Alan Thomas that has been making the rounds on Twitter of the very first e-book we published. Richard A. Lanham’s The Electronic Word required 2 MB of RAM and a floppy disk reader, yet in its “out-of-timeness,” we can already see the othering of the book-as-object and our desire to store information in as portable (and small) a capacity as possible. Kindle Fire quivers. We keep moving.

B2RJdBXIEAAT819

 

For more on #UPWeek, follow the hash-tag on Twitter.

Add a Comment
47. Top 40 Democracy

9780226896182

Eric Weisbard’s Top 40 Democracy: The Rival Mainstreams of American Music considers the shifting terrain of the pop music landscape, in which FM radio (once an indisputably dominant medium) constructed multiple mainstreams, tailoring each to target communities built on race, gender, class, and social identity. Charting (no pun intended) how categories rivaled and pushed against each other in their rise to reach American audiences, the book posits a counterintuitive notion: when even the blandest incarnation of a particular sub-group (the Isley Brothers version of R & B, for instance) rose to the top of the charts, so too did the visibility of that group’s culture and perspective, making musical formatting one of the master narratives of late-twentieth-century identity.

In a recent piece for the Sound Studies blog, Weisbard wrote about the rise of both Taylor Swift and, via mid-term elections, the Republican Party:

The genius, and curse, of the commercial-cultural system that produced Taylor Swift’s Top 40 democracy win in the week of the 2014 elections, is that its disposition is inherently centrist. Our dominant music formats, rival mainstreams engaged in friendly combat rather than culture war, locked into place by the early 1970s. That it happened right then was a response to, and recuperation from, the splintering effects of the 1960s. But also, a moment of maximum wealth equality in the U.S. was perfect to persuade sponsors that differing Americans all deserved cultural representation.

And, as Weisbard concludes:

Pop music democracy too often gives us the formatted figures of diverse individuals triumphing, rather than collective empowerment. It’s impressive what Swift has accomplished; we once felt that about President Obama, too. But she’s rather alone at the top.

To read more about Top 40 Democracy, click here.

 

Add a Comment
48. Excerpt: Top 40 Democracy

9780226896182

To follow-up on yesterday’s post, here’s an excerpt from Eric Weisbard’s Top 40 Democracy: The Rival Mainstreams of American Music.

***

“The Logic of Formats”

Nearly every history of Top 40 launches from an anecdote about how radio station manager Todd Storz came up with the idea sometime between World War II and the early 1950s, watching with friends in a bar in Omaha as customers repeatedly punched up the same few songs on the jukebox. A waitress, after hearing the tunes for hours, paid for more listens, though she was unable to explain herself. “When they asked why, she replied, simply: ‘I like ’em.’ ” As Storz said on another occasion, “Why this should be, I don’t know. But I saw waitresses do this time after time.” He resolved to program a radio station following the same principles: the hits and nothing but the hits.

Storz’s aha moment has much to tell about Top 40’s complicated relationship to musical diversity. He might be seen as an entrepreneur with his ear to the ground, like the 1920s furniture salesman who insisted hillbilly music be recorded or the 1970s Fire Island dancer who created remixes to extend the beat. Or he could be viewed as a schlockmeister lowering standards for an inarticulate public, especially women —so often conceived as mass-cultural dupes. Though sponsored broadcasting had been part of radio in America, unlike much of the rest of the world, since its beginnings, Top 40 raised hackles in a postwar era concerned about the numbing effects of mass culture. “We become a jukebox without lights,” the Radio Advertising Bureau’s Kevin Sweeney complained. Time called Storz the “King of the Giveaway” and complained of broadcasting “well larded with commercials.”

Storz and those who followed answered demands that licensed stations serve a communal good by calling playlist catholicity a democracy of sound: “If the public suddenly showed a preference for Chinese music, we would play it . . . I do not believe there is any such thing as better or inferior music.” Top 40 programmer Chuck Blore, responding to charges that formats stifled creative DJs, wrote, “He may not be as free to inflict his musical taste on the public, but now, and rightfully, I think, the public dictates the popular music of the day.” Mike Joseph boasted, “When I first go into a market, I go into every record store personally. I’ll spend up to three weeks doing interviews, with an average of forty-five minutes each. And I get every single thing I can get: the sales on every configuration, every demo for every single, the gender of every buyer, the race of every buyer. . . . I follow the audience flow of the market around the clock.” Ascertaining public taste became a matter of extravagant claim for these professional intermediaries: broadcasting divided into “dayparts” to impact commuters, housewives, or students.

Complicating the tension between seeing formats as pandering or as deferring to popular taste was a formal quality that Top 40 also shared with the jukebox: it could encompass many varieties of hits or group a subset for a defined public. This duality blurred categories we often keep separate. American show business grew from blackface minstrelsy and its performative rather than innate notion of identity —pop as striking a pose, animating a mask, putting on style or a musical. More folk and genre-derived notions of group identity, by contrast, led to the authenticity-based categories of rock, soul, hip-hop, and country. Top 40 formats drew on both modes, in constantly recalibrated proportions. And in doing so, the logic of formats, especially the 1970s format system that assimilated genres, unsettled notions of real and fake music.

Go back to Storz’s jukebox. In the late 1930s, jukeboxes revived a record business collapsed by free music on radio and the Great Depression. Jack Kapp in particular, working for the US branch of British-owned Decca, tailored the records he handled to boom from the pack: swing jazz dance beats, slangy vernacular from black urban culture, and significant sexual frankness. This capitalized on qualities inherent in recordings, which separated sound from its sources in place, time, and community, allowing both new artifice — one did not know where the music came from, exactly — and new realism: one might value, permanently, the warble of a certain voice, suggesting a certain origin. Ella Fitzgerald, eroticizing the nursery rhyme “A-Tisket, A-Tasket” in 1938 on Decca, with Chick Webb’s band behind her, could bring more than a hint of Harlem’s Savoy Ballroom to a place like Omaha, as jukeboxes helped instill a national youth culture. Other jukeboxes highlighted the cheating songs of honky-tonk country or partying &B: urban electrifications of once-rural sounds. By World War II, pop was as much these brash cross-genre jukebox blends as it was the Broadway-Hollywood-network radio axis promoting Irving Berlin’s genteel “White Christmas.”

Todd Storz’s notion of Top 40 put the jukebox on the radio. Records had not always been a radio staple. Syndicated network stations avoided “canned music”; record labels feared the loss of sales and often stamped “Not Licensed for Radio Broadcast” on releases. So the shift that followed television’s taking original network programming was twofold: local radio broadcasting that relied on a premade consumer product. Since there were many more records to choose from than network shows, localized Top 40 fed a broader trend that allowed an entrepreneurial capitalism — independent record-label owners such as Sam Phillips of Sun Records, synergists such as American Bandstandhost Dick Clark, or station managers such as Storz—to compete with corporations like William Paley’s Columbia Broadcasting System, the so-called Tiffany Network, which included Columbia Records. The result, in part, was rock and roll, which had emerged sonically by the late 1940s but needed the Top 40 system to become dominant with young 45 RPM – singles buyers by the end of the 1950s.

An objection immediately presents itself, one that will recur throughout this study: Was Top 40 rock and roll at all, or a betrayal of the rockabilly wildness that Sam Phillips’s roster embodied for the fashioning of safe teen idols by Dick Clark? Did the format destroy the genre? The best answer interrogates the question: Didn’t the commerce-first pragmatism of formatting, with its weak boundaries, free performers and fans inhibited by tighter genre codes? For Susan Douglas, the girl group records of the early 1960s made possible by Top 40 defy critics who claim that rock died between Elvis Presley’s army induction and the arrival of the Beatles. Yes, hits like “Leader of the Pack” were created by others, often men, and were thoroughly commercial. Yes, they pulled punches on gender roles even as they encouraged girls to identify with young male rebels. But they “gave voice to all the warring selves inside us struggling.” White girls admired black girls, just as falsetto harmonizers like the Beach Boys allowed girls singing along to assume male roles in “nothing less than musical cross-dressing.” Top 40’s “euphoria of commercialism,” Douglas argues, did more than push product; “tens of millions of young girls started feeling, at the same time, that they, as a generation, would not be trapped.” Top 40, like the jukebox before it and MTV afterward, channeled cultural democracy: spread it but contained it within a regulated, commercialized path.

We can go back further than jukebox juries becoming American Bandstands. Ambiguities between democratic culture and commodification are familiar within cultural history. As Jean-Christophe Agnew points out in his study Worlds Apart, the theater and the marketplace have been inextricable for centuries, caught up as capitalism developed in “the fundamental problematic of a placeless market: the problems of identity, intentionality, accountability, transparency, and reciprocity that the pursuit of commensurability invariably introduces into that universe of particulate human meanings we call culture.” Agnew’s history ranges from Shakespeare to Melville’s Confidence Man, published in 1857. At that point in American popular culture, white entertainers often performed in blackface, jumping Jim Crow and then singing a plaintive “Ethiopian” melody by Stephen Foster. Eric Lott’s book on minstrelsy gives this racial mimicry a handy catchphrase: Love and Theft. Tarred-up actors, giddy with the new freedoms of a white man’s democracy but threatened by industrial “wage slavery,” embodied cartoonish blacks for social comment and anti-bourgeois rudeness. Amid vicious racial stereotyping could be found performances that respectable theater disavowed. Referring to a popular song of the era, typically performed in drag, the New York Tribune wrote in 1853, “ ‘Lucy Long’ was sung by a white negro as a male female danced.” And because of minstrelsy’s fixation on blackness, African Americans after the Civil War found an entry of sorts into entertainment: as songwriter W. C. Handy unceremoniously put it, “The best talent of that generation came down the same drain. The composers, the singers, the musicians, the speakers, the stage performers —the minstrel shows got them all.” If girl groups showcase liberating possibility in commercial constraints, minstrelsy challenges unreflective celebration.

Entertainment, as it grew into the brashest industry of modernizing America, fused selling and singing as a matter of orthodoxy. The three-act minstrel show stamped formats on show business early on, with its songand-dance opening, variety-act olio, and dramatic afterpiece, its interlocutors and end men. Such structures later migrated to variety, vaudeville, and Broadway. After the 1890s, tunes were supplied by Tin Pan Alley sheet-music publishers, who professionalized formula songwriting and invented “payola”— ethically dubious song plugging. These were song factories, unsentimental about creativity, yet the evocation of cheap tinniness in the name was deliberately outrageous, announcing the arrival of new populations —Siberian-born Irving Berlin, for example, the Jew who wrote “White Christmas.” Tin Pan Alley’s strictures of form but multiplicity of identity paved the way for the Brill Building teams who wrote the girl group songs, the Motown Records approach to mainstreaming African American hits, and even millennial hitmakers from Korean “K-Pop” to Sweden’s Cheiron Studios. Advertisers, Timothy Taylor’s history demonstrates, used popular music attitude as early as they could —sheet-music parodies, jingles, and the showmanship of radio hosts like crooner Rudy Vallee designed to give products “ginger, pep, sparkle, and snap.”

The Lucky Strike Hit Parade, a Top 40 forerunner with in-house vocalists performing the leading tunes, was “music for advertising’s sake,” its conductor said in 1941.

Radio, which arrived in the 1920s, was pushed away from a BBC model and toward what Thomas Streeter calls “corporate liberalism” by leaders like Herbert Hoover, who declared as commerce secretary, “We should not imitate some of our foreign colleagues with governmentally controlled broadcasting supported by a tax upon the listener.” In the years after the 1927 Radio Act, the medium consolidated around sponsor-supported syndicated network shows, successfully making radio present by 1940 in 86 percent of American homes and some 6.5 million cars, with average listening of four hours a day. The programming, initially local, now fused the topsy-turvy theatrics of vaudeville and minstrelsy —Amos ’n’ Andy ranked for years with the most popular programs —with love songs and soap operas aimed at the feminized intimacy of the bourgeois parlor. Radio’s mass orientation meant immigrants used it to embrace a mainstream American identity; women confessed sexual feelings for the likes of Vallee as part of the bushels of letters sent to favored broadcasters; and Vox Pop invented the “man on the street” interview, connecting radio’s commercialized public with more traditional political discourse and the Depression era’s documentary impulse. While radio scholars have rejected the view of an authoritarian, manipulative “culture industry,” classically associated with writers such as the Frankfurt School’s Theodor Adorno, historian Elena Razlogova offers an important qualification: “by the 1940s both commercial broadcasters and empirical social scientists . . . shared Adorno’s belief in expert authority and passive emotional listening.” Those most skeptical of mass culture often worked inside the beast.

Each network radio program had a format. So, for example, Kate Smith, returning for a thirteenth radio season in 1942, offered a three-act structure within each broadcast: a song and comedy slot, ad, drama, ad, and finally a segment devoted to patriotism —fitting for the singer of “God Bless America.” She was said by Billboard, writing with the slangy prose that characterized knowing and not fully genteel entertainment professionals, to have a show that “retains the format which, tho often heavy handed and obvious, is glovefit to keep the tremendous number of listeners it has acquired and do a terrific selling job for the sponsor”— General Foods. The trade journal insisted, “Next to a vocal personality, a band on the air needs a format —an idea, a framework of showmanship.”

Top 40 formats addressed the same need to fit broadcast, advertiser, and public, but through a different paradigm: what one branded with an on-air jukebox approach was now the radio station itself, to multiple sponsors. Early on, Top 40s competed with nonformat stations, the “full service” AM’s that relied on avuncular announcers with years of experience, in-house news, community bulletins, and songs used as filler. As formats came to dominate, with even news and talk stations formatted for consistent sound, competing sonic configurations hailed different demographics. But no format was pure: to secure audience share in a crowded market, a programmer might emphasize a portion of a format (Quiet Storm &B) or blur formats (country crossed with easy listening). Subcategories proliferated, creating what a 1978 how-to book called “the radio format conundrum.” The authors, listing biz slang along the lines of MOR,Good Music, and Chicken Rock, explained, “Words are coined, distorted and mutilated, as the programmer looks for ways to label or tag a format, a piece of music, a frame of mind.”

A framework of showmanship in 1944 had become a frame of mind in 1978. Formats began as theatrical structures but evolved into marketing devices — efforts to convince sponsors of the link between a mediated product and its never fully quantifiable audience. Formats did not idealize culture; they sold it. They structured eclecticism rather than imposing aesthetic values. It was the customer’s money —a democracy of whatever moved people.

The Counterlogic of Genres

At about the same time Todd Storz watched the action at a jukebox in Omaha, sociologist David Riesman was conducting in-depth interviews with young music listeners. Most, he found, were fans of what was popular— uncritical. But a minority of interviewees disliked “name bands, most vocalists (except Negro blues singers), and radio commercials.” They felt “a profound resentment of the commercialization of radio and musicians.” They were also, Riesman reported, overwhelmingly male.

American music in the twentieth century was vital to the creation of what Grace Hale’s account calls “a nation of outsiders.” “Hot jazz” adherents raved about Louis Armstrong’s solos in the 1920s, while everybody else thought it impressive enough that Paul Whiteman’s orchestra could syncopate the Charleston and introduce “Rhapsody in Blue.” By the 1930s, the in-crowd were Popular Front aligned, riveted at the pointedly misnamed cabaret Café Society, where doormen had holes in their gloves and Billie Holiday made the anti-lynching, anti-minstrelsy “Strange Fruit” stop all breathing. Circa Riesman’s study, the hipsters Norman Mailer and Jack Kerouac would celebrate redefined hot as cool, seeding a 1960s San Francisco scene that turned hipsters into hippie counterculture.

But the urge to value music as an authentic expression of identity appealed well beyond outsider scenes and subcultures. Hank Williams testified, “When a hillbilly sings a crazy song, he feels crazy. When he sings, ‘I Laid My Mother Away,’ he sees her a-laying right there in the coffin. He sings more sincere than most entertainers because the hillbilly was raised rougher than most entertainers. You got to know a lot about hard work. You got to have smelt a lot of mule manure before you can sing like a hillbilly. The people who has been raised something like the way the hillbilly has knows what he is singing about and appreciates it.” Loretta Lynn reduced this to a chorus: “If you’re looking at me, you’re looking at country.” Soul, rock, and hip-hop offered similar sentiments. An inherently folkloric valuation of popular music, Karl Miller has written, “so thoroughly trounced minstrelsy that historians rarely discuss the process of its ascendance. The folkloric paradigm is the air that we breathe.”

For this study, I want to combine subcultural outsiders and identity-group notions of folkloric authenticity into a single opposition to formats: genres. If entertainment formats are an undertheorized category of analysis, though a widely used term, genres have been highly theorized. By sticking with popular music, however, we can identify a few accepted notions. Music genres have rules: socially constructed and accepted codes of form, meaning, and behavior. Those who recognize and are shaped by these rules belong to what pioneering pop scholar Simon Frith calls “genre worlds”: configurations of musicians, listeners, and figures mediating between them who collectively create a sense of inclusivity and exclusivity. Genres range from highly specific avant-gardes to scenes, industry categories, and revivals, with large genre “streams” to feed subgenres. If music genres cannot be viewed —as their adherents might prefer —as existing outside of commerce and media, they do share a common aversion: to pop shapelessness.

Deconstructing genre ideology within music can be as touchy as insisting on minstrelsy’s centrality: from validating Theft to spitting in the face of Love. Producer and critic John Hammond, progressive in music and politics, gets rewritten as the man who told Duke Ellington that one of his most ambitious compositions featured “slick, un-negroid musicians,” guilty of “aping Tin Pan Alley composers for commercial reasons.” A Hammond obsession, 1930s Mississippi blues guitarist Robert Johnson has his credentials to be called “King of the Delta Blues” and revered by the likes of Bob Dylan, Eric Clapton, and the Rolling Stones questioned by those who want to know why Delta blues, as a category, was invented and sanctified after the fact and how that undercut more urban and vaudeville-inflected, not to mention female, “classic” blues singers such as Ma Rainey, Mamie Smith, and Bessie Smith.

The tug-of-war between format and genre, performative theatrics and folkloric authenticity, came to a head with rock, the commercially and critically dominant form of American music from the late 1960s to the early 1990s. Fifties rock and roll had been the music of black as much as white Americans, southern as much as northern, working class far more than middle class. Rock was both less inclusive and more ideological: what Robert Christgau, aware of the politics of the shift from his first writing as a founding rock critic, called “all music deriving primarily from the energy and influence of the Beatles—and maybe Bob Dylan, and maybe you should stick pretensions in there someplace.” Ellen Willis, another pivotal early critic, centered her analysis of the change on the rock audience’s artistic affiliations: “I loved rock and roll, but I felt no emotional identification with the performers. Elvis Presley was my favorite singer, and I bought all his records; just the same, he was a stupid, slicked-up hillbilly, a bit too fat and soft to be really good-looking, and I was a middle-class adolescent snob.” Listening to Mick Jagger of the Rolling Stones was a far different process: “I couldn’t condescend to him — his ‘vulgarity’ represented a set of social and aesthetic attitudes as sophisticated as mine.”

The hippies gathered at Woodstock were Riesman’s minority segment turned majority, but with a difference. They no longer esteemed contemporary versions of “Negro blues singers”: only three black artists played Woodstock. Motown-style format pop was dismissed as fluff in contrast to English blues-rock and other music with an overt genre lineage. Top 40 met disdain, as new underground radio centered on “freeform”— meaning free of format. Music critics like Christgau, Willis, and Frith challenged these assumptions at the time, with Frith’s Sound Effects the strongest account of rock’s hypocritical “intimations of sincerity, authenticity, art — noncommercial concerns,” even as “rock became the record industry.” In a nation of outsiders, rock ruled, or as a leftist history, Rock ’n’ Roll Is Here to Pay, snarked, “Music for Music’s Sake Means More Money.” Keir Keightley elaborates, “One of the great ironies of the second half of the twentieth century is that while rock has involved millions of people buying a mass-marketed, standardized commodity (CD, cassette, LP) that is available virtually everywhere, these purchases have produced intense feelings of freedom, rebellion, marginality, oppositionality, uniqueness and authenticity.” In 1979, rock fans led by a rock radio DJ blew up disco records; as late as 2004, Kelefa Sanneh felt the need to deconstruct rock-ism in the New York Times.

Yet it would be simplistic to reduce rockism to its disproportions of race, gender, class, and sexuality. What fueled and fuels such attitudes toward popular music, ones hardly limited to rock alone, is the dream of music as democratic in a way opposite to how champions of radio formats justified their playlists. Michael Kramer, in an account of rock far more sympathetic than most others of late, argues that the countercultural era refashioned the bourgeois public sphere for a mass bohemia: writers and fans debated in music publications, gathered with civic commitment at music festivals, and shaped freeform radio into a community instrument. From the beginning, “hip capitalism” battled movement concerns, but the notion of music embodying anti-commercial beliefs, of rock as revolutionary or at least progressive, was genuine. The unity of the rock audience gave it more commercial clout: not just record sales, but arena-sized concerts, the most enduring music publication in Rolling Stone, and ultimately a Rock and Roll Hall of Fame to debate rock against rock and roll or pop forever. Discursively, if not always in commercial reality, this truly was the Rock Era.

The mostly female listeners of the Top 40 pop formats bequeathed by Storz’s jukebox thus confronted, on multiple levels, the mostly male listeners of a rock genre that traced back to the anti-commercial contingent of Riesman’s interviewees. A democracy of hit songs, limited by its capitalist nature, was challenged by a democracy of genre identity, limited by its demographic narrowness. The multi-category Top 40 strands I will be examining were shaped by this enduring tension.

Pop Music in the Rock Era

Jim Ladd, a DJ at the Los Angeles freeform station KASH-FM, received a rude awakening in 1969 when a new program director laid down some rules. “We would not be playing any Top 40 bullshit, but real rock ’n’ roll; and there was no dress code. There would, however, be something known as ‘the format.’ ” Ladd was now told what to play. He writes bitterly about those advising stations. “The radio consultant imposed a statistical grid over the psychedelic counterculture, and reduced it to demographic research. Do you want men 18–24, adults 18–49, women 35–49, or is your target audience teens? Whatever it may be, the radio consultant had a formula.” Nonetheless, the staff was elated when, in 1975, KASH beat Top 40 KHJ, “because to us, it represented everything that we were trying to change in radio. Top 40 was slick, mindless pop pap, without one second of social involvement in its format.” Soon however, KAOS topped KASH with a still tighter format: “balls-out rock ’n’ roll.”

Ladd’s memoir, for all its biases, demonstrates despite itself why it would be misleading to view rock /pop or genre /format dichotomies as absolute divisions. By the mid-1970s, album-oriented rock (AOR) stations, like soul and country channels, pursued a format strategy as much as Top 40 or AC, guided by consultants and quarterly ratings. Rock programmers who used genre rhetoric of masculine rebellion (“balls-out rock ’n’ roll”) still honored Storz’s precept that most fans wanted the same songs repeated. Stations divided listeners explicitly by age and gender and tacitly by race and class. The division might be more inclusive: adults, 18–49; or less so: men, 18–34. The “psychedelic counterculture” ideal of dropping out from the mass had faded, but so had some of the mass: crossover appeal was one, not always desirable, demographic. And genre longings remained, with Ladd’s rockist disparagement of Top 40 symptomatic: many, including those in the business, quested for “social involvement” and disdained format tyranny. If AOR was formatted à la pop, pop became more like rock and soul, as seen in the power ballad, which merged rock’s amplification of sound and self with churchy and therapeutic exhortation.

Pop music in the rock era encompassed two strongly appealing, sometimes connected, but more often opposed impulses. The logic of formats celebrated the skillful matching of a set of songs with a set of people: its proponents idealized generating audiences, particularly new audiences, and prided themselves on figuring out what people wanted to hear. To believe in formats could mean playing it safe, with the reliance on experts and contempt for audiences that Razlogova describes in an earlier radio era: one cliché in radio was that stations were never switched off for the songs they didn’t play, only the ones they did. But there were strong business reasons to experiment with untapped consumer segments, to accentuate the “maturation” of a buying group with “contemporary”— a buzzword of the times —music to match. To successfully develop a new format, like the urban contemporary approach to black middle-class listeners, marked a great program director or consultant, and market-to-market experimentation in playlist emphasis was constant. Record companies, too, argued that a song like “Help Me Make It through the Night,” Kris Kristofferson’s explicit 1971 hit for Sammi Smith, could attract classier listeners for the country stations that played it.

By contrast, the logic of genres —accentuated by an era of counterculture, black power, feminism, and even conservative backlash — celebrated the creative matching of a set of songs and a set of ideals: music as artistic expression, communal statement, and coherent heritage. These were not necessarily anti-commercial impulses. Songwriters had long since learned the financial reasons to craft a lasting Broadway standard, rather than cash in overnight with a disposable Tin Pan Alley ditty. As Keightley shows, the career artist, steering his or her own path, was adult pop’s gift to the rock superstars. Frank Sinatra, Chairman of the Board, did not only symbolically transform into Neil Young, driving into the ditch if he chose. Young actually recorded for Reprise Records, the label that Sinatra had founded in 1960, whose president, Mo Ostin, went on to merge it with, and run, the artist-friendly and rock-dominated major label Warner Bros. Records.

Contrast Ladd’s or Young’s sour view of formatting with Clive Davis, who took over as president of Columbia Records during the rise of the counterculture. Writing just after the regularizing of multiple Top 40 strands, Davis found the mixture of old-school entertainment and new-school pop categories he confronted, the tensions between format and genre, endlessly fascinating. He was happy to discourse on the reasons why an MOR release by Ray Conniff might outsell an attention-hogging album by Bob Dylan, then turn around and explain why playing Las Vegas had tainted the rock group Blood, Sweat & Tears by rebranding them as MOR. Targeting black albums, rather than singles, to music buyers intrigued him, and here he itemized how he accepted racial divisions as market realities, positioning funk’s Earth, Wind & Fire as “progressive” to white rockers while courting soul nationalists too. “Black radio was also becoming increasingly militant; black program directors were refusing to see white promotion men. . . . If a record is ripe to be added to the black station’s play list, but is not quite a sure thing, it is ridiculous to have a white man trying to convince the program director to put it on.”

The incorporation of genre by formats proved hugely successful from the 1970s to the 1990s. Categories of mainstream music multiplied, major record labels learned boutique approaches to rival indies in what Timothy Dowd calls “decentralized” music selling, and the global sounds that Israeli sociologist Motti Regev sums up as “pop-rock” fused national genres with a common international structure of hitmaking, fueled by the widespread licensing in the 1980s of commercial radio channels in countries formerly limited to government broadcasting. In 2000, I was given the opportunity, for a New York Times feature, to survey a list of the top 1,000 selling albums and top 200 artists by total US sales, as registered by SoundScan’s barcode-scanning process since the service’s introduction in 1991. The range was startling: twelve albums on the list by Nashville’s Garth Brooks, but also twelve by the Beatles and more than twenty linked to the gangsta rappers in N.W.A. Female rocker Alanis Morissette topped the album list, with country and AC singer Shania Twain not far behind. Reggae’s Bob Marley had the most popular back-catalogue album, with mammoth total sales for pre-rock vocalist Barbra Streisand and jazz’s Miles Davis. Even “A Horse with No Name” still had fans:America’s Greatest Hits made a top 1,000 list that was 30 percent artists over forty years old in 2000 and one-quarter 1990s teen pop like Backstreet Boys. Pop meant power ballads (Mariah Carey, Celine Dion), rock (Pink Floyd, Metallica, Pearl Jam), and Latin voices (Selena, Marc Anthony), five mellow new age Enya albums, and four noisy Jock Jams compilations.

Yet nearly all this spectrum of sound was owned by a shrinking number of multinationals, joined as the 1990s ended by a new set of vast radio chains like Clear Channel, allowed by a 1996 Telecommunications Act in the corporate liberal spirit of the 1927 policies. The role of music in sparking countercultural liberation movements had matured into a well-understood range of scenes feeding into mainstreams, or train-wreck moments by tabloid pop stars appreciated with camp irony by omnivorous tastemakers. The tightly formatted world that Jim Ladd feared and Clive Davis coveted had come to pass. Was this true diversity, or a simulation? As Keith Negus found when he spoke with those participating in the global pop order, genre convictions still pressed against format pragmatism. Rock was overrepresented at record labels. Genre codes shaped the corporate cultures that framed the selling of country music, gangsta rap, and Latin pop. “The struggle is not between commerce and creativity,” Negus concluded, “but about what is to be commercial and creative.” The friction between competing notions of how to make and sell music had resulted in a staggering range of product, but also intractable disagreements over that product’s value within cultural hierarchies.

To read more about Top 40 Democracy, click here.

Add a Comment
49. Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration

9780226644844

An excerpt from Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration

by Devah Pager

***

Introduction

At the start of the 1970s, incarceration appeared to be a practice in decline. Criticized for its overuse and detrimental effects, practitioners and reformers looked to community-based alternatives as a more promising strategy for managing criminal offenders. A 1967 report published by the President’s Commission on Law Enforcement and Administration of Justice concluded: “Life in many institutions is at best barren and futile, at worst unspeakably brutal and degrading. The conditions in which [prisoners] live are the poorest possible preparation for their successful reentry into society, and often merely reinforces in them a pattern of manipulation or destructiveness.” The commission’s primary recommendation involved developing “more extensive community programs providing special, intensive treatment as an alternative to institutionalization for both juvenile and adult offenders.” Echoing this sentiment, a 1973 report by the National Advisory Commission on Criminal Justice Standards and Goals took a strong stand against the use of incarceration. “The prison, the reformatory, and the jail have achieved only a shocking record of failure. There is overwhelming evidence that these institutions create crime rather than prevent it.” The commission firmly recommended that “no new institutions for adults should be built and existing institutions for juveniles should be closed.” Following what appeared to be the current of the time, historian David Rothman in 1971 confidently proclaimed, “We have been gradually escaping from institutional responses and one can foresee the period when incarceration will be used still more rarely than it is today.”

Quite opposite to the predictions of the time, incarceration began a steady ascent, with prison populations expanding sevenfold over the next three decades. Today the United States boasts the highest rate of incarceration in the world, with more than two million individuals currently behind bars. Characterized by a rejection of the ideals of rehabilitation and an emphasis on “tough on crime” policies, the practice of punishment over the past thirty years has taken a radically different turn from earlier periods in history. Reflecting the stark shift in orientation, the U.S. Department of Justice released a report in 1992 stating “there is no better way to reduce crime than to identify, target, and incapacitate those hardened criminals who commit staggering numbers of violent crimes whenever they are on the streets.” Far removed from earlier calls for decarceration and community supervision, recent crime policy has emphasized containment and harsh punishment as a primary strategy of crime control.

The revolving door

Since the wave of tough on crime rhetoric spread throughout the nation in the early 1970s, the dominant concern of crime policy has been getting criminals off the streets. Surprisingly little thought, however, has gone into developing a longer-term strategy for coping with criminal offenders. With more than 95 percent of those incarcerated eventually released, the problems of offender management do not end at the prison walls. According to one estimate, there are currently more than twelve million ex-felons in the United States, representing roughly 9 percent of the male working-age population. The yearly influx of returning inmates is double the current number of legal immigrants entering the United States from Mexico, Central America, and South America combined.

Despite the vast numbers of inmates leaving prison each year, little provision has been made for their release; as a result, many do not remain out for long. Of those recently released, nearly two-thirds will be charged with new crimes, and more than 40 percent will return to prison within three years. In fact, the revolving door of the prison has now become its own source of growth, with the faces of former inmates increasingly represented among annual admissions to prison. By the end of the 1990s, more than a third of those entering state prison had been there before.

The revolving door of the prison is fueled, in part, by the social contexts in which crime flourishes. Poor neighborhoods, limited opportunities, broken families, and overburdened schools each contribute to the onset of criminal activity among youth and its persistence into early adulthood. But even beyond these contributing factors, evidence suggests that experience with the criminal justice system in itself has adverse consequences for long-term outcomes. In particular, incarceration is associated with limited future employment opportunities and earnings potential, which themselves are among the strongest predictors of desistance from crime. Given the immense barriers to successful reentry, it is little wonder that such a high proportion of those released from prison quickly make their way back through the prison’s revolving door.

The criminalization of young, black men

As the cycle of incarceration and release continues, an ever greater number of young men face prison as an expected marker of adulthood. But the expansive reach of the criminal justice system has not affected all groups equally. More than any other group, African Americans have felt the impact of the prison boom, comprising more than 40 percent of the current prison population while making up just 12 percent of the U.S. population. At any given time, roughly 12 percent of all young black men between the ages of twenty-five and twenty-nine are behind bars, compared to less than 2 percent of white men in the same age group; roughly a third are under criminal justice supervision. Over the course of a lifetime, nearly one in three young black men–and well over half of young black high school dropouts–will spend some time in prison. According to these estimates, young black men are more likely to go to prison than to attend college, serve in the military, or, in the case of high school dropouts, be in the labor market. Prison is no longer a rare or extreme event among our nation’s most marginalized groups. Rather it has now become a normal and anticipated marker in the transition to adulthood.

There is reason to believe that the consequences of these trends extend well beyond the prison walls, with widespread assumptions about the criminal tendencies among blacks affecting far more than those actually engaged in crime. Blacks in this country have long been regarded with suspicion and fear; but unlike progressive trends in other racial attitudes, associations between race and crime have changed little in recent years. Survey respondents consistently rate blacks as more prone to violence than any other American racial or ethnic group, with the stereotype of aggressiveness and violence most frequently endorsed in ratings of African Americans. The stereotype of blacks as criminals is deeply embedded in the collective consciousness of white Americans, irrespective of the perceiver’s level of prejudice or personal beliefs.

While it would be impossible to trace the source of contemporary racial stereotypes to any one factor, the disproportionate growth of the criminal justice system in the lives of young black men–and the corresponding media coverage of this phenomenon, which presents an even more skewed representation–has likely played an important role. Experimental research shows that exposure to news coverage of a violent incident committed by a black perpetrator not only increases punitive attitudes about crime but further increases negative attitudes about blacks generally. The more exposure we have to images of blacks in custody or behind bars, the stronger our expectations become regarding the race of assailants or the criminal tendencies of black strangers.

The consequences of mass incarceration then may extend far beyond the costs to the individual bodies behind bars, and to the families that are disrupted or the communities whose residents cycle in and out. The criminal justice system may itself legitimate and reinforce deeply embedded racial stereotypes, contributing to the persistent chasm in this society between black and white.

The credentialing of stigma

The phenomenon of mass incarceration has filtered into the public consciousness through cycles of media coverage and political debates. But a more lasting source of information detailing the scope and reach of the criminal justice system is generated internally by state courts and departments of corrections. For each individual processed through the criminal justice system, police records, court documents, and corrections databases detail dates of arrest, charges, conviction, and terms of incarceration. Most states make these records publicly available, often through on-line repositories, accessible to employers, landlords, creditors, and other interested parties. With increasing numbers of occupations, public services, and other social goods becoming off-limits to ex-offenders, these records can be used as the official basis for eligibility determination or exclusion. The state in this way serves as a credentialing institution, providing official and public certification of those among us who have been convicted of wrongdoing. The “credential” of a criminal record, like educational or professional credentials, constitutes a formal and enduring classification of social status, which can be used to regulate access and opportunity across numerous social, economic, and political domains.

Within the employment domain, the criminal credential has indeed become a salient marker for employers, with increasing numbers using background checks to screen out undesirable applicants. The majority of employers claim that they would not knowingly hire an applicant with a criminal background. These employers appear less concerned about specific information conveyed by a criminal conviction and its bearing on a particular job, but rather view this credential as an indicator of general employability or trustworthiness. Well beyond the single incident at its origin, the credential comes to stand for a broader internal disposition.

The power of the credential lies in its recognition as an official and legitimate means of evaluating and classifying individuals. The negative credential of a criminal record represents one such tool, offering formal certification of the offenders among us and official notice of those demographic groups most commonly implicated. To understand fully the impact of this negative credential, however, we must rely on more than speculation as to when and how these official labels are invoked as the basis for enabling or denying opportunity. Because credentials are often highly correlated with other indicators of social status or stigma (e.g., race, gender, class), we must examine their direct and independent impact. In addition, credentials may affect certain groups differently than others, with the official marker of criminality carrying more or less stigma depending on the race of its bearer. As increasing numbers of young men are marked by their contact with the criminal justice system, it becomes a critical priority to understand the costs and consequences of this now prevalent form of negative credential.

What do we know about the consequences of incarceration?

Despite the vast political and financial resources that have been mobilized toward prison expansion, very little systematic attention has been focused on the potential problems posed by the large and increasing number of inmates being released each year. A snapshot of ex-offenders one year after release reveals a rocky path of reintegration, with rates of joblessness in excess of 75 percent and rates of rearrest close to 45 percent. But one simple question remains unanswered: Are the employment problems of ex-offenders caused by their offender status, or does this population simply comprise a group of individuals who were never very successful at mainstream involvement in the first place? This question is important, for its answer points to one of two very different sets of policy recommendations. To the extent that the problems of prisoner reentry reflect the challenges of a population poorly equipped for conventional society, our policies would be best targeted toward some combination of treatment, training, and, at the extreme, containment. If, on the other hand, the problems of prisoner reentry are to some degree caused by contact with the criminal justice system itself, then a closer examination of the (unintended) consequences of America’s war on crime may be warranted. Establishing the nature of the relationship between incarceration and subsequent outcomes, then, is critical to developing strategies best suited to address this rapidly expanding ex-offender population.

In an attempt to resolve the substantive and methodological questions surrounding the consequences of incarceration, this book provides both an experimental and an observational approach to studying the barriers to employment for individuals with criminal records. The first stage observes the experiences of black and white job seekers with criminal records in comparison to equally qualified nonoffenders. In the second stage, I turn to the perspectives of employers in order to better understand the concerns that underlie their hiring decisions. Overall, this study represents an accounting of trends that have gone largely unnoticed or underappreciated by academics, policy makers, and the general public. After thirty years of prison expansion, only recently has broad attention turned to the problems of prisoner reentry in an era of mass incarceration. By studying the ways in which the mark of a criminal record shapes and constrains subsequent employment opportunities, this book sheds light on a powerful, emergent mechanism of labor market stratification. Further, this analysis recognizes that an investigation of incarceration in the contemporary United States would be inadequate without careful attention to the dynamics of race. As described earlier, there is a strong link between race and crime, both real and perceived, and yet the implications of this relationship remain poorly understood. This study takes a hard look at the labor market experiences of young black men, both with and without criminal pasts. In doing so, we gain a close-up view of the powerful role race continues to play in shaping the labor market opportunities available to young men. The United States remains sharply divided along color lines. Understanding the mechanisms that perpetuate these divisions represents a crucial step toward their resolution.

To read more about Marked, click here.

Add a Comment
50. Free e-book for December: Swordfish

9780226922904

Our free e-book for December is renowned marine biologist Richard Ellis’s Swordfish: A Biography of the Ocean Gladiator.
***
A perfect fish in the evolutionary sense, the broadbill swordfish derives its name from its distinctive bill—much longer and wider than the bill of any other billfish—which is flattened into the sword we all recognize. And though the majesty and allure of this warrior fish has commanded much attention—from adventurous sportfishers eager to land one to ravenous diners eager to taste one—no one has yet been bold enough to truly take on the swordfish as a biographer. Who better to do so than Richard Ellis, a master of marine natural history?Swordfish: A Biography of the Ocean Gladiatoris his masterly ode to this mighty fighter.
The swordfish, whose scientific name means “gladiator,” can take on anyone and anything, including ships, boats, sharks, submarines, divers, and whales, and in this book Ellis regales us with tales of its vitality and strength. Ellis makes it easy to understand why it has inspired so many to take up the challenge of epic sportfishing battles as well as the longline fishing expeditions recounted by writers such as Linda Greenlaw and Sebastian Junger. Ellis shows us how the bill is used for defense—contrary to popular opinion it is not used to spear prey, but to slash and debilitate, like a skillful saber fencer. Swordfish, he explains, hunt at the surface as well as thousands of feet down in the depths, and like tuna and some sharks, have an unusual circulatory system that gives them a significant advantage over their prey, no matter the depth in which they hunt. Their adaptability enables them to swim in waters the world over—tropical, temperate, and sometimes cold—and the largest ever caught on rod and reel was landed in Chile in 1953, weighing in at 1,182 pounds (and this heavyweight fighter, like all the largest swordfish, was a female).
Ellis’s detailed and fascinating, fact-filled biography takes us behind the swordfish’s huge, cornflower-blue eyes and provides a complete history of the fish from prehistoric fossils to its present-day endangerment, as our taste for swordfish has had a drastic effect on their population the world over. Throughout, the book is graced with many of Ellis’s own drawings and paintings, which capture the allure of the fish and bring its splendor and power to life for armchair fishermen and landlocked readers alike.
To download your free copy, click here.

Add a Comment

View Next 25 Posts