in all blogs
Viewing: Blog Posts Tagged with: Politics and Current Events, Most Recent at Top [Help]
Results 1 - 12 of 12
Americans tend to see negative campaign ads as just that: negative. Pundits, journalists, voters, and scholars frequently complain that such ads undermine elections and even democratic government itself. But John G. Geer here takes the opposite stance, arguing that when political candidates attack each other, raising doubts about each other’s views and qualifications, voters—and the democratic process—benefit.
In Defense of Negativity, Geer’s study of negative advertising in presidential campaigns from 1960 to 2004, asserts that the proliferating attack ads are far more likely than positive ads to focus on salient political issues, rather than politicians’ personal characteristics. Accordingly, the ads enrich the democratic process, providing voters with relevant and substantial information before they head to the polls.
An important and timely contribution to American political discourse, In Defense of Negativity concludes that if we want campaigns to grapple with relevant issues and address real problems, negative ads just might be the solution.
“Geer has set out to challenge the widely held belief that attack ads and negative campaigns are destroying democracy. Quite the opposite, he argues in his provocative new book: Negativity is good for you and for the political system. . . . In Defense of Negativity adds a new argument to the debate about America’s polarized politics, and in doing so it asserts that voters are less bothered by today’s partisan climate than many believe. If there are problems—and there are—Geer says it’s time to stop blaming it all on 30-second spots.”—Washington Post
Download your free copy of In Defense of Negativity here.
Watch “The Bear,” one of those 30-second spots (less an attack ad, and more a foray into American surrrealism) produced for Ronald Reagan’s 1984 presidential campaign, below:
When you think about Wikipedia, you might not immediately envision it as a locus for a political theory of openness—and that might well be due to a cut-and-paste utopian haze that masks the site’s very real politicking around issues of shared decision-making, administrative organization, and the push for and against transparencies. In Wikipedia and the Politics of Openness, forthcoming this December, Nathaniel Tkacz cuts throw the glow and establishes how issues integral to the concept of “openness” play themselves out in the day-to-day reality of Wikipedia’s existence. Recently, critic Alan Liu, whose prescient scholarship on the relationship between our literary/historical and technological imaginations has shaped much of the humanities turn to new media, endorsed the book via Twitter:
With that in mind, the book’s jacket copy furthers a frame for Tkacz’s argument:
Few virtues are as celebrated in contemporary culture as openness. Rooted in software culture and carrying more than a whiff of Silicon Valley technical utopianism, openness—of decision-making, data, and organizational structure—is seen as the cure for many problems in politics and business.
But what does openness mean, and what would a political theory of openness look like? With Wikipedia and the Politics of Openness, Nathaniel Tkacz uses Wikipedia, the most prominent product of open organization, to analyze the theory and politics of openness in practice—and to break its spell. Through discussions of edit wars, article deletion policies, user access levels, and more, Tkacz enables us to see how the key concepts of openness—including collaboration, ad-hocracy, and the splitting of contested projects through “forking”—play out in reality.
The resulting book is the richest critical analysis of openness to date, one that roots media theory in messy reality and thereby helps us move beyond the vaporware promises of digital utopians and take the first steps toward truly understanding what openness does, and does not, have to offer.
Read more about Wikipedia and the Politics of Openness, available December 2014, here.
By: Kristi McGuire,
Blog: The Chicago Blog
(Login to Add to MyJacketFlap
, Art and Architecture
, Black Studies
, Books for the News
, History and Philosophy of Science
, Politics and Current Events
, Add a tag
In wrapping of the year’s best-of-2012 lists, we couldn’t help but single out the University of Chicago Press titles that made the cut as reads worth remembering. With that in mind, here’s a list of our books that earned praise as cream of the crop here and abroad, from scholarly journals, literary blogs, metropolitan newspapers, and the like. If you’re looking, might we (and others) recommend—
A Naked Singularity by Sergio De La Pava
A World in One Cubic Foot: Portraits of Biodiversity by David Liittschwager
Alive in the Writing: Crafting Ethnography in the Company of Chekhov by Kirin Narayan
And Bid Him Sing: A Biography of Countée Cullen by Charles Molesworth
The Art of Medicine: 2,000 Years of Images and Imagination by Julie Anderson, Emm Barnes, and Emma Shackleton
Bewilderment by David Ferry
Book Was There: Reading in Electronic Times by Andrew Piper
Dangerous Work: Diary of an Arctic Adventure by Arthur Conan Doyle
Dreaming in French: The Paris Years of Jacqueline Bouvier Kennedy, Susan Sontag, and Angela Davis
The Last Walk: Reflections on Our Pets at the End of Their Lives by Jessica Pierce
Other Criteria: Confrontations with Twentieth-Century Art by Leo Steinberg
- announced as a book of the year by the Art Newspaper (originally published in 2007: TIME WARP)
The Romantic Machine: Utopian Science and Technology after Napoleon by John Tresch
The Sounding of the Whale: Science and Cetaceans in the Twentieth Century by D. Graham Burnett
The Structure of Scientific Revolutions: Fiftieth Anniversary Edition by Thomas S. Kuhn
- made Nature magazine’s Top Twelve of 2012 list
The Timeline of Presidential Elections: How Campaigns Do (And Do Not) Matter by Robert S. Erikson and Christopher Wlezian
Vegetables: A Biography by Evelyne Bloch-Dano
included as one of the best books of 2012 by Audubon magazine
You’ll Know When You Get There: Herbie Hancock and the Mwandishi Band by Bob Gluck
The devastation wrought by Hurricane Sandy to the Mid-Atlantic and Northeastern United States, the Caribbean, and Eastern Canada continues to exceed early damage estimates, with almost 66 billion dollars in losses currently anticipated for the US alone, and a death toll of 253 afflicting seven nations. In his recent book The Human Shore: Seacoasts in History, John R. Gillis articulates—and even anticipates—how our relationship to the sea has begun to take on new and potentially catastrophic dimensions. Accounting for more than 100,000 years of seaside civilization, Gillis argues that in spite of mass movement to the coasts in the last half-century, we have forgotten how to live with our oceans. Applying this knowledge to our tenuous responses to this most recent disaster, Gillis explains how a shift in education, awareness, and planning might yet allow us to learn the lessons necessary for sustainable coinhabitance with the seas. You can read more of his thoughts on what we can do below.
“History Has Lessons for Post-Sandy America” by John R. Gillis
In the wake of Hurricane Sandy, Americans are finally beginning to ask themselves whether or not it might be advisable to build up to the edge of the sea. It is dawning on us that we are dealing with a human-made rather than natural disaster. The surge of populations to the sea has been accelerating in recent decades and losses have begun to mount astronomically as expensive properties, encouraged by federally-subsidized insurance, crowd the seashore. On American coasts, a culture of coping—the product of thousands of years of human habitation, on shores that began in prehistoric Africa and ultimately circled the globe—is rapidly vanishing.
Our ancestors knew not just how to live on the sea, but with it. They came there to enjoy the most productive environment the world could offer: in terms of what the land could provide, as well as the even-richer marine biota located just offshore. First as hunters and gatherers, and later assisted by sail and ultimately steam, coastal societies generated social and economic resources greater than their inland neighbors. In the early modern period, it was by means of seaborne empires that Europe extended its world dominance. The United States was born coastal, discovered and settled by sea. In 1837, Alexis de Tocqueville predicted for the young republic a glorious maritime future. The opening up of the North American continent ultimately turned this country inward, but it has always been multishored, facing out toward the Atlantic, Pacific, and the Gulf.
America’s native people had been farmer-fishers. The Europeans who followed them were similar in their orientation to both land and sea. These settlers, like the hunter-gatherers they replaced, were highly mobile, moving alongshore in search of their livelihood. They built dwellings of light, transportable materials and when they settled permanently, they confined themselves well back from the sea, often facing away from what they knew to be its ever-present dangers. These people were not risk-averse, but they were well-informed and cautious about the ways of the sea. Their beaches were strewn with wrecks, as testimony to the uncontrolled power of the oceans to take, as well as give, life. They did not ask to be rescued but instead coped as communities.
In the late twentieth century, older coastal inhabitants have been largely displaced by interior populations who have come to shore to recreate rather than earn their livings. These new residents have confined the fishers to a few small ports, taking over the beaches between, and clearing away even the memories of working life, not to mention the life-and-death struggles that once played out on the seas. Today, the beach is supposed to be the place where we get away from the world—and even the thought of its troubles. Fishing villages have now been turned into some of the world’s highest-priced real estate, forcing fishers and clammers to live elsewhere, as they commute to the few working waterfronts that still exist. In most places, these have been replaced by what John Cheever called a “second shore,” ports of “antique shops, restaurants, and tea shops.”
Gone are not only the old coastal peoples but their well-developed cultures of risk and coping. Risk has been displaced to the national treasury; coping is left to governments at the state and federal levels. This new coastal generation no longer knows how to live lightly on the shores or how to construct portable buildings that can be removed from the path of danger. Earlier generations knew the sea to be an ever-present risk, but did not treat it as an enemy from which there can be no retreat. Americans now fly flags in the face of hurricanes and resist the pulling back of lighthouses threatened by beach erosion as a betrayal of national sovereignty.
The first response of politicians to Sandy—to restore and rebuild in place—was not at all promising, but there is still time for wiser counsel. Already there have been calls for risk to be gradually shifted from the government to property owners. Instead of quick fixes like manufacturing bigger sea walls and expensive storm barriers, we can wait for nature to do its part by rebuilding barrier islands and wetlands. But we also need to do our part by educating the public on the history and culture of risk and coping. We can do the first by taking financial responsibility for our own mistakes. The second can be accomplished by sensible coastal planning and new building codes that are informed by the history of local resilience, which has much offer if we are only willing to consult its long record.
John R. Gillis is the author of The Human Shore: Seacoasts in History; Islands of the Mind; A World of Their Own Making: Myth, Ritual, and the Quest for Family Values; and Commemorations. A professor of history emeritus at Rutgers University, he now divides his time between two coasts: Northern California and Maine.
By: Kristi McGuire,
Blog: The Chicago Blog
(Login to Add to MyJacketFlap
, Art and Architecture
, Books for the News
, Gay and Lesbian
, History and Philosophy of Science
, Medieval and Renaissance
, Politics and Current Events
, UCP News
, Add a tag
The 2012 class of Guggenheim Fellows was announced this week by the John Simon Guggenheim Memorial Foundation, inciting some exuberant responses on the part of several winners (check out Terry Teachout’s Twitter feed). The Guggenheim has long been hailed as the “mid-career award,” honoring scholars, scientists, poets, artists, and writers, who have likely published a book or three, professed a fair amount of research, and are actively engaged in projects of significant scope. The fellowship possesses some tortured origins—(John) Simon Guggenheim, who served as president of the American Smelting and Refining Company and Republican senator from Colorado, seeded the award (1925) following the death of this son John (1922) from mastoiditis (Guggenheim’s second son George later committed suicide, and more infamously his older brother Benjamin went down with the Titanic). Among this year’s crop (we dare say more forward-leaning than previous years?) is a roster of standout “professionals who have demonstrated exceptional ability by publishing a significant body of work in the fields of natural sciences, social sciences, humanities, and the creative arts,” affiliated with the University of Chicago Press: Creative Arts Christian Wiman, editor of Poetry magazine and author of three poetry collections, coeditor of The Open Door: [...]
What follows below is a list of proper nouns mentioned by Alan Gilbert, author of Black Patriots and Loyalists: Fighting for Emancipation in the War for Independence, during an interview with 3 AM magazine:
Richard Gilbert, United States, Harvard, World War II, Wobblies, Schenley Industries, New York, Ayub Khan, Pakistan, Little Rock, Central High, New York Times, South Africa, Emma, Democrats, Taj, Americans, Adamjee, East Pakistan, West Pakistan, Ashraf Adamjee, Wouter Tim, Marx, Indian Ocean, Chestertown, Maryland, Freedom Summer, Walden School, New York, Andy Goodman, James Cheney, Michael Schwerner, Vietnam War, Bernard Fall, Denis Warner, Jean Lacouture, Stanley Hoffmann, Barrington Moore, French, German, English, Government 1a, Carl Friedrich, Max Weber, Adam Smith, Emile Durkheim, Sigmund Freud, David Hume, I. F. Stone, Herbert Marcuse, McGeorge Bundy, May 2nd Movement, London School of Economics, Ralph Miliband, Labour Party, Ecole Normale, Paris, Althusser, Montesquieu, Das Kapital, England, Michael Walzer, Dita Skhlar, Artistotle, Hilary Putnam, John Rawls, Dick Boyd, SDS, Alan Garfinkel, Forms of Explanation, Norm Daniels, Cornell, Nick Sturgeon, Richard Miller, David Lyons, American Council of Learned Societies, Marx’s Politics: Communists and Citizens, Leo Strauss, Karl Loewith, the Right, Adolf Hitler, Plato, Thomas Hobbes, J. J. Rousseau, Alex Rosenberg, the Iliad, Simone Weil, Chicago, Africa, Obama, Bin Laden, Goldman Sachs, Tunisia, Egypt, Greece, Spain, Occupy, Carl Schmitt, Political Theology, Karl Loewith, Constitution, Bob Goldwin, Mike Malbin, Dick Cheney, Scott Horton, Eugene Shepperd, Michael Zank, William Altman, Shylock, Fagin, Hannah Arendt, Martin Heidegger, Brown vs. Board of Education, Charles Percy, Cuba, the American President, Bradley Manning, Iraq War, Americanism, Evangelicism, Bill Clinton, George W. Bush, Awlaki, Jack Balkin, Ron Paul, British Tories, Andrew Sullivan, Bob Barr, Condi Rice, Democratic Individuality, Magna Carta, Law Lords, Catholic Church, Spirit of the Laws, Gilbert Harman, Socrates, Meno, American South, Ku Klux Klan, John Woolman, John Laurens, Thomas Peters, Black Patriots and Loyalists: Fighting for Emancipation in the War for Independence, Lincoln, Thomas Hobbes, George Washington, Gabriel Prosser, the Republic, Thrasymachus, the Adamjee Jute Mill, Brian Leiter, Bangladesh, Hilary Putnam, Democratic Individuality, Martin Luther King, Thich Nat Hanh, Vienna, Jean-Paul Sartre, Charles Taylor, Vichy, the Riviera, G. A. Cohen, the Communist Manifesto, Engels, the Eighteenth Brumaire, Genealogy of Morals, Politics as a Vocation, 11th Thesis on Feuerbach, Mayor Bloomberg, Franklin Delano Roosevelt, National Labor Relations Act, Flint, San Francisco, Harry Bridges, National Guard, Wisconsin, May Day, the Second International, Haymarket, Civil Rights Acts, Vincent Harding, Memphis, Must Global Politics Constrain Democracy, International Working Men’s Association, Hans Morganthau, George Kennan, Robert Gilpin, Robert Keohane, John Brown, Frederick Douglass, Walt Whitman, Henry David Thoreau, Ralph Waldo Emerson, Michel Foucault, A Theory of Justice, Rupert Murdoch, Jacopo Arbenz, Guatemala, ITT, Salvador Allende, Law of Peoples, David Levine, Blackwater, Xe Corporation, Yitzhak Perlman, Stradivarius, C. P. Snow, Henry Giroux, Max Planck, Denver, Koch Brothers, National Public Radio, Mitt Romney,
Forty-eight years ago today, then-president Lyndon Johnson formally introduced his platform for the “Great Society” at the University of Michigan, Ann Arbor’s commencement on May 22, 1964. Coined by speechwriter Richard N. Goodwin (who also wrote for Robert F. Kennedy—he’s still living, and is the spouse of historian Doris Kearns Goodwin), the Great Society sponsored a series of social initiatives that helped Johnson win election later that fall in a landslide victory, and many of them—decades later—remain with us today, including Medicaid, Medicare, and the Older Americans Act.
Several agencies and institutions were first endowed by Great Society–funded legislation, including the National Endowment for the Arts, the National Endowment for the Humanities, and the Hirshhorn Museum and Sculpture Garden. Among the landmark legislation passed in Johnson’s term was the Economic Opportunity Act of 1964, Civil Right Act of 1964, the Voting Rights Act of 1965, and the Civil Rights Act of 1968; the Social Security Act (1965), the Food Stamp Act (1964), and the Immigration and National Services Act (1965); and, the Elementary and Higher Education Act (1965), the Higher Education Act (1965), and the Bilingual Education Act (1968). The Cigarette Labeling Act. The Motor Vehicle Safety Act. The Clear Air, Water Quality, and Clean Water Restoration Acts and Amendments. These legislative endeavors, voted into law by the Eighty-Ninth Congress (the Johnson Administration submitted eighty-seven bills to Congress, and Johnson signed eighty-four, or 96 percent, perhaps the most successful legislative agenda in U.S. Congressional history), were imperative enough to twentieth-century American life that we don’t need to footnote their contributions (or, sluggish sigh: maybe we do). So, too, with the organizations that sprung up through GS–helmed research initiatives and public partnerships: Head Start, the Job Corps, VISTA, the Corporation for Public Broadcasting, and National Public Radio.
W. J. T. Mitchell, writing in Cloning Terror: The War of Images, 9/11 to the Present, writes the following on the origins of the phrase “war on terror,” tracing it to the nineteenth-century’s “war on tuberculosis”:
The metaphor was updated by Lyndon Johnson’s “war on poverty” and Richard Nixon’s “war on drugs” (another war, incidentally, that has proved to be endless and unwinnable). All these “wars” were properly understood in quotation marks, as serious efforts to solve systemic problems in public health. LBJ did not envision the bombing of poor neighborhoods as the way to conduct a war on poverty. (The drug war, on the other hand, is well on its way down the slippery slope toward literalization as military action.)
From Politicians Don’t Pander: Political Manipulation and the Loss of Democratic Responsiveness by Lawrence R. Jacobs and Robert Y. Shapiro:
The Declaration of Independence was animated by a demand for “consent of the governed” and the promise of popular control has inspired a long and, at times, violent struggle for the right to vote by all Americans, the full and equal right to freedom of speech and assembly, and other essential rights.
Does the American government respond to the broad public or to the interests and values of narrowly constituted groups committed to advancing their private policy agendas? On one side lies democratic accountability; on the other a closed and insular government that is ill-suited to address the wishes or wants of most citizens. When politicians persistently disregard the public’s policy preferences, popular sovereignty and representative democracy are threatened.
|The responsiveness of national policymakers to what most Americans prefer has declined and remained low for almost two decades.
Can we rely on competitive elections to fend off muted responsiveness to centrist opinion? After all, congressional Democrats suffered stunning setbacks in the 1994 elections following Clinton’s campaign for an unpopular health care reform plan and the Republicans’ congressional majorities were reduced in the 1996 and 1998 elections after they pursued policies that defied strong public preferences. We argue that electoral punishment may not be enough to improve the public’s influence on government: the responsiveness of national policymakers to what most Americans prefer has declined and remained low for almost two decades despite electoral setbacks to Democrats and Republicans. Politicians have worked hard to obscure their true positions and to distort the positions of their opponents, which makes it hard for the electorate to identify the policy positions of elected officials and to punish politicians for pursuing unpopular policies. In addition, most members of Congress today attach greater electoral importance to following the policy goals of party activists than responding to centrist opinion. The bottom line is that most politicians are keenly motivated and amply skilled at evading electoral accountability for long periods. Their success has impaired our system of accountability and sullied the quality of citizenship by eroding public trust and fuelling the news media’s increasing focus on political conflict and strategy rather than on the substantive issues raised by government policy.
Our analysis should not be confused, however, with naive populism. We recognize that the sheer complexity and scope of government decisions require elite initiative, at times without public guidance. And, on occasion, elites may need to defy ill-informed and unreasoned public opinion in defense of larger considerations and, instead, rely upon the public’s post hoc evaluations of their actions and their arguments justifying their actions. Franklin Roosevelt’s arming of mercha
Unless you’ve been sleeping under a rock (under Iraq? Unforgiveable pun?), yesterday’s Supreme Court decision to uphold the majority of the Patient Protection and Affordable Care Act (PPACA), ruled in the National Federation of Independent Businesses v. Sebelius, likely caught your attention. Despite attempts to repeal the act by both the 111th and 112th Congresses, the Court determined that the government mandate for health care was a tax, and thus fell under Congress’s taxing authority, with the caveat that the federal government could not withhold Medicaid funds in their entirety to states that refused to comply with Medicaid expansion. The Washington Post has a helpful electronic cheat sheet that explains how the legislation will affect you directly in the months and year to come, based on the type of insurance you do or do not carry, your income, and household status. With that in mind, we asked scholar Beatrix Hoffman, author of Health Care for Some: Rights and Rationing in the United States since 1930, to weigh in on Court’s ruling in light of her own research on America’s long tradition of unequal access to health care. Her thoughts follow below.
A Historic Ruling for Health Care
The Supreme Court shocker that (mostly) saved the Affordable Care Act adds a new chapter to the history of health-care reform in the United States. As we heard frequently throughout the debates, several presidents and numerous politicians have proposed national health-care plans in the past. Over the course of nearly 100 years, only Lyndon Johnson was successful, with the creation of Medicare and Medicaid in 1965. Then, when Congress passed the Affordable Care Act (ACA) in March 2010, Barack Obama achieved the most sweeping reform in history. Yesterday, just a single vote by Chief Justice John Roberts saved the entire law from being declared unconstitutional.
This was the first time that national health-care reform faced a constitutional challenge. Medicare was funded through a payroll deduction, which (the Supreme Court just reminded us) was fully within Congress’s power to tax. Medicare’s framers deliberately grafted their new health-insurance program for the elderly onto a popular, efficient, payroll-tax-based system that already existed and was fully constitutional: Social Security.
President Obama and the members of Congress who wrote the Affordable Care Act broke with this successful reform tradition by rejecting the option of expanding Medicare or otherwise building on existing tax-based social insurance programs. Believing such an approach to be politically unfeasible, they instead opted for the “individual mandate,” which requires the uninsured to purchase private health insurance, an idea that originated in a conservative think-tank and was first applied in Massachusetts under then-Governor Mitt Romney. Despite this dramatic political compromise, the individual mandate did not succeed in capturing the votes of the Congressional opposition—not a single Republican voted for the Affordable
“And the great owners, who must lose their land in an upheaval, the great owners with access to history, with eyes to read history and to know the great fact: when property accumulates in too few hands it is taken away. And that companion fact: when a majority of the people are hungry and cold they will take by force what they need. And the little screaming fact that sounds through all history: repression works only to strengthen and knit the repressed.”
Heat wave as repression? Not an exact science. But something about the sweltering temperatures this weekend (the feeling of exodus, perhaps, but not migration) prompted a return to The Grapes of Wrath. 1936 was the year that set many of the record temperatures in the United States that we’re now dabbling in breaking; it was also the year of the coup d’etat that triggered the Spanish Civil War (farewell, Abraham Lincoln Brigade!), and a massive sit-down strike by the United Auto Workers in Flint, Michigan. In the middle of the Dust Bowl’s prairie-afflicted sandstorms and the Depression, our wealth inequality peaked and would remain at the highest levels the country had seen, until just prior to the Too Big to Fail crisis (2007).
In July of 1995, a similar wave struck Chicago. NASDAQ topped 1000, we tried to “disarm” Iraq, and a slow municipal response converged with the elderly poor living in the heart of the city, ushering in unprecendented deaths (over 700) and procedural calamity.
Eric Klinenberg’s Heat Wave: A Social Autopsy of a Disaster in Chicago explores the social, political, and institutional organs of the city that made this urban disaster so much worse than it ought to have been:
“Today if you ask Chicagoans about the heat wave they will likely tell you that not all the deaths were ‘really real.’ That’s a direct legacy of the politics of the disaster.”
More from Eric Klinenberg, in an interview that further engages the consequences of natural disaster, human error, and social unrest here.
Joseph Cropsey—American political philosopher; distinguished service professor emeritus in the Department of Political Science at the University of Chicago; dedicated teacher; and coeditor of the “Strauss–Cropsey Reader” (History of Political Philosophy), a staple in universities for fifty years—died last week at the age of 92.
Cropsey completed his PhD in economics at Columbia University in 1952, with a dissertation on the work of Adam Smith, one of his lifelong scholarly interests (in addition to interstitial aspects in the works of Plato and Karl Marx, the figure of Socrates and issues of philosophical sobriety, and the limitations and entrapments of modern liberalism). By 1957, Cropsey was at the University of Chicago (after stints at the CCNY and the New School) as a Rockefeller Foundation Fellow, following Leo Strauss, who would become his most significant collaborator, and assist in his intellectual turn from economics to political philosphy.
The University of Chicago News Office reports on their intellectual partnership:
Strauss encouraged Cropsey to examine texts deeply. “When Strauss was at the head of his class, sitting up there, he would at a certain point say, ‘What does this mean?’ When I have to deal with a text of Plato, I have constantly to be asking myself, ‘What does that truly mean?’ Until one comes to grips with the question, one has not done one’s duty to the object or to oneself,” he told Dialogo.
Cropsey continued teaching at Chicago until 2004, garnering the Quantrell Award for Excellence in Undergraduate Teaching, serving on 134 PhD dissertation committees, and directing the John M. Olin Center for Inquiry into the Theory and Practice of Democracy.
In addition to the History of Political Philosophy, Cropsey authored and edited numerous volumes. Among those are Thomas Hobbes’s A Dialogue between a Philosopher and a Student of the Common Laws of England, an invaluable later writing by Hobbes, and Plato’s World: Man’s Place in the Cosmos, which culminated Cropsey’s lifelong work on the philosopher.
The dissemination of a war against terror has depended on a locution full of historical and contemporary ironies, for terror began its lexical life as the policy of the state, and wars are traditionally waged by states, so the war against terror can be (and has been) deciphered as the war of the state against itself. But international events are not the only sources of interruption of or distraction from the working out of memorial vocabularies for the dead of 9/11. There is also the ongoing negotiation between commerce and commemoration at the WTC site, a process that pits the declared obligations of memory and due respect against those of a future civic life, both economic and cultural. It is easy to cast the moguls of Manhattan as insensitive and materialistic, but the memorial process has also been aggressively suborned by the politicians, whose avowed respect for the dead is not beyond suspicions of present and future self-interest. Debates about the use of the site have not been unmarked by the assumption that the dead should bury the dead and thus by an embarrassingly hasty inclination to get on with life. Many residents have made it clear that they do not wish to live in a national memorial emptied of retail, full of tourists by day and deserted by night. On the other side, melancholic extremes can also be identified among some of the survivor families and other involved groups who want the site to remain always a shrine to the departed.
. . . .
The event we call 9/11 has a past that we can rediscover, a present that we must monitor, and a future we can project. Many of us who were addressing even the most circumscribed of publics—our students or fellow academics—felt the urge, in the immediate aftermath of September 11, 2001, to make a statement, to testify, to register a response, to initiate some sort of commemoration. Many of those responses to the form of grief, sorrow, shock, and above all, self-recrimination at the appearance of carrying on as before. The rhetoric veered wildly between sympathy and self-importance—as if it were a moral duty that each of us should speak—but what was notable was the need to register awareness of some sort. Many people all across America, not only those who knew one of the dead or knew someone who knew someone, reported feelings of acute personal anxiety and radical insecurity, but there was never a point at which this response could be analyzed as prior to or outside of its mediation by television and by political manipulation. With the passage of time it may come to appear that 9/11 did not blow away our past in an eruption of the unimaginable but that it refigured that past into patterns open to being made into new and often dangerous forms of sense.
—from 9/11: The Culture of Commemoration by David Simpson (2006)