What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Date

Click days in this calendar to see posts by day or month
<<November 2014>>
SuMoTuWeThFrSa
      01
02030405060708
09101112131415
16171819202122
23242526272829
30      
new posts in all blogs
Viewing: Blog Posts Tagged with: american history, Most Recent at Top [Help]
Results 1 - 25 of 311
1. Thanksgiving with Benjamin Franklin

“A Full Belly is the Mother of all Evil,” Benjamin Franklin counseled the readers of Poor Richard’s Almanack. For some mysterious reason this aphorism hasn’t had the sticking power of some of the inventor’s more famous sayings, like “he who lies down with dogs shall rise up with fleas.” Most of us are more inclined to see a full belly as one of life’s blessings. The offending epigram, however, can’t be described as an aberration. Franklin’s writings are filled with variations on this advice: “A full Belly makes a dull brain”; “The Muse starves in a Cook’s shop”; and “Three good meals a day makes bad living.” It’s no wonder that one canny writer has taken advantage of the unquenchable American appetite for both the founding fathers and diet books to publish The Benjamin Franklin Diet, a complete guide to slimming down, eighteenth-century style.

Franklin’s antipathy to a full belly reflected his Puritan upbringing, which stigmatized gustatory pleasures as low or impure. When he was growing up, he recalled in his Autobiography, “little or no Notice was ever taken of what related to the Victuals on the table, whether it was well or ill dressed, in or out of season, of good or bad flavour, preferable of inferior.” Franklin claimed to have thoroughly adopted this legacy of indifference to food, but there is good evidence to the contrary. He abandoned an early commitment to vegetarianism when, on board the ship that carried him away from bondage to his brother in Boston, he succumbed to the temptation to indulge in a catch of cod. As he confessed, “I had formerly been a great Lover of fish, & when this came hot out of the Frying Pan, it smeled admirably well.” Reasoning that fish ate other fish, and thus why shouldn’t he, the pragmatic Franklin “din’d upon Cod very heartily.” The famous portrait of Franklin by Joseph Siffred Duplessis, painted decades later in France, suggests that he gained no better control of his appetites as he matured. Not even a hero worshipper could call the man thin. A second chin falls heavy below his jaw line, his belly strains against the buttons of his sumptuous waistcoat, and his arms bear a resemblance to fattened sausages.

Rachel Hope Cleves - Ben Franklin

Not a total hypocrite, Franklin did include passages in his writing that treat the pleasures of the table more positively. Poor Richard’s advice that “Fools makes Feasts and Wise Men eat them” suggests that frugality, more than distaste, motivated Franklin’s advice be temperate. During his embassy in Paris, when Franklin sought to win France over to the American cause, he ate out six nights a week. And without a doubt he enjoyed many of the nice things he was served, such as îles flottantes and champagne.

A proud American, Franklin also sought to introduce his French friends to some of the glories of his native cuisine. He insisted that American corn flour could make a sweeter bread than wheat alone (several of the philosophes were engaged in pursuit of a more nutritious bread recipe to improve the condition of the peasantry, who derived the majority of their calories from the staff of life). Later, after his return to Philadelphia, Franklin sent his friends shipments of Pennsylvania hams – remarkable for the sweetness of their fat, which he attributed to the pigs’ subsisting on corn.

If you want to try Benjamin Franklin’s recipe for corn bread you can find it in the appendix to Gilbert Chinard’s wonderful 1958 essay “Benjamin Franklin on the Art of Eating.” This little pamphlet, printed by the American Philosophical Society, contains a number of recipes found among Franklin’s papers, few of which could be described as dietetic. Franklin’s recipe for roasted pig pays great attention to producing a delicious crackling. His oyster sauce is heavy on the cream. And his puff pastry, recommended for encasing his apple pudding, calls for a pound of butter. Frarnklin’s apple pudding makes a tempting proposition for a food historian on the eve of Thanksgiving, especially since, like many eighteenth-century recipes, Franklin’s terse instructions offer just enough detail to inspire certainty that the end result would be inedible by twentieth-century standards. What better reason could there be to break out the mixing bowl!

*   *   *   *   *

To make an apple pudding.

Make a good puff-paste, roll it out half an inch thick, pare your apples, and core them, enough to fill the crust, and close it up, tie it in a cloth and boil it. If a small pudding, two hours: if a large one three or four hours. When it is enough turn it into your dish, cut a piece of the crust out of the top, butter and sugar it to your palate; lay on the crust again, and send it to table hot.

*   *   *   *   *

The sense of the unfamiliar has always been what compels me about history, it gives me the feeling of discovery and assures me that I am not just finding my own reflection in the sources. I, for example, do not bring a love of boiling to my reading of dessert recipes. Baking I expect – hours of boiling, not so much. I boil few foods, and those only briefly. I boil pasta 7 to 12 minutes, always anxious to drain the pot while the noodles are still al dente. Sometimes I boil green beans, but just for a couple minutes and often I steam them instead. I boil eggs, but I like the yolks soft so I don’t leave them in for more than six minutes. I never boil dessert pastries. But Benjamin Franklin told me to, so for the sake of historical knowledge I threw all my cooking know-how to the wind and set out to slavishly follow his orders.

Difficulties confronted me long before I arrived at the boiling. To begin, Franklin directed that I make a puff pastry, mixing four pints, or a quarter of a peck, of flour with half a pound of butter. How much did eighteenth-century dry pints weigh? And did they weigh the same in the colonies as they did in England? Today the imperial wet pint is four ounces more than the American wet pint (20 oz vs. 16 oz). One thing is for certain, whatever the exact weight of an eighteenth-century dry pint might be, four of them is a whopping amount. I made the executive decision to weight a pint at 16 oz and cut the recipe in half so that I didn’t completely empty our flour bin. Halving the butter as well, I ended up with a very dry mix:

image#1

The next direction was to add cold water until a stiff dough formed. Having spent the past twenty-five years of baking trying to add as little water to my pie dough as possible to prevent it turning tough, I needed to tamp down all my better instincts to pour in the cup and a half of cold water that my dry mix required to come together.

image#2

The brick of paste that resulted was so hard that it had to be beat into submission to follow the next directions, which called for the dough to be rolled out, buttered, rolled up, rolled out, and buttered again, nine to ten successive times until another half pound of butter had been added.

image#3(1)

After an hour of buttering and rolling, I was left with a lovely, pliable, yellow dough, which I rolled out “half a thumb’s thickness” and set on a cheese cloth.

image#4

Franklin’s recipe calls next for chopped cored apples to be placed on the dough. No seasoning is done at this stage: no spices added to the apples, no sugar, no butter, no lemon. Just apples. How big? How many? Over how much of the dough? It doesn’t say.

image#5

Nor did the recipe explain how to seal the dough. I went for crimping and ended up with something that looked like a giant Cornish pasty.

image#6

At least until I wrapped it up in pastry and began the boiling, whence it commenced to look more like a brain. It was hard to commit willful destruction of this beautiful pasty, rather than pop the parcel into a hot oven where it might grow golden and crisp. What was the purpose of building up 10 layers of lamination only to melt out all the butter in a bubbling pot? Again, Franklin was mute.

image#7

image#8

The cooking instructions said to boil the pudding from two to four hours depending on its size. Unsure of the standard of measurement, I decided on three hours. There were no further cooking directions and perhaps I should have just let it be, but worried that the pudding wasn’t getting cooked on the top, which bounced above the bubbling water, I flipped the package each hour. Perhaps if I hadn’t, the pudding would have developed more of a crust.

For the final step, Franklin directs that the top of the pudding be removed, sugar and butter be mixed in with the apples, then the top replaced and the whole served immediately. When I cut away the muslin and lifted the soggy lid I found that the apples inside had reduced to a beautiful sauce within the boiled pastry casing. I added some chopped butter and brown sugar, then closed the pudding back up and let the flavors meld. I can’t say the result would win first prize in a pie contest, it wouldn’t even win honorable mention. But I can report that the mess tasted quite nice in a bland, comforting, soft, sort of way. Not a bad match for turkey at all.

image#9

Image#10

Image#11

 

Featured image: “The First Thanksgiving,” Jean Leone Gerome Ferris (c. 1912). Public domain via Wikimedia Commons

The post Thanksgiving with Benjamin Franklin appeared first on OUPblog.

0 Comments on Thanksgiving with Benjamin Franklin as of 11/24/2014 10:06:00 AM
Add a Comment
2. The legitimate fear that months of civil unrest in Ferguson, Missouri will end in rioting

On 9 August 2014, Officer Darren Wilson of the Ferguson, Missouri (a suburb of St. Louis) Police Department, shot and killed Michael Brown, an unarmed 18-year-old. Officer Wilson is white and Michael Brown was black, sparking allegations from wide swaths of the local and national black community that Wilson’s shooting of Brown, and the Ferguson Police Department’s reluctance to arrest the officer, are both racially motivated events that smack of an historic trend of black inequality within the US criminal justice system.

The fact that the Ferguson Police Department and city government are predominantly white, while the town is predominantly black, has underscored this distrust. So too have recent events in Los Angeles, New York, Ohio, South Carolina, St. Louis, and other places that suggest a disturbing pattern of white police personnel’s use of excessive force in the beatings or deaths of blacks across the nation. So disturbing, in fact, that this case and the others linked to it not only have inspired an organic, and diverse, crop of youth activists, but also have captured the close attention of President Barack Obama, Attorney General Eric Holder, national civil rights organizations and the national black leadership. Indeed, not one or two, but three concurrent investigations of Officer Wilson’s shooting of Michael Brown are ongoing—one by the St. Louis Police Department and the other two by the FBI and the Justice Department, who are concerned with possible civil rights violations. The case also has a significant international following. The parents of Michael Brown raised this profile recently when they testified in Geneva, Switzerland before the United Nations Committee against Torture. There, they joined a US delegation to plead for support to end police brutality aimed at profiled black youth.

The details of the shooting investigations, each bit eagerly seized by opposing sides (those who support Brown and those who defend Wilson) as they become publicly available, still don’t give a comprehensive view of what actually happened between the officer and the teen, leaving too much speculation as to whether or not the Ferguson Grand Jury, who have been considering the case since 20 August, will return an indictment(s) against Officer Wilson.

Protest at Ferguson Police Dept, by
Protest at Ferguson Police Dept, by Jamelle Bouie. CC-BY-2.0 via Wikimedia Commons.

What is known of the incident is that about noon on that Saturday, Michael Brown and a friend, Dorian Johnson, were walking down Canefield Drive in Ferguson when Darren Wilson approached the two in his squad car, telling them to get out of the street and onto the sidewalk. A scuffle ensued between Brown and Wilson within the police car. In his defense, Officer Wilson has stated that Brown attacked him and tried to grab his weapon. Dorian Johnson has countered that Wilson pulled Michael Brown into his car, suggesting that Brown was trying to defend himself from an overly aggressive Wilson. Shots were fired in Wilson’s police car and Brown ran down the street, pursued by Wilson. Autopsy reports indicate that Brown was shot at least six times, four times in his left arm, once through his left eye and once in the top of his head. The latter caused the youth’s death. Michael Brown’s body lay in the street, uncovered, for several hours while the police conducted a preliminary investigation, prompting even more outrage by black onlookers.

Since Michael Brown’s death, protestors from the area and across the nation have occupied the streets of Ferguson, demanding justice for the slain teen and his family. Nights of initial confrontations between police forces (the Ferguson Police, the St. Louis Police, the Missouri State Troopers and the National Guard have all been deployed in Ferguson at some time, and in some capacity, since the shooting) and though there has been some arson, looting, protestor and police violence, and arrests—even of news reporters—the protests generally have been peaceful. Not only police action during these protests, but their equipment as well, have sparked criticism and the growing demand that law enforcement agencies demilitarize. The daily protests have persisted, at times growing in great number, as during a series of “Hands up, Don’t Shoot” events that were held not just in Ferguson, but in many cities nationwide, including Chicago, New York, Washington, D.C., Los Angeles and Omaha, Nebraska in August and September. The “hands up” stance is to protest Brown’s shooting which some, but not all, witnesses have stated came even with Brown’s hands up in a gesture of surrender to Wilson.

Missouri Governor Jay Nixon, and other state and local officials, along with many of the residents of Ferguson, fear that if the Grand Jury does not indict Darren Wilson for Michael Brown’s murder, civil unrest will erupt into violence, producing an event similar to the Los Angeles Riots of 1992. In Los Angeles, large numbers of persons rioted when it seemed that the legal outcomes of two back-to-back criminal cases smacked of black injustice—the acquittal of four white police officers indicted in the assault of black motorist Rodney King, and the no jail-time sentence of a Korean shopkeeper found guilty for the murder of Latasha Harlins, a black teen. The result was the worst race riot in US history, with more than 50 people killed, the burning of a substantial portion of the ethnic business enclave of Koreatown, and at least a billion dollars in property damage.

Certainly the fear is a legitimate one. The vast majority of US race riots that have centered on black participation have occurred with like conditions as a spark—the community’s belief that a youth or vulnerable person among them has been brutalized with state sanction. The nation has witnessed these events not only in Los Angeles in 1965 and 1992; but also in Harlem in 1935 and 1964; Richmond, California in 1968; San Francisco in 1986; Tampa, Florida in 1967 and 1986; Miami in 1980; Newark, New Jersey in 1967; York, Pennsylvania in 1969; Crown Heights (Brooklyn), New York in 1991; St. Petersburg, Florida in 1996; Cincinnati, Ohio in 2001; Benton Harbor, Michigan in 2003; Oakland, California in 2009 and 2010, and the list goes on. These events all have served as cautionary tales that, unfortunately, have not resulted in either the perception or reality of black equality before the law. It is this legacy that frustrates and frightens Ferguson residents.

The post The legitimate fear that months of civil unrest in Ferguson, Missouri will end in rioting appeared first on OUPblog.

0 Comments on The legitimate fear that months of civil unrest in Ferguson, Missouri will end in rioting as of 1/1/1900
Add a Comment
3. The Republican view on bipartisanship

Anyone who expects bipartisanship in the wake of last Tuesday’s elections has not been paying attention. The Republican Party does not believe in a two-party system that includes the Democrats, and it never has. Ever since the Civil War, when the Republicans were convinced that their Democratic opposition was in treacherous league with the Confederacy, the Grand Old Party in season and out has doubted the legitimacy of the Democrats to hold power. While the Republicans have accepted the results of national elections as facts they could not change, they have not believed that the Democrats were ever legitimately holding power. Democratic victories, in the minds of Republicans, are the result of fraud and abuse.

Consider some examples: In 1876, Republicans in New York said the Democratic party was “the same in character and spirit as when it sympathized with treason.” Half a century later, speaking of Woodrow Wilson, Henry Cabot Lodge told the 1920 Republican national convention that “Mr. Wilson stands for a theory of administration and government which is not American.” When Senator Joseph R. McCarthy spoke of “twenty years of treason” in the 1950s, he was not joking. He meant the statement as literal fact. So too did an aide to George H.W. Bush in 1992 when he observed, “We are America. These other people are not America.”

So when Rush Limbaugh comments that “Democrats were not elected to govern,” or Leon H. Wolf of Redstate says Democrats “should not be even be invited to be part of the discussion lest their gangrenous, festering and destructive ideas should further infect our caucus,” they are reflecting an attitude toward the Democrats that is at least a century and a half old.

If, as many Republicans believe, there are elements of illegitimacy and evil in the Democratic Party under the leadership of President Obama, then a posture of intense resistance become a necessary GOP tactic. Meeting the threat that the Democrats pose in terms of such issues as same-sex marriage, climate change   and immigration reform requires going beyond politics as usual and employing any means necessary to save the nation.

For contemporary Republicans, scorched earth tactics and all-out opposition seem the appropriate response to the presence of a pretender in the White House who in their minds is pursuing the collapse of the American republic. There no longer exists between Republicans and Democrats a rough consensus about the purpose of the United States.

RNC 2008
The 2008 Republican National Convention. Photo part of the Carol M. Highsmith Archive, Prints and Photographs Division of the Library of Congress.

How has it come to this? A long review of both political parties suggests that the experience of the Civil War introduced a flaw into American democracy that was never resolved or recognized. The Republicans regarded the wartime flirtation of some Democrats with the Confederacy as evidence of treason. So it may have been at that distant time. What rendered that conclusion toxic was the perpetuation of the idea of Democratic illegitimacy and betrayal long after 1865.

After their extended years in the wilderness during the New Deal, Republicans reasserted their presidential dominance, with a few Democratic interruptions from 1952 to 1992. Republicans thus saw in the ascendancy of Dwight Eisenhower, Richard Nixon, Gerald Ford, Ronald Reagan, and the two Bushes a return to the proper order of politics in which the Republicans were destined to be in charge and Democrats to occupy a position of perennial deference outside of Congress.

Then the unthinkable happened. Not just a Democrat but a black Democrat won the White House. The southern-based Republican Party saw its worst fears coming true. A man with a foreign-sounding name, an equivocal religious background, and a black skin was president and pursuing what were to most Republicans sinister goals. Under his administration, blacks became assertive, gays married, the poor got health care, and the wealthy faced both a lack of due respect and a claim on their income.

The Republican allegiance to traditional democratic practices now seemed to them outmoded in this national crisis. Americans could not really have elected Barack Obama and put his party in control of the destiny of the nation. Such an outcome must be illegitimate. And what is the remedy for illegitimacy, treason, and godlessness? To quote Leon Wolf again: “Working with these people is not what America elected you to do. Republicans, it elected you to stop them.” Pundits who forecast a new era of bipartisanship comparable to what Dwight D. Eisenhower, Everett Dirksen, Sam Rayburn, and Lyndon B. Johnson achieved in the 1950s are living in a nostalgic dream world. Richard Nixon viewed politics as war and contemporary Republicans will proceed to explore the validity of his insight over the next two years. For the American voter, clinging to the naive notion of the parties working together, each taking part of the loaf, the best guide may be Bette Davis in All About Eve: “Fasten your seat belts. It’s going to be a bumpy night.”

Featured image: Members of the Republican Party gather at the 1900 National Convention. Public domain via Wikimedia Commons.

The post The Republican view on bipartisanship appeared first on OUPblog.

0 Comments on The Republican view on bipartisanship as of 11/15/2014 6:25:00 AM
Add a Comment
4. The Civil War in five senses

Historians are tasked with recreating days past, setting vivid scenes that bring the past to the present. Mark M. Smith, author of The Smell of Battle, the Taste of Siege: A Sensory History of the Civil War, engages all five senses to recall the roar of canon fire at Vicksburg, the stench of rotting corpses in Gettysburg, and many more of the sights and sounds of battle. In doing so, Smith creates a multi-dimensional vision of the Civil War and captures the human experience during wartime. Here, Smith speaks to how our senses work to inform our understanding of history and why the Civil War was a singular sensory event.

Sensory overload in the Civil War

Using sensory history to understand the past

How the Civil War transformed taste

Headline image credit: The Siege of Vicksburg. Litograph by Kurz and Allison, 1888. Public domain via the Library of Congress.

The post The Civil War in five senses appeared first on OUPblog.

0 Comments on The Civil War in five senses as of 1/1/1900
Add a Comment
5. Innovation and safety in high-risk organizations

The construction or recertification of a nuclear power plant often draws considerable attention from activists concerned about safety. However, nuclear powered US Navy (USN) ships routinely dock in the most heavily populated areas without creating any controversy at all. How has the USN managed to maintain such an impressive safety record?

The USN is not alone, many organizations, such as nuclear public utilities, confront the need to maintain perfect reliability or face catastrophe. However, this compelling need to be reliable does not insulate them from the need to innovate and change.  Given the high stakes and the risks that changes in one part of an organization’s system will have consequences for others, how can such organizations make better decisions regarding innovation? The experience of the USN is apt here as well.

Given that they have at their core a nuclear reactor, navy submarines are clearly high-risk organizations that need to innovate yet must maintain 100% reliability.  Shaped by the disastrous loss of the USS Thresher in 1963 the U.S. Navy (USN) adopted a very cautious approach dominated by safety considerations. In contrast, the Soviet Navy, mindful of its inferior naval position relative to the United States and her allies, adopted a much more aggressive approach focused on pushing the limits of what its submarines could do.

Decision-making in both organizations was complex and very different. It was a complex interaction among individuals confronting a central problem (their opponents’ capabilities) with a wide range of solutions. In addition, the solution was arrived at through a negotiated political process in response to another party that was, ironically, never directly addressed, i.e. the submarines never fought the opponent.

Perhaps ironically, given its government’s reputation for rigidity, it was the Soviet Navy that was far more entrepreneurial and innovative. The Soviets often decided to develop multiple types of different attack submarines – submarines armed with scores of guided missiles to attack U.S. carrier battle groups, referred to as SSGNs, and smaller submarines designed to attack other submarines. In contrast the USN adopted a much more conservative approach, choosing to modify its designs slightly such as by adding vertical launch tubes to its Los Angeles class submarines. It helped the USN that it needed its submarines to mostly do one thing – attack enemy submarines – while the Soviets needed their submarines to both attack submarines and USN carrier groups.

The Hunt for Red October, Soviet Submarine, by Kevin Labianco. CC-BY-NC-ND-2.0 via Flickr.
The Hunt for Red October, Soviet Submarine – 1970s, by Kevin Labianco. CC-BY-NC-ND-2.0 via Flickr.

As a result of their innovation, aided by utilizing design bureaus, something that does not exist in the U.S. military-industry complex, the Soviets made great strides in closing the performance gaps with the USN. Their Alfa class submarines were very fast and deep diving. Their final class of submarine before the disintegration of the Soviet Union – the Akula class – was largely a match for the Los Angeles class boats of the USN. However, they did so at a high price.

Soviet submarines suffered from many accidents, including ones involving their nuclear reactor. Both their SSGNs, designed to attack USN carrier groups, as well as their attack submarines, had many problems. After 1963 the Soviets had at least 15 major accidents that resulted in a total loss of the boat or major damage to its nuclear reactor. One submarine, the K429 actually sunk twice. The innovative Alfas, immortalized in The Hunt for Red October, were so trouble-prone that they were all decommissioned in 1990 save for one that had its innovative reactor replaced with a conventional one. In contrast, the USN had no accidents, though one submarine, the USS Scorpion, was lost in 1968 to unknown causes.

Why were the USN submarines so much more reliable? There were four basic reasons. First, the U.S. system allowed for much more open communication among the relevant actors. This allowed for easier mutual adjustment between the complex yet tightly integrated systems. Second, the U.S. system diffused power much more than in the Soviet political system. As a result, the U.S. pursued less radical innovations. Third, in the U.S. system decision makers often worked with more than one group – for example a U.S. admiral not only worked within the Navy, but also interacted with the shipyards and with Congress. Finally, Admiral Rickover was a strong safety advocate who instilled a strong safety culture that has endured to this day.

In short, share information, share power, make sure you know what you are doing and have someone powerful who is an advocate for safety. Like so much in management it sounds like common sense if you explain it well, but in reality it is very hard to do, as the Soviets discovered.

Feature image credit: Submarine, by subadei. CC-BY-2.0 via Flickr.

The post Innovation and safety in high-risk organizations appeared first on OUPblog.

0 Comments on Innovation and safety in high-risk organizations as of 11/10/2014 3:49:00 AM
Add a Comment
6. The anti-urban tradition in America: why Americans dislike their cities

Another election season is upon us, and so it is time for another lesson in electoral geography. Americans are accustomed to color-coding our politics red and blue, and we shade those handful of states that swing both ways purple. These color choices, of course, vastly simplify the political dynamic of the country.

Look more closely at those maps, and you’ll see that the real political divide is between metropolitan America and everywhere else. The blue dots on the map are, in fact, tiny, and the country is otherwise awash in red. Those blue dots, though, are where most of us live — 65% of us according the Brookings Institution live in metro regions of 500,000 or more — and those big red areas are increasingly empty.

The urban-rural divide has existed in American politics from the very beginning. It is a central irony of American political life that we are an urbanized nation inhabited by people who are deeply ambivalent about cities.

It’s what I call the “anti-urban tradition” in American life, and it comes in two parts.

On the one hand, American cities — starting with Philadelphia in the 18th century — have always been places of ethnic, racial, religious, and cultural diversity. First stop for immigrant arrivals from eastern Europe or the American south, cities embodied the cosmopolitan ideal that critic Randolph Bourne celebrated in his 1916 essay “Trans-National America.”

Not all Americans were as enthusiastic as Bourne about cities filling up with Catholics from Italy and Poland, Jews from Russia and Lithuania, and African-Americans from Mississippi and North Carolina. Many, in fact, recoiled in horror at all this heterogeneity. Many, of course, still do, as when Republican Vice Presidential candidate Sarah Palin campaigned in North Carolina and called small towns there “real America.”

Steven Conn - Immigrants NYC
Italian immigrants pictured along Mulberry Street on Manhattan’s Lower East Side, ca. 1900. Public domain via Wikimedia Commons.

On the other hand, the industrial cities that boomed at the turn of the 20th century relied on the actions of government to make life livable. Paved streets, clean water, sanitary sewers — all this infrastructure required the intervention of local, state, and eventually the federal government. Indeed, the 20th century city is where our commitments to the public realm have been given their widest expression — public space, public transportation, public education, public housing. And anti-urbanists then and now have a deep suspicion of those public, “collective” commitments.

In this sense, cities stand as antithetical to the basic, bedrock, “real” American values: self-reliant individualism and the supremacy of all things private. The 2012 Republican Party Platform, for example, denounced “sustainable development,” often associated with urbanist design principles, as nothing less than an assault on “the American way of life of private property ownership, single family homes, private car ownership and individual travel choices, and privately owned farms.”

Yet while anti-urbanism today is closely associated with Tea Party conservatives, its history in the 20th century is more complicated. The American antipathy toward our cities has been common across the political spectrum.

Franklin Roosevelt, architect of the modern liberal state, disliked cities personally — one of his closest aides described him as a “child of the country” who saw cities as “a perhaps necessary nuisance.” He was, to borrow the title of a 1940 biography, a “country squire in the White House.”

The New Deal reflected that anti-urban feeling. While a number of his New Deal programs addressed themselves to the failing industrial economy of the nation’s cities, FDR’s larger ambition was to “decentralize” cities by moving people and industry out into the hinterlands. This urge tied together the Tennessee Valley Authority, the Civilian Conservation Corps, and the program to build a series of entirely new towns. After all, according to New Dealer Rexford Tugwell, FDR “always did, and always would, think people better off in the country.”

A generation later, the counter-culture of the 1960s which had emerged on college campuses in Berkeley, Madison, Ann Arbor, and elsewhere manifested its own version of anti-urbanism. Fed up with what they saw as America’s un-savable cities, they went back to the land in dozens of different communal experiments. So many young people joined the exodus out of the city that Newsweek magazine declared 1969 “The Year of the Commune.”

Whether in the hills of Vermont or the hills of Marin County, communards shared the anti-urban impulse with their parents, who had left the city to move to the suburbs in the 1950s. As Steve Diamond put it in a 1971 book describing a trip from his commune back to New York: “you could feel yourself approaching the Big C (City, Civilization, Cancer) itself, deeper and deeper into the decaying heart.” These rebels might not have recognized their resemblance to their parents, but it was there in their shared anti-urban rhetoric.

Certainly, our ambivalence toward our cities lies beneath our unwillingness to tackle urban problems, whether in Detroit or Cleveland or Philadelphia. But the consequences of our anti-urban tradition are more wide-ranging. Our inability to think in public terms, to address the commonweal, grows directly out of our experience running away from cities in the 20th century. If we want a more effective and invigorated politics in the 21st century, therefore, we will have to outgrow our anti-urban habits.

Featured image: Aerial view of the tip of Manhattan, New York, United States, ca. 1931. Public domain via Wikimedia Commons.

The post The anti-urban tradition in America: why Americans dislike their cities appeared first on OUPblog.

0 Comments on The anti-urban tradition in America: why Americans dislike their cities as of 10/10/2014 9:47:00 AM
Add a Comment
7. Catesby’s American Dream: religious persecution in Elizabethan England

Over the summer of 1582 a group of English Catholic gentlemen met to hammer out their plans for a colony in North America — not Roanoke Island, Sir Walter Raleigh’s settlement of 1585, but Norumbega in present-day New England.

The scheme was promoted by two knights of the realm, Sir George Peckham and Sir Thomas Gerard, and it attracted several wealthy backers, including a gentleman from the midlands called Sir William Catesby. In the list of articles drafted in June 1582, Catesby agreed to be an Associate. In return for putting up £100 and ten men for the first voyage (forty for the next), he was promised a seignory of 10,000 acres and election to one of “the chief offices in government”. Special privileges would be extended to “encourage women to go on the voyage” and according to Bernardino de Mendoza, the Spanish ambassador in London, the settlers would “live in those parts with freedom of conscience.”

Religious liberty was important for these English Catholics because they didn’t have it at home. The Mass was banned, their priests were outlawed and, since 1571, even the possession of personal devotional items, like rosaries, was considered suspect. In November 1581, Catesby was fined 1,000 marks (£666) and imprisoned in the Fleet for allegedly harboring the Jesuit missionary priest, Edmund Campion, who was executed in December.

Campion’s mission had been controversial. He had challenged the state to a public debate and he had told the English Catholics that those who had been obeying the law and attending official church services every week — perhaps crossing their fingers, or blocking their ears, or keeping their hats on, to show that they didn’t really believe in Protestantism — had been living in sin. Church papistry, as it was known pejoratively, was against the law of God. The English government responded by raising the fine for non-attendance from 12 pence to £20 a month. It was a crippling sum and it prompted Catesby and his friends to go in search of a promised land.

The American venture was undeniably risky — “wild people, wild beasts, unexperienced air, unprovided land” did not inspire investor confidence — but it had some momentum in the summer of 1582. Francis Walsingham, Elizabeth I’s secretary of state, was behind it, but the Spanish scuppered it. Ambassador Mendoza argued that the emigration would drain “the small remnant of good blood” from the “sick body” of England. He was also concerned for Spain’s interests in the New World. The English could not be allowed a foothold in the Americas. It mattered not a jot that they were Catholic, “they would immediately have their throats cut as happened to the French.” Mendoza conveyed this threat to the would-be settlers via their priests with the further warning that “they were imperilling their consciences by engaging in an enterprise prejudicial to His Holiness” the Pope.

Revellers commemorate the failed 1605 assasination attempt against King James I each year on November 5. Public domain via Wikimedia Commons.
Revellers commemorate the failed 1605 assasination attempt against King James I each year on November 5. Public domain via Wikimedia Commons.

So Sir William Catesby did not sail the seas or have a role in the plantation of what — had it succeeded — would have been the first English colony in North America. He remained in England and continued to strive for a peaceful solution. “Suffer us not to be the only outcasts and refuse of the world,” he and his friends begged Elizabeth I in 1585, just before an act was passed making it a capital offense to be, or even to harbor, a seminary priest in England. Three years later, as the Spanish Armada beat menacingly towards England’s shore, Sir William and other prominent Catholics were clapped up as suspected fifth columnists. In 1593 those Catholics who refused to go to church were forbidden by law from traveling beyond five miles of their homes without a license. And so it went on until William’s death in 1598.

Seven years later, in the reign of the next monarch James I (James VI of Scotland), William’s son Robert became what we would today call a terrorist. Frustrated, angry and “beside himself with mindless fanaticism,” he contrived to blow up the king and the House of Lords at the state opening of Parliament on 5 November 1605. “The nature of the disease,” he told his recruits, “required so sharp a remedy.” The plot was discovered and anti-popery became ever more entrenched in English culture. Only in 2013 was the constitution weeded of a clause that insisted that royal heirs who married Catholics were excluded from the line of succession.

Every 5 November, we English and Scottish set off our fireworks and let our children foam with marshmallow, and we enjoy “bonfire night” as a bit of harmless fun, without really thinking about why the plotters sought their “sharp remedy” or, indeed, about the tragedy of the father’s failed American Dream, a dream for religious freedom that was twisted out of all recognition by the son.

Featured image: North East America, by Abraham Ortelius 1570. Public Domain via Wikimedia Commons.

The post Catesby’s American Dream: religious persecution in Elizabethan England appeared first on OUPblog.

0 Comments on Catesby’s American Dream: religious persecution in Elizabethan England as of 9/6/2014 7:20:00 AM
Add a Comment
8. The Wilderness Act of 1964 in historical perspective

Signed into law by President Lyndon Johnson on 3 September 1964, the Wilderness Act defined wilderness “as an area where the earth and its community of life are untrammeled by man, where man himself is a visitor who does not remain.” It not only put 1.9 million acres under federal protection, it created an entire preservation system that today includes nearly 110 million acres across forty-four states and Puerto Rico—some 5% of the land in the United States. These public lands include wildlife refuges and national forests and parks where people are welcome as visitors, but may not take up permanent residence.

The definition of what constitutes “wilderness” is not without controversy, and some critics question whether preservation is the best use of specific areas. Nevertheless, most Americans celebrate the guarantee that there will always be special places in the United States where nature can thrive in its unfettered state, without human intervention or control. Campers, hikers, birdwatchers, scientists and other outdoor enthusiasts owe much to Howard Zahniser, the act’s primary author.

In recent decades, environmental awareness and protection are values just about as all-American as Mom and apple pie. Despite the ill-fated “Drill, Baby, Drill,” slogan of the 2008 campaign, virtually all political candidates, whatever their party, profess concern about the environment and a commitment to its protection. As a professor, I have a hard time persuading my students, who were born more than two decades after the first Earth Day (in 1970), that environmental protection was once commonly considered downright traitorous.

For generations, many Americans were convinced that it was the exploitation of natural resources that made America great. The early pioneers survived because they wrested a living from the wilderness, and their children and grandchildren thrived because they turned natural resources into profit. Only slowly did the realization come that people had been so intent on pursuing vast commercial enterprises they failed to consider their environmental impact. When, according to the 1890 census, the frontier was closed, the nation was no longer a land of ever-expanding boundaries and unlimited resources. Birds like the passenger pigeon and the Carolina Parakeet were hunted into extinction; practices like strip-mining left ugly scars on the land, and clear-cutting made forest sustainability impossible.

At the turn of the last century members of the General Federation of Women’s Clubs called for the preservation of wilderness, especially through the creation of regional and national parks. They enjoyed the generous cooperation of the Forest Service during the Theodore Roosevelt administration, but found that overall, “it is difficult to get anyone to work for the public with the zeal with which men work for their own pockets.”

Public domain via Wikimedia Commons.
President Theodore Roosevelt and naturalist John Muir atop Glacier Point in Yosemite. Public domain via Wikimedia Commons.

Not surprisingly, Theodore Roosevelt framed his support for conservation in terms of benefiting people rather than (non-human) nature. In 1907 he addressed both houses of Congress to gain support for his administration’s effort to “get our people to look ahead and to substitute a planned and orderly development of our resources in place of a haphazard striving for immediate profit.” It is a testament to Roosevelt’s persona that he could sow the seeds of conservationism within a male population deeply suspicious of any argument even remotely tinged with what was derided as “female sentimentality.” Writer George L. Knapp, for example, termed the call for conservation “unadulterated humbug” and the dire prophecies of further extinction “baseless vaporings.” He preferred to celebrate the fruits of men’s unregulated resource consumption: “The pine woods of Michigan have vanished to make the homes of Kansas; the coal and iron which we have failed—thank Heaven!—to ‘conserve’ have carried meat and wheat to the hungry hives of men and gladdened life with an abundance which no previous age could know.” According to Knapp, men should be praised, not chastened, for turning “forests into villages, mines into ships and skyscrapers, scenery into work.”

The press reinforced the belief that the use of natural resources equaled progress. The Houston Post, for example, declared, “Smoke stacks are a splendid sign of a city’s prosperity,” and the Chicago Record Herald reported that the Creator who made coal “knew that smoke would be a good thing for the world.” Pittsburgh city leaders equated smoke with manly virtue and derided the “sentimentality and frivolity” of those who sought to limit industry out of baseless fear of the by-products it released into the air.

Pioneering educator and psychologist G. Stanley Hall confirmed that “caring for nature was female sentiment, not sound science.” Gifford Pinchot, made first chief of the Forestry Service in 1905, was a self-avowed conservationist. He escaped charges of effeminacy by making it clear that he measured nature’s value by its service to humanity. He dedicated his agency to “the art of producing from the forest whatever it can yield for the service of man.” “Trees are a crop, just like corn,” he famously proclaimed, “Wilderness is waste.”

Looking back at the last fifty years, the Wilderness Act of 1964 is an important achievement. But it becomes truly remarkable when viewed in the context of the long history that preceded it.

Headline image credit: Cook Lake, Bridger-Teton National Forest, Wyoming. Photos by the Pinedale Ranger District of the Bridger-Teton National Forest. Public domain via Wikimedia Commons.

The post The Wilderness Act of 1964 in historical perspective appeared first on OUPblog.

0 Comments on The Wilderness Act of 1964 in historical perspective as of 9/2/2014 8:31:00 AM
Add a Comment
9. Alice Paul, suffragette and activist, in 10 facts

Ninety-four years ago today, the Nineteenth Amendment to the Constitution of the United States took effect, enshrining American women’s right to vote. Fifty years later, in the midst of a new wave of feminist activism, Congress designated 26 August as Women’s Equality Day in the United States. The 1971 Joint Resolution read, in part, “the women of the United States have been treated as second-class citizens and have not been entitled the full rights and privileges, public or private, legal or institutional, which are available to male citizens of the United States” and women “have united to assure that these rights and privileges are available to all citizens equally regardless of sex.” For that reason, Congress was prevailed upon to declare 26 August a day to commemorate the the Nineteenth Amendment as a “symbol of the continued fight for equal rights.”

Alice Paul was a pivotal and controversial figure in the last years of the American battle to win the vote for women. Her first national action was to organize a grand suffrage procession in Washington, DC on 3 March 1913. She organized the parade on behalf of the National American Woman Suffrage Association (NAWSA), the only group working to win women the vote on a national scale. She later founded her own organization, the National Woman’s Party, and charted a surprisingly aggressive course of social protests to convince Congress to pass a woman suffrage amendment to the Constitution.

Alice Paul lived long enough to see Women’s Equality Day established; she died in 1977. She did not live to see the project which consumed the remaining years of her life ratified — an Equal Rights Amendment to the Constitution. In 2014, a renewed effort emerged to pass the ERA.

As Women’s Equality Day is celebrated around the country today, here are a few things you may not know about suffrage leader and ERA author Alice Paul:

Alice Paul. Public Domain via Wikimedia Commons.
Alice Paul. Public Domain via Wikimedia Commons.

1.  Alice Paul was proudly a birthright Quaker, but as she became interested in politics, she became frustrated with her faith’s reluctance to actively work for woman suffrage. We often associate Quakers with political activism, but in the late nineteenth century, the vast majority of Quakers disapproved of such efforts.

2.  Paul loved dancing and sports. Indeed, her love for physical activity was a factor in drawing her into social protest, first in England, then in America. In her high school and college years, she played softball, basketball, hockey, and tennis, and also ice skated when she could. She learned to dance while attending Swarthmore College near Philadelphia and regretted her few opportunities to attend dances in her later years.

3.  Paul was arrested seven times in England for her suffrage activism, but only once in America. The longest sentence she served in Britain was one month. In the United States, she was sentenced to seven months, but only served one.

4.  Paul endured forced feeding fifty-five times in London’s Holloway Prison in 1909 and perhaps another twenty-five times while at the District of Columbia’s Jail in 1917. Authorities used forced feeding to break the hunger strikes initiated by suffrage prisoners. Some women suffered health problems as a result. Alice Paul struggled with digestive issues for years after and may have lost her sense of smell.

5.  Paul is often portrayed as eager to leave NAWSA to found her own militant suffrage group. In fact, she did so only when her hand was forced. Divisions over strategy or tactics are nothing new to any political group and NAWSA itself came about only in 1890 after two long-estranged suffrage organizations compromised in order to present a united front. The 1914 effort to oust the controversial Alice Paul from NAWSA arose from multiple sources, including the current NAWSA president, Anna Howard Shaw and once-and-future president, Carrie Chapman Catt.

6.  Paul’s persona as a leader combined stereotypically feminine and masculine traits in a way that invited fervent loyalty or deep-seated antipathy. Her dislike of the spotlight and ingrained modesty lent her a vulnerability which undercut concerns about her militant past and her powerful drive. Others found her charismatic authority threatening.

7.  Though the protests of Paul’s National Woman’s Party are often described as “civil disobedience,” Paul believed all of her actions were completely within the law. Before Paul initiated picketing to protest the lack of a suffrage amendment in 1917, picketing was largely the province of labor organizations. After consulting with attorneys about the legality of the practice, Paul adapted the silent vigil of two earlier protests and sent “silent sentinels” to picket the White House. While labor picketing often prompted violence on both sides, Paul gave her troops strict instructions to remain non-violent. Violence was, however, visited upon them by bystanders outraged by the women’s insistence on pressing for suffrage while the country was engaged in World War I.

8.  Paul’s most colorful protests occurred after the House of Representatives passed the suffrage amendment bill. It took another eighteen months to convince the Senate to pass the amendment. To maintain pressure on Congress, Alice Paul crafted watchfire protests across from the White House in Lafayette Square, during which suffragists burned President Wilson’s words about his much-celebrated belief in democracy. They even burned Wilson in effigy to urge him to use his political power to sway the Senate.

9.  Alice Paul was not present during the frenzied effort to make Tennessee the ratifying state for the suffrage amendment. She longed to be at the Tennessee statehouse, but NWP lobbying required a constant input of cash. Her ability to raise funds surpassed anyone else’s, so she chose to stay in Washington to keep the money flowing. Paul’s ability to raise funds was a key factor in the success of the NWP.

10.  Alice Paul bequeathed us the iconic images of the battle for the ballot: photographs of the 1913 procession, the 1917 White House pickets, the 1918 watchfire protests. These images speak to the courage, the persistence and the fortitude of all the women who fought to gain the most fundamental right of citizenship: the right to consent.

Featured image: Alice Paul. Public Domain via Library of Congress.

The post Alice Paul, suffragette and activist, in 10 facts appeared first on OUPblog.

0 Comments on Alice Paul, suffragette and activist, in 10 facts as of 8/26/2014 8:19:00 AM
Add a Comment
10. Alice Paul, suffragist and activist, in 10 facts

Ninety-four years ago today, the Nineteenth Amendment to the Constitution of the United States took effect, enshrining American women’s right to vote. Fifty years later, in the midst of a new wave of feminist activism, Congress designated 26 August as Women’s Equality Day in the United States. The 1971 Joint Resolution read, in part, “the women of the United States have been treated as second-class citizens and have not been entitled the full rights and privileges, public or private, legal or institutional, which are available to male citizens of the United States” and women “have united to assure that these rights and privileges are available to all citizens equally regardless of sex.” For that reason, Congress was prevailed upon to declare 26 August a day to commemorate the the Nineteenth Amendment as a “symbol of the continued fight for equal rights.”

Alice Paul was a pivotal and controversial figure in the last years of the American battle to win the vote for women. Her first national action was to organize a grand suffrage procession in Washington, DC on 3 March 1913. She organized the parade on behalf of the National American Woman Suffrage Association (NAWSA), the only group working to win women the vote on a national scale. She later founded her own organization, the National Woman’s Party, and charted a surprisingly aggressive course of social protests to convince Congress to pass a woman suffrage amendment to the Constitution.

Alice Paul lived long enough to see Women’s Equality Day established; she died in 1977. She did not live to see the project which consumed the remaining years of her life ratified — an Equal Rights Amendment to the Constitution. In 2014, a renewed effort emerged to pass the ERA.

As Women’s Equality Day is celebrated around the country today, here are a few things you may not know about suffrage leader and ERA author Alice Paul:

Alice Paul. Public Domain via Wikimedia Commons.
Alice Paul. Public Domain via Wikimedia Commons.

1.  Alice Paul was proudly a birthright Quaker, but as she became interested in politics, she became frustrated with her faith’s reluctance to actively work for woman suffrage. We often associate Quakers with political activism, but in the late nineteenth century, the vast majority of Quakers disapproved of such efforts.

2.  Paul loved dancing and sports. Indeed, her love for physical activity was a factor in drawing her into social protest, first in England, then in America. In her high school and college years, she played softball, basketball, hockey, and tennis, and also ice skated when she could. She learned to dance while attending Swarthmore College near Philadelphia and regretted her few opportunities to attend dances in her later years.

3.  Paul was arrested seven times in England for her suffrage activism, but only once in America. The longest sentence she served in Britain was one month. In the United States, she was sentenced to seven months, but only served one.

4.  Paul endured forced feeding fifty-five times in London’s Holloway Prison in 1909 and perhaps another twenty-five times while at the District of Columbia’s Jail in 1917. Authorities used forced feeding to break the hunger strikes initiated by suffrage prisoners. Some women suffered health problems as a result. Alice Paul struggled with digestive issues for years after and may have lost her sense of smell.

5.  Paul is often portrayed as eager to leave NAWSA to found her own militant suffrage group. In fact, she did so only when her hand was forced. Divisions over strategy or tactics are nothing new to any political group and NAWSA itself came about only in 1890 after two long-estranged suffrage organizations compromised in order to present a united front. The 1914 effort to oust the controversial Alice Paul from NAWSA arose from multiple sources, including the current NAWSA president, Anna Howard Shaw and once-and-future president, Carrie Chapman Catt.

6.  Paul’s persona as a leader combined stereotypically feminine and masculine traits in a way that invited fervent loyalty or deep-seated antipathy. Her dislike of the spotlight and ingrained modesty lent her a vulnerability which undercut concerns about her militant past and her powerful drive. Others found her charismatic authority threatening.

7.  Though the protests of Paul’s National Woman’s Party are often described as “civil disobedience,” Paul believed all of her actions were completely within the law. Before Paul initiated picketing to protest the lack of a suffrage amendment in 1917, picketing was largely the province of labor organizations. After consulting with attorneys about the legality of the practice, Paul adapted the silent vigil of two earlier protests and sent “silent sentinels” to picket the White House. While labor picketing often prompted violence on both sides, Paul gave her troops strict instructions to remain non-violent. Violence was, however, visited upon them by bystanders outraged by the women’s insistence on pressing for suffrage while the country was engaged in World War I.

8.  Paul’s most colorful protests occurred after the House of Representatives passed the suffrage amendment bill. It took another eighteen months to convince the Senate to pass the amendment. To maintain pressure on Congress, Alice Paul crafted watchfire protests across from the White House in Lafayette Square, during which suffragists burned President Wilson’s words about his much-celebrated belief in democracy. They even burned Wilson in effigy to urge him to use his political power to sway the Senate.

9.  Alice Paul was not present during the frenzied effort to make Tennessee the ratifying state for the suffrage amendment. She longed to be at the Tennessee statehouse, but NWP lobbying required a constant input of cash. Her ability to raise funds surpassed anyone else’s, so she chose to stay in Washington to keep the money flowing. Paul’s ability to raise funds was a key factor in the success of the NWP.

10.  Alice Paul bequeathed us the iconic images of the battle for the ballot: photographs of the 1913 procession, the 1917 White House pickets, the 1918 watchfire protests. These images speak to the courage, the persistence and the fortitude of all the women who fought to gain the most fundamental right of citizenship: the right to consent.

Featured image: Alice Paul. Public Domain via Library of Congress.

The post Alice Paul, suffragist and activist, in 10 facts appeared first on OUPblog.

0 Comments on Alice Paul, suffragist and activist, in 10 facts as of 8/26/2014 4:43:00 PM
Add a Comment
11. The road to hell is mapped with good intentions

Antebellum Americans were enamored of maps. In addition to mapping the United States’ land hunger, they also plotted weather patterns, epidemics, the spread of slavery, and events from the nation’s past.

And the afterlife.

Imaginative maps to heaven and hell form a peculiar subset of antebellum cartography, as Americans surveyed not only the things they could see but also the things unseen. Inspired by the biblical injunction to “Enter ye in at the strait gate: for wide is the gate, and broad is the way, that leadeth to destruction… and narrow is the way, which leadeth unto life, and few there be that find it” (Matthew 7:13-14 KJV), the maps provided striking graphics connecting beliefs and behavior in this life to the next.

The post The road to hell is mapped with good intentions appeared first on OUPblog.

0 Comments on The road to hell is mapped with good intentions as of 8/25/2014 6:59:00 AM
Add a Comment
12. George Burroughs: Salem’s perfect witch

On 19 August 1692, George Burroughs stood on the ladder and calmly made a perfect recitation of the Lord’s Prayer. Some in the large crowd of observers were moved to tears, so much so that it seemed the proceedings might come to a halt. But Reverend Burroughs had uttered his last words. He was soon “turned off” the ladder, hanged to death for the high crime of witchcraft. After the execution, Reverend Cotton Mather, who had been watching the proceedings from horseback, acted quickly to calm the restless multitude. He reminded them among other things “that the Devil has often been transformed into an Angel of Light” — that despite his pious words and demeanor, Burroughs had been the leader of Satan’s war against New England. Thus assured, the executions would continue. Five people would die that day, one of most dramatic and important in the course of the Salem witch trials. For the audience on 19 August realized that if a Puritan minister could hang for witchcraft, then no one was safe. Their tears and protests were the beginning of the public opposition that would eventually bring the trials to an end. Unfortunately, by the time that happened, nineteen people had been executed, one pressed to death, and five perished in the wretched squalor of the Salem and Boston jails.

The fact that a Harvard-educated Puritan minister was considered the ringleader of the largest witch hunt in American history is one of the many striking oddities about the Salem trials. Yet, a close look at Burroughs reveals that his character and his background personified virtually all the fears and suspicions that ignited witchcraft accusations in 1692. There was no single cause, no simple explanation to why the Salem crisis happened. Massachusetts Bay faced a confluence of events that produced the fears and doubts that led to the crisis. Likewise, a wide range of people faced charges for having supposedly committed diverse acts of witchcraft against a broad swath of the populace. Yet, there were many reasons people were suspicious of George Burroughs, indeed he was the perfect witch.

In 1680 when Burroughs was hired to be the minister of Salem Village he quickly became a central figure in the on-going controversy over religion, politics, and money that would span more than thirty years and result in the departure of the community’s first four ministers. One of Burroughs’s parishioners wrote to him, complaining that “Brother is against brother and neighbors against neighbors, all quarreling and smiting one another.” After a little over two years in office, the Salem Village Committee stopped paying Burroughs’s salary, so he wisely left town to return to his old job, as minister of Falmouth (now Portland, Maine).

George Burroughs spent most of his career in Falmouth, a town on the edge of the frontier. He was fortunate to escape the bloody destruction of the settlement by Native Americans in 1676 (during King Philip’s War) and 1690 (during King William’s War). The latter conflict brought a string of disastrous defeats to Massachusetts, and as many historians have noted, the ensuing war panic helped trigger the witch trials. The war was a spiritual defeat for the Puritan colony as they were losing to French Catholics allied with people they considered to be “heathen” Indians. It seemed Satan’s minions would end the Puritans’ New England experiment. Burroughs was one of many refugees from Maine who were either afflicted by or accused of witchcraft. In addition, most of the judges were military officers as well as speculators in Maine lands that the war had made worthless. Some of the afflicted refugees were suffering what today would be considered post-traumatic shock. Used to the manual labor of the frontier, Burroughs was so incredibly strong that several would testify in 1692 to his feats of supernatural strength. The minister’s seemingly miraculous escapes from Falmouth in 1676 and 1690 also brought him under suspicion. Perhaps he had done so with the help of the devil, or the Indians.

Bench in memory of George Burroughs at the Salem Witch Trials Memorial, Salem, Massachusetts. Photo by Emerson W. Baker.
Bench in memory of George Burroughs at the Salem Witch Trials Memorial, Salem, Massachusetts. Photo by Emerson W. Baker.

Tainted by his frontier ties, the twice-widowed Burroughs’s personal life and perceived religious views amplified fears of the minister. At his trial, several testified to his secretive ways, his seemingly preternatural knowledge, and his strict rule over his wives. He forbid his wives to speak about him to others, and even censored their letters to family. Meanwhile the afflicted said they saw the specters of Burroughs’s late wives, who claimed he murdered them. The charges were groundless. However, his controlling ways and the spectacular testimony against him at least raised the question of domestic abuse. Such perceived abuse of authority — at the family, community or colony-wide level — is a common thread linking many of Salem’s accused.

Some observers believed Burroughs was secretive because they suspected he was a Baptist. This Protestant sect had legal toleration but like the Quakers, was considered dangerous by most Massachusetts Puritans because of their belief in adult baptism and adult-only membership in the church. Burroughs admitted to the Salem judges that he had not recently received Puritan communion and had not baptized his younger children (both signs that he might be a Baptist). His excuse was that he was never ordained and hence could not lead the communion service, nor could he baptize children. However, since Burroughs left his post in Maine, he admitted he had visited Boston and Charlestown and had failed to take advantage of these rights there.

Even if he was not a Baptist, as a Puritan minister he was at risk. Burroughs was just one of five ministers cried out upon in 1692. Fully, 30 percent of the people accused were ministers, their immediate family members, or extended kin. In many ways, the witch trials were a critique of the religious and political policies of the colony. But that is another story.

Header image taken by Emerson W. Baker.

The post George Burroughs: Salem’s perfect witch appeared first on OUPblog.

0 Comments on George Burroughs: Salem’s perfect witch as of 8/19/2014 8:49:00 AM
Add a Comment
13. The long journey to Stonewall

By Nancy C. Unger


When I was invited by the Commonwealth Club of San Francisco to participate in its month-long program “The LGBT Journey,” I was a bit overwhelmed by all the possibilities. I’ve been teaching “Lesbians and Gay Men in the U.S.” since 2002, and my enthusiasm for the subject grows every time the course is offered. It’s a passion shared by my students. They never sigh and say, “Gay and lesbian history again?”

But what to present in only forty-five minutes? My most recent scholarship examines lesbian alternative environments in the 1970s and 1980s. In the end, though, I decided to make a larger point. For many people, LGBTQ American history begins with the Stonewall Riots in 1969, so I determined to use this opportunity to talk about the history of same-sex desire that is as old as this nation.

To briefly develop that fascinating history, I touch on some of the sodomy trials in the colonial period, in which communities were surprisingly tolerant of men who were well known for seeking sexual contact with other men. I note women in early America who passed as men, often marrying other women, and develop the difficulty in determining if these were lesbians — or simply women who had no other way to earn a living wage, vote, walk the streets unescorted, and enjoy independence and autonomy. Those same questions also apply to the Boston Marriages that began forming in the late 1800s. Professional women (many of whom graduated from the new, elite women’s colleges in the Boston area) entered into lifelong partnerships with other women. Certainly, some were lesbian. But, like passing as a man, being with a person of the same sex is what allowed a woman to have a career, to travel, to enjoy all the independence that came with not being a subservient wife.

Boston Marriages and same-sex intimate friendships became less socially acceptable with increasing public awareness of same-sex desires. The “medicalization” of those desires began in 1870s and 80s, with the term “homosexual” coming into being around 1892. Same-sex sexual behavior acquired a name — and was defined as deviant. Arrests of men begin to increase. And with the presidency of Theodore Roosevelt in 1901, homosexuality became unpatriotic and un-American. As Kevin Murphy develops in Political Manhood, in 1907 Roosevelt warned Harvard undergraduates against becoming “too fastidious, too sensitive to take part in the rough, hurly-burly of the actual work of the world.” He cautioned that “the weakling and coward are out of place in a strong and free community.” The “mollycoddle” Roosevelt warned against was sufficiently similar to emerging definitions of the male homosexual that the two were often conflated, and used to marginalize and stigmatize certain men as weak, cowardly, sissy, and potentially disloyal.

Environmentalists, denounced as being anti-progress, were ridiculed as “short haired women and long haired men.” John Muir, for example, was lampooned as both effeminate and impotent. He was depicted in drag on the front page of the San Francisco Call in 1909 for his efforts to sweep back the waters flooding Hetch Hetchy Valley.

John Muir - San Francisco Call cartoon

John Muir lampooned for being effeminate in a San Francisco Call cartoon from December 13, 1909. Public domain via the Library of Congress.

Gay men and lesbians operated under a variety of burdens: religious, legal, medical, economic, and social. So how did we get to Stonewall and beyond? Out of changes wrought by World War II and the Cold War came a number of early organizations and challenges to homophobia.

In 1957 it occurred to psychologist Evelyn Hooker that all of the big medical studies on the pathology of the homosexual were based on gay men hospitalized for depression. Her report, “The Adjustment of the Male Overt Homosexual” demonstrated that, despite pervasive homophobia, most self-identified homosexuals were no worse in social adjustment than the general population. Her work was an important step towards the American Psychiatric Association’s decision in 1973 to remove homosexuality from its list of illnesses.

Frank Kameny was a World War II combat veteran who earned his PhD in astronomy at Harvard in 1956. In the middle of the Cold War and the nascent space race, astronomers were at a premium. Kameny, however, was terminated from his position in the US army map service when his arrest on a lewd conduct charge was uncovered. He took his case all the way to the Supreme Court in 1961, but lost. As John D’Emilio notes in Sexual Politics, Sexual Communities, Kameny urged his gay brothers and sisters in 1964 to quit debating whether homosexuality is caused by nature or nurture: “I do not see the NAACP and CORE worrying about which chromosome and gene produced a black skin, or about the possibility of bleaching the Negro [as the solution to racism]. I do not see any great interest on the part of the B’nai B’rith Anti-defamation League in the possibility of solving problems of anti-Semitism by converting Jews to Christians . . . We are interested in obtaining rights for our respective minorities as Negroes, as Jews, and as Homosexuals. Why we are Negroes, Jews, or Homosexuals is totally irrelevant, and whether we can be changed to Whites, Christians, or Heterosexuals is equally irrelevant . . . I take the stand that not only is homosexuality. . . not immoral, but that homosexual acts engaged in by consenting adults are moral, in a positive and real sense, and are right, good, and desirable, both for the individual participants and for the society in which they live.” In 1965 Kameny organized the picketing of the White House to protest homophobia in the government.

Clearly, queer American history did not begin with the Stonewall Riots. It’s a history of oppression that spans several centuries, but also an inspiring story of people fighting for equal rights and acceptance for all Americans.

Nancy C. Unger is Professor of History at Santa Clara University. Her publications include Fighting Bob La Follette: The Righteous Reformer and Beyond Nature’s Housekeepers: American Women in Environmental History. You can follow her on Facebook and listen to her CSPAN lecture on the subject.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post The long journey to Stonewall appeared first on OUPblog.

0 Comments on The long journey to Stonewall as of 8/5/2014 6:50:00 AM
Add a Comment
14. A Q&A with John Ferling on the American Revolution

John Ferling is one of the premier historians on the American Revolution. He has written numerous books on the battles, historical figures, and events that led to American independence, most recently with contributions to The American Revolution: A Historical Guidebook. Here, he answers questions and discusses some of the lesser-known aspects of the American Revolution.

What was the greatest consequence of the American Revolution?

The greatest consequence of the American Revolution stemmed from Jefferson’s master stroke in the Declaration of Independence. His ringing declaration that “all men are created equal” and all possess the natural right to “life, liberty, and the pursuit of happiness” has inspired generations hopeful of realizing the meaning of the American Revolution.

What was the most underrated battle of the Revolutionary War?

King’s Mountain often gets lost in the shuffle, but if Washington’s brilliant Trenton-Princeton campaign was crucially important, King’s Mountain was no less pivotal. Washington’s victory was America’s first in nearly a year, King’s Mountain the first of significance in three years. Trenton-Princeton was vital for recruiting a new army in 1777; King’s Mountain stopped Britain’s recruitment of southern Tories in its tracks. Enemy losses were nearly identical at Trenton-Princeton and King’s Mountain. Finally, Sir Henry Clinton thought the defeat at King’s Mountain was pivotal, and soon thereafter he told one of his generals that with the setback “all his Dreams of Conquest quite vanish’d.”

Sketch of the Battle of Trenton by Andreas Wiederholt (b. 1752?). Public domain via Wikimedia Commons.

Sketch of the Battle of Trenton by Andreas Wiederholt (b. 1752?). Public domain via Wikimedia Commons.

What’s the one unanswered question about the American Revolution you’d most like answered?

The war in the South in 1780 and 1781 is shot through with mysteries. Why did Benjamin Lincoln stay put in Charleston in 1780? He might have withdrawn to the interior, as did those defending against Burgoyne’s invasion, or he might have made a stand behind the Ashley River — as Washington did on the Brandywine — and then retreated to the interior.

Why in the summer that followed did Horatio Gates immediately take the field when his army was so unprepared and he faced no immediate threat? Why did Gates in August at Camden position his men so that the militia faced Cornwallis’s regulars?

Why in 1781 did not Sir Henry Clinton order General Cornwallis back to the Carolinas or summon him and most of his army to New York?

With all the mistakes, maybe the biggest mystery of the war is how anyone won.

What is your favorite quote by a Revolutionary?

George Washington by Gilbert Stuart. Public domain via Wikimedia Commons.

George Washington by Gilbert Stuart. Public domain via Wikimedia Commons.

Aside from the egalitarian and natural rights portions of Jefferson’s Declaration of Independence, I have two favorite quotations from revolutionaries. One is that of Captain Levi Preston of Danvers, Massachusetts. When asked why he had soldiered on the first day of the war, he responded: “[W]hat we meant in going for those Redcoats was this: we always had governed ourselves and we always meant to. They didn’t mean we should.” My second favorite is Washington’s remark on learning of Lexington and Concord: a “Brother’s Sword has been sheathed in a Brother’s breast.”

Aside from John and Abigail, what was the best husband-wife duo of the Revolution?

If “best” means the duo that best aided the American Revolution, I am sure there must have been countless nameless men who bore arms while their spouses at home made bullets. But of those with whom I am familiar, I opt for Joseph and Esther Reed. He played an important role in Pennsylvania’s insurgency, served in the army and as Washington’s secretary, played a crucial role in the Continental Army’s escape after the Second Battle of Trenton, sat in the Continental Congress, and was the chief executive of his state for three years. She organized the Ladies Organization in Philadelphia in 1780 and published a broadside urging women not to purchase unnecessary consumer items, but instead to donate the money that they saved to aid the soldiery in the Continental army. Altogether, her campaign raised nearly $50,000 in four states.

What was the most important diplomatic action of the war?

The greatest consequence of the American Revolution stemmed from Jefferson’s master stroke in the Declaration of Independence. His ringing declaration that “all men are created equal” and all possess the natural right to “life, liberty, and the pursuit of happiness” has inspired generations hopeful of realizing the meaning of the American Revolution.

What is your favorite Revolutionary War site (battlefield, home, museum, etc.) to visit today?

If limited to choosing only one site, it would be Mount Vernon. For one thing, George Washington seemed to have a hand in almost everything that occurred in America from 1753 until his death in 1799. In addition, he was a farmer, a pursuit that is alien to most of us today. Mount Vernon includes an informative museum, a functioning distillery and mill, farm land, animals, gardens, and of course the mansion, which opens a window onto the life of a wealthy Virginia planter. Those who lived there as slaves are not overlooked and slavery at Mount Vernon is not whitewashed. Nearly a full day is required to take in everything and at day’s end a visitor who comes without much understanding of the man and his time will leave having received a decent and illuminating introduction to Washington and eighteenth century life and culture.

Mount Vernon by Ad Meskens. CC-BY-SA-3.0 via Wikimedia Commons.

Mount Vernon by Ad Meskens. CC-BY-SA-3.0 via Wikimedia Commons.

Propaganda was important during the Revolution. What is your favorite propaganda item?

Had there been an Abraham Zapruder armed with a motion picture camera on Lexington Green on 19 April 1775, we might know precisely what occurred when the first shot was fired in the Revolutionary War. But we will never know if that shot was fired accidentally, whether it was fired by British soldiers following orders, or as some alleged if it was fired by a colonist in hiding. What is known is that soon after that historic day the Massachusetts Committee of Safety deposed witnesses of the bloody event, from which it cobbled together an account showing that the regulars opened fire after being commanded to “Fire, by God, Fire.” That account circulated before the official British report was published. In a day when knowledge of who fired the first shot to launch a war was still important, the Massachusetts radicals had scored a propaganda master stroke.

If you could time travel and visit any American city, colony, or state for one year between 1763 and 1783, which would you choose?

I would choose to be living in Boston in 1763. I would like to know what people in the city were thinking about Anglo-America prior to the Sugar and Stamp Acts and how many had ever heard of The Independent Whig. I would like to visit grog shops to discover whether there was a hint of rebellion among the workers and whether they thought Samuel Adams would ever amount to anything. While there, perhaps I could catch a game at Fenway when the St. Louis Browns come to town.

In your opinion, what was George Washington’s biggest blunder of the war?

A book about Washington’s blunders would be large, but his most baffling mistake occurred in September and October 1776. Although fully aware that he was soon to be trapped in Harlem Heights by a superior British army and utterly dominant Royal Navy, Washington made no attempt to escape his snare. His letters at the time indicate an awareness of his dilemma. They also suggest that in addition to his customary indecisiveness, Washington was not just thoroughly exhausted, but in the throes of a black depression. These assorted factors likely explain his potentially fatal torpor. He and the American cause were saved from the looming disaster by the arrival of General Charles Lee, whose advice Washington still respected. Lee took one look and urged Washington to get the army out of the trap. Washington listened, and escaped.

In your opinion, who was the most overrated revolutionary?

Franklin is the most overrated. He was not unimportant – indeed, I think he was a very great man – but as he was abroad for years, he played a minor role in the insurgency between 1765 and 1775. Furthermore, while Franklin was popular in France, Vergennes was a realist who acted in the interest of his country. It is ludicrous to think that Franklin pulled his strings.

Who was the most underrated revolutionary?

General Nathanael Greene is so underrated that many today are unaware of him. But he was the general that Washington turned to for good advice, made personal sacrifices to try to straighten out the quartermaster corps, and waged an absolutely brilliant campaign in the Carolinas between January and March 1781. It was his heroics in the South that helped drive Cornwallis to take his fateful step into Virginia, and to his doom. Had it not been for Greene, it is difficult to envision a pivotal allied victory on the scale of Yorktown in 1781, and without Yorktown the war would have had a different ending, possibly one that did not include American independence.

Was American independence inevitable?

Chatham and Burke knew how independence could be avoided, but it involved surrendering much of Parliament’s power over the colonists. Burke also glimpsed the possibility of using proffered concessions to play on the divisions in the Continental Congress, which included many delegates who opposed a break with Britain. Burke’s notion might have worked. But from the beginning the great majority in Parliament thought that in a worst case scenario the use of force would bring the colonists to heel. Given the political realities of the day, war appears to have been virtually inevitable. Even so, independence very likely would have been prevented had Britain had an adequate number of troops in America in April 1775 or a capable general to lead the campaign for New York in 1776, someone like Earl Cornwallis.

A version of this Q&A first appeared on the Journal of the American Revolution.

John Ferling is Professor Emeritus of History at the University of West Georgia. He is a leading authority on late 18th and early 19th century American history. His latest book, Jefferson and Hamilton: The Rivalry that Forged a Nation, was published in October 2013. He is the author of many books, including Independence, The Ascent of George Washington, Adams vs. Jefferson: The Tumultuous Election of 1800Almost a Miracle: The American Victory in the War of IndependenceSetting the World Ablaze: Washington, Adams, Jefferson, and the American Revolution, John Adams: A Life, and A Leap in the Dark: The Struggle to Create the American Republic. He lives in Atlanta, Georgia.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post A Q&A with John Ferling on the American Revolution appeared first on OUPblog.

0 Comments on A Q&A with John Ferling on the American Revolution as of 8/2/2014 6:38:00 AM
Add a Comment
15. On the 95th anniversary of the Chicago Race Riots

By Elaine Lewinnek


On 27 July 1919, a black boy swam across an invisible line in the water. “By common consent and custom,” an imaginary line extending out across Lake Michigan from Chicago’s 29th Street separated the area where blacks were permitted to swim from where whites swam. Seventeen-year-old Eugene Williams crossed that line. He may have strayed across it by accident or may have challenged it on purpose. We do not know his motives because the whites on the beach reacted by throwing stones and Eugene Williams drowned. Police at the beach arrested black bystanders, infuriating other blacks so much that one black man shot at the police, who returned fire, shooting into the crowd of blacks. The violence spread from there. Over the next week, in the middle of that hot summer of 1919, 38 people died, 537 were hospitalized, and approximately 1,000 were left homeless. White and black Chicagoans fought over access to beaches, parks, streetcars, and especially residential space. The burning of houses, during this riot, inflamed passions almost as much as the killing of people. It took a rainstorm and the state militia to end the violence in July 1919, which nevertheless simmered just below the surface, erupting in smaller clashes between blacks and whites throughout the next four decades, especially every May, during Chicago’s traditional moving season.

ChicagoRaceRiot_1919_wagon

Family leaving damaged home after 1919 Chicago race riot by Chicago Commission on Race Relations. Negro in Chicago: The Negro in Chicago; a study of race relations and a race riot (1922). Public domain via Wikimedia Commons.

The 1910s were the first decade of the Great Migration, a decade when 70,000 blacks had moved to Chicago, more than doubling the existing black population. This was also a decade when the lines of Chicago’s residential apartheid were hardening. Historically, Chicago’s blacks found homes in industrial suburbs such as Maywood and Chicago Heights, domestic service hubs such as Evanston and Glencoe, rustic owner-built suburbs such as Robbins and Dixmoor, and some recently-annexed suburban space such as Morgan Park and Lilydale. Increasingly, though, blacks were confined to a narrow four-block strip around State Street on Chicago’s South Side known as the Black Belt. Half of Chicago’s blacks lived there in 1900, while 90% of Chicago’s blacks lived there by 1930.

The Black Belt was a crowded space where two or three families often squeezed into one-room apartments, landlords neglected to repair rotting floors or hinge-less doors, schools eventually ran on shifts so that each child was educated for only half a day, and the police tolerated gamblers and brothels. It was so unhealthy that Richard Wright called it “our death sentence without a trial.” Blacks who tried to move beyond the Black Belt were met with vandalism, arson, and bomb-throwers, including 24 bombs thrown in the first half of 1919 alone.

Earlier, some Chicago neighborhoods had welcomed black homeowners, but after the First World War there was an increasingly widespread belief that blacks hurt property values. Chicago realtor L. M. Smith and his Kenwood and Hyde Park Property Owners Association spread the notion that any black moving into a neighborhood was akin to a thief, robbing that street of its property values. By the 1920s, Chicago Realtors prohibited members from introducing any new racial group into a neighborhood and encouraged the spread of restrictive covenants, legally barring blacks while also consolidating ideas of whiteness. As late as 1945, two Chicago sociologists reported that, while “English, German, Scotch, Irish, and Scandinavian have little adverse effect on property values[,] Northern Italians are considered less desirable, followed by Bohemians and Czechs, Poles, Lithuanians, Greeks, and Russian Jews of the Lower class. Southern Italians, along with Negroes and Mexicans, are at the bottom of the scale.” As historians of race recognize, many European immigrants were considered not quite white before 1950. Those immigrants eventually joined the alliance of groups considered white partly because realtors, mortgage lenders, and housing economists established a bright line between the property values of “whites” and those of blacks.

The lines established in 1919 have lingered. As late as 1990, among Chicago’s suburban blacks, almost half of them lived in the same fourteen suburbs that blacks had lived in before 1920: they had not gained access to newer spaces. It was black neighborhoods that suffered disproportionately from urban renewal and the construction of tall-tower public housing in the twentieth century, further reinforcing the overlaps between race and space in Chicago. Many whites inherit property whose value has increased because of the racist real-estate policies founded after the violence of 1919. Recently, Ta-Nehisi Coates has recently used the history of Chicago’s property market to publicize “The Case for Reparations,” after generations of denying blacks access to homeowner equity.

It is worth remembering the events of 95 years ago, when Eugene Williams and 37 other people died, as Chicagoans clashed in the streets over emerging ideas of racialized property values.

Elaine Lewinnek is a professor of American Studies at California State University, Fullerton and the author of The Working Man’s Reward: Chicago’s Early Suburbs and the Roots of American Sprawl.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post On the 95th anniversary of the Chicago Race Riots appeared first on OUPblog.

0 Comments on On the 95th anniversary of the Chicago Race Riots as of 7/30/2014 8:54:00 AM
Add a Comment
16. Revolution, by Deborah Wiles | Book Review

Revolution, Deborah Wiles’ second novel in The Sixties Trilogy, sends readers on a journey to Greenwood, Mississippi in the summer of 1964, also known as “Freedom Summer."

Add a Comment
17. Same-sex marriage now and then

By Rachel Hope Cleves


Same-sex marriage is having a moment. The accelerating legalization of same-sex marriage at the state level since the Supreme Court’s June 2013 United States v. Windsor decision, striking down the Defense of Marriage Act, has truly been astonishing. Who is not dumbstruck by the spectacle of legal same-sex marriages performed in a state such as Utah, which criminalized same-sex sexual behavior until 2003? The historical whiplash is dizzying.

Daily headlines announcing the latest changes to the legal landscape of same-sex marriage are feeding public curiosity about the history of such unions, and several of the books that top the “Gay & Lesbian History” bestsellers lists focus on same-sex marriage. However, they tend to focus on the immediate antecedents for today’s legal decisions, rather than the historical roots of the issue.

At first consideration, it may seem anachronistic to describe a same-sex union from the early nineteenth century as a “marriage,” but this is the language that several who knew Charity Bryant (1777-1851) and Sylvia Drake (1784-1868) used at the time. As a young boy growing up in western Vermont during the antebellum era, Hiram Harvey Hurlburt Jr. paid a visit to a tailor shop run by the two women to order a suit of clothes made. Noticing something unusual about the women, Hurlburt asked around town and “heard it mentioned as if Miss Bryant and Miss Drake were married to each other.” Looking back from the vantage of old age, Hurlburt chose to include their story in a handwritten memoir he left to his descendants. Like homespun suits, the women were a relic of frontier Vermont, which was receding swiftly into the distance as the twentieth century surged forward. Once upon a time, Hurlburt recalled for his relatives, two women of unusual character could be known around town as a married couple.

There were many who agreed with Hurlburt. Charity Bryant’s sister-in-law, Sarah Snell Bryant, mother to the beloved antebellum poet and journalist William Cullen Bryant, wrote to the women “I consider you both one as man and wife are one.” The poet himself described his Aunt Sylvia as a “fond wife” to her “husband,” his Aunt Charity. And Charity called Sylvia her “helpmeet,” using one of the most common synonyms for wife in early America.

The evidence that Charity and Sylvia possessed a public reputation as a married couple in their small Vermont town, and among the members of their family, goes a long way to constituting evidence that their union should be labeled as a same-sex marriage and seen as a precedent for today’s struggle. In the legal landscape of the early nineteenth century, “common law” marriages could be verified based on two conditions: a couple’s public reputation as being married, and their sharing of a common residence. Charity and Sylvia fit both those criteria. After they met in the spring of 1807, while Charity was paying a visit to Sylvia’s hometown of Weybridge, Vermont, Charity decided to rent a room in town and invited Sylvia to come live with her. The two commenced their lives together on 3 July 1807, a date that the women regarded as their anniversary forever after. The following year they built their own cottage, initially a twelve-by-twelve foot room, which they moved into on the last day of 1808. They lived there together for the rest of their days, until Charity’s death in 1851 from heart disease. Sylvia lasted another eight years in the cottage, before moving into her older brother’s house for the final years of her life.

The grave of Charity Bryant and Sylvia Drake.

The grave of Charity Bryant and Sylvia Drake. Photo by Rachel Hope Cleves. Do not use without permission.

Of course, Charity and Sylvia did not fit one very important criterion for marriage, common-law or statutory: that the union be established between a man and a woman. But then, their transgression of this requirement likened their union to other transgressive marriages of the age: those between couples where one or both spouses were already married, or where one or both spouses were beneath the age of consent at the formation of the union, or where one spouse was legally enslaved. In each of these latter circumstances, courts called on to pass judgment over questions of inheritance or the division of property sometimes recognized the validity of marriages even where the spouses could not legally be married according to statute. Since Charity and Sylvia never argued over property in life, and since their inheritors did not challenge the terms of the women’s wills which split their common property between their families, the courts never had a reason to rule on the legality of the women’s marriage. Ultimately, the question of whether their union constituted a legal marriage in its time cannot be resolved.

Regardless, it is vital that the history of marriage include relationships socially understood to be marriages as well as those relationships that fit the legal definition. Although the legality of same-sex marriage has been the subject of focused attention in the past decade (and the past year especially), we cannot forget that marriage exists first and foremost as a social fact. To limit the definition of marriage entirely to those who fit within its statutory terms would, for example, exclude two and a half centuries of enslaved Americans from the history of marriage. It would confuse law’s prescriptive powers with a description of reality, and give statute even more power than its oversized claims.

Awareness of how hard-fought the last decade’s legalization battle has been makes it difficult to believe that during the early national era two same-sex partners could really and truly be married. However, a close look at Charity and Sylvia’s story compells us to re-examine our beliefs. History is not a progress narrative, we all know. What’s only just become possible now may have also been possible at points in the past. Historians of the early American republic might want to ask why Charity and Sylvia’s marriage was possible in the first decades of the nineteenth century, whether it would have been so forty years later or forty years before, and what their marriage can tell us about the possibilities for sexual revolution and women’s independence in the years following the Revolution. For historians of any age, Charity and Sylvia’s story is a reminder of the unexpected openings and foreclosures that make the past so much more interesting than our assumptions.

Rachel Hope Cleves is Associate Professor of History at the University of Victoria. She is author of Charity and Sylvia: A Same-Sex Marriage in Early America.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Same-sex marriage now and then appeared first on OUPblog.

0 Comments on Same-sex marriage now and then as of 7/15/2014 7:39:00 AM
Add a Comment
18. The trouble with military occupations: lessons from Latin America

By Alan McPherson


Recent talk of declining US influence in the Middle East has emphasized the Obama administration’s diplomatic blunders. Its poor security in Benghazi, its failure to predict events in Egypt, its difficulty in reaching a deal on withdrawal in Afghanistan, and its powerlessness before sectarian violence in Iraq, to be sure, all are symptoms of this loss of influence.

Yet all miss a crucial point about a region of the world crawling with US troops. What makes a foreign military presence most unpopular is simply that it is, well, military. It is a point almost too obvious to make, but one that is forgotten again and again as the United States and other nations keep sending troops abroad and flying drones to take out those who violently oppose them.

The warnings against military occupation are age-old. The Founding Fathers understood the harshness of an occupying force, at least a British one, and thus declared in the Third Amendment that “no soldier shall, in time of peace be quartered in any house, without the consent of the owner, nor in time of war, but in a manner to be prescribed by law.”

US occupations of small Latin American countries a century ago taught a similar lesson. What was most irksome for those who saw the US marines occupy Nicaragua, Haiti, and the Dominican Republic between 1912 and 1934 was not the State Department’s desire to protect US lives and property. It also wasn’t its paranoia about German gunboats during World War I. It wasn’t even the fact itself of intervention. While some of these occupations began with minimal violence and a surprising welcome by occupied peoples, as months turned into years, the behavior of US troops on the ground became so brutal, so arbitrary, and so insensitive to local cultures that they drove many to join movements to drive the marines back into the sea.

Ocupación militar del 1916 en República Dominicana by Walter. CC-BY-SA-3.0 via Wikimedia Commons.

Ocupación militar del 1916 en República Dominicana by Walter. CC-BY-SA-3.0 via Wikimedia Commons.

It was one thing to subdue armed supporters of the regimes overthrown by the United States in all three countries. Latin Americans often met the demise of those unpopular governments with relief. Besides, armed groups recognized the overwhelming training and firepower of the marines and agreed readily to disarmament. As one Haitian palace guard recalled of the marine landing of 1915, “Everyone fled. Me too. You had only to see them, with their weaponry, their massive, menacing appearance, to understand both that they came to do harm to our country and that resistance was futile.”

It was the violence during “peace” time that turned the masses against occupation. At times marines and their native constabularies hunted small groups of insurgents and treated all locals as potential traitors. Other times they enforced new regulations, replaced local political officials, or militarized borders. And they could do it all with virtual impunity since any of their crimes would be tried by their own in US-run military provost courts.

Abuses were particularly frequent and grave either when no marines supervised constabularies or when a single marine — often a non-commissioned officer elevated to officer status in the constabulary — unleashed a reign of terror in a small town.

The residents of Borgne, in Haiti, for instance, hated a Lieutenant Kelly for approving beatings and imprisonments for trivial crimes or for no apparent reason at all. While in the Dominican Republic, around Hato Mayor, Captain Thad Taylor riled over his own fear-filled fiefdom. As one marine described it, Taylor “believed that all circumstances called for a campaign of frightfulness; he arrested indiscriminately upon suspicion; then people rotted in jail pending investigation or search for evidence.” In Nicaragua, the “M company” was widely accused of violence against children, especially throwing them into the air and spearing them with bayonets. In Haiti, a forced labor system known as the corvée saw Americans stop people, peasants, servants, or anyone else, in the street and make them work, particularly to build roads. The identification of the corvée with occupation abuse was so strong that, years later, Haitians gladly worked for foreign corporations but refused to build roads for them.

Rape added an element of gendered terror to abuses. The cases that appear in the historical record — many more likely went unreported — indicate an attitude of permissiveness, fed by the occupations’ monopoly of force, that bred widespread fear among occupied women. Assault was so dreaded that Haitian women stopped bathing in rivers.

Marines were none too careful about keeping abuses quiet and perhaps wanted them to be widely known so as to terrorize the populace. They repeatedly beat and hanged occupied peoples in town plazas, walked them down country roads with ropes around their necks, and ordered them to dig graves for others.

Brutality also marked the behavior of US troops in the cities, where there was no open rebellion. Often, marines and sailors grew bored and turned to narcotics, alcohol, and prostitution, vices that were sure to bring on trouble. A Navy captain suggested the humiliation suffered by Nicaraguan police, soldiers, and artisans, who all made less money than local prostitutes. He also noted the “racial feeling, . . . which leads to the assumption of an air of superiority on the part of the marines.” In response, the US minister requested that Nicaragua provide space for a canteen, dance hall, motion picture theatre, and other buildings to occupy the marines’ time.

Even the non-violent aspects of occupation were repressive. A French minister in Haiti noted that all Haitians resented intrusions in their daily freedoms, especially the curfew. Trigger-happy US soldiers on patrol also exhibited “an extraordinary lack of discipline and in certain cases, an incredible disdain for propriety.” Dominicans considered US citizens to be “hypocrites” because they drank so much abroad while living under prohibition at home. Reviewing such cases, a commanding officer assessed that 90 percent of his “troubles with the men” stemmed from alcohol. In a particularly terrifying incident, private Mike Brunski left his Port-au-Prince legation post at 6 a.m. and started shooting Haitians “apparently without provocation,” killing one and wounding others. He walked back to the legation “and continued his random firing from the balcony.” Medical examiners pronounced Brunski “sane but drunk.”

Abuse proved a recruiting bonanza for insurgents. Little useful information, and even less military advance, was gained from abuse and torture in Nicaragua. A Haitian group also admitted that “internal peace could not be preserved because the permanent and brutal violation of individual rights of Haitian citizens was a perpetual provocation to revolt.” Terror otherwise hardened the population against the occupation. Many in the Dominican Republic later testified with disgust to having seen lynchings with their own eyes. There and in Haiti, newspaper editors braved prison sentences to publish tales of atrocities.

By 1928, the State Department’s Sumner Welles observed that occupation “inevitably loses for the United States infinitely more, through the continental hostility which it provokes, through the fears and suspicions which it engenders, through the lasting resentment on the part of those who are personally injured by the occupying force, than it ever gains for the United States through the temporary enforcement of an artificial peace.”

Hostile, troublesome, horrifying: the US military in Latin America encountered a range of accusations as they it on believing that it brought security and prosperity to the region. Sound familiar? The lessons of Latin American resistance a century ago are especially relevant today, when counterinsurgency doctrine posits that a successful occupation, such as the one by coalition forces in Afghanistan, must win over local public opinion in order to bring lasting security. Thankfully, it appears that the lesson has sunk in to a certain extent, with diplomats increasingly reclaiming their role from the military as go-betweens with locals. Yet much of the damage has been done, and in the case of drones, no amount of diplomacy can assuage the feeling of terror felt throughout the countryside of Afghanistan and Pakistan when that ominous buzzing sound approaches overhead. It is, perhaps for better and certainly for worse, the new face of US military occupation.

Alan McPherson is a professor of international and area studies at the University of Oklahoma and the author of The Invaded: How Latin Americans and their Allies Fought and Ended US Occupations.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post The trouble with military occupations: lessons from Latin America appeared first on OUPblog.

0 Comments on The trouble with military occupations: lessons from Latin America as of 1/1/1900
Add a Comment
19. Putting an end to war

By Barry S. Levy and Victor W. Sidel


War is hell. War kills people, mainly non-combatant civilians, and injures and maims many more — both physically and psychologically. War destroys the health-supporting infrastructure of society, including systems of medical care and public health services, food and water supply, sanitation and sewage treatment, transportation, communication, and power generation. War destroys the fabric of society and damages the environment. War uproots individuals, families, and often entire communities, making people refugees or internally displaced persons. War diverts human and financial resources. War reinforces the mistaken concept that violence is an acceptable way of resolving conflicts and disputes. And war often creates hatreds that are passed on from one generation to the next.

During the Korean War, a grief stricken American infantryman whose friend has been killed in action is comforted by another soldier. In the background, a corpsman methodically fills out casualty tags. Haktong-ni area, Korea. August 28, 1950. Public domain via Wikimedia Commons.

War is hell. Yet we, as a society, have sanitized the reality of “war” in many ways. In the absence of a draft, many of us have no direct experience of war and do not even personally know people who have recently fought in war. And the US Congress has long since ceded to the President its authority to declare war.

The government and the media infrequently use the word “war.” Instead, they use many euphemisms for “war,” such as “military campaign” and “armed conflict,” and for the tactics of war, like “combat operations” and “surgical strikes.” Nevertheless, we, as a society, often think in a war-like context. We use “war” as a metaphor: the War on Poverty, the War on Cancer, the War on Drugs. And militaristic metaphors pervade the language of medicine and public health: Patients battle cancer. Physicians fight AIDS. Health care providers addressing especially challenging problems work on the front lines or in the trenches. Public health workers target vulnerable populations. And the main office of the leading professional organization in public health is called “headquarters.”

We envision a world without war and see the need to develop the popular and political will to end war. To create a world without war, we, as a society, would need to stop using sanitized phrases to describe “war” and to stop thinking in a militaristic context. But much more would need to be done to create a world without war.

A central concept in public health for the prevention of disease is the use of a triangle, with its three points labeled “host,” “agents,” and “environment.” This concept could be applied for developing strategies to create a world without war — strategies aimed at the host (people), strategies aimed at agents (weapons of war and the military), and strategies aimed at the environment (the conditions in which people live).

Strategies aimed at people could include promoting better understanding and more tolerance among people and among nations, promoting economic and social interdependency among nations, promoting nonviolent resolution of disputes and conflicts, and developing the popular and political will to prevent war and promote peace.

Strategies aimed at weapons of war and the military could include controlling the international arms trade, eliminating weapons of mass destruction, reducing military expenditures, and intervening in disputes and conflicts to prevent war.

Strategies aimed at improving the conditions in which people live — which often contribute to the outbreak of war — could include protecting human rights and civil liberties, reducing poverty and socioeconomic inequalities, improving education and employment opportunities, and ensuring personal security and legal protections.

War is hell. A world without war would be heavenly.

Barry S. Levy, M.D., M.P.H. is an Adjunct Professor of Public Health at Tufts University School of Medicine. Victor W. Sidel, M.D. is Distinguished University Professor of Social Medicine Emeritus at Montefiore Medical Center and Albert Einstein Medical College, and an Adjunct Professor of Public Health at Weill Cornell Medical College. Dr. Levy and Dr. Sidel are co-editors of the recently published second edition of Social Injustice and Public Health as well as two editions each of the books War and Public Health and Terrorism and Public Health, all of which have been published by Oxford University Press. They are both past presidents of the American Public Health Association.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post Putting an end to war appeared first on OUPblog.

0 Comments on Putting an end to war as of 6/19/2014 6:29:00 AM
Add a Comment
20. The top 10 historic places from the American Revolution

In 1996, Congress commissioned the National Park Service to compile a list of sites and landmarks that played a part in the American Revolution. From battlefields to encampments, meeting houses to museums, these places offer us a chance to rediscover the remarkable men and women who founded this nation and to recognize the relevance of not just what they did, but where they did it. Frances Kennedy, editor of The American Revolution: A Historical Guidebook, has compiled a list of her favorite places, and why.

1.  Minute Man National Historical Park, Concord, Lincoln, and Lexington, Massachusetts

The American Revolution began years before the first battles of the Revolutionary War, fought at Lexington and Concord in April 1775. In 1765 Parliament passed the Stamp Act which imposed taxes on the colonies. After the colonists began protests, the British ordered soldiers to Boston. In April 1775, after the royal governor of Massachusetts learned that colonists were storing military supplies in Concord, he ordered British regulars to march from Boston to seize the supplies. They fired “the shot heard around the world”. Men on both sides were killed in Lexington and Concord; more British soldiers were killed on their march back to Boston by militiamen firing from behind walls and trees. The Historical Park includes the sites of the battles at Lexington Green and at North Bridge in Concord. In July, the Continental Congress named George Washington commander in chief of the Continental Army.

Minute Man National Park, Concord, MA by Jay Sullivan. CC-BY-SA-3.0 via Wikimedia Commons.

Minute Man National Park, Concord, MA by Jay Sullivan. CC-BY-SA-3.0 via Wikimedia Commons.

2.  Boston National Historical Park, Dorchester Heights in South Boston, Massachusetts

In early March 1776, General Washington surprised the British forces occupying Boston by fortifying Dorchester Heights, the hills high above the ships in the harbor. He had received about 60 pieces of the artillery in late February after Colonel Henry Knox, formerly a Boston bookseller, had succeeded in dragging them across more than 200 miles of ice and snow from Fort Ticonderoga on Lake Champlain. Washington’s successful siege forced the British to evacuate Boston. They sailed out of the harbor on March 17 toward New York. The tall white marble monument on the Heights commemorates the siege.

3.  Independence National Historical Park, Philadelphia, Pennsylvania

The clocktower at Independence Hall. Philadelphia, PA by Captain Albert E. Theberge, NOAA Corps (ret.) (NOAA Photo Library: amer0024). CC-BY-2.0 via Wikimedia Commons.

The clocktower at Independence Hall. Philadelphia, PA by Captain Albert E. Theberge, NOAA Corps (ret.) (NOAA Photo Library: amer0024). CC-BY-2.0 via Wikimedia Commons.

Great beginnings in America are commemorated in the Historical Park. In Carpenter’s Hall in 1774, delegates from twelve colonies — all except Georgia – met in the First Continental Congress, to consider how to respond to acts of Parliament that threatened their liberties. The State House of the Province of Pennsylvania, now Independence Hall, was the meeting place for the Second Continental Congress, beginning on May 10, 1775. In 1776, Thomas Jefferson wrote the Declaration of Independence, Congress edited it, and on July 4, Congress accepted it, in the same Hall. After the Articles of Confederation were ratified, the Confederation Congress met in the Hall until June 21, 1783. The Constitutional Convention met in the Hall in May-September 1787 and approved the Constitution on September 17.

4.  Washington Crossing Historic Park, Washington Crossing, Pennsylvania

Washington crossed the Delaware River on Christmas night 1776 in a raging snowstorm and on the next day, he won the battle of Trenton, New Jersey. Henry Knox was in command of the crossing, the same officer who had dragged the artillery from Fort Ticonderoga for the successful siege of Boston. Among the men who managed the boats in the dangerous river crossing, amid floating cakes of ice, were the mariners in Colonel John Glover’s Massachusetts regiment, which included blacks, American Indians, and whites. The historic river crossing is commemorated in the two parks, one on each side of the river.

5.  Saratoga National Historical Park, Stillwater, New York

In July 1777 General Burgoyne’s 8,000-man British army, which included German soldiers and American Indians, marched down the Hudson River Valley with the goal of splitting the states and isolating New England. Slowed by the dense forests and its long supply line, the army finally crossed the river in mid-September, won a costly victory against the Americans in the battle of Freeman’s Farm, fought another at Bemis Heights on October 7, and was forced by the American siege to surrender 6,800 soldiers on October 17. The Historical Park, along the Hudson River, includes the sites of the battles and the British camp.

6.  Valley Forge National Historical Park, King of Prussia, Pennsylvania

The army barely survived the 1977-1978 winter in the Valley Forge camp. More than 2,000 soldiers died as a result of too little to eat and inadequate clothing. Conditions improved in March after Nathanael Greene was appointed quartermaster general. Baron von Steuben, a former Prussian officer commended to Washington by Benjamin Franklin, arrived at the camp in February and began a training regimen for the army. By June 1778 when the soldiers marched out of Valley Forge, they were better trained and more confident. The 5.4 square mile Historical Park includes historic buildings, sites of the brigade encampments, the historic trace road, monuments, and the camp defense lines.

Valley_Forge_National_Historical_Park_VAFO3929

National Park Service Digital Image Archives. Public domain via Wikimedia Commons.

7.  Kings Mountain National Military Park, Blacksburg, South Carolina

In the fall of 1780, the patriots known as over-mountain men rode from their farms in Virginia, Tennessee, and North Carolina to battle British Major Patrick Ferguson and his command of loyalists. After riflemen from South Carolina joined the patriots, their force numbered about two thousand. On October 5, they attacked Ferguson’s camp, killed him and one-third of his force and captured about 650 loyalists in the hour-long battle. It was the first major patriot victory after the British capture of  Charleston the previous May. The Military Park includes the battle site, monuments, and Ferguson’s grave. The Overmountain Victory National Historic Trail commemorates the patriots’ routes (on today’s maps).

8.  Cowpens National Battlefield, Gaffney, South Carolina

Cowpens, a pastureland in South Carolina, is the site of General Daniel Morgan’s attack on Lieutenant Colonel Banastre Tarleton’s British Legion on January 17, 1781.  Called a tactical masterpiece, Morgan’s battle plan began with militia sharpshooters who fired two volleys, then fell back behind the Continentals who next attacked, followed by the cavalry. In less than an hour, the British suffered nearly 1,000 casualties, including about 500 captured. Tarleton escaped and rode to General Cornwallis to report the bad news. The 845-acre park includes the battlefield and monuments.

9.  Guilford Courthouse National Military Park, Greensboro, North Carolina

In the small community of Guilford Courthouse, General Nathanael Greene used General Morgan’s Cowpens tactics to battle General Cornwallis on March 15, 1781. Cornwallis won the battle over Greene’s militia and Continentals, but his force suffered such heavy casualties that he was left with only about 1,500 soldiers able to fight.  He retreated to the coast and made his decision to march north to join the British force in Virginia. The Military Park includes the battlefield and monuments.

10.  Colonial National Historical Park, Yorktown, Virginia

In May 1781, General Cornwallis took command of the British soldiers in Virginia and waited for reinforcements to arrive from General Henry Clinton in New York. In mid-August, General Washington and the Comte de Rochambeau, in command of the French army in America, began their march south after learning that the French fleet was sailing for the Chesapeake Bay. On September 5 the French defeated the British fleet in a battle in the Atlantic Ocean off the Bay. Cornwallis could not be reinforced.  By September 26 the American and French armies had arrived at Yorktown with about 18,000 soldiers and put Cornwallis’s 5,500-man force under siege. On October 19, Cornwallis surrendered, ending the last major battle of the Revolutionary War. The Historical Park includes Historic Jamestowne in Jamestown and the Yorktown Battlefield in Yorktown. The Washington-Rochambeau Revolutionary Route National Historic Trail commemorates the route (on today’s maps) taken by the two armies, the French from Rhode Island and both south from New York.

Frances H. Kennedy is a conservationist and historian. Her books include The Civil War Battlefield GuideAmerican Indian Places, and, most recently, The American Revolution: A Historical Guidebook.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post The top 10 historic places from the American Revolution appeared first on OUPblog.

0 Comments on The top 10 historic places from the American Revolution as of 6/30/2014 7:53:00 AM
Add a Comment
21. The Ultimate Summertime Food!

Hot Diggity Dog: The History of the Hot Dog

By Adrienne Sylver; illustrated by Elwood H. Smith

 

Summer is approaching with the speed at which kids consume hot dogs at a summer barbecue. So what better picture book to greet its approach than Adrienne Styler’s homage to the hot dog.

This historical hymn to the hot dog will put parents and young readers in touch with their inner child and bring back fond memories of favorite hot dog hang outs and haunts you parents still remember! It certainly did for me. Mine was Callahan’s in Fort Lee, New Jersey. Since 1950, this roadside stand that sat next to a rival hot dog stand called Hiram’s, (yes, these TWO distinct places existed side by side, each with their own faithful followers for many years), quenched the taste buds of hot dog hounds from near and far! The motto of Callahan’s was, “So Big, So Good” and it was. More to follow, faithful frank followers!!

4855539_origBut Ms. Sylver’s picture book lets us in on the facts of how Americans fell in love with a “dog on a bun” aka a frank, wiener, or “red hot”, as they’re referred to regionally and we continue to consume them at a rate of 2 billion each July! A tasty tip Ms. Sylver mentions is THAT is enough hot dog links to encircle the earth seven times. Actually, the hot dog is not so All American a food as you may think. Its area of origination seems to have been Frankfort, Germany or Vienna, Austria in the 1400’s and later found its way to America in the 1860’s.

Kids will love learning facts about the birth of the bun. Did you know hot dogs were first sold bunless? Ouch! That burns the fingers! And after losing a ton of gloves that the vendor gave as hot dog “holders”, he asked his brother-in-law that happened to be a baker for a hand. The baker handed him a BUN for his franks. Necessity definitely was the “Mother of Invention” here!

The famous Coney Island hot dog sold for a mere nickel and in the Great Depression when jobs and therefore food was scarce, a nickel could buy a great meal and their popularity boomed!

Young readers will find out how the hot dog found its way to ballparks and even find that actors like the famous Humphrey Bogart were devotees of the dog when he said, “A hot dog at the ball park is better than a steak at the Ritz!

Elwood H. Smith’s lively illustrations match this hot dog handbook to a tee! And the sidebar facts and suggestions included on each page will pique your child’s interest. It’s the perfect summer read. So, now let’s get back to MY favorite hot dog haven. Callahan’s has morphed into a food truck, as that seems to be the new craze today. You follow the truck for one of the greatest hot dogs – EVER. The relish is even a special blend! Now where is that truck parked next? Hmmmm. Callahan’s forever!

Here is a link to let you in on a taste of the history of Callahan’s!  http://www.callahanshotdogs.com/

timthumb.php

 

Add a Comment
22. Best Selling Picture Books | July 2014

Three of the books in The Children's Book Review's best selling picture books list for July fall under the category of American history. Each of the books are deliciously rich in visual cues.

Add a Comment
23. July 4th and the American Dream in a season of uncertainty

By Jim Cullen


There’s not much history in our holidays these days. For most Americans, they’re vehicles of leisure more than remembrance. Labor Day means barbecues; Washington’s Birthday (lumped together with Lincoln’s) is observed as a presidential Day of Shopping. The origins of Memorial Day in Confederate grave decoration or Veterans Day in the First World War are far less relevant than the prospect of a day off from work or school.

Independence Day fares a little better. Most Americans understand it marks the birth of their national identity, and it’s significant enough not to be moved around to the first weekend of July (though we’re happy enough, as is the case this year, when it conveniently forms the basis of a three-day weekend). There are flags and fireworks abundantly in evidence. That said, the American Revolution is relatively remote to 21st century citizens, few of whom share ancestral ties, much less sympathy, for the views of the partisans of 1776, some of whom were avowedly pro-slavery and all of them what we would regard as woefully patriarchal.

The Declaration of Independence by John Trumbull. Public Domain via Wikimedia Commons.

The Declaration of Independence by John Trumbull. Public Domain via Wikimedia Commons.

The main reason why Independence Day matters to us is that it commemorates the debut of the Declaration of Independence in American life (the document was actually approved by Congress on July 2nd; it was announced to the public two days later). Far more than any other document in American history, including the Constitution, the Declaration resonates in everyday American life. We can all cite its famous affirmation of “life, liberty, and the pursuit of happiness” because in it we sense our true birthright, the DNA to which we all relate. The Declaration gave birth to the American Dream — or perhaps I should say an American Dream. Dreams have never been the exclusive property of any individual or group of people. But never had a place been explicitly constituted to legitimate human aspiration in a new and living way. Dreams did not necessarily come true in the United States, and there were all kinds of politically imposed barriers to their realization alongside those that defied human prediction or understanding. But such has been the force of the idea that US history has been widely understood — in my experience as a high school teacher working with adolescents, instinctively so — as a progressive evolution in which barriers are removed for ever-widening concentric circles that bring new classes of citizens — slaves, women, immigrants, gays — into the fold.

This is, in 2014, our mythic history. (I use the word “myth” in the anthropological sense, as a widely held belief whose empirical reality cannot be definitively proved or denied.) But myths are not static things; they wax and wane and morph over the course of their finite lives. As with religious faith, the paradox of myths is that they’re only fully alive in the face of doubt: there’s no need to honor the prosaic fact or challenge the evident falsity. Ambiguity is the source of a myth’s power.

Here in the early 21st century, the American Dream is in a season of uncertainty. The myth does not assert that all dreams do come true, only that all dreams can come true, and for most of us the essence of can resides in a notion of equality of opportunity. We’ve never insisted on equality of condition (indeed, relatively few Americans ever had much desire for it, in stark contrast to other peoples in the age of Marx). Differential outcomes are more than fine as long as we believe it’s possible anyone can end up on top. But the conventional wisdom of our moment, from the columns of Paul Krugman to the pages of Thomas Piketty, suggests that the game is hopelessly rigged. In particular, race and class privilege seem to give insuperable advantage over those seeking to achieve upward mobility. The history of the world is full of Ciceros and Genghis Khans and Joans of Arc who improbably overcame great odds. But in the United States, such people aren’t supposed to be exceptional. They’re supposed to be almost typical.

One of the more curious aspects of our current crisis in equality of opportunity is that it isn’t unique in American history. As those pressing the point frequently observe, inequality is greater now than any time since the 1920s, and before that the late nineteenth century. Or, before that, the antebellum era: for slaves, the difference between freedom and any form of equality — now so seemingly cavernous, even antithetical — were understandably hard to discern. And yet the doubts about the legitimacy of the American Dream, always present, did not seem quite as prominent in those earlier periods as they do now. Frederick Douglass, Horatio Alger, Emma Lazarus: these were soaring voices of hope during earlier eras of inequality. F. Scott Fitzgerald’s Jay Gatsby was a cautionary tale, for sure, but the greatness of his finite accomplishments was not denied even by normally skeptical Nick Carraway. What’s different now may not be our conditions so much as our expectations. Like everything else, they have a price.

I don’t want to brush away serious concerns: it may well be that on 4 July 2014 an American Dream is dying, that we’re on a downward arc different than that of a rising power. But it is perhaps symptomatic of our condition — a condition in which economic realities are considered the only ones that matter — whereby the Dream is so closely associated with notions of wealth. We all know about the Dreams of Andrew Carnegie, Henry Ford, Bill Gates, and Mark Zuckerberg. But the American Dream was never solely, or even primarily, about money — even for Benjamin Franklin, whose cheeky subversive spirit lurks beneath his adoption as the patron saint of American capitalism. Anne Bradstreet, Thomas Jefferson, Martin Luther King: some of these people were richer than others, and all had their flaws. But none of them thought of their aspirations primarily in terms of how wealthy they became, or measured success in terms of personal gain. Their American Dreams were about their hopes for their country as a better place. If we can reconnect our aspirations to their faith, perhaps our holidays can become more active vessels of thanksgiving.

Jim Cullen is chair of the History Department of the Ethical Culture Fieldston School in New York. He is the author of The American Dream: A Short History of an Idea that Shaped a Nation and Sensing the Past: Hollywood Stars and Historical Visions, among other books. He is currently writing a cultural history of the United States since 1945.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post July 4th and the American Dream in a season of uncertainty appeared first on OUPblog.

0 Comments on July 4th and the American Dream in a season of uncertainty as of 1/1/1900
Add a Comment
24. Mapping the American Revolution

By Frances H. Kennedy


From the rocky coast of Maine to the shores of northern Florida to the cornfields of Indiana, there are hundreds of sites and landmarks in the eastern United States that are connected to the American Revolution. Some of these sites, such as Bunker Hill and Valley Forge, are better known, and others are more obscure, but all are integral to learning about where and how American independence was fought for, and eventually secured. Beginning with the Boston Common, first occupied by British troops in 1768, and closing with Fraunces Tavern in New York, where George Washington bid farewell to his officers on 4 December 1783, this map plots the locations of these sites and uses The American Revolution: A Historical Guidebook to explain why they were important.

Frances H. Kennedy is a conservationist and historian. Her books include The Civil War Battlefield Guide, American Indian Places, and, most recently, The American Revolution: A Historical Guidebook.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Mapping the American Revolution appeared first on OUPblog.

0 Comments on Mapping the American Revolution as of 7/3/2014 10:32:00 AM
Add a Comment
25. What if the Fourth of July were dry?

By Kyle G. Volk


In 1855, the good citizens of the state of New York faced this very prospect. Since the birth of the republic, alcohol and Independence Day have gone hand in hand, and in the early nineteenth century alcohol went hand in hand with every day. Americans living then downed an average seven gallons of alcohol per year, more than twice what Americans drink now. In homes and workshops, churches and taverns; at barn-raisings, funerals, the ballot box; and even while giving birth — they lubricated their lives with ardent spirits morning, noon, and night. If there was an annual apex in this prolonged cultural bender, it was the Fourth of July, when many commemorated the glories of independence with drunken revelry.

Beginning in the 1820s, things began to change. A rising lot of middle-class evangelical Protestants hoped to banish the bottle not only from the nation’s birthday but from the nation itself. With the evidence of alcohol’s immense personal and social costs before them, millions of men and women joined the temperance crusade and made it one of America’s first grass-roots social movements. Reformers demonized booze and made “teetotalism” (what we call abstinence) a marker of moral respectability. As consumption levels began to fall by mid-century, activists sought to seal their reformation with powerful state laws prohibiting the sale of alcohol. They insisted — in true democratic fashion — that an overwhelming majority of citizens were ready for a dry America.

State legislators played along, initiating America’s first experiment with prohibition not in the well-remembered 1920s but rather in the 1850s. It was a time when another moral question — slavery — divided the nation, but it was also a time when hard-drinking Irish and German immigrants — millions of them Catholics — threatened to overwhelm Protestant America. With nativism in the air, 13 states enacted prohibition laws. Predicting the death of alcohol and the salvation of the nation, temperance reformers set out to see these laws enforced.

New York’s moment came in 1855. The state legislature passed a prohibition statute in the spring and chose the Fourth of July for the measure to take effect. If prohibition was going to work, it had to work on the wettest day of the year. The bold timing was not lost on contemporaries who imagined the “sensation” that would undoubtedly accompany a dry Independence Day. “Cannon will have a novel sound in the ears of some people” and “flags will have a curious look to some eyes,” the prohibitionist New York Times jibbed. American eyes and ears, of course, had long been impaired by “brandy smashes” and “gin slings.” But now a proper, sober celebration of the nation’s birth could proceed without alcohol’s irreverent influence.

Laborers dispose of liquor during Prohibition. Public domain via Wikimedia Commons.

A stiff cocktail of workingmen and entrepreneurs, immigrants, and native-born Americans, however, burst on to the scene to keep the Fourth of July, and every day thereafter, wet. These anti-prohibitionists condemned prohibition as an affront to their cultural traditions and livelihoods. To them, prohibition exposed the grave threat that organized moral reform and invasive state governments posed to personal liberty and property rights. It revealed American democracy’s despotic tendencies — what anti-prohibitionists repeatedly called the “tyranny of the majority.” Considering themselves an oppressed minority, liquor dealers, hotel keepers, brewers, distillers, and other alcohol-purveying businessmen led America’s first wet crusade. In the process, they became critical pioneers in America’s lasting tradition of popular minority-rights politics. As the Fourth of July approached, they initiated opinion campaigns, using mass meetings and the press to bombard the public with anti-prohibitionist propaganda that placed minority rights and constitutional freedom at the heart of America’s democratic experiment. They formed special “liquor dealer associations” and used them to raise funds, lobby politicos, hire attorneys, and determine a course of resistance once prohibition took effect.

In some locales their public-opinion campaigns worked as skittish officials refused to enforce the law. As the New York Times grudgingly observed of the Fourth of July in Manhattan, “The law was in no respect observed.” But elsewhere, officials threatened enforcement and reports of a dry Fourth circulated. In Yonkers, for example, there wasn’t “a drop of liquor to be had.” With prohibition enforced in Brooklyn, Buffalo, and elsewhere, anti-prohibitionists implemented their plan of civil disobedience — intentionally and peacefully resisting a law that they deemed unjust and morally reprehensible. They defiantly sold booze and hoped to be arrested so they could “test” prohibition’s constitutionality in court. Liquor dealer associations organized these efforts and guaranteed members access to their legal defense funds to cover the costs of fines and litigation. The battle had fully commenced.

In New York, as in other states, anti-prohibitionists’ activism paid off. Their efforts soon turned prohibition into a dead letter throughout the state, and they convinced New York’s Court of Appeals to declare prohibition unconstitutional. The Fourth of July had been a dry affair in many New York towns in 1855, but anti-prohibitionists ensured that the national birthday in 1856 was a wet one. Their temperance adversaries, of course, would persist and emerge victorious when national Prohibition in the form of the 18th Amendment took full effect in 1920. But anti-prohibitionists continued to counter with tactics intended to protect civil liberties and minority rights in America’s democracy. That Independence Day today remains a wet affair owes much to their resistance and to the brand of minority-rights politics they popularized in the mid-nineteenth century.

Kyle G. Volk is Associate Professor of History at the University of Montana. He is author of Moral Minorities and the Making of American Democracy, recently published by Oxford University Press.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post What if the Fourth of July were dry? appeared first on OUPblog.

0 Comments on What if the Fourth of July were dry? as of 7/4/2014 6:46:00 AM
Add a Comment

View Next 25 Posts