JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: american history, Most Recent at Top [Help]
Results 1 - 25 of 327
How to use this Page
You are viewing the most recent posts tagged with the words: american history in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
In mid-February the Public Broadcasting Service aired a four-hour documentary entitled The Italian Americans, an absorbing chronicle of one immigrant group’s struggles and successes in America. It has received rave reviews across the country. For all its virtues, however, the film falls short in at least one important respect.
The recent letter written by 47 Republican senators to the government of Iran about nuclear negotiations has revived talk about the classic phrase “politics stops at the water’s edge.” The tag line, arguing that partisanship should be put aside in foreign policy, is often attributed to Senator Arthur Vandenberg (R-Michigan) who used it in endorsing some of the diplomatic initiatives of the Democratic Truman administration at the start of the Cold War.
The book is a nonfiction picture book about Dr. Gordon Sato, whose mangrove tree-planting project transformed an impoverished village in Eritrea into a self-sufficient community. Dr. Sato named his project the Manzanar Project, partly inspired by the time he spent as a child in Manzanar, a Japanese Internment Camp in California.
A few weeks ago, Susan Roth, co-author and illustrator of the book, received this message from someone who had known Dr. Sato a very long time ago (reposted with permission):
A few years ago, Dr. Gordon Sato sent me a copy of your book, “Mangrove Tree” and I would like to share with you the Gordon Sato that I know. I too was imprisoned at Manzanar because I looked like the enemy. I took 24 units of UC educational courses to qualify as Provisional High School teacher at Manzanar. I was selected to teach high school Physics. Gordon Sato was a student in my Physics class. It was some forty years after Manzanar closed that Gordon Sato phoned me and said he wanted to come and see me. He told me that he had received as BS degree from USC 1951 and his Doctorate degree from Caltech in 1955. He said he was ready to go to Eritrea, Africa on scientific project to help Eritrea out of poverty. He said he called The Manzanar Project and handed me a copy of that project. I did not know of all of the scientific research he had done nor the scientific accomplishment he had achieved. While this Nisei who has dedicated his life for humanity, I want you to know the other Gordon Sato.
For a student to seek his former teacher is in itself a wonderful tribute to me. But then, at our meeting, Gordon Sato said he wanted to thank me for inspiring him to get a college education. Two little words, “Thank You” showed me a man who stands tall among all of us with courage and humility. I too had hoped that something good would come out of that place of injustice. Little did I know that I had planted a seed that would blossom into something beautiful for the world to see. That is the Gordon Sato that I know.
We love this reminder that behind every leader, innovator, scientist, and world changer, there’s a great teacher! Thank you, Gordon Sato, and thank you Tadashi Kishi!
Who here used to play the computer game “Oregon Trail” obsessively as a child? Let’s see a show of hands. Think back fondly on the days you used to carefully select your wagon train, hunt for buffalo, and decide whether you needed to ford the river or caulk your wagon. (Sometimes, when I am driving, I feel like I am rafting down the Columbia River and trying to avoid boulders and like driftwood and stuff. A fun fact about me, I know. If you would like to relive the magic, you can play the game here, btw. It’s not perfect but it sparked my interest in this period of American history as a kid.) Anyway. When I found out that Under the Painted Sky was about two young women – one Chinese-American, one African-American – who cross-dress as teenage boys in order to navigate the Oregon Trail – I was sold. If... Read more »
March is Women’s History Month and as the United States gears up for the 2016 election, I propose we salute a pathbreaking woman candidate for president. No, not Hillary Rodham Clinton, but Shirley Chisholm, who became the first woman and the first African American to seek the nomination of the Democratic Party for president. And yet far too often Shirley Chisholm is seen as just a footnote or a curiosity, rather than as a serious political contender who demonstrated that a candidate who was black or female or both belonged in the national spotlight.
Speaker of the House John Boehner is learning the enduring truth of Lyndon Johnson’s famous distinction between a cactus and a caucus. In a caucus, said LBJ, all the pricks are on the inside. Presumably Speaker Boehner seldom thinks about his Republican predecessors as leaders of the House.
Lucy Stone, a nineteenth-century abolitionist and suffragist, became by the 1850s one of the most famous women in America. She was a brilliant orator, played a leading role in organizing and participating in national women’s rights conventions, served as president of the American Equal Rights Association [...]
What do opera singer Leontyne Price, activist Victoria Gray Adams, civil rights organizer Bayard Rustin, and Harvard sociologist William Julius Wilson have in common? They all attended or graduated from Wilberforce University. Located outside of Dayton, Ohio, Wilberforce was the first institution of higher education to be owned and operated by African Americans.
If it were not for his impeachment on 24 February 1868, and the subsequent trial in the Senate that led to his acquittal, Andrew Johnson would probably reside among the faded nineteenth century presidents that only historical specialists now remember. Succeeding to the White House after the murder of Abraham Lincoln in April 1865, Johnson proved to be a presidential failure [...]
Last week marked two important events in the unfinished story of southern racial violence. On February 10, the Alabama-based Equal Justice Initiative released Lynching in America, an unflinching report that documents 3,959 black victims of mob violence in twelve southern states between 1877 and 1950.
On Valentine’s Day, we usually think of romance and great love stories. But there is another type of love we often overlook: love between friends, particularly between men and women in a platonic friendship. This is not a new phenomenon: loving friendships were possible and even fairly common among elite men and women in America’s founding era. These were affectionate relationships of mutual respect, emotional support, and love that had to carefully skirt the boundaries of romance. While extravagant declarations of love would have raised eyebrows, these friends found socially acceptable ways to express their affection for one another. Learn more about some special pairs of platonic friends from early America, including some very familiar names.
Eloise Payne and William Ellery Channing
William was best known as a Unitarian minister and early transcendentalist, but to a bright young teacher named Eloise he was “my dear friend.” Eloise looked to William, seven years her senior, for religious and professional advice, but she wasn’t afraid to rebuke him when he became too critical. When she worried that his affections were waning after he started courting the woman who would become his wife, he replied, “You hold the same place in my heart as ever, and I can now say to you with more propriety than before, that few hold a higher.” (Photo credit: Public Domain via The Frick Collection.)
George Washington and Elizabeth Powel
George and Elizabeth met while George was in Elizabeth’s hometown of Philadelphia for the Constitutional Convention. George often spent the evening with Elizabeth and she later visited him at Mount Vernon. They had frank political discussions and exchanged gifts for over a decade. For her fiftieth birthday, George sent her a poetic tribute written by a friend of Elizabeth’s, and he signed one of his last letters to her before his death, “I am truly yours.” (Photo credit: Public Domain via Wikimedia Commons.)
Thomas Jefferson and Abigail Adams
Abigail Adams called her friend Thomas Jefferson “one of the choice ones on earth,” and Thomas greatly admired the wife of his long-time friend John Adams. They both lived in Paris in the 1780’s and attended plays and other events together. Later, he jokingly referred to her as Venus; he wrote from Paris that while selecting Roman busts to send for the Adams’ London home, he passed over the figure of Venus because he “thought it out of taste to have two at table at the same time.” (Photo credit: Public Domain via Library of Congress.)
Margaret Bayard Smith and Anthony Bleecker
Margaret and Anthony first met as young adults in New York City as part of the same circle of writers and intellectuals. Some twenty-five years later, Margaret wrote a novel which Anthony helped to edit. The novel’s central love story was based upon her friendship with Anthony. “Has not friendship recollections as sweet and dear as those of love?” she wrote to him. Her answer: “Yes, indeed it has—at least in my heart.” (Photo credit: Public Domain via Wikimedia Commons.)
John Rodgers and Anne Pinkney
John Rodgers is best remembered as a navel hero who fought the Barbary pirates and fired the first shots of the War of 1812. But while he was across the Atlantic fighting pirates, he relied on his friend Ann Pinkney at home in Maryland to help further his courtship of a young woman named Minerva Denison. Ann reported back to John on his “goddess” and was pleased to extract a confession of Minerva’s love for John which she passed along. John and Minerva married, while John and Ann remained friends. (Photo credit: Public Domain via Wikimedia Commons.)
Benjamin Franklin and Georgiana Shipley
Benjamin Franklin was notorious for his flirtations with women, but it’s likely that most of his flirting was merely part of playful friendships. Such appears to be the case with a teenage girl he befriended in London in 1772, Georgiana Shipley. He gave her a pet squirrel named Mungo as well as a snuff box with his portrait painted on the lid. He declared himself “your affectionate friend” and she was even more effusive: “The love and respect I feel for my much-valued friend are sentiments so habitual to my heart that no time nor circumstance can lessen the affection.” (Photo credit: Public Domain via Wikimedia Commons.)
Gilbert Stuart and Sarah Wentworth Morton
Gilbert Stuart is best known for his portraits of presidents, but his friendship with Boston writer Sarah Wentworth Morton prompted his only known poetry. Gilbert created three portraits of Sarah, one of which he kept for himself. She published a poem praising his artistry, beginning with “Stuart, thy portraits speak with skill divine.” He replied that her poetry created “a cheering influence at my heart” and that ultimately poetry was superior to painting. This was a friendship between a very talented pair! (Photo credit: Public Domain via Wikimedia Commons.)
Elizabeth Graeme Fergusson and Benjamin Rush
The doctor and writer Benjamin Rush had a long friendship with one of Philadelphia’s smartest women, Elizabeth Graeme Fergusson. Elizabeth wrote Benjamin frequently from her country estate but sometimes worried she didn’t receive enough letters in return. As she wrote in a poem she sent him in 1793, “One Letter a week she surely might claim,/ To keep alive Friendship; and fan its pure Flame.” He may not have written as often as she would like, but he admired her greatly. She was, he said after her death, “a woman of uncommon talents and virtues” who was “beloved by a numerous circle of friends and acquaintances.” (Photo credit: Public Domain via Wikimedia Commons.)
Eliza Parke Custis and Marquis de Lafayette
The Marquis de Lafayette formed a lasting bond with George Washington during the American Revolution, and his affections later extended to Washington’s step-granddaughter Eliza Parke Custis. Lafayette was a father figure for Eliza, whose own father died when she was young. Eliza confided her troubles in him and he wrote long letters in reply offering advice and affection. The pair wrote each other for years, with Lafayette conveying his “paternal love” and “most affectionate respectful attachments.” (Photo credit: Public Domain via Wikimedia Commons.)
Featured image: Scene at the Signing of the Constitution of the United States, Howard Chandler Christy (1940). Image courtesy of Wikimedia Commons.
These books, guides, and cards offer interesting trivia and facts, engaging formats, and lively illustrations; a perfect combination to pique interest for hours of casual reading, followed by days of reciting trivia, and hopefully, years of knowledge about these important people in American history.
In the 1960s, the South, was rife with racial tension. The Supreme Court had just declared, in its landmark case Brown vs. Board of Education, that racial segregation in public schools was unconstitutional, and the country was in the midst of a growing Civil Rights Movement. In response to these events, Ku Klux Klan activity boomed, reaching an intensity not seen since the 20s, when they boasted over four million members. Surprisingly, North Carolina, which had been one of the more progressive Southern states, had the largest and most active Klan membership — greater than the rest of the South combined — earning it the nickname “Klansville, USA”. This slideshow features images from the time of the Civil Rights-era Klan.
A rally against school integration, 1959
In the wake of the Brown vs. Board of Education decision, and in the midst of the growing Civil Rights Movement, Ku Klux Klan activity boomed, reaching an intensity not seen since the 1920s. (Image credit: United States Library of Congress via Wikimedia Commons)
United Klans of America Charter and Business Card
The UKA adopted the trappings of a bureaucratic organization. North Carolina Klan leader Bob Jones distributed business cards that announced him as Grand Dragon. (Image courtesy of the National Archives and Records Administration)
Crowd at 1963 March on Washington
“We have the same right as the Negro to demonstrate,” Bob Jones told reporters, responding in part to the previous week’s March on Washington, which had attracted an estimated quarter-million Civil Rights supporters to the nation’s capital. (Image credit: National Archives and Records Administration via Wikimedia Commons)
United Klans of America Flyer
The UKA printed up to two thousand of these flyers to advertise each rally. Members passed them out to likely candidates at service stations, cafes, and other meeting spaces. (Image courtesy of the National Archives and Records Administration)
UKA Membership Cards
UKA members stapling their membership cards to a cross burned at a rally in September 1969. With Bob Jones in prison on contempt of Congress charges, the group never recovered. (Image courtesy of Don Sturkey)
Be sure to check out the American Experience documentary Klansville U.S.A. airing Tuesday, 13 January on PBS.
Heading image: The Ku Klux Klan on parade down Pennsylvania Avenue, 1928. U.S. National Archives and Records Administration. Public domain via Wikimedia Commons.
Seventy years ago today, in Korematsu v. United States, the Supreme Court upheld the constitutionality of the Japanese-American internment program authorized by President Franklin Roosevelt’s Executive Order 9066. The Korematsu decision and the internment program that forcibly removed over 100,000 people of Japanese ancestry from their homes during World War II are often cited as ugly reminders of the dangers associated with wartime hysteria, racism, fear-mongering, xenophobia, an imperial president, and judicial dereliction of duty. But the events surrounding Korematsu are also a harrowing reminder of what happens to liberty when the “Madisonian machine” breaks down — that is, when the structural checks and balances built into our system of government fail and give way to the worst forms of tyranny.
Our 18th century system of separated and fragmented government — what Gordon Silverstein calls the “Madisonian machine” — was engineered to prevent tyranny, or rather tyrannies. Madison’s Federalist 51 outlines a prescription for avoiding “Big T Tyranny” — the concentration of power in any one branch of government. This would be accomplished by dividing and separating powers among the three branches of government and between the federal government and the states. “Ambition must be made to counteract ambition,” Madison wrote. Each branch would jealously protect its own powers while guarding against encroachments by the others.
But this wasn’t the only form of tyranny the framers worried about. In a democracy, minorities are always at risk of being oppressed by majorities — what I call “little t tyranny.” Madison’s solution to this kind of tyranny is articulated in Federalist 10. The cure to this disease was firstly to elect representatives who could filter the passions of the masses and make more enlightened decisions. Secondly, Madison observed that as long as the citizenry is sufficiently divided and carved up into numerous smaller “factions,” it would be unlikely that a unified majority would emerge to oppress a minority faction.
In the events leading up to and including the Supreme Court’s decision in Korematsu, these safeguards built into the Madisonian machine broke down, giving way to both forms of T/tyranny. Congress not only acquiesced to President Roosevelt’s executive order, it responded with alacrity to support it. After just one hour of floor debate and virtually no dissent, Congress passed Public Law 503, which promulgated the order and assigned criminal penalties for violating it. And the branch furthest removed from the whims and passions of the majority, the Supreme Court, declined to second-guess the wisdom of the elected branches. As Justice Hugo Black wrote for the majority in Korematsu, “we cannot reject as unfounded the judgment of the military authorities and of Congress…” If Congress had been more skeptical, perhaps the Supreme Court might have been, too. But the Supreme Court has a long track record of deference to the executive when Congress gives express consent for his actions – especially in times of war. Unfortunately, under the Madisonian design, this is exactly when the Supreme Court ought to be the most skeptical of executive power.
To be sure, these checks and balances built into the Madisonian system were only meant to function as “auxiliary precautions.” The most important safeguard against T/tyranny would be the people themselves. Through a campaign of misinformation and fear-mongering, however, this protection was also rendered ineffective. Public opinion data was used selectively to convey the impression to both legislators and west coast citizens that the majority of Americans supported the internment program. The passions of the public were further manipulated by the media and west coast newspaper headlines such as “Japanese Here Sent Vital Data to Tokyo,” “Lincoln Would Intern Japs,” and “Danger in Delaying Jap Removal Cited.” Any dissent or would-be countervailing “factions,” to use Madison’s phrase, were effectively silenced.
In Korematsu, ambition did not counteract ambition as Madison had intended, and the machine broke down. That’s because in order to function properly, the Madisonian machine requires access to information and time for genuine deliberation. It also requires friction. It requires people to disagree – for our elected representatives to disagree with one another, for the Supreme Court to police the elected branches, for citizens to pause, faction off, and check one another. So we can complain of gridlock in government, but let’s not forget that the alternative, as demonstrated by the unforgivable and tragic events of Korematsu, exposes the most vulnerable among us to the worst forms of tyranny.
Featured image credit: A young evacuee of Japanese ancestry waits with the family baggage before leaving by bus for an assembly center. US National Archives and Records Administration. Public domain via Wikimedia Commons.
With the arrival of the celebration of Hanukkah, I wanted to revisit a special book I have spoken about before; Hanukkah at Valley Forge. In 2007 this book received The Sydney Taylor Award from the Association of Jewish Libraries given in recognition of picture books and also those for teens that authentically reflect the Jewish experience. Here, the book’s vivid watercolor illustrations coupled with Mr. Krensky’s fictionalized retelling of a historically researched anecdote come together for what I think is a powerful picture book.
Stephen Krensky’s book, Hanukkah at Valley Forge, combines history and holiday in an interesting way. The parallels of American and Jewish history intertwine on a bitterly cold winter evening at Valley Forge. Faced with increasing uncertainty and mounting odds, General George Washington meets a Polish immigrant observing the first night of Hanukkah with the lighting of the candles there amidst the fading hope of Washington’s ragtag colonial army.
Common themes of man’s need to hope in the face of increasing despair and the price of liberty’s cause, echo in the meeting of these two men at a pivotal point in our nation’s early history. Some historical accuracy was apparently discovered in the research of the book, and it is left to the reader to wonder if chance meetings sometimes turn the tides of men and war.
Fifty years ago today, a most unlikely figure was called to speak at the Oxford Union Debating Society: Mr. Malcolm X. The Union, with its historic chamber modeled on the House of Commons, was the political training ground for the scions of the British establishment. Malcolm X, by contrast, had become a global icon of black militancy, with a reputation as a dangerous Black Muslim. The visit seemed something of an awkward pairing. Malcolm X encountered a hotel receptionist who tried to make him write his name in full in the guest book (she had never heard of him), sat through a bow tie silver service dinner ahead of the debate, and had to listen to a conservative debating opponent accuse him of being a racist on a par with the Prime Minister of South Africa. A closer look at the event, though, reveals the pairing of Malcolm X and the Oxford Union to be a good fit — and reveals much about the issues of race and rights then, and now.
From the perspective of the Oxford Union, a controversial speaker was an entirely good thing. The BBC covered Malcolm X’s costs and broadcast the debate. In late 1964, though, Malcolm X also spoke to student concerns about race equality. For many years, the British media’s (sympathetic) coverage of anti-racist protests in the American South and South Africa gave the impression that racial discrimination was chiefly to be found elsewhere. A bitter election which turned on anti-immigration sentiment in late 1964 in Smethwick, in the English midlands, with its infamous slogan, “If you want a n***** for you neighbour, vote Labour,” exposed the virulence of the race issue in Britain, too. Students followed this news abroad and at home. Some visited “racial hotspots” in person. Others joined demonstrations in solidarity. Still, on the surface, such issues seemed a world away from Oxford’s dreaming spires.
But some students in Oxford were also grappling with the question of race in their own institution. The Union President, Eric Antony Abrahams, was a Jamaican Rhodes Scholar, who had vowed to his sister in his first week that he would “fill the Union chamber with blacks.” Abrahams was part of a growing cohort of students from newly independent nations who studied in Britain, many of whom called for changes in curriculum and representation. Three days before Malcolm X arrived, Oxford students released a report showing that more than half of University landladies in the city refused to accept students of color. The University had an official policy of non-discrimination, but the fact that many landladies turned down black applicants in practice had been a running sore for years. The report, and Malcolm X’s visit, brought the matter to public attention. Student activism ultimately forced a change in practice, part of a nationwide series of protests against the unofficial color-bar in many British lodgings. At a time when Ferguson is rightly at the forefront of the news, events in Oxford in 1964 remind us that atrocities elsewhere should serve as a prompt to address, rather than a reason to ignore, questions of rights and representation nearer to home.
For Malcolm X, coming to Oxford was an exciting challenge. He loved pitting his wits against the brightest and the best. As chance would have it, as Prisoner 22843 in the Norfolk Penal Colony in Massachusetts, he may well have debated against a visiting team from Oxford. More germane, though, was Malcolm’s desire in what turned out to be the final year of his life to place the black freedom struggle in America within the global context of human rights. He had spent the better part of 1964 in the Middle East and Africa. In each stop along his dizzying itinerary of states, he attempted to build support for international opposition to racial discrimination in America. Malcolm’s visits to Europe in late 1964 were no different. But it was Oxford that afforded him the opportunity to broadcast his views before his widest single audience yet. Citing the recent murders of civil rights activists in Mississippi, Malcolm X told his audience: “In that country, where I am from, still our lives are not worth two cents.”
At a time when cities across the United States have recently braced themselves against the threat of rebellion in the aftermath of the acquittal of Michael Brown’s killer, it is hard not to conclude that for many African Americans, Malcolm’s words at Oxford continue to haunt the nation. Indeed, by placing the civil rights movement in broad relief internationally, Malcolm sought to link the fate of African Americans with West Indians, Pakistanis, West Africans, Indians, and others, seeking their own justice in the capitals and banlieus of Europe. Emphasizing the independence of this new emergent world both within and outside of the confines of Europe, Malcolm hoped that the “time of revolution” his audience was living in would in part be defined by a broader sense of what it meant to be human. There could no longer be distinctions between “black” and “white” deaths — despite his condemnation of the media for continuing to indulge such distinctions.
“A Full Belly is the Mother of all Evil,” Benjamin Franklin counseled the readers of Poor Richard’s Almanack. For some mysterious reason this aphorism hasn’t had the sticking power of some of the inventor’s more famous sayings, like “he who lies down with dogs shall rise up with fleas.” Most of us are more inclined to see a full belly as one of life’s blessings. The offending epigram, however, can’t be described as an aberration. Franklin’s writings are filled with variations on this advice: “A full Belly makes a dull brain”; “The Muse starves in a Cook’s shop”; and “Three good meals a day makes bad living.” It’s no wonder that one canny writer has taken advantage of the unquenchable American appetite for both the founding fathers and diet books to publish The Benjamin Franklin Diet, a complete guide to slimming down, eighteenth-century style.
Franklin’s antipathy to a full belly reflected his Puritan upbringing, which stigmatized gustatory pleasures as low or impure. When he was growing up, he recalled in his Autobiography, “little or no Notice was ever taken of what related to the Victuals on the table, whether it was well or ill dressed, in or out of season, of good or bad flavour, preferable of inferior.” Franklin claimed to have thoroughly adopted this legacy of indifference to food, but there is good evidence to the contrary. He abandoned an early commitment to vegetarianism when, on board the ship that carried him away from bondage to his brother in Boston, he succumbed to the temptation to indulge in a catch of cod. As he confessed, “I had formerly been a great Lover of fish, & when this came hot out of the Frying Pan, it smeled admirably well.” Reasoning that fish ate other fish, and thus why shouldn’t he, the pragmatic Franklin “din’d upon Cod very heartily.” The famous portrait of Franklin by Joseph Siffred Duplessis, painted decades later in France, suggests that he gained no better control of his appetites as he matured. Not even a hero worshipper could call the man thin. A second chin falls heavy below his jaw line, his belly strains against the buttons of his sumptuous waistcoat, and his arms bear a resemblance to fattened sausages.
Not a total hypocrite, Franklin did include passages in his writing that treat the pleasures of the table more positively. Poor Richard’s advice that “Fools makes Feasts and Wise Men eat them” suggests that frugality, more than distaste, motivated Franklin’s advice be temperate. During his embassy in Paris, when Franklin sought to win France over to the American cause, he ate out six nights a week. And without a doubt he enjoyed many of the nice things he was served, such as îles flottantes and champagne.
A proud American, Franklin also sought to introduce his French friends to some of the glories of his native cuisine. He insisted that American corn flour could make a sweeter bread than wheat alone (several of the philosophes were engaged in pursuit of a more nutritious bread recipe to improve the condition of the peasantry, who derived the majority of their calories from the staff of life). Later, after his return to Philadelphia, Franklin sent his friends shipments of Pennsylvania hams – remarkable for the sweetness of their fat, which he attributed to the pigs’ subsisting on corn.
If you want to try Benjamin Franklin’s recipe for corn bread you can find it in the appendix to Gilbert Chinard’s wonderful 1958 essay “Benjamin Franklin on the Art of Eating.” This little pamphlet, printed by the American Philosophical Society, contains a number of recipes found among Franklin’s papers, few of which could be described as dietetic. Franklin’s recipe for roasted pig pays great attention to producing a delicious crackling. His oyster sauce is heavy on the cream. And his puff pastry, recommended for encasing his apple pudding, calls for a pound of butter. Frarnklin’s apple pudding makes a tempting proposition for a food historian on the eve of Thanksgiving, especially since, like many eighteenth-century recipes, Franklin’s terse instructions offer just enough detail to inspire certainty that the end result would be inedible by twentieth-century standards. What better reason could there be to break out the mixing bowl!
* * * * *
To make an apple pudding.
Make a good puff-paste, roll it out half an inch thick, pare your apples, and core them, enough to fill the crust, and close it up, tie it in a cloth and boil it. If a small pudding, two hours: if a large one three or four hours. When it is enough turn it into your dish, cut a piece of the crust out of the top, butter and sugar it to your palate; lay on the crust again, and send it to table hot.
* * * * *
The sense of the unfamiliar has always been what compels me about history, it gives me the feeling of discovery and assures me that I am not just finding my own reflection in the sources. I, for example, do not bring a love of boiling to my reading of dessert recipes. Baking I expect – hours of boiling, not so much. I boil few foods, and those only briefly. I boil pasta 7 to 12 minutes, always anxious to drain the pot while the noodles are still al dente. Sometimes I boil green beans, but just for a couple minutes and often I steam them instead. I boil eggs, but I like the yolks soft so I don’t leave them in for more than six minutes. I never boil dessert pastries. But Benjamin Franklin told me to, so for the sake of historical knowledge I threw all my cooking know-how to the wind and set out to slavishly follow his orders.
Difficulties confronted me long before I arrived at the boiling. To begin, Franklin directed that I make a puff pastry, mixing four pints, or a quarter of a peck, of flour with half a pound of butter. How much did eighteenth-century dry pints weigh? And did they weigh the same in the colonies as they did in England? Today the imperial wet pint is four ounces more than the American wet pint (20 oz vs. 16 oz). One thing is for certain, whatever the exact weight of an eighteenth-century dry pint might be, four of them is a whopping amount. I made the executive decision to weight a pint at 16 oz and cut the recipe in half so that I didn’t completely empty our flour bin. Halving the butter as well, I ended up with a very dry mix:
The next direction was to add cold water until a stiff dough formed. Having spent the past twenty-five years of baking trying to add as little water to my pie dough as possible to prevent it turning tough, I needed to tamp down all my better instincts to pour in the cup and a half of cold water that my dry mix required to come together.
The brick of paste that resulted was so hard that it had to be beat into submission to follow the next directions, which called for the dough to be rolled out, buttered, rolled up, rolled out, and buttered again, nine to ten successive times until another half pound of butter had been added.
After an hour of buttering and rolling, I was left with a lovely, pliable, yellow dough, which I rolled out “half a thumb’s thickness” and set on a cheese cloth.
Franklin’s recipe calls next for chopped cored apples to be placed on the dough. No seasoning is done at this stage: no spices added to the apples, no sugar, no butter, no lemon. Just apples. How big? How many? Over how much of the dough? It doesn’t say.
Nor did the recipe explain how to seal the dough. I went for crimping and ended up with something that looked like a giant Cornish pasty.
At least until I wrapped it up in pastry and began the boiling, whence it commenced to look more like a brain. It was hard to commit willful destruction of this beautiful pasty, rather than pop the parcel into a hot oven where it might grow golden and crisp. What was the purpose of building up 10 layers of lamination only to melt out all the butter in a bubbling pot? Again, Franklin was mute.
The cooking instructions said to boil the pudding from two to four hours depending on its size. Unsure of the standard of measurement, I decided on three hours. There were no further cooking directions and perhaps I should have just let it be, but worried that the pudding wasn’t getting cooked on the top, which bounced above the bubbling water, I flipped the package each hour. Perhaps if I hadn’t, the pudding would have developed more of a crust.
For the final step, Franklin directs that the top of the pudding be removed, sugar and butter be mixed in with the apples, then the top replaced and the whole served immediately. When I cut away the muslin and lifted the soggy lid I found that the apples inside had reduced to a beautiful sauce within the boiled pastry casing. I added some chopped butter and brown sugar, then closed the pudding back up and let the flavors meld. I can’t say the result would win first prize in a pie contest, it wouldn’t even win honorable mention. But I can report that the mess tasted quite nice in a bland, comforting, soft, sort of way. Not a bad match for turkey at all.
Featured image: “The First Thanksgiving,” Jean Leone Gerome Ferris (c. 1912). Public domain via Wikimedia Commons.
On 9 August 2014, Officer Darren Wilson of the Ferguson, Missouri (a suburb of St. Louis) Police Department, shot and killed Michael Brown, an unarmed 18-year-old. Officer Wilson is white and Michael Brown was black, sparking allegations from wide swaths of the local and national black community that Wilson’s shooting of Brown, and the Ferguson Police Department’s reluctance to arrest the officer, are both racially motivated events that smack of an historic trend of black inequality within the US criminal justice system.
The fact that the Ferguson Police Department and city government are predominantly white, while the town is predominantly black, has underscored this distrust. So too have recent events in Los Angeles, New York, Ohio, South Carolina, St. Louis, and other places that suggest a disturbing pattern of white police personnel’s use of excessive force in the beatings or deaths of blacks across the nation. So disturbing, in fact, that this case and the others linked to it not only have inspired an organic, and diverse, crop of youth activists, but also have captured the close attention of President Barack Obama, Attorney General Eric Holder, national civil rights organizations and the national black leadership. Indeed, not one or two, but three concurrent investigations of Officer Wilson’s shooting of Michael Brown are ongoing—one by the St. Louis Police Department and the other two by the FBI and the Justice Department, who are concerned with possible civil rights violations. The case also has a significant international following. The parents of Michael Brown raised this profile recently when they testified in Geneva, Switzerland before the United Nations Committee against Torture. There, they joined a US delegation to plead for support to end police brutality aimed at profiled black youth.
The details of the shooting investigations, each bit eagerly seized by opposing sides (those who support Brown and those who defend Wilson) as they become publicly available, still don’t give a comprehensive view of what actually happened between the officer and the teen, leaving too much speculation as to whether or not the Ferguson Grand Jury, who have been considering the case since 20 August, will return an indictment(s) against Officer Wilson.
What is known of the incident is that about noon on that Saturday, Michael Brown and a friend, Dorian Johnson, were walking down Canefield Drive in Ferguson when Darren Wilson approached the two in his squad car, telling them to get out of the street and onto the sidewalk. A scuffle ensued between Brown and Wilson within the police car. In his defense, Officer Wilson has stated that Brown attacked him and tried to grab his weapon. Dorian Johnson has countered that Wilson pulled Michael Brown into his car, suggesting that Brown was trying to defend himself from an overly aggressive Wilson. Shots were fired in Wilson’s police car and Brown ran down the street, pursued by Wilson. Autopsy reports indicate that Brown was shot at least six times, four times in his left arm, once through his left eye and once in the top of his head. The latter caused the youth’s death. Michael Brown’s body lay in the street, uncovered, for several hours while the police conducted a preliminary investigation, prompting even more outrage by black onlookers.
Since Michael Brown’s death, protestors from the area and across the nation have occupied the streets of Ferguson, demanding justice for the slain teen and his family. Nights of initial confrontations between police forces (the Ferguson Police, the St. Louis Police, the Missouri State Troopers and the National Guard have all been deployed in Ferguson at some time, and in some capacity, since the shooting) and though there has been some arson, looting, protestor and police violence, and arrests—even of news reporters—the protests generally have been peaceful. Not only police action during these protests, but their equipment as well, have sparked criticism and the growing demand that law enforcement agencies demilitarize. The daily protests have persisted, at times growing in great number, as during a series of “Hands up, Don’t Shoot” events that were held not just in Ferguson, but in many cities nationwide, including Chicago, New York, Washington, D.C., Los Angeles and Omaha, Nebraska in August and September. The “hands up” stance is to protest Brown’s shooting which some, but not all, witnesses have stated came even with Brown’s hands up in a gesture of surrender to Wilson.
Missouri Governor Jay Nixon, and other state and local officials, along with many of the residents of Ferguson, fear that if the Grand Jury does not indict Darren Wilson for Michael Brown’s murder, civil unrest will erupt into violence, producing an event similar to the Los Angeles Riots of 1992. In Los Angeles, large numbers of persons rioted when it seemed that the legal outcomes of two back-to-back criminal cases smacked of black injustice—the acquittal of four white police officers indicted in the assault of black motorist Rodney King, and the no jail-time sentence of a Korean shopkeeper found guilty for the murder of Latasha Harlins, a black teen. The result was the worst race riot in US history, with more than 50 people killed, the burning of a substantial portion of the ethnic business enclave of Koreatown, and at least a billion dollars in property damage.
Certainly the fear is a legitimate one. The vast majority of US race riots that have centered on black participation have occurred with like conditions as a spark—the community’s belief that a youth or vulnerable person among them has been brutalized with state sanction. The nation has witnessed these events not only in Los Angeles in 1965 and 1992; but also in Harlem in 1935 and 1964; Richmond, California in 1968; San Francisco in 1986; Tampa, Florida in 1967 and 1986; Miami in 1980; Newark, New Jersey in 1967; York, Pennsylvania in 1969; Crown Heights (Brooklyn), New York in 1991; St. Petersburg, Florida in 1996; Cincinnati, Ohio in 2001; Benton Harbor, Michigan in 2003; Oakland, California in 2009 and 2010, and the list goes on. These events all have served as cautionary tales that, unfortunately, have not resulted in either the perception or reality of black equality before the law. It is this legacy that frustrates and frightens Ferguson residents.
Ninety-four years ago today, the Nineteenth Amendment to the Constitution of the United States took effect, enshrining American women’s right to vote. Fifty years later, in the midst of a new wave of feminist activism, Congress designated 26 August as Women’s Equality Day in the United States. The 1971 Joint Resolution read, in part, “the women of the United States have been treated as second-class citizens and have not been entitled the full rights and privileges, public or private, legal or institutional, which are available to male citizens of the United States” and women “have united to assure that these rights and privileges are available to all citizens equally regardless of sex.” For that reason, Congress was prevailed upon to declare 26 August a day to commemorate the the Nineteenth Amendment as a “symbol of the continued fight for equal rights.”
Alice Paul was a pivotal and controversial figure in the last years of the American battle to win the vote for women. Her first national action was to organize a grand suffrage procession in Washington, DC on 3 March 1913. She organized the parade on behalf of the National American Woman Suffrage Association (NAWSA), the only group working to win women the vote on a national scale. She later founded her own organization, the National Woman’s Party, and charted a surprisingly aggressive course of social protests to convince Congress to pass a woman suffrage amendment to the Constitution.
Alice Paul lived long enough to see Women’s Equality Day established; she died in 1977. She did not live to see the project which consumed the remaining years of her life ratified — an Equal Rights Amendment to the Constitution. In 2014, a renewed effort emerged to pass the ERA.
As Women’s Equality Day is celebrated around the country today, here are a few things you may not know about suffrage leader and ERA author Alice Paul:
1. Alice Paul was proudly a birthright Quaker, but as she became interested in politics, she became frustrated with her faith’s reluctance to actively work for woman suffrage. We often associate Quakers with political activism, but in the late nineteenth century, the vast majority of Quakers disapproved of such efforts.
2. Paul loved dancing and sports. Indeed, her love for physical activity was a factor in drawing her into social protest, first in England, then in America. In her high school and college years, she played softball, basketball, hockey, and tennis, and also ice skated when she could. She learned to dance while attending Swarthmore College near Philadelphia and regretted her few opportunities to attend dances in her later years.
3. Paul was arrested seven times in England for her suffrage activism, but only once in America. The longest sentence she served in Britain was one month. In the United States, she was sentenced to seven months, but only served one.
4. Paul endured forced feeding fifty-five times in London’s Holloway Prison in 1909 and perhaps another twenty-five times while at the District of Columbia’s Jail in 1917. Authorities used forced feeding to break the hunger strikes initiated by suffrage prisoners. Some women suffered health problems as a result. Alice Paul struggled with digestive issues for years after and may have lost her sense of smell.
5. Paul is often portrayed as eager to leave NAWSA to found her own militant suffrage group. In fact, she did so only when her hand was forced. Divisions over strategy or tactics are nothing new to any political group and NAWSA itself came about only in 1890 after two long-estranged suffrage organizations compromised in order to present a united front. The 1914 effort to oust the controversial Alice Paul from NAWSA arose from multiple sources, including the current NAWSA president, Anna Howard Shaw and once-and-future president, Carrie Chapman Catt.
6. Paul’s persona as a leader combined stereotypically feminine and masculine traits in a way that invited fervent loyalty or deep-seated antipathy. Her dislike of the spotlight and ingrained modesty lent her a vulnerability which undercut concerns about her militant past and her powerful drive. Others found her charismatic authority threatening.
7. Though the protests of Paul’s National Woman’s Party are often described as “civil disobedience,” Paul believed all of her actions were completely within the law. Before Paul initiated picketing to protest the lack of a suffrage amendment in 1917, picketing was largely the province of labor organizations. After consulting with attorneys about the legality of the practice, Paul adapted the silent vigil of two earlier protests and sent “silent sentinels” to picket the White House. While labor picketing often prompted violence on both sides, Paul gave her troops strict instructions to remain non-violent. Violence was, however, visited upon them by bystanders outraged by the women’s insistence on pressing for suffrage while the country was engaged in World War I.
8. Paul’s most colorful protests occurred after the House of Representatives passed the suffrage amendment bill. It took another eighteen months to convince the Senate to pass the amendment. To maintain pressure on Congress, Alice Paul crafted watchfire protests across from the White House in Lafayette Square, during which suffragists burned President Wilson’s words about his much-celebrated belief in democracy. They even burned Wilson in effigy to urge him to use his political power to sway the Senate.
9. Alice Paul was not present during the frenzied effort to make Tennessee the ratifying state for the suffrage amendment. She longed to be at the Tennessee statehouse, but NWP lobbying required a constant input of cash. Her ability to raise funds surpassed anyone else’s, so she chose to stay in Washington to keep the money flowing. Paul’s ability to raise funds was a key factor in the success of the NWP.
10. Alice Paul bequeathed us the iconic images of the battle for the ballot: photographs of the 1913 procession, the 1917 White House pickets, the 1918 watchfire protests. These images speak to the courage, the persistence and the fortitude of all the women who fought to gain the most fundamental right of citizenship: the right to consent.
Signed into law by President Lyndon Johnson on 3 September 1964, the Wilderness Act defined wilderness “as an area where the earth and its community of life are untrammeled by man, where man himself is a visitor who does not remain.” It not only put 1.9 million acres under federal protection, it created an entire preservation system that today includes nearly 110 million acres across forty-four states and Puerto Rico—some 5% of the land in the United States. These public lands include wildlife refuges and national forests and parks where people are welcome as visitors, but may not take up permanent residence.
The definition of what constitutes “wilderness” is not without controversy, and some critics question whether preservation is the best use of specific areas. Nevertheless, most Americans celebrate the guarantee that there will always be special places in the United States where nature can thrive in its unfettered state, without human intervention or control. Campers, hikers, birdwatchers, scientists and other outdoor enthusiasts owe much to Howard Zahniser, the act’s primary author.
In recent decades, environmental awareness and protection are values just about as all-American as Mom and apple pie. Despite the ill-fated “Drill, Baby, Drill,” slogan of the 2008 campaign, virtually all political candidates, whatever their party, profess concern about the environment and a commitment to its protection. As a professor, I have a hard time persuading my students, who were born more than two decades after the first Earth Day (in 1970), that environmental protection was once commonly considered downright traitorous.
For generations, many Americans were convinced that it was the exploitation of natural resources that made America great. The early pioneers survived because they wrested a living from the wilderness, and their children and grandchildren thrived because they turned natural resources into profit. Only slowly did the realization come that people had been so intent on pursuing vast commercial enterprises they failed to consider their environmental impact. When, according to the 1890 census, the frontier was closed, the nation was no longer a land of ever-expanding boundaries and unlimited resources. Birds like the passenger pigeon and the Carolina Parakeet were hunted into extinction; practices like strip-mining left ugly scars on the land, and clear-cutting made forest sustainability impossible.
At the turn of the last century members of the General Federation of Women’s Clubs called for the preservation of wilderness, especially through the creation of regional and national parks. They enjoyed the generous cooperation of the Forest Service during the Theodore Roosevelt administration, but found that overall, “it is difficult to get anyone to work for the public with the zeal with which men work for their own pockets.”
Not surprisingly, Theodore Roosevelt framed his support for conservation in terms of benefiting people rather than (non-human) nature. In 1907 he addressed both houses of Congress to gain support for his administration’s effort to “get our people to look ahead and to substitute a planned and orderly development of our resources in place of a haphazard striving for immediate profit.” It is a testament to Roosevelt’s persona that he could sow the seeds of conservationism within a male population deeply suspicious of any argument even remotely tinged with what was derided as “female sentimentality.” Writer George L. Knapp, for example, termed the call for conservation “unadulterated humbug” and the dire prophecies of further extinction “baseless vaporings.” He preferred to celebrate the fruits of men’s unregulated resource consumption: “The pine woods of Michigan have vanished to make the homes of Kansas; the coal and iron which we have failed—thank Heaven!—to ‘conserve’ have carried meat and wheat to the hungry hives of men and gladdened life with an abundance which no previous age could know.” According to Knapp, men should be praised, not chastened, for turning “forests into villages, mines into ships and skyscrapers, scenery into work.”
The press reinforced the belief that the use of natural resources equaled progress. The Houston Post, for example, declared, “Smoke stacks are a splendid sign of a city’s prosperity,” and the Chicago Record Herald reported that the Creator who made coal “knew that smoke would be a good thing for the world.” Pittsburgh city leaders equated smoke with manly virtue and derided the “sentimentality and frivolity” of those who sought to limit industry out of baseless fear of the by-products it released into the air.
Pioneering educator and psychologist G. Stanley Hall confirmed that “caring for nature was female sentiment, not sound science.” Gifford Pinchot, made first chief of the Forestry Service in 1905, was a self-avowed conservationist. He escaped charges of effeminacy by making it clear that he measured nature’s value by its service to humanity. He dedicated his agency to “the art of producing from the forest whatever it can yield for the service of man.” “Trees are a crop, just like corn,” he famously proclaimed, “Wilderness is waste.”
Looking back at the last fifty years, the Wilderness Act of 1964 is an important achievement. But it becomes truly remarkable when viewed in the context of the long history that preceded it.
Headline image credit: Cook Lake, Bridger-Teton National Forest, Wyoming. Photos by the Pinedale Ranger District of the Bridger-Teton National Forest. Public domain via Wikimedia Commons.
Over the summer of 1582 a group of English Catholic gentlemen met to hammer out their plans for a colony in North America — not Roanoke Island, Sir Walter Raleigh’s settlement of 1585, but Norumbega in present-day New England.
The scheme was promoted by two knights of the realm, Sir George Peckham and Sir Thomas Gerard, and it attracted several wealthy backers, including a gentleman from the midlands called Sir William Catesby. In the list of articles drafted in June 1582, Catesby agreed to be an Associate. In return for putting up £100 and ten men for the first voyage (forty for the next), he was promised a seignory of 10,000 acres and election to one of “the chief offices in government”. Special privileges would be extended to “encourage women to go on the voyage” and according to Bernardino de Mendoza, the Spanish ambassador in London, the settlers would “live in those parts with freedom of conscience.”
Religious liberty was important for these English Catholics because they didn’t have it at home. The Mass was banned, their priests were outlawed and, since 1571, even the possession of personal devotional items, like rosaries, was considered suspect. In November 1581, Catesby was fined 1,000 marks (£666) and imprisoned in the Fleet for allegedly harboring the Jesuit missionary priest, Edmund Campion, who was executed in December.
Campion’s mission had been controversial. He had challenged the state to a public debate and he had told the English Catholics that those who had been obeying the law and attending official church services every week — perhaps crossing their fingers, or blocking their ears, or keeping their hats on, to show that they didn’t really believe in Protestantism — had been living in sin. Church papistry, as it was known pejoratively, was against the law of God. The English government responded by raising the fine for non-attendance from 12 pence to £20 a month. It was a crippling sum and it prompted Catesby and his friends to go in search of a promised land.
The American venture was undeniably risky — “wild people, wild beasts, unexperienced air, unprovided land” did not inspire investor confidence — but it had some momentum in the summer of 1582. Francis Walsingham, Elizabeth I’s secretary of state, was behind it, but the Spanish scuppered it. Ambassador Mendoza argued that the emigration would drain “the small remnant of good blood” from the “sick body” of England. He was also concerned for Spain’s interests in the New World. The English could not be allowed a foothold in the Americas. It mattered not a jot that they were Catholic, “they would immediately have their throats cut as happened to the French.” Mendoza conveyed this threat to the would-be settlers via their priests with the further warning that “they were imperilling their consciences by engaging in an enterprise prejudicial to His Holiness” the Pope.
So Sir William Catesby did not sail the seas or have a role in the plantation of what — had it succeeded — would have been the first English colony in North America. He remained in England and continued to strive for a peaceful solution. “Suffer us not to be the only outcasts and refuse of the world,” he and his friends begged Elizabeth I in 1585, just before an act was passed making it a capital offense to be, or even to harbor, a seminary priest in England. Three years later, as the Spanish Armada beat menacingly towards England’s shore, Sir William and other prominent Catholics were clapped up as suspected fifth columnists. In 1593 those Catholics who refused to go to church were forbidden by law from traveling beyond five miles of their homes without a license. And so it went on until William’s death in 1598.
Seven years later, in the reign of the next monarch James I (James VI of Scotland), William’s son Robert became what we would today call a terrorist. Frustrated, angry and “beside himself with mindless fanaticism,” he contrived to blow up the king and the House of Lords at the state opening of Parliament on 5 November 1605. “The nature of the disease,” he told his recruits, “required so sharp a remedy.” The plot was discovered and anti-popery became ever more entrenched in English culture. Only in 2013 was the constitution weeded of a clause that insisted that royal heirs who married Catholics were excluded from the line of succession.
Every 5 November, we English and Scottish set off our fireworks and let our children foam with marshmallow, and we enjoy “bonfire night” as a bit of harmless fun, without really thinking about why the plotters sought their “sharp remedy” or, indeed, about the tragedy of the father’s failed American Dream, a dream for religious freedom that was twisted out of all recognition by the son.
Featured image: North East America, by Abraham Ortelius 1570. Public Domain via Wikimedia Commons.
Another election season is upon us, and so it is time for another lesson in electoral geography. Americans are accustomed to color-coding our politics red and blue, and we shade those handful of states that swing both ways purple. These color choices, of course, vastly simplify the political dynamic of the country.
Look more closely at those maps, and you’ll see that the real political divide is between metropolitan America and everywhere else. The blue dots on the map are, in fact, tiny, and the country is otherwise awash in red. Those blue dots, though, are where most of us live — 65% of us according the Brookings Institution live in metro regions of 500,000 or more — and those big red areas are increasingly empty.
The urban-rural divide has existed in American politics from the very beginning. It is a central irony of American political life that we are an urbanized nation inhabited by people who are deeply ambivalent about cities.
It’s what I call the “anti-urban tradition” in American life, and it comes in two parts.
On the one hand, American cities — starting with Philadelphia in the 18th century — have always been places of ethnic, racial, religious, and cultural diversity. First stop for immigrant arrivals from eastern Europe or the American south, cities embodied the cosmopolitan ideal that critic Randolph Bourne celebrated in his 1916 essay “Trans-National America.”
Not all Americans were as enthusiastic as Bourne about cities filling up with Catholics from Italy and Poland, Jews from Russia and Lithuania, and African-Americans from Mississippi and North Carolina. Many, in fact, recoiled in horror at all this heterogeneity. Many, of course, still do, as when Republican Vice Presidential candidate Sarah Palin campaigned in North Carolina and called small towns there “real America.”
On the other hand, the industrial cities that boomed at the turn of the 20th century relied on the actions of government to make life livable. Paved streets, clean water, sanitary sewers — all this infrastructure required the intervention of local, state, and eventually the federal government. Indeed, the 20th century city is where our commitments to the public realm have been given their widest expression — public space, public transportation, public education, public housing. And anti-urbanists then and now have a deep suspicion of those public, “collective” commitments.
In this sense, cities stand as antithetical to the basic, bedrock, “real” American values: self-reliant individualism and the supremacy of all things private. The 2012 Republican Party Platform, for example, denounced “sustainable development,” often associated with urbanist design principles, as nothing less than an assault on “the American way of life of private property ownership, single family homes, private car ownership and individual travel choices, and privately owned farms.”
Yet while anti-urbanism today is closely associated with Tea Party conservatives, its history in the 20th century is more complicated. The American antipathy toward our cities has been common across the political spectrum.
Franklin Roosevelt, architect of the modern liberal state, disliked cities personally — one of his closest aides described him as a “child of the country” who saw cities as “a perhaps necessary nuisance.” He was, to borrow the title of a 1940 biography, a “country squire in the White House.”
The New Deal reflected that anti-urban feeling. While a number of his New Deal programs addressed themselves to the failing industrial economy of the nation’s cities, FDR’s larger ambition was to “decentralize” cities by moving people and industry out into the hinterlands. This urge tied together the Tennessee Valley Authority, the Civilian Conservation Corps, and the program to build a series of entirely new towns. After all, according to New Dealer Rexford Tugwell, FDR “always did, and always would, think people better off in the country.”
A generation later, the counter-culture of the 1960s which had emerged on college campuses in Berkeley, Madison, Ann Arbor, and elsewhere manifested its own version of anti-urbanism. Fed up with what they saw as America’s un-savable cities, they went back to the land in dozens of different communal experiments. So many young people joined the exodus out of the city that Newsweek magazine declared 1969 “The Year of the Commune.”
Whether in the hills of Vermont or the hills of Marin County, communards shared the anti-urban impulse with their parents, who had left the city to move to the suburbs in the 1950s. As Steve Diamond put it in a 1971 book describing a trip from his commune back to New York: “you could feel yourself approaching the Big C (City, Civilization, Cancer) itself, deeper and deeper into the decaying heart.” These rebels might not have recognized their resemblance to their parents, but it was there in their shared anti-urban rhetoric.
Certainly, our ambivalence toward our cities lies beneath our unwillingness to tackle urban problems, whether in Detroit or Cleveland or Philadelphia. But the consequences of our anti-urban tradition are more wide-ranging. Our inability to think in public terms, to address the commonweal, grows directly out of our experience running away from cities in the 20th century. If we want a more effective and invigorated politics in the 21st century, therefore, we will have to outgrow our anti-urban habits.
Featured image: Aerial view of the tip of Manhattan, New York, United States, ca. 1931. Public domain via Wikimedia Commons.
The construction or recertification of a nuclear power plant often draws considerable attention from activists concerned about safety. However, nuclear powered US Navy (USN) ships routinely dock in the most heavily populated areas without creating any controversy at all. How has the USN managed to maintain such an impressive safety record?
The USN is not alone, many organizations, such as nuclear public utilities, confront the need to maintain perfect reliability or face catastrophe. However, this compelling need to be reliable does not insulate them from the need to innovate and change. Given the high stakes and the risks that changes in one part of an organization’s system will have consequences for others, how can such organizations make better decisions regarding innovation? The experience of the USN is apt here as well.
Given that they have at their core a nuclear reactor, navy submarines are clearly high-risk organizations that need to innovate yet must maintain 100% reliability. Shaped by the disastrous loss of the USS Thresher in 1963 the U.S. Navy (USN) adopted a very cautious approach dominated by safety considerations. In contrast, the Soviet Navy, mindful of its inferior naval position relative to the United States and her allies, adopted a much more aggressive approach focused on pushing the limits of what its submarines could do.
Decision-making in both organizations was complex and very different. It was a complex interaction among individuals confronting a central problem (their opponents’ capabilities) with a wide range of solutions. In addition, the solution was arrived at through a negotiated political process in response to another party that was, ironically, never directly addressed, i.e. the submarines never fought the opponent.
Perhaps ironically, given its government’s reputation for rigidity, it was the Soviet Navy that was far more entrepreneurial and innovative. The Soviets often decided to develop multiple types of different attack submarines – submarines armed with scores of guided missiles to attack U.S. carrier battle groups, referred to as SSGNs, and smaller submarines designed to attack other submarines. In contrast the USN adopted a much more conservative approach, choosing to modify its designs slightly such as by adding vertical launch tubes to its Los Angeles class submarines. It helped the USN that it needed its submarines to mostly do one thing – attack enemy submarines – while the Soviets needed their submarines to both attack submarines and USN carrier groups.
As a result of their innovation, aided by utilizing design bureaus, something that does not exist in the U.S. military-industry complex, the Soviets made great strides in closing the performance gaps with the USN. Their Alfa class submarines were very fast and deep diving. Their final class of submarine before the disintegration of the Soviet Union – the Akula class – was largely a match for the Los Angeles class boats of the USN. However, they did so at a high price.
Soviet submarines suffered from many accidents, including ones involving their nuclear reactor. Both their SSGNs, designed to attack USN carrier groups, as well as their attack submarines, had many problems. After 1963 the Soviets had at least 15 major accidents that resulted in a total loss of the boat or major damage to its nuclear reactor. One submarine, the K429 actually sunk twice. The innovative Alfas, immortalized in The Hunt for Red October, were so trouble-prone that they were all decommissioned in 1990 save for one that had its innovative reactor replaced with a conventional one. In contrast, the USN had no accidents, though one submarine, the USS Scorpion, was lost in 1968 to unknown causes.
Why were the USN submarines so much more reliable? There were four basic reasons. First, the U.S. system allowed for much more open communication among the relevant actors. This allowed for easier mutual adjustment between the complex yet tightly integrated systems. Second, the U.S. system diffused power much more than in the Soviet political system. As a result, the U.S. pursued less radical innovations. Third, in the U.S. system decision makers often worked with more than one group – for example a U.S. admiral not only worked within the Navy, but also interacted with the shipyards and with Congress. Finally, Admiral Rickover was a strong safety advocate who instilled a strong safety culture that has endured to this day.
In short, share information, share power, make sure you know what you are doing and have someone powerful who is an advocate for safety. Like so much in management it sounds like common sense if you explain it well, but in reality it is very hard to do, as the Soviets discovered.
Feature image credit: Submarine, by subadei. CC-BY-2.0 via Flickr.
Historians are tasked with recreating days past, setting vivid scenes that bring the past to the present. Mark M. Smith, author of The Smell of Battle, the Taste of Siege: A Sensory History of the Civil War, engages all five senses to recall the roar of canon fire at Vicksburg, the stench of rotting corpses in Gettysburg, and many more of the sights and sounds of battle. In doing so, Smith creates a multi-dimensional vision of the Civil War and captures the human experience during wartime. Here, Smith speaks to how our senses work to inform our understanding of history and why the Civil War was a singular sensory event.
Anyone who expects bipartisanship in the wake of last Tuesday’s elections has not been paying attention. The Republican Party does not believe in a two-party system that includes the Democrats, and it never has. Ever since the Civil War, when the Republicans were convinced that their Democratic opposition was in treacherous league with the Confederacy, the Grand Old Party in season and out has doubted the legitimacy of the Democrats to hold power. While the Republicans have accepted the results of national elections as facts they could not change, they have not believed that the Democrats were ever legitimately holding power. Democratic victories, in the minds of Republicans, are the result of fraud and abuse.
Consider some examples: In 1876, Republicans in New York said the Democratic party was “the same in character and spirit as when it sympathized with treason.” Half a century later, speaking of Woodrow Wilson, Henry Cabot Lodge told the 1920 Republican national convention that “Mr. Wilson stands for a theory of administration and government which is not American.” When Senator Joseph R. McCarthy spoke of “twenty years of treason” in the 1950s, he was not joking. He meant the statement as literal fact. So too did an aide to George H.W. Bush in 1992 when he observed, “We are America. These other people are not America.”
So when Rush Limbaugh comments that “Democrats were not elected to govern,” or Leon H. Wolf of Redstate says Democrats “should not be even be invited to be part of the discussion lest their gangrenous, festering and destructive ideas should further infect our caucus,” they are reflecting an attitude toward the Democrats that is at least a century and a half old.
If, as many Republicans believe, there are elements of illegitimacy and evil in the Democratic Party under the leadership of President Obama, then a posture of intense resistance become a necessary GOP tactic. Meeting the threat that the Democrats pose in terms of such issues as same-sex marriage, climate change and immigration reform requires going beyond politics as usual and employing any means necessary to save the nation.
For contemporary Republicans, scorched earth tactics and all-out opposition seem the appropriate response to the presence of a pretender in the White House who in their minds is pursuing the collapse of the American republic. There no longer exists between Republicans and Democrats a rough consensus about the purpose of the United States.
How has it come to this? A long review of both political parties suggests that the experience of the Civil War introduced a flaw into American democracy that was never resolved or recognized. The Republicans regarded the wartime flirtation of some Democrats with the Confederacy as evidence of treason. So it may have been at that distant time. What rendered that conclusion toxic was the perpetuation of the idea of Democratic illegitimacy and betrayal long after 1865.
After their extended years in the wilderness during the New Deal, Republicans reasserted their presidential dominance, with a few Democratic interruptions from 1952 to 1992. Republicans thus saw in the ascendancy of Dwight Eisenhower, Richard Nixon, Gerald Ford, Ronald Reagan, and the two Bushes a return to the proper order of politics in which the Republicans were destined to be in charge and Democrats to occupy a position of perennial deference outside of Congress.
Then the unthinkable happened. Not just a Democrat but a black Democrat won the White House. The southern-based Republican Party saw its worst fears coming true. A man with a foreign-sounding name, an equivocal religious background, and a black skin was president and pursuing what were to most Republicans sinister goals. Under his administration, blacks became assertive, gays married, the poor got health care, and the wealthy faced both a lack of due respect and a claim on their income.
The Republican allegiance to traditional democratic practices now seemed to them outmoded in this national crisis. Americans could not really have elected Barack Obama and put his party in control of the destiny of the nation. Such an outcome must be illegitimate. And what is the remedy for illegitimacy, treason, and godlessness? To quote Leon Wolf again: “Working with these people is not what America elected you to do. Republicans, it elected you to stop them.” Pundits who forecast a new era of bipartisanship comparable to what Dwight D. Eisenhower, Everett Dirksen, Sam Rayburn, and Lyndon B. Johnson achieved in the 1950s are living in a nostalgic dream world. Richard Nixon viewed politics as war and contemporary Republicans will proceed to explore the validity of his insight over the next two years. For the American voter, clinging to the naive notion of the parties working together, each taking part of the loaf, the best guide may be Bette Davis in All About Eve: “Fasten your seat belts. It’s going to be a bumpy night.”
Featured image: Members of the Republican Party gather at the 1900 National Convention. Public domain via Wikimedia Commons.