JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: this day in history, Most Recent at Top [Help]
Results 1 - 25 of 125
How to use this Page
You are viewing the most recent posts tagged with the words: this day in history in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
Nursing lore has long maintained that the mysterious illness that sent Florence Nightingale to bed for 30 years after her return from the Crimea was syphilis. At least that’s what many nursing students were told in the 1960s, when my wife was working on her BSN. Syphilis, however, would be difficult to reconcile with the fact that Nightingale was likely celibate her entire life and had not a single sign or symptom typical of that venereal infection.
On 6 August 2015, the Voting Rights Act (VRA) will be turning 50 years old. In 1965, President Lyndon B. Johnson approved this groundbreaking legislation to eliminate discriminatory barriers to voting. The Civil Rights Movement played a notable role in pushing the VRA to become law. In honor of the law's birthday, Oxford University Press has put together a quiz to test how much you know about its background, including a major factor in its success, Section 5.
Today marks the forty-sixth anniversary of Prince Charles’s formal investiture as Prince of Wales. At the time of this investiture, Charles himself was just shy of his twenty-first birthday, and in a video clip from that year, the young prince looks lean and fresh-faced in his suit, his elbows resting on his knees, his hands clasping and unclasping as he speaks to the importance of the investiture.
On 9 July 1755, British troops under the command of General Edward Braddock suffered one of the greatest disasters of military history. Braddock's Defeat, or the Battle of the Monongahela, was the most important battle prior to the American Revolution, carrying with it enormous consequences for the British, French, and Native American peoples of North America.
John Paul Jones died in Paris on this day in 1792, lonely and forgotten by the country he helped bring into existence. Shortly before his death, he began to lose his appetite. Then his legs began to swell, and then his abdomen, making it difficult for him to button his waistcoat and to breath.
Over the past half-century, Medicare and Medicaid have constituted the bedrock of American healthcare, together providing insurance coverage for more than 100 million people. Yet these programs remain controversial: clashes endure between opponents who criticize costly, “big government” programs and supporters who see such programs as essential to the nation's commitment to protect the vulnerable.
When Charles Darwin died at age 73 on this day 133 years ago, his physicians decided that he had succumbed to “degeneration of the heart and greater vessels,” a disorder we now call “generalized arteriosclerosis.” Few would argue with this diagnosis, given Darwin’s failing memory, and his recurrent episodes of “swimming of the head,” “pain in the heart”, and “irregular pulse” during the decade or so before he died.
‘Anzac’ (soon transmuting from acronym to word) came to sum up the Australian desire to reflect on what the war had meant. What was the first Anzac Day? At least four explanations exist of the origins of the idea of Anzac, the most enduring legacy of Australia’s Great War.
May the Fourth be with you! Playing off a pun on one of the movie’s most famous quotes, May the 4th is the unofficial holiday in which Star Wars fans across the globe celebrate the beloved blockbuster series. The original Star Wars movie, now known as Star Wars IV: A New Hope, was released on 25 May 1977, but to those of us who waited in line after line to see it again and again in theaters, it will always be just Star Wars.
On this day in 1863, General Thomas J. “Stonewall” Jackson, one of the wiliest military commanders this country ever produced, died eight days after being shot by his own men. He had lost a massive amount of blood before having his left arm amputated by Dr. Hunter Holmes McGuire, arguably the most celebrated Civil War surgeon of either side.
Of the many controversies surrounding the life and legacy of Christopher Columbus, who died on this day 510 years ago, one of the most intriguing but least discussed questions is his true country of origin. For reasons lost in time, Columbus has been identified with unquestioned consistency as an Italian of humble beginnings from the Republic of Genoa. Yet in over 536 existing pages of his letters and documents, not once does the famous explorer claim to have come from Genoa.
On this day in 1953, the New Zealand mountaineer Edmund Hillary and Nepali-Indian Sherpa mountaineer Tenzing Norgay became the first people to reach the summit of Mount Everest. In the following excerpt from his book, Exploration: A Very Short Introduction (OUP, 2015), Stewart A. Weaver discusses why we, as humans, want to explore and discover. For all the different forms it takes in different historical periods, for all the worthy and unworthy motives that lie behind it, exploration, travel for the sake of discovery and adventure, seems to be a human compulsion.
On June 21, Mt. Zion United Methodist Church in Philadelphia, Mississippi will hold its fifty-first memorial service for three young civil rights workers murdered by the Ku Klux Klan at the start of the Freedom Summer. Andrew Goodman, James Chaney, and Michael Schwerner were activists who planned to create a voting rights school at the church, located in rural Neshoba County.
This year marks the fiftieth anniversary of the congressional passage of the Hart-Celler Immigration and Nationality Act, signed into law by President Lyndon B. Johnson. It was the culmination of a trend toward reforming immigrant admissions and naturalization policies that had gathered momentum in the early years of the Cold War era.
As we head towards another General Election in 2015, once again politicians from the Right and Left will battle it out, hoping to persuade the electorate that either a big state or small state will best address the challenges facing our society. For 40 years, Germans living behind the Iron Curtain in the German Democratic Republic (GDR) had first-hand experience of a big state, with near-full employment and heavily subsidized rent and basic necessities. Then, when the Berlin Wall fell, and East Germany was effectively taken over by West Germany in the reunification process, they were plunged into a new capitalist reality. The whole fabric of daily life changed, from the way people voted, to the brand of butter they bought, to the newspapers that they read. Circumstances forced East Germans to swap Communism for Capitalism, and their feelings about this change remain quite diverse.
Initially, East Germans flooded across the border, bursting with excitement and curiosity to see what the West was like – a ‘West’ that most had only known through watching Western television. For some, sampling a McDonald’s hamburger – the ultimate symbol of Western capitalism – was high on the to-do list, for others it was access to Levi’s jeans or exotic fruit that was particularly novel.
At the same time as delighting in consumer improvements however, many East Germans felt ambivalent about the wider changes. Decades of state propaganda that painted Western societies as unjust places where homelessness, drugs and unemployment were rife, had left its mark, and many East Germans felt unsure and slightly fearful of what was to come.
From a position of full employment in 1989, 3 years after reunification 15% of East Germans were out of work. For those who struggled to put bread on the table after reunification, the advantages of having a wider range of goods to buy remained a largely theoretical gain. For others however, reunification led to greater freedom to pursue individual career choices that were not dictated by the state’s needs.
For those who were made to feel vulnerable and afraid by a regime that watched, trailed and threatened to imprison them, such as political opponents, Christians, environmental activists or other non-conformists, the fall of the Wall and the end of the GDR most often brought relief: the Western set-up allowed for greater freedom of expression and greater freedom of movement.
For the majority of East Germany though, this was not how they felt: since many say they had no idea of the extent to which the Stasi was intertwined with daily life, the end of the GDR did not bring with it a sense of relief. In fact, many East Germans felt that there were many things the West could learn from the GDR, and were resentful at the lack of openness to incorporating any East German policies or practices in the reunification process.
“Everywhere is becoming like a foreign land. I have long wished to travel to foreign parts, but I have always wanted to be able to come home … The landscapes remain the same, the towns and villages have the same names, but everything here is becoming increasingly unfamiliar.”
This view was echoed my many East Germans, who were conscious that they, for example, dressed differently from their Western compatriots, didn’t know how to pronounce items on the McDonalds menu when they were ordering and didn’t know how to work coin-operated supermarket trolleys in the West. With the fall of the Wall, a whole way of life evaporated. The certainties on which day to day routines had been built ceased to exist.
Swapping Communism for Capitalism has prompted diverse reactions from East Germans. Few would wish to return to the GDR, even if it were possible. However while many delight in having greater individual choice about what they eat, where they go, what they do and what they say, they often also have a wistful nostalgia for life before reunification, where the disparity between rich and poor was smaller and the solidarity between citizens seemed to be greater.
“This is a historic day. East Germany has announced that, starting immediately, its borders are open to everyone. The GDR is opening its borders … the gates in the Berlin Wall stand open.”
—Hans Joachim Friedrichs, reporting for the Tagesthemen, 9 November 1989
On 9 November 1989, at midnight, the East German government opened its borders to West Germany for the first time in almost thirty years: a city divided, families and friends separated for a generation, reunited again. For much of its existence, attempting to cross the wall meant almost certain death, and around 80 East Germans were killed in the attempt, shot down by the border guards as they tried to make their escape. With this announcement, however, the gates were thrown open.
The mood was euphoric. East Germans surged through the opened gates, shouting and cheering, to be met by the West Germans on the other side. That same night, they began dismantling the barrier which had kept them apart together, chipping away the bricks to keep as mementos. The fall of the Wall — an ugly scar across Berlin, adorned in barbed wire and patrolled by guards with machine guns — was a pivotal event in German history. A nation crippled by the most devastating conflict in living memory, and then carved up and separated from itself by the victors, could finally shrug off the long shadow cast by a dark history, and look toward a brighter, unified future.
The seismic consequences of the demolition were also felt well beyond the borders of Germany, and, along with the slow rusting and decay of the Iron Curtain, helped to spell the end for the USSR. In just two years after the wall’s demolition, the Soviet Union would cease to exist, thus ending the era we now call the Cold War. A period of around fifty years, marked by suspicion, space rockets, assassinations, espionage, show trials, paranoia, and propaganda, and which brought the world to the brink of destruction with the Cuban Missile Crisis, was finally at an end.
To mark the 25th anniversary of this momentous moment, we’ve compiled a selection of free chapters and articles across our online resources, which shed further light on the history behind the wall, what it meant to live in a city divided by it, and how the USSR declined and eventually fell.
On Sunday 13 August 1961, the wall was erected. This chapter, drawing on first-hand accounts, examines the initial reactions to the wall. As quoted in the chapter, one source describes the atmosphere of the day the wall went up as if “East Berlin was dead. It was as if a bell‐jar had been placed over it and all the air sucked out. The same oppressiveness which hung over us, hung over all Berlin. There was no trace of big city life, of hustle and bustle. Like when a storm moves across the city. Or when the sky lowers and people ask if hail is on the way.”
Whilst taking very little direct action against the wall, the West did offer covert assistance to groups of East Berlin activists trying to provide escape routes for those who wanted it. One of these groups was led by Rudolf Müller and his associates, who had dug a tunnel underneath the wall and were busy ferrying through escapees when a group of East German soldiers surprised them. Though they escaped unscathed, the confrontation left a twenty-one year-old soldier – Egon Schultz – dead. This chapter examines how Schultz and his death became idealised and politicised by the East German state, transforming him into a hero-victim of the ‘socialist frontier.’
In the early 1980s the USSR was struggling with a war in Afghanistan, economic problems, and changes of leadership. From the middle of the 1980s, Soviet policy changes under Gorbachev ended the arms race and eventually relinquished control of Eastern Europe, bringing about an end to the Cold War and the USSR. This chapter looks at these final years of the Cold War, and explores the impact of Reagan and Gorbachev.
One of the key factors in the demise of the USSR was the USSR itself – or, rather, the reforms of Gorbachev. With twin policies of ‘perestroika’ (literally ‘restructuring’) and ‘glasnost’ (a policy calling for increased transparency in the Soviet Union), Gorbachev began the slow process toward democratization, dismantling the totalitarian psychology that had marked previous Soviet regimes and paving the way for progressive reforms.
Gorbachev’s policies, coupled with Hungary opening its borders to tens of thousands of East Germans, left the state with a crisis on its hands. When it decided to close its Hungarian borders, many citizens took to the streets to protest in what quickly became a large movement. Troops were sent to forcibly dispel the protesters, but their use of non-violent tactics made it difficult to justify the use of force, leading many of the troops to defy orders and defect. As the momentum for the movement grew, the strength of the state declined, leading to the fall of the wall and the eventual dismantling of East Germany.
Guyatt examines different historical perspectives on what caused the end of the Cold War, as well as the psychological, strategic, and political effects of its aftermath. Was it the press statement made by Gorbachev’s spokesman after the fall of the Berlin Wall that the tensions, which spread “from Yalta to Malta,” were over that marked the War’s official end? Perhaps the end came with Gorbachev’s statement to the United Nations announcing the end of Soviet Union military force to subdue the satellite states of the Warsaw Pact in 1988. The article explores these catalysts, among others, to present a comprehensive look at the War’s end and its resulting feelings of anxiety, fear, and “triumphalism” that abounded in Western Europe.
As the wall came down, Germans were faced with a new challenge: how to forge a new, modern Germany. Linking the ‘macro’ worlds of institutional change to the ‘micro’ worlds of the lives and individual histories of its citizens, this chapter paints a fascinating portrait of once socialist and totalitarian state transitioning into the democratic Federal Republic of Germany.
The dismantling of the wall, which was both a symbolic and literal division between East and West, could have served as a potent symbol for a unified Germany and played an integral part in its foundation myth, yet this was not the case. Why was this? Charting the reasons behind this – including the pre-existing German fields of memory left by its dark past – the chapter explains why the fall of the wall is likely to remain a “muted, tempered memory” in German politics.
What books would you add to our Berlin Wall reading list?
On this day in 1984, musical aficionados from the worlds of pop and rock came together to record the iconic ‘Do They Know It’s Christmas?’ single for Band Aid. The single has gone down in history as an example of the power of music to help right the wrongs in the world. The song leapt to the number one spot over the Christmas of 1984, selling over a million copies in under a week and totalling sales of three million by the end of that year. The Band Aid super-group featured the cream of eighties pop, including David Bowie, Phil Collins, George Michael, Sting, Cliff Richard and Paul McCartney.
The sales target for the single was £70,000, all of which was to be donated to the African famine relief fund. With support from Radio 1 DJs and a Top of the Pops Christmas Special, sales sky-rocketed and Geldof, feeling the strength of public opinion behind him, went toe-to-toe with the conservative government in an attempt to have tax on the single waived. Margaret Thatcher initially refused the plea, but as public outcry grew, Thatcher caved-in to public demands and the tax on sales worth nearly £9 million was donated back to charity.
Bob Geldof and a host of artists old and new have re-recorded the single to help raise funds to stem the Ebola crisis. Our infographic marks the 30th anniversary of the original recording and illustrates the movers and shakers that made this monumental milestone in pop history possible.
The disease that carried Mozart off 224 years ago today was as sudden as it was mysterious. It struck during a year in which he was uncommonly healthy and also spectacularly productive. Only its essential elements are known, the most striking of which was progressive swelling (i.e., edema) of the entire body, ultimately so profound that a few days before Mozart died he was unable to make the smallest movement and had to be fitted with a gown that opened at the back to facilitate changing. By then, according to his son, Carl Thomas, he also had a stench so awful (likely due to retained urinary waste products), that “an autopsy was rendered impossible.” Although Mozart was the disorder’s most notable victim, he was by no means its only one. According to Dr. Eduard Vincent Guldener von Lobes, one of several consulting physicians: “A great number of inhabitants of Vienna were at this time laboring under the same complaint, and the number of cases which terminated fatally, like that of Mozart’s, was considerable. I saw the body after death, and it exhibited no appearances beyond those usual in such cases.” Von Lobes’ statement was recently confirmed by Zeger, Weigl and Steptoe, who found a marked increase in “deaths from edema among young men” recorded in Vienna’s official daily register in the weeks surrounding Mozart’s death compared with previous and following years.
Although over 100 different diagnoses have been proposed as the cause of Mozart’s fatal illness, none fits its character, course, and epidemiological characteristics better than acute glomerulonephritis – acute inflammation of the microscopic filters of the kidneys (the glomeruli) induced by a preceding streptococcal infection. Mozart, in fact, was no stranger to streptococcal infections and their complications. He had a series of severe illnesses as a child, which were almost certainly recurrent episodes of strep throat and streptococcus-induced acute rheumatic fever. Therefore, if his final, fatal illness was acute, post-streptococcal glomerulonephritis, it would have been just one of many times his life could have been cut short by an encounter with streptococci. However, unlike acute rheumatic fever, post-streptococcal glomerulonephritis is typically a benign disorder of young children, which virtually always resolves fully in a matter of weeks. How then, could acute, post-streptococcal glomerulonephritis explain not just Mozart’s death, but also those of the many other young Viennese men who died of “edema” during the winter of 1791/2?
The answer lies with the particular species of streptococcus responsible for the cases of acute glomerulonephritis. Streptococcus pyogenes is the species of streptococcus responsible for the vast majority of acute post-streptococcal glomerulonephritis (also “strep throat’ and rheumatic fever) – those benign cases, involving children who recover completely after relatively brief illnesses. There is, however, another rarer form of post-streptococcal glomerulonephritis, a much severer form, which attacks and sometimes kills adults. It’s caused by a different species of streptococcus, Streptococcus equi, the agent responsible for “strangles,” a highly contagious infection of horses. The bacterium also attacks cows, and in rare instances in which humans are infected, consumption of milk or milk products from S. equi-infected cows is responsible. The infection produces an illness typical of acute glomerulonephritis, in which over 90% of the victims are adults. One in 50 dies, even with the best care available today. One in 20 requires renal dialysis to recover, which, of course, was not available in Mozart’s day.
In the final analysis, of the myriad diagnoses proposed to date, only an epidemic of acute post-streptococcal glomerulonephritis caused by milk or cheese contaminated with S. equi, explains both the clinical and the epidemiological features of Mozart’s fatal illness.
Headline image credit: Mozart family portrait, circa 1780. Public domain via Wikimedia Commons.
Many students, when asked by a teacher or professor to volunteer in front of the class, shy away, avoid eye contact, and try to seem as plain and unremarkable as possible. The same is true in dental school – unless it comes to laughing gas.
As a fourth year dental student, I’ve had times where I’ve tried to avoid professors’ questions about anatomical variants of nerves, or the correct way to drill a cavity, or what type of tooth infection has symptoms of hot and cold sensitivity. There are other times where you cannot escape having to volunteer. These include being the first “patient” to receive an injection from one of your classmate’s unsteady and tentative hands. Or having an impression taken with too much alginate so that all of your teeth (along with your uvula and tonsils) are poured up in a stone model.
But volunteering in the nitrous oxide lab … that’s a different story. The lab day is about putting ourselves in our patients’ shoes, to be able to empathize with them when they need to be sedated. For me, the nitrous oxide lab might have been the most enjoyable 5 minutes of my entire dental education.
In today’s dental practice, nitrous oxide is a readily available, well-researched, incredibly safe method of reducing patient anxiety with little to no undesired side effects. But this was not always the case.
The Oxford Textbook of Anaesthesia for Oral and Maxillofacial Surgery argues that “with increasingly refined diets [in the mid-nineteenth century] and the use of copious amounts of sugar, tooth decay, and so dentistry, were on the increase.” Prior to the modern day local anesthesia armamentarium, extractions and dental procedures were completed with no anesthesia. Patients self-medicated with alcohol or other drugs, but there was no predictable or controllable way to prevent patients from experiencing excruciating pain.
That is until Horace Wells, a dentist from Hartford, Connecticut started taking an interest in nitrous oxide as a method of numbing patients to pain.
Wells became convinced of the analgesic properties of nitrous oxide on December 11, 1844 after observing a public display in Hartford of a man inhaling the gas and subsequently hitting his shin on a bench. After the gas wore off, the man miraculously felt no pain. With inspiration from this demonstration and a strong belief in the analgesic (and possibly the amnestic) qualities of nitrous oxide, on December 12, Wells proceeded to inhale a bag of the nitrous oxide and have his associate John Riggs extract one of his own teeth. It was risky—and a huge success. With this realization that dental work could be pain free, Wells proceeded to test his new anesthesia method on over a dozen patients in the following weeks. He was proud of his achievement, but he chose not to patent his method because he felt pain relief should be “as free as the air.”
This discovery brought Wells to the Ether Dome at the Massachusetts General Hospital in Boston. Before an audience of Harvard Medical School faculty and students, Wells convinced a volunteer from the audience to have their tooth extracted after inhaling nitrous oxide. Wells’ success came to an abrupt halt when this volunteer screamed out in pain during the extraction. Looking back on this event, it is very likely that the volunteer did not inhale enough of the gas to achieve the appropriate anesthetic effect. But the reason didn’t matter—Wells was horrified by his volunteer’s reaction, his own apparent failure, and was laughed out of the Ether Dome as a fraud.
The following year, William Morton successfully demonstrated the use of ether as an anesthetic for dental and medical surgery. He patented the discovery of ether as a dental anesthetic and sold the rights to it. To this day, most credit the success of dental anesthesia to Morton, not Wells.
After giving up dentistry, Horace Wells worked unsuccessfully as a salesman and traveled to Paris to see a presentation on updated anesthesia techniques. But his ego had been broken. After returning the U.S, he developed a dangerous addiction to chloroform (perhaps another risky experiment for patient sedation, gone awry) that left him mentally unstable. In 1848, he assaulted a streetwalker under the influence. He was sent to prison and in the end, took his own life.
This is the sad story of a man whose discovery revolutionized dentists’ ability to effectively care for patients while keeping them calm and out of pain. As a student at the University of Connecticut School of Dental Medicine, it is a point of pride knowing that Dr. Wells made this discovery just a few miles from where I have learned about the incredible effects of nitrous oxide. My education has taught me to use it effectively for patients who are nervous about a procedure and to improve the safety of care for patients with high blood pressure. This is a day we can remember a brave man who risked his own livelihood in the name of patient care.
Featured image credit: Laughing gas, by Rumford Davy. Public domain via Wikimedia Commons.
Seventy years ago today, in Korematsu v. United States, the Supreme Court upheld the constitutionality of the Japanese-American internment program authorized by President Franklin Roosevelt’s Executive Order 9066. The Korematsu decision and the internment program that forcibly removed over 100,000 people of Japanese ancestry from their homes during World War II are often cited as ugly reminders of the dangers associated with wartime hysteria, racism, fear-mongering, xenophobia, an imperial president, and judicial dereliction of duty. But the events surrounding Korematsu are also a harrowing reminder of what happens to liberty when the “Madisonian machine” breaks down — that is, when the structural checks and balances built into our system of government fail and give way to the worst forms of tyranny.
Our 18th century system of separated and fragmented government — what Gordon Silverstein calls the “Madisonian machine” — was engineered to prevent tyranny, or rather tyrannies. Madison’s Federalist 51 outlines a prescription for avoiding “Big T Tyranny” — the concentration of power in any one branch of government. This would be accomplished by dividing and separating powers among the three branches of government and between the federal government and the states. “Ambition must be made to counteract ambition,” Madison wrote. Each branch would jealously protect its own powers while guarding against encroachments by the others.
But this wasn’t the only form of tyranny the framers worried about. In a democracy, minorities are always at risk of being oppressed by majorities — what I call “little t tyranny.” Madison’s solution to this kind of tyranny is articulated in Federalist 10. The cure to this disease was firstly to elect representatives who could filter the passions of the masses and make more enlightened decisions. Secondly, Madison observed that as long as the citizenry is sufficiently divided and carved up into numerous smaller “factions,” it would be unlikely that a unified majority would emerge to oppress a minority faction.
In the events leading up to and including the Supreme Court’s decision in Korematsu, these safeguards built into the Madisonian machine broke down, giving way to both forms of T/tyranny. Congress not only acquiesced to President Roosevelt’s executive order, it responded with alacrity to support it. After just one hour of floor debate and virtually no dissent, Congress passed Public Law 503, which promulgated the order and assigned criminal penalties for violating it. And the branch furthest removed from the whims and passions of the majority, the Supreme Court, declined to second-guess the wisdom of the elected branches. As Justice Hugo Black wrote for the majority in Korematsu, “we cannot reject as unfounded the judgment of the military authorities and of Congress…” If Congress had been more skeptical, perhaps the Supreme Court might have been, too. But the Supreme Court has a long track record of deference to the executive when Congress gives express consent for his actions – especially in times of war. Unfortunately, under the Madisonian design, this is exactly when the Supreme Court ought to be the most skeptical of executive power.
To be sure, these checks and balances built into the Madisonian system were only meant to function as “auxiliary precautions.” The most important safeguard against T/tyranny would be the people themselves. Through a campaign of misinformation and fear-mongering, however, this protection was also rendered ineffective. Public opinion data was used selectively to convey the impression to both legislators and west coast citizens that the majority of Americans supported the internment program. The passions of the public were further manipulated by the media and west coast newspaper headlines such as “Japanese Here Sent Vital Data to Tokyo,” “Lincoln Would Intern Japs,” and “Danger in Delaying Jap Removal Cited.” Any dissent or would-be countervailing “factions,” to use Madison’s phrase, were effectively silenced.
In Korematsu, ambition did not counteract ambition as Madison had intended, and the machine broke down. That’s because in order to function properly, the Madisonian machine requires access to information and time for genuine deliberation. It also requires friction. It requires people to disagree – for our elected representatives to disagree with one another, for the Supreme Court to police the elected branches, for citizens to pause, faction off, and check one another. So we can complain of gridlock in government, but let’s not forget that the alternative, as demonstrated by the unforgivable and tragic events of Korematsu, exposes the most vulnerable among us to the worst forms of tyranny.
Featured image credit: A young evacuee of Japanese ancestry waits with the family baggage before leaving by bus for an assembly center. US National Archives and Records Administration. Public domain via Wikimedia Commons.
Two hundred years ago American and British delegates signed a treaty in the Flemish town of Ghent to end a two-and-a-half-year conflict between the former colonies and mother country. Overshadowed by the American Revolution and Napoleonic Wars in the two nations’ historical memories, the War of 1812 has been somewhat rehabilitated during its bicentennial. Yet arguing for the importance of a status quo antebellum treaty that concluded a war in which neither belligerent achieved its war aims, no territory was exchanged, and no victor formally declared can be a tough sell. Compared to the final defeat of Napoleon at the Battle of Waterloo, fought a just a few months later and forty odd miles down the road from Ghent, the end of the War of 1812 admittedly lacked cinema-worthy drama.
But the Treaty of Ghent mattered enormously (and not just to historians interested in the War of 1812). The war it ended saw relatively light casualties, measured in the thousands compared to the millions who died in the French Revolutionary and Napoleonic Wars that raged across the rest of the globe. Nevertheless, for the indigenous and colonizing peoples that inhabited the borderlands surrounding the United States, the conflict had proved devastating. Because the American and British economies were intertwined, the war had also wreaked havoc on American agriculture and British manufacturing, and wrecked each other’s merchant navies. Moreover, public support for the war in the British Empire and the United States had been lukewarm with plenty of outspoken opposition who had worked tirelessly to prevent and then quickly end the war.
Not surprisingly, peace resulted in widespread celebration across the Atlantic. The Leeds Mercury, many of whose readers were connected to the manufacturing industries that had relied on American markets, even compared the news with that of the Biblical account of the angelic chorus’s announcement of the birth of Jesus: “This Country, thanks to the good Providence of God, is now at Peace with Europe, with America, and with the World. . . . There is at length ‘Peace on Earth,’ and we trust the revival of ‘Good-will among men’ will quickly follow the close of national hostilities.” When the treaty reached Washington for ratification, President James Madison and Congress fell over themselves in a rush to sign it.
Far more interesting than what the relatively brief Treaty of Ghent includes is what was left out. When the British delegation arrived at Ghent in August 2014, they had every possible advantage. Britain had won the naval war, the United States was on brink of bankruptcy, and the end of Britain’s war with France meant that hardened veterans were being deployed for an imminent invasion of the United States. Later that month British troops would humiliatingly burn Washington. Even Ghent itself was a home field advantage, as it was occupied by British troops and within a couple of days of communication with ministers in London.
In consequence, Britain’s initial demands were severe. If the United States wanted peace, it had to cede 250,000 square miles of its northwestern lands (amounting to more than 15% of US territory, including all or parts of the modern states of Michigan, Illinois, Indiana, Ohio, Missouri, Iowa, Wisconsin, and Minnesota). These lands would be used to create an independent American Indian state—promises of which the British had used to recruit wary Indian allies. Britain also demanded a new border for Canada, which included the southern shores of the Great Lakes and a chunk of British-occupied Maine—changes that would have given Canada considerable natural defenses. The Americans, claimed the British, were “aggrandizers”, and these measures would ensure that such ambitions would be forever thwarted.
The significance of the terms is difficult to underestimate. Western expansion would have ground to a halt in the face of a powerful British-led alliance with the Spanish Empire and new American Indian state. The humiliation would likely have resulted in the collapse of the United States. The long-marginalized New England Federalists had been outspoken in their opposition to the war and President James Madison’s Southern-dominated Republican Party, with some of their leaders openly threatening secession. The Island of Nantucket had already signed a separate peace with Britain, and many inhabitants of British-occupied Maine had signed oaths of allegiance to Britain. The Governor of Massachusetts had even sent an agent to Canada to discuss terms of British support for his state’s secession, which included a request for British troops. The counterfactuals of a New England secession are too great to explore here, but the implications are epic—not least because, unlike in 1861, the US government in 1814 was in no position to stop one. In the end, a combination of the American delegates’ obstinacy and a rapidly fading British desire to keep the nation on an expensive war footing solely to fight the Americans led the British to abandon their harsh terms.
In consequence, the Treaty of Ghent cemented the United States rather than destroyed it. Historians have long debated who truly won the war. However, what mattered most was that neither side managed a decisive victory. The Americans lacked the organization and national unity to win; the British lacked the will to wage an expensive, offensive war in North America. American inadequacy ensured that all of Canada would prosper as part of the British Empire, even though Upper Canada (now Ontario) had arguably closer links to the United States and was populated largely by economic migrants from the United States. British desire to avoid further confrontation enabled the Americans to focus its attentions on eliminating the other, and considerably weaker, obstacles to continental supremacy: the American Indians and the remnants of the Spanish Empire, who proved to be the real losers of the War of 1812 and the Treaty of Ghent.
Featured image: The Signing of the Treaty of Ghent, Christmas Eve, 1814, Amédée Forestier (1814). Public domain via Wikimedia Commons.
Today, 8 January, is the 80th birthday of Elvis Presley. Born to Vernon Elvis Presley and Gladys Love Presley (née Smith) in 1935, the ‘King of Rock and Roll’ left an indelible mark on American popular culture. In celebration, we present a brief extract from Elvis Presley: A Southern Life by Joel Williamson.
One photograph of the small Presley family captures the essence of their lives then and thereafter. Elvis, about three years old, is posed with Gladys and Vernon. Elvis is standing, and his parents are sitting on either side of him.
The exact date of the picture is unknown. Decades later it showed up in the photograph collection of the Official Elvis Presley Fan Club in Leicester, England. Interviews with pediatricians, pediatric nurses, mothers, fathers, grandmothers, and grandfathers have estimated Elvis’s age.
The blank, clean, slightly gray background is probably the concrete wall of the brand-new Lee County jail in Tupelo. Vernon is a prisoner, having been arrested on November 16, 1937, for forging a check. The county jail had recently been built by the Works Progress Administration (WPA), a New Deal project to employ the unemployed. Previously, county prisoners had been lodged in the run-down town jail. Only the white prisoners were moved to the new jail.
In the photograph, mother, child, and father are close, body to body as if huddled against a coming moment of separation. Gladys’s left arm reaches behind and across Elvis’s back to Vernon. Her open hand rests lightly on Vernon’s left shoulder, as if to hold him in gently, to affirm her presence with him. It is a hand that seeks to comfort, but its loose openness signals her powerlessness.
Vernon had been charged with forging a check on Orville Bean, the dairy farmer who was his landlord and employer. He had been arrested and arraigned during the fall term of criminal court. He pled not guilty, but he would not get a speedy trial. His plea came too late for him to be tried in the fall term of court. His case would have to wait for the spring term, which began six months later on Monday, May 23, 1938. Before the court convened that spring, the local papers were full of suggestions that the docket was overfull and that justice in Lee County must be meted out more rapidly than before.
Only days before Vernon’s case would have been tried, he changed his plea to guilty. Justice swiftly followed. On Wednesday, May 25, Judge Thomas H. Johnston sentenced Vernon to three years in the state penitentiary. He got no credit for the six months he had spent in the county jail. After sentencing came the anxious wait before the prison guards trucked him off to Parchman Farm.
On Saturday, May 28, Circuit Court Clerk Joe J. Kilgo wrote out the papers committing Vernon and eleven other convicts to Parchman. The twelve men waited in the county jail for the dreaded arrival of “Long Chain Charley,” a sergeant on the guard force at Parchman who circulated through the state collecting convicts for transport to prison. He always brought a long chain to which he shackled his prisoners to prevent their escape.
Six months in the county jail waiting for a trial had been bad enough, but there was always at least some hope for relief. Orville Bean might decide not to press charges against Vernon. Relatives and friends might somehow intervene. If it came to a trial, a good lawyer might rise to defend him and the jury might find him innocent. Having changed his plea to guilty, Vernon faced the certainty of serving at hard labor in a notoriously tough prison for three long years, years in which he could not come home every night to his wife and child in their little two-room wooden house in East Tupelo nor earn money to support them.
Sensing the pathos in the photograph does not require knowledge of its history. The bodies of the man and woman are tense with anxiety and dread. The child is anxious and confused. Vernon has put his hat on his head as if making ready to leave. He faces the camera, but his eyes cut to his left as if watching fearfully for someone or something to appear that he already hears. Gladys also stares to the left, her body stiff.
The little boy’s gaze is less focused, as if he were told to look at the camera but senses something he needs to see off to the left too. He wears bib overalls over a dark, long-sleeved shirt, charmingly trimmed with white cuffs and a white collar. Gladys is a talented seamstress. She wears a flower-print dress. Her dress, like Elvis’s shirt, is attractively set off by a collar of a different color. Elvis, like his father, wears a hat. His hat seems almost man-sized, cocked at a rakish angle on his round little head. His full cherubic lips are twisted down to the right as if he realizes that he should say something and set his jaw in some certain way to assert an attitude, but he doesn’t know what to say or how.
This is the earliest photograph of Elvis. The photographer was most likely a friend or a relative who had driven Gladys and Elvis a couple of miles over from their home in East Tupelo. It was a defining moment in the lives of Elvis, Gladys, and Vernon Presley, individually and collectively. The very fact of the visit, the camera, and the one photograph that has been preserved indicates that they understood that they were at a critical juncture in their lives. The petty and foolish crime that Vernon committed in the fall of 1937, when he was twenty-one, Gladys twenty-five, and Elvis less than three, deeply marked their lives.
Today, 8 January, would have been Elvis Presley’s 80th birthday. In remembrance of his fascinating life we’re sharing a slideshow from the beautiful images inElvis Presley: A Southern Life by Joel Williamson. How did this Southern boy make it from Nashville and Vegas, to Grafenwoehr and the White House?
Elvis with his parents, 1950. Joseph A. Tunzi/ JAT Publishing.
Elvis Presley with Scotty and Bill poster, Cape Girardeau, Mo., July 1955. Taken at the Country Music Hall of Fame in Nashville, Tennessee. Thomas Hawk, photographer. Available via Flikr.
Elvis on his way to fame at the Louisiana Hayride, 1956. LSU-Shreveport Archives and Special Collections.
An impromptu session with Jerry Lee Lewis, Carl Perkins, Elvis Presley, and Johnny Cash at the Sun Record Studios in Memphis, Tennessee, on December 4, 1956. Originally published in the Memphis Press-Scimitar. Courtesy of the Memphis and Shelby County Room, Memphis Public Library & Information Center.
Headline and 1956 photo from article on Elvis and Mae Axton, who wrote “Heartbreak Hotel,” just after the record sold 1 million copies, 1956. Published in the Memphis Commercial Appeal. Courtesy of the Memphis and Shelby County Room, Memphis Public Library & Information Center.
Elvis Presley in Grafenwoehr, 1958. Courtesy of U.S. Army Garrison Grafenwoehr.
The façade of Graceland in the late 1950s or early 1960s.
Elvis during his ’68 Comeback Special on NBC. Available via Joseph A. Tunzi/ JAT
Elvis and Priscilla’s wedding at the Aladdin Hotel, Las Vegas, May 1, 1967. Available via Getty.
Priscilla and Elvis at a dinner. Memphis and Shelby County Room, Available via Memphis Public Library & Information Center.
Elvis after a performance in Las Vegas, January or February 1970. Available via Joseph A. Tunzi/ JAT Publishing.
Elvis rehearsing in Las Vegas for his 1970 documentary, “Elvis: That’s the Way It Is. Available via Joseph A. Tunzi/ JAT Publishing.
Elvis Presley meets President Richard Nixon on December 21, 1970. White House Chief Photographer Oliver F. Atkins. General Services Administration. National Archives and Records Service. Office of Presidential Libraries. Office of Presidential Papers. Collection RN-WHPO: White House Photo Office Collection (Nixon Administration), 01/20/1969–08/09/1974.
Marquee of the International Hotel, Las Vegas, 1971. Available via Joseph A. Tunzi/ JAT Publishing.
From just behind the gates at Graceland, a look at the mourners gathered on the day Elvis died, as the police try to hold back the crowds, August 16, 1977. Photographed by Saul Brown. Memphis and Shelby County Room, Memphis Public Library & Information Center.
Featured image credit: Headline and 1956 photo from article on Elvis and Mae Axton, who wrote “Heartbreak Hotel,” just after the record sold 1 million copies, 1956. Published in the Memphis Commercial Appeal. Courtesy of the Memphis and Shelby County Room, Memphis Public Library & Information Center.
As anyone knows who has looked at the newspapers over the festive season, 2015 is a bumper year for anniversaries: among them Magna Carta (800 years), Agincourt (600 years), and Waterloo (200 years). But it is January which sees the first of 2015’s major commemorations, for it is fifty years since Sir Winston Churchill died (on the 24th) and received a magnificent state funeral (on the 30th). As Churchill himself had earlier predicted, he died on just the same day as his father, Lord Randolph Churchill, had done, in 1895, exactly seventy years before.
The arrangements for Churchill’s funeral, codenamed ‘Operation Hope Not’, had long been in the planning, which meant that Churchill would receive the grandest obsequies afforded to any commoner since the funerals of Nelson and Wellington. And unlike Magna Carta or Agincourt or Waterloo, there are many of us still alive who can vividly remember those sad yet stirring events of half a century ago. My generation (I was born in 1950) grew up in what were, among other things, the sunset years of Churchillian apotheosis. They may, as Lord Moran’s diary makes searingly plain, have been sad and enfeebled years for Churchill himself, but they were also years of unprecedented acclaim and veneration. During the last decade of his life, he was the most famous man alive. On his ninetieth birthday, thousands of greeting cards were sent, addressed to ‘The Greatest Man in the World, London’, and they were all delivered to Churchill’s home. During his last days, when he lay dying, there were many who found it impossible to contemplate the world without him, just as Queen Victoria had earlier wondered, at the time of his death in 1852, how Britain would manage without the Duke of Wellington.
Like all such great ceremonial occasions, the funeral itself had many meanings, and for those of us who watched it on television, by turns enthralled and tearful, it has also left many memories. In one guise, it was the final act homage to the man who had been described as ‘the saviour of his country’, and who had lived a life so full of years and achievement and honour and controversy that it was impossible to believe anyone in Britain would see his like again. But it was also, and in a rather different emotional and historical register, not only the last rites of the great man himself, but also a requiem for Britain as a great power. While Churchill might have saved his country during the Second World War, he could not preserve its global greatness thereafter. It was this sorrowful realization that had darkened his final years, just as his funeral, attended by so many world leaders and heads of state, was the last time that a British figure could command such global attention and recognition. (The turn out for Margaret Thatcher’s funeral, in 2013, was nothing like as illustrious.) These multiple meanings made the ceremonial the more moving, just as there were many episodes which made it unforgettable: the bearer party struggling and straining to carry the huge, lead-lined coffin up the steps of St Paul’s; Clement Attlee—Churchill’s former political adversary—old and frail, but determined to be there as one of the pallbearers, sitting on a chair outside the west door brought especially for him; the cranes of the London docks dipping in salute, as Churchill’s coffin was born up the Thames from Tower Pier to Waterloo Station; and the funeral train, hauled by a steam engine of the Battle of Britain class, named Winston Churchill, steaming out of the station.
For many of us, the funeral was made the more memorable by Richard Dimbleby’s commentary. Already stricken with cancer, he must have known that this would be the last he would deliver for a great state occasion (he would, indeed, be dead before the year was out), and this awareness of his own impending mortality gave to his commentary a tone of tender resignation that he had never quite achieved before. As his son, Jonathan, would later observe in his biography of his father, ‘Richard Dimbleby’s public was Churchill’s public, and he had spoken their emotions.’
Fifty years on, the intensity of those emotions cannot be recovered, but many events have been planned to commemorate Churchill’s passing, and to ponder the nature of his legacy. Two years ago, a committee was put together, consisting of representatives of the many institutions and individuals that constitute the greater Churchill world, both in Britain and around the world, which it has been my privilege to chair. Significant events are planned for 30 January: in Parliament, where a wreath will be laid; on the River Thames, where Havengore, the ship that bore Churchill’s coffin, will retrace its journey; and at Westminster Abbey, where there will be a special evensong. It will be a moving and resonant day, and the prelude to many other events around the country and around the world. Will any other British prime minister be so vividly and gratefully remembered fifty years after his—or her—death?
Headline image credit: Franklin D. Roosevelt and Winston Churchill, New Bond Street, London. Sculpted by Lawrence Holofcener. Public domain via Wikimedia Commons.
Since Beethoven’s death on this day 188 years ago, debate has raged as to the cause of his deafness, generating scores of diagnoses ranging from measles to Paget’s disease. If deafness had been his only problem, diagnosing the disorder might have been easier, although his ear problem was of a strange character no longer seen. It began ever so surreptitiously and took over two decades to complete its destruction of Beethoven’s hearing.