What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

Recently Viewed

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Tag

In the past 7 days

Blog Posts by Date

Click days in this calendar to see posts by day or month
new posts in all blogs
Viewing: Blog Posts Tagged with: us, Most Recent at Top [Help]
Results 1 - 25 of 147
1. Inequality in democracies: interest groups and redistribution

We are by now more or less aware that income inequality in the US and in most of the rich OECD world is higher today than it was some 30 to 40 years ago. Despite varying interpretations of what led to this increase, the fact remains that inequality is exhibiting a persistent increase, which is robust to both expansionary and contractionary economic times. One might even say that it became a stylized fact of the developed world (amid some worthy exceptions). The question on everyone's lips is how can a democracy result in rising inequality?

The post Inequality in democracies: interest groups and redistribution appeared first on OUPblog.

0 Comments on Inequality in democracies: interest groups and redistribution as of 2/25/2015 4:55:00 AM
Add a Comment
2. CBTV: ‘Journey of Two’ by Joshua Mulligan

Two best friends wake up and start the day.

Add a Comment
3. Animated Fragments #26

Animated Fragments is our semi-regular feature of animation tests, experiments, micro-shorts, and other bits of cartoon flotsam that doesn't fit into other categories. To view the previous 25 installments, go to the Fragments archive.

Add a Comment
4. Animated Fragments #24

Interesting animation is being produced everywhere you look nowadays. This evening, we’re delighted to present animated fragments from six different countries: Chile, Iran, UK, US, Japan and Spain. For more, visit the Animated Fragments archive.

“Lollypop Man—The Escape” (work-in-progress) by Estudio Pintamonos (Chile)

“Bazar” by Mehdi Alibeygi (Iran)

“Time” by Max Halley (UK)

Hand-drawn development animation for Wreck-It Ralph to “explore animation possibilities before [Gene's] model and rig were finalised” by Sarah Airriess (US)

“Rithm loops” for an iPhone/iPad app by AllaKinda (Spain)

“Against” by Yukie Nakauchi (Japan)

Add a Comment
5. Weekend Groove: Music Videos from Poland, US, The Netherlands, and UK

Our semi-regular roundup of interesting, creative and original animated music videos.

“Birthday” directed by Renata Gąsiorowska (Poland)

Music video for Alphabets Heaven.

“The Mystery of You” directed by Eric Deuel (US)

Music video for Spencer Day.

“Been Too Long” (“Duurt te Lang”) directed by Job, Joris & Marieke (The Netherlands)

Music video for Fit

“G.O.D.” directed by Tom Bunker and Nicos Livesey (UK)

Music video for Binary.
Lead Animators (2D & 3D): Blanca Martinez de Rituerto & Joe Sparrow
Secondary 2D Animation: Andy Baker, Tom Bunker, Nicos Livesey

“Joy” directed by Hayley Morris (US)

Music video for Iron and Wine
Behind-the-scenes video HERE
Director/Animator: Hayley Morris
Fabricators: Hayley Morris, Denise Hauser and Randy Bretzin
Color Correction: Evan Kultangwatana
Model for watercolor animation: Louise Sheldon

Add a Comment
6. Weekend Groove: Music Videos from US, UK, Spain and Belgium

Our weekly roundup of the most interesting, creative and original animated music videos.

“Latter Days” directed by Matt Christensen (US)

Music video for The Middle Eight. Go to Matt’s website for a behind-the-scenes photo album.

“We Can Be Ghosts Now” directed by Tom Jobbins (UK)

Music video for Hiatus feat. Shura.
Art Director: John Jobe Reynolds
Cinematographer: Matthias Pilz
Colorist: Danny Atkinson
Compositor: Jonathan Topf
Editor:Robert Mila

“Magdalena” directed by Lucas Borras (Spain/US)

Music video for Quantic & Alice Russell.

“Separated” directed by Mark Borgions (Belgium)

Music video for Stan Lee Cole.

Add a Comment
7. Weekend Groove: Music Videos from Poland, US, UK, and Australia

The weekend is almost over, but it’s not too late to check out these quality animated music videos that have recently come to my attention.

“Katachi” directed by Kijek/Adamski (Poland)

Music video for Shugo Tokumaru. Watch the making-of video.

“the light that died in my arms” directed by Alan Foreman (US)

Music video for Ten Minute Turns. Song written and recorded by Alan Foreman and Roger Paul Mason

“Easy” directed by Wesley Louis and Tim McCourt (UK)

Music video for Mat Zo and Porter Robinson. Credits:

Directed By Louis & McCourt
Art Direction by Bjorn Aschim
Animators: Jonathan ‘Djob’ Nkondo, James Duveen, Sam Taylor, Wesley Louis, Tim McCourt
Backgrounds and Layouts: Bjorn Aschim, Mike Shorten
Compositing: Sam Taylor, Jonathan Topf
3D VFX Directing by Jonathan Topf
Graphic Design by Hisako Nakagawa
Producers: Jack Newman, Drew O’Neill
Produced by Bullion

“Time to Go” directed by Darcy Prendergast and Seamus Spilsbury (Australia)

Music video for Wax Tailor produced by Oh Yeah Wow. Credits:

Animators: Sam Lewis, Mike Greaney, Seamus Spilsbury, Darcy Prendergast
VFX supervisor: Josh Thomas
Assistant animators: Alexandra Calisto de Carvalho, Joel Williams
Compositors: Josh Thomas, Jeremy Blode, James Bailey, Alexandra Calisto de Carvalho, Keith Crawford, Dan Steen
Crotchet sculptor: Julie Ramsden
Colour grade: Dan Stonehouse, Crayon

Add a Comment
8. African American lives

February marks a month of remembrance for Black History in the United States. It is a time to reflect on the events that have enabled freedom and equality for African Americans, and a time to celebrate the achievements and contributions they have made to the nation.

Dr Carter Woodson, an advocate for black history studies, initially created “Negro History Week” between the birthdays of two great men who strived to influence the lives of African Americans: Fredrick Douglass and Abraham Lincoln. This celebration was then expanded to the month of February and became Black History Month. Find out more about important African American lives with our quiz.

Rev. Ralph David Abernathy speaks at Nat’l. Press Club luncheon. Photo by Warren K. Leffler. 1968. Library of Congress.

Your Score:  

Your Ranking:  

The landmark American National Biography offers portraits of more than 18,700 men & women — from all eras and walks of life — whose lives have shaped the nation. The American National Biography is the first biographical resource of this scope to be published in more than sixty years.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post African American lives appeared first on OUPblog.

0 Comments on African American lives as of 2/24/2013 4:05:00 AM
Add a Comment
9. Jazz lives in the African American National Biography

By Scott Yanow


When I was approached by the good folks at Oxford University Press to write some entries on jazz artists, I noticed that while the biggest names (Louis Armstrong, Duke Ellington, Charlie Parker, Miles Davis, John Coltrane, etc.) were already covered, many other artists were also deserving of entries. There were several qualities that I looked for in musicians before suggesting that they be written about. Each musician had to have a distinctive sound (always a prerequisite before any artist is considered a significant jazz musician), a strong body of work, and recordings that sound enjoyable today. It did not matter if the musician’s prime was in the 1920s or today. If their recordings still sounded good, they were eligible to be given prestigious entries in the African American National Biography.

Some of the entries included in the February update to the Oxford African American Studies Center are veteran singers Ernestine Anderson, Ernie Andrews, and Jon Hendricks; trumpet legends Harry “Sweets” Edison, Kenny Dorham, and Art Farmer; and a few giants of today, including pianist Kenny Barron, trumpeter Roy Hargrove, and clarinetist Don Byron.

File:Kenny Barron Munich 2001.JPG

In each case, in addition to including the musicians’ basic biographical information, key associations, and recordings, I have included a few sentences that place each artist in their historic perspective, talking about how they fit into their era, describing their style, and discussing their accomplishments. Some musicians had only a brief but important prime period, but there is a surprising number of artists whose careers lasted over 50 years. In the case of Benny Carter, the alto saxophonist/arranger was in his musical prime for a remarkable 70 years, still sounding great when he retired after his 90th birthday.

Jazz, whether from 90 years ago or today, has always overflowed with exciting talents. While jazz history books often simplify events, making it seem as if there were only a handful of giants, the number of jazz greats is actually in the hundreds. There was more to the 1920s than Louis Armstrong, more to the swing era than Benny Goodman and Glenn Miller, and more to the classic bebop era than Charlie Parker and Dizzy Gillespie. For example, while Duke Ellington is justly celebrated, during the 49 years that he led his orchestra, he often had as many as ten major soloists in his band at one time, all of whom had colorful and interesting lives.

Because jazz has had such a rich history, it is easy for reference books and encyclopedias to overlook the very viable scene of today. The music did not stop with the death of John Coltrane in 1967 or the end of the fusion years in the late 1970s. Because the evolution of jazz was so rapid between 1920 and 1980, continuing in almost a straight line as the music became freer and more advanced, it is easy (but inaccurate) to say that the music has not continued evolving. What has happened during the past 35 years is that instead of developing in one basic way, the music evolved in a number of directions. The music world became smaller and many artists utilized aspects of World and folk music to create new types of “fusions.” Some musicians explored earlier styles in creative ways, ranging from 1920s jazz to hard bop. The avant-garde or free jazz scene introduced many new musicians, often on small label releases. And some of the most adventurous players combined elements of past styles — such as utilizing plunger mutes on horns or engaging in collective improvisations — to create something altogether new.

While many veteran listeners might call one period or another jazz’s “golden age,” the truth is that the music has been in its prime since around 1920 (when records became more widely available) and is still in its golden age today. While jazz deserves a much larger audience, there is no shortage of creative young musicians of all styles and approaches on the scene today. The future of jazz is quite bright and the African American National Biography’s many entries on jazz greats reflect that optimism.

Scott Yanow is the author of eleven books on jazz, including The Great Jazz Guitarists, The Jazz Singers, Trumpet Kings, Jazz On Record 1917-76, and Jazz On Film.

The Oxford African American Studies Center combines the authority of carefully edited reference works with sophisticated technology to create the most comprehensive collection of scholarship available online to focus on the lives and events which have shaped African American and African history and culture. It provides students, scholars and librarians with more than 10,000 articles by top scholars in the field.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.
Image Credit: Kenny Barron 2001, Munich/Germany. Photo by Sven.petersen, public domain via Wikimedia Commons

The post Jazz lives in the African American National Biography appeared first on OUPblog.

0 Comments on Jazz lives in the African American National Biography as of 2/19/2013 3:57:00 AM
Add a Comment
10. Five books for Presidents Day

By John Ferling


Picking out five books on the founding of the nation, and its leaders, is not an easy task. I could easily have listed twenty-five that were important to me. But here goes:

Merrill Jensen, The Founding of a Nation: A History of the American Revolution, 1763-1776 (New York, Oxford University Press, 1968)

This book remains the best single volume history of the American Revolution through the Declaration of Independence. This isn’t flag waving, but a warts and all treatment in which Jensen demonstrates that many of the now revered Founders feared and resisted the insurgency that led to American independence.

Merrill Jensen, The American Revolution Within America (New York, New York University Press, 1974)

Obviously I admire the work of Merrill Jensen. Lectures delivered to university audiences quite often are not especially readable, but this collection of three talks that he delivered at New York University is a wonderful read. Jensen pulls no punches. He shows what some Founders sought to gain from the Revolution and what others hoped to prevent, and he makes clear that those who wished (“conspired” might be a better word) to stop the political and social changes unleashed by the American Revolution were in the forefront of those who wrote and ratified the US Constitution.

Gordon Wood, The Radicalism of the American Revolution (New York, Knopf, 1992)

While this book is far from a complete history of the American Revolution (and it never pretended to be), it chronicles how America was changed by the Revolution. I think the first eighty or so pages were among the best ever written in detailing how people thought and behaved prior to the American Revolution. I always asked the students in my introductory US History survey classes to read that section of the book.

James Thomas Flexner, George Washington and the New Nation, 1783-1793 (Boston, Little Brown, 1970) and George Washington: Anguish and Farewell, 1793-1799 (Boston, Little Brown, 1972)

Alright, I cheated. There are two books here, bringing my total number of books to six. Flexner was a popular writer who produced a wonderful four volume life of Washington in the 1960s and 1970s. These two volumes chronicle Washington following the War of Independence, and they offer a rich and highly readable account of Washington’s presidency and the nearly three years left to him following his years as chief executive.

Peter Onuf, ed., Jeffersonian Legacies (Charlottesville, University Press of Virginia, 1993)

This collection of fifteen original essays by assorted scholars scrutinizes the nooks and crannies of Thomas Jefferson’s life and thought. As in any such collection, some essays are better than others, but on the whole this is a good starting point for understanding Jefferson and what scholars have thought of him. Though these essays were published twenty years ago, most remain surprisingly fresh and modern.

John Ferling is Professor Emeritus of History at the University of West Georgia. He is a leading authority on late 18th and early 19th century American history. His new book, Jefferson and Hamilton: The Rivalry that Forged a Nation, will be published in October. He is the author of many books, including Independence, The Ascent of George Washington, Almost a MiracleSetting the World Ablaze, and A Leap in the Dark. He lives in Atlanta, Georgia.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Five books for Presidents Day appeared first on OUPblog.

0 Comments on Five books for Presidents Day as of 2/18/2013 12:07:00 PM
Add a Comment
11. The presidents that time forgot

By Michael J. Gerhardt


If you think that Barack Obama can only learn how to build a lasting legacy from our most revered presidents like Abraham Lincoln, you should think twice. I am sure that Obama knows what great presidents did that made them great. He can also learn, however, from some once popular presidents who are now forgotten because they made mistakes or circumstances that helped to bury their legacies.

Enduring presidential legacies require presidents to do things and express constitutional visions that stand the test of time. To be lasting, presidential legacies need to inspire subsequent presidents and generations to build on them. Without such inspiration and investment, legacies are lost and eventually forgotten.

Consider, for example, James Monroe who was the only man besides Obama to be the third president in a row to be reelected. Once wildly popular, he is now largely forgotten. His first term was known as the era of good feelings because it coincided with the demise of any viable opposition party. When he was reelected in 1820, he won every electoral vote but one. Yet, most Americans know nothing about his presidency except perhaps the “Monroe Doctrine” supporting American intervention to protect the Americas from European interference. The doctrine endures because subsequent presidents have adhered to it.

Monroe’s record is largely forgotten for three reasons: First, his legislative achievements eroded over time. He authorized two of the most significant laws enacted in the nineteenth century — the Missouri Compromise, restricting slavery in the Missouri territory, and the Tenure in Office Act, which restricted the terms of certain executive branch officials. But, subsequent presidents differed over both laws’ constitutionality and tried to repeal or amend them. Eventually, the Supreme Court struck them both down.

Second, Monroe had no distinctive vision of the presidency or Constitution. He entered office as the fourth and last member of the Virginia dynasty of presidents. He had nothing to offer that could match the vision and stature of his three predecessors from Virginia — Washington, Jefferson, and Madison.  Even with no opposition party, he was unsure where to lead the country. His last two years in office were so fractious, they became known as the era of bad feelings.

Third, Monroe had no close political ally to follow him in office. While he had been his mentor Madison’s logical successor, Monroe had no natural heir. Subsequent presidents, including John Quincy Adams who had been his Secretary of State, felt little fidelity to his legacy.

If President Obama wants to avoid Monroe’s mistakes, he must plan for the future. He should consider whom he would like to follow him and which of his legislative initiatives Republicans might support. If Obama stands on the sidelines in the next election or fails to produce significant bipartisan achievements in his second term, he risks having his successor(s) bury his legacy.

Grover Cleveland, another two-term president, is more forgotten than Monroe. If he is remembered at all, it is as the only man to serve two, non-consecutive terms as president. He was the only Democrat elected in the second half of the nineteenth century and the only president other than Franklin Roosevelt to have won most of the popular vote in three consecutive presidential elections.

Yet, Cleveland’s record is forgotten because he blocked rather than built things. He devoted his first term to vetoing laws he thought favored special interests. He cast more vetoes than any president except for FDR, and in his second term the Senate retaliated against his efforts to remove executive officials to create vacancies to fill by stalling hundreds of his nominations.

Cleveland successfully appealed to the American people to break the impasse with the Senate, but his constant clashes with Congress took their toll. In his second term, his disdain for Congress, bullying its members to do what he wanted, and stubbornness prevented him from reaching any meaningful accord to deal with the worst economic downturn before the Great Depression of the 1930s. While Cleveland resisted building bridges to Republicans in Congress, Obama still has time to build some.

Finally, Calvin Coolidge had the vision and rhetoric required for an enduring legacy, but his results failed the test of time. He was virtually unknown when he became Republican Warren Harding’s Vice-President. But, when Harding died, Coolidge inherited a scandal-ridden administration. He worked methodically with Congress to root out the corruption in the administration, established regular press briefings, and easily won the 1924 presidential election  Over the next four years, he signed the most significant federal disaster relief bill until Hurricane Katrina and the first federal regulations of broadcasting and aviation. He supported establishing the World Court and the Kellogg-Briand Pact, which outlawed war.

Coolidge’s vision had wide appeal. His conviction that the business of America was business still resonates among many Republicans, but he quickly squandered his good will with the Republican-led Senate when, shortly after his inauguration in 1925, he insisted on re-nominating Charles Warren as Attorney General after it had rejected his nomination. Coolidge could have easily won reelection, but he lost interest in politics after his son died in 1924. He did not help his Secretary of Commerce Herbert Hoover win the presidency in 1928 and said nothing as the economy lapsed into the Great Depression. His penchant for silence, for which he was widely ridiculed, and the failures of his international initiatives and economic policies destroyed his legacy.

As Obama enters his second term, he cannot stand above the fray like Monroe and Coolidge. He must lead the nation through it. He must work with Congress rather than become mired in squabbles with it as did Cleveland, whose contempt for Congress and limited vision made grand bargains impossible. On many issues, including gay rights and solving the debt ceiling, President Obama’s detachment has allowed him to be perceived as having been led rather than leading. He still has a chance to lead through his words and actions and define his legacy as something more than his having been the first African-American elected president or the controversy over the individual mandate in the Affordable Care Act. Unlike forgotten presidents, he still has the means to construct a legacy Americans will value and remember, but to avoid their fates he must use them — now.

Michael Gerhardt is Samuel Ashe Distinguished Professor of Constitutional Law at the University of North Carolina, Chapel Hill. A nationally recognized authority on constitutional conflicts, he has testified in several Supreme Court confirmation hearings, and has published several books, including The Forgotten Presidents: Their Untold Constitutional Legacy.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.
Image credit: Images from The Forgotten Presidents.

The post The presidents that time forgot appeared first on OUPblog.

0 Comments on The presidents that time forgot as of 2/18/2013 6:47:00 AM
Add a Comment
12. A quiz on Prohibition

Old Man Prohibition hung in effigy from a flagpole as New York celebrated the advent Repeal after years of bootleg booze. Source: NYPL.

How much do you know about the era of Prohibition, when gangsters rose to power and bathtub gin became a staple? 2013 marks the 80th anniversary of the repeal of the wildly unpopular 18th Amendment, initiated on 17 February 1933 when the Blaine Act passed the United States Senate. To celebrate, test your knowledge with this quiz below, filled with tidbits of 1920s trivia gleaned from The Oxford Encyclopedia of Food and Drink in America: Second Edition.

Your Score:  

Your Ranking:  

The second edition of The Oxford Encyclopedia of Food and Drink in America thoroughly updates the original, award-winning title, while capturing the shifting American perspective on food and ensuring that this title is the most authoritative, current reference work on American cuisine. Editor Andrew F. Smith teaches culinary hist ory and professional food writing at The New School University in Manhattan. He serves as a consultant to several food television productions (airing on the History Channel and the Food Network), and is the General Editor for the University of Illinois Press’ Food Series. He has written several books on food, including The Tomato in America, Pure Ketchup, and Popped Culture: A Social History of Popcorn in America. The Oxford Encyclopedia of Food and Drink is also available on Oxford Reference.

Subscribe to the OUPblog via email or RSS.
Subscribe to only food and drink articles on the OUPblog via email or RSS.

The post A quiz on Prohibition appeared first on OUPblog.

0 Comments on A quiz on Prohibition as of 2/17/2013 6:53:00 AM
Add a Comment
13. The bombing of Monte Cassino

On the 15th of February 1944, Allied planes bombed the abbey at Monte Cassino as part of an extended campaign against the Italians. St. Benedict of Nursia established his first monastery, the source of the Benedictine Order, here around 529. Over four months, the Battle of Monte Cassino would inflict some 200,000 causalities and rank as one of the most horrific battles of World War Two. This excerpt from Peter Caddick-Adams’s Monte Cassino: Ten Armies in Hell, recounts the bombing.

On the afternoon of 14 February, Allied artillery shells scattered leaflets containing a printed warning in Italian and English of the abbey’s impending destruction. These were produced by the same US Fifth Army propaganda unit that normally peddled surrender leaflets and devised psychological warfare messages. The monks negotiated a safe passage through the German lines for 16 February — too late, as it turned out. American Harold Bond, of the 36th Texan Division, remembered  the texture of the ‘honey-coloured Travertine stone’ of the abbey that fine Tuesday morning, and how ‘the Germans seemed to sense that something important was about to happen for they were strangely quiet’. Journalist Christopher Buckley wrote of ‘the cold blue on that late winter morning’ as formations of Flying Fortresses ‘flew in perfect formation with that arrogant dignity which distinguishes bomber aircraft as they set out upon a sortie’. John Buckeridge of 1/Royal Sussex, up on Snakeshead, recalled his surprise as the air filled with the drone of engines and waves of silver bombers, the sun glinting off their bellies, hove into view. His surprise turned to concern when he saw their bomb doors open — as far as his battalion was concerned the raid was not due for at least another day.

Brigadier Lovett of 7th Indian Brigade was furious at the lack of warning: ‘I was called on the blower and told that the bombers would be  over in fifteen minutes… even as I spoke the  roar  [of  aircraft] drowned my voice as the first shower of eggs [bombs]  came down.’ At the HQ of the 4/16th Punjabis, the adjutant wrote: ‘We went to the door of the command post and gazed up… There we saw the white trails of many high-level bombers. Our first thought was that they were the enemy. Then somebody said, “Flying Fortresses.” There followed the whistle, swish and blast as the first flights struck at the monastery.’ The first formation released their cargo over the abbey. ‘We could see them fall, looking at this distance like little black stones, and then the ground  all around  us shook with gigantic shocks as they exploded,’ wrote Harold  Bond. ‘Where the abbey had been there was only a huge cloud of smoke and dust which concealed the entire hilltop.’

The aircraft which committed the deed came from the massive resources of the US Fifteenth and Twelfth Air Forces (3,876 planes, including transports and those of the RAF in theatre), whose heavy and medium bombardment wings were based predominantly on two dozen temporary airstrips around Foggia in southern Italy (by comparison, a Luftwaffe return of aircraft numbers in Italy on 31 January revealed 474 fighters, bombers and reconnaissance aircraft in theatre, of which 224 were serviceable). Less than an hour’s flying time from Cassino, the Foggia airfields were primitive, mostly grass affairs, covered with Pierced Steel Planking runways, with all offices, accommodation and other facilities under canvas, or quickly constructed out of wood. In mid-winter the buildings and tents were wet and freezing, and often the runways were swamped with oceans of mud which inhibited  flying. Among the personnel stationed there was Joseph Heller, whose famous novel Catch-22 was based on the surreal no-win-situation chaos of Heller’s 488th Bombardment Squadron, 340th Bomb Group, Twelfth Air Force, with whom he flew sixty combat missions as a bombardier (bomb-aimer) in B-25 Mitchells.

After the first wave of  aircraft struck Cassino monastery, a Sikh company of 4/16th Punjabis fell back, understandably, and a German wireless message was heard to announce: ‘Indian troops  with turbans are retiring’. Bond and his friends were astonished when, ‘now and again, between the waves of bombers, a wind would blow the smoke away, and to our surprise we saw the gigantic walls of the abbey still stood’. Captain Rupert Clarke, Alexander’s ADC, was watching with his boss. ‘Alex and I were lying out on the ground about 3,000 yards from Cassino. As I watched the bombers, I saw bomb doors open and bombs began to fall well short of the target.’ Back at the 4/16th Punjabis, ‘almost before the ground ceased to shake the telephones were ringing. One of our companies was within 300 yards of the target and the others within 800 yards; all had received a plastering and were asking questions with some asperity.’ Later, when a formation of B-25 medium bombers passed over, Buckley noticed, ‘a  bright  flame, such  as a  giant  might have produced by striking titanic matches on the mountain-side, spurted swiftly upwards at half a dozen points. Then a pillar of smoke 500 feet high broke upwards into the blue. For nearly five minutes it hung around the building, thinning gradually upwards.’

Nila Kantan of the Royal Indian Army Service Corps was no longer driving trucks, as no vehicles could get up to the 4th Indian Division’s positions overlooking the abbey, so he found himself portering instead. ‘On our shoulders we carried all the things up the hill; the gradient was one in three, and we had to go almost on all fours. I was watching from our hill as all the bombers went in and unloaded their bombs; soon after, our guns blasted the hill, and ruined the monastery.’ For Harold Bond, the end was the strangest, ‘then nothing happened. The smoke and dust slowly drifted away, showing the crumbled masonry with fragments of walls still standing, and men in their foxholes talked with each other about the show they had just seen, but the battlefield remained relatively quiet.’

The abbey had been literally ruined, not obliterated as Freyberg had required, and was now one vast mountain of rubble with many walls still remaining up to a height of forty or more feet, resembling the ‘dead teeth’ General John K. Cannon of the USAAF wanted to remove; ironically those of the north-west corner (the future target of all ground assaults through the hills) remained intact. These the Germans, sheltering from the smaller bombs, immediately occupied and turned into excellent defensive positions, ready to slaughter the 4th Indian Division when they belatedly attacked. As Brigadier Kippenberger observed: ‘Whatever had been the position before, there was no doubt  that the enemy was now entitled to garrison the ruins, the breaches in the fifteen-foot-thick walls were nowhere complete, and we wondered whether we had gained anything.’

Peter Caddick-Adams is a Lecturer in Military and Security Studies at the United Kingdom’s Defence Academy, and author of Monte Cassino: Ten Armies in Hell and Monty and Rommel: Parallel Lives. He holds the rank of major in the British Territorial Army and has served with U.S. forces in Bosnia, Iraq, and Afghanistan.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
Image credits: (1) Source: U.S. Air Force; (2) Bundesarchiv, Bild 146-2005-0004 / Wittke / CC-BY-SA; (3) Bundesarchiv, Bild 183-J26131 / Enz / CC-BY-SA

The post The bombing of Monte Cassino appeared first on OUPblog.

0 Comments on The bombing of Monte Cassino as of 2/15/2013 12:11:00 PM
Add a Comment
14. Who was Dorothy Wrinch?

Remembered today for her much publicized feud with Linus Pauling over the shape of proteins, known as “the cyclol controversy,” Dorothy Wrinch made essential contributions to the fields of Darwinism, probability and statistics, quantum mechanics, x-ray diffraction, and computer science. The first women to receive a doctor of science degree from Oxford University, her understanding of the science of crystals and the ever-changing notion of symmetry has been fundamental to science.

We sat down with Marjorie Senechal, author of I Died for Beauty: Dorothy Wrinch and the Cultures of Science, to explore the life of this brilliant and controversial figure.

Who was Dorothy Wrinch?

Dorothy Wrinch was a British mathematician and a student of Bertrand Russell. An exuberant, exasperating personality, she knew no boundaries, academic or otherwise. She sowed fertile seeds in many fields of science — philosophy, mathematics, seismology, probability, genetics, protein chemistry, crystallography.

What is she remembered for?

Unfortunately, she’s mainly remembered for her battle with the chemist Linus Pauling. Dorothy proposed the first-ever model for protein architecture, provoking a world-class controversy in scientific circles. Linus led her opponents; few noticed that his arguments were as wrong as her model was.

Why did he attack her research and career with such ferocity?

In those days before scientific imaging, scientists imagined. Outsized personalities, fierce ambitions, and cultural misunderstandings also played a role. And gender bias: Dorothy didn’t know her place. She didn’t suffer critics gratefully, or fools gladly. On a deeper level, the fight was philosophical. Imagination and experiment, beauty and truth are entangled inseparably, then and now.

Linus won two Nobel prizes. What became of Dorothy?

Dorothy, a single mother, came to the United States with her daughter at the beginning of World War II, and eventually settled in Massachusetts; she taught at Smith College for many years and wrote scientific books and papers. But her career never recovered. I wrote this book to find out why.

Marjorie Senechal is the Louise Wolff Kahn Professor Emerita in Mathematics and History of Science and Technology, Smith College, and Co-Editor of The Mathematical Intelligencer. She is the author of I Died for Beauty: Dorothy Wrinch and the Cultures of Science.

Subscribe to the OUPblog via email or RSS.
Subscribe to only science and medicine articles on the OUPblog via email or RSS.

The post Who was Dorothy Wrinch? appeared first on OUPblog.

0 Comments on Who was Dorothy Wrinch? as of 2/11/2013 7:20:00 AM
Add a Comment
15. A history of smuggling in America

Today America is the world’s leading anti-smuggling crusader. While honorable, that title is also an ironic one when you consider America’s very close history of… smuggling. Our illicit imports have ranged from West Indies molasses and Dutch gunpowder in the 18th century, to British industrial technologies and African slaves in the 19th century, to French condoms and Canadian booze in the early 20th century, to Mexican workers and Colombian cocaine in the modern era. Simply put, America was built by smugglers.

In this video from Brown University’s Watson Institute for International Studies, Peter Andreas, author of Smuggler Nation: How Illicit Trade Made America, explains America’s long relationship with smuggling and illicit trade.

Click here to view the embedded video.

Peter Andreas is a professor in the Department of Political Science and the Watson Institute for International Studies at Brown University. He was previously an Academy Scholar at Harvard University, a Research Fellow at the Brookings Institution, and an SSRC-MacArthur Foundation Fellow on International Peace and Security. Andreas has written numerous books, including Smuggler Nation: How Illicit Trade Made America, published widely in scholarly journals and policy magazines, presented Congressional testimony, written op-eds for major newspapers, and provided frequent media commentary.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post A history of smuggling in America appeared first on OUPblog.

0 Comments on A history of smuggling in America as of 2/5/2013 9:16:00 AM
Add a Comment
16. Weekend Groove: Music Videos from Japan, US and UK

“Cirrus” directed by Cyriak (UK)

“Yamasuki Yamazaki” directed by Shishi Yamazaki (Japan)

“Tourniquet” directed by Jordan Bruner (US)

Music Video for Hem. Animated by Greg Lytle and Jordan Bruner.

Add a Comment
17. “Third Nation” along the US-Mexico border

By Michael Dear


Not long ago, I passed a roadside sign in New Mexico which read: “Es una frontera, no una barrera / It’s a border, not a barrier.” This got me thinking about the nature of the international boundary line separating the US from Mexico. The sign’s message seemed accurate, but what exactly did it mean?

On 2 February 1848, a ‘Treaty of Peace, Friendship, Limits and Settlement’ was signed at Guadalupe Hidalgo, thus terminating the Mexican-American War. The conflict was ostensibly about securing the boundary of the recently-annexed state of Texas, but it was clear from the outset that US President Polk’s ambition was territorial expansion. As consequences of the Treaty, Mexico gained peace and $15 million, but eventually lost one-half of its territory; the US achieved the largest land grab in its history through a war that many (including Ulysses S. Grant) regarded as dishonorable.

In recent years, I’ve traveled the entire length of the 2,000-mile US-Mexico border many times, on both sides. There are so many unexpected and inspiring places! Mutual interdependence has always been the hallmark of cross-border communities. Border people are staunchly independent and composed of many cultures with mixed loyalties. They get along perfectly well with people on the other side, but remain distrustful of far-distant national capitals. The border states are among the fastest-growing regions in both countries — places of economic dynamism, teeming contradiction, and vibrant political and cultural change.

A small fence separates densely populated Tijuana, Mexico, right, from the United States in the Border Patrol’s San Diego Sector.

Yet the border is also a place of enormous tension associated with undocumented migration and drug wars. Neither of these problems has its source in the borderlands, but border communities are where the burdens of enforcement are geographically concentrated. It’s because of our country’s obsession with security, immigration, and drugs that after 9/11 the US built massive fortifications between the two nations, and in so doing, threatened the well-being of cross-border communities.

I call the spaces between Mexico and the US a ‘third nation.’ It’s not a sovereign state, I realize, but it contains many of the elements that would otherwise warrant that title, such as a shared identity, common history, and joint traditions. Border dwellers on both sides readily assert that they have more in common with each other than with their host nations. People describe themselves as ‘transborder citizens.’ One man who crossed daily, living and working on both sides, told me: “I forget which side of the border I’m on.” The boundary line is a connective membrane, not a separation. It’s easy to reimagine these bi-national communities as a ‘third nation’ slotted snugly in the space between two countries. (The existing Tohono O’Odham Indian Nation already extends across the borderline in the states of Arizona and Sonora.)

But there is more to the third nation than a cognitive awareness. Both sides are also deeply connected through trade, family, leisure, shopping, culture, and legal connections. Border-dwellers’ lives are intimately connected by their everyday material lives, and buttressed by innumerable formal and informal institutional arrangements (NAFTA, for example, as well as water and environmental conservation agreements). Continuity and connectivity across the border line existed for centuries before the border was put in place, even back to the Spanish colonial era and prehistoric Mesoamerican times.

Do the new fortifications built by the US government since 9/11 pose a threat to the well-being of borderland communities? Certainly there’s been interruptions to cross-border lives: crossing times have increased; the number of US Border Patrol ‘boots on ground’ has doubled; and a new ‘gulag’ of detention centers has been instituted to apprehend, prosecute and deport all undocumented migrants. But trade has continued to increase, and cross-border lives are undiminished. US governments are opening up new and expanded border crossing facilities (known as ports of entry) at record levels.  Gas prices in Mexican border towns are tied to the cost of gasoline on the other side. The third nation is essential to the prosperity of both countries.

So yes, the roadside sign in New Mexico was correct. The line between Mexico and the US is a border in the geopolitical sense, but it is submerged by communities that do not regard it as a barrier to centuries-old cross-border intercourse. The international boundary line is only just over a century-and-a-half old. Historically, there was no barrier; and the border is not a barrier nowadays.

The walls between Mexico and the US will come down. Walls always do. The Berlin Wall was torn down virtually overnight, its fragments sold as souvenirs of a calamitous Cold War. The Great Wall of China was transformed into a global tourist attraction. Left untended, the US-Mexico Wall will collapse under the combined assault of avid recyclers, souvenir hunters, and local residents offended by its mere presence.

As the US prepares once again to consider immigration reform, let the focus this time be on immigration and integration. The framers of the Treaty of Guadalupe Hidalgo were charged with making the US-Mexico border, but on this anniversary of the Treaty’s signing, we may best honor the past by exploring a future when the border no longer exists. Learning from the lives of cross-border communities in the third nation would be an appropriate place to begin.

Michael Dear is a professor in the College of Environmental Design at the University of California, Berkeley, and author of Why Walls Won’t Work: Repairing the US-Mexico Divide (Oxford University Press).

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post “Third Nation” along the US-Mexico border appeared first on OUPblog.

0 Comments on “Third Nation” along the US-Mexico border as of 2/2/2013 8:48:00 AM
Add a Comment
18. Burrowing into Punxsutawney Phil’s hometown data

In the United States, a German belief about the badger (applied in Switzerland to the wolf) has been transferred to the woodchuck, better known as the groundhog: on Candlemas he breaks his hibernation in order to observe the weather; if he can see his shadow he returns to his slumbers for six weeks, but if it rains he stays up and about, since winter will soon be over. This has earned Candlemas the name of ‘Groundhog Day’. In Quarryville, Lancaster County, Pa., a Slumbering Groundhog Lodge was formed, whose members, wearing silk hats and carrying canes, went out in search of a groundhog burrow; on finding one they watched its inhabitant’s conduct and reported back. Of twenty observations recorded, eight prognostications proved true, seven false, and five were indeterminate. The ritual is now carried on at Punxsutawney, Pa., where the weather prophet has been named Punxsutawney Phil. (The Oxford Companion to the Year)

By Sydney Beveridge


Every February Second, people across Pennsylvania and the world look to a famous rodent to answer the question—when will spring come?

For over 120 years, Punxsutawney Phil Soweby (Punxsutawney Phil for short) has offered his predictions, based on whether he sees his shadow (more winter) or not (an early spring).

The first official Groundhog Day celebration took place in 1887 and Phil has gone on to star in a blockbuster film, dominate the early February news cycle, and even appear on Oprah. (He also has his own Beanie Baby and his own flower.)

In addition to weather predictions, Phil also loves data, and while people think he is hibernating, he is actually conducting demographic analysis. As a Social Explorer subscriber, he used the site’s mapping and reporting tools to look at the composition of his hometown.

Click here to view the embedded video.

Punxsutawney, PA, located outside of Pittsburgh, is part of Jefferson County. Examining Census data from 1890, Phil learned that the population was 44,405 around the time of his first predictions. While the rest of the nation was becoming more urban, Jefferson County remained more rural with only one eighth of the population living in places with 2,500 people or more (compared to nearly half statewide and more than a third in the US).

Many Jefferson residents worked in the farming industry. Back then, there were 3.2 families for every farm in Jefferson County — higher than the rest of the state with 5.0 families per farm.

Less than three decades after the Civil War, the county (located in a northern state) was 99.9 percent white, which was a little higher than statewide (97.9 percent) and also higher than nationwide 87.8 percent. (The Census also noted that there was one Chinese resident of Jefferson County in 1890.)

Groundhog Day was originally called Candlemas, a day that Germans said the hibernating groundhog took a break from slumbering to check the weather. (According to the Oxford Companion to the Year.) If the creature sees its shadow, and is frightened, winter will hold on and hibernating will continue, but if not, the groundhog will stay awake and spring will come early. Back in 1890, there were 703 Germans living in Jefferson County (representing 1.6 percent of the county population and 11.3 percent of the foreign born), making Germany the fourth most common foreign born place of birth behind England, Scotland, and Austria. Groundhog Day is also said to be Celtic in its roots, so perhaps the 623 Irish residents (representing 1.4 percent of the county population and 10.1 percent of the foreign born) helped to establish the tradition in Pennsylvania.

Looking to today’s numbers, Phil was astonished to learn from the 2010 Census that Jefferson County has just 795 more people than it did 120 years ago. While Jefferson grew by 1.8 percent, the state grew by 141.6 percent and the nation grew by 393.0 percent.

Phil dug deeper. The 2008-10 American Community Survey data reveal that the once-prominent farming industry had shrunk considerably. (Because it is a small group, “agriculture” is now grouped with other industries including forestry, fishing and hunting, and mining.) While Jefferson residents are more likely to work in the industry than other Pennsylvanians, that share represents just 4.4 percent of the employed civilian workforce.

According to the Census, Jefferson is still predominately white (98.3 percent), while the rest of the state and nation have become somewhat more diverse (81.9 percent white in Pennsylvania and 72.4 percent nationwide). Today there are 24 Chinese residents (out of a total of 92 Asian residents).

As Phil rises from his burrow this February second, he will survey the shadows with new insight into his community and audience. To learn more about Punxsutawney Phil’s hometown burrow (and your own borough), please visit our mapping and reporting tools.

Sydney Beveridge is the Media and Content Editor for Social Explorer, where she works on the blog, curriculum materials, how-to-videos, social media outreach, presentations and strategic planning. She is a graduate of Swarthmore College and the Columbia University Graduate School of Journalism. A version of this article originally appeared on the Social Explorer blog. You can use Social Explorer’s mapping and reporting tools to investigate dreams, freedoms, and equality further.

Social Explorer is an online research tool designed to provide quick and easy access to current and historical census data and demographic information. The easy-to-use web interface lets users create maps and reports to better illustrate, analyze and understand demography and social change. From research libraries to classrooms to the front page of the New York Times, Social Explorer is helping people engage with society and science.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Burrowing into Punxsutawney Phil’s hometown data appeared first on OUPblog.

0 Comments on Burrowing into Punxsutawney Phil’s hometown data as of 2/2/2013 8:48:00 AM
Add a Comment
19. Oral historians and online spaces

By Caitlin Tyler-Richards


In November 2012, a thread appeared on the H-Net Oral history listserv with the enticing subject line “experimental uses of oral history.” Amid assorted student projects and artistic explorations, two projects in particular caught my eye: the VOCES Oral History Project and the Freedom Mosaic. As we work towards our upcoming special issue on Oral History in the Digital Age, I’ve been mulling over oral historians negotiate online spaces, and how the Internet and related advancing technologies can inspire, but also challenge the manner in which they share their scholarship. I believe the VOCES Oral History Project and the Freedom Mosaic offer two distinct paths historians may take in carving out online space, and raise an interesting issue regarding content versus aesthetics.

Based out of the University of Texas at Austin, the VOCES Oral History Project (previously the US Latino & Latina WWII Oral History Project) launched the spring of 1999 in response to the dearth of Latin@s experiences in WWII scholarship. Since its inception, the project has conducted over 500 interviews, which have spawned a variety of media, from mini-documentaries to the play Voices of Valor by James E. Garcia. My personal favorite is the Narratives Newspaper (1999-2004), a bi-annually printed collection of stories written by journalism students, based on interviews conducted with the WWII veterans. Thanks to a grant from the Institute for Museum and Library Services, in 2009 VOCES expanded its project to include the Vietnam and Korean Wars — hence the name change.

VOCES does not have the most technologically innovative or visually attractive website; it relies on text and links much more than contemporary web design generally allows. However, it still serves as a strong, online base for the project, allowing the staff to maintain and occasionally expand the project, and facilitates greater access to the public. For instance, VOCES has transferred the Narratives Newspaper stories into an indexed, online archive, one which they continue to populate with new reports featuring personal photos. The website also helps to sustain the project by inviting the public to participate, encouraging them to sign up for VOCES’ volunteer database or conduct their own interviews. They continue to offer a print subscription to a biannual Narratives Newsletter, which speaks to the manner in which they straddle the line between print and digital. It’s understandable given their audience (i.e. war veterans), yet I wonder if it also hinders a full transition into the digital realm.

The second project that intrigued me was the Freedom Mosaic, a collaboration between the National Center for Civil and Human Rights, CNN and the Ford Foundation to share “individual stories that changed history” from major civil and human rights movements. The Freedom Mosaic is a professionally-designed “microsite” that opens with an attractive, interactive interface made up interviewees’ pictures, which viewers can click on to access multimedia profiles of civil and human rights participants. Each profile includes something like a player’s card for the interviewee on the left side of the page — imagine a title like, “Visionary”, a full body picture, an inspiring quote, personal photos, and perhaps a map. On the right side, viewers can click “Play Story” to watch a mini-documentary on the subject, including interview clips. Viewers can also click the tab “More” at the bottom to bring up a brief text biography, additional interviews and interview transcripts.

The Freedom Mosaic is not a standard oral history project. According to Dr. Clifford Kuhn, who served as the consulting historian and interviewer, and introduced the site to H-NetOralHist listserv, “The idea was to develop a dynamic web site that departed from many of the archivally-oriented civil rights-themes web sites, in an attempt to especially appeal to younger users, roughly 15-30 years old.” On one hand, I greatly approve of beautifully designed projects that trick the innocent Internet explorer into learning something — and I think the Freedom Mosaic could succeed. However, I’m bothered by the site’s dearth of… history. Especially when sharing contemporary stories like that of immigration activist Jessica Colotl, I would have liked a bit more background on immigration in the United States than her brief bio provides.

VOCES and the Freedom Mosaic are excellent examples of how historians may establish a space online, amid the cat memes and indie movie blogs, for academic research. While I have my concerns, I believe both sites succeed in fulfilling their projects’ respective goals, whether that be archiving or eye-catching. However, I would ask you, lovely readers: Is there a template between VOCES’ text heavy archive and the Freedom Mosaic’s dazzling pentagons that might better serve in sharing oral history? What would a rubric for “successful oral history sites” look like? Or should every site be tailor fit to each project?

To the comments! The more you share, the better the eventual OHR website will look.

Caitlin Tyler-Richards is the editorial/ media assistant at the Oral History Review. When not sharing profound witticisms at @OralHistReview, Caitlin pursues a PhD in African History at the University of Wisconsin-Madison. Her research revolves around the intersection of West African history, literature and identity construction, as well as a fledgling interest in digital humanities. Before coming to Madison, Caitlin worked for the Lannan Center for Poetics and Social Practice at Georgetown University.

The Oral History Review, published by the Oral History Association, is the U.S. journal of record for the theory and practice of oral history. Its primary mission is to explore the nature and significance of oral history and advance understanding of the field among scholars, educators, practitioners, and the general public. Follow them on Twitter at @oralhistreview and like them on Facebook to preview the latest from the Review, learn about other oral history projects, connect with oral history centers across the world, and discover topics that you may have thought were even remotely connected to the study of oral history. Keep an eye out for upcoming posts on the OUPblog for addendum to past articles, interviews with scholars in oral history and related fields, and fieldnotes on conferences, workshops, etc.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post Oral historians and online spaces appeared first on OUPblog.

0 Comments on Oral historians and online spaces as of 2/1/2013 10:16:00 AM
Add a Comment
20. The non-interventionist moment

By Andrew J. Polsky


The signs are clear. President Barack Obama has nominated two leading skeptics of American military intervention for the most important national security cabinet posts. Meeting with Afghan President Hamid Karzai, who would prefer a substantial American residual presence after the last American combat troops have departed in 2014, Obama has signaled that he wants a more rapid transition out of an active combat role (perhaps as soon as this spring, rather than during the summer). The president has also countered a push from his own military advisors to keep a sizable force in Afghanistan indefinitely by agreeing to consider the “zero option” of a complete withdrawal. We appear on the verge of a non-interventionist moment in American politics, when leaders and the general public alike shun major military actions.

Only a decade ago, George W. Bush stood before the graduating class at West Point to proclaim the dawn of a new era in American security policy. Neither deterrence nor containment, he declared, sufficed to deal with the threat posed by “shadowy terrorist networks with no nation or citizens to defend” or with “unbalanced dictators” possessing weapons of mass destruction. “[O]ur security will require all Americans to be forward looking and resolute, to be ready for preemptive action when necessary to defend our liberty and to defend our lives.” This new “Bush Doctrine” would soon be put into effect. In March 2003, the president ordered the US military to invade Iraq to remove one of those “unstable dictators,” Saddam Hussein.

This post-9/11 sense of assertiveness did not last. Long and costly wars in Iraq and Afghanistan discredited the leaders responsible and curbed any popular taste for military intervention on demand. Over the past two years, these reservations have become obvious as other situations arose that might have invited the use of troops just a few years earlier: Obama intervened in Libya but refused to send ground forces; the administration has rejected direct measures in the Syrian civil war such as no-fly zones; and the president refused to be stampeded into bombing Iranian nuclear facilities.

The reaction against frustrating wars follows a familiar historical pattern. In the aftermath of both the Korean War and the Vietnam War, Americans expressed a similar reluctance about military intervention. Soon after the 1953 truce that ended the Korean stalemate, the Eisenhower administration faced the prospect of intervention in Indochina, to forestall the collapse of the French position with the pending Viet Minh victory at Dien Bien Phu. As related by Fredrik Logevall in his excellent recent book, Embers of War, both Eisenhower and Secretary of State John Foster Dulles were fully prepared to deploy American troops. But they realized that in the backwash from Korea neither the American people nor Congress would countenance unilateral action. Congressional leaders indicated that allies, the British in particular, would need to participate. Unable to secure agreement from British foreign secretary Anthony Eden, Eisenhower and Dulles were thwarted, and decided instead to throw their support behind the new South Vietnamese regime of Ngo Dinh Diem.

Another period marked by reluctance to use force followed the Vietnam War. Once the last American troops withdrew in 1973, Congress rejected the possibility they might return, banning intervention in Indochina without explicit legislative approval. Congress also adopted the War Powers Resolution, more significant as a symbolic statement about the wish to avoid being drawn into a protracted military conflict by presidential initiative than as a practical measure to curb presidential bellicosity.

It is no coincidence that Obama’s key foreign and defense policy nominees were shaped by the crucible of Vietnam. Both Chuck Hagel and John Kerry fought in that war and came away with “the same sensibility about the futilities of war.” Their outlook contrasts sharply with that of Obama’s initial first-term selections to run the State Department and the Pentagon: both Hillary Clinton and Robert Gates backed an increased commitment of troops in Afghanistan in 2009. Although Senators Hagel and Kerry supported the 2002 congressional resolution authorizing the use of force in Iraq, they became early critics of the war. Hagel has expressed doubts about retaining American troops in Afghanistan or using force against Iran.

Given the present climate, we are unlikely to see a major American military commitment during the next several years. Obama’s choice of Kerry and Hagel reflect his view that, as he put it in the 2012 presidential debate about foreign policy, the time has come for nation-building at home. It will suffice in the short run to hold distant threats at bay. Insofar as possible, the United States will rely on economic sanctions and “light footprint” methods such as drone strikes on suspected terrorists.

If the past is any guide, however, this non-interventionist moment won’t last.  The post-Korea and post-Vietnam interludes of reluctance gave way within a decade to a renewed willingness to send American troops into combat. By the mid-1960s, Lyndon Johnson had embraced escalation in Vietnam; Ronald Reagan made his statement through his over-hyped invasion of Grenada to crush its pipsqueak revolutionary regime. The American people backed both decisions.

The return to interventionism will recur because the underlying conditions that invite it have not changed significantly. In the global order, the United States remains the hegemonic power that seeks to preserve stability. We retain a military that is more powerful by several orders of magnitude than any other, and will surely remain so even after the anticipate reductions in defense spending. Psychologically, the American people have long been sensitive to distant threats, and we have shown that we can be stampeded into endorsing military action when a president identifies a danger to our security. (And presidents themselves become vulnerable to charges that they are tolerating American decline whenever a hostile regime comes to power anywhere in the world.)

Those of us who question the American proclivity to resort to the use of force, then, should enjoy the moment.

Andrew Polsky is Professor of Political Science at Hunter College and the CUNY Graduate Center. A former editor of the journal Polity, his most recent book is Elusive Victories: The American Presidency at War. Read Andrew Polsky’s previous blog posts.

Subscribe to the OUPblog via email or RSS.
Subscribe to only law and politics articles on the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post The non-interventionist moment appeared first on OUPblog.

0 Comments on The non-interventionist moment as of 1/30/2013 12:48:00 PM
Add a Comment
21. Jim Downs on the Emancipation Proclamation

The editors of the Oxford African American Studies Center spoke to Professor Jim Downs, author of Sick From Freedom: African-American Illness and Suffering during the Civil War and Reconstruction, about the legacy of the Emancipation Proclamation 150 years after it was first issued. We discuss the health crisis that affected so many freedpeople after emancipation, current views of the Emancipation Proclamation, and insights into the public health crises of today.

Emancipation was problematic, indeed disastrous, for so many freedpeople, particularly in terms of their health. What was the connection between newfound freedom and health?

I would not say that emancipation was problematic; it was a critical and necessary step in ending slavery. I would first argue that emancipation was not an ending point but part of a protracted process that began with the collapse of slavery. By examining freedpeople’s health conditions, we can see how that process unfolded—we can see how enslaved people liberated themselves from the shackles of Southern plantations but then were confronted with a number of questions: How would they survive? Where would they get their next meal? Where were they to live? How would they survive in a country torn apart by war and disease?

Due to the fact that freedpeople lacked many of these basic necessities, hundreds of thousands of former slaves became sick and died.

The traditional narrative of emancipation begins with liberation from slavery in 1862-63 and follows freedpeople returning to Southern plantations after the war for employment in 1865 and then culminates with grassroots political mobilization that led to the Reconstruction Amendments in the late 1860s. This story places formal politics as the central organizing principle in the destruction of slavery and the movement toward citizenship without considering the realities of freedpeople’s lives during this seven- to eight- year period. By investigating freedpeople’s health conditions, we first notice that many formerly enslaved people died during this period and did not live to see the amendments that granted citizenship and suffrage. They survived slavery but perished during emancipation—a fact that few historians have considered. Additionally, for those that did survive both slavery and emancipation, it was not such a triumphant story; without food, clothing, shelter, and medicine, emancipation unleashed a number of insurmountable challenges for the newly freed.

Was the health crisis that befell freedpeople after emancipation any person, government, or organization’s fault? Was the lack of a sufficient social support system a product of ignorance or, rather, a lack of concern?

The health crises that befell freedpeople after emancipation resulted largely from the mere fact that no one considered how freedpeople would survive the war and emancipation; no one was prepared for the human realities of emancipation. Congress and the President focused on the political question that emancipation raised: what was the status of formerly enslaved people in the Republic?

When the federal government did consider freedpeople’s condition in the final years of the war, they thought the solution was to simply return freedpeople to Southern plantations as laborers. Yet, no one in Washington thought through the process of agricultural production: Where was the fertile land? (Much of it was destroyed during the war; and countless acres were depleted before the war, which was why Southern planters wanted to move west.) How long would crops grow? How would freedpeople survive in the meantime?

Meanwhile, a drought erupted in the immediate aftermath of the war that thwarted even the most earnest attempts to develop a free labor economy in the South. Therefore, as a historian, I am less invested in arguing that someone is at fault, and more committed to understanding the various economic and political forces that led to the outbreak of sickness and suffering. Creating a new economic system in the South required time and planning; it could not be accomplished simply by sending freedpeople back to Southern plantations and farms. And in the interim of this process, which seemed like a good plan by federal leaders in Washington, a different reality unfolded on the ground in the postwar South. Land and labor did not offer an immediate panacea to the war’s destruction, the process of emancipation, and the ultimate rebuilding of the South. Consequently, freedpeople suffered during this period.

When the federal government did establish the Medical Division of the Freedmen’s Bureau – an agency that established over 40 hospitals in the South, employed over 120 physicians, and treated an estimated one million freedpeople — the institution often lacked the finances, personnel, and resources to stop the spread of disease. In sum, the government did not create this division with a humanitarian — or to use 19th century parlance, “benevolence” — mission, but rather designed this institution with the hope of creating a healthy labor force.

So, if an epidemic broke out, the Bureau would do its best to stop its spread. Yet, as soon as the number of patients declined, the Bureau shut down the hospital. The Bureau relied on a system of statistical reporting that dictated the lifespan of a hospital.  When a physician reported a declining number of patients treated, admitted, or died in the hospital, Washington officials would order the hospital to be closed. However, the statistical report failed to capture the actual behavior of a virus, like smallpox. Just because the numbers declined in a given period did not mean that the virus stopped spreading among susceptible freedpeople.  Often, it continued to infect formerly enslaved people, but because the initial symptoms of smallpox got confused with other illnesses it was overlooked. Or, as was often the case, the Bureau doctor in an isolated region noticed a decline among a handful of patients, but not too far away in a neighboring plantation or town, where the Bureau doctor did not visit, smallpox spread and remained unreported. Yet, according to the documentation at a particular moment the virus seemed to dissipate, which was not the case. So, even when the government, in the shape of Bureau doctors, tried to do its best to halt the spread of the disease, there were not enough doctors stationed throughout the South to monitor the virus, and their methods of reporting on smallpox were problematic.

You draw an interesting distinction between the terms refugee and freedmen as they were applied to emancipated slaves at different times. What did the term refugee entail and how was it a problematic description?

I actually think that freedmen or freedpeople could be a somewhat misleading term, because it defines formerly enslaved people purely in terms of their political status—the term freed places a polish on their condition and glosses over their experience during the war in which the military and federal government defined them as both contraband and refugees. Often forced to live in “contraband camps,” which were makeshift camps that surrounded the perimeter of Union camps, former slaves’ experience resembled a condition more associated with that of refugees. More to the point, the term freed does not seem to jibe with what I uncovered in the records—the Union Army treats formerly enslaved people with contempt, they assign them to laborious work, they feed them scraps, they relegate them to muddy camps where they are lucky if they can use a discarded army tent to protect themselves against the cold and rain. The term freedpeople does not seem applicable to those conditions.

That said, I struggle with my usage of these terms, because on one level they are politically no longer enslaved, but they are not “freed” in the ways in which the prevailing history defines them as politically mobile and autonomous. And then on a simply rhetorical level, freedpeople is a less awkward and clumsy expression than constantly writing formerly enslaved people.

Finally, during the war abolitionists and federal officials argued over these terms and classifications and in the records.  During the war years, the Union army referred to the formerly enslaved as refugees, contraband, and even fugitives. When the war ended, the federal government classified formerly enslaved people as freedmen, and used the term refugee to refer to white Southerners displaced by the war. This is fascinating because it implies that white people can be dislocated and strung out but that formerly enslaved people can’t be—and if they are it does not matter, because they are “free.”

Based on your understanding of the historical record, what were Lincoln’s (and the federal government’s) goals in issuing the Emancipation Proclamation? Do you see any differences between these goals and the way in which the Emancipation Proclamation is popularly understood?

The Emancipation Proclamation was a military tactic to deplete the Southern labor force. This was Lincoln’s main goal—it invariably, according to many historians, shifted the focus of the war from a war for the Union to a war of emancipation. I never really understood what that meant, or why there was such a fuss over this distinction, largely because enslaved people had already begun to free themselves before the Emancipation Proclamation and many continued to do so after it without always knowing about the formal proclamation.

The implicit claim historians make when explaining how the motivation for the war shifted seems to imply that the Union soldiers thusly cared about emancipation so that the idea that it was a military tactic fades from view and instead we are placed in a position of imagining Union soldiers entering the Confederacy to destroy slavery—that they were somehow concerned about black people. Yet, what I continue to find in the record is case after case of Union officials making no distinction about the objective of the war and rounding up formerly enslaved people and shuffling them into former slave pens, barricading them in refugee camps, sending them on death marches to regions in need of laborers. I begin to lose my patience when various historians prop up the image of the Union army (or even Lincoln) as great emancipators when on the ground they literally turned their backs on children who starved to death; children who froze to death; children whose bodies were covered with smallpox. So, from where I stand, I see the Emancipation Proclamation as a central, important, and critical document that served a valuable purpose, but the sources quickly divert my attention to the suffering and sickness that defined freedpeople’s experience on the ground.

Do you see any parallels between the situation of post-Civil War freedpeople and the plights of currently distressed populations in the United States and abroad? What can we learn about public health crises, marginalized groups, etc.?

Yes, I do, but I would prefer to put this discussion on hold momentarily and simply say that we can see parallels today, right now. For example, there is a massive outbreak of the flu spreading across the country. Some are even referring to it as an epidemic. Yet in Harlem, New York, the pharmacies are currently operating with a policy that they cannot administer flu shots to children under the age of 17, which means that if a mother took time off from work and made it to Rite Aid, she can’t get her children their necessary shots. Given that all pharmacies in that region follow a particular policy, she and her children are stuck. In Connecticut, Kathy Lee Gifford of NBC’s Today Show relayed a similar problem, but she explained that she continued to travel throughout the state until she could find a pharmacy to administer her husband a flu shot. The mother in Harlem, who relies on the bus or subway, has to wait until Rite Aid revises its policy. Rite Aid is revising the policy now, as I write this response, but this means that everyday that it takes for a well-intentioned, well-meaning pharmacy to amend its rules, the mother in Harlem or mother in any other impoverished area must continue to send her children to school without the flu shot, where they remain susceptible to the virus.

In the Civil War records, I saw a similar health crisis unfold: people were not dying from complicated, unknown illnesses but rather from the failures of a bureaucracy, from the inability to provide basic medical relief to those in need, and from the fact that their economic status greatly determined their access to basic health care.

Tim Allen is an Assistant Editor for the Oxford African American Studies Center.

The Oxford African American Studies Center combines the authority of carefully edited reference works with sophisticated technology to create the most comprehensive collection of scholarship available online to focus on the lives and events which have shaped African American and African history and culture. It provides students, scholars and librarians with more than 10,000 articles by top scholars in the field.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Jim Downs on the Emancipation Proclamation appeared first on OUPblog.

0 Comments on Jim Downs on the Emancipation Proclamation as of 1/30/2013 7:04:00 AM
Add a Comment
22. Who was Harry Hopkins?

By David Roll


He was a spectral figure in the Franklin Roosevelt administration. Slightly sinister. A ramshackle character, but boyishly attractive. Gaunt, pauper-thin. Full of nervous energy, fueled by caffeine and Lucky Strikes.

Hopkins was an experienced social worker, an in-your-face New Deal reformer. Yet he sought the company of the rich and well born. He was a gambler, a bettor on horses, cards, and the time of day. Between his second and third marriages he dated glamorous women — movie stars, actresses and fashionistas.

It was said that he had a mind like a razor, a tongue like a skinning knife. A New Yorker profile described him as a purveyor of wit and anecdote. He loved to tell the story of the time President Roosevelt wheeled himself into Winston Churchill’s bedroom unannounced. It was when the prime minister was staying at the White House. Churchill had just emerged from his afternoon bath, stark naked and gleaming pink. The president apologized and started to withdraw. “Think nothing of it,” Churchill said. “The Prime Minister has nothing to hide from the President of the United States.” Whether true or not, Hopkins dined out on this story for years.

On the evening of 10 May 1940 — a year and a half before the United States entered the Second World War — Roosevelt and Hopkins had just finished dinner. They were in the Oval Study on the second floor of the White House. As usual, they were gossiping and enjoying each other’s jokes and stories. Hopkins was forty-nine; the president was fifty-seven. They had known one another for a decade; Hopkins had run several of Roosevelt’s New Deal agencies that put millions of Americans to work on public works and infrastructure projects. Franklin and Eleanor Roosevelt had consoled Harry following the death of his second wife, Barbara, in 1937. The first lady was the surrogate mother of Hopkins’s daughter, Diana, age seven. Hopkins had become part of the Roosevelt family. He was Roosevelt’s closest advisor and friend.

The president sensed that Harry was not feeling well. He knew that Hopkins had had more than half of his stomach removed due to cancer and was suffering from malnutrition and weakeness in his legs. Roosevelt insisted that his friend spend the night.


Hopkins was the man who came to dinner and never left. For most of the next three-and-one-half years Harry would live in the Lincoln suite a few doors down the hall from the Roosevelts and his daughter would live on the third floor near the Sky Parlor. Without any particular portfolio or title, Hopkins conducted business for the president from a card table and a telephone in his bedroom.

During those years, as the United States was drawn into the maelstrom of the Second World War, Harry Hopkins would devote his life to helping the president prepare for and win the war. He would shortly form a lifelong friendship with Winston Churchill and his wife Clementine. He would even earn a measure of respect and a degree of trust from Joseph Stalin, the brutal dictator of the Soviet Union. He would play a critical role, arguably the critical role, in establishing and preserving America’s alliance with Great Britain and the Soviet Union that won the war.

Harry Hopkins was the pectin and the glue. He understood that victory depended on holding together a three-party coalition: Stalin, Churchill, and Roosevelt. This would be his single-minded focus throughout the war years. Churchill was awed by Hopkins’s ability to focus; he often addressed him as “Lord Root of the Matter.”

Much of Hopkins’s success was due to social savvy, what psychologists call emotional intelligence or practical intelligence. He knew how to read people and situations, and how to use that natural talent to influence decisions and actions. He usually knew when to speak and when to remain silent. Whether it was at a wartime conference, alone with Roosevelt, or a private meeting with Stalin, when Hopkins chose to speak, his words were measured to achieve effect.

At a dinner in London with the leaders of the British press during the Blitz, when Britain stood alone, Hopkins’s words gave the press barons the sense that though America was not yet in the war, she was marching beside them. One of the journalists who was there wrote, “We were happy men all; our confidence and our courage had been stimulated by a contact which Shakespeare, in Henry the V, had a phrase, ‘a little touch of Harry in the night.’”

Hopkins’s touch was not little nor was it light. To Stalin, Hopkins spoke po dushe (according to the soul). Churchill saw Hopkins as a “crumbling lighthouse from which there shown beams that led great fleets to harbour.” To Roosevelt, he gave his life, “asking for nothing except to serve.” They were the “happy few.” And Hopkins had made himself one of them.

David Roll is the author of The Hopkins Touch: Harry Hopkins and the Forging of the Alliance to Defeat Hitler. He is a partner at Steptoe & Johnson LLP and founder of Lex Mundi Pro Bono Foundation, a public interest organization that provides pro bono legal services to social entrepreneurs around the world. He was awarded the Purpose Prize Fellowship by Civic Ventures in 2009.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Who was Harry Hopkins? appeared first on OUPblog.

0 Comments on Who was Harry Hopkins? as of 1/29/2013 11:06:00 AM
Add a Comment
23. A letter from Harry Truman to Judge Learned Hand

Learned Hand was born on this day in 1872. In a letter dated 15 May 1951, Judge Learned Hand wrote President Harry S. Truman to declare his intention to retire from “regular active service.” President Truman responded to Hand’s news with a letter praising his service to the country. These letters are excerpted from Reason and Imagination: The Selected Letters of Learned Hand, edited by Constance Jordan.

To Judge Learned Hand

May 23, 1951
The White House, Washington, D.C.

Dear Judge Hand:

Your impending retirement fills me with regret, which I know is shared by the American people. It is hard to accept the fact that after forty-two years of most distinguished service to our Nation, your activities are now to be narrowed. It is always difficult for me to express a sentiment of deep regret; what makes my present task so overwhelming is the compulsion I feel to attempt, on behalf of the American people, to give in words some inkling of the place you have held and will always hold in the life and spirit of our country.

Your profession has long since recognized the magnitude of your contribution to the law. There has never been any question about your preeminent place among American jurists – indeed among the nations of the world. In your writings, in your day-to-day work for almost half a century, you have added purpose and hope to man’s quest for justice through the process of law. As judge and philosopher, you have expressed the spirit of America and the highest in civilization which man has achieved. America and the American people are the richer because of the vigor and fullness of your contribution to our way of life.

We are compensated in part by the fact that you are casting off only a part of the burdens which you have borne for us these many years, and by our knowledge that you will continue actively to influence our life and society for years to come. May you enjoy many happy years of retirement, secure in the knowledge that no man, whatever his walk of life, has ever been more deserving of the admiration and the gratitude of his country, and indeed of the entire free world.

Very sincerely yours, Harry S. Truman

Hand immediately responded to the President’s letter:

To President Harry S. Truman
May 24, 1951

Dear Mr. President:

Your letter about my retirement quite overwhelms me. I dare not believe that it is justified by anything which I have done, yet I cannot but be greatly moved that you should think that it is. The best reward that anyone can expect from official work is the approval of those competent to judge who become acquainted with it; your words of warm approval are much more than I could conceivably have hoped to receive. I can only tell you of my deep gratitude, and assure you that your letter will be a possession for all time for me and for those who come after me.

Respectfully yours, LEARNED HAND

The letters above were excerpted from Reason and Imagination: The Selected Letters of Learned Hand, edited by Constance Jordan, a retired professor of comparative literature and also Hand’s granddaughter. Learned Hand served on the United States District Court and is commonly thought to be the most influential justice never to serve on the Supreme Court. He corresponded with people in different walks of life, some who were among his friends and acquaintances, others who were strangers to him.

Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.
Image credit: (1) Harry S Truman. US National Archives. Public domain via Wikimedia Commons. (2) Judge Learned Hand circa 1910. Public domain via Wikimedia Commons.

The post A letter from Harry Truman to Judge Learned Hand appeared first on OUPblog.

0 Comments on A letter from Harry Truman to Judge Learned Hand as of 1/27/2013 9:54:00 AM
Add a Comment
24. ‘Ebonics’ in flux

By Tim Allen


On this day forty years ago, the African American psychologist Robert Williams coined the term “Ebonics” during an education conference held at Washington University in Saint Louis, Missouri. At the time, his audience was receptive to, even enthusiastic about, the word. But invoke the word “Ebonics” today and you’ll have no trouble raising the hackles of educators, journalists, linguists, and anyone else who might have an opinion about how people speak. (That basically accounts for all of us, right?) The meaning of the controversial term, however, has never been entirely stable.

For Williams, Ebonics encompassed not only the language of African Americans, but also their social and cultural histories. Williams fashioned the word “Ebonics”—a portmanteau of the words “ebony” (black) and “phonics” (sounds)—in order to address a perceived lack of understanding of the shared linguistic heritage of those who are descended from African slaves, whether in North America, the Caribbean, or Africa. Williams and several other scholars in attendance at the 1973 conference felt that the then-prevalent term “Black English” was insufficient to describe the totality of this legacy.

Ebonics managed to stay under the radar for the next couple of decades, but then re-emerged at the center of a national controversy surrounding linguistic, cultural, and racial diversity in late 1996. At that time, the Oakland, California school board, in an attempt to address some of the challenges of effectively teaching standard American English to African American schoolchildren, passed a resolution recognizing the utility of Ebonics in the classroom. The resolution suggested that teachers should acknowledge the legitimacy of the language that their students actually spoke and use it as a sort of tool in Standard English instruction. Many critics understood this idea as a lowering of standards and an endorsement of “slang”, but the proposed use of Ebonics in the classroom did not strike most linguists or educators as particularly troublesome. However, the resolution also initially characterized Ebonics as a language nearly entirely separate from English. (For example, the primary advocate of this theory, Ernie Smith, has called Ebonics “an antonym for Black English.” (Beyond Ebonics, p. 21)) The divisive idea that “Ebonics” could be considered its own language—not an English dialect but more closely related to West African languages—rubbed many people the wrong way and gave a number of detractors additional fodder for their derision.

Linguists were quick to respond to the controversy and offer their own understanding of “Ebonics”. In the midst of the Oakland debate, the Linguistic Society of America resolved that Ebonics is a speech variety that is “systematic and rule-governed like all natural speech varieties. […] Characterizations of Ebonics as “slang,” “mutant,” ” lazy,” “defective,” “ungrammatical,” or “broken English” are incorrect and demeaning.” The linguists refused to make a pronouncement on the status of Ebonics as either a language or dialect, stating that the distinction was largely a political or social one. However, most linguists agree on the notion that the linguistic features described by “Ebonics” compose a dialect of English that they would more likely call “African American Vernacular English” (AAVE) or perhaps “Black English Vernacular”. This dialect is one among many American English dialects, including Chicano English, Southern English, and New England English.

And if the meaning of “Ebonics” weren’t muddy enough, a fourth perspective on the term emerged around the time of the Oakland debate. Developed by Professor Carol Aisha Blackshire-Belay, this idea takes the original view of Ebonics as a descriptive term for languages spoken by the descendants of West African slaves and expands it to cover the language of anyone from Africa or in the African diaspora. Her Afrocentric vision of Ebonics, in linguist John Baugh’s estimation, “elevates racial unity, but it does so at the expense of linguistic accuracy.” (Beyond Ebonics, p. 23)

The term “Ebonics” seems to have fallen out of favor recently, perhaps due to the unpleasant associations with racially-tinged debate that it engenders (not to mention the confusing multitude of definitions it has produced!). However, the legacy of the Ebonics controversy that erupted in the United States in 1996 and 1997 has been analyzed extensively by scholars of language, politics, and race in subsequent years. And while “Ebonics”, the word, may have a reduced presence in our collective vocabulary, many of the larger issues surrounding its controversial history are still with us: How do we improve the academic achievement of African American children? How can we best teach English in school? How do we understand African American linguistic heritage in the context of American culture? Answers to these questions may not be immediately forthcoming, but we can, perhaps, thank “Ebonics” for moving the national conversation forward.

Tim Allen is an Assistant Editor for the Oxford African American Studies Center.

The Oxford African American Studies Center combines the authority of carefully edited reference works with sophisticated technology to create the most comprehensive collection of scholarship available online to focus on the lives and events which have shaped African American and African history and culture. It provides students, scholars and librarians with more than 10,000 articles by top scholars in the field.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post ‘Ebonics’ in flux appeared first on OUPblog.

0 Comments on ‘Ebonics’ in flux as of 1/26/2013 7:36:00 AM
Add a Comment
25. Choices and rights, children and murder

By Leigh Ann Wheeler

Demonstration protesting anti-abortion candidate Ellen McCormack at the Democratic National Convention, New York City. Photo by Warren K. Leffler, 14 July 1976. Source: Library of Congress.

“Abortion is a Personal Decision, Not a Legal Debate!”
“My Body, My Choice!”
“Abortion Rights, Social Justice, Reproductive Freedom!”

Such are today’s arguments for upholding Roe v. Wade, whose fortieth birthday many of us are celebrating.

Others are mourning.

“It’s a child, not a choice!”
“Abortion kills children!”
“Stop killing babies!”

How did we arrive at this stunningly polarized place in our discussion — our national shouting match — over women’s reproductive rights?

Certainly it wasn’t always this way. Indeed, consensus and moderation on the issue of abortion has been the rule until recently.

Even if we go back to biblical times, the brutal and otherwise misogynist law of the Old Testament made no mention of abortion, despite popular use of herbal abortifacients at the time. Moreover, it did not treat a person who caused a miscarriage as a murderer. Fast-forward several thousand years to North American indigenous societies where women regularly aborted unwanted pregnancies. Even Christian Europeans who settled in their midst did not prohibit abortion, especially before “quickening,” or the appearance of fetal movement. Support for restrictions on abortion emerged only in the 1800s, a time when physicians seeking to gain professional status sought control over the procedure. Not until the twentieth century did legislation forbidding all abortions begin to blanket the land.

What happened during those decades to women with unwanted pregnancies is well documented. For a middle-class woman, a nine-month “vacation” with distant relatives, a quietly performed abortion by a reputable physician, or, for those without adequate support, a “back-alley” job; for a working-class woman, nine months at a home for unwed mothers, a visit to a back-alley butcher, or maybe another mouth to feed. Women made do, sometimes by giving their lives, one way or another.

But not until the 1950s did serious challenges to laws against abortion emerge. They began to gain a constitutional foothold in the 1960s, when the Planned Parenthood Federation of America and the American Civil Liberties Union (ACLU) persuaded the US Supreme Court to declare state laws that prohibited contraceptives in violation of a newly articulated right to privacy. By the 1970s, the notion of a right to privacy actually cut many ways, but on January 23, 1973, it cut straight through state criminal laws against abortion. In Roe, the Supreme Court adopted the ACLU’s claim that the right to privacy must “encompass a woman’s decision whether or not to terminate her pregnancy.” But the Court also permitted intrusion on that privacy according to a trimester timetable that linked a woman’s rights to the stage of her pregnancy and a physician’s advice; as the pregnancy progressed, the Court allowed the state’s interest in preserving the woman’s health or the life of the fetus to take over.

Roe actually returned the country to an abortion law regime not so terribly different from the one that had reigned for centuries if not millennia before the nineteenth century. The first trimester of a pregnancy, or the months before “quickening,” remained largely under the woman’s control, though not completely, given the new role of the medical profession. The other innovation was that women’s control now derived from a constitutional right to privacy — a right made meaningful only by the availability and affordability of physicians willing to perform abortions.

With these exceptions, the Supreme Court’s decision in Roe did little more than return us to an older status quo.  So why has it left us screaming at each other over choices and children, rights and murder?

There are many answers to this question, but a major one involves partisan politics.

On the eve of Roe, to be a Catholic was practically tantamount to being a Democrat. Moreover, feminists were as plentiful in the Republican Party as they were in the Democratic Party. Not so today, on the eve of Roe’s fortieth birthday. Why?

As the Catholic Church cemented its position against abortion and feminists embraced abortion rights as central to a women’s rights agenda, politicians saw an opportunity to poach on their opponent’s constituency and activists saw an opportunity to hitch their fortunes to one of the two major parties. In the 1970s, Paul Weyrich, the conservative activist who coined the phrase “moral majority,” urged Republicans to adopt a pro-life platform in order to woo Catholic Democrats. More recently, the 2012 election showed us Republican candidates who would prohibit all abortions — at all stages of a pregnancy and even in cases of rape and incest — and a proudly, loudly pro-choice Democratic Party.

In the past forty years, abortion has played a major role in realigning our major political parties, associating one with conservative Christianity and the other with women’s rights — a phenomenon that has contributed to the emergence of a twenty-point gender gap, the largest in US history. Perhaps, then, it is no surprise that we are screaming at each other.

Leigh Ann Wheeler is Associate Professor of History at Binghamton University. She is co-editor of the Journal of Women’s History and the author of How Sex Became a Civil Liberty and Against Obscenity: Reform and the Politics of Womanhood in America, 1873-1935.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post Choices and rights, children and murder appeared first on OUPblog.

0 Comments on Choices and rights, children and murder as of 1/22/2013 8:46:00 AM
Add a Comment

View Next 25 Posts