JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: british, Most Recent at Top [Help]
Results 1 - 25 of 127
How to use this Page
You are viewing the most recent posts tagged with the words: british in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
The seemingly unassailable rise of the MOOC – the Massive Open On-Line Course – has many universities worried. Offering access to millions of potential students, it seems like the solution to so many of the problems that beset higher education. Fees are low, or even non-existent; anyone can sign up; staff time is strictly limited as even grading is done by peers or automated multiple-choice questionnaires. In an era of ever-rising tuition fees and of concerns about the barriers that stop the less well-off from applying to good universities, the MOOC can seem like a panacea.
In 1958, the prominent childcare advice writer and pediatrician Dr. Benjamin Spock told readers that ‘a man can be a warm father and a real man at the same time’. In this revised edition of the bestseller Baby and Child Care, the American author dedicated a whole section to ‘The Father’s Part’. This was a much lengthier discussion of men’s role in caring for their babies and young children than in the first edition, but the role of the father remained very much secondary to that of the mother. Though Spock advised readers it was ‘the wrong idea’ to consider childcare as the sole responsibility of the mother, it was clear that he thought the father’s responsibility in day-to-day care remained rather minimal, in part because of the lack of interest of fathers themselves. He added that it was ‘Better to play for fifteen minutes enjoyably, and then say, “Now I’m going to read my paper,” than to spend all day at the zoo, crossly.’
Having children had long been understood as a sign of manhood, proving men’s virility and adult status. Jim Bullock, for example, who was born in 1903, recollected the definite ideas around virility and masculinity in the mining village of Bowers Row in Yorkshire. He described:
The first child was conceived as soon as was decently possible, for the young husband had to prove his manhood. If a year passed without a child—or the outward sign of one being on the way—this man was taunted by his mates both at work and on the street corner by such cruel remarks.
He added that men were expected to suffer some of the same symptoms as their wife during pregnancy, such as morning sickness and toothache, as well as losing weight as their wife gained it. If he didn’t experience these effects, his love for and fidelity to his wife could be questioned.
With increasing knowledge about birth control, sex, and childbirth across many parts of British society as the twentieth century progressed, these views became outdated.
Having children was still a sign of achieving adult masculinity. However, too much interaction with anything to do with pregnancy, birth and babies could also be emasculating—this was, of course, ‘women’s business’. David, a labourer from Nottingham, who became a father in the 1950s, highlighted how he kept his distance from both the birth and caring for his new baby, ‘because it wasn’t manly’.
Some fathers were becoming more willing to help out with children. Mr. K from Preston described how ‘relaxing’ he found it to sit giving one of his babies a bottle after work. Yet, though attitudes to men’s roles in childcare were gradually shifting, it was the relationship between masculinity and fatherhood that changed more substantially in the middle of the twentieth century.
What can be found in the 1940s and 1950s in Britain was a new kind of relationship between fatherhood and masculinity. This was, in fact, a time when the ‘celebrity dad’ became prominent in the press. In 1955, for example, the Daily Mirror published a feature on actor Kenneth More, interviewed whilst he took care of his toddler. In 1957, it featured an article and large image of the singer Lonnie Donegan with his three-year-old daughter, apparently enjoying singing together at home. Sports stars and royals were also the subject of this kind of attention, and seemed to embody Spock’s claim that men indeed could be a real man and a warm father at the same time. More ‘ordinary’ dads also hinted at this change. Whilst taking an overly active role in the physical care of babies remained potentially tricky for many men, their identities were increasingly encompassing a more caring and fatherly side. Mr. G, born in 1903, suggested that there was change around the First World War; by the 1920s, men were much happier to be seen taking their child for a walk in the area he lived in Lancashire. And Martin from Oldham, whose first child was born in the mid-1950s, described how he proudly took his child in its pram for a beer in his local pub. Men’s roles with their children hadn’t been radically reshaped. But whilst in earlier generations, it was simply having children that was a sign of manliness, by the 1950s, being seen as an involved father was becoming part of an ideal vision of masculinity.
The importance of fatherhood to the achievement of certain ideal of masculinity has ebbed and flowed across the twentieth century; it could both prove and challenge a sense of manliness. Today we see plenty of evidence of men proudly displaying their fatherhood—the man with a pram or carrying a baby in a sling isn’t so rare any more. Yet, in every generation there are more or less involved fathers; plenty of men throughout the twentieth century, and much earlier, enjoyed spending time with their children and felt close to them. Today, women, for the most part, still take on the burden of childcare, even if there are plenty of couples who do things differently. Historical research helps question the idea that the ‘new man’ of the last couple of decades is quite so new—and by thinking about how fatherhood relates to masculine identity, we can better understand changes to parenting and gender roles over time.
Image Credit: “Father’s Strength” by Shavar Ross. CC by NC-ND 2.0 via Flickr.
The Reformation was a seismic event in history, whose consequences are still working themselves out in Europe and across the world. The protests against the marketing of indulgences staged by the German monk Martin Luther in 1517 belonged to a long-standing pattern of calls for internal reform and renewal in the Christian Church. But they rapidly took a radical and unexpected turn, engulfing first Germany and then Europe as a whole in furious arguments about how God’s will was to be discerned, and how humans were to be ‘saved’. However, these debates did not remain confined to a narrow sphere of theology. They came to reshape politics and international relations; social, cultural, and artistic developments; relations between the sexes; and the patterns and performances of everyday life.
Below we take a look at some of the key events that shaped the Reformation. In The Oxford Illustrated History of the ReformationPeter Marshall and a team of experts tell the story of how a multitude of rival groups and individuals, with or without the support of political power, strove after visions of ‘reform’.
Featured image credit: Fishing for Souls, Adriaen Pietersz van de Venne, 1614. Rijksmeseum, Amsterdam. Public domain via Wikimedia Commons.
There was a great change in peace settlements after World War I. Not only were the Central Powers supposed to pay reparations, cede territory, and submit to new rules concerning the citizenship of their former subjects, but they were also required to deliver nationals accused of violations of the laws and customs of war (or violations of the laws of humanity, in the case of the Ottoman Empire) to the Allies to stand trial.
This was the first time in European history that victor powers imposed such a demand following an international war. This was also the first time that regulations specified by the Geneva and Hague Conventions were enforced after a war ended. Previously, states used their own military tribunals to enforce the laws and customs of war (as well as regulations concerning espionage), but they typically granted amnesty for foreigners after a peace treaty was signed.
The Allies intended to create special combined military tribunals to prosecute individuals whose violations had affected persons from multiple countries. They demanded post-war trials for many reasons. Legal representatives to the Paris Peace Conference believed that “might makes right” should not supplant international law; therefore, the rules governing the treatment of civilians and prisoners-of-war must be enforced. They declared the war had created a modern sensibility that demanded legal innovations, such as prosecuting heads of state and holding officers responsible for the actions of subordinates. British and French leaders wanted to mollify domestic feelings of injury as well as propel an interpretation that the war had been a fight for “justice over barbarism,” rather than a colossal blood-letting. They also sought to use trials to exert pressure on post-war governments to pursue territorial and financial objectives.
The German, Ottoman, and Bulgarian governments resisted extradition demands and foreign trials, yet staged their own prosecutions. Each fulfilled a variety of goals by doing so. The Weimar government in Germany was initially forced to sign the Versailles Treaty with its extradition demands, then negotiated to hold its own trials before its Supreme Court in Leipzig because the German military, plus right-wing political parties, refused the extradition of German officers. The Weimar government, led by the Social Democratic party, needed the military’s support to suppress communist revolutions. The Leipzig trials, held 1921-27, only covered a small number of cases, serving to deflect responsibility for the most serious German violations, such as the massacre of approximately 6,500 civilians in Belgium and deportation of civilians to work in Germany. The limited scope of the trials did not purge the German military as the Allies had hoped. Yet the trials presented an opportunity for German prosecutors to take international charges and frame them in German law. Although the Allies were disturbed by the small number of convictions, this was the first time that a European country had agreed to try its own after a major war.
The Ottoman imperial government first destroyed the archives of the “Special Organization,” a secret group of Turkish nationalists who deported Greeks from the Aegean region in 1914 and planned and executed the massacre of Armenians in 1915. But in late 1918, a new Ottoman imperial government formed a commission to investigate parliamentary deputies and former government ministers from the Turkish nationalist party, the Committee of Union and Progress, which had planned the attacks. It also sought to prosecute Committee members who had been responsible for the Ottoman Empire’s entrance into the war. The government then held a series of military trials of its own accord in 1919 to prosecute actual perpetrators of the massacres, as well as purge the government of Committee members, as these were opponents of the imperial system. It also wanted to quash the British government’s efforts to prosecute Turks with British military tribunals. Yet after the British occupied Istanbul, the nationalist movement under Mustafa Kemal retaliated by arresting British officers. Ultimately, the Kemalists gained control of the country, ended all Turkish military prosecutions for the massacres, and nullified guilty verdicts.
Like the German and Ottoman situations, Bulgaria began a rocky governmental and social transformation after the war. The initial post-war government signed an armistice with the Allies to avoid the occupation of the capital, Sofia. It then passed a law granting amnesty for persons accused of violating the laws and customs of war. However, a new government came to power in 1919, representing a coalition of the Agrarian Union, a pro-peasant party, and right-wing parties. The government arrested former ministers and generals and prosecuted them with special civilian courts in order to purge them; they were blamed for Bulgaria’s entrance into the war. Some were prosecuted because they lead groups of refugees from Macedonia in a terrorist organization, the Internal Macedonian Revolutionary Organization. Suppressing Macedonian terrorism was an important condition for Bulgaria to improve its relationship with its neighbor, the Kingdom of the Serbs, Croats, and Slovenes. In 1923, however, Aleksandar Stambuliski, the leader of the Agrarian Union, was assassinated in a military coup, leading to new problems in Bulgaria.
We could ask a counter-factual question: What if the Allies had managed to hold mixed military tribunals for war-time violations instead of allowing the defeated states to stage their own trials? If an Allied tribunal for Germany was run fairly and political posturing was suppressed, it might have established important legal precedents, such as establishing individual criminal liability for violations of the laws of war and the responsibility of officers and political leaders for ordering violations. On the other hand, guilty verdicts might have given Germany’s nationalist parties new heroes in their quest to overturn the Versailles order.
An Allied tribunal for the Armenian massacres would have established the concept that a sovereign government’s ministers and police apparatus could be held criminally responsible under international law for actions undertaken against their fellow nationals. It might also have created a new historical source about this highly contested episode in Ottoman and Turkish history. Yet it is speculative whether the Allies would have been able to compel the post-war Turkish government to pay reparations to Armenian survivors and return stolen property.
Finally, an Allied tribunal for alleged Bulgarian war criminals, if constructed impartially, might have resolved the intense feelings of recrimination that several of the Balkan nations harbored toward each other after World War I. It might also have helped the Agrarian Union survive against its military and terrorist enemies. However, a trial concentrating only on Bulgarian crimes would not have dealt with crimes committed by Serbian, Greek, and Bulgarian forces and paramilitaries during the Balkan Wars of 1912-13, so a selective tribunal after World War I may not have healed all wounds.
Image Credit: Château de Versailles Hall of Mirrors Ceiling. Photo by Dennis Jarvis. CC BY-SA 2.0 via Flickr.
Winston Churchill’s Victory broadcast of 13 May 1945, in which he claimed that but for Northern Ireland’s “loyalty and friendship” the British people “should have been confronted with slavery or death,” is perhaps the most emphatic assertion that the Second World War entrenched partition from the southern state and strengthened the political bond between Britain and Northern Ireland.
Two years earlier, however, in private correspondence with US President Roosevelt, Churchill had written disparagingly of the young men of Belfast, who unlike their counterparts in Britain were not subject to conscription, loafing around “with their hands in their pockets,” hindering recruitment and the vital work of the shipyards.
Churchill’s role as a unifying figure, galvanising the war effort through wireless broadcasts and morale-boosting public appearances, is much celebrated in accounts of the British Home Front. The further away from London and the South East of England that one travels, however, the more questions should be asked of this simplistic narrative. Due to Churchill’s actions as Liberal Home Secretary during the 1910 confrontations between miners and police in South Wales, for example, he was far less popular in Wales, and indeed in Scotland, than in England during the war. But in Northern Ireland, too, Churchill was a controversial figure at this time. The roots of this controversy are to be found in events that took place more than a quarter of a century before, in 1912.
Then First Lord of the Admiralty, Churchill was booed on arrival in Belfast that February, before his car was attacked and his effigy brandished by a mob of loyalist demonstrators. Later at Belfast Celtic Football Ground he was cheered by a crowd of five thousand nationalists as he spoke in favour of Home Rule for Ireland. Churchill was not sympathetic to the Irish nationalist cause but believed that Home Rule would strengthen the Empire and the bond between Britain and Ireland; he also saw this alliance as vital to the defence of the United Kingdom.
Loyalists were outraged. Angry dockers hurled rotten fish at Churchill and his wife Clementine as they left the city; historian and novelist Hugh Shearman reported that their car was diverted to avoid thousands of shipyard workers who had lined the route with pockets filled with “Queen’s Island confetti,” local slang for rivet heads. (Harland and Wolff were at this time Belfast’s largest employer, and indeed one of the largest shipbuilding firms in the world; at the time of the Churchills’ visit the Titanic was being fitted out.)
Two years later in March 1914 Churchill made a further speech in Bradford in England, calling for a peaceful solution to the escalating situation in Ulster and arguing that the law in Ireland should be applied equally to nationalists and unionists without preference. Three decades later, this speech was widely reprinted and quoted in several socialist and nationalist publications in Northern Ireland, embarrassing the unionist establishment by highlighting their erstwhile hostility to the most prominent icon of the British war effort. Churchill’s ignominious retreat from Belfast in 1912 was also raised by pamphleteers and politicians who sought to exploit a perceived hypocrisy in the unionist government’s professed support for the British war effort as it sought to suppress dissent within the province. One socialist pamphlet attacked unionists by arguing that “The Party which denied freedom of speech to a member of the British Government before it became the Government of Northern Ireland is not likely to worry overmuch about free speech for its political opponents after it became the Government.”
And in London in 1940 Victor Gollancz’s Left Book Club published a polemic by the Dublin-born republican activist Jim Phelan, startlingly entitled Churchill Can Unite Ireland. In this Phelan expressed hopes that Churchill’s personality itself could effect positive change in Ireland. He saw Churchill as a figure who could challenge what Phelan called “punctilio,” the adherence to deferential attitudes that kept vested interests in control of the British establishment. Phelan identified a cultural shift in Britain following Churchill’s replacement of Chamberlain as Prime Minister, characterised by a move towards plain speaking: he argued that for the first time since the revolutionary year of 1848 “people are saying and writing what they mean.”
Jim Phelan’s ideas in Churchill Can Unite Ireland were often fanciful, but they alert us to the curious patterns of debate that can be found away from more familiar British narratives of the Second World War. Here a proud Irish republican could assert his faith in a British Prime Minister with a questionable record in Ireland as capable of delivering Irish unity.
Despite publically professed loyalty to the British war effort, unionist mistrust of the London government in London endured over the course of the war, partly due to Churchill’s perceived willingness to deal with Irish Taoiseach Éamon de Valera. Phelan’s book concluded with the words: “Liberty does not grow on trees; it must be fought for. Not ‘now or never’. Now.” Eerily these lines presaged the infamous telegram from Churchill to de Valera following the bombing of Pearl Harbor the following year in 1941, which, it is implied, offered Irish unity in return for the southern state’s entry into the war on the side of Allies, and read in part “Now is your chance. Now or never. A Nation once again.”
As anyone knows who has looked at the newspapers over the festive season, 2015 is a bumper year for anniversaries: among them Magna Carta (800 years), Agincourt (600 years), and Waterloo (200 years). But it is January which sees the first of 2015’s major commemorations, for it is fifty years since Sir Winston Churchill died (on the 24th) and received a magnificent state funeral (on the 30th). As Churchill himself had earlier predicted, he died on just the same day as his father, Lord Randolph Churchill, had done, in 1895, exactly seventy years before.
The arrangements for Churchill’s funeral, codenamed ‘Operation Hope Not’, had long been in the planning, which meant that Churchill would receive the grandest obsequies afforded to any commoner since the funerals of Nelson and Wellington. And unlike Magna Carta or Agincourt or Waterloo, there are many of us still alive who can vividly remember those sad yet stirring events of half a century ago. My generation (I was born in 1950) grew up in what were, among other things, the sunset years of Churchillian apotheosis. They may, as Lord Moran’s diary makes searingly plain, have been sad and enfeebled years for Churchill himself, but they were also years of unprecedented acclaim and veneration. During the last decade of his life, he was the most famous man alive. On his ninetieth birthday, thousands of greeting cards were sent, addressed to ‘The Greatest Man in the World, London’, and they were all delivered to Churchill’s home. During his last days, when he lay dying, there were many who found it impossible to contemplate the world without him, just as Queen Victoria had earlier wondered, at the time of his death in 1852, how Britain would manage without the Duke of Wellington.
Like all such great ceremonial occasions, the funeral itself had many meanings, and for those of us who watched it on television, by turns enthralled and tearful, it has also left many memories. In one guise, it was the final act homage to the man who had been described as ‘the saviour of his country’, and who had lived a life so full of years and achievement and honour and controversy that it was impossible to believe anyone in Britain would see his like again. But it was also, and in a rather different emotional and historical register, not only the last rites of the great man himself, but also a requiem for Britain as a great power. While Churchill might have saved his country during the Second World War, he could not preserve its global greatness thereafter. It was this sorrowful realization that had darkened his final years, just as his funeral, attended by so many world leaders and heads of state, was the last time that a British figure could command such global attention and recognition. (The turn out for Margaret Thatcher’s funeral, in 2013, was nothing like as illustrious.) These multiple meanings made the ceremonial the more moving, just as there were many episodes which made it unforgettable: the bearer party struggling and straining to carry the huge, lead-lined coffin up the steps of St Paul’s; Clement Attlee—Churchill’s former political adversary—old and frail, but determined to be there as one of the pallbearers, sitting on a chair outside the west door brought especially for him; the cranes of the London docks dipping in salute, as Churchill’s coffin was born up the Thames from Tower Pier to Waterloo Station; and the funeral train, hauled by a steam engine of the Battle of Britain class, named Winston Churchill, steaming out of the station.
For many of us, the funeral was made the more memorable by Richard Dimbleby’s commentary. Already stricken with cancer, he must have known that this would be the last he would deliver for a great state occasion (he would, indeed, be dead before the year was out), and this awareness of his own impending mortality gave to his commentary a tone of tender resignation that he had never quite achieved before. As his son, Jonathan, would later observe in his biography of his father, ‘Richard Dimbleby’s public was Churchill’s public, and he had spoken their emotions.’
Fifty years on, the intensity of those emotions cannot be recovered, but many events have been planned to commemorate Churchill’s passing, and to ponder the nature of his legacy. Two years ago, a committee was put together, consisting of representatives of the many institutions and individuals that constitute the greater Churchill world, both in Britain and around the world, which it has been my privilege to chair. Significant events are planned for 30 January: in Parliament, where a wreath will be laid; on the River Thames, where Havengore, the ship that bore Churchill’s coffin, will retrace its journey; and at Westminster Abbey, where there will be a special evensong. It will be a moving and resonant day, and the prelude to many other events around the country and around the world. Will any other British prime minister be so vividly and gratefully remembered fifty years after his—or her—death?
Headline image credit: Franklin D. Roosevelt and Winston Churchill, New Bond Street, London. Sculpted by Lawrence Holofcener. Public domain via Wikimedia Commons.
January 2015 sees the addition of 226 biographies to the Oxford Dictionary of National Biography, offering the lives of those who have played their part in shaping British history between the late 20th and early 21st century. The sectors and professions each of these individuals influenced range from medicine to film, including Nobel Prize and Oscar winners. Explore our infographic below as we highlight a selection of these new lives: some well-renowned, some lesser-known, yet all significant.
The New Year brings with it a new instalment of Oxford DNB biographies which, as every January, extend the Dictionary’s coverage of people who shaped British life in the late twentieth and early twenty-first century. This January we add biographies of 226 men and women who died during 2011. These new biographies were commissioned by my predecessor as editor, Lawrence Goldman, but having recently assumed the editor’s chair, I take full and appreciative responsibility for introducing them.
The new biographies bear vivid witness to an astonishing diversity of personal experience, individual achievement, and occasional delinquency; and they range from Claude Choules (b.1901), the last British-born veteran of the First World War, who died at the age of 110, to the singer and songwriter Amy Winehouse (b.1983), who died from alcohol poisoning aged just twenty-seven. The great majority of the people whose biographies are now added (191, or 84%) were born before the outbreak of the Second World War, and the majority (137, or 60%) were born before 1930. Typically, therefore, most were active between the 1940s and the 1980s, but some (such as Choules) are included for their activities before 1918, and several (such as Winehouse, or the anti-war campaigner, Brian Haw) only came to prominence in the 2000s.
The lives of Choules and Winehouse—the one exceptionally long, the other cut tragically short—draw attention to two of the most significant groups to be found in this new selection. A generation after Choules, many Britons served bravely during the Second World War, among them the SOE veteran Nancy Wake who led a group of resistance fighters and who killed German soldiers with her bare hands; SOE’s French section sent 39 women agents into France during the war, and Wake was undoubtedly among the toughest and most redoubtable. Her fellow SOE officer, Patrick Leigh Fermor, is best known for his capture, on Crete, of the German officer General Kreipe—an event that was retold in the film Ill Met by Moonlight (1957). In March 1942 Leslie Audus was captured by the Japanese and put to work in a slave labour camp. There he employed his skills as a botanist to create a nutritional supplement from fermented soya beans, saving him and hundreds of his fellow prisoners from starvation. After the war, Audus enjoyed a distinguished scientific career though, with great modesty, he made little of his remarkable prison work, which remained known only to former captives who owed him their lives.
The troubled creative life of our latest-born person, Amy Winehouse, is representative of a second significant group of lives to emerge from our new set of biographies. These were the entertainers for whom the celebrity-devouring world of show business was a place of some highs but ultimately of disenchantment and disappointment. Forty years before Winehouse came to public attention, the singer Kathy Kirby enjoyed a glittering career. Ubiquitous in the early 1960s with hit after hit, she was reputedly the highest-paid female singer of her generation. However, she failed to adapt to the rise of rock’n’roll, and soon spiralled into drug and alcohol abuse, bankruptcy, and psychiatric problems. The difficulties Kathy Kirby experienced bear similarities to those of the Paisley-born songwriter Gerry Rafferty, best known for his hit single ‘Baker Street’, which deals with loneliness in a big city; Rafferty too found fame hard to cope with, and eventually succumbed to alcoholism.
Of course, not all encounters with modern British popular culture were so troubled. One of the longest biographies added in this new update is that of the actress Elizabeth Taylor who shot to stardom in National Velvet (1944) and remained ever after a figure of international standing. While Taylor’s private life garnered almost as much attention as her screen roles, she’s also notable in pioneering the now popular association between celebrity and charitable causes—in Taylor’s case for charities working to combat HIV/AIDS. To that of Elizabeth Taylor we can also add other well-known names, among them Lucian Freud—by common consent the greatest British artist of his day, whose depictions of human flesh are unrivalled in their impact and immediacy; the journalist and author Christopher Hitchens, who made his career in the US; Ken Russell, the enfant terrible of British cinema; and the dramatist Shelagh Delany, best-known for her play, A Taste of Honey (1958).
In addition to documenting the lives, and legacies, of well-known individuals—such as Freud, Hitchens, and Delaney—it’s also the purpose of each January update of the ODNB to include people of real historical significance who did not make the headlines. In creating a rounded picture of those who’ve shaped modern Britain, we’re helped enormously by more than 400 external specialists. Divided into specialist panels—from archaeology and broadcasting to the voluntary sector and zoology—our advisers recommend people for inclusion from long lists of possible candidates. And it’s their insight that ensures we provide biographies of many less familiar figures responsible for some truly remarkable achievements. Here is just one example. Leslie Collier was a virologist who, in the 1960s, developed a heat-stable vaccine for smallpox which made possible a mass vaccination programme in Africa and South America. The result was the complete eradication of smallpox as proclaimed by the World Health Organization in 1980. How many figures can claim to have abolished what was once a terrifying global disease?
Whether long or short, good or bad, exemplary or tragic, or something more nuanced and complex in-between, the 226 new biographies now added to the Oxford DNB make fascinating—and sometimes sobering—reading.
Featured image credit: Amy Winehouse, singing, by NRK P3. CC-BY-NC-SA-2.0 via Flickr.
Two hundred years ago American and British delegates signed a treaty in the Flemish town of Ghent to end a two-and-a-half-year conflict between the former colonies and mother country. Overshadowed by the American Revolution and Napoleonic Wars in the two nations’ historical memories, the War of 1812 has been somewhat rehabilitated during its bicentennial. Yet arguing for the importance of a status quo antebellum treaty that concluded a war in which neither belligerent achieved its war aims, no territory was exchanged, and no victor formally declared can be a tough sell. Compared to the final defeat of Napoleon at the Battle of Waterloo, fought a just a few months later and forty odd miles down the road from Ghent, the end of the War of 1812 admittedly lacked cinema-worthy drama.
But the Treaty of Ghent mattered enormously (and not just to historians interested in the War of 1812). The war it ended saw relatively light casualties, measured in the thousands compared to the millions who died in the French Revolutionary and Napoleonic Wars that raged across the rest of the globe. Nevertheless, for the indigenous and colonizing peoples that inhabited the borderlands surrounding the United States, the conflict had proved devastating. Because the American and British economies were intertwined, the war had also wreaked havoc on American agriculture and British manufacturing, and wrecked each other’s merchant navies. Moreover, public support for the war in the British Empire and the United States had been lukewarm with plenty of outspoken opposition who had worked tirelessly to prevent and then quickly end the war.
Not surprisingly, peace resulted in widespread celebration across the Atlantic. The Leeds Mercury, many of whose readers were connected to the manufacturing industries that had relied on American markets, even compared the news with that of the Biblical account of the angelic chorus’s announcement of the birth of Jesus: “This Country, thanks to the good Providence of God, is now at Peace with Europe, with America, and with the World. . . . There is at length ‘Peace on Earth,’ and we trust the revival of ‘Good-will among men’ will quickly follow the close of national hostilities.” When the treaty reached Washington for ratification, President James Madison and Congress fell over themselves in a rush to sign it.
Far more interesting than what the relatively brief Treaty of Ghent includes is what was left out. When the British delegation arrived at Ghent in August 2014, they had every possible advantage. Britain had won the naval war, the United States was on brink of bankruptcy, and the end of Britain’s war with France meant that hardened veterans were being deployed for an imminent invasion of the United States. Later that month British troops would humiliatingly burn Washington. Even Ghent itself was a home field advantage, as it was occupied by British troops and within a couple of days of communication with ministers in London.
In consequence, Britain’s initial demands were severe. If the United States wanted peace, it had to cede 250,000 square miles of its northwestern lands (amounting to more than 15% of US territory, including all or parts of the modern states of Michigan, Illinois, Indiana, Ohio, Missouri, Iowa, Wisconsin, and Minnesota). These lands would be used to create an independent American Indian state—promises of which the British had used to recruit wary Indian allies. Britain also demanded a new border for Canada, which included the southern shores of the Great Lakes and a chunk of British-occupied Maine—changes that would have given Canada considerable natural defenses. The Americans, claimed the British, were “aggrandizers”, and these measures would ensure that such ambitions would be forever thwarted.
The significance of the terms is difficult to underestimate. Western expansion would have ground to a halt in the face of a powerful British-led alliance with the Spanish Empire and new American Indian state. The humiliation would likely have resulted in the collapse of the United States. The long-marginalized New England Federalists had been outspoken in their opposition to the war and President James Madison’s Southern-dominated Republican Party, with some of their leaders openly threatening secession. The Island of Nantucket had already signed a separate peace with Britain, and many inhabitants of British-occupied Maine had signed oaths of allegiance to Britain. The Governor of Massachusetts had even sent an agent to Canada to discuss terms of British support for his state’s secession, which included a request for British troops. The counterfactuals of a New England secession are too great to explore here, but the implications are epic—not least because, unlike in 1861, the US government in 1814 was in no position to stop one. In the end, a combination of the American delegates’ obstinacy and a rapidly fading British desire to keep the nation on an expensive war footing solely to fight the Americans led the British to abandon their harsh terms.
In consequence, the Treaty of Ghent cemented the United States rather than destroyed it. Historians have long debated who truly won the war. However, what mattered most was that neither side managed a decisive victory. The Americans lacked the organization and national unity to win; the British lacked the will to wage an expensive, offensive war in North America. American inadequacy ensured that all of Canada would prosper as part of the British Empire, even though Upper Canada (now Ontario) had arguably closer links to the United States and was populated largely by economic migrants from the United States. British desire to avoid further confrontation enabled the Americans to focus its attentions on eliminating the other, and considerably weaker, obstacles to continental supremacy: the American Indians and the remnants of the Spanish Empire, who proved to be the real losers of the War of 1812 and the Treaty of Ghent.
Featured image: The Signing of the Treaty of Ghent, Christmas Eve, 1814, Amédée Forestier (1814). Public domain via Wikimedia Commons.
By December 1914 the Great War had been raging for nearly five months. If anyone had really believed that it would be ‘all over by Christmas’ then it was clear that they had been cruelly mistaken. Soldiers in the trenches had gained a grudging respect for their opposite numbers. After all, they had managed to fight each other to a standstill.
On Christmas Eve there was a severe frost. From the perspective of the freezing-cold trenches the idea of the season of peace and goodwill seemed surrealistic. Yet parcels and Christmas gifts began to arrive in the trenches and there was a strange atmosphere in the air. Private William Quinton was watching:
We could see what looked like very small coloured lights. What was this? Was it some prearranged signal and the forerunner of an attack? We were very suspicious, when something even stranger happened. The Germans were actually singing! Not very loud, but there was no mistaking it. Suddenly, across the snow-clad No Man’s Land, a strong clear voice rang out, singing the opening lines of “Annie Laurie“. It was sung in perfect English and we were spellbound. To us it seemed that the war had suddenly stopped! Stopped to listen to this song from one of the enemy.
“We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle.”
On Christmas Day itself, in some sectors of the line, there was no doubting the underlying friendly intent. Yet the men that took the initiative in initiating a truce were brave – or foolish – as was witnessed by Sergeant Frederick Brown:
Sergeant Collins stood waist high above the trench waving a box of Woodbines above his head. German soldiers beckoned him over, and Collins got out and walked halfway towards them, in turn beckoning someone to come and take the gift. However, they called out, “Prisoner!” A shot rang out, and he staggered back, shot through the chest. I can still hear his cries, “Oh my God, they have shot me!”
This was not a unique incident. Yet, despite the obvious risks, men were still tempted. Individuals would get off the trench, then dive back in, gradually becoming bolder as Private George Ashurst recalled:
It was grand, you could stretch your legs and run about on the hard surface. We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle. Part way through we were all playing football. It was so pleasant to get out of that trench from between them two walls of clay and walk and run about – it was heaven.
The idea that football matches were played between the British and Germans in No Man’s Land has taken a grip, but the evidence is intangible.
The truce was not planned or controlled – it just happened. Even senior officers recognised that there was little that could be done in this strange state of affairs. Brigadier General Lord Edward Gleichen accepted the truce as a fait accompli, but was keen to ensure that the Germans did not get too close to the ramshackle British trenches:
They came out of their trenches and walked across unarmed, with boxes of cigars and seasonable remarks. What were our men to do? Shoot? You could not shoot unarmed men. Let them come? You could not let them come into your trenches; so the only thing feasible was done – and our men met them half-way and began talking to them. Meanwhile our officers got excellent close views of the German trenches.
Another practical reason for embracing the truce was the opportunity it presented for burying the dead that littered No Man’s Land. Private Henry Williamson was assigned to a burial party:
The Germans started burying their dead which had frozen hard. Little crosses of ration box wood nailed together and marked in indelible pencil. They were putting in German, ‘For Fatherland and Freedom!’ I said to a German, “Excuse me, but how can you be fighting for freedom? You started the war, and we are fighting for freedom!” He said, “Excuse me English comrade, but we are fighting for freedom for our country!”
It should be noted that the truce was by no means universal, particularly where the British were facing Prussian units.
For the vast majority of the participants, the truce was a matter of convenience and maudlin sentiment. It did not mark some deep flowering of the human spirit, or signify political anti-war emotions taking root amongst the ranks. The truce simply enabled them to celebrate Christmas in a freer, more jovial, and, above all, safer environment, while satisfying their rampant curiosity about their enemies.
The truce could not last: it was a break from reality, not the dawn of a peaceful world. The gradual end mirrored the start, for any misunderstandings could cost lives amongst the unwary. For Captain Charles Stockwell it was handled with a consummate courtesy:
At 8.30am I fired three shots in the air and put up a flag with ‘Merry Christmas!’ on it, and I climbed on the parapet. He put up a sheet with, ‘Thank you’ on it, and the German captain appeared on the parapet. We both bowed and saluted and got down into our respective trenches – he fired two shots in the air and the war was on again!
In other sectors, the artillery behind the lines opened up and the bursting shells soon shattered the truce.
War regained its grip on the whole of the British sector. When it came to it, the troops went back to war willingly enough. Many would indeed have rejoiced at the end of the war, but they were still willing to accept orders, still willing to kill Germans. Nothing had changed.
The light in the Orkneys is so clear, so bright, so lucid, it feels like you are on top of the world looking though thin clouds into heaven.
It doesn’t even feel part of the UK: when you sail off the edge of Scotland by the Scrabster to Stromness ferry, you feel you are departing the real world to land in a magical realm.
Nowhere else on earth can you go to a place and see eight thousand years of continuous history in such a tiny space.
Skara Brae is what remains of a neolithic village, older than Stonehenge and the pyramids, kept secret underground until uncovered by a severe storm in 1850. You can walk in and sit down, look around at the stone walls, stone beds, stone cupboards, dressers, seats, and storage boxes. Recognizably human people lived here, seeing this same landscape and coast, feeling the same wind on their faces that you do, their eyes resting on the doors, hearths and toilets (one in each dwelling).
This is ‘stone age’ but talking about such ages is a misnomer in the Orkneys where they had no appreciable bronze age nor iron age so proceeded from the non-use of one metal to the non-use of another in what is now the best preserved neolithic site in Europe.
The Orkneys have been so fascinating for so long that even the vandalism needs to be preserved. In Maeshowe burial mound you can see where Viking tourists who came to the monument, already ancient by their time, wrote graffiti about their girlfriends on the walls. They wrote in Norse runes.
The Orkney islands were the headquarters of the Viking invasion fleets, and to this day the Orkneys are the only place in the world besides Norway where the Norwegian national day is celebrated.
The islands are filled with Tolkeinesque place names like the Ring of Brodgar, the Brough of Birsay, the Standing Stones of Stenness. Sagas were born here, like that of the peaceable 12th century Earl of Orkney, treacherously assassinated and now known as St Magnus, after whom the cathedral is named.
Sagas were created here in living memory. This is where the British home fleet was at anchor and the German fleet still lies. The battle fleet of the German Imperial Navy transferred in its entirety to Scapa Flow in 1919 to await a decision on its future. The German sailors could not bring themselves to give up their ships; they opened the seacocks and scuttled them all. At low tide you can still see the rusting hulks of Wilhelmine ambitions to dominate Europe.
If the Orkneys sound bleak and rocky, that would be the wrong impression to leave. They have rich and fertile farming land with green plains rolling on under a pearl sky. People tell folk tales around the peat fires, drinking ginger-flavoured whiskey; an orange cat pads around the grain heaps in the Highland Park distillery, and the islands shimmer under the ‘simmer dim’ of nightless summer days. I should be there now.
Like much historical research, my chapter in the Britain’s Soldiers collection came about more or less by accident. It relates to an incident that I discovered in the War Office papers at in 2007. I was taking a group of History students from Northampton University to The National Archives in Kew, to help them with their undergraduate dissertations. I had a small amount of time to do some of my own research, so ordered WO 43/404. The title sounded promising: ‘A book containing copies of correspondence in 1761, relative to the Behaviour of the Duke of Richmond’s regiment & Militia at Stamford on the 15th April 1761’.
What arrived was a letter book. It immediately struck me as being unusual, as the War Office usually just kept in-letters, whereas this book copied both sides of a lengthy correspondence between the Secretary at War and various other protagonists. Something noteworthy had clearly occurred, and was therefore preserved, possibly as a precedent to inform future action. I was pushed for time, however, so I quickly photographed the whole book and returned to my students.
Four years later I finally had an opportunity to transcribe the letters. What emerged was a bizarre event. In brief, the Lincolnshire Militia was quartered in Stamford and a regiment of the regular army that was on the march approached the town. The regulars had such contempt for the militia that they marched straight in, disregarding the usual military convention that they send advance word, and proceeded to refuse any of the other courtesies that the militia attempted to offer them. The militia’s commander took such umbrage at these slights that he posted sentries at entrances to the town, ‘to prevent any armed Troops entering the Town for the future without my knowledge and consent’. When further regulars attempted to enter the town, the militia stopped them at the point of their bayonets and a fight ensued in which men were injured, and could have been killed.
This was more than a local scrap. Neither side would admit fault and so wrote to the War Office to intercede. Despite the fact that Britain was in the midst of the Seven Years War – the largest global conflict that Britain had then fought – the Secretary at War took the incident very seriously indeed, and the letter book records how the fallout preoccupied him for a further two months. The dispute even drew in the King himself, who as Commander in Chief was keen to preserve ‘that Equality and Harmony in Service which is so much to be wished and cultivated’.
I was intrigued by the story that emerged from this letter book, and a paper trail ensued as I attempted to flesh out the story with other sources. My attempts to find references to the affair in public sources such as newspapers drew a blank, and I had more luck in private family papers and local government records instead. This was not an incident that was publicly known about at the time, which perhaps explains why historians had overlooked it.
As a cultural historian, what drew me to this incident was that it was an example of people behaving oddly. It is often only when people are behaving abnormally that we get an insight into the normal expectations of behaviour that go unspoken – in this case, attitudes towards military precedence and masculine honour. I think that incidents like the one at Stamford in 1761 highlight the crucial significance of ‘polite’ and controlled conduct in the eighteenth-century military. To our eyes, the interpersonal conduct of Georgian army officers may seem terribly mannered – but this is clearly desirable when you have large numbers of men who are highly defensive of their status and armed to the teeth. Rather than just reconstructing micro-incidents like this for their own sake, therefore, it is helpful to think about them more anthropologically in order to shed light on the workings of a past society.
Headline image credit: Battle of Bunker Hill by E. Percy Moran. Public domain via Wikimedia Commons.
Two hundred years ago last Friday the owner of the London Times, John Walter II, is said to have surprised a room full of printers who were preparing hand presses for the production of that day’s paper. He showed them an already completed copy of the paper and announced, “The Times is already printed – by steam.” The paper had been printed the night before on a steam-driven press, and without their labor. Walter anticipated and tried to mediate the shock and unrest with which this news was met by the now-idled printers. It was one of many scenes of change and conflict in early industrialization where the hand was replaced by the machine. Similar scenes of hand labor versus steam entered into cultural memory from Romantic poetry about framebreaking Luddites to John Henry’s hand-hammering race against the steam drill.
There were many reasons to celebrate the advent of the steam press in 1814, as well as reasons to worry about it. Steam printing brought the cost of printing down, increased the number of possible impressions per day by four times, and, in a way, we might say that it helped “democratize” access to information. That day, the Times proclaimed that the introduction of steam was the “greatest improvement” to printing since its very invention. Further down that page, which itself was “taken off last night by a mechanical apparatus,” we read why the hand press printers might have been concerned: “after the letters are placed by the compositors… little more remains for man to do, than to attend upon, and watch this unconscious agent in its operations.”
Moments of technological change do indeed put people out of work. My father, who worked at the Buffalo News for nearly his entire career, often told me about layoffs or fears of layoffs coming with the development of new computerized presses, print processes, and dwindling markets for print. But the narrative of the hand versus the machine, or of the movement from the hand to the machine, obscures a truth about labor, especially information labor. Forms of human labor are replaced (and often quantifiably reduced), but they are also rearranged, creating new forms of work and social relations around them. We would do well to avoid the assumption that no one worked the steam press once hand presses went mostly idle. As information, production, and circulation becomes more technologically abstracted from the hands of workers, there is an increased tendency to assume that no labor is behind it.
Two hundred years after the morning when the promise of faster, cheaper, and more accessible print created uncertainty among the workers who produced it, I am writing to you using an Apple Computer made by workers in Shenzhen, China with metals mined all over the global South. The software I am using to accomplish this task was likely written and maintained by programmers in India managed by consultants in the United States. You are likely reading this on a similar device. Information has been transmitted between us via networks of wires, servers, cable lines, and wireless routers, all with their own histories of people who labor. If you clicked over here from Facebook, a worker in a cubicle in Manilla may have glanced over this link among thousands of others while trying to filter out content that violates the social network’s terms of service. Technical laborers, paid highly or almost nothing at all, and working under a range of conditions, are silently mediating this moment of exchange between us. Though they may no longer be hand-pressed, the surfaces on which we read and write are never too distant from the hands of information workers.
Like research in book history and print culture studies, the common appearance of a worker’s hand in Google Books reminds us that, despite radical changes in technology over centuries, texts are material objects and are negotiated by numerous people for diverse purposes, only some of which we would call “reading” proper. The hand pulling the lever of a hand press and the hand turning pages in scanner may be representative of two poles on a two-century timeline, but, for me, they suggest many more continuities between early print and contemporary digital cultures than ruptures. John Walter II’s proclamation on 28 November 1814 was not a turn away from a Romantic past of artisanal labor toward a bleak and mechanized future. Rather, it was an early moment in an ongoing struggle to create and circulate words and images to ever more people while also sustaining the lives of those who produce them. Instead of assuming, two hundred years on, that we have been on a trajectory away from the hand, we must continue looking for and asking about the conditions of the hand in the machine.
Headline image credit: Hand of Google, by Unknown CC BY-SA 3.0 via Wikimedia Commons.
One of the ironies of the Scottish independence referendum is that Scotland is widely recognised to be a changed place despite the majority voting in favour of the union. It became clear during the course of 2014 that something significant was happening. Scotland witnessed levels of public engagement and debate never before seen. Hugh MacDiarmid’s ‘Glasgow 1960’ comes to mind. Returning to Glasgow ‘after long exile’, MacDiarmid’s narrator encounters packed trams heading for Ibrox, the home of Rangers football club, but discovers that the crowds are going to listen to a debate between ‘Professor MacFadyen and a Spainish pairty’ and that newspapers with headlines ‘Special! Turkish Poet’s Abstruse New Song’ were selling ‘like hot cakes’.
The Scottish Question may not have been debated on quite so elevated a level but debates were conducted the length and breadth of Scotland in a remarkably civil, engaging, and open manner. Those who sought to portray these debates as something sinister could do no better than refer to a professional politician who had an egg thrown at him while he addressed meetings on top of an Irn Bru crate. The dull, limited, predictable, binary debate of the conventional press contrasted with the expansive, lively, and engaging discussions that took place in often novel venues in every nook and cranny of Scotland. The Scottish Question, as debated by the public, was not restricted to a narrow constitutional question but became a genuine dialogue about what kind of place Scotland should seek to become. The referendum started a process that has not been halted by the outcome of a referendum on whether Scotland should become an independent country, the formal question that provoked this all-embracing national conversation.
The result of referendum and reaction to it has been in stark contrast to the referendum on devolution 35 years ago. In 1979, Scots had narrowly voted for a very limited form of devolution – 51.6% in favour on a turnout of 63.7% – but the measure on offer was not implemented as it failed to achieve the weighted majority demanded by Parliament at Westminster. The expectation in the run-up to that referendum had been that a decisive majority would vote for devolution. The slight numeric majority hid a defeat in expectations. Expectations were very different in the months leading up to September 18th this year. Early in 2014, opponents of independence thought that they might push support for independence below 30% and were still convinced that it would win less than 40% only a few weeks before Scots went to vote. In the event, 55.3% voted for the union on a record turnout of 84.6% but it has been the 45% that has been celebrated as victory. It has been the membership of the Yes parties, that has increased dramatically, with the membership of the Scottish National Party now dwarfing that of the other Scottish parties. With just under 100,000 members, the SNP can claim to be the only mass party in the UK today. Politics is an expectations game and supporters of independence knew that they had a ‘mountain to climb’, in the words of the chair of the official Yes campaign.
As opinion polls narrowed towards the end of the campaign, a ‘Vow’ was signed by the three main UK party leaders promising substantially more devolution while protecting Scotland’s share of public spending. This means that even the debate around the narrowed constitutionalist understanding of the Scottish Question will continue. More powers will be delivered with ramifications for the rest of the United Kingdom. Scotland is a changed place but an answer to the Scottish Question remains as elusive as ever.
Headline image credit: Glencoe, Scotland panorama by Gil Cavalcanti. CC BY-SA 3.0 via Wikimedia Commons.
Autumn is here again – in England, the season of mists and mellow fruitfulness, in the US also the season of Thanksgiving. On the fourth Thursday in November, schoolchildren across the country will stage pageants, some playing black-suited Puritans, others Native Americans bedecked with feathers. By tradition, Barack Obama will ‘pardon’ a turkey, but 46 million others will be eaten in a feast complete with corncobs and pumpkin pie. The holiday has a long history: Lincoln fixed the date (amended by Roosevelt in 1941), and Washington made it a national event. Its origins, of course, lay in the Pilgrim Fathers’ first harvest of 1621.
Who now remembers who these intrepid migrants were – not the early ‘founding fathers’ they became, but who they were when they left? The pageant pilgrims are undifferentiated. Who knows the name of Christopher Martin, a merchant from Billericay near Chelmsford in Essex? He took his whole family on the Mayflower, most of whom, including Martin himself, perished in New Plymouth’s first winter. They died Essex folk in a strange land: there was nothing ‘American’ about them. And as for Thanksgiving, well that habit came from the harvest festivals and religious observances of Protestant England. Even pumpkin pie was an English dish, exported then forgotten on the eastern side of the Atlantic.
Towns like Billericay, Chelmsford and Colchester were crucial to American colonization: ordinary places that produced extraordinary people. The trickle of migrants in the 1620s, in the next decade became a flood, leading to some remarkable transformations. In 1630 Francis Wainwright was drawing ale and clearing pots in a Chelmsford inn when his master, Alexander Knight, decided to emigrate to Massachusetts. It was an age of austerity, of bad harvests and depression in the cloth industry. Plus those who wanted the Protestant Reformation to go further – Puritans – feared that under Charles I it was slipping backwards. Many thought they would try their luck elsewhere until England’s fortunes were restored, perhaps even that by building a ‘new’ England they could help with this restoration. Wainwright, aged about fourteen, went with Knight, and so entered a world of hardship and danger and wonder.
One May dawn, seven years later, Wainwright was standing by the Mystic River in Connecticut, one of seventy troops waiting to shoot at approaching Pequot warriors. According to an observer, the Englishmen ‘being bereaved of pity, fell upon the work without compassion’, and by dusk 400 Indians lay dead in their ruined encampment. The innkeeper’s apprentice had fired until his ammunition was exhausted, then used his musket as a club. One participant celebrated the victory, remarking that English guns had been so fearsome, it was ‘as though the finger of God had touched both match and flint’. Another rejoiced that providence had made a ‘fiery oven’ of the Pequots’ fort. Wainwright took two native heads home as souvenirs. Unlike many migrants, he stayed in America, proud to be a New Englander, English by birth but made different by experience. He lived a long life in commerce, through many fears and alarms, and died at Salem in 1692 during the white heat of the witch-trials.
The story poses hard historical questions. What is identity, and how does it change? Thanksgiving pageants turn Englishmen into Americans as if by magic; but the reality was more gradual and nuanced. Recently much scholarly energy has been poured into understanding past emotions. We may think our emotions are private, but they leak out all the time; we may even use them to get what we want. Converted into word and deed, emotions leave traces in the historical record. When the Pilgrim William Bradford called the Pequot massacre ‘a sweet sacrifice’, he was not exactly happy but certainly pleased that God’s will had been done.
Puritans are not usually associated with emotion, but they were deeply sensitive to human and divine behaviour, especially in the colonies. Settlers were proud to be God’s chosen people – like Israelites in the wilderness – yet pride brought shame, followed by doubt that God liked them at all. Introspection led to wretchedness, which was cured by the Holy Spirit, and they were back to their old censorious selves. In England, even fellow Puritans thought they’d lost the plot, as did most (non-Puritan) New Englanders. But godly colonists established what historians call an ‘emotional regime’ or ‘emotional community’ in which their tears and thunder were not only acceptable but carried great political authority.
John Winthrop, the leader of the fleet that carried Francis Wainwright to New England, was an intensely emotional man who loved his wife and children almost as much as he loved God. Gaunt, ascetic and tirelessly judgmental, he became Massachusetts Bay Colony’s first governor, driven by dreams of building a ‘city upon a hill’. It didn’t quite work out: Boston grew too quickly, and became diverse and worldly. And not everyone cared for Winthrop’s definition of liberty: freedom to obey him and his personal interpretation of God’s designs. But presidents from Reagan to Obama have been drawn to ‘the city upon the hill’ as an emotionally potent metaphor for the US in its mission to inspire, assist, and police the world.
Winthrop’s feelings, however, came from and were directed at England. His friend Thomas Hooker, ‘the father of Connecticut’, cut his teeth as a clergyman in Chelmsford when Francis Wainwright lived there. Partly thanks to Wainwright, one assumes, he found the town full of drunks, with ‘more profaneness than devotion’. But Hooker ‘quickly cleared streets of this disorder’. The ‘city upon the hill’, then, was not a blueprint for America, but an exemplar to help England reform itself. Indeed, long before the idea was associated with Massachusetts, it related to English towns – notably Colchester – that aspired to be righteous commonwealths in a country many felt was going to the dogs. Revellers did not disappear from Chelmsford and Colchester – try visiting on a Saturday night – but, as preachers and merchants and warriors, its people did sow the seeds from which grew the most powerful nation in the world.
So if you’re celebrating Thanksgiving this year, or you know someone who is, it’s worth remembering that the first colonists to give thanks were not just generic Old World exiles, uniformly dull until America made them special, but living, breathing emotional individuals with hearts and minds rooted in English towns and shires. To them, the New World was not an upgrade on England: it was a space in which to return their beloved country to its former glories.
Featured image credit: Signing of the Constitution, by Thomas P. Rossiter. Public domain via Wikimedia Commons
Alan Mathison Turing (1912-1954) was a mathematician and computer scientist, remembered for his revolutionary Automatic Computing Engine, on which the first personal computer was based, and his crucial role in breaking the ENIGMA code during the Second World War. He continues to be regarded as one of the greatest scientists of the 20th century.
We live in an age that Turing both predicted and defined. His life and achievements are starting to be celebrated in popular culture, largely with the help of the newly released film The Imitation Game, starring Benedict Cumberbatch as Turing and Keira Knightley as Joan Clarke. We’re proud to publish some of Turing’s own work in mathematics, computing, and artificial intelligence, as well as numerous explorations of his life and work. Use our interactive Enigma Machine below to learn more about Turing’s extraordinary achievements.
Image credits: (1) Bletchley Park Bombe by Antoine Taveneaux. CC-BY-SA-3.0 via Wikimedia Commons. (2) Alan Turing Aged 16, Unknown Artist. Public domain via Wikimedia Commons. (3) Good question by Garrett Coakley. CC-BY-SA 2.0 via Flickr.
Remembrance Day is a memorial day observed in Commonwealth of Nations member states since the end of the First World War to remember those who have died in the line of duty. It is observed by a two-minute silence on the ’11th hour on the 11th day of the 11th month’, in accordance with the armistice signed by representatives of Germany and the Entente on 11 November, 1918. The First World War officially ended with the signing of theTreaty of Versailleson 28 June 1919. In the UK, Remembrance Sunday occurs on the Sunday closest to the 11th November, and is marked by ceremonies at local war memorials in most villages, towns, and cities. The red poppy has become a symbol for Remembrance Day due to the poem In Flanders Fields, by Lieutenant Colonel John McCrae.
You can discover more about the history behind the First World War by exploring the free resources included in theinteractive imageabove.
Feature image credit: Poppy Field, by Martin LaBar. CC-BY-NC-2.0 via Flickr.
Time passes quickly. As we track the progression of events hundred years ago on the Western Front, the dramas flash by. In the time it takes to answer an e-mail the anniversary of another battle has come and gone.
We have celebrated the fumbling British skirmishes at Mons and Le Cateau in late August, but largely forgotten the French triumph at the Battle of the Marne which first stemmed and threw back the German wheeling attack through Belgium into Northern France under the Schlieffen Plan. We have already bypassed the spirited Franco-British attempts at the Battle of the Aisne in September to take the Chemin des Dames. The Race to the Sea was under way: the British and German Armies desperately trying to turn their enemy’s northern flank.
Throughout, the performance of the British Expeditionary Force has often been exaggerated. Imaginative accounts of Germans advancing in massed columns and being blown away by rapid rifle fire are common. A rather more realistic assessment is that the British infantry were steadfast enough in defence, but unable to function properly in coordination with their artillery or machine guns. The Germans seemed to have a far better grip of the manifold disciplines of modern warfare.
Yet everything changed in October. The Germans were scraping the barrel for manpower and decided to throw new reserve formations into the battle. Young men with the minimum of training, incapable of sophisticated battle tactics. They were marched forward in a last gambler’s throw of the dice to try and break through to the Channel Ports. To do that they needed first to capture the small Belgian city of Ypres.
One might have thought that Ypres was some fabled city, fought over to secure untold wealth or a commanding tactical position. Nothing could be further from the truth. Ypres was just an ordinary town, lying in the centre of the fertile Western Flanders plain. Yet the low ridges to the east represented one of the last feasible lines of defence. The British also saw the town, not as an end in itself, but as a stepping stone to more strategically important locations pushing eastwards, such as the rail centre at Roulers or the ports of Ostend and Zeebrugge. For both sides Ypres was on the road to somewhere.
The battle began in mid-October and soon began to boil up. Time and time the Germans hurled themselves forward, the grey-green hordes pressing forwards and being shot down in their hundreds. The British had learnt many lessons and this was where they finally proved themselves worthy adversaries for the German Army. On the evening of 23 October young Captain Harry Dillon was fighting for his life:
A great grey mass of humanity was charging, running for all God would let them, straight on to us not 50 yards off. Everybody’s nerves were pretty well on edge as I had warned them what to expect, and as I fired my rifle the rest all went off almost simultaneously. One saw the great mass of Germans quiver. In reality some fell, some fell over them, and others came on. I have never shot so much in such a short time, could not have been more than a few seconds and they were down. Suddenly one man – I expect an officer – jumped up and came on. I fired and missed, seized the next rifle and dropped him a few yards off. Then the whole lot came on again and it was the most critical moment of my life. Twenty yards more and they would have been over us in thousands, but our fire must have been fearful, and at the very last moment they did the most foolish thing they possibly could have done. Some of the leading people turned to the left for some reason, and they all followed like a great flock of sheep. We did not lose much time, I can give you my oath. My right hand is one huge bruise from banging the bolt up and down. I don’t think one could have missed at the distance and just for one short minute or two we poured the ammunition into them in boxfuls. My rifles were red hot at the finish. The firing died down and out of the darkness a great moan came. People with their arms and legs off trying to crawl away; others who could not move gasping out their last moments with the cold night wind biting into their broken bodies and the lurid red glare of a farm house showing up clumps of grey devils killed by the men on my left further down. A weird awful scene; some of them would raise themselves on one arm or crawl a little distance, silhouetted as black as ink against the red glow of the fire. [p. 287-288, Fire & Movement, by Peter Hart]
Some of the Germans had got within 25 yards of Dillon’s line. It had been a close run thing and after they had been relieved by the French later that night the French reported that some 740 German corpses littered the ground in front of his trenches. This was the real war: not a skirmishes like the earlier battles, this was the real thing.
The German attacks continued, followed as day follows night, by French and British counter-attacks to restore the situation. The Germans nibbled at the Allied line but were unable to achieve anything of importance. Yet for all the sound and fury, over the next few days the front line stayed relatively static. The German troops were flagging in their efforts. After one last effort on 11 November the Germans threw in the towel. They would not break through the Allied lines in 1914. The British and French lines had held. Battered, bruised, but unbroken. The First Battle of Ypres had confirmed the strategic victory gained by the French at the Marne. The German advance in the west had been blocked, if they sought victory in 1915 they would have to look to the east and attack Russia.
The 1914 campaign would prove decisive to the war. The utter failure of the Schlieffen Plan, designed to secure the rapid defeat of France, meant that Germany would be condemned to ruinous hostilities on two fronts. This was the great turning-point of the whole war. The pre-war predictions from the German strategists that they could not prevail in a long-drawn out war against the combined forces of France and Russia proved accurate, especially when the British Empire and United States joined the fight. The German Army fought with a sustained skill and endurance, but after 1914, the odds really were stacked against them.
This year is the 250th anniversary of Horace Walpole’s The Castle of Otranto, first published on Christmas Eve 1764 as a seasonal ghost story. The Castle of Otranto is often dubbed the “first Gothic novel” due to Walpole describing it as a “Gothic story,” but for him the Gothic meant very different things from what it might do today. While the Gothic was certainly associated with the supernatural, it was predominantly a theory of English progress rooted in Anglo-Saxon and medieval history — effectively the cultural wing of parliamentarian politics and Protestant theology. The genre of the “Gothic novel,” with all its dire associations of uncanny horror, would not come into being for at least another century. Instead, the writing that followed in the wake of Otranto was known as the German School, the ‘Terrorist System of Writing’, or even hobgobliana.
Reading Otranto today, however, it is almost impossible to forget what 250 years of Gothickry have bequeathed to our culture in literature, architecture, film, music, and fashion: everything from the great Gothic Revival design of the Palace of Westminster to none-more-black clothes for sale on Camden Town High Street and the eerie music of Nick Cave, Jordan Reyne, and Fields of the Nephilim.
And the cinema has been instrumental in spreading this unholy word. Despite being rooted in the history of the barbarian tribes who sacked Rome and the thousand-year epoch of the Dark Ages, the Gothic was also a state-of-the-art movement. Technology drove the Gothic dream, enabling, for instance, the towering spires and colossal naves of medieval cathedrals, or enlisting in nineteenth-century art and literature the latest scientific developments in anatomy and galvanism (Frankenstein), the circulation of the blood and infection (The Vampyre), or drug use and psychology (Strange Case of Dr Jekyll and Mr Hyde).
The moving image on the cinema screen therefore had an immediate and compelling appeal. The very experience of cinema was phantasmagoric — kaleidoscopic images projected in a darkened room, accompanied by often wild, expressionist music. The hallucinatory visions of Henry Fuseli and Gustave Doré arose and, like revenants, came to life.
Camera tricks, special effects, fantastical scenery, and monstrous figures combined in a new visual style, most notably in Robert Wiene’s The Cabinet of Dr Caligari (1920) and F. W. Murnau’s Nosferatu: A Symphony of Terror (1922). Murnau’s Nosferatu, the first vampire film, fed parasitically on Bram Stoker’s Dracula; it was rumored that Max Schreck, who played the nightmarish Count Orlok, was indeed a vampire himself. The horror film had arrived.
Mid-century Hollywood movie stars such as Bela Lugosi, who first played Dracula in 1931, and Boris Karloff, who played Frankenstein’s monster in the same year, made these roles iconic. Lugosi played Dracula as a baleful East European, deliberately melodramatic; Karloff was menacing in a different way: mute, brutal, and alien. Both embodied the threat of the “other”: communist Russia, as conjured up by the cinema. Frankenstein’s monster is animated by the new cinematic energy of electricity and light, while in Dracula the Count’s life and death are endlessly replayed on the screen in an immortal and diabolical loop.
It was in Britain, however, that horror films really took the cinema-going public by the throat. Britain was made for the Gothic cinema: British film-makers such as Hammer House of Horror could draw on the nation’s rich literary heritage, its crumbling ecclesiastical remains and ruins, the dark and stormy weather, and its own homegrown movie stars such as Peter Cushing and Christopher Lee. Lee in particular radiated a feral sexuality, enabling Hammer Horror to mix a heady cocktail of sex and violence on the screen. It was irresistible.
The slasher movies that have dominated international cinema since Hammer through franchises such as Hellraiser and Saw are more sensationalist melodrama than Gothic, but Gothic film does thrive and continues to create profound unease in audiences: The Exorcist, the Alien films, Blade Runner, The Blair Witch Project, and more overtly literary pictures such as Bram Stoker’s Dracula are all contemporary classics — as is Buffy the Vampire Slayer on TV.
And despite the hi-tech nature of film-making, the profound shift in the meaning of Gothic, and the gulf of 250 years, the pulse of The Castle of Otranto still beats in these films. The action of Otranto takes place predominantly in the dark in a suffocatingly claustrophobic castle and in secret underground passages. Inexplicable events plague the plot, and the dead — embodying the inescapable crimes of the past — haunt the characters like avenging revenants. Otranto is a novel of passion and terror, of human identity at the edge of sanity. In that sense, Horace Walpole did indeed set down the template of the Gothic. The Gothic may have mutated since 1764, it may now go under many different guises, but it is still with us today. And there is no escape.
Autumn 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors look at aspects of the ODNB’s online evolution in the decade since 2004. Here the ODNB’s publication editor, Philip Carter, considers how an ever-evolving Dictionary is being transformed by new opportunities in digital research.
When it was first published in September 2004, the Oxford DNB brought together the work of more than 10,000 humanities scholars charting the lives of nearly 55,000 historical individuals. Collectively it captured a generation’s understanding and perception of the British past and Britons’ reach worldwide. But if the Dictionary was a record of scholarship within a particular timeframe, it was also seen from the outset as a work in progress. This is most evident in the decision to include in the ODNB every person who had appeared in the original, Victorian DNB. Doing so defined the 2004 Dictionary (to quote the entry on Colin Matthew, its founding editor) as ‘a collective account of the attitudes of two centuries: the nineteenth as well as the twentieth, the one developing organically from the other.’
In the decade since 2004 this notion of the ODNB as an organic ‘work in progress’ has gone a step further. This is seen, in part, in the continued extension of biographical coverage, both of the ‘recently deceased’ and of newly documented lives from earlier periods—as discussed in other articles in this 10th anniversary series. But in addition to new content there’s also been the evolution—in the form of corrections, revisions, amplifications, and re-appraisals—of a sizeable share of the ODNB’s 55,000 existing biographies, as new scholarship comes to light.
The need to ‘keep-up’ with fresh research is not new. In 1908 the Victorian DNB was reprinted in an edition that collated the marginalia and correspondence born of several decades of reading. Thereafter, no further reprints were undertaken and later findings remained on file: information relating to the birthplace of the Quaker reformer Elizabeth Fry, for example—submitted by postcard in 1918—could not be address until the 2004 edition of the Dictionary. Such things are today unimaginable. Over the past ten years, and alongside the programme of new biographies, existing ODNB entries have been regularly updated online—with proposed amendments reviewed by the Dictionary’s academic editors in consultation with authors and reviewers. It’s worth remembering that today’s expectation of regular online updating is one that’s emerged in the lifetime of the published ODNB. Just 10 years ago, many saw online reference as a means of delivery not a new entity in its own right. The expectation that scholarly online reference could and should keep in step with new research and publications (and could be done while maintaining academic standards) is one pioneered, in part, by works like the ODNB.
One consequence is that Dictionary editors now focus on conservation (just as museum or gallery curators care for items in their collection) as well as on commissioning. In doing so we draw heavily on an ever-growing range of digitized records that have become available in the lifetime of the published Dictionary. This has been a truly remarkable development in humanities research in the past 5 to 10 years. For British history we’ve seen the digitization of (to name a few): the census returns for England and Wales (to 1911); indexes of civil registration in England and Wales (births, marriages, and deaths from 1838); Scottish parish registers from about 1500; early modern wills and probates; and 300 years’ worth of national and provincial newspapers. And this just scratches the surface.
In 2004 there were many people in the ODNB for whom the biographical trail ran cold. Access to paper records alone once meant that certain individuals simply disappeared from the historical record. Of course, some lives remain puzzles. But with these newly digitized sources we’re now able to address many of the previously unknown and untraceable episodes that were scattered across the 2004 edition. A decade on we’ve added details of nearly 3000 previously unknown births, marriages, and deaths for ODNB subjects. Access to newly digitized sources also prompts more wide-ranging revisions. Take, for example, the traveller Eliza Fay (1755/6-1816), known for her Original Letters from India, whose Dictionary entry has recently doubled in length owing to new genealogical research that minutely plots a troubled personal life that led Fay to travel to India and the business ventures she maintained there.
The case of Eliza Fay reminds us that this boom in digitized resources is particularly valuable for better understanding the lives of nineteenth and twentieth-century women. As a result of multiple marriages and/or multiple name changes many such biographies are prone to obscurity. There are also many occasions when women gave false information about their age, often for professional reasons. With digital resources, and a little detective work, it’s now possible to recover these stories. One example is Valerie, Lady Meux (1852-1910), who married into one of Britain’s wealthiest brewing families. To her contemporaries, and to generations of researchers, Lady Meux appeared the epitome of high society. But recent research uncovers a very different story: that of Susie Langton, the daughter of a Devon baker who—via multiple changes of birthdate and name—worked her way into the London elite. To Susie Langton (or Lady Meux), the discovery of her true past may not have been welcome, but for modern historians it becomes a key part of her story, and a fascinating case study of late-Victorian social mobility.
A good deal of this detective work is being done from the ODNB office. But much more comes in from thousands of researchers worldwide who are also making use of digitized resources. It’s our good fortune that the ODNB online is growing up with the Who Do You Think You Are? generation—a band of genealogists from whom we’ve benefited greatly thanks to their willingness to share new information. Such discoveries obviously enhance our understanding of the ODNB’s 60,000 main subjects, but they’re similarly adding much to the Dictionary’s 300,000 ‘other’ people: the parents, children, spouses, in-laws, patrons, teachers, business partners, and lovers who also populate these biographies. Looking ahead to our second decade, we anticipate that more will be made of these hundreds of thousands of ‘extras’ in creating a richer picture of the British past—as the ODNB continues to document and add to what we know.
Autumn 2014 marked the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors looked at aspects of the ODNB’s online evolution in the decade since 2004. In this final post of the series, Alex May—ODNB’s editor for the very recent past— considers the Dictionary as a record of contemporary history.
When it was first published in September 2004, the Oxford DNB included biographies of people who had died (all in the ODNB are deceased) on or before 31 December 2001. In the subsequent ten years we have continued to extend the Dictionary’s coverage into the twenty-first century—with regular updates recording those who have died since 2001. Of the 4300 people whose biographies have been added to the online ODNB in this decade, 2172 died between 1 January 2001 and 31 December 2010 (our current terminus)—i.e., about 220 per year of death. While this may sound a lot, the average number of deaths per year over the same period in the UK was just short of 500,000, indicating a roughly one in 2300 chance of entering the ODNB. This does not yet approach the levels of inclusion for people who died the late nineteenth century, let alone earlier periods: someone dying in England in the first decade of the seventeenth century, for example, had a nearly three-times greater chance of being included in the ODNB than someone who died in the first decade of the twenty-first century.
‘Competition’ for spaces at the modern end of the dictionary is therefore fierce. Some subjects are certainties—prime ministers such as Ted Heath or Jim Callaghan, or Nobel prize-winning scientists such as Francis Crick or Max Perutz. There are perhaps fifty or sixty potential subjects a year about whose inclusion no-one would quibble. But there are as many as 1500 people on our lists each year, and for perhaps five or six hundred of them a very good case could be made.
This is where our advisers come in. Over the last ten years we have relied heavily on the help of some 500 people, experts and leading figures in their fields whether as scholars or practitioners, who have given unstintingly of their time and support. Advisers are enjoined to consider all the aspects of notability, including achievement, influence, fame, and notoriety. Of course, their assessments can often vary, particularly in the creative fields, but even in those it is remarkable how often they coincide.
Our advisers have also in most cases been crucial in identifying the right contributor for each new biography, whether he or she be a practitioner from the same field (we often ask politicians to write on politicians—Ted Heath and Jim Callaghan are examples of this—lawyers on lawyers, doctors on doctors, and so on), or a scholar of the particular subject area. Sadly, a number of our advisers and contributors have themselves entered the dictionary in this decade, among them the judge Tom Bingham, the politician Roy Jenkins, the journalist Tony Howard, and the historian Roy Porter.
Just as the selection of subjects is made with an eye to an imaginary reader fifty or a hundred years’ hence (will that reader need or want to find out more about that person?), so the entries themselves are written with such a reader in view. ODNB biographies are not always the last word on a subject, but they are rarely the first. Most of the ‘recently deceased’ added to the Dictionary have received one or more newspaper obituary. ODNB biographies differ from newspaper obituaries in providing more, and more reliable, biographical information, as well as being written after a period of three to four years’ reflection between death and publication of the entry—allowing information to emerge and reputations to settle. In addition, ODNB lives attempt to provide an understanding of context, and a considered assessment (implicit or explicit) of someone’s significance: in short, they aim to narrate and evaluate a person’s life in the context of the history of modern Britain and the broad sweep of a work of historical reference.
The result, over the last ten years, has been an extraordinary collection of biographies offering insights into all corners of twentieth and early twenty-first century British life, from multiple angles. The subjects themselves have ranged from the soprano Elisabeth Schwarzkopf to the godfather of punk, Malcolm McLaren; the high tory Norman St John Stevas to the IRA leader Sean MacStiofáin; the campaigner Ludovic Kennedy to the jester Jeremy Beadle; and the turkey farmer Bernard Matthews to Julia Clements, founder of the National Association of Flower Arranging Societies. By birth date they run from the founder of the Royal Ballet, Dame Ninette de Valois (born in 1898, who died in 2001), to the ‘celebrity’ Jade Goody (born in 1981, who died in 2009). Mention of the latter reminds us of Leslie Stephen’s determination to represent the whole of human life in the pages of his original, Victorian DNB. Poignantly, in light of the 100th anniversary of the outbreak of the First World War, among the oldest subjects included in the dictionary are three of the ‘last veterans’, Harry Patch, Henry Allingham, and Bill Stone, who, as the entry on them makes clear, reacted very differently to the notion of commemoration and their own late fame.
The work of selecting from thousands of possible subjects, coupled with the writing and evaluation of the chosen biographies, builds up a contemporary picture of modern Britain as we record those who’ve shaped the very recent past. As we begin the ODNB’s second decade this work continues: in January 2015 we’ll publish biographies of 230 people who died in 2011 and we’re currently editing and planning those covering the years 2012 and 2013, including what will be a major article on the life, work, and legacy of Margaret Thatcher.
Links between biography and contemporary history are further evident online—creating opportunities to search across the ODNB by profession or education, and so reveal personal networks, associations, and encounters that have shaped modern national life. Online it’s also possible to make connections between people active in or shaped by national events. Searching for Dunkirk, or Suez, or the industrial disputes of the 1970s brings up interesting results. Searching for the ‘Festival of Britain’ identifies the biographies of 35 men and women who died between 2001-2010: not just the architects who worked on the structures or the sculptors and artists whose work was showcased, but journalists, film-makers, the crystallographer Helen Megaw (whose diagrams of crystal structures adorned tea sets used during the Festival), and the footballer Bobby Robson, who worked on the site as a trainee electrician. Separately, these new entries shed light not only on the individuals concerned but on the times in which they lived. Collectively, they amount to a substantial and varied slice of modern British national life.
Headline image credit: Harry Patch, 2007, by Jim Ross. CC-BY-SA-3.0 via Wikimedia Commons.
Today, 27 October sees the centenary of the birth of the poet, Dylan Marlais Thomas. Born on Cwmdonkin Drive, Swansea, and brought up in the genteel district of Uplands, Thomas’s childhood was suburban and orthodox — his father an aspirational but disappointed English teacher at the local grammar school.
Swansea would remain a place for home comforts. But from the mid-1930s, Thomas began a wandering life that took in London’s Fitzrovia — and in particular its pubs, the Fitzroy Tavern and the Wheatsheaf — and then (as a dysfunctionally married man) the New Forest, squalid rooms in wartime London, New Quay on Cardigan Bay, Italy, Laugharne in Carmarthenshire, and from 1950 the United States where he gained a popular student following and where he died in Manhattan, aged thirty-nine.
For all his wanderings, few of Thomas’s poems were written outside Wales. Indeed, half of the published poems for which he is known were written, in some form, while he was living at home in Swansea between 1930 and 1934. As Paul Ferris, his Oxford DNB biographer writes, “commonplace scenes and characters from childhood recur in his writing: the park that adjoins Cwmdonkin Drive; the bay and sands that were visible from the windows; a maternal aunt he visited” — the latter giving rise to one of Thomas’s best-known poems, “Fern Hill.” In literary London, and in numerous bar rooms thereafter, Thomas’s “drinking and clowning were indispensable to him, but they were only half the story; ‘I am as domestic as a slipper’ he once observed, with some truth.”
On 27th October 1914 Dylan Thomas was born in Swansea, South Wales. He is widely regarded as one the most significant Welsh writers of the 20th century.Thomas’s popular reputation has continued to grow after his death on 9th November, 1953, despite some critics describing his work as too ‘florid‘. He wrote prolifically throughout his lifetime but is arguably best known for his poetry. His poem The hand that signed the paper is taken from Jon Stallworthy’s edited collection The Oxford Book of War Poetry, and can be found below:
One hundred years ago today, far from the erupting battlefields of Europe, a small German force in the city of Tsingtau (Qingdao), Germany’s most important possession in China, was preparing for an impending siege. The small fishing village of Qingdao and the surrounding area had been reluctantly leased to the German Empire by the Chinese government for 99 years in 1898, and German colonists soon set about transforming this minor outpost into a vibrant city boasting many of the comforts of home, including the forerunner of the now-famous Tsingtao Brewery. By 1914, Qingdao had over 50,000 residents and was the primary trading port in the region. Given its further role as the base for the Far East Fleet of the Imperial German Navy, however, Qingdao was unable to avoid becoming caught up in the faraway European war.
The forces that besieged Qingdao in the autumn of 1914 were composed of troops from Britain and Japan, the latter entering the war against Germany in accord with the Anglo-Japanese Alliance. The Alliance had been agreed in 1902 amid growing anxiety in Britain regarding its interests in East Asia, and rapidly modernizing Japan was seen as a useful ally in the region. For Japanese leaders, the signing of such an agreement with the most powerful empire of the day was seen as a major diplomatic accomplishment and an acknowledgement of Japan’s arrival as one of the world’s great powers. More immediately, the Alliance effectively guaranteed the neutrality of third parties in Japan’s looming war with Russia, and Japan’s victory in the Russo-Japanese War of 1904-05 sent shockwaves across the globe as the first defeat of a great European empire by a non-Western country in a conventional modern war.
In Britain, Japan’s victory was celebrated as a confirmation of the strength of its Asian ally, and represented the peak of a fascination with Japan in Britain that marked the first decade of the twentieth century. This culminated in the 1910 Japan-British Exhibition in London, which saw over eight million visitors pass through during its six-month tenure. In contrast, before the 1890s, Japan had been portrayed in Britain primarily as a relatively backward yet culturally interesting nation, with artists and intellectuals displaying considerable interest in Japanese art and literature. Japan’s importance as a military force was first recognized during the Sino-Japanese War of 1894-95, and especially from the time of the Russo-Japanese War, Japan’s military prowess was popularly attributed to a supposedly ancient warrior spirit that was embodied in ‘bushido’, or the ‘way of the samurai’.
The ‘bushido’ ideal was popularized around the world especially through the prominent Japanese educator Nitobe Inazo’s (1862-1933) book Bushido: The Soul of Japan, which was originally published in English in 1900 and achieved global bestseller status around the time of the Russo-Japanese War (a Japanese translation first appeared in 1908). The British public took a positive view towards the ‘national spirit’ of its ally, and many saw Japan as a model for curing perceived social ills. Fabian Socialists such as Beatrice Webb (1858-1943) and Oliver Lodge (1851-1940) lauded the supposed collectivism of ‘bushido’, while Alfred Stead (1877-1933) and other promoters of the Efficiency Movement celebrated Japan’s rapid modernization. For his part, H.G. Wells 1905 novel A Modern Utopia included a ‘voluntary nobility’ called ‘samurai,’ who guided society from atop a governing structure that he compared to Plato’s ideal republic. At the same time, British writers lamented the supposed decline of European chivalry from an earlier ideal, contrasting it with the Japanese who had seemingly managed to turn their ‘knightly code’ into a national ethic followed by citizens of all social classes.
The ‘bushido boom’ in Britain was not mere Orientalization of a distant society, however, but was strongly influenced by contemporary Japanese discourse on the subject. The term ‘bushido’ only came into widespread use around 1900, and even a decade earlier most Japanese would have been bemused by the notion of a national ethic based on the former samurai class. Rather than being an ancient tradition, the modern ‘way of the samurai’ developed from a search for identity among Japanese intellectuals at the end of the nineteenth century. This process saw an increasing shift away from both Chinese and European thought towards supposedly native ideals, and the former samurai class provided a useful foundation. The construction of an ethic based on the ‘feudal’ samurai was given apparent legitimacy by the popularity of idealized chivalry and knighthood in nineteenth-century Europe, with the notion that English ‘gentlemanship’ was rooted in that nation’s ‘feudal knighthood’ proving especially influential. This early ‘bushido’ discourse profited from the nationalistic fervor following Japan’s victory over China in 1895, and the concept increasingly came to be portrayed as a unique and ancient martial ethic. At the same time, those theories that had drawn inspiration from European models came to be ignored, with one prominent Japanese promoter of ‘bushido’ deriding European chivalry as ‘mere woman-worship’.
In the first years of the twentieth century, the Anglo-Japanese Alliance contributed greatly to the positive reception in Britain of theories positing a Japanese ‘martial race’, and the fate of ‘bushido’ in the UK demonstrated the effect of geopolitics on theories of ‘national characteristics’. By 1914, British attitudes had begun to change amid increasing concern regarding Japan’s growing assertiveness. Even the Anglo-Japanese operation that finally captured Qingdao in November was marked by British distrust of Japanese aims in China, a sentiment that was strengthened by Japan’s excessive demands on China the following year. Following the war, Japan’s reluctance to return the captured territory to China caused British opposition to Japan’s China policy to increase, leading to the end of the Anglo-Japanese Alliance in 1923. The two countries subsequently drifted even further apart, and by the 1930s, ‘bushido’ was popularly described in Britain as an ethic of treachery and cruelty, only regaining its positive status after 1945 through samurai films and other popular culture as Japan and Britain again became firm allies in the Cold War.
Headline image credit: Former German Governor’s Residence in Qingdao, by Brücke-Osteuropa. Public domain via Wikimedia Commons.
The fifth of November is not just an excuse to marvel at sparklers, fireworks, and effigies; it is part of a national tradition that is based on one of the most famous moments in British political history. The Gunpowder Plot itself was actually foiled on the night of Monday 4 November, 1605. However, throughout the following day, Londoners were asked to light bonfires in order to celebrate the failure of the assassination attempt on King James I of England. Henceforth, the fifth of November has become known as ‘Bonfire Night’ or even ‘Guy Fawkes Night’ – named after the most posthumously famous of the thirteen conspirators. Guy Fawkes became the symbol for the conspirators after being caught during the failed treason attempt. For centuries after 1605, boys creating a cloaked effigy – based on Guy Fawkes’ disguised appearance in the Vaults at the House of Lords – have been asking for “a penny for the Guy”.
Below is a timeline that describes the events leading up to the failed Gunpowder Plot and the execution of Guy Fawkes and his fellow conspirators. If you would like to learn more about Bonfire Night, you can explore the characters behind the Gunpowder Plot, the traditions associated with it, or simply learn how to throw the best Guy Fawkes Night party.
Feature image credit: Guy Fawkes, by Crispijn van de Passe der Ältere. Public domain via Wikimedia Commons.