JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: british, Most Recent at Top [Help]
Results 1 - 25 of 139
How to use this Page
You are viewing the most recent posts tagged with the words: british in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
World War Two was the most devastating conflict in recorded human history. It was both global in extent and total in character. It has understandably left a long and dark shadow across the decades. Yet it is three generations since hostilities formally ended in 1945 and the conflict is now a lived memory for only a few. And this growing distance in time has allowed historians to think differently about how to describe it, how to explain its course, and what subjects to focus on when considering the wartime experience.
The first female Juliet appears to have been Mary Saunderson, to Henry Harris’s Romeo in 1662 when her future husband, Thomas Betterton, played Mercutio. Later she acted admirably as Ophelia and Lady Macbeth but nothing I have read characterizes her as great. Elizabeth Barry (c.1658–1713) succeeded her as Betterton’s leading lady, excelling in pathetic roles and achieving her greatest successes in the heroic tragedies of her own time.
As I approached retirement, it seemed appropriate that I should tackle one of the most controversial aspects of Liverpool history: race relations. Since there is outstanding scholarship on the operation, legacy, and memorialisation of the heinous slave trade, I chose to concentrate on later developments, particularly the growth of a large ‘black’ population from the late 19th century, primarily composed of 'seamen' who dropped anchor in ‘sailortown’ Liverpool.
History is surfeited with examples of the interactions between society and individual sexuality. Same-sex desire in particular has been, up until the present moment, a topic largely shrouded in shame, secrecy, and silence. As a result, it is often visualized through the image of 'the closet,' conveying notions of entrapment, protection, and liberation. Dominic Janes, author of Picturing the Closet: Male Secrecy and Homosexual Visibility in Britain, recently sat down with us to talk about visualization of same-sex desire in eighteenth-century Britain to the present.
Every campus has one, and sometimes more than more: the often unlovely and usually unloved concrete building put up at some point in the 1960s. Generally neglected and occasionally even unfinished, with steel reinforcing rods still poking out of it, the sixties building might be a hall of residence or a laboratory, a library or lecture room. It rarely features in prospectuses and is never – never ever – used to house the vice chancellor’s office.
On 25 February 1603, Queen Elizabeth I’ s cousin and friend - Katherine Howard, the countess of Nottingham - died. Although Katherine had been ill for some time, her death hit the queen very hard; indeed one observer wrote that she took the loss ‘muche more heavyly’ than did Katherine’s husband, the Charles, Earl of Nottingham. The queen’s grief was unsurprising, for Elizabeth had known the countess longer than almost anyone else alive at that time.
Imagine a plant that grew into a plum pudding, a cricket bat, or even a pair of trousers. Rather than being a magical transformation straight out of Cinderella, these ‘wonderful plants’ were instead to be found in Victorian Britain. Just one of the Fairy-Tales of Science introduced by chemist and journalist John Cargill Brough in his ‘book for youth’ of 1859, these real-world connections and metamorphoses that traced the origins of everyday objects were arguably even more impressive than the fabled conversion of pumpkin to carriage (and back again).
The flow of girls in particular from the safety of Britain into the war zones of the Middle East causes much hand-wringing. A report from the Institute for Strategic Dialogue says one in six of foreigners going to Syria and Iraq are women or girls.
The beliefs of British Prime Ministers since 1941 about the nation’s security and role in the world have been of critical importance in understanding the development and retention of a nuclear capability. Winston Churchill supported the development as a means of national survival during the Second World War.
Two hundred years ago American and British delegates signed a treaty in the Flemish town of Ghent to end a two-and-a-half-year conflict between the former colonies and mother country. Overshadowed by the American Revolution and Napoleonic Wars in the two nations’ historical memories, the War of 1812 has been somewhat rehabilitated during its bicentennial. Yet arguing for the importance of a status quo antebellum treaty that concluded a war in which neither belligerent achieved its war aims, no territory was exchanged, and no victor formally declared can be a tough sell. Compared to the final defeat of Napoleon at the Battle of Waterloo, fought a just a few months later and forty odd miles down the road from Ghent, the end of the War of 1812 admittedly lacked cinema-worthy drama.
But the Treaty of Ghent mattered enormously (and not just to historians interested in the War of 1812). The war it ended saw relatively light casualties, measured in the thousands compared to the millions who died in the French Revolutionary and Napoleonic Wars that raged across the rest of the globe. Nevertheless, for the indigenous and colonizing peoples that inhabited the borderlands surrounding the United States, the conflict had proved devastating. Because the American and British economies were intertwined, the war had also wreaked havoc on American agriculture and British manufacturing, and wrecked each other’s merchant navies. Moreover, public support for the war in the British Empire and the United States had been lukewarm with plenty of outspoken opposition who had worked tirelessly to prevent and then quickly end the war.
Not surprisingly, peace resulted in widespread celebration across the Atlantic. The Leeds Mercury, many of whose readers were connected to the manufacturing industries that had relied on American markets, even compared the news with that of the Biblical account of the angelic chorus’s announcement of the birth of Jesus: “This Country, thanks to the good Providence of God, is now at Peace with Europe, with America, and with the World. . . . There is at length ‘Peace on Earth,’ and we trust the revival of ‘Good-will among men’ will quickly follow the close of national hostilities.” When the treaty reached Washington for ratification, President James Madison and Congress fell over themselves in a rush to sign it.
Far more interesting than what the relatively brief Treaty of Ghent includes is what was left out. When the British delegation arrived at Ghent in August 2014, they had every possible advantage. Britain had won the naval war, the United States was on brink of bankruptcy, and the end of Britain’s war with France meant that hardened veterans were being deployed for an imminent invasion of the United States. Later that month British troops would humiliatingly burn Washington. Even Ghent itself was a home field advantage, as it was occupied by British troops and within a couple of days of communication with ministers in London.
In consequence, Britain’s initial demands were severe. If the United States wanted peace, it had to cede 250,000 square miles of its northwestern lands (amounting to more than 15% of US territory, including all or parts of the modern states of Michigan, Illinois, Indiana, Ohio, Missouri, Iowa, Wisconsin, and Minnesota). These lands would be used to create an independent American Indian state—promises of which the British had used to recruit wary Indian allies. Britain also demanded a new border for Canada, which included the southern shores of the Great Lakes and a chunk of British-occupied Maine—changes that would have given Canada considerable natural defenses. The Americans, claimed the British, were “aggrandizers”, and these measures would ensure that such ambitions would be forever thwarted.
The significance of the terms is difficult to underestimate. Western expansion would have ground to a halt in the face of a powerful British-led alliance with the Spanish Empire and new American Indian state. The humiliation would likely have resulted in the collapse of the United States. The long-marginalized New England Federalists had been outspoken in their opposition to the war and President James Madison’s Southern-dominated Republican Party, with some of their leaders openly threatening secession. The Island of Nantucket had already signed a separate peace with Britain, and many inhabitants of British-occupied Maine had signed oaths of allegiance to Britain. The Governor of Massachusetts had even sent an agent to Canada to discuss terms of British support for his state’s secession, which included a request for British troops. The counterfactuals of a New England secession are too great to explore here, but the implications are epic—not least because, unlike in 1861, the US government in 1814 was in no position to stop one. In the end, a combination of the American delegates’ obstinacy and a rapidly fading British desire to keep the nation on an expensive war footing solely to fight the Americans led the British to abandon their harsh terms.
In consequence, the Treaty of Ghent cemented the United States rather than destroyed it. Historians have long debated who truly won the war. However, what mattered most was that neither side managed a decisive victory. The Americans lacked the organization and national unity to win; the British lacked the will to wage an expensive, offensive war in North America. American inadequacy ensured that all of Canada would prosper as part of the British Empire, even though Upper Canada (now Ontario) had arguably closer links to the United States and was populated largely by economic migrants from the United States. British desire to avoid further confrontation enabled the Americans to focus its attentions on eliminating the other, and considerably weaker, obstacles to continental supremacy: the American Indians and the remnants of the Spanish Empire, who proved to be the real losers of the War of 1812 and the Treaty of Ghent.
Featured image: The Signing of the Treaty of Ghent, Christmas Eve, 1814, Amédée Forestier (1814). Public domain via Wikimedia Commons.
The New Year brings with it a new instalment of Oxford DNB biographies which, as every January, extend the Dictionary’s coverage of people who shaped British life in the late twentieth and early twenty-first century. This January we add biographies of 226 men and women who died during 2011. These new biographies were commissioned by my predecessor as editor, Lawrence Goldman, but having recently assumed the editor’s chair, I take full and appreciative responsibility for introducing them.
The new biographies bear vivid witness to an astonishing diversity of personal experience, individual achievement, and occasional delinquency; and they range from Claude Choules (b.1901), the last British-born veteran of the First World War, who died at the age of 110, to the singer and songwriter Amy Winehouse (b.1983), who died from alcohol poisoning aged just twenty-seven. The great majority of the people whose biographies are now added (191, or 84%) were born before the outbreak of the Second World War, and the majority (137, or 60%) were born before 1930. Typically, therefore, most were active between the 1940s and the 1980s, but some (such as Choules) are included for their activities before 1918, and several (such as Winehouse, or the anti-war campaigner, Brian Haw) only came to prominence in the 2000s.
The lives of Choules and Winehouse—the one exceptionally long, the other cut tragically short—draw attention to two of the most significant groups to be found in this new selection. A generation after Choules, many Britons served bravely during the Second World War, among them the SOE veteran Nancy Wake who led a group of resistance fighters and who killed German soldiers with her bare hands; SOE’s French section sent 39 women agents into France during the war, and Wake was undoubtedly among the toughest and most redoubtable. Her fellow SOE officer, Patrick Leigh Fermor, is best known for his capture, on Crete, of the German officer General Kreipe—an event that was retold in the film Ill Met by Moonlight (1957). In March 1942 Leslie Audus was captured by the Japanese and put to work in a slave labour camp. There he employed his skills as a botanist to create a nutritional supplement from fermented soya beans, saving him and hundreds of his fellow prisoners from starvation. After the war, Audus enjoyed a distinguished scientific career though, with great modesty, he made little of his remarkable prison work, which remained known only to former captives who owed him their lives.
The troubled creative life of our latest-born person, Amy Winehouse, is representative of a second significant group of lives to emerge from our new set of biographies. These were the entertainers for whom the celebrity-devouring world of show business was a place of some highs but ultimately of disenchantment and disappointment. Forty years before Winehouse came to public attention, the singer Kathy Kirby enjoyed a glittering career. Ubiquitous in the early 1960s with hit after hit, she was reputedly the highest-paid female singer of her generation. However, she failed to adapt to the rise of rock’n’roll, and soon spiralled into drug and alcohol abuse, bankruptcy, and psychiatric problems. The difficulties Kathy Kirby experienced bear similarities to those of the Paisley-born songwriter Gerry Rafferty, best known for his hit single ‘Baker Street’, which deals with loneliness in a big city; Rafferty too found fame hard to cope with, and eventually succumbed to alcoholism.
Of course, not all encounters with modern British popular culture were so troubled. One of the longest biographies added in this new update is that of the actress Elizabeth Taylor who shot to stardom in National Velvet (1944) and remained ever after a figure of international standing. While Taylor’s private life garnered almost as much attention as her screen roles, she’s also notable in pioneering the now popular association between celebrity and charitable causes—in Taylor’s case for charities working to combat HIV/AIDS. To that of Elizabeth Taylor we can also add other well-known names, among them Lucian Freud—by common consent the greatest British artist of his day, whose depictions of human flesh are unrivalled in their impact and immediacy; the journalist and author Christopher Hitchens, who made his career in the US; Ken Russell, the enfant terrible of British cinema; and the dramatist Shelagh Delany, best-known for her play, A Taste of Honey (1958).
In addition to documenting the lives, and legacies, of well-known individuals—such as Freud, Hitchens, and Delaney—it’s also the purpose of each January update of the ODNB to include people of real historical significance who did not make the headlines. In creating a rounded picture of those who’ve shaped modern Britain, we’re helped enormously by more than 400 external specialists. Divided into specialist panels—from archaeology and broadcasting to the voluntary sector and zoology—our advisers recommend people for inclusion from long lists of possible candidates. And it’s their insight that ensures we provide biographies of many less familiar figures responsible for some truly remarkable achievements. Here is just one example. Leslie Collier was a virologist who, in the 1960s, developed a heat-stable vaccine for smallpox which made possible a mass vaccination programme in Africa and South America. The result was the complete eradication of smallpox as proclaimed by the World Health Organization in 1980. How many figures can claim to have abolished what was once a terrifying global disease?
Whether long or short, good or bad, exemplary or tragic, or something more nuanced and complex in-between, the 226 new biographies now added to the Oxford DNB make fascinating—and sometimes sobering—reading.
Featured image credit: Amy Winehouse, singing, by NRK P3. CC-BY-NC-SA-2.0 via Flickr.
January 2015 sees the addition of 226 biographies to the Oxford Dictionary of National Biography, offering the lives of those who have played their part in shaping British history between the late 20th and early 21st century. The sectors and professions each of these individuals influenced range from medicine to film, including Nobel Prize and Oscar winners. Explore our infographic below as we highlight a selection of these new lives: some well-renowned, some lesser-known, yet all significant.
As anyone knows who has looked at the newspapers over the festive season, 2015 is a bumper year for anniversaries: among them Magna Carta (800 years), Agincourt (600 years), and Waterloo (200 years). But it is January which sees the first of 2015’s major commemorations, for it is fifty years since Sir Winston Churchill died (on the 24th) and received a magnificent state funeral (on the 30th). As Churchill himself had earlier predicted, he died on just the same day as his father, Lord Randolph Churchill, had done, in 1895, exactly seventy years before.
The arrangements for Churchill’s funeral, codenamed ‘Operation Hope Not’, had long been in the planning, which meant that Churchill would receive the grandest obsequies afforded to any commoner since the funerals of Nelson and Wellington. And unlike Magna Carta or Agincourt or Waterloo, there are many of us still alive who can vividly remember those sad yet stirring events of half a century ago. My generation (I was born in 1950) grew up in what were, among other things, the sunset years of Churchillian apotheosis. They may, as Lord Moran’s diary makes searingly plain, have been sad and enfeebled years for Churchill himself, but they were also years of unprecedented acclaim and veneration. During the last decade of his life, he was the most famous man alive. On his ninetieth birthday, thousands of greeting cards were sent, addressed to ‘The Greatest Man in the World, London’, and they were all delivered to Churchill’s home. During his last days, when he lay dying, there were many who found it impossible to contemplate the world without him, just as Queen Victoria had earlier wondered, at the time of his death in 1852, how Britain would manage without the Duke of Wellington.
Like all such great ceremonial occasions, the funeral itself had many meanings, and for those of us who watched it on television, by turns enthralled and tearful, it has also left many memories. In one guise, it was the final act homage to the man who had been described as ‘the saviour of his country’, and who had lived a life so full of years and achievement and honour and controversy that it was impossible to believe anyone in Britain would see his like again. But it was also, and in a rather different emotional and historical register, not only the last rites of the great man himself, but also a requiem for Britain as a great power. While Churchill might have saved his country during the Second World War, he could not preserve its global greatness thereafter. It was this sorrowful realization that had darkened his final years, just as his funeral, attended by so many world leaders and heads of state, was the last time that a British figure could command such global attention and recognition. (The turn out for Margaret Thatcher’s funeral, in 2013, was nothing like as illustrious.) These multiple meanings made the ceremonial the more moving, just as there were many episodes which made it unforgettable: the bearer party struggling and straining to carry the huge, lead-lined coffin up the steps of St Paul’s; Clement Attlee—Churchill’s former political adversary—old and frail, but determined to be there as one of the pallbearers, sitting on a chair outside the west door brought especially for him; the cranes of the London docks dipping in salute, as Churchill’s coffin was born up the Thames from Tower Pier to Waterloo Station; and the funeral train, hauled by a steam engine of the Battle of Britain class, named Winston Churchill, steaming out of the station.
For many of us, the funeral was made the more memorable by Richard Dimbleby’s commentary. Already stricken with cancer, he must have known that this would be the last he would deliver for a great state occasion (he would, indeed, be dead before the year was out), and this awareness of his own impending mortality gave to his commentary a tone of tender resignation that he had never quite achieved before. As his son, Jonathan, would later observe in his biography of his father, ‘Richard Dimbleby’s public was Churchill’s public, and he had spoken their emotions.’
Fifty years on, the intensity of those emotions cannot be recovered, but many events have been planned to commemorate Churchill’s passing, and to ponder the nature of his legacy. Two years ago, a committee was put together, consisting of representatives of the many institutions and individuals that constitute the greater Churchill world, both in Britain and around the world, which it has been my privilege to chair. Significant events are planned for 30 January: in Parliament, where a wreath will be laid; on the River Thames, where Havengore, the ship that bore Churchill’s coffin, will retrace its journey; and at Westminster Abbey, where there will be a special evensong. It will be a moving and resonant day, and the prelude to many other events around the country and around the world. Will any other British prime minister be so vividly and gratefully remembered fifty years after his—or her—death?
Headline image credit: Franklin D. Roosevelt and Winston Churchill, New Bond Street, London. Sculpted by Lawrence Holofcener. Public domain via Wikimedia Commons.
Winston Churchill’s Victory broadcast of 13 May 1945, in which he claimed that but for Northern Ireland’s “loyalty and friendship” the British people “should have been confronted with slavery or death,” is perhaps the most emphatic assertion that the Second World War entrenched partition from the southern state and strengthened the political bond between Britain and Northern Ireland.
Two years earlier, however, in private correspondence with US President Roosevelt, Churchill had written disparagingly of the young men of Belfast, who unlike their counterparts in Britain were not subject to conscription, loafing around “with their hands in their pockets,” hindering recruitment and the vital work of the shipyards.
Churchill’s role as a unifying figure, galvanising the war effort through wireless broadcasts and morale-boosting public appearances, is much celebrated in accounts of the British Home Front. The further away from London and the South East of England that one travels, however, the more questions should be asked of this simplistic narrative. Due to Churchill’s actions as Liberal Home Secretary during the 1910 confrontations between miners and police in South Wales, for example, he was far less popular in Wales, and indeed in Scotland, than in England during the war. But in Northern Ireland, too, Churchill was a controversial figure at this time. The roots of this controversy are to be found in events that took place more than a quarter of a century before, in 1912.
Then First Lord of the Admiralty, Churchill was booed on arrival in Belfast that February, before his car was attacked and his effigy brandished by a mob of loyalist demonstrators. Later at Belfast Celtic Football Ground he was cheered by a crowd of five thousand nationalists as he spoke in favour of Home Rule for Ireland. Churchill was not sympathetic to the Irish nationalist cause but believed that Home Rule would strengthen the Empire and the bond between Britain and Ireland; he also saw this alliance as vital to the defence of the United Kingdom.
Loyalists were outraged. Angry dockers hurled rotten fish at Churchill and his wife Clementine as they left the city; historian and novelist Hugh Shearman reported that their car was diverted to avoid thousands of shipyard workers who had lined the route with pockets filled with “Queen’s Island confetti,” local slang for rivet heads. (Harland and Wolff were at this time Belfast’s largest employer, and indeed one of the largest shipbuilding firms in the world; at the time of the Churchills’ visit the Titanic was being fitted out.)
Two years later in March 1914 Churchill made a further speech in Bradford in England, calling for a peaceful solution to the escalating situation in Ulster and arguing that the law in Ireland should be applied equally to nationalists and unionists without preference. Three decades later, this speech was widely reprinted and quoted in several socialist and nationalist publications in Northern Ireland, embarrassing the unionist establishment by highlighting their erstwhile hostility to the most prominent icon of the British war effort. Churchill’s ignominious retreat from Belfast in 1912 was also raised by pamphleteers and politicians who sought to exploit a perceived hypocrisy in the unionist government’s professed support for the British war effort as it sought to suppress dissent within the province. One socialist pamphlet attacked unionists by arguing that “The Party which denied freedom of speech to a member of the British Government before it became the Government of Northern Ireland is not likely to worry overmuch about free speech for its political opponents after it became the Government.”
And in London in 1940 Victor Gollancz’s Left Book Club published a polemic by the Dublin-born republican activist Jim Phelan, startlingly entitled Churchill Can Unite Ireland. In this Phelan expressed hopes that Churchill’s personality itself could effect positive change in Ireland. He saw Churchill as a figure who could challenge what Phelan called “punctilio,” the adherence to deferential attitudes that kept vested interests in control of the British establishment. Phelan identified a cultural shift in Britain following Churchill’s replacement of Chamberlain as Prime Minister, characterised by a move towards plain speaking: he argued that for the first time since the revolutionary year of 1848 “people are saying and writing what they mean.”
Jim Phelan’s ideas in Churchill Can Unite Ireland were often fanciful, but they alert us to the curious patterns of debate that can be found away from more familiar British narratives of the Second World War. Here a proud Irish republican could assert his faith in a British Prime Minister with a questionable record in Ireland as capable of delivering Irish unity.
Despite publically professed loyalty to the British war effort, unionist mistrust of the London government in London endured over the course of the war, partly due to Churchill’s perceived willingness to deal with Irish Taoiseach Éamon de Valera. Phelan’s book concluded with the words: “Liberty does not grow on trees; it must be fought for. Not ‘now or never’. Now.” Eerily these lines presaged the infamous telegram from Churchill to de Valera following the bombing of Pearl Harbor the following year in 1941, which, it is implied, offered Irish unity in return for the southern state’s entry into the war on the side of Allies, and read in part “Now is your chance. Now or never. A Nation once again.”
There was a great change in peace settlements after World War I. Not only were the Central Powers supposed to pay reparations, cede territory, and submit to new rules concerning the citizenship of their former subjects, but they were also required to deliver nationals accused of violations of the laws and customs of war (or violations of the laws of humanity, in the case of the Ottoman Empire) to the Allies to stand trial.
This was the first time in European history that victor powers imposed such a demand following an international war. This was also the first time that regulations specified by the Geneva and Hague Conventions were enforced after a war ended. Previously, states used their own military tribunals to enforce the laws and customs of war (as well as regulations concerning espionage), but they typically granted amnesty for foreigners after a peace treaty was signed.
The Allies intended to create special combined military tribunals to prosecute individuals whose violations had affected persons from multiple countries. They demanded post-war trials for many reasons. Legal representatives to the Paris Peace Conference believed that “might makes right” should not supplant international law; therefore, the rules governing the treatment of civilians and prisoners-of-war must be enforced. They declared the war had created a modern sensibility that demanded legal innovations, such as prosecuting heads of state and holding officers responsible for the actions of subordinates. British and French leaders wanted to mollify domestic feelings of injury as well as propel an interpretation that the war had been a fight for “justice over barbarism,” rather than a colossal blood-letting. They also sought to use trials to exert pressure on post-war governments to pursue territorial and financial objectives.
The German, Ottoman, and Bulgarian governments resisted extradition demands and foreign trials, yet staged their own prosecutions. Each fulfilled a variety of goals by doing so. The Weimar government in Germany was initially forced to sign the Versailles Treaty with its extradition demands, then negotiated to hold its own trials before its Supreme Court in Leipzig because the German military, plus right-wing political parties, refused the extradition of German officers. The Weimar government, led by the Social Democratic party, needed the military’s support to suppress communist revolutions. The Leipzig trials, held 1921-27, only covered a small number of cases, serving to deflect responsibility for the most serious German violations, such as the massacre of approximately 6,500 civilians in Belgium and deportation of civilians to work in Germany. The limited scope of the trials did not purge the German military as the Allies had hoped. Yet the trials presented an opportunity for German prosecutors to take international charges and frame them in German law. Although the Allies were disturbed by the small number of convictions, this was the first time that a European country had agreed to try its own after a major war.
The Ottoman imperial government first destroyed the archives of the “Special Organization,” a secret group of Turkish nationalists who deported Greeks from the Aegean region in 1914 and planned and executed the massacre of Armenians in 1915. But in late 1918, a new Ottoman imperial government formed a commission to investigate parliamentary deputies and former government ministers from the Turkish nationalist party, the Committee of Union and Progress, which had planned the attacks. It also sought to prosecute Committee members who had been responsible for the Ottoman Empire’s entrance into the war. The government then held a series of military trials of its own accord in 1919 to prosecute actual perpetrators of the massacres, as well as purge the government of Committee members, as these were opponents of the imperial system. It also wanted to quash the British government’s efforts to prosecute Turks with British military tribunals. Yet after the British occupied Istanbul, the nationalist movement under Mustafa Kemal retaliated by arresting British officers. Ultimately, the Kemalists gained control of the country, ended all Turkish military prosecutions for the massacres, and nullified guilty verdicts.
Like the German and Ottoman situations, Bulgaria began a rocky governmental and social transformation after the war. The initial post-war government signed an armistice with the Allies to avoid the occupation of the capital, Sofia. It then passed a law granting amnesty for persons accused of violating the laws and customs of war. However, a new government came to power in 1919, representing a coalition of the Agrarian Union, a pro-peasant party, and right-wing parties. The government arrested former ministers and generals and prosecuted them with special civilian courts in order to purge them; they were blamed for Bulgaria’s entrance into the war. Some were prosecuted because they lead groups of refugees from Macedonia in a terrorist organization, the Internal Macedonian Revolutionary Organization. Suppressing Macedonian terrorism was an important condition for Bulgaria to improve its relationship with its neighbor, the Kingdom of the Serbs, Croats, and Slovenes. In 1923, however, Aleksandar Stambuliski, the leader of the Agrarian Union, was assassinated in a military coup, leading to new problems in Bulgaria.
We could ask a counter-factual question: What if the Allies had managed to hold mixed military tribunals for war-time violations instead of allowing the defeated states to stage their own trials? If an Allied tribunal for Germany was run fairly and political posturing was suppressed, it might have established important legal precedents, such as establishing individual criminal liability for violations of the laws of war and the responsibility of officers and political leaders for ordering violations. On the other hand, guilty verdicts might have given Germany’s nationalist parties new heroes in their quest to overturn the Versailles order.
An Allied tribunal for the Armenian massacres would have established the concept that a sovereign government’s ministers and police apparatus could be held criminally responsible under international law for actions undertaken against their fellow nationals. It might also have created a new historical source about this highly contested episode in Ottoman and Turkish history. Yet it is speculative whether the Allies would have been able to compel the post-war Turkish government to pay reparations to Armenian survivors and return stolen property.
Finally, an Allied tribunal for alleged Bulgarian war criminals, if constructed impartially, might have resolved the intense feelings of recrimination that several of the Balkan nations harbored toward each other after World War I. It might also have helped the Agrarian Union survive against its military and terrorist enemies. However, a trial concentrating only on Bulgarian crimes would not have dealt with crimes committed by Serbian, Greek, and Bulgarian forces and paramilitaries during the Balkan Wars of 1912-13, so a selective tribunal after World War I may not have healed all wounds.
Image Credit: Château de Versailles Hall of Mirrors Ceiling. Photo by Dennis Jarvis. CC BY-SA 2.0 via Flickr.
The Reformation was a seismic event in history, whose consequences are still working themselves out in Europe and across the world. The protests against the marketing of indulgences staged by the German monk Martin Luther in 1517 belonged to a long-standing pattern of calls for internal reform and renewal in the Christian Church. But they rapidly took a radical and unexpected turn, engulfing first Germany and then Europe as a whole in furious arguments about how God’s will was to be discerned, and how humans were to be ‘saved’. However, these debates did not remain confined to a narrow sphere of theology. They came to reshape politics and international relations; social, cultural, and artistic developments; relations between the sexes; and the patterns and performances of everyday life.
Below we take a look at some of the key events that shaped the Reformation. In The Oxford Illustrated History of the ReformationPeter Marshall and a team of experts tell the story of how a multitude of rival groups and individuals, with or without the support of political power, strove after visions of ‘reform’.
Featured image credit: Fishing for Souls, Adriaen Pietersz van de Venne, 1614. Rijksmeseum, Amsterdam. Public domain via Wikimedia Commons.
In 1958, the prominent childcare advice writer and pediatrician Dr. Benjamin Spock told readers that ‘a man can be a warm father and a real man at the same time’. In this revised edition of the bestseller Baby and Child Care, the American author dedicated a whole section to ‘The Father’s Part’. This was a much lengthier discussion of men’s role in caring for their babies and young children than in the first edition, but the role of the father remained very much secondary to that of the mother. Though Spock advised readers it was ‘the wrong idea’ to consider childcare as the sole responsibility of the mother, it was clear that he thought the father’s responsibility in day-to-day care remained rather minimal, in part because of the lack of interest of fathers themselves. He added that it was ‘Better to play for fifteen minutes enjoyably, and then say, “Now I’m going to read my paper,” than to spend all day at the zoo, crossly.’
Having children had long been understood as a sign of manhood, proving men’s virility and adult status. Jim Bullock, for example, who was born in 1903, recollected the definite ideas around virility and masculinity in the mining village of Bowers Row in Yorkshire. He described:
The first child was conceived as soon as was decently possible, for the young husband had to prove his manhood. If a year passed without a child—or the outward sign of one being on the way—this man was taunted by his mates both at work and on the street corner by such cruel remarks.
He added that men were expected to suffer some of the same symptoms as their wife during pregnancy, such as morning sickness and toothache, as well as losing weight as their wife gained it. If he didn’t experience these effects, his love for and fidelity to his wife could be questioned.
With increasing knowledge about birth control, sex, and childbirth across many parts of British society as the twentieth century progressed, these views became outdated.
Having children was still a sign of achieving adult masculinity. However, too much interaction with anything to do with pregnancy, birth and babies could also be emasculating—this was, of course, ‘women’s business’. David, a labourer from Nottingham, who became a father in the 1950s, highlighted how he kept his distance from both the birth and caring for his new baby, ‘because it wasn’t manly’.
Some fathers were becoming more willing to help out with children. Mr. K from Preston described how ‘relaxing’ he found it to sit giving one of his babies a bottle after work. Yet, though attitudes to men’s roles in childcare were gradually shifting, it was the relationship between masculinity and fatherhood that changed more substantially in the middle of the twentieth century.
What can be found in the 1940s and 1950s in Britain was a new kind of relationship between fatherhood and masculinity. This was, in fact, a time when the ‘celebrity dad’ became prominent in the press. In 1955, for example, the Daily Mirror published a feature on actor Kenneth More, interviewed whilst he took care of his toddler. In 1957, it featured an article and large image of the singer Lonnie Donegan with his three-year-old daughter, apparently enjoying singing together at home. Sports stars and royals were also the subject of this kind of attention, and seemed to embody Spock’s claim that men indeed could be a real man and a warm father at the same time. More ‘ordinary’ dads also hinted at this change. Whilst taking an overly active role in the physical care of babies remained potentially tricky for many men, their identities were increasingly encompassing a more caring and fatherly side. Mr. G, born in 1903, suggested that there was change around the First World War; by the 1920s, men were much happier to be seen taking their child for a walk in the area he lived in Lancashire. And Martin from Oldham, whose first child was born in the mid-1950s, described how he proudly took his child in its pram for a beer in his local pub. Men’s roles with their children hadn’t been radically reshaped. But whilst in earlier generations, it was simply having children that was a sign of manliness, by the 1950s, being seen as an involved father was becoming part of an ideal vision of masculinity.
The importance of fatherhood to the achievement of certain ideal of masculinity has ebbed and flowed across the twentieth century; it could both prove and challenge a sense of manliness. Today we see plenty of evidence of men proudly displaying their fatherhood—the man with a pram or carrying a baby in a sling isn’t so rare any more. Yet, in every generation there are more or less involved fathers; plenty of men throughout the twentieth century, and much earlier, enjoyed spending time with their children and felt close to them. Today, women, for the most part, still take on the burden of childcare, even if there are plenty of couples who do things differently. Historical research helps question the idea that the ‘new man’ of the last couple of decades is quite so new—and by thinking about how fatherhood relates to masculine identity, we can better understand changes to parenting and gender roles over time.
Image Credit: “Father’s Strength” by Shavar Ross. CC by NC-ND 2.0 via Flickr.
The seemingly unassailable rise of the MOOC – the Massive Open On-Line Course – has many universities worried. Offering access to millions of potential students, it seems like the solution to so many of the problems that beset higher education. Fees are low, or even non-existent; anyone can sign up; staff time is strictly limited as even grading is done by peers or automated multiple-choice questionnaires. In an era of ever-rising tuition fees and of concerns about the barriers that stop the less well-off from applying to good universities, the MOOC can seem like a panacea.
Historians should be banned from watching movies or TV set in their area of expertise. We usually bore and irritate friends and family with pedantic interjections about minor factual errors and chronological mix-ups. With Hilary Mantel’s novels Wolf Hall and Bring Up the Bodies, and the sumptuous BBC series based on them, this pleasure is denied us. The series is as ferociously well researched as it is superbly acted and directed. Cranmer probably didn’t have a beard in 1533, but, honestly, that’s about the best I can do.
This is only my second Kazuo Ishiguro book following on from Never Let Me Go. For me, coming off a novel about cloning, I had no expectations about where he would go next. Much has been made about this novel being a “departure” for Ishiguro but I would argue that he has gone back to something […]
If you want to win votes and get elected in Britain, at least in general elections, then you had better get a party. The occasional and isolated exceptions only prove the rule. Before the 2010 general election, in the wake of the parliamentary expenses scandal, there was speculation that independent candidates might do unusually well, but in the event this did not happen. Elected politicians have a wonderful capacity for persuading themselves that their electoral success is to be explained by their obvious personal qualities, but the evidence is all against them.
Two hundred years ago last Friday the owner of the London Times, John Walter II, is said to have surprised a room full of printers who were preparing hand presses for the production of that day’s paper. He showed them an already completed copy of the paper and announced, “The Times is already printed – by steam.” The paper had been printed the night before on a steam-driven press, and without their labor. Walter anticipated and tried to mediate the shock and unrest with which this news was met by the now-idled printers. It was one of many scenes of change and conflict in early industrialization where the hand was replaced by the machine. Similar scenes of hand labor versus steam entered into cultural memory from Romantic poetry about framebreaking Luddites to John Henry’s hand-hammering race against the steam drill.
There were many reasons to celebrate the advent of the steam press in 1814, as well as reasons to worry about it. Steam printing brought the cost of printing down, increased the number of possible impressions per day by four times, and, in a way, we might say that it helped “democratize” access to information. That day, the Times proclaimed that the introduction of steam was the “greatest improvement” to printing since its very invention. Further down that page, which itself was “taken off last night by a mechanical apparatus,” we read why the hand press printers might have been concerned: “after the letters are placed by the compositors… little more remains for man to do, than to attend upon, and watch this unconscious agent in its operations.”
Moments of technological change do indeed put people out of work. My father, who worked at the Buffalo News for nearly his entire career, often told me about layoffs or fears of layoffs coming with the development of new computerized presses, print processes, and dwindling markets for print. But the narrative of the hand versus the machine, or of the movement from the hand to the machine, obscures a truth about labor, especially information labor. Forms of human labor are replaced (and often quantifiably reduced), but they are also rearranged, creating new forms of work and social relations around them. We would do well to avoid the assumption that no one worked the steam press once hand presses went mostly idle. As information, production, and circulation becomes more technologically abstracted from the hands of workers, there is an increased tendency to assume that no labor is behind it.
Two hundred years after the morning when the promise of faster, cheaper, and more accessible print created uncertainty among the workers who produced it, I am writing to you using an Apple Computer made by workers in Shenzhen, China with metals mined all over the global South. The software I am using to accomplish this task was likely written and maintained by programmers in India managed by consultants in the United States. You are likely reading this on a similar device. Information has been transmitted between us via networks of wires, servers, cable lines, and wireless routers, all with their own histories of people who labor. If you clicked over here from Facebook, a worker in a cubicle in Manilla may have glanced over this link among thousands of others while trying to filter out content that violates the social network’s terms of service. Technical laborers, paid highly or almost nothing at all, and working under a range of conditions, are silently mediating this moment of exchange between us. Though they may no longer be hand-pressed, the surfaces on which we read and write are never too distant from the hands of information workers.
Like research in book history and print culture studies, the common appearance of a worker’s hand in Google Books reminds us that, despite radical changes in technology over centuries, texts are material objects and are negotiated by numerous people for diverse purposes, only some of which we would call “reading” proper. The hand pulling the lever of a hand press and the hand turning pages in scanner may be representative of two poles on a two-century timeline, but, for me, they suggest many more continuities between early print and contemporary digital cultures than ruptures. John Walter II’s proclamation on 28 November 1814 was not a turn away from a Romantic past of artisanal labor toward a bleak and mechanized future. Rather, it was an early moment in an ongoing struggle to create and circulate words and images to ever more people while also sustaining the lives of those who produce them. Instead of assuming, two hundred years on, that we have been on a trajectory away from the hand, we must continue looking for and asking about the conditions of the hand in the machine.
Headline image credit: Hand of Google, by Unknown CC BY-SA 3.0 via Wikimedia Commons.
Like much historical research, my chapter in the Britain’s Soldiers collection came about more or less by accident. It relates to an incident that I discovered in the War Office papers at in 2007. I was taking a group of History students from Northampton University to The National Archives in Kew, to help them with their undergraduate dissertations. I had a small amount of time to do some of my own research, so ordered WO 43/404. The title sounded promising: ‘A book containing copies of correspondence in 1761, relative to the Behaviour of the Duke of Richmond’s regiment & Militia at Stamford on the 15th April 1761’.
What arrived was a letter book. It immediately struck me as being unusual, as the War Office usually just kept in-letters, whereas this book copied both sides of a lengthy correspondence between the Secretary at War and various other protagonists. Something noteworthy had clearly occurred, and was therefore preserved, possibly as a precedent to inform future action. I was pushed for time, however, so I quickly photographed the whole book and returned to my students.
Four years later I finally had an opportunity to transcribe the letters. What emerged was a bizarre event. In brief, the Lincolnshire Militia was quartered in Stamford and a regiment of the regular army that was on the march approached the town. The regulars had such contempt for the militia that they marched straight in, disregarding the usual military convention that they send advance word, and proceeded to refuse any of the other courtesies that the militia attempted to offer them. The militia’s commander took such umbrage at these slights that he posted sentries at entrances to the town, ‘to prevent any armed Troops entering the Town for the future without my knowledge and consent’. When further regulars attempted to enter the town, the militia stopped them at the point of their bayonets and a fight ensued in which men were injured, and could have been killed.
This was more than a local scrap. Neither side would admit fault and so wrote to the War Office to intercede. Despite the fact that Britain was in the midst of the Seven Years War – the largest global conflict that Britain had then fought – the Secretary at War took the incident very seriously indeed, and the letter book records how the fallout preoccupied him for a further two months. The dispute even drew in the King himself, who as Commander in Chief was keen to preserve ‘that Equality and Harmony in Service which is so much to be wished and cultivated’.
I was intrigued by the story that emerged from this letter book, and a paper trail ensued as I attempted to flesh out the story with other sources. My attempts to find references to the affair in public sources such as newspapers drew a blank, and I had more luck in private family papers and local government records instead. This was not an incident that was publicly known about at the time, which perhaps explains why historians had overlooked it.
As a cultural historian, what drew me to this incident was that it was an example of people behaving oddly. It is often only when people are behaving abnormally that we get an insight into the normal expectations of behaviour that go unspoken – in this case, attitudes towards military precedence and masculine honour. I think that incidents like the one at Stamford in 1761 highlight the crucial significance of ‘polite’ and controlled conduct in the eighteenth-century military. To our eyes, the interpersonal conduct of Georgian army officers may seem terribly mannered – but this is clearly desirable when you have large numbers of men who are highly defensive of their status and armed to the teeth. Rather than just reconstructing micro-incidents like this for their own sake, therefore, it is helpful to think about them more anthropologically in order to shed light on the workings of a past society.
Headline image credit: Battle of Bunker Hill by E. Percy Moran. Public domain via Wikimedia Commons.
The light in the Orkneys is so clear, so bright, so lucid, it feels like you are on top of the world looking though thin clouds into heaven.
It doesn’t even feel part of the UK: when you sail off the edge of Scotland by the Scrabster to Stromness ferry, you feel you are departing the real world to land in a magical realm.
Nowhere else on earth can you go to a place and see eight thousand years of continuous history in such a tiny space.
Skara Brae is what remains of a neolithic village, older than Stonehenge and the pyramids, kept secret underground until uncovered by a severe storm in 1850. You can walk in and sit down, look around at the stone walls, stone beds, stone cupboards, dressers, seats, and storage boxes. Recognizably human people lived here, seeing this same landscape and coast, feeling the same wind on their faces that you do, their eyes resting on the doors, hearths and toilets (one in each dwelling).
This is ‘stone age’ but talking about such ages is a misnomer in the Orkneys where they had no appreciable bronze age nor iron age so proceeded from the non-use of one metal to the non-use of another in what is now the best preserved neolithic site in Europe.
The Orkneys have been so fascinating for so long that even the vandalism needs to be preserved. In Maeshowe burial mound you can see where Viking tourists who came to the monument, already ancient by their time, wrote graffiti about their girlfriends on the walls. They wrote in Norse runes.
The Orkney islands were the headquarters of the Viking invasion fleets, and to this day the Orkneys are the only place in the world besides Norway where the Norwegian national day is celebrated.
The islands are filled with Tolkeinesque place names like the Ring of Brodgar, the Brough of Birsay, the Standing Stones of Stenness. Sagas were born here, like that of the peaceable 12th century Earl of Orkney, treacherously assassinated and now known as St Magnus, after whom the cathedral is named.
Sagas were created here in living memory. This is where the British home fleet was at anchor and the German fleet still lies. The battle fleet of the German Imperial Navy transferred in its entirety to Scapa Flow in 1919 to await a decision on its future. The German sailors could not bring themselves to give up their ships; they opened the seacocks and scuttled them all. At low tide you can still see the rusting hulks of Wilhelmine ambitions to dominate Europe.
If the Orkneys sound bleak and rocky, that would be the wrong impression to leave. They have rich and fertile farming land with green plains rolling on under a pearl sky. People tell folk tales around the peat fires, drinking ginger-flavoured whiskey; an orange cat pads around the grain heaps in the Highland Park distillery, and the islands shimmer under the ‘simmer dim’ of nightless summer days. I should be there now.
By December 1914 the Great War had been raging for nearly five months. If anyone had really believed that it would be ‘all over by Christmas’ then it was clear that they had been cruelly mistaken. Soldiers in the trenches had gained a grudging respect for their opposite numbers. After all, they had managed to fight each other to a standstill.
On Christmas Eve there was a severe frost. From the perspective of the freezing-cold trenches the idea of the season of peace and goodwill seemed surrealistic. Yet parcels and Christmas gifts began to arrive in the trenches and there was a strange atmosphere in the air. Private William Quinton was watching:
We could see what looked like very small coloured lights. What was this? Was it some prearranged signal and the forerunner of an attack? We were very suspicious, when something even stranger happened. The Germans were actually singing! Not very loud, but there was no mistaking it. Suddenly, across the snow-clad No Man’s Land, a strong clear voice rang out, singing the opening lines of “Annie Laurie“. It was sung in perfect English and we were spellbound. To us it seemed that the war had suddenly stopped! Stopped to listen to this song from one of the enemy.
“We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle.”
On Christmas Day itself, in some sectors of the line, there was no doubting the underlying friendly intent. Yet the men that took the initiative in initiating a truce were brave – or foolish – as was witnessed by Sergeant Frederick Brown:
Sergeant Collins stood waist high above the trench waving a box of Woodbines above his head. German soldiers beckoned him over, and Collins got out and walked halfway towards them, in turn beckoning someone to come and take the gift. However, they called out, “Prisoner!” A shot rang out, and he staggered back, shot through the chest. I can still hear his cries, “Oh my God, they have shot me!”
This was not a unique incident. Yet, despite the obvious risks, men were still tempted. Individuals would get off the trench, then dive back in, gradually becoming bolder as Private George Ashurst recalled:
It was grand, you could stretch your legs and run about on the hard surface. We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle. Part way through we were all playing football. It was so pleasant to get out of that trench from between them two walls of clay and walk and run about – it was heaven.
The idea that football matches were played between the British and Germans in No Man’s Land has taken a grip, but the evidence is intangible.
The truce was not planned or controlled – it just happened. Even senior officers recognised that there was little that could be done in this strange state of affairs. Brigadier General Lord Edward Gleichen accepted the truce as a fait accompli, but was keen to ensure that the Germans did not get too close to the ramshackle British trenches:
They came out of their trenches and walked across unarmed, with boxes of cigars and seasonable remarks. What were our men to do? Shoot? You could not shoot unarmed men. Let them come? You could not let them come into your trenches; so the only thing feasible was done – and our men met them half-way and began talking to them. Meanwhile our officers got excellent close views of the German trenches.
Another practical reason for embracing the truce was the opportunity it presented for burying the dead that littered No Man’s Land. Private Henry Williamson was assigned to a burial party:
The Germans started burying their dead which had frozen hard. Little crosses of ration box wood nailed together and marked in indelible pencil. They were putting in German, ‘For Fatherland and Freedom!’ I said to a German, “Excuse me, but how can you be fighting for freedom? You started the war, and we are fighting for freedom!” He said, “Excuse me English comrade, but we are fighting for freedom for our country!”
It should be noted that the truce was by no means universal, particularly where the British were facing Prussian units.
For the vast majority of the participants, the truce was a matter of convenience and maudlin sentiment. It did not mark some deep flowering of the human spirit, or signify political anti-war emotions taking root amongst the ranks. The truce simply enabled them to celebrate Christmas in a freer, more jovial, and, above all, safer environment, while satisfying their rampant curiosity about their enemies.
The truce could not last: it was a break from reality, not the dawn of a peaceful world. The gradual end mirrored the start, for any misunderstandings could cost lives amongst the unwary. For Captain Charles Stockwell it was handled with a consummate courtesy:
At 8.30am I fired three shots in the air and put up a flag with ‘Merry Christmas!’ on it, and I climbed on the parapet. He put up a sheet with, ‘Thank you’ on it, and the German captain appeared on the parapet. We both bowed and saluted and got down into our respective trenches – he fired two shots in the air and the war was on again!
In other sectors, the artillery behind the lines opened up and the bursting shells soon shattered the truce.
War regained its grip on the whole of the British sector. When it came to it, the troops went back to war willingly enough. Many would indeed have rejoiced at the end of the war, but they were still willing to accept orders, still willing to kill Germans. Nothing had changed.