JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: british, Most Recent at Top [Help]
Results 1 - 25 of 113
How to use this Page
You are viewing the most recent posts tagged with the words: british in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
Autumn is here again – in England, the season of mists and mellow fruitfulness, in the US also the season of Thanksgiving. On the fourth Thursday in November, schoolchildren across the country will stage pageants, some playing black-suited Puritans, others Native Americans bedecked with feathers. By tradition, Barack Obama will ‘pardon’ a turkey, but 46 million others will be eaten in a feast complete with corncobs and pumpkin pie. The holiday has a long history: Lincoln fixed the date (amended by Roosevelt in 1941), and Washington made it a national event. Its origins, of course, lay in the Pilgrim Fathers’ first harvest of 1621.
Who now remembers who these intrepid migrants were – not the early ‘founding fathers’ they became, but who they were when they left? The pageant pilgrims are undifferentiated. Who knows the name of Christopher Martin, a merchant from Billericay near Chelmsford in Essex? He took his whole family on the Mayflower, most of whom, including Martin himself, perished in New Plymouth’s first winter. They died Essex folk in a strange land: there was nothing ‘American’ about them. And as for Thanksgiving, well that habit came from the harvest festivals and religious observances of Protestant England. Even pumpkin pie was an English dish, exported then forgotten on the eastern side of the Atlantic.
Towns like Billericay, Chelmsford and Colchester were crucial to American colonization: ordinary places that produced extraordinary people. The trickle of migrants in the 1620s, in the next decade became a flood, leading to some remarkable transformations. In 1630 Francis Wainwright was drawing ale and clearing pots in a Chelmsford inn when his master, Alexander Knight, decided to emigrate to Massachusetts. It was an age of austerity, of bad harvests and depression in the cloth industry. Plus those who wanted the Protestant Reformation to go further – Puritans – feared that under Charles I it was slipping backwards. Many thought they would try their luck elsewhere until England’s fortunes were restored, perhaps even that by building a ‘new’ England they could help with this restoration. Wainwright, aged about fourteen, went with Knight, and so entered a world of hardship and danger and wonder.
One May dawn, seven years later, Wainwright was standing by the Mystic River in Connecticut, one of seventy troops waiting to shoot at approaching Pequot warriors. According to an observer, the Englishmen ‘being bereaved of pity, fell upon the work without compassion’, and by dusk 400 Indians lay dead in their ruined encampment. The innkeeper’s apprentice had fired until his ammunition was exhausted, then used his musket as a club. One participant celebrated the victory, remarking that English guns had been so fearsome, it was ‘as though the finger of God had touched both match and flint’. Another rejoiced that providence had made a ‘fiery oven’ of the Pequots’ fort. Wainwright took two native heads home as souvenirs. Unlike many migrants, he stayed in America, proud to be a New Englander, English by birth but made different by experience. He lived a long life in commerce, through many fears and alarms, and died at Salem in 1692 during the white heat of the witch-trials.
The story poses hard historical questions. What is identity, and how does it change? Thanksgiving pageants turn Englishmen into Americans as if by magic; but the reality was more gradual and nuanced. Recently much scholarly energy has been poured into understanding past emotions. We may think our emotions are private, but they leak out all the time; we may even use them to get what we want. Converted into word and deed, emotions leave traces in the historical record. When the Pilgrim William Bradford called the Pequot massacre ‘a sweet sacrifice’, he was not exactly happy but certainly pleased that God’s will had been done.
Puritans are not usually associated with emotion, but they were deeply sensitive to human and divine behaviour, especially in the colonies. Settlers were proud to be God’s chosen people – like Israelites in the wilderness – yet pride brought shame, followed by doubt that God liked them at all. Introspection led to wretchedness, which was cured by the Holy Spirit, and they were back to their old censorious selves. In England, even fellow Puritans thought they’d lost the plot, as did most (non-Puritan) New Englanders. But godly colonists established what historians call an ‘emotional regime’ or ‘emotional community’ in which their tears and thunder were not only acceptable but carried great political authority.
John Winthrop, the leader of the fleet that carried Francis Wainwright to New England, was an intensely emotional man who loved his wife and children almost as much as he loved God. Gaunt, ascetic and tirelessly judgmental, he became Massachusetts Bay Colony’s first governor, driven by dreams of building a ‘city upon a hill’. It didn’t quite work out: Boston grew too quickly, and became diverse and worldly. And not everyone cared for Winthrop’s definition of liberty: freedom to obey him and his personal interpretation of God’s designs. But presidents from Reagan to Obama have been drawn to ‘the city upon the hill’ as an emotionally potent metaphor for the US in its mission to inspire, assist, and police the world.
Winthrop’s feelings, however, came from and were directed at England. His friend Thomas Hooker, ‘the father of Connecticut’, cut his teeth as a clergyman in Chelmsford when Francis Wainwright lived there. Partly thanks to Wainwright, one assumes, he found the town full of drunks, with ‘more profaneness than devotion’. But Hooker ‘quickly cleared streets of this disorder’. The ‘city upon the hill’, then, was not a blueprint for America, but an exemplar to help England reform itself. Indeed, long before the idea was associated with Massachusetts, it related to English towns – notably Colchester – that aspired to be righteous commonwealths in a country many felt was going to the dogs. Revellers did not disappear from Chelmsford and Colchester – try visiting on a Saturday night – but, as preachers and merchants and warriors, its people did sow the seeds from which grew the most powerful nation in the world.
So if you’re celebrating Thanksgiving this year, or you know someone who is, it’s worth remembering that the first colonists to give thanks were not just generic Old World exiles, uniformly dull until America made them special, but living, breathing emotional individuals with hearts and minds rooted in English towns and shires. To them, the New World was not an upgrade on England: it was a space in which to return their beloved country to its former glories.
Featured image credit: Signing of the Constitution, by Thomas P. Rossiter. Public domain via Wikimedia Commons
Alan Mathison Turing (1912-1954) was a mathematician and computer scientist, remembered for his revolutionary Automatic Computing Engine, on which the first personal computer was based, and his crucial role in breaking the ENIGMA code during the Second World War. He continues to be regarded as one of the greatest scientists of the 20th century.
We live in an age that Turing both predicted and defined. His life and achievements are starting to be celebrated in popular culture, largely with the help of the newly released film The Imitation Game, starring Benedict Cumberbatch as Turing and Keira Knightley as Joan Clarke. We’re proud to publish some of Turing’s own work in mathematics, computing, and artificial intelligence, as well as numerous explorations of his life and work. Use our interactive Enigma Machine below to learn more about Turing’s extraordinary achievements.
Image credits: (1) Bletchley Park Bombe by Antoine Taveneaux. CC-BY-SA-3.0 via Wikimedia Commons. (2) Alan Turing Aged 16, Unknown Artist. Public domain via Wikimedia Commons. (3) Good question by Garrett Coakley. CC-BY-SA 2.0 via Flickr.
Remembrance Day is a memorial day observed in Commonwealth of Nations member states since the end of the First World War to remember those who have died in the line of duty. It is observed by a two-minute silence on the ’11th hour on the 11th day of the 11th month’, in accordance with the armistice signed by representatives of Germany and the Entente on 11 November, 1918. The First World War officially ended with the signing of theTreaty of Versailleson 28 June 1919. In the UK, Remembrance Sunday occurs on the Sunday closest to the 11th November, and is marked by ceremonies at local war memorials in most villages, towns, and cities. The red poppy has become a symbol for Remembrance Day due to the poem In Flanders Fields, by Lieutenant Colonel John McCrae.
You can discover more about the history behind the First World War by exploring the free resources included in theinteractive imageabove.
Feature image credit: Poppy Field, by Martin LaBar. CC-BY-NC-2.0 via Flickr.
Time passes quickly. As we track the progression of events hundred years ago on the Western Front, the dramas flash by. In the time it takes to answer an e-mail the anniversary of another battle has come and gone.
We have celebrated the fumbling British skirmishes at Mons and Le Cateau in late August, but largely forgotten the French triumph at the Battle of the Marne which first stemmed and threw back the German wheeling attack through Belgium into Northern France under the Schlieffen Plan. We have already bypassed the spirited Franco-British attempts at the Battle of the Aisne in September to take the Chemin des Dames. The Race to the Sea was under way: the British and German Armies desperately trying to turn their enemy’s northern flank.
Throughout, the performance of the British Expeditionary Force has often been exaggerated. Imaginative accounts of Germans advancing in massed columns and being blown away by rapid rifle fire are common. A rather more realistic assessment is that the British infantry were steadfast enough in defence, but unable to function properly in coordination with their artillery or machine guns. The Germans seemed to have a far better grip of the manifold disciplines of modern warfare.
Yet everything changed in October. The Germans were scraping the barrel for manpower and decided to throw new reserve formations into the battle. Young men with the minimum of training, incapable of sophisticated battle tactics. They were marched forward in a last gambler’s throw of the dice to try and break through to the Channel Ports. To do that they needed first to capture the small Belgian city of Ypres.
One might have thought that Ypres was some fabled city, fought over to secure untold wealth or a commanding tactical position. Nothing could be further from the truth. Ypres was just an ordinary town, lying in the centre of the fertile Western Flanders plain. Yet the low ridges to the east represented one of the last feasible lines of defence. The British also saw the town, not as an end in itself, but as a stepping stone to more strategically important locations pushing eastwards, such as the rail centre at Roulers or the ports of Ostend and Zeebrugge. For both sides Ypres was on the road to somewhere.
The battle began in mid-October and soon began to boil up. Time and time the Germans hurled themselves forward, the grey-green hordes pressing forwards and being shot down in their hundreds. The British had learnt many lessons and this was where they finally proved themselves worthy adversaries for the German Army. On the evening of 23 October young Captain Harry Dillon was fighting for his life:
A great grey mass of humanity was charging, running for all God would let them, straight on to us not 50 yards off. Everybody’s nerves were pretty well on edge as I had warned them what to expect, and as I fired my rifle the rest all went off almost simultaneously. One saw the great mass of Germans quiver. In reality some fell, some fell over them, and others came on. I have never shot so much in such a short time, could not have been more than a few seconds and they were down. Suddenly one man – I expect an officer – jumped up and came on. I fired and missed, seized the next rifle and dropped him a few yards off. Then the whole lot came on again and it was the most critical moment of my life. Twenty yards more and they would have been over us in thousands, but our fire must have been fearful, and at the very last moment they did the most foolish thing they possibly could have done. Some of the leading people turned to the left for some reason, and they all followed like a great flock of sheep. We did not lose much time, I can give you my oath. My right hand is one huge bruise from banging the bolt up and down. I don’t think one could have missed at the distance and just for one short minute or two we poured the ammunition into them in boxfuls. My rifles were red hot at the finish. The firing died down and out of the darkness a great moan came. People with their arms and legs off trying to crawl away; others who could not move gasping out their last moments with the cold night wind biting into their broken bodies and the lurid red glare of a farm house showing up clumps of grey devils killed by the men on my left further down. A weird awful scene; some of them would raise themselves on one arm or crawl a little distance, silhouetted as black as ink against the red glow of the fire. [p. 287-288, Fire & Movement, by Peter Hart]
Some of the Germans had got within 25 yards of Dillon’s line. It had been a close run thing and after they had been relieved by the French later that night the French reported that some 740 German corpses littered the ground in front of his trenches. This was the real war: not a skirmishes like the earlier battles, this was the real thing.
The German attacks continued, followed as day follows night, by French and British counter-attacks to restore the situation. The Germans nibbled at the Allied line but were unable to achieve anything of importance. Yet for all the sound and fury, over the next few days the front line stayed relatively static. The German troops were flagging in their efforts. After one last effort on 11 November the Germans threw in the towel. They would not break through the Allied lines in 1914. The British and French lines had held. Battered, bruised, but unbroken. The First Battle of Ypres had confirmed the strategic victory gained by the French at the Marne. The German advance in the west had been blocked, if they sought victory in 1915 they would have to look to the east and attack Russia.
The 1914 campaign would prove decisive to the war. The utter failure of the Schlieffen Plan, designed to secure the rapid defeat of France, meant that Germany would be condemned to ruinous hostilities on two fronts. This was the great turning-point of the whole war. The pre-war predictions from the German strategists that they could not prevail in a long-drawn out war against the combined forces of France and Russia proved accurate, especially when the British Empire and United States joined the fight. The German Army fought with a sustained skill and endurance, but after 1914, the odds really were stacked against them.
The fifth of November is not just an excuse to marvel at sparklers, fireworks, and effigies; it is part of a national tradition that is based on one of the most famous moments in British political history. The Gunpowder Plot itself was actually foiled on the night of Monday 4 November, 1605. However, throughout the following day, Londoners were asked to light bonfires in order to celebrate the failure of the assassination attempt on King James I of England. Henceforth, the fifth of November has become known as ‘Bonfire Night’ or even ‘Guy Fawkes Night’ – named after the most posthumously famous of the thirteen conspirators. Guy Fawkes became the symbol for the conspirators after being caught during the failed treason attempt. For centuries after 1605, boys creating a cloaked effigy – based on Guy Fawkes’ disguised appearance in the Vaults at the House of Lords – have been asking for “a penny for the Guy”.
Below is a timeline that describes the events leading up to the failed Gunpowder Plot and the execution of Guy Fawkes and his fellow conspirators. If you would like to learn more about Bonfire Night, you can explore the characters behind the Gunpowder Plot, the traditions associated with it, or simply learn how to throw the best Guy Fawkes Night party.
Feature image credit: Guy Fawkes, by Crispijn van de Passe der Ältere. Public domain via Wikimedia Commons.
One hundred years ago today, far from the erupting battlefields of Europe, a small German force in the city of Tsingtau (Qingdao), Germany’s most important possession in China, was preparing for an impending siege. The small fishing village of Qingdao and the surrounding area had been reluctantly leased to the German Empire by the Chinese government for 99 years in 1898, and German colonists soon set about transforming this minor outpost into a vibrant city boasting many of the comforts of home, including the forerunner of the now-famous Tsingtao Brewery. By 1914, Qingdao had over 50,000 residents and was the primary trading port in the region. Given its further role as the base for the Far East Fleet of the Imperial German Navy, however, Qingdao was unable to avoid becoming caught up in the faraway European war.
The forces that besieged Qingdao in the autumn of 1914 were composed of troops from Britain and Japan, the latter entering the war against Germany in accord with the Anglo-Japanese Alliance. The Alliance had been agreed in 1902 amid growing anxiety in Britain regarding its interests in East Asia, and rapidly modernizing Japan was seen as a useful ally in the region. For Japanese leaders, the signing of such an agreement with the most powerful empire of the day was seen as a major diplomatic accomplishment and an acknowledgement of Japan’s arrival as one of the world’s great powers. More immediately, the Alliance effectively guaranteed the neutrality of third parties in Japan’s looming war with Russia, and Japan’s victory in the Russo-Japanese War of 1904-05 sent shockwaves across the globe as the first defeat of a great European empire by a non-Western country in a conventional modern war.
In Britain, Japan’s victory was celebrated as a confirmation of the strength of its Asian ally, and represented the peak of a fascination with Japan in Britain that marked the first decade of the twentieth century. This culminated in the 1910 Japan-British Exhibition in London, which saw over eight million visitors pass through during its six-month tenure. In contrast, before the 1890s, Japan had been portrayed in Britain primarily as a relatively backward yet culturally interesting nation, with artists and intellectuals displaying considerable interest in Japanese art and literature. Japan’s importance as a military force was first recognized during the Sino-Japanese War of 1894-95, and especially from the time of the Russo-Japanese War, Japan’s military prowess was popularly attributed to a supposedly ancient warrior spirit that was embodied in ‘bushido’, or the ‘way of the samurai’.
The ‘bushido’ ideal was popularized around the world especially through the prominent Japanese educator Nitobe Inazo’s (1862-1933) book Bushido: The Soul of Japan, which was originally published in English in 1900 and achieved global bestseller status around the time of the Russo-Japanese War (a Japanese translation first appeared in 1908). The British public took a positive view towards the ‘national spirit’ of its ally, and many saw Japan as a model for curing perceived social ills. Fabian Socialists such as Beatrice Webb (1858-1943) and Oliver Lodge (1851-1940) lauded the supposed collectivism of ‘bushido’, while Alfred Stead (1877-1933) and other promoters of the Efficiency Movement celebrated Japan’s rapid modernization. For his part, H.G. Wells 1905 novel A Modern Utopia included a ‘voluntary nobility’ called ‘samurai,’ who guided society from atop a governing structure that he compared to Plato’s ideal republic. At the same time, British writers lamented the supposed decline of European chivalry from an earlier ideal, contrasting it with the Japanese who had seemingly managed to turn their ‘knightly code’ into a national ethic followed by citizens of all social classes.
The ‘bushido boom’ in Britain was not mere Orientalization of a distant society, however, but was strongly influenced by contemporary Japanese discourse on the subject. The term ‘bushido’ only came into widespread use around 1900, and even a decade earlier most Japanese would have been bemused by the notion of a national ethic based on the former samurai class. Rather than being an ancient tradition, the modern ‘way of the samurai’ developed from a search for identity among Japanese intellectuals at the end of the nineteenth century. This process saw an increasing shift away from both Chinese and European thought towards supposedly native ideals, and the former samurai class provided a useful foundation. The construction of an ethic based on the ‘feudal’ samurai was given apparent legitimacy by the popularity of idealized chivalry and knighthood in nineteenth-century Europe, with the notion that English ‘gentlemanship’ was rooted in that nation’s ‘feudal knighthood’ proving especially influential. This early ‘bushido’ discourse profited from the nationalistic fervor following Japan’s victory over China in 1895, and the concept increasingly came to be portrayed as a unique and ancient martial ethic. At the same time, those theories that had drawn inspiration from European models came to be ignored, with one prominent Japanese promoter of ‘bushido’ deriding European chivalry as ‘mere woman-worship’.
In the first years of the twentieth century, the Anglo-Japanese Alliance contributed greatly to the positive reception in Britain of theories positing a Japanese ‘martial race’, and the fate of ‘bushido’ in the UK demonstrated the effect of geopolitics on theories of ‘national characteristics’. By 1914, British attitudes had begun to change amid increasing concern regarding Japan’s growing assertiveness. Even the Anglo-Japanese operation that finally captured Qingdao in November was marked by British distrust of Japanese aims in China, a sentiment that was strengthened by Japan’s excessive demands on China the following year. Following the war, Japan’s reluctance to return the captured territory to China caused British opposition to Japan’s China policy to increase, leading to the end of the Anglo-Japanese Alliance in 1923. The two countries subsequently drifted even further apart, and by the 1930s, ‘bushido’ was popularly described in Britain as an ethic of treachery and cruelty, only regaining its positive status after 1945 through samurai films and other popular culture as Japan and Britain again became firm allies in the Cold War.
Headline image credit: Former German Governor’s Residence in Qingdao, by Brücke-Osteuropa. Public domain via Wikimedia Commons.
Today, 27 October sees the centenary of the birth of the poet, Dylan Marlais Thomas. Born on Cwmdonkin Drive, Swansea, and brought up in the genteel district of Uplands, Thomas’s childhood was suburban and orthodox — his father an aspirational but disappointed English teacher at the local grammar school.
Swansea would remain a place for home comforts. But from the mid-1930s, Thomas began a wandering life that took in London’s Fitzrovia — and in particular its pubs, the Fitzroy Tavern and the Wheatsheaf — and then (as a dysfunctionally married man) the New Forest, squalid rooms in wartime London, New Quay on Cardigan Bay, Italy, Laugharne in Carmarthenshire, and from 1950 the United States where he gained a popular student following and where he died in Manhattan, aged thirty-nine.
For all his wanderings, few of Thomas’s poems were written outside Wales. Indeed, half of the published poems for which he is known were written, in some form, while he was living at home in Swansea between 1930 and 1934. As Paul Ferris, his Oxford DNB biographer writes, “commonplace scenes and characters from childhood recur in his writing: the park that adjoins Cwmdonkin Drive; the bay and sands that were visible from the windows; a maternal aunt he visited” — the latter giving rise to one of Thomas’s best-known poems, “Fern Hill.” In literary London, and in numerous bar rooms thereafter, Thomas’s “drinking and clowning were indispensable to him, but they were only half the story; ‘I am as domestic as a slipper’ he once observed, with some truth.”
On 27th October 1914 Dylan Thomas was born in Swansea, South Wales. He is widely regarded as one the most significant Welsh writers of the 20th century.Thomas’s popular reputation has continued to grow after his death on 9th November, 1953, despite some critics describing his work as too ‘florid‘. He wrote prolifically throughout his lifetime but is arguably best known for his poetry. His poem The hand that signed the paper is taken from Jon Stallworthy’s edited collection The Oxford Book of War Poetry, and can be found below:
Autumn 2014 marked the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors looked at aspects of the ODNB’s online evolution in the decade since 2004. In this final post of the series, Alex May—ODNB’s editor for the very recent past— considers the Dictionary as a record of contemporary history.
When it was first published in September 2004, the Oxford DNB included biographies of people who had died (all in the ODNB are deceased) on or before 31 December 2001. In the subsequent ten years we have continued to extend the Dictionary’s coverage into the twenty-first century—with regular updates recording those who have died since 2001. Of the 4300 people whose biographies have been added to the online ODNB in this decade, 2172 died between 1 January 2001 and 31 December 2010 (our current terminus)—i.e., about 220 per year of death. While this may sound a lot, the average number of deaths per year over the same period in the UK was just short of 500,000, indicating a roughly one in 2300 chance of entering the ODNB. This does not yet approach the levels of inclusion for people who died the late nineteenth century, let alone earlier periods: someone dying in England in the first decade of the seventeenth century, for example, had a nearly three-times greater chance of being included in the ODNB than someone who died in the first decade of the twenty-first century.
‘Competition’ for spaces at the modern end of the dictionary is therefore fierce. Some subjects are certainties—prime ministers such as Ted Heath or Jim Callaghan, or Nobel prize-winning scientists such as Francis Crick or Max Perutz. There are perhaps fifty or sixty potential subjects a year about whose inclusion no-one would quibble. But there are as many as 1500 people on our lists each year, and for perhaps five or six hundred of them a very good case could be made.
This is where our advisers come in. Over the last ten years we have relied heavily on the help of some 500 people, experts and leading figures in their fields whether as scholars or practitioners, who have given unstintingly of their time and support. Advisers are enjoined to consider all the aspects of notability, including achievement, influence, fame, and notoriety. Of course, their assessments can often vary, particularly in the creative fields, but even in those it is remarkable how often they coincide.
Our advisers have also in most cases been crucial in identifying the right contributor for each new biography, whether he or she be a practitioner from the same field (we often ask politicians to write on politicians—Ted Heath and Jim Callaghan are examples of this—lawyers on lawyers, doctors on doctors, and so on), or a scholar of the particular subject area. Sadly, a number of our advisers and contributors have themselves entered the dictionary in this decade, among them the judge Tom Bingham, the politician Roy Jenkins, the journalist Tony Howard, and the historian Roy Porter.
Just as the selection of subjects is made with an eye to an imaginary reader fifty or a hundred years’ hence (will that reader need or want to find out more about that person?), so the entries themselves are written with such a reader in view. ODNB biographies are not always the last word on a subject, but they are rarely the first. Most of the ‘recently deceased’ added to the Dictionary have received one or more newspaper obituary. ODNB biographies differ from newspaper obituaries in providing more, and more reliable, biographical information, as well as being written after a period of three to four years’ reflection between death and publication of the entry—allowing information to emerge and reputations to settle. In addition, ODNB lives attempt to provide an understanding of context, and a considered assessment (implicit or explicit) of someone’s significance: in short, they aim to narrate and evaluate a person’s life in the context of the history of modern Britain and the broad sweep of a work of historical reference.
The result, over the last ten years, has been an extraordinary collection of biographies offering insights into all corners of twentieth and early twenty-first century British life, from multiple angles. The subjects themselves have ranged from the soprano Elisabeth Schwarzkopf to the godfather of punk, Malcolm McLaren; the high tory Norman St John Stevas to the IRA leader Sean MacStiofáin; the campaigner Ludovic Kennedy to the jester Jeremy Beadle; and the turkey farmer Bernard Matthews to Julia Clements, founder of the National Association of Flower Arranging Societies. By birth date they run from the founder of the Royal Ballet, Dame Ninette de Valois (born in 1898, who died in 2001), to the ‘celebrity’ Jade Goody (born in 1981, who died in 2009). Mention of the latter reminds us of Leslie Stephen’s determination to represent the whole of human life in the pages of his original, Victorian DNB. Poignantly, in light of the 100th anniversary of the outbreak of the First World War, among the oldest subjects included in the dictionary are three of the ‘last veterans’, Harry Patch, Henry Allingham, and Bill Stone, who, as the entry on them makes clear, reacted very differently to the notion of commemoration and their own late fame.
The work of selecting from thousands of possible subjects, coupled with the writing and evaluation of the chosen biographies, builds up a contemporary picture of modern Britain as we record those who’ve shaped the very recent past. As we begin the ODNB’s second decade this work continues: in January 2015 we’ll publish biographies of 230 people who died in 2011 and we’re currently editing and planning those covering the years 2012 and 2013, including what will be a major article on the life, work, and legacy of Margaret Thatcher.
Links between biography and contemporary history are further evident online—creating opportunities to search across the ODNB by profession or education, and so reveal personal networks, associations, and encounters that have shaped modern national life. Online it’s also possible to make connections between people active in or shaped by national events. Searching for Dunkirk, or Suez, or the industrial disputes of the 1970s brings up interesting results. Searching for the ‘Festival of Britain’ identifies the biographies of 35 men and women who died between 2001-2010: not just the architects who worked on the structures or the sculptors and artists whose work was showcased, but journalists, film-makers, the crystallographer Helen Megaw (whose diagrams of crystal structures adorned tea sets used during the Festival), and the footballer Bobby Robson, who worked on the site as a trainee electrician. Separately, these new entries shed light not only on the individuals concerned but on the times in which they lived. Collectively, they amount to a substantial and varied slice of modern British national life.
Headline image credit: Harry Patch, 2007, by Jim Ross. CC-BY-SA-3.0 via Wikimedia Commons.
Autumn 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors look at aspects of the ODNB’s online evolution in the decade since 2004. Here the ODNB’s publication editor, Philip Carter, considers how an ever-evolving Dictionary is being transformed by new opportunities in digital research.
When it was first published in September 2004, the Oxford DNB brought together the work of more than 10,000 humanities scholars charting the lives of nearly 55,000 historical individuals. Collectively it captured a generation’s understanding and perception of the British past and Britons’ reach worldwide. But if the Dictionary was a record of scholarship within a particular timeframe, it was also seen from the outset as a work in progress. This is most evident in the decision to include in the ODNB every person who had appeared in the original, Victorian DNB. Doing so defined the 2004 Dictionary (to quote the entry on Colin Matthew, its founding editor) as ‘a collective account of the attitudes of two centuries: the nineteenth as well as the twentieth, the one developing organically from the other.’
In the decade since 2004 this notion of the ODNB as an organic ‘work in progress’ has gone a step further. This is seen, in part, in the continued extension of biographical coverage, both of the ‘recently deceased’ and of newly documented lives from earlier periods—as discussed in other articles in this 10th anniversary series. But in addition to new content there’s also been the evolution—in the form of corrections, revisions, amplifications, and re-appraisals—of a sizeable share of the ODNB’s 55,000 existing biographies, as new scholarship comes to light.
The need to ‘keep-up’ with fresh research is not new. In 1908 the Victorian DNB was reprinted in an edition that collated the marginalia and correspondence born of several decades of reading. Thereafter, no further reprints were undertaken and later findings remained on file: information relating to the birthplace of the Quaker reformer Elizabeth Fry, for example—submitted by postcard in 1918—could not be address until the 2004 edition of the Dictionary. Such things are today unimaginable. Over the past ten years, and alongside the programme of new biographies, existing ODNB entries have been regularly updated online—with proposed amendments reviewed by the Dictionary’s academic editors in consultation with authors and reviewers. It’s worth remembering that today’s expectation of regular online updating is one that’s emerged in the lifetime of the published ODNB. Just 10 years ago, many saw online reference as a means of delivery not a new entity in its own right. The expectation that scholarly online reference could and should keep in step with new research and publications (and could be done while maintaining academic standards) is one pioneered, in part, by works like the ODNB.
One consequence is that Dictionary editors now focus on conservation (just as museum or gallery curators care for items in their collection) as well as on commissioning. In doing so we draw heavily on an ever-growing range of digitized records that have become available in the lifetime of the published Dictionary. This has been a truly remarkable development in humanities research in the past 5 to 10 years. For British history we’ve seen the digitization of (to name a few): the census returns for England and Wales (to 1911); indexes of civil registration in England and Wales (births, marriages, and deaths from 1838); Scottish parish registers from about 1500; early modern wills and probates; and 300 years’ worth of national and provincial newspapers. And this just scratches the surface.
In 2004 there were many people in the ODNB for whom the biographical trail ran cold. Access to paper records alone once meant that certain individuals simply disappeared from the historical record. Of course, some lives remain puzzles. But with these newly digitized sources we’re now able to address many of the previously unknown and untraceable episodes that were scattered across the 2004 edition. A decade on we’ve added details of nearly 3000 previously unknown births, marriages, and deaths for ODNB subjects. Access to newly digitized sources also prompts more wide-ranging revisions. Take, for example, the traveller Eliza Fay (1755/6-1816), known for her Original Letters from India, whose Dictionary entry has recently doubled in length owing to new genealogical research that minutely plots a troubled personal life that led Fay to travel to India and the business ventures she maintained there.
The case of Eliza Fay reminds us that this boom in digitized resources is particularly valuable for better understanding the lives of nineteenth and twentieth-century women. As a result of multiple marriages and/or multiple name changes many such biographies are prone to obscurity. There are also many occasions when women gave false information about their age, often for professional reasons. With digital resources, and a little detective work, it’s now possible to recover these stories. One example is Valerie, Lady Meux (1852-1910), who married into one of Britain’s wealthiest brewing families. To her contemporaries, and to generations of researchers, Lady Meux appeared the epitome of high society. But recent research uncovers a very different story: that of Susie Langton, the daughter of a Devon baker who—via multiple changes of birthdate and name—worked her way into the London elite. To Susie Langton (or Lady Meux), the discovery of her true past may not have been welcome, but for modern historians it becomes a key part of her story, and a fascinating case study of late-Victorian social mobility.
A good deal of this detective work is being done from the ODNB office. But much more comes in from thousands of researchers worldwide who are also making use of digitized resources. It’s our good fortune that the ODNB online is growing up with the Who Do You Think You Are? generation—a band of genealogists from whom we’ve benefited greatly thanks to their willingness to share new information. Such discoveries obviously enhance our understanding of the ODNB’s 60,000 main subjects, but they’re similarly adding much to the Dictionary’s 300,000 ‘other’ people: the parents, children, spouses, in-laws, patrons, teachers, business partners, and lovers who also populate these biographies. Looking ahead to our second decade, we anticipate that more will be made of these hundreds of thousands of ‘extras’ in creating a richer picture of the British past—as the ODNB continues to document and add to what we know.
This year is the 250th anniversary of Horace Walpole’s The Castle of Otranto, first published on Christmas Eve 1764 as a seasonal ghost story. The Castle of Otranto is often dubbed the “first Gothic novel” due to Walpole describing it as a “Gothic story,” but for him the Gothic meant very different things from what it might do today. While the Gothic was certainly associated with the supernatural, it was predominantly a theory of English progress rooted in Anglo-Saxon and medieval history — effectively the cultural wing of parliamentarian politics and Protestant theology. The genre of the “Gothic novel,” with all its dire associations of uncanny horror, would not come into being for at least another century. Instead, the writing that followed in the wake of Otranto was known as the German School, the ‘Terrorist System of Writing’, or even hobgobliana.
Reading Otranto today, however, it is almost impossible to forget what 250 years of Gothickry have bequeathed to our culture in literature, architecture, film, music, and fashion: everything from the great Gothic Revival design of the Palace of Westminster to none-more-black clothes for sale on Camden Town High Street and the eerie music of Nick Cave, Jordan Reyne, and Fields of the Nephilim.
And the cinema has been instrumental in spreading this unholy word. Despite being rooted in the history of the barbarian tribes who sacked Rome and the thousand-year epoch of the Dark Ages, the Gothic was also a state-of-the-art movement. Technology drove the Gothic dream, enabling, for instance, the towering spires and colossal naves of medieval cathedrals, or enlisting in nineteenth-century art and literature the latest scientific developments in anatomy and galvanism (Frankenstein), the circulation of the blood and infection (The Vampyre), or drug use and psychology (Strange Case of Dr Jekyll and Mr Hyde).
The moving image on the cinema screen therefore had an immediate and compelling appeal. The very experience of cinema was phantasmagoric — kaleidoscopic images projected in a darkened room, accompanied by often wild, expressionist music. The hallucinatory visions of Henry Fuseli and Gustave Doré arose and, like revenants, came to life.
Camera tricks, special effects, fantastical scenery, and monstrous figures combined in a new visual style, most notably in Robert Wiene’s The Cabinet of Dr Caligari (1920) and F. W. Murnau’s Nosferatu: A Symphony of Terror (1922). Murnau’s Nosferatu, the first vampire film, fed parasitically on Bram Stoker’s Dracula; it was rumored that Max Schreck, who played the nightmarish Count Orlok, was indeed a vampire himself. The horror film had arrived.
Mid-century Hollywood movie stars such as Bela Lugosi, who first played Dracula in 1931, and Boris Karloff, who played Frankenstein’s monster in the same year, made these roles iconic. Lugosi played Dracula as a baleful East European, deliberately melodramatic; Karloff was menacing in a different way: mute, brutal, and alien. Both embodied the threat of the “other”: communist Russia, as conjured up by the cinema. Frankenstein’s monster is animated by the new cinematic energy of electricity and light, while in Dracula the Count’s life and death are endlessly replayed on the screen in an immortal and diabolical loop.
It was in Britain, however, that horror films really took the cinema-going public by the throat. Britain was made for the Gothic cinema: British film-makers such as Hammer House of Horror could draw on the nation’s rich literary heritage, its crumbling ecclesiastical remains and ruins, the dark and stormy weather, and its own homegrown movie stars such as Peter Cushing and Christopher Lee. Lee in particular radiated a feral sexuality, enabling Hammer Horror to mix a heady cocktail of sex and violence on the screen. It was irresistible.
The slasher movies that have dominated international cinema since Hammer through franchises such as Hellraiser and Saw are more sensationalist melodrama than Gothic, but Gothic film does thrive and continues to create profound unease in audiences: The Exorcist, the Alien films, Blade Runner, The Blair Witch Project, and more overtly literary pictures such as Bram Stoker’s Dracula are all contemporary classics — as is Buffy the Vampire Slayer on TV.
And despite the hi-tech nature of film-making, the profound shift in the meaning of Gothic, and the gulf of 250 years, the pulse of The Castle of Otranto still beats in these films. The action of Otranto takes place predominantly in the dark in a suffocatingly claustrophobic castle and in secret underground passages. Inexplicable events plague the plot, and the dead — embodying the inescapable crimes of the past — haunt the characters like avenging revenants. Otranto is a novel of passion and terror, of human identity at the edge of sanity. In that sense, Horace Walpole did indeed set down the template of the Gothic. The Gothic may have mutated since 1764, it may now go under many different guises, but it is still with us today. And there is no escape.
September 2014 marked the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts explore aspects of the Dictionary’s online evolution in the decade since 2004. In this post, Sir David Cannadine describes his role as the new editor of the Oxford DNB.
Here at Princeton, the new academic year is very much upon us, and I shall soon begin teaching a junior seminar on ‘Winston Churchill, Anglo-America, and the “Special Relationship”’, which is always enormously enjoyable, not least because one of the essential books on the undergraduate reading list is Paul Addison’s marvellous brief biography, published by OUP, which he developed from the outstanding entry on Churchill that he wrote for the Oxford DNB. I’ve been away from the university for a year, on leave as a visiting professor at New York University, so there is a great deal of catching up to do. This month I also assume the editorial chair at the ODNB, as its fourth editor, in succession to the late-lamented Colin Matthew, to Brian Harrison, and to Lawrence Goldman.
As such, I shall be the first ODNB editor who is not resident in Britain, let alone living and working in Oxford, but this says more about our globalized and inter-connected world than it does about me. When I was contacted, several months ago, by a New York representative of OUP, asking me whether I might consider being the next editor, I gave my permanent residence in America as a compelling reason for not taking the job on. But he insisted that, far from being a disadvantage, this was in fact something of a recommendation. In following in the footsteps of my three predecessors (all, as it happens, personal friends) I am eager to do all I can to ensure that my occupancy of the editorial chair will not prove him (and OUP) to have been mistaken.
As must be true of any historian of Britain, the Oxford DNB and its predecessor have always been an essential part of my working life; and I can vividly recall the precise moment at which that relationship (rather inauspiciously) began. As a Cambridge undergraduate, I once mentioned to one of my supervisors that I greatly admired the zest, brio, and elan of J.H. Plumb’s brief life of the earl of Chatham, which I had been given a few years before as a school prize. ‘Oh’, he sniffily replied, ‘there’s no original research there; Plumb got it all from the DNB.’ Of course, I had heard of something called DNA; but what, I wondered, was this (presumably non-molecular) sequel called the DNB? Since I was clearly expected to know, I didn’t dare ask; but I soon found out, and so began a lifelong friendship.
During my remaining undergraduate days, as I worked away in the reading room of the Cambridge University Library, the DNB became a constant source of solace and relief: for when the weekly reading list seemed overwhelming, or the essay-writing was not going well, I furtively sought distraction by pulling a random volume of the DNB off the reference shelves. As a result, I cultivated what Leslie Stephen (founding editor of the Dictionary’s Victorian edition) called ‘the great art of skipping’ from one entry to another, and this remains one of the abiding pleasures provided by the DNB’s hard-copy successor. Once I started exploring the history of the modern British aristocracy, the DNB also became an invaluable research tool, bringing to life many a peer whose entry in Burke or Debrett was confined to the barest biographical outline.
Thus approached and appreciated, it was very easy to take the DNB for granted, and it was only when I wrote a lengthy essay on the volume covering the years 1961 to 1970, for the London Review of Books in 1981, that I first realized what an extraordinary enterprise it was and, indeed, had always been since the days when Leslie Stephen first founded it almost one hundred years before. I also came to appreciate how it had developed and evolved across the intervening decades, and I gained some understanding of its strengths—and of its weaknesses, too. So I was not altogether surprised when OUP bravely decided to redo the whole Dictionary, and the DNB was triumphantly reborn as the ODNB—first published almost exactly 10 years ago—to which I contributed the biographies on George Macaulay Trevelyan and Noel Annan.
Since 2004 the Oxford DNB has continued to expand its biographical coverage with three annual online updates, the most recent of which appeared last week. In September 2013 I wrote a collective entry on the Calthorpe family for an update exploring the history of Birmingham and the Black Country, and I am eager to remain an intermittent but enthusiastic contributor now that I am editor. As we rightly mark and celebrate the tenth anniversary of the publication of the ODNB, and its successful continuation across the intervening decade, it is clear that I take over an enterprise in good spirits and an organization (as the Americans would say) in good shape. Within the United Kingdom and, indeed, around the world, the ODNB boasts an unrivalled global audience and an outstanding array of global contributors; and I greatly look forward to keeping in touch, and to getting to know many of you better, in the months and years to come.
Headline image credit: ODNB, online. Image courtesy of the ODNB editorial team.
September 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts explore aspects of the Dictionary’s online evolution in the decade since 2004. In this post, Henry Summerson considers how new research in medieval biography is reflected in ODNB updates.
Today’s publication of the Oxford Dictionary of National Biography’s September 2014 update—marking the Dictionary’s tenth anniversary—contains a chronological bombshell. The ODNB covers the history of Britons worldwide ‘from the earliest times’, a phrase which until now has meant since the fourth century BC, as represented by Pytheas, the Marseilles merchant whose account of the British Isles is the earliest known to survive. But a new ‘biography’ of the Red Lady of Paviland—whose incomplete skeleton was discovered in 1823 in Wales, and which today resides in Oxford’s Museum of Natural History—takes us back to distant prehistory. As the earliest known site of ceremonial human burial in western Europe, Paviland expands the Dictionary’s range by over 32,000 years.
The Red Lady’s is not the only ODNB biography pieced together from unidentified human remains (Lindow Man and the Sutton Hoo burial are others), while the new update also adds the fifteenth-century ‘Worcester Pilgrim’ whose skeleton and clothing are on display at the city’s cathedral. However, the Red Lady is the only one of these ‘historical bodies’ whose subject has changed sex—the bones having been found to be those of a pre-historical man, and not (as was thought when they were discovered), of a Roman woman.
The process of re-examination and re-interpretation which led to this discovery can serve as a paradigm for the development of the DNB, from its first edition (1885-1900) to its second (2004), and its ongoing programme of online updates. In the case of the Red Lady the moving force was in its broadest sense scientific. In this ‘he’ is not unique in the Dictionary. The bones of the East Frankish queen Eadgyth (d.946), discovered in 2008 provide another example of human remains giving rise to a recent biography. But changes in analysis have more often originated in more conventional forms of historical scholarship. Since 2004 these processes have extended the ODNB’s pre-1600 coverage by 300 men and women, so bringing the Dictionary’s complement for this period to more than 7000 individuals.
In part, these new biographies are an evolution of the Dictionary as it stood in 2004 as we broaden areas of coverage in the light of current scholarship. One example is the 100 new biographies of medieval bishops that, added to the ODNB’s existing selection, now provide a comprehensive survey of every member of the English episcopacy from the Conquest to the Reformation—a project further encouraged by the publication of new sources by the Canterbury and York Society and the Early English Episcopal Acta series.
Taken together these new biographies offer opportunities to explore the medieval church, with reference to incumbents’ background and education, the place of patronage networks, or the shifting influence of royal and papal authority. That William Alnwick (d.1449), ‘a peasant born of a low family’, could become bishop of Norwich and Lincoln is, for example, indicative of the growing complexity of later medieval episcopal administration and its need for talented men. A second ODNB project (still in progress) focuses on late-medieval monasticism. Again, some notable people have come to light, including the redoubtable Elizabeth Cressener, prioress of Dartford, who opposed even Thomas Cromwell with success.
Away from religious life, recent projects to augment the Dictionary’s medieval and early modern coverage have focused on new histories of philanthropy—with men like Thomas Alleyne, a Staffordshire clergyman whose name is preserved by three schools—and of royal courts and courtly life. Hence first-time biographies of Sir George Blage, whom Henry VIII used to address as ‘my pig’, and at a lower social level, John Skut, the tailor who made clothes for most of the king’s wives: ‘while Henry’s queens came and went, John Skut remained.’
Alongside these are many included for remarkable or interesting lives which illuminate the past in sometimes unexpected ways. At the lowest social level, such lives may have been very ordinary, but precisely because they were commonplace they were seldom recorded. Where a full biography is possible, figures of this kind are of considerable interest to historians. One such is Agnes Cowper, a Southwark ‘servant and vagrant’ in the years around 1600; attempts to discover who was responsible for her maintenance shed a fascinating light on a humble and precarious life, and an experience shared by thousands of late-Tudor Londoners. Such light falls only rarely, but the survival of sources, and the readiness of scholars to investigate them, have also led to recent biographies of the Roman officers and their wives at Vindolanda, based on the famous ‘tablets’ found at Chesterholm in Northumberland; the early fourteenth-century anchorite Christina Carpenter, who provoked outrage by leaving her cell (but later returned to it), and whose story has inspired a film, a play and a novel; and trumpeter John Blanke, whose fanfares enlivened the early Tudor court and whose portrait image is the only identifiable likeness of a black person in sixteenth-century British art.
While people like Blanke are included for their distinctiveness, most ODNB subjects can be related to the wider world of their contemporaries. A significant component of the Dictionary since 2004 has been an interest in recreating medieval and early modern networks and associations; they include the sixth-century bringers of Christianity to England, the companions of William I, and the enforcers of Magna Carta. Each establishes connections between historical figures, sets the latter in context, and charts how appreciations of these networks and their participants have developed over time—from the works of early chroniclers to contemporary historians. Indeed, in several instances, notably the Round Table knights or the ‘Merrie Men’, it is this (often imaginative) interpretation and recreation of Britain’s medieval past that is to the fore.
The importance of medieval afterlives returns us to the Red Lady of Paviland. His biography presents what can be known, or plausibly surmised, about its subject, alongside the ways in which his bodily remains (and the resulting life) have been interpreted by successive generations—each perceptibly influenced by the cultural as well as scholarly outlook of the day. Next year sees the 800th anniversary of the granting of Magna Carta, a centenary which can be confidently expected to bring further medieval subjects into Oxford Dictionary of National Biography. It is unlikely that the historians responsible will be unaffected by considerations of the long-term significance of the Charter. Nor, indeed, should they be—it is the interaction of past and present which does most to bring historical biography to life.
Much of the comment on the official photographic portrait of the Queen released in April this year to celebrate her 88th birthday focussed on her celebrity photographer, David Bailey, who seemed to have ‘infiltrated’ (his word) the bosom of the establishment. Less remarked on, but equally of note, is that the very informal pose that the queen adopted showed her smiling, and not only smiling but also showing her teeth.
It is only very recently that monarchs have cracked a smile for a portrait, let alone a smile that revealed teeth. Before the modern age, monarchs embodied power – and power rarely smiles. Indeed it has often been thought to be worrying when it does. Prime Minister Tony Blair’s endlessly flashing teeth caused this powerful statesman to trigger as much suspicion as approval. The negative reaction was testimony to an unwritten law of portraiture, present until very recently in western art. According to this, an open mouth signifies plebeian status, extreme emotion, or else folly and licence, bordering on insanity. As late as the eighteenth century, an individual who liked to be depicted smiling as manifestly as Tony Blair would have risked being locked up as a lunatic.
The individual who broke this unwritten law of western portraiture was Louise Élisabeth Vigée Le Brun whose charming smile –- at once twinklingly seductive and reassuringly maternal – was displayed at the Paris Salon in 1787. It appears on the front cover of my book, The Smile Revolution in Eighteenth-Century Paris. The French capital had witnessed the emergence of modern dentistry over the course of the century – a subject that has been largely neglected. In addition, the city’s elites adopted the polite smile of sensibility that they had learned from the novels of Samuel Richardson and Jean-Jacques Rousseau. Madame Vigée Le Brun’s smile shocked the artistic establishment and the stuffy court elite out at Versailles, who still observed tradition, but it marked the advent of white teeth as a positive attribute in western art.
Yet if Vigée Le Brun’s example was followed by many of the most eminent artists of her day (David, Ingres, Gérard, etc), the white tooth smile took much longer to establish itself as a canonical and approved portrait gesture. The eighteenth century’s ‘Smile Revolution’ aborted after 1789. Politics under the French Revolution and the Terror were far too serious to accommodate smiles. The increasingly gendered world of separate spheres consigned the smile to the domestic environment. And for most of the nineteenth century, monarchs and men of power in the public sphere, following traditional modes of the expression of gravitas, invariably presented a smile-less face to the world.
Probably the first reigning monarch to have a portrait painted that revealed white teeth was Queen Victoria. This may seem surprising given her famous penchant for staying resolutely ‘unamused’. Yet in 1843, she commissioned the German portrait-painter Franz-Xaver Winterhalter to paint a delightfully informal study, that showed the twenty-four year-old monarch reclining on a sofa revealing her teeth in a dreamy and indeed mildly aroused smile. Yet the conditions of the portrait’s commission showed that the seemly old rules were still in place. For Victoria had commissioned the portrait as a precious personal gift for her ‘angelic’ husband, Prince Albert. What she called her ‘secret picture’ was hung in the queen’s bedroom and was not seen in public throughout her reign. Indeed, its display in an exhibition in 2009, over a century after her death, marked only its second public showing since its creation. This was three years after Rolf Harris’s 2006 portrayal of the queen with a white-tooth smile, a significant precursor to David Bailey’s photograph.
If English monarchs have thus been late-comers to the twentieth-century smile-fest, their subjects have been baring their teeth in a smile for many decades. As early as the 1930s and 1940, the practice of saying ‘cheese’ when confronted with a camera became the norm. Hollywood-style studio photography, advertising models and more relaxed forms of sociability and subjectivity have combined to produce the twentieth century’s very own Smile Revolution. So it is worth reflecting whether the reigning monarch’s early twenty-first century acceptance of the smile’s progress will mark a complete and durable revolution in royal portraiture. Seemingly only time – and the Prince of Wales – will tell.
Over the summer of 1582 a group of English Catholic gentlemen met to hammer out their plans for a colony in North America — not Roanoke Island, Sir Walter Raleigh’s settlement of 1585, but Norumbega in present-day New England.
The scheme was promoted by two knights of the realm, Sir George Peckham and Sir Thomas Gerard, and it attracted several wealthy backers, including a gentleman from the midlands called Sir William Catesby. In the list of articles drafted in June 1582, Catesby agreed to be an Associate. In return for putting up £100 and ten men for the first voyage (forty for the next), he was promised a seignory of 10,000 acres and election to one of “the chief offices in government”. Special privileges would be extended to “encourage women to go on the voyage” and according to Bernardino de Mendoza, the Spanish ambassador in London, the settlers would “live in those parts with freedom of conscience.”
Religious liberty was important for these English Catholics because they didn’t have it at home. The Mass was banned, their priests were outlawed and, since 1571, even the possession of personal devotional items, like rosaries, was considered suspect. In November 1581, Catesby was fined 1,000 marks (£666) and imprisoned in the Fleet for allegedly harboring the Jesuit missionary priest, Edmund Campion, who was executed in December.
Campion’s mission had been controversial. He had challenged the state to a public debate and he had told the English Catholics that those who had been obeying the law and attending official church services every week — perhaps crossing their fingers, or blocking their ears, or keeping their hats on, to show that they didn’t really believe in Protestantism — had been living in sin. Church papistry, as it was known pejoratively, was against the law of God. The English government responded by raising the fine for non-attendance from 12 pence to £20 a month. It was a crippling sum and it prompted Catesby and his friends to go in search of a promised land.
The American venture was undeniably risky — “wild people, wild beasts, unexperienced air, unprovided land” did not inspire investor confidence — but it had some momentum in the summer of 1582. Francis Walsingham, Elizabeth I’s secretary of state, was behind it, but the Spanish scuppered it. Ambassador Mendoza argued that the emigration would drain “the small remnant of good blood” from the “sick body” of England. He was also concerned for Spain’s interests in the New World. The English could not be allowed a foothold in the Americas. It mattered not a jot that they were Catholic, “they would immediately have their throats cut as happened to the French.” Mendoza conveyed this threat to the would-be settlers via their priests with the further warning that “they were imperilling their consciences by engaging in an enterprise prejudicial to His Holiness” the Pope.
So Sir William Catesby did not sail the seas or have a role in the plantation of what — had it succeeded — would have been the first English colony in North America. He remained in England and continued to strive for a peaceful solution. “Suffer us not to be the only outcasts and refuse of the world,” he and his friends begged Elizabeth I in 1585, just before an act was passed making it a capital offense to be, or even to harbor, a seminary priest in England. Three years later, as the Spanish Armada beat menacingly towards England’s shore, Sir William and other prominent Catholics were clapped up as suspected fifth columnists. In 1593 those Catholics who refused to go to church were forbidden by law from traveling beyond five miles of their homes without a license. And so it went on until William’s death in 1598.
Seven years later, in the reign of the next monarch James I (James VI of Scotland), William’s son Robert became what we would today call a terrorist. Frustrated, angry and “beside himself with mindless fanaticism,” he contrived to blow up the king and the House of Lords at the state opening of Parliament on 5 November 1605. “The nature of the disease,” he told his recruits, “required so sharp a remedy.” The plot was discovered and anti-popery became ever more entrenched in English culture. Only in 2013 was the constitution weeded of a clause that insisted that royal heirs who married Catholics were excluded from the line of succession.
Every 5 November, we English and Scottish set off our fireworks and let our children foam with marshmallow, and we enjoy “bonfire night” as a bit of harmless fun, without really thinking about why the plotters sought their “sharp remedy” or, indeed, about the tragedy of the father’s failed American Dream, a dream for religious freedom that was twisted out of all recognition by the son.
Featured image: North East America, by Abraham Ortelius 1570. Public Domain via Wikimedia Commons.
A set of related satirical poems, probably written in the early thirteenth century, described an imaginary church council of English priests reacting to the news that they must henceforth be celibate. In this fictional universe the council erupted in outrage as priest after priest stood to denounce the new papal policy. Not surprisingly, the protests of many focused on sex, with one speaker, for instance, indignantly protesting that virile English clerics should be able to sleep with women, not livestock. However, other protests were focused on family. Some speakers appealed to the desire for children, and others noted their attachment to their consorts, such as one who exclaimed: “This is a useless measure, frivolous and vain; he who does not love his companion is not sane!” The poems were created for comical effect, but a little over a century earlier English priests had in fact faced, for the first time, a nationwide, systematic attempt to enforce clerical celibacy. Undoubtedly a major part of the ensuing uproar was about sex, but in reality as in fiction it was also about family.
Rules demanding celibacy first appeared at church councils in the late Roman period but were only sporadically enforced in Western Europe through the early Middle Ages and never had more than a limited impact in what would become the Eastern Orthodox Church. In Anglo-Saxon England moralists sometimes preached against clerical marriage and both king and church occasionally issued prohibitions against it, but to little apparent effect. Indeed, one scribe erased a ban on clerical marriage from a manuscript and wrote instead, “it is right that a cleric (or priest) love a decent woman and bed her.” In the eleventh century, however, a reinvigorated papacy began a sustained drive to enforce clerical celibacy throughout Catholic Europe for clerics of the ranks of priest, deacon, or subdeacon. This effort provoked great controversy, but papal policy prevailed, and over the next couple of centuries increasingly made clerical celibacy the norm.
In England, it was Anselm, the second archbishop of Canterbury appointed after the Norman Conquest, who made the first attempt to systematically impose clerical celibacy in 1102. Anselm’s efforts created a huge challenge to the status quo, for many, perhaps most English priests were married in 1102 and the priesthood was often a hereditary profession. Indeed, Anselm and Pope Paschal II agreed not to attempt in the short term to enforce one part of the program of celibacy, the disbarment of sons of priests from the priesthood, because that would have decimated the ranks of the English clergy. Anselm, moreover, found himself trying to figure out how to allow priests to take care of their former wives, and priests who obediently separated from their wives were apparently sometimes threatened by their angry in-laws. Not surprisingly, Anselm’s efforts were deeply unpopular and faced widespread opposition.
Priests then and in subsequent generations (for Anselm’s efforts had only limited success in the short run) were often deeply attached to their families. A miracle story recorded after Thomas Becket’s death in 1170 describes a grieving priest getting confirmation from the recent martyr that his concubine, who had done good works before her death, had gone to heaven. Other miracle stories show priests and their companions lamenting the illness, misfortune, or death of a child and seeking miraculous aid. It took a long time to fully convince everyone that priestly families were ipso facto immoral. Even late in the twelfth century, the monastic writer John of Ford, in a saint’s life of the hermit Wulfric of Haselbury, could depict the family of a parish priest, Brictric, as perfectly pious, with Brictric’s wife making ecclesiastical vestments and his son and eventual successor as priest, Osbern, serving at mass as a minor cleric. John also depicted a former concubine of another priest as a saintly woman noted for her piety. Proponents of clerical celibacy had a difficult challenge not only in enforcing the rules but in convincing people that they ought to be enforced in the first place.
Inevitably, priests’ families suffered heavily from the drive for celibacy. The sons of priests lost the chance to routinely follow in their father’s professional footsteps, as most medieval men did. After priestly marriage was legally eliminated, sons and daughters both were automatically illegitimate, bringing severe legal disadvantages. However, it was the female companions of priests who suffered most. Partly this was because one of the key motives behind clerical celibacy was the belief that sexual contact with women polluted priests who then physically touched God by touching the sacrament as they performed the Eucharist. Moralists constantly preached that this was irreligious, even blasphemous, and disgusting. However, the female partners of priests also suffered because preachers constantly denigrated them as whores and used misogynistic stereotypes to try to convince priests that they should avoid taking partners. Thus preachers repeatedly attacked priests for wasting money on adorning their “whores” or for arising from having sex with their “whores” to go perform the Eucharist. It is hard to know the precise position of priests’ wives in the eleventh century but it is quite likely that most were perfectly respectable. Nonetheless, the attacks of reformers had a powerful impact. In 1137 King Stephen decided to do his part to encourage clerical celibacy, and raise money in the process, by rounding up clerical concubines and holding them in the Tower of London for ransom. Some of these were probably partners of canons of St Paul’s cathedral, who were rich and powerful men, but even so, while in the tower they were subject to physical mockery and abuse. Increasingly, it was impossible to be both the partner of a priest and a respectable member of society.
Many of the proponents of clerical celibacy were fiercely idealistic in their efforts to prevent what they saw as widespread pollution of the Eucharist, to remove the costs of families from the financial burdens of churches, to make the priesthood more distinctive from the laity, and simply to enforce church law. As the historian Christopher Brooke suggested nearly six decades ago, however, and as subsequent research has clearly demonstrated, one result of their efforts was a social revolution that resulted in broken homes and personal tragedies.
Headline image: 12th Century painters, from the Web Gallery of Art. Public domain via Wikimedia Commons.
In a week’s time, the residents of Scotland (not the Scottish people: Scots resident south of the border are ineligible to vote) will decide whether or not to destroy the UK as currently constituted. The polls are on a knife edge; and Alex Salmond, the leader of the separatists, has a track record as a strong finisher. If he gets his way, the UK will lose 8% of its citizens and a third of its land mass; and Scotland, cut off, at least initially, from every international body (the UN Security Council, NATO, the EU) and every UK institution (the Bank of England, the pound sterling, the BBC, the security services), will face a bleak and uncertain future.
In the first century BC, the Roman republic was collapsing as a result of its systemic inability to curb the ambitions of powerful politicians. Everyone could see that the end was nigh; no one could predict what would follow. The conditions were ideal for the development of political oratory, and Cicero emerged as Rome’s greatest orator, determined to save his country even at the cost of his own life. During his consulship, he suppressed the conspiracy of Catiline, denouncing that man and his deluded supporters in his four Catilinarian Speeches. He pulled no punches: he did not hold back, like the supporters of the Union today, for fear of appearing too “negative”. So he informed the senate:
“A plot has been formed to ensure that, following a universal massacre, there should not be a single person left even to mourn the name of the Roman people or to lament the destruction of so great an empire.”
For Catiline’s supporters, he had nothing but contempt, telling the people:
“Reclining at their banquets, embracing their whores, heavy with wine, stuffed with food, wreathed with flowers, drenched with perfume, and worn out by promiscuous sex, they belch out their plans for the massacre of decent citizens and the burning of Rome.”
Cicero went straight for the jugular. Two decades later he denounced a more powerful adversary, Mark Antony, who was attempting with much greater forces to seize control of the state. Cicero attacked him in a series of speeches, the Philippics; but Antony did a deal with Octavian, got what he wanted, and had Cicero killed. Cicero’s words at the end of the Second Philippic were prophetic:
“I defended this country when I was a young man: I shall not desert it now that I am old. I faced down the swords of Catiline: I shall not flinch before yours. Yes, and I would willingly offer my body, if the freedom of this country could at once be secured by my death. Two things alone I long for: first, that when I die I may leave the Roman people free; and second, that each person’s fate may reflect the way he has behaved towards his country.”
Where is Cicero today when we need him? The debate on the future of Scotland, and hence of the UK, has been conducted in newspapers, in TV interviews and debates, and in social media. Anonymous internet trolls hurl abuse at celebrities who dare to express their affection for Britain. The Westminster Parliament stays silent. One MP, however, is free of the party whips, and has been touring Scotland delivering passionate, hard-hitting and unapologetically negative speeches in defence of the Union. This is George Galloway, and the speech he gave in Edinburgh on 24 June can be read and listened to here.
Like Cicero, Galloway pulls no punches. He compares the current crisis with 1940, the last time the UK faced an existential threat:
“And not one person asked in that summer and autumn of 1940 and into 1941 if the pilots who were spinning above us defending us from invasion from the barbaric horde were from Suffolk or Sutherland. We were people together on a small piece of rock with 300 years of common history.”
Referring to his political differences with the other supporters of the Union, he says, “We have come together but temporarily at a moment of national peril”, declaring:
“There will be havoc if you vote Yes in September. Havoc in Edinburgh and throughout the land and you will break the hearts of many others too.”
This preference for extreme, unambiguous statements, delivered with the greatest possible emotional force, and this recognition of the significance of the historical moment, is pure Cicero. But what is most Ciceronian in Galloway’s speech is the moral dimension. Galloway is not concerned with whether the new Scottish state would have to concentrate its spending on benefits or foreign embassies. Instead, he harks back repeatedly to the Second World War, that conflict of good against evil, contrasting it with Bannockburn, “a battle 700 years ago between two French-speaking kings with Scottish people on both sides”. And, as Cicero would, he judges an issue by the moral character of the people concerned: on the one side, Brian Souter, “the gay-baiting billionaire” and major donor of the SNP, and on the other, the children’s author J. K. Rowling, “one of our highest achieving women in the history of our entire country”, whose moderate and reasoned support for the Union has earned her hate mail from fanatical separatists. Morality runs like a thread all the way through Galloway’s speech.
How come so few women are in favour of independence? Why are Scotland’s women the most resistant of all the demographics in this contest? The reason is that women simply don’t like gambling. And everything in their project is about gambling — for your future, your pension, your children and their children’s future.
“Let it be inscribed on the forehead of every citizen what he thinks about his country”, Cicero told the senate. Next week, the future of the UK will be decided by a secret ballot. If Britain survives in a political and not merely in a geographical sense, part of the credit will be due to the Ciceronian eloquence of Mr Galloway.
I want an independent Scotland that is true to the ideals of egalitarianism articulated in some of the best poetry of Robert Burns. I want a pluralist, cosmopolitan Scotland accountable to its own parliament and allied to the European Union. My vote goes to Borgen, not to Braveheart. I want change.
Britain belongs to a past that is sometimes magnificent, but is a relic of empire. Scotland played its sometimes bloody part in that, but now should get out, and have the courage of its own distinctive convictions. It is ready to face up to being a small nation, and to get over its nostalgia for being part of some supposed ‘world power’. No better, no worse than many other nations, it is regaining its self-respect.
Yet the grip of the past is strong. Almost absurdly emblematic of the complicated state of 2014 Scottish politics is Bannockburn: seven hundred years ago Bannockburn, near Stirling in central Scotland, was the site of the greatest medieval Scottish victory against an English army. Today Bannockburn is part of a local government zone controlled by a Labour-Conservative political alliance eager to defeat any aspirations for Scottish independence. In the summer of 2014 Bannockburn was the site of a civilian celebration of that 1314 Scottish victory, and of a large-scale contemporary British military rally. The way the Labour and Conservative parties in Scotland are allied, sometimes uneasily, in the ‘Better Together’ or ‘No’ campaign to preserve the British Union makes Scotland a very different political arena from England where Labour is the opposition party fighting a Conservative Westminster government. England has no parliament of its own. As a result, the so-called ‘British’ Parliament, awash with its Lords, with its cabinet of privately educated millionaires, and with all its braying of privilege, spends much of its time on matters that relate to England, not Britain. This is a manifest abuse of power. The Scottish Parliament at Holyrood looks – and is – very different.
Like many contemporary Scottish writers and artists, I am nourished by traditions, yet I like the idea of change and dislike the status quo, especially the political status quo. National identity is dynamic, not fixed. Democracy is about vigorous debate, about rocking the boat. Operating in an atmosphere of productive uncertainty is often good for artistic work. Writers enjoy rocking the boat, and can see that as a way of achieving a more egalitarian society. That’s why most writers and artists who have spoken out are on the ‘Yes’ side. If there is a Yes vote in the Scottish independence referendum on 18 September 2014, it will be a clear vote for change. If there is a ‘No’ vote, it will be because of a strong innate conservatism in Scottish society – a sense of wanting to play it safe and not rock the boat. Whether Scotland’s Labour voters remain conservative in their allegiances and vote ‘No’, or can be swayed to vote ‘Yes’ because they see the possibility of a more egalitarian future — is a key question.
As we get nearer and nearer to the date of the Scottish independence referendum on 18 September, I expect there will be an audible closing of ranks on the part of the British establishment. Already in July we have had interventions from the First Sea Lord (who gave a Better Togetherish speech at the naming ceremony for an aircraft carrier), and a lot of money from major landowners and bankers has been swelling the coffers of those opposed to independence. In Glasgow it was good to read at an event with Liz Lochhead, Kathleen Jamie, Alasdair Gray, and other poets and novelists in support of independence. This is a very exciting time for Scotland, a time when relationships with all kinds of institutions are coming under intense scrutiny. Whatever happens, the country is likely to emerge stronger, and with an intensified sense of itself as a democratic place.
The Union of 1707 – which by uniting the English and Scottish parliaments created the new state of the United Kingdom of Great Britain – was enthusiastically sought by some Scots and grudgingly accepted by many more, even if most people would have been happier with a federal union. What until recently most historians had missed was the identification with the Union of Scottish politicians and their supporters who had suffered under the later Stuart regime. In some cases they’d been forced into exile in the Low Countries They were backers of the Revolution (of 1688-90) in Scotland, which they saw as truly glorious. They advocated union as a means of securing the gains of the Revolution (constitutional monarchy, the re-establishment of Presbyterianism and certain civil liberties) and keeping the Jacobites’ hands off the imperial crown. This was a union based on Whig principles – religious, civic and economic. It was effected, as far as Scotland was concerned, through the persistence of a number of driven individuals some of whom had advocated closer union with England in 1688-9, and were still around in 1706-7 to vote for this in the Scottish Parliament.
I take issue with the centuries-old shibboleth that in 1707 the Scots had been, in the words of Robert Burns, ‘bought and sold for English gold’, by a ‘parcel’ of roguish politicians. The Union of 1707 was not the betrayal of the Scottish nation its critics had long asserted, a measure to be overturned if Scotland was to be set back on its rightful constitutional trajectory – not as a stateless nation within the British union state but as an independent nation state.
Yet support for the Scottish Nationalists in Scotland has grown strongly since the 1970s, along with disenchantment with the British state and Westminster. Scots’ identification with Britain has fallen sharply, with most Scots now feeling more Scottish than British.
It’s pretty clear that the Union is more vulnerable today than at any previous time since the Jacobite risings of 1714-5 and 1745-6. The props upon which it was built either no longer apply – its core purpose was to ensure that Queen Anne was succeeded by a Protestant (thereby excluding the Catholic claimant, James Edward Stuart, later the ‘Old Pretender’), or are less important. Presbyterianism, the security of which was enshrined (in theory at least) in the first of the two acts that comprised the Union agreement, has ceased to matter for most Scots. Scotland’s economy is no longer under-developed – unhindered access to the English market and to England’s Atlantic and Caribbean colonies were attractions even for Scots who were otherwise opposed to incorporation.
In short, there is a case for saying that the Union is past its ‘sell by date’. Those who are keen to maintain the United Kingdom need to come up with a vision for a Union for the 21st century – or at the very least a rationale – of the kind that inspired Scots to push for such an arrangement in 1707. Many more rallied to defend it – sometimes by risking life and limb – against the Jacobite incursions of 1715 and 1745. Until recently the main pro-Union campaign, Better Together, has been criticized for emphasizing the negative aspects of Scottish independence – ‘project fear’ – rather than the positive virtues of the Union.
Yet support for Yes Scotland – the separatists’ campaign – is (at the time of writing) apparently no higher than around 40% of the electorate, suggesting that when the referendum vote happens, on 18 September this year, a majority of Scots will vote No. Comparison with other nations in Europe that have recently struggled for and achieved independence may tell us something – not least that Scotland’s experience of union with a bigger neighbor has been somewhat less oppressive. Like being in bed not with an elephant as some allege, but a teddy bear. And that currently, notwithstanding its failings, more Scots than the nationalists hoped for still feel comfortable within the Union. It’s a habit that’s lasted for more than three centuries. As things stand, not enough people have found compelling reasons to give it up.
This is the centenary year of the enactment of the third Home Rule Bill, as well (of course) as the year of the Scottish referendum on independence. Yet the centenary conversation in Ireland and the somewhat more vigorous debate upon Scots independence, have been conducted — for the most part — quite separately.
While it would be wrong to push the analogies too far, there are some striking similarities – and some differences – between the debate on Home Rule in 1912-14, and the current debate upon Scottish independence. These similarities (and indeed distinctions) might well give food for thought to the protagonists within the Scottish ‘Yes’ and ‘Better Together’ camps — and indeed there is evidence that both Gordon Brown and Alex Salmond have ruminated accordingly.
One critical difference between Ireland in 1914 and Scotland in 2014 is that of militancy — Ireland on the eve of the First World War being an armed camp comprising the Ulster and Irish Volunteer movements, opponents and proponents of Home Rule, as well as the British Army. The Scottish political debate has not been militarised, and there is no evidence that it will become so (the Scottish National Liberation Army, for example, has never posed a significant threat). Modern Scottish nationalism has developed as a wholly constitutional and pacific phenomenon.
Of course mainstream Scottish nationalism has only recently, through successive Holyrood elections, emerged as a majority phenomenon. But it has never had to encounter the challenge (faced by Irish nationalism a century ago) of returning a majority of elected representatives, while being lengthily resisted in London.
One aspect of the Irish experience in 1914 was that a fraught constitutional debate, heightened political expectations, and the delaying or disappointment of those expectations (with Unionist resistance and the onset of War), combined to make a highly volatile political chemistry. The hardening expectations of change across Scotland in 2014 mean that national (as well as social and economic) aspirations may need to be quickly and sensitively addressed, whatever the result of the referendum.
One critical dimension of this militancy in 1914 was the trenchant support given to Ulster Unionist paramilitarism by the British Conservative leadership — this in part a symptom of the profound divisions in British and Irish politics and society precipitated by the debate over Home Rule. It is striking that both the Home Rule issue in 1914 and the referendum in 2014 have each attracted an unusually broad range of declarations of allegiance from a complex array of interest groups and individuals. In 1914 there was a high level of ‘celebrity’ endorsement and intervention over Home Rule: taking literary figures alone, Sir Arthur Conan Doyle came out as a Home Ruler, while Rudyard Kipling was a strong Unionist. In 2014 Irvine Welsh has declared in favour of independence, while J.K. Rowling is against. Ian Rankin provides a case-study in the complexity (and profundity) of division: he is an agnostic on the issue, but is clear that his characters would have strong opinions. So, Inspector Rebus joins the unionists of 2014 (though the actor Ken Stott, most recent of the TV Rebuses, is reportedly in the ‘yes’ camp).
The analogies between Home Rule and the debate on Scottish independence extend much further than the ‘A’ list, however. The substantial strength and challenge of Home Rule sentiment produced striking intellectual movement before and in 1914 — just as the strength of the movement for Scots independence has produced similar movement a century later.
In 1912-14 the constitutional impasse over Home Rule in fact helped to stimulate support for (what was then called) ‘federalism’ among some of the Unionist elite, including even Edward Carson. In terms of the (nearly) equally weighted forces fighting over Scottish independence, Gordon Brown has now moved to embrace the idea of a federal United Kingdom; and he has been joined or preceded by others, including (for example) the Scottish Conservative journalist, David Torrance. Discussion of a possible English parliament was broached prominently in 1911-1914 and again in 2014. Both in 1914 and in 2014 it appears that the constitutional shape of the ever-malleable United Kingdom is once again in transition — but because unionists are now shifting no less then nationalists.
And indeed some Scots Nationalists have moved towards embracing at least some of the symbols of the British connection. John Redmond, the Home Rule leader, emphasised monarchy and empire in his vision of Irish autonomy during the Home Rule era, partly through personal conviction, and partly in terms of subverting unionist arguments. In similar vein, Alex Salmond (despite a strong tradition of republican sentiment within the SNP), has embraced the ‘union of the crowns’ as SNP strategy, and has in recent years referred deferentially to the Queen (‘of Scots’), and her central place in an independent nation.
Here, as elsewhere, Ireland’s century-old debate on Home Rule speaks to the current condition of Scotland. Indeed here, as elsewhere, Ireland’s wider experience of Union chimes with that of the Scots.
With Scotland voting on independence on 18 September 2014, the UK coalition government sought advice on the relevant law from two leading international lawyers, James Crawford and Alan Boyle. Their subsequent report has a central argument. An independent Scotland would be separatist, breaking away from the remainder of the UK. Therefore, the latter (known as restUK or rUK) would be the continuator state – enjoying all the rights and duties of the existing UK, while Scotland would be new state having none of rUK’s rights and especially no membership of any international organizations it enjoys now as part of the UK. The bargaining power of rUK as to what it might concede of the UK’s rights would be complete, e.g. with respect to a common currency. This legal opinion has created a confrontational atmosphere around the referendum vote and caused anxiety among Scottish voters about to ‘jump into the unknown’.
It is essential to unpack the distracting complexity of the expert international law professionalism of this advice. Firstly, Crawford and Boyle gloss over the actual legal circumstances of the contract of union between Scotland and England, in particular that the Union was a bargain among powers equal in the eyes of international law at that time. More specifically, the England which, with Wales, concluded the Treaty of Union is exactly the same entity standing opposite to Scotland now as then (leaving aside the North of Ireland which has the option under the Belfast Agreement of leaving the UK by referendum).
There is no international standard, in the event of a dissolution of a union, which can provide any objective criterion to determine that Scotland is the breakaway entity. In international law, recognition of new states is largely a matter of the political discretion of existing states. It depends on an international consensus, or lack of it, where political preference may or may not trump any possibly objective standard of political legitimacy, e.g. self-determination by democratic consent. The vast amount of state practice which Crawford and Boyle’s legal opinion displays is misleading insofar as there is, in fact, no definitive legal marker of guidance. This is shown by the fact that England is the continuator state because it is larger than Scotland. Legally, there has to be a continuator state. But since this obviously cannot be Scotland, it must be England. Even Scotland assumes this to be the case.
It is necessary to focus upon an international legal history of the individual states, rather than the more general international law offered by Crawford and Boyle. The Anglo-Scottish Union displays a phenomenon that Linda Colley has referred to as the composite state. This is where two or more sovereign nations agree to merge their highest governmental level institution (parliament) into a single state made up of several nations – a state-nation – but other lesser local institutions might remain. In the Europe of the 15th to the 17th century this was a common phenomenon, the most celebrated being in Scandinavia, involving Sweden, Denmark and Norway in a variety of partnerships from the Kalmar Union (1397) onwards. The logic of these partnerships was that they were always open to renegotiation. Now, this is precisely what the English generously recognize in the Edinburgh Agreement. The logic of the composite state does not cover the many cases in which a core nation forms itself into a state and then jealously guards its territorial integrity against dissident minorities, which are then regarded as separatist and destructive of national unity. It is possible that an aura of this type of scenario runs through the legal opinion of Crawford and Boyle, although they have to accept the consensual context of the advice they are being asked to give.
The real issues facing Scotland have to be confronted on a basis of equality and mutual consent in accordance with the international law established as apposite for this case. These issues are a matter of history, not merely that of the 17th-18th century, but also the evolution of the 1707 Treaty of Union (implemented through separate Acts of Union passed in the Scottish and English Parliaments) to the very recent past – especially the Thatcher years and the neo-liberal revolution in English-dominated UK politics. It has to be recognized that there are profound differences of social philosophy now between Scotland and England around the issue of neo-liberalism and the defense of community. These provide good reasons to revisit that 1707 bargain. This revisiting should be on the basis of complete equality. The sharing of common institutions of the United Kingdom, such as the currency, would have to be negotiated after reaching an agreement in which neither side – as so-called continuator state – would have a higher standing.
Scottish women are said to hold the key to independence, as they predominate in the ‘no’ camp. Men have been repeatedly estimated from poll data to be around 50:50 for and against, while those women who were sure of their intentions were 60% against.
This has been represented as an alarming gender divide, but a look at the history of women fighting for the vote in Scotland shows they have long been resolute in their positions, more concerned with what politics could do in real life than the grandstanding of political ideas, and much more internationalist than their sisters south of the border.
The Scottish route to women’s suffrage started in 1867 with the Edinburgh National Society for Women’s Suffrage; similar societies were established in Manchester, London, and Dublin. Later these suffragists were joined by the suffragettes, who attracted considerable publicity for arson, vandalism, and hunger-striking in the cause, to the disdain of the constitutional campaigners who thought this sort of behaviour counter-productive. This major division in tactics has served to obscure the fundamental similarity of both campaigns as both sides were directed towards the same objective: for women to have the vote on the same basis as men, which was then on a property-owning franchise. They also both steered away from engagement in other social activities. The vote was all-important, it was a millennialist objective, which once achieved would inaugurate an era of social justice and peace. Other social activity was at best a distraction and could wait till after the advent of the franchise. For this reason English suffragists such as Millicent Fawcett were not involved in important campaigns like those against the Contagious Diseases Acts and for temperance, whatever their personal views may have been.
Scottish women took another path, with a much more inclusive vision of the purpose of political activism. For them the vote was one of a number of issues on which to campaign, and temperance was another. Using the vehicle of the Scottish Christian Union, Scottish women allied with the American Women’s Christian Temperance Union, the most powerful women’s suffrage organisation in the world.
The temperance cause was part of a set of progressive measures as disparate as anti-slavery, ‘social purity’ (sexual control), universal education, and promoting enhanced domestic skills to the poor. All had women as prime movers or playing a prominent part – the so-called ‘feminine public sphere’. Scottish women embraced this ‘woman’s mission’ with a vengeance, for example eagerly seizing on the municipal vote which was granted to Scottish women in 1881, in order to favour candidates who wanted strict alcohol licensing. Other areas of activity included such practical institutions as the Glasgow Samaritan Hospital for ‘diseases of women’ and rescue homes for ‘female inebriates.’ It has been said that alcohol more than slavery or suffrage or any other single cause politicised American women. Megan Smitley in The Feminine Public Sphere (MUP, 2009) has convincingly argued that the same can be said for Scottish women.
In the United States the Women’s Christian Temperance Union saw through enfranchisements state by state, and sent out missionaries to New Zealand (which became the first nation to enfranchise women in 1893) and to Australia (which started enfranchising with South Australia in 1894). Isabel Napier, who was National Superintendent of the Suffrage Department of the Scottish Christian Union, grew up in New Zealand and retained strong links. “When Suffrage became law in New Zealand all their influence was thrown on the side of Temperance Reform,” she said, “and so you have the advanced laws that now obtain.” WCTU speakers toured Scotland from the Shetlands to the Borders, hosted by the Scottish Christian Union.
In contrast, English women considered the US temperance campaign vulgar and did not welcome WCTU speakers; they feared the ‘Americanisation’ of their field. Nor did English and Welsh temperance organisations officially support women’s suffrage (though individual members doubtless did).
The importance of this tradition of social activism for the independence debate has been that Scottish women were not moved by the same arguments as men. The ‘Braveheart tendency’ of independence at all costs as a patriotic ideal, regardless of the consequences, has had limited feminine appeal. As Lesley Riddoch wrote in The Scotsman: “Toughing out controversy and appearing to spoil for a fight may earn respect from male commentators and small armies of cyber-angry, anonymous men. Clever dick answers, snide-sounding put downs and swaggering arrogance turn off watching women as swiftly as they appear to engage watching men.” That was the level at which most of the independence campaign was fought, however, leading to a frantic late catch-up as more ‘woman friendly’ policies were rolled out.
The issues that women took most interest in were: How would either side deal with child poverty, low pay, and poor housing? What could be done about the European-wide disgrace of poor health and low life expectancy in parts of Scotland? Finally (and in a manner that would be instantly recognisable to nineteenth century prohibitionists) how to deal with the appalling levels of alcohol abuse in Scotland which are so damaging to personal health and family life?
Such practical matters of national renewal were often drowned out by masculine bluster.
Hadrian’s Wall has been in the news again recently for all the wrong reasons. Occasional wits have pondered on its significance in the Scottish Referendum, neglecting the fact that it has never marked the Anglo-Scottish border, and was certainly not constructed to keep the Scots out. Others have mistakenly insinuated that it is closed for business, following the widely reported demise of the Hadrian’s Wall Trust. And then of course there is the Game of Thrones angle, best-selling writer George R R Martin has spoken of the Wall as an inspiration for the great wall of ice that features in his books.
Media coverage of both Hadrian’s Wall Trust’s demise and Game of Thrones’ rise has sometimes played upon and propagated the notion that the Hadrian’s Wall was manned by shivering Italian legionaries guarding the fringes civilisation – irrespective of the fact that the empire actually trusted the security of the frontier to its non-citizen soldiers, the auxilia rather than to its legionaries. The tendency to overemphasise the Italian aspect reflects confusion about what the Roman Empire and its British frontier was about. But Martin, who made no claims to be speaking as a historian when he spoke of how he took the idea of legionaries from Italy, North Africa, and Greece guarding the Wall as a source of inspiration, did at least get one thing right about the Romano-British frontier.
There were indeed Africans on the Wall during the Roman period. In fact, at times there were probably more North Africans than Italians and Greeks. While all these groups were outnumbered by north-west Europeans, who tend to get discussed more often, the North African community was substantial, and its stories warrant telling.
Perhaps the most remarkable tale to survive is an episode in the Historia Augusta (Life of Severus 22) concerning the inspection of the Wall by the emperor Septimius Severus. The emperor, who was himself born in Libya, was confronted by a black soldier, part of the Wall garrison and a noted practical joker. According to the account the notoriously superstitious emperor saw in the soldier’s black skin and his brandishing of a wreath of Cyprus branches, an omen of death. And his mood was not further improved when the soldier shouted the macabre double entendre iam deus esto victor (now victor/conqueror, become a god). For of course properly speaking a Roman emperor should first die before being divinized. The late Nigerian classicist, Lloyd Thompson, made a powerful point about this intriguing passage in his seminal work Romans and Blacks, ‘the whole anecdote attributes to this man a disposition to make fun of the superstitious beliefs about black strangers’. In fact we might go further, and note just how much cultural knowledge and confidence this frontier soldier needed to play the joke – he needed to be aware of Roman funerary practices, superstitions, and the indeed the practice of emperor worship itself.
Why is this illuminating episode not better known? Perhaps it is because there is something deeply uncomfortable about what could be termed Britain’s first ‘racist joke’, or perhaps the problem lies with the source itself, the notoriously unreliable Historia Augusta. And yet as a properly forensic reading of this part of the text by Professor Tony Birley has shown, the detail included around the encounter is utterly credible, and we can identify places alluded to in it at the western end of the Wall. So it is quite reasonable to believe that this encounter took place.
Not only this, but according to the restoration of the text preferred by Birley and myself, there is a reference to a third African in this passage. The restoration post Maurum apud vallum missum in Britannia indicates that this episode took place after Severus has granted discharge to a soldier of the Mauri (the term from which ‘Moors’ derives). And has Birley has noted, we know that there was a unit of Moors stationed at Burgh-by-Sands on the Solway at this time.
Sadly, Burgh is one of the least explored forts on Hadrian’s Wall, but some sense of what may one day await an extensive campaign of excavation there comes from Transylvania in Romania, where investigations at the home of another Moorish regiment of the Roman army have revealed a temple dedicated to the gods of their homelands. Perhaps too, evidence of different North African legacies would emerge. The late Vivian Swann, a leading expert in the pottery of the Wall has presented an attractive case that the appearance of new forms of ceramics indicates the introduction of North African cuisine in northern Britain in the second and third centuries AD.
What is clear is that the Mauri of Burgh-by-Sands were not the only North Africans on the Wall. We have an African legionary’s tombstone from Birdoswald, and from the East Coast the glorious funerary stela set up to commemorate Victor, a freedman (former slave) by his former master, a trooper in a Spanish cavalry regiment. Victor’s monument now stands on display in Arbeia Museum at South Shields next to the fine, and rather better known, memorial to the Catuvellunian Regina, freedwoman and wife of Barates from Palmyra in Syria. Together these individuals, and the many other ethnic groups commemorated on the Wall, remind us of just how cosmopolitan the people of Roman frontier society were, and of how a society that stretched from the Solway and the Tyne to the Euphrates was held together.
September 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts consider aspects of the ODNB’s online evolution in the decade since 2004. Here the literary historian, David Hill Radcliffe, considers how the ODNB online is shaping new research in the humanities.
The publication of the Oxford Dictionary of National Biography in September 2004 was a milestone in the history of scholarship, not least for crossing from print to digital publication. Prior to this moment a small army of biographers, myself among them, had worked almost entirely from paper sources, including the stately volumes of the first, Victorian ‘DNB’ and its 20th-century print supplement volumes. But the Oxford DNB of 2004 was conceived from the outset as a database and published online as web pages, not paper pages reproduced in facsimile. In doing away with the page image as a means of structuring digital information, the online ODNB made an important step which scholarly monographs and articles might do well to emulate.
Database design has seen dramatic changes since 2004—shifting from the relational model of columns and rows, to semi-structured data used with XML technologies, to the unstructured forms used for linking data across repositories. The implications of these developments for the future of the ODNB remain to be seen, but there is every reason to believe that its content will be increasingly accessed in ways other than the format of the traditional biographical essay. Essays are not going away, of course. But they will be supplemented by the arrays of tables, charts, maps, and graphs made possible by linked data. Indeed, the ODNB has been moving in this direction since 2004 with the addition of thousands of curated links between individuals (recorded in biographical essays) and the social hierarchies and networks to which they belonged (presented in thematic list and group entries)—and then on to content by or about a person held in archives, museums or galleries worldwide.
Online the ODNB offers scholars the opportunity to select, group, and parse information not just at the level of the article, but also in more detailed ways—and this is where computational matters get interesting. I currently use the ODNB online as a resource for a digital prosopography attached to a collection of documents called ‘Lord Byron and his Times’, tracking relationships among more than 12,000 Byron-contemporaries mentioned in nineteenth-century letters and memoirs; of these people a remarkable 5000 have entries in the ODNB. The traditional object of prosopography was to collect small amounts of information about large numbers of persons, using patterns to draw inferences about slenderly documented lives. But when computation is involved, a prosopography can be used with linked data to parse large amounts of information about large numbers of persons. As a result, one can attend to particularities, treating individuals as members of a group or social network without reducing them to the uniformity of a class identity. Digital prosopography thus returns us to something like the nineteenth-century liberalism that inspired Sir Leslie Stephen’s original DNB (1885-1900).
The key to finding patterns in large collections of lives and documents, the evolution of technology suggests, is to atomize the data. As a writer of biographies I would select from documentary sources, collecting the facts of a life, and translating them into the form of an ODNB essay. Creating a record in a prosopography involves a similar kind of abstraction: working from (say) an ODNB entry, I abstract facts from the prose, encoding names and titles and dates in a semi-structured XML template that can then be used to query my archive, comprising data from previous ODNB abstractions and other sources. For instance: ‘find relationships among persons who corresponded with Byron (or Harrow School classmates, or persons born in Nottinghamshire, etc.) mentioned in the Quarterly Review.’ An XML prosopography is but a step towards recasting the information as flexible, concise, and extensible semantic data.
While human readers can easily distinguish the character-string ‘Oxford’ as referring to the place, the university, or the press, this is a challenge for computation—like distinguishing ‘Byron’ the poet from ‘Byron’ the admiral. One can attack this problem by using algorithms to compare adjacent strings, or one can encode strings by hand to disambiguate them, or use a combination of both. Digital ODNB essays are good candidates for semantic analysis since their structure is predictable and they are dense with significant names of persons, places, events, and relationships that can be used for data-linking. One translates character-strings into semantic references, groups the references into relationships, and expresses the relationships in machine-readable form.
A popular model for parsing semantic data is via ‘triples’: statements in the form subject / property / object, which describe a relationship between the subject and the object: the tree / is in / the quad. It is powerful because it can describe anything, and its statements can be yoked together to create new statements. For example: ‘Lord Byron wrote Childe Harold’, and ‘John Murray published Childe Harold’ are both triples. Once the three components are translated into semantically disambiguated machine-readable URIs (Uniquely Referring Identifiers), computation can infer that ‘John Murray published Lord Byron.’
Now imagine the contents of the ODNB expressed not as 60,000 biographical essays but as several billion such statements. In fact, this is far from unthinkable, given the nature of the material and progress being made in information technology. The result is a wonderful back-to-the-future moment with Leslie Stephen’s Victorian DNB wedded to Charles Babbage’s calculating machine: the simplicity of the triple and the power of finding relations embedded within them. Will the fantasies of positivist historians finally be realized? Not likely; while computation is good at questions of ‘who’, ‘what’, ‘where’, and ‘when’, it is not so good at ‘why’ and ‘how’. Biographers and historians are unlikely to find themselves out of a job anytime soon. On the contrary, once works like the ODNB are rendered machine-readable and cross-query-able, scholars will find more work on their hands than they know what to do with.
So the publication of the ODNB online in September 2004 will be fondly remembered as a liminal moment when humanities scholarship crossed from paper to digital. The labour of centuries of research was carried across that important threshold, recast in a medium enabling new kinds of investigation the likes of which—ten years on—we are only beginning to contemplate.
Most of what we hear and read about twelfth-century hottie Rosamund Clifford, aka “Fair Rosamund,” just wasn’t so. True, she was Henry II’s mistress. But that’s about it. Like so many other medieval myths, Rosamund’s legendary life and death are a later invention. Herewith, the best of (untrue) Rosamund:
Myth 1: She went to school at, lived at, had assignations with the king at, retired to, died at, or in any way hung out at Godstow Abbey.
Sadly, Rosamund never entered Godstow until she was a fair corpse. She died around the year 1176, in the midst of her affair with the king, and was buried at Godstow, probably because her mother was already buried there. Contrary to what you will read in various places, there is no evidence that the king paid for her tomb. Her tomb was placed in the front of the high altar, and the king did show particular favor to the monastery because of it. Fifteen years later, Bishop Hugh of Lincoln made the nuns move the tomb out of the church because it was inappropriate for a “whore” to be buried there.
Myth 2: She and Henry went drinking at the Trout. Or the Perch.
I read this about the pubs near Godstow in a student handbook when I was doing my postgraduate work at Oxford, and I wanted to believe it. So did visiting relatives. Alas, not true. See number 2 above: no hanging out at Godstow. But my visitors and I did enjoy some pleasant pints at both the aforementioned hostelries.
Myth 3: She lived in a maze at Woodstock.
Of course this is a later embellishment, related to the next two myths. But a fairly elaborate pleasure garden does seem to have been incorporated into the royal residence at Woodstock in this period, adjacent to a room that just a generation later was known as “Rosamund’s Chamber.” So the maze story may have evolved from a real trysting place in a complex garden.
Myth 4: The queen found her in the maze by means of a silken thread.
See previous myth. But there is, just barely, a silken thread in Rosamund’s true story. After her burial at Godstow, King Henry wanted a special relationship with her burial place, so the nunnery’s patron deeded his patronal rights in Godstow to the king. In the ceremony he used a silk cloth that was later described as “a silken thread.”
Myth 5: She was murdered by Queen Eleanor of Aquitaine.
The earliest version of this story, from the fourteenth century, has Eleanor stabbing Rosamund; in Renaissance versions the queen makes Rosamund choose between stabbing and poison. Interestingly, even the Victorians made a sympathetic victim of poor Rosamund (the fornicating mistress) and turned Eleanor (the wronged wife) into a murderous monster. Needless to say, there’s no truth to the murder stories, which arose long after Rosamund died.
Myth 6: She was the mother of Henry II’s illegitimate son Geoffrey Plantagenet, archbishop of York, and/or his illegitimate son William Longespee, earl of Salisbury.
Rosamund was too young to be Geoffrey’s mother, who was apparently a woman named Ikeni. William Longespee was the son of Ida de Tosny.
Myth 7: Latin bell inscriptions all over England make reference to her.
These inscriptions read, “I who am struck am called Maria [or Katherine], the rose of the world.” Rosamund was a rare, possibly unique, name for a woman in twelfth-century England, but the phrases rosa munda (pure rose) and rosa mundi (rose of the world) were epithets for the Virgin Mary. It’s likely that Rosamund Clifford was named (creatively and, as it turned out, ironically) in honor of the Virgin, and that the bell inscriptions came from the same general cultural source.
Myth 8: Roses were spread over her tomb.
No, just a silken pall and candles, as far as we know. It’s possible, however, that the Gallica rose ‘Rosa Mundi’ was named for her, as her legend grew in the later Middle Ages. Perhaps the rose, like the bells, was named for the Virgin Mary, but the name of the rose is one bit of Rosamund lore that seems plausible.