JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: british, Most Recent at Top [Help]
Results 1 - 25 of 118
How to use this Page
You are viewing the most recent posts tagged with the words: british in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
By December 1914 the Great War had been raging for nearly five months. If anyone had really believed that it would be ‘all over by Christmas’ then it was clear that they had been cruelly mistaken. Soldiers in the trenches had gained a grudging respect for their opposite numbers. After all, they had managed to fight each other to a standstill.
On Christmas Eve there was a severe frost. From the perspective of the freezing-cold trenches the idea of the season of peace and goodwill seemed surrealistic. Yet parcels and Christmas gifts began to arrive in the trenches and there was a strange atmosphere in the air. Private William Quinton was watching:
We could see what looked like very small coloured lights. What was this? Was it some prearranged signal and the forerunner of an attack? We were very suspicious, when something even stranger happened. The Germans were actually singing! Not very loud, but there was no mistaking it. Suddenly, across the snow-clad No Man’s Land, a strong clear voice rang out, singing the opening lines of “Annie Laurie“. It was sung in perfect English and we were spellbound. To us it seemed that the war had suddenly stopped! Stopped to listen to this song from one of the enemy.
“We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle.”
On Christmas Day itself, in some sectors of the line, there was no doubting the underlying friendly intent. Yet the men that took the initiative in initiating a truce were brave – or foolish – as was witnessed by Sergeant Frederick Brown:
Sergeant Collins stood waist high above the trench waving a box of Woodbines above his head. German soldiers beckoned him over, and Collins got out and walked halfway towards them, in turn beckoning someone to come and take the gift. However, they called out, “Prisoner!” A shot rang out, and he staggered back, shot through the chest. I can still hear his cries, “Oh my God, they have shot me!”
This was not a unique incident. Yet, despite the obvious risks, men were still tempted. Individuals would get off the trench, then dive back in, gradually becoming bolder as Private George Ashurst recalled:
It was grand, you could stretch your legs and run about on the hard surface. We tied an empty sandbag up with its string and kicked it about on top – just to keep warm of course. We did not intermingle. Part way through we were all playing football. It was so pleasant to get out of that trench from between them two walls of clay and walk and run about – it was heaven.
The idea that football matches were played between the British and Germans in No Man’s Land has taken a grip, but the evidence is intangible.
The truce was not planned or controlled – it just happened. Even senior officers recognised that there was little that could be done in this strange state of affairs. Brigadier General Lord Edward Gleichen accepted the truce as a fait accompli, but was keen to ensure that the Germans did not get too close to the ramshackle British trenches:
They came out of their trenches and walked across unarmed, with boxes of cigars and seasonable remarks. What were our men to do? Shoot? You could not shoot unarmed men. Let them come? You could not let them come into your trenches; so the only thing feasible was done – and our men met them half-way and began talking to them. Meanwhile our officers got excellent close views of the German trenches.
Another practical reason for embracing the truce was the opportunity it presented for burying the dead that littered No Man’s Land. Private Henry Williamson was assigned to a burial party:
The Germans started burying their dead which had frozen hard. Little crosses of ration box wood nailed together and marked in indelible pencil. They were putting in German, ‘For Fatherland and Freedom!’ I said to a German, “Excuse me, but how can you be fighting for freedom? You started the war, and we are fighting for freedom!” He said, “Excuse me English comrade, but we are fighting for freedom for our country!”
It should be noted that the truce was by no means universal, particularly where the British were facing Prussian units.
For the vast majority of the participants, the truce was a matter of convenience and maudlin sentiment. It did not mark some deep flowering of the human spirit, or signify political anti-war emotions taking root amongst the ranks. The truce simply enabled them to celebrate Christmas in a freer, more jovial, and, above all, safer environment, while satisfying their rampant curiosity about their enemies.
The truce could not last: it was a break from reality, not the dawn of a peaceful world. The gradual end mirrored the start, for any misunderstandings could cost lives amongst the unwary. For Captain Charles Stockwell it was handled with a consummate courtesy:
At 8.30am I fired three shots in the air and put up a flag with ‘Merry Christmas!’ on it, and I climbed on the parapet. He put up a sheet with, ‘Thank you’ on it, and the German captain appeared on the parapet. We both bowed and saluted and got down into our respective trenches – he fired two shots in the air and the war was on again!
In other sectors, the artillery behind the lines opened up and the bursting shells soon shattered the truce.
War regained its grip on the whole of the British sector. When it came to it, the troops went back to war willingly enough. Many would indeed have rejoiced at the end of the war, but they were still willing to accept orders, still willing to kill Germans. Nothing had changed.
The light in the Orkneys is so clear, so bright, so lucid, it feels like you are on top of the world looking though thin clouds into heaven.
It doesn’t even feel part of the UK: when you sail off the edge of Scotland by the Scrabster to Stromness ferry, you feel you are departing the real world to land in a magical realm.
Nowhere else on earth can you go to a place and see eight thousand years of continuous history in such a tiny space.
Skara Brae is what remains of a neolithic village, older than Stonehenge and the pyramids, kept secret underground until uncovered by a severe storm in 1850. You can walk in and sit down, look around at the stone walls, stone beds, stone cupboards, dressers, seats, and storage boxes. Recognizably human people lived here, seeing this same landscape and coast, feeling the same wind on their faces that you do, their eyes resting on the doors, hearths and toilets (one in each dwelling).
This is ‘stone age’ but talking about such ages is a misnomer in the Orkneys where they had no appreciable bronze age nor iron age so proceeded from the non-use of one metal to the non-use of another in what is now the best preserved neolithic site in Europe.
The Orkneys have been so fascinating for so long that even the vandalism needs to be preserved. In Maeshowe burial mound you can see where Viking tourists who came to the monument, already ancient by their time, wrote graffiti about their girlfriends on the walls. They wrote in Norse runes.
The Orkney islands were the headquarters of the Viking invasion fleets, and to this day the Orkneys are the only place in the world besides Norway where the Norwegian national day is celebrated.
The islands are filled with Tolkeinesque place names like the Ring of Brodgar, the Brough of Birsay, the Standing Stones of Stenness. Sagas were born here, like that of the peaceable 12th century Earl of Orkney, treacherously assassinated and now known as St Magnus, after whom the cathedral is named.
Sagas were created here in living memory. This is where the British home fleet was at anchor and the German fleet still lies. The battle fleet of the German Imperial Navy transferred in its entirety to Scapa Flow in 1919 to await a decision on its future. The German sailors could not bring themselves to give up their ships; they opened the seacocks and scuttled them all. At low tide you can still see the rusting hulks of Wilhelmine ambitions to dominate Europe.
If the Orkneys sound bleak and rocky, that would be the wrong impression to leave. They have rich and fertile farming land with green plains rolling on under a pearl sky. People tell folk tales around the peat fires, drinking ginger-flavoured whiskey; an orange cat pads around the grain heaps in the Highland Park distillery, and the islands shimmer under the ‘simmer dim’ of nightless summer days. I should be there now.
Autumn is here again – in England, the season of mists and mellow fruitfulness, in the US also the season of Thanksgiving. On the fourth Thursday in November, schoolchildren across the country will stage pageants, some playing black-suited Puritans, others Native Americans bedecked with feathers. By tradition, Barack Obama will ‘pardon’ a turkey, but 46 million others will be eaten in a feast complete with corncobs and pumpkin pie. The holiday has a long history: Lincoln fixed the date (amended by Roosevelt in 1941), and Washington made it a national event. Its origins, of course, lay in the Pilgrim Fathers’ first harvest of 1621.
Who now remembers who these intrepid migrants were – not the early ‘founding fathers’ they became, but who they were when they left? The pageant pilgrims are undifferentiated. Who knows the name of Christopher Martin, a merchant from Billericay near Chelmsford in Essex? He took his whole family on the Mayflower, most of whom, including Martin himself, perished in New Plymouth’s first winter. They died Essex folk in a strange land: there was nothing ‘American’ about them. And as for Thanksgiving, well that habit came from the harvest festivals and religious observances of Protestant England. Even pumpkin pie was an English dish, exported then forgotten on the eastern side of the Atlantic.
Towns like Billericay, Chelmsford and Colchester were crucial to American colonization: ordinary places that produced extraordinary people. The trickle of migrants in the 1620s, in the next decade became a flood, leading to some remarkable transformations. In 1630 Francis Wainwright was drawing ale and clearing pots in a Chelmsford inn when his master, Alexander Knight, decided to emigrate to Massachusetts. It was an age of austerity, of bad harvests and depression in the cloth industry. Plus those who wanted the Protestant Reformation to go further – Puritans – feared that under Charles I it was slipping backwards. Many thought they would try their luck elsewhere until England’s fortunes were restored, perhaps even that by building a ‘new’ England they could help with this restoration. Wainwright, aged about fourteen, went with Knight, and so entered a world of hardship and danger and wonder.
One May dawn, seven years later, Wainwright was standing by the Mystic River in Connecticut, one of seventy troops waiting to shoot at approaching Pequot warriors. According to an observer, the Englishmen ‘being bereaved of pity, fell upon the work without compassion’, and by dusk 400 Indians lay dead in their ruined encampment. The innkeeper’s apprentice had fired until his ammunition was exhausted, then used his musket as a club. One participant celebrated the victory, remarking that English guns had been so fearsome, it was ‘as though the finger of God had touched both match and flint’. Another rejoiced that providence had made a ‘fiery oven’ of the Pequots’ fort. Wainwright took two native heads home as souvenirs. Unlike many migrants, he stayed in America, proud to be a New Englander, English by birth but made different by experience. He lived a long life in commerce, through many fears and alarms, and died at Salem in 1692 during the white heat of the witch-trials.
The story poses hard historical questions. What is identity, and how does it change? Thanksgiving pageants turn Englishmen into Americans as if by magic; but the reality was more gradual and nuanced. Recently much scholarly energy has been poured into understanding past emotions. We may think our emotions are private, but they leak out all the time; we may even use them to get what we want. Converted into word and deed, emotions leave traces in the historical record. When the Pilgrim William Bradford called the Pequot massacre ‘a sweet sacrifice’, he was not exactly happy but certainly pleased that God’s will had been done.
Puritans are not usually associated with emotion, but they were deeply sensitive to human and divine behaviour, especially in the colonies. Settlers were proud to be God’s chosen people – like Israelites in the wilderness – yet pride brought shame, followed by doubt that God liked them at all. Introspection led to wretchedness, which was cured by the Holy Spirit, and they were back to their old censorious selves. In England, even fellow Puritans thought they’d lost the plot, as did most (non-Puritan) New Englanders. But godly colonists established what historians call an ‘emotional regime’ or ‘emotional community’ in which their tears and thunder were not only acceptable but carried great political authority.
John Winthrop, the leader of the fleet that carried Francis Wainwright to New England, was an intensely emotional man who loved his wife and children almost as much as he loved God. Gaunt, ascetic and tirelessly judgmental, he became Massachusetts Bay Colony’s first governor, driven by dreams of building a ‘city upon a hill’. It didn’t quite work out: Boston grew too quickly, and became diverse and worldly. And not everyone cared for Winthrop’s definition of liberty: freedom to obey him and his personal interpretation of God’s designs. But presidents from Reagan to Obama have been drawn to ‘the city upon the hill’ as an emotionally potent metaphor for the US in its mission to inspire, assist, and police the world.
Winthrop’s feelings, however, came from and were directed at England. His friend Thomas Hooker, ‘the father of Connecticut’, cut his teeth as a clergyman in Chelmsford when Francis Wainwright lived there. Partly thanks to Wainwright, one assumes, he found the town full of drunks, with ‘more profaneness than devotion’. But Hooker ‘quickly cleared streets of this disorder’. The ‘city upon the hill’, then, was not a blueprint for America, but an exemplar to help England reform itself. Indeed, long before the idea was associated with Massachusetts, it related to English towns – notably Colchester – that aspired to be righteous commonwealths in a country many felt was going to the dogs. Revellers did not disappear from Chelmsford and Colchester – try visiting on a Saturday night – but, as preachers and merchants and warriors, its people did sow the seeds from which grew the most powerful nation in the world.
So if you’re celebrating Thanksgiving this year, or you know someone who is, it’s worth remembering that the first colonists to give thanks were not just generic Old World exiles, uniformly dull until America made them special, but living, breathing emotional individuals with hearts and minds rooted in English towns and shires. To them, the New World was not an upgrade on England: it was a space in which to return their beloved country to its former glories.
Featured image credit: Signing of the Constitution, by Thomas P. Rossiter. Public domain via Wikimedia Commons
One of the ironies of the Scottish independence referendum is that Scotland is widely recognised to be a changed place despite the majority voting in favour of the union. It became clear during the course of 2014 that something significant was happening. Scotland witnessed levels of public engagement and debate never before seen. Hugh MacDiarmid’s ‘Glasgow 1960’ comes to mind. Returning to Glasgow ‘after long exile’, MacDiarmid’s narrator encounters packed trams heading for Ibrox, the home of Rangers football club, but discovers that the crowds are going to listen to a debate between ‘Professor MacFadyen and a Spainish pairty’ and that newspapers with headlines ‘Special! Turkish Poet’s Abstruse New Song’ were selling ‘like hot cakes’.
The Scottish Question may not have been debated on quite so elevated a level but debates were conducted the length and breadth of Scotland in a remarkably civil, engaging, and open manner. Those who sought to portray these debates as something sinister could do no better than refer to a professional politician who had an egg thrown at him while he addressed meetings on top of an Irn Bru crate. The dull, limited, predictable, binary debate of the conventional press contrasted with the expansive, lively, and engaging discussions that took place in often novel venues in every nook and cranny of Scotland. The Scottish Question, as debated by the public, was not restricted to a narrow constitutional question but became a genuine dialogue about what kind of place Scotland should seek to become. The referendum started a process that has not been halted by the outcome of a referendum on whether Scotland should become an independent country, the formal question that provoked this all-embracing national conversation.
The result of referendum and reaction to it has been in stark contrast to the referendum on devolution 35 years ago. In 1979, Scots had narrowly voted for a very limited form of devolution – 51.6% in favour on a turnout of 63.7% – but the measure on offer was not implemented as it failed to achieve the weighted majority demanded by Parliament at Westminster. The expectation in the run-up to that referendum had been that a decisive majority would vote for devolution. The slight numeric majority hid a defeat in expectations. Expectations were very different in the months leading up to September 18th this year. Early in 2014, opponents of independence thought that they might push support for independence below 30% and were still convinced that it would win less than 40% only a few weeks before Scots went to vote. In the event, 55.3% voted for the union on a record turnout of 84.6% but it has been the 45% that has been celebrated as victory. It has been the membership of the Yes parties, that has increased dramatically, with the membership of the Scottish National Party now dwarfing that of the other Scottish parties. With just under 100,000 members, the SNP can claim to be the only mass party in the UK today. Politics is an expectations game and supporters of independence knew that they had a ‘mountain to climb’, in the words of the chair of the official Yes campaign.
As opinion polls narrowed towards the end of the campaign, a ‘Vow’ was signed by the three main UK party leaders promising substantially more devolution while protecting Scotland’s share of public spending. This means that even the debate around the narrowed constitutionalist understanding of the Scottish Question will continue. More powers will be delivered with ramifications for the rest of the United Kingdom. Scotland is a changed place but an answer to the Scottish Question remains as elusive as ever.
Headline image credit: Glencoe, Scotland panorama by Gil Cavalcanti. CC BY-SA 3.0 via Wikimedia Commons.
Two hundred years ago last Friday the owner of the London Times, John Walter II, is said to have surprised a room full of printers who were preparing hand presses for the production of that day’s paper. He showed them an already completed copy of the paper and announced, “The Times is already printed – by steam.” The paper had been printed the night before on a steam-driven press, and without their labor. Walter anticipated and tried to mediate the shock and unrest with which this news was met by the now-idled printers. It was one of many scenes of change and conflict in early industrialization where the hand was replaced by the machine. Similar scenes of hand labor versus steam entered into cultural memory from Romantic poetry about framebreaking Luddites to John Henry’s hand-hammering race against the steam drill.
There were many reasons to celebrate the advent of the steam press in 1814, as well as reasons to worry about it. Steam printing brought the cost of printing down, increased the number of possible impressions per day by four times, and, in a way, we might say that it helped “democratize” access to information. That day, the Times proclaimed that the introduction of steam was the “greatest improvement” to printing since its very invention. Further down that page, which itself was “taken off last night by a mechanical apparatus,” we read why the hand press printers might have been concerned: “after the letters are placed by the compositors… little more remains for man to do, than to attend upon, and watch this unconscious agent in its operations.”
Moments of technological change do indeed put people out of work. My father, who worked at the Buffalo News for nearly his entire career, often told me about layoffs or fears of layoffs coming with the development of new computerized presses, print processes, and dwindling markets for print. But the narrative of the hand versus the machine, or of the movement from the hand to the machine, obscures a truth about labor, especially information labor. Forms of human labor are replaced (and often quantifiably reduced), but they are also rearranged, creating new forms of work and social relations around them. We would do well to avoid the assumption that no one worked the steam press once hand presses went mostly idle. As information, production, and circulation becomes more technologically abstracted from the hands of workers, there is an increased tendency to assume that no labor is behind it.
Two hundred years after the morning when the promise of faster, cheaper, and more accessible print created uncertainty among the workers who produced it, I am writing to you using an Apple Computer made by workers in Shenzhen, China with metals mined all over the global South. The software I am using to accomplish this task was likely written and maintained by programmers in India managed by consultants in the United States. You are likely reading this on a similar device. Information has been transmitted between us via networks of wires, servers, cable lines, and wireless routers, all with their own histories of people who labor. If you clicked over here from Facebook, a worker in a cubicle in Manilla may have glanced over this link among thousands of others while trying to filter out content that violates the social network’s terms of service. Technical laborers, paid highly or almost nothing at all, and working under a range of conditions, are silently mediating this moment of exchange between us. Though they may no longer be hand-pressed, the surfaces on which we read and write are never too distant from the hands of information workers.
Like research in book history and print culture studies, the common appearance of a worker’s hand in Google Books reminds us that, despite radical changes in technology over centuries, texts are material objects and are negotiated by numerous people for diverse purposes, only some of which we would call “reading” proper. The hand pulling the lever of a hand press and the hand turning pages in scanner may be representative of two poles on a two-century timeline, but, for me, they suggest many more continuities between early print and contemporary digital cultures than ruptures. John Walter II’s proclamation on 28 November 1814 was not a turn away from a Romantic past of artisanal labor toward a bleak and mechanized future. Rather, it was an early moment in an ongoing struggle to create and circulate words and images to ever more people while also sustaining the lives of those who produce them. Instead of assuming, two hundred years on, that we have been on a trajectory away from the hand, we must continue looking for and asking about the conditions of the hand in the machine.
Headline image credit: Hand of Google, by Unknown CC BY-SA 3.0 via Wikimedia Commons.
Like much historical research, my chapter in the Britain’s Soldiers collection came about more or less by accident. It relates to an incident that I discovered in the War Office papers at in 2007. I was taking a group of History students from Northampton University to The National Archives in Kew, to help them with their undergraduate dissertations. I had a small amount of time to do some of my own research, so ordered WO 43/404. The title sounded promising: ‘A book containing copies of correspondence in 1761, relative to the Behaviour of the Duke of Richmond’s regiment & Militia at Stamford on the 15th April 1761’.
What arrived was a letter book. It immediately struck me as being unusual, as the War Office usually just kept in-letters, whereas this book copied both sides of a lengthy correspondence between the Secretary at War and various other protagonists. Something noteworthy had clearly occurred, and was therefore preserved, possibly as a precedent to inform future action. I was pushed for time, however, so I quickly photographed the whole book and returned to my students.
Four years later I finally had an opportunity to transcribe the letters. What emerged was a bizarre event. In brief, the Lincolnshire Militia was quartered in Stamford and a regiment of the regular army that was on the march approached the town. The regulars had such contempt for the militia that they marched straight in, disregarding the usual military convention that they send advance word, and proceeded to refuse any of the other courtesies that the militia attempted to offer them. The militia’s commander took such umbrage at these slights that he posted sentries at entrances to the town, ‘to prevent any armed Troops entering the Town for the future without my knowledge and consent’. When further regulars attempted to enter the town, the militia stopped them at the point of their bayonets and a fight ensued in which men were injured, and could have been killed.
This was more than a local scrap. Neither side would admit fault and so wrote to the War Office to intercede. Despite the fact that Britain was in the midst of the Seven Years War – the largest global conflict that Britain had then fought – the Secretary at War took the incident very seriously indeed, and the letter book records how the fallout preoccupied him for a further two months. The dispute even drew in the King himself, who as Commander in Chief was keen to preserve ‘that Equality and Harmony in Service which is so much to be wished and cultivated’.
I was intrigued by the story that emerged from this letter book, and a paper trail ensued as I attempted to flesh out the story with other sources. My attempts to find references to the affair in public sources such as newspapers drew a blank, and I had more luck in private family papers and local government records instead. This was not an incident that was publicly known about at the time, which perhaps explains why historians had overlooked it.
As a cultural historian, what drew me to this incident was that it was an example of people behaving oddly. It is often only when people are behaving abnormally that we get an insight into the normal expectations of behaviour that go unspoken – in this case, attitudes towards military precedence and masculine honour. I think that incidents like the one at Stamford in 1761 highlight the crucial significance of ‘polite’ and controlled conduct in the eighteenth-century military. To our eyes, the interpersonal conduct of Georgian army officers may seem terribly mannered – but this is clearly desirable when you have large numbers of men who are highly defensive of their status and armed to the teeth. Rather than just reconstructing micro-incidents like this for their own sake, therefore, it is helpful to think about them more anthropologically in order to shed light on the workings of a past society.
Headline image credit: Battle of Bunker Hill by E. Percy Moran. Public domain via Wikimedia Commons.
Autumn 2014 marked the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors looked at aspects of the ODNB’s online evolution in the decade since 2004. In this final post of the series, Alex May—ODNB’s editor for the very recent past— considers the Dictionary as a record of contemporary history.
When it was first published in September 2004, the Oxford DNB included biographies of people who had died (all in the ODNB are deceased) on or before 31 December 2001. In the subsequent ten years we have continued to extend the Dictionary’s coverage into the twenty-first century—with regular updates recording those who have died since 2001. Of the 4300 people whose biographies have been added to the online ODNB in this decade, 2172 died between 1 January 2001 and 31 December 2010 (our current terminus)—i.e., about 220 per year of death. While this may sound a lot, the average number of deaths per year over the same period in the UK was just short of 500,000, indicating a roughly one in 2300 chance of entering the ODNB. This does not yet approach the levels of inclusion for people who died the late nineteenth century, let alone earlier periods: someone dying in England in the first decade of the seventeenth century, for example, had a nearly three-times greater chance of being included in the ODNB than someone who died in the first decade of the twenty-first century.
‘Competition’ for spaces at the modern end of the dictionary is therefore fierce. Some subjects are certainties—prime ministers such as Ted Heath or Jim Callaghan, or Nobel prize-winning scientists such as Francis Crick or Max Perutz. There are perhaps fifty or sixty potential subjects a year about whose inclusion no-one would quibble. But there are as many as 1500 people on our lists each year, and for perhaps five or six hundred of them a very good case could be made.
This is where our advisers come in. Over the last ten years we have relied heavily on the help of some 500 people, experts and leading figures in their fields whether as scholars or practitioners, who have given unstintingly of their time and support. Advisers are enjoined to consider all the aspects of notability, including achievement, influence, fame, and notoriety. Of course, their assessments can often vary, particularly in the creative fields, but even in those it is remarkable how often they coincide.
Our advisers have also in most cases been crucial in identifying the right contributor for each new biography, whether he or she be a practitioner from the same field (we often ask politicians to write on politicians—Ted Heath and Jim Callaghan are examples of this—lawyers on lawyers, doctors on doctors, and so on), or a scholar of the particular subject area. Sadly, a number of our advisers and contributors have themselves entered the dictionary in this decade, among them the judge Tom Bingham, the politician Roy Jenkins, the journalist Tony Howard, and the historian Roy Porter.
Just as the selection of subjects is made with an eye to an imaginary reader fifty or a hundred years’ hence (will that reader need or want to find out more about that person?), so the entries themselves are written with such a reader in view. ODNB biographies are not always the last word on a subject, but they are rarely the first. Most of the ‘recently deceased’ added to the Dictionary have received one or more newspaper obituary. ODNB biographies differ from newspaper obituaries in providing more, and more reliable, biographical information, as well as being written after a period of three to four years’ reflection between death and publication of the entry—allowing information to emerge and reputations to settle. In addition, ODNB lives attempt to provide an understanding of context, and a considered assessment (implicit or explicit) of someone’s significance: in short, they aim to narrate and evaluate a person’s life in the context of the history of modern Britain and the broad sweep of a work of historical reference.
The result, over the last ten years, has been an extraordinary collection of biographies offering insights into all corners of twentieth and early twenty-first century British life, from multiple angles. The subjects themselves have ranged from the soprano Elisabeth Schwarzkopf to the godfather of punk, Malcolm McLaren; the high tory Norman St John Stevas to the IRA leader Sean MacStiofáin; the campaigner Ludovic Kennedy to the jester Jeremy Beadle; and the turkey farmer Bernard Matthews to Julia Clements, founder of the National Association of Flower Arranging Societies. By birth date they run from the founder of the Royal Ballet, Dame Ninette de Valois (born in 1898, who died in 2001), to the ‘celebrity’ Jade Goody (born in 1981, who died in 2009). Mention of the latter reminds us of Leslie Stephen’s determination to represent the whole of human life in the pages of his original, Victorian DNB. Poignantly, in light of the 100th anniversary of the outbreak of the First World War, among the oldest subjects included in the dictionary are three of the ‘last veterans’, Harry Patch, Henry Allingham, and Bill Stone, who, as the entry on them makes clear, reacted very differently to the notion of commemoration and their own late fame.
The work of selecting from thousands of possible subjects, coupled with the writing and evaluation of the chosen biographies, builds up a contemporary picture of modern Britain as we record those who’ve shaped the very recent past. As we begin the ODNB’s second decade this work continues: in January 2015 we’ll publish biographies of 230 people who died in 2011 and we’re currently editing and planning those covering the years 2012 and 2013, including what will be a major article on the life, work, and legacy of Margaret Thatcher.
Links between biography and contemporary history are further evident online—creating opportunities to search across the ODNB by profession or education, and so reveal personal networks, associations, and encounters that have shaped modern national life. Online it’s also possible to make connections between people active in or shaped by national events. Searching for Dunkirk, or Suez, or the industrial disputes of the 1970s brings up interesting results. Searching for the ‘Festival of Britain’ identifies the biographies of 35 men and women who died between 2001-2010: not just the architects who worked on the structures or the sculptors and artists whose work was showcased, but journalists, film-makers, the crystallographer Helen Megaw (whose diagrams of crystal structures adorned tea sets used during the Festival), and the footballer Bobby Robson, who worked on the site as a trainee electrician. Separately, these new entries shed light not only on the individuals concerned but on the times in which they lived. Collectively, they amount to a substantial and varied slice of modern British national life.
Headline image credit: Harry Patch, 2007, by Jim Ross. CC-BY-SA-3.0 via Wikimedia Commons.
Today, 27 October sees the centenary of the birth of the poet, Dylan Marlais Thomas. Born on Cwmdonkin Drive, Swansea, and brought up in the genteel district of Uplands, Thomas’s childhood was suburban and orthodox — his father an aspirational but disappointed English teacher at the local grammar school.
Swansea would remain a place for home comforts. But from the mid-1930s, Thomas began a wandering life that took in London’s Fitzrovia — and in particular its pubs, the Fitzroy Tavern and the Wheatsheaf — and then (as a dysfunctionally married man) the New Forest, squalid rooms in wartime London, New Quay on Cardigan Bay, Italy, Laugharne in Carmarthenshire, and from 1950 the United States where he gained a popular student following and where he died in Manhattan, aged thirty-nine.
For all his wanderings, few of Thomas’s poems were written outside Wales. Indeed, half of the published poems for which he is known were written, in some form, while he was living at home in Swansea between 1930 and 1934. As Paul Ferris, his Oxford DNB biographer writes, “commonplace scenes and characters from childhood recur in his writing: the park that adjoins Cwmdonkin Drive; the bay and sands that were visible from the windows; a maternal aunt he visited” — the latter giving rise to one of Thomas’s best-known poems, “Fern Hill.” In literary London, and in numerous bar rooms thereafter, Thomas’s “drinking and clowning were indispensable to him, but they were only half the story; ‘I am as domestic as a slipper’ he once observed, with some truth.”
On 27th October 1914 Dylan Thomas was born in Swansea, South Wales. He is widely regarded as one the most significant Welsh writers of the 20th century.Thomas’s popular reputation has continued to grow after his death on 9th November, 1953, despite some critics describing his work as too ‘florid‘. He wrote prolifically throughout his lifetime but is arguably best known for his poetry. His poem The hand that signed the paper is taken from Jon Stallworthy’s edited collection The Oxford Book of War Poetry, and can be found below:
One hundred years ago today, far from the erupting battlefields of Europe, a small German force in the city of Tsingtau (Qingdao), Germany’s most important possession in China, was preparing for an impending siege. The small fishing village of Qingdao and the surrounding area had been reluctantly leased to the German Empire by the Chinese government for 99 years in 1898, and German colonists soon set about transforming this minor outpost into a vibrant city boasting many of the comforts of home, including the forerunner of the now-famous Tsingtao Brewery. By 1914, Qingdao had over 50,000 residents and was the primary trading port in the region. Given its further role as the base for the Far East Fleet of the Imperial German Navy, however, Qingdao was unable to avoid becoming caught up in the faraway European war.
The forces that besieged Qingdao in the autumn of 1914 were composed of troops from Britain and Japan, the latter entering the war against Germany in accord with the Anglo-Japanese Alliance. The Alliance had been agreed in 1902 amid growing anxiety in Britain regarding its interests in East Asia, and rapidly modernizing Japan was seen as a useful ally in the region. For Japanese leaders, the signing of such an agreement with the most powerful empire of the day was seen as a major diplomatic accomplishment and an acknowledgement of Japan’s arrival as one of the world’s great powers. More immediately, the Alliance effectively guaranteed the neutrality of third parties in Japan’s looming war with Russia, and Japan’s victory in the Russo-Japanese War of 1904-05 sent shockwaves across the globe as the first defeat of a great European empire by a non-Western country in a conventional modern war.
In Britain, Japan’s victory was celebrated as a confirmation of the strength of its Asian ally, and represented the peak of a fascination with Japan in Britain that marked the first decade of the twentieth century. This culminated in the 1910 Japan-British Exhibition in London, which saw over eight million visitors pass through during its six-month tenure. In contrast, before the 1890s, Japan had been portrayed in Britain primarily as a relatively backward yet culturally interesting nation, with artists and intellectuals displaying considerable interest in Japanese art and literature. Japan’s importance as a military force was first recognized during the Sino-Japanese War of 1894-95, and especially from the time of the Russo-Japanese War, Japan’s military prowess was popularly attributed to a supposedly ancient warrior spirit that was embodied in ‘bushido’, or the ‘way of the samurai’.
The ‘bushido’ ideal was popularized around the world especially through the prominent Japanese educator Nitobe Inazo’s (1862-1933) book Bushido: The Soul of Japan, which was originally published in English in 1900 and achieved global bestseller status around the time of the Russo-Japanese War (a Japanese translation first appeared in 1908). The British public took a positive view towards the ‘national spirit’ of its ally, and many saw Japan as a model for curing perceived social ills. Fabian Socialists such as Beatrice Webb (1858-1943) and Oliver Lodge (1851-1940) lauded the supposed collectivism of ‘bushido’, while Alfred Stead (1877-1933) and other promoters of the Efficiency Movement celebrated Japan’s rapid modernization. For his part, H.G. Wells 1905 novel A Modern Utopia included a ‘voluntary nobility’ called ‘samurai,’ who guided society from atop a governing structure that he compared to Plato’s ideal republic. At the same time, British writers lamented the supposed decline of European chivalry from an earlier ideal, contrasting it with the Japanese who had seemingly managed to turn their ‘knightly code’ into a national ethic followed by citizens of all social classes.
The ‘bushido boom’ in Britain was not mere Orientalization of a distant society, however, but was strongly influenced by contemporary Japanese discourse on the subject. The term ‘bushido’ only came into widespread use around 1900, and even a decade earlier most Japanese would have been bemused by the notion of a national ethic based on the former samurai class. Rather than being an ancient tradition, the modern ‘way of the samurai’ developed from a search for identity among Japanese intellectuals at the end of the nineteenth century. This process saw an increasing shift away from both Chinese and European thought towards supposedly native ideals, and the former samurai class provided a useful foundation. The construction of an ethic based on the ‘feudal’ samurai was given apparent legitimacy by the popularity of idealized chivalry and knighthood in nineteenth-century Europe, with the notion that English ‘gentlemanship’ was rooted in that nation’s ‘feudal knighthood’ proving especially influential. This early ‘bushido’ discourse profited from the nationalistic fervor following Japan’s victory over China in 1895, and the concept increasingly came to be portrayed as a unique and ancient martial ethic. At the same time, those theories that had drawn inspiration from European models came to be ignored, with one prominent Japanese promoter of ‘bushido’ deriding European chivalry as ‘mere woman-worship’.
In the first years of the twentieth century, the Anglo-Japanese Alliance contributed greatly to the positive reception in Britain of theories positing a Japanese ‘martial race’, and the fate of ‘bushido’ in the UK demonstrated the effect of geopolitics on theories of ‘national characteristics’. By 1914, British attitudes had begun to change amid increasing concern regarding Japan’s growing assertiveness. Even the Anglo-Japanese operation that finally captured Qingdao in November was marked by British distrust of Japanese aims in China, a sentiment that was strengthened by Japan’s excessive demands on China the following year. Following the war, Japan’s reluctance to return the captured territory to China caused British opposition to Japan’s China policy to increase, leading to the end of the Anglo-Japanese Alliance in 1923. The two countries subsequently drifted even further apart, and by the 1930s, ‘bushido’ was popularly described in Britain as an ethic of treachery and cruelty, only regaining its positive status after 1945 through samurai films and other popular culture as Japan and Britain again became firm allies in the Cold War.
Headline image credit: Former German Governor’s Residence in Qingdao, by Brücke-Osteuropa. Public domain via Wikimedia Commons.
The fifth of November is not just an excuse to marvel at sparklers, fireworks, and effigies; it is part of a national tradition that is based on one of the most famous moments in British political history. The Gunpowder Plot itself was actually foiled on the night of Monday 4 November, 1605. However, throughout the following day, Londoners were asked to light bonfires in order to celebrate the failure of the assassination attempt on King James I of England. Henceforth, the fifth of November has become known as ‘Bonfire Night’ or even ‘Guy Fawkes Night’ – named after the most posthumously famous of the thirteen conspirators. Guy Fawkes became the symbol for the conspirators after being caught during the failed treason attempt. For centuries after 1605, boys creating a cloaked effigy – based on Guy Fawkes’ disguised appearance in the Vaults at the House of Lords – have been asking for “a penny for the Guy”.
Below is a timeline that describes the events leading up to the failed Gunpowder Plot and the execution of Guy Fawkes and his fellow conspirators. If you would like to learn more about Bonfire Night, you can explore the characters behind the Gunpowder Plot, the traditions associated with it, or simply learn how to throw the best Guy Fawkes Night party.
Feature image credit: Guy Fawkes, by Crispijn van de Passe der Ältere. Public domain via Wikimedia Commons.
Time passes quickly. As we track the progression of events hundred years ago on the Western Front, the dramas flash by. In the time it takes to answer an e-mail the anniversary of another battle has come and gone.
We have celebrated the fumbling British skirmishes at Mons and Le Cateau in late August, but largely forgotten the French triumph at the Battle of the Marne which first stemmed and threw back the German wheeling attack through Belgium into Northern France under the Schlieffen Plan. We have already bypassed the spirited Franco-British attempts at the Battle of the Aisne in September to take the Chemin des Dames. The Race to the Sea was under way: the British and German Armies desperately trying to turn their enemy’s northern flank.
Throughout, the performance of the British Expeditionary Force has often been exaggerated. Imaginative accounts of Germans advancing in massed columns and being blown away by rapid rifle fire are common. A rather more realistic assessment is that the British infantry were steadfast enough in defence, but unable to function properly in coordination with their artillery or machine guns. The Germans seemed to have a far better grip of the manifold disciplines of modern warfare.
Yet everything changed in October. The Germans were scraping the barrel for manpower and decided to throw new reserve formations into the battle. Young men with the minimum of training, incapable of sophisticated battle tactics. They were marched forward in a last gambler’s throw of the dice to try and break through to the Channel Ports. To do that they needed first to capture the small Belgian city of Ypres.
One might have thought that Ypres was some fabled city, fought over to secure untold wealth or a commanding tactical position. Nothing could be further from the truth. Ypres was just an ordinary town, lying in the centre of the fertile Western Flanders plain. Yet the low ridges to the east represented one of the last feasible lines of defence. The British also saw the town, not as an end in itself, but as a stepping stone to more strategically important locations pushing eastwards, such as the rail centre at Roulers or the ports of Ostend and Zeebrugge. For both sides Ypres was on the road to somewhere.
The battle began in mid-October and soon began to boil up. Time and time the Germans hurled themselves forward, the grey-green hordes pressing forwards and being shot down in their hundreds. The British had learnt many lessons and this was where they finally proved themselves worthy adversaries for the German Army. On the evening of 23 October young Captain Harry Dillon was fighting for his life:
A great grey mass of humanity was charging, running for all God would let them, straight on to us not 50 yards off. Everybody’s nerves were pretty well on edge as I had warned them what to expect, and as I fired my rifle the rest all went off almost simultaneously. One saw the great mass of Germans quiver. In reality some fell, some fell over them, and others came on. I have never shot so much in such a short time, could not have been more than a few seconds and they were down. Suddenly one man – I expect an officer – jumped up and came on. I fired and missed, seized the next rifle and dropped him a few yards off. Then the whole lot came on again and it was the most critical moment of my life. Twenty yards more and they would have been over us in thousands, but our fire must have been fearful, and at the very last moment they did the most foolish thing they possibly could have done. Some of the leading people turned to the left for some reason, and they all followed like a great flock of sheep. We did not lose much time, I can give you my oath. My right hand is one huge bruise from banging the bolt up and down. I don’t think one could have missed at the distance and just for one short minute or two we poured the ammunition into them in boxfuls. My rifles were red hot at the finish. The firing died down and out of the darkness a great moan came. People with their arms and legs off trying to crawl away; others who could not move gasping out their last moments with the cold night wind biting into their broken bodies and the lurid red glare of a farm house showing up clumps of grey devils killed by the men on my left further down. A weird awful scene; some of them would raise themselves on one arm or crawl a little distance, silhouetted as black as ink against the red glow of the fire. [p. 287-288, Fire & Movement, by Peter Hart]
Some of the Germans had got within 25 yards of Dillon’s line. It had been a close run thing and after they had been relieved by the French later that night the French reported that some 740 German corpses littered the ground in front of his trenches. This was the real war: not a skirmishes like the earlier battles, this was the real thing.
The German attacks continued, followed as day follows night, by French and British counter-attacks to restore the situation. The Germans nibbled at the Allied line but were unable to achieve anything of importance. Yet for all the sound and fury, over the next few days the front line stayed relatively static. The German troops were flagging in their efforts. After one last effort on 11 November the Germans threw in the towel. They would not break through the Allied lines in 1914. The British and French lines had held. Battered, bruised, but unbroken. The First Battle of Ypres had confirmed the strategic victory gained by the French at the Marne. The German advance in the west had been blocked, if they sought victory in 1915 they would have to look to the east and attack Russia.
The 1914 campaign would prove decisive to the war. The utter failure of the Schlieffen Plan, designed to secure the rapid defeat of France, meant that Germany would be condemned to ruinous hostilities on two fronts. This was the great turning-point of the whole war. The pre-war predictions from the German strategists that they could not prevail in a long-drawn out war against the combined forces of France and Russia proved accurate, especially when the British Empire and United States joined the fight. The German Army fought with a sustained skill and endurance, but after 1914, the odds really were stacked against them.
Remembrance Day is a memorial day observed in Commonwealth of Nations member states since the end of the First World War to remember those who have died in the line of duty. It is observed by a two-minute silence on the ’11th hour on the 11th day of the 11th month’, in accordance with the armistice signed by representatives of Germany and the Entente on 11 November, 1918. The First World War officially ended with the signing of theTreaty of Versailleson 28 June 1919. In the UK, Remembrance Sunday occurs on the Sunday closest to the 11th November, and is marked by ceremonies at local war memorials in most villages, towns, and cities. The red poppy has become a symbol for Remembrance Day due to the poem In Flanders Fields, by Lieutenant Colonel John McCrae.
You can discover more about the history behind the First World War by exploring the free resources included in theinteractive imageabove.
Feature image credit: Poppy Field, by Martin LaBar. CC-BY-NC-2.0 via Flickr.
Alan Mathison Turing (1912-1954) was a mathematician and computer scientist, remembered for his revolutionary Automatic Computing Engine, on which the first personal computer was based, and his crucial role in breaking the ENIGMA code during the Second World War. He continues to be regarded as one of the greatest scientists of the 20th century.
We live in an age that Turing both predicted and defined. His life and achievements are starting to be celebrated in popular culture, largely with the help of the newly released film The Imitation Game, starring Benedict Cumberbatch as Turing and Keira Knightley as Joan Clarke. We’re proud to publish some of Turing’s own work in mathematics, computing, and artificial intelligence, as well as numerous explorations of his life and work. Use our interactive Enigma Machine below to learn more about Turing’s extraordinary achievements.
Image credits: (1) Bletchley Park Bombe by Antoine Taveneaux. CC-BY-SA-3.0 via Wikimedia Commons. (2) Alan Turing Aged 16, Unknown Artist. Public domain via Wikimedia Commons. (3) Good question by Garrett Coakley. CC-BY-SA 2.0 via Flickr.
The Union of 1707 – which by uniting the English and Scottish parliaments created the new state of the United Kingdom of Great Britain – was enthusiastically sought by some Scots and grudgingly accepted by many more, even if most people would have been happier with a federal union. What until recently most historians had missed was the identification with the Union of Scottish politicians and their supporters who had suffered under the later Stuart regime. In some cases they’d been forced into exile in the Low Countries They were backers of the Revolution (of 1688-90) in Scotland, which they saw as truly glorious. They advocated union as a means of securing the gains of the Revolution (constitutional monarchy, the re-establishment of Presbyterianism and certain civil liberties) and keeping the Jacobites’ hands off the imperial crown. This was a union based on Whig principles – religious, civic and economic. It was effected, as far as Scotland was concerned, through the persistence of a number of driven individuals some of whom had advocated closer union with England in 1688-9, and were still around in 1706-7 to vote for this in the Scottish Parliament.
I take issue with the centuries-old shibboleth that in 1707 the Scots had been, in the words of Robert Burns, ‘bought and sold for English gold’, by a ‘parcel’ of roguish politicians. The Union of 1707 was not the betrayal of the Scottish nation its critics had long asserted, a measure to be overturned if Scotland was to be set back on its rightful constitutional trajectory – not as a stateless nation within the British union state but as an independent nation state.
Yet support for the Scottish Nationalists in Scotland has grown strongly since the 1970s, along with disenchantment with the British state and Westminster. Scots’ identification with Britain has fallen sharply, with most Scots now feeling more Scottish than British.
It’s pretty clear that the Union is more vulnerable today than at any previous time since the Jacobite risings of 1714-5 and 1745-6. The props upon which it was built either no longer apply – its core purpose was to ensure that Queen Anne was succeeded by a Protestant (thereby excluding the Catholic claimant, James Edward Stuart, later the ‘Old Pretender’), or are less important. Presbyterianism, the security of which was enshrined (in theory at least) in the first of the two acts that comprised the Union agreement, has ceased to matter for most Scots. Scotland’s economy is no longer under-developed – unhindered access to the English market and to England’s Atlantic and Caribbean colonies were attractions even for Scots who were otherwise opposed to incorporation.
In short, there is a case for saying that the Union is past its ‘sell by date’. Those who are keen to maintain the United Kingdom need to come up with a vision for a Union for the 21st century – or at the very least a rationale – of the kind that inspired Scots to push for such an arrangement in 1707. Many more rallied to defend it – sometimes by risking life and limb – against the Jacobite incursions of 1715 and 1745. Until recently the main pro-Union campaign, Better Together, has been criticized for emphasizing the negative aspects of Scottish independence – ‘project fear’ – rather than the positive virtues of the Union.
Yet support for Yes Scotland – the separatists’ campaign – is (at the time of writing) apparently no higher than around 40% of the electorate, suggesting that when the referendum vote happens, on 18 September this year, a majority of Scots will vote No. Comparison with other nations in Europe that have recently struggled for and achieved independence may tell us something – not least that Scotland’s experience of union with a bigger neighbor has been somewhat less oppressive. Like being in bed not with an elephant as some allege, but a teddy bear. And that currently, notwithstanding its failings, more Scots than the nationalists hoped for still feel comfortable within the Union. It’s a habit that’s lasted for more than three centuries. As things stand, not enough people have found compelling reasons to give it up.
This is the centenary year of the enactment of the third Home Rule Bill, as well (of course) as the year of the Scottish referendum on independence. Yet the centenary conversation in Ireland and the somewhat more vigorous debate upon Scots independence, have been conducted — for the most part — quite separately.
While it would be wrong to push the analogies too far, there are some striking similarities – and some differences – between the debate on Home Rule in 1912-14, and the current debate upon Scottish independence. These similarities (and indeed distinctions) might well give food for thought to the protagonists within the Scottish ‘Yes’ and ‘Better Together’ camps — and indeed there is evidence that both Gordon Brown and Alex Salmond have ruminated accordingly.
One critical difference between Ireland in 1914 and Scotland in 2014 is that of militancy — Ireland on the eve of the First World War being an armed camp comprising the Ulster and Irish Volunteer movements, opponents and proponents of Home Rule, as well as the British Army. The Scottish political debate has not been militarised, and there is no evidence that it will become so (the Scottish National Liberation Army, for example, has never posed a significant threat). Modern Scottish nationalism has developed as a wholly constitutional and pacific phenomenon.
Of course mainstream Scottish nationalism has only recently, through successive Holyrood elections, emerged as a majority phenomenon. But it has never had to encounter the challenge (faced by Irish nationalism a century ago) of returning a majority of elected representatives, while being lengthily resisted in London.
One aspect of the Irish experience in 1914 was that a fraught constitutional debate, heightened political expectations, and the delaying or disappointment of those expectations (with Unionist resistance and the onset of War), combined to make a highly volatile political chemistry. The hardening expectations of change across Scotland in 2014 mean that national (as well as social and economic) aspirations may need to be quickly and sensitively addressed, whatever the result of the referendum.
One critical dimension of this militancy in 1914 was the trenchant support given to Ulster Unionist paramilitarism by the British Conservative leadership — this in part a symptom of the profound divisions in British and Irish politics and society precipitated by the debate over Home Rule. It is striking that both the Home Rule issue in 1914 and the referendum in 2014 have each attracted an unusually broad range of declarations of allegiance from a complex array of interest groups and individuals. In 1914 there was a high level of ‘celebrity’ endorsement and intervention over Home Rule: taking literary figures alone, Sir Arthur Conan Doyle came out as a Home Ruler, while Rudyard Kipling was a strong Unionist. In 2014 Irvine Welsh has declared in favour of independence, while J.K. Rowling is against. Ian Rankin provides a case-study in the complexity (and profundity) of division: he is an agnostic on the issue, but is clear that his characters would have strong opinions. So, Inspector Rebus joins the unionists of 2014 (though the actor Ken Stott, most recent of the TV Rebuses, is reportedly in the ‘yes’ camp).
The analogies between Home Rule and the debate on Scottish independence extend much further than the ‘A’ list, however. The substantial strength and challenge of Home Rule sentiment produced striking intellectual movement before and in 1914 — just as the strength of the movement for Scots independence has produced similar movement a century later.
In 1912-14 the constitutional impasse over Home Rule in fact helped to stimulate support for (what was then called) ‘federalism’ among some of the Unionist elite, including even Edward Carson. In terms of the (nearly) equally weighted forces fighting over Scottish independence, Gordon Brown has now moved to embrace the idea of a federal United Kingdom; and he has been joined or preceded by others, including (for example) the Scottish Conservative journalist, David Torrance. Discussion of a possible English parliament was broached prominently in 1911-1914 and again in 2014. Both in 1914 and in 2014 it appears that the constitutional shape of the ever-malleable United Kingdom is once again in transition — but because unionists are now shifting no less then nationalists.
And indeed some Scots Nationalists have moved towards embracing at least some of the symbols of the British connection. John Redmond, the Home Rule leader, emphasised monarchy and empire in his vision of Irish autonomy during the Home Rule era, partly through personal conviction, and partly in terms of subverting unionist arguments. In similar vein, Alex Salmond (despite a strong tradition of republican sentiment within the SNP), has embraced the ‘union of the crowns’ as SNP strategy, and has in recent years referred deferentially to the Queen (‘of Scots’), and her central place in an independent nation.
Here, as elsewhere, Ireland’s century-old debate on Home Rule speaks to the current condition of Scotland. Indeed here, as elsewhere, Ireland’s wider experience of Union chimes with that of the Scots.
With Scotland voting on independence on 18 September 2014, the UK coalition government sought advice on the relevant law from two leading international lawyers, James Crawford and Alan Boyle. Their subsequent report has a central argument. An independent Scotland would be separatist, breaking away from the remainder of the UK. Therefore, the latter (known as restUK or rUK) would be the continuator state – enjoying all the rights and duties of the existing UK, while Scotland would be new state having none of rUK’s rights and especially no membership of any international organizations it enjoys now as part of the UK. The bargaining power of rUK as to what it might concede of the UK’s rights would be complete, e.g. with respect to a common currency. This legal opinion has created a confrontational atmosphere around the referendum vote and caused anxiety among Scottish voters about to ‘jump into the unknown’.
It is essential to unpack the distracting complexity of the expert international law professionalism of this advice. Firstly, Crawford and Boyle gloss over the actual legal circumstances of the contract of union between Scotland and England, in particular that the Union was a bargain among powers equal in the eyes of international law at that time. More specifically, the England which, with Wales, concluded the Treaty of Union is exactly the same entity standing opposite to Scotland now as then (leaving aside the North of Ireland which has the option under the Belfast Agreement of leaving the UK by referendum).
There is no international standard, in the event of a dissolution of a union, which can provide any objective criterion to determine that Scotland is the breakaway entity. In international law, recognition of new states is largely a matter of the political discretion of existing states. It depends on an international consensus, or lack of it, where political preference may or may not trump any possibly objective standard of political legitimacy, e.g. self-determination by democratic consent. The vast amount of state practice which Crawford and Boyle’s legal opinion displays is misleading insofar as there is, in fact, no definitive legal marker of guidance. This is shown by the fact that England is the continuator state because it is larger than Scotland. Legally, there has to be a continuator state. But since this obviously cannot be Scotland, it must be England. Even Scotland assumes this to be the case.
It is necessary to focus upon an international legal history of the individual states, rather than the more general international law offered by Crawford and Boyle. The Anglo-Scottish Union displays a phenomenon that Linda Colley has referred to as the composite state. This is where two or more sovereign nations agree to merge their highest governmental level institution (parliament) into a single state made up of several nations – a state-nation – but other lesser local institutions might remain. In the Europe of the 15th to the 17th century this was a common phenomenon, the most celebrated being in Scandinavia, involving Sweden, Denmark and Norway in a variety of partnerships from the Kalmar Union (1397) onwards. The logic of these partnerships was that they were always open to renegotiation. Now, this is precisely what the English generously recognize in the Edinburgh Agreement. The logic of the composite state does not cover the many cases in which a core nation forms itself into a state and then jealously guards its territorial integrity against dissident minorities, which are then regarded as separatist and destructive of national unity. It is possible that an aura of this type of scenario runs through the legal opinion of Crawford and Boyle, although they have to accept the consensual context of the advice they are being asked to give.
The real issues facing Scotland have to be confronted on a basis of equality and mutual consent in accordance with the international law established as apposite for this case. These issues are a matter of history, not merely that of the 17th-18th century, but also the evolution of the 1707 Treaty of Union (implemented through separate Acts of Union passed in the Scottish and English Parliaments) to the very recent past – especially the Thatcher years and the neo-liberal revolution in English-dominated UK politics. It has to be recognized that there are profound differences of social philosophy now between Scotland and England around the issue of neo-liberalism and the defense of community. These provide good reasons to revisit that 1707 bargain. This revisiting should be on the basis of complete equality. The sharing of common institutions of the United Kingdom, such as the currency, would have to be negotiated after reaching an agreement in which neither side – as so-called continuator state – would have a higher standing.
Scottish women are said to hold the key to independence, as they predominate in the ‘no’ camp. Men have been repeatedly estimated from poll data to be around 50:50 for and against, while those women who were sure of their intentions were 60% against.
This has been represented as an alarming gender divide, but a look at the history of women fighting for the vote in Scotland shows they have long been resolute in their positions, more concerned with what politics could do in real life than the grandstanding of political ideas, and much more internationalist than their sisters south of the border.
The Scottish route to women’s suffrage started in 1867 with the Edinburgh National Society for Women’s Suffrage; similar societies were established in Manchester, London, and Dublin. Later these suffragists were joined by the suffragettes, who attracted considerable publicity for arson, vandalism, and hunger-striking in the cause, to the disdain of the constitutional campaigners who thought this sort of behaviour counter-productive. This major division in tactics has served to obscure the fundamental similarity of both campaigns as both sides were directed towards the same objective: for women to have the vote on the same basis as men, which was then on a property-owning franchise. They also both steered away from engagement in other social activities. The vote was all-important, it was a millennialist objective, which once achieved would inaugurate an era of social justice and peace. Other social activity was at best a distraction and could wait till after the advent of the franchise. For this reason English suffragists such as Millicent Fawcett were not involved in important campaigns like those against the Contagious Diseases Acts and for temperance, whatever their personal views may have been.
Scottish women took another path, with a much more inclusive vision of the purpose of political activism. For them the vote was one of a number of issues on which to campaign, and temperance was another. Using the vehicle of the Scottish Christian Union, Scottish women allied with the American Women’s Christian Temperance Union, the most powerful women’s suffrage organisation in the world.
The temperance cause was part of a set of progressive measures as disparate as anti-slavery, ‘social purity’ (sexual control), universal education, and promoting enhanced domestic skills to the poor. All had women as prime movers or playing a prominent part – the so-called ‘feminine public sphere’. Scottish women embraced this ‘woman’s mission’ with a vengeance, for example eagerly seizing on the municipal vote which was granted to Scottish women in 1881, in order to favour candidates who wanted strict alcohol licensing. Other areas of activity included such practical institutions as the Glasgow Samaritan Hospital for ‘diseases of women’ and rescue homes for ‘female inebriates.’ It has been said that alcohol more than slavery or suffrage or any other single cause politicised American women. Megan Smitley in The Feminine Public Sphere (MUP, 2009) has convincingly argued that the same can be said for Scottish women.
In the United States the Women’s Christian Temperance Union saw through enfranchisements state by state, and sent out missionaries to New Zealand (which became the first nation to enfranchise women in 1893) and to Australia (which started enfranchising with South Australia in 1894). Isabel Napier, who was National Superintendent of the Suffrage Department of the Scottish Christian Union, grew up in New Zealand and retained strong links. “When Suffrage became law in New Zealand all their influence was thrown on the side of Temperance Reform,” she said, “and so you have the advanced laws that now obtain.” WCTU speakers toured Scotland from the Shetlands to the Borders, hosted by the Scottish Christian Union.
In contrast, English women considered the US temperance campaign vulgar and did not welcome WCTU speakers; they feared the ‘Americanisation’ of their field. Nor did English and Welsh temperance organisations officially support women’s suffrage (though individual members doubtless did).
The importance of this tradition of social activism for the independence debate has been that Scottish women were not moved by the same arguments as men. The ‘Braveheart tendency’ of independence at all costs as a patriotic ideal, regardless of the consequences, has had limited feminine appeal. As Lesley Riddoch wrote in The Scotsman: “Toughing out controversy and appearing to spoil for a fight may earn respect from male commentators and small armies of cyber-angry, anonymous men. Clever dick answers, snide-sounding put downs and swaggering arrogance turn off watching women as swiftly as they appear to engage watching men.” That was the level at which most of the independence campaign was fought, however, leading to a frantic late catch-up as more ‘woman friendly’ policies were rolled out.
The issues that women took most interest in were: How would either side deal with child poverty, low pay, and poor housing? What could be done about the European-wide disgrace of poor health and low life expectancy in parts of Scotland? Finally (and in a manner that would be instantly recognisable to nineteenth century prohibitionists) how to deal with the appalling levels of alcohol abuse in Scotland which are so damaging to personal health and family life?
Such practical matters of national renewal were often drowned out by masculine bluster.
Hadrian’s Wall has been in the news again recently for all the wrong reasons. Occasional wits have pondered on its significance in the Scottish Referendum, neglecting the fact that it has never marked the Anglo-Scottish border, and was certainly not constructed to keep the Scots out. Others have mistakenly insinuated that it is closed for business, following the widely reported demise of the Hadrian’s Wall Trust. And then of course there is the Game of Thrones angle, best-selling writer George R R Martin has spoken of the Wall as an inspiration for the great wall of ice that features in his books.
Media coverage of both Hadrian’s Wall Trust’s demise and Game of Thrones’ rise has sometimes played upon and propagated the notion that the Hadrian’s Wall was manned by shivering Italian legionaries guarding the fringes civilisation – irrespective of the fact that the empire actually trusted the security of the frontier to its non-citizen soldiers, the auxilia rather than to its legionaries. The tendency to overemphasise the Italian aspect reflects confusion about what the Roman Empire and its British frontier was about. But Martin, who made no claims to be speaking as a historian when he spoke of how he took the idea of legionaries from Italy, North Africa, and Greece guarding the Wall as a source of inspiration, did at least get one thing right about the Romano-British frontier.
There were indeed Africans on the Wall during the Roman period. In fact, at times there were probably more North Africans than Italians and Greeks. While all these groups were outnumbered by north-west Europeans, who tend to get discussed more often, the North African community was substantial, and its stories warrant telling.
Perhaps the most remarkable tale to survive is an episode in the Historia Augusta (Life of Severus 22) concerning the inspection of the Wall by the emperor Septimius Severus. The emperor, who was himself born in Libya, was confronted by a black soldier, part of the Wall garrison and a noted practical joker. According to the account the notoriously superstitious emperor saw in the soldier’s black skin and his brandishing of a wreath of Cyprus branches, an omen of death. And his mood was not further improved when the soldier shouted the macabre double entendre iam deus esto victor (now victor/conqueror, become a god). For of course properly speaking a Roman emperor should first die before being divinized. The late Nigerian classicist, Lloyd Thompson, made a powerful point about this intriguing passage in his seminal work Romans and Blacks, ‘the whole anecdote attributes to this man a disposition to make fun of the superstitious beliefs about black strangers’. In fact we might go further, and note just how much cultural knowledge and confidence this frontier soldier needed to play the joke – he needed to be aware of Roman funerary practices, superstitions, and the indeed the practice of emperor worship itself.
Why is this illuminating episode not better known? Perhaps it is because there is something deeply uncomfortable about what could be termed Britain’s first ‘racist joke’, or perhaps the problem lies with the source itself, the notoriously unreliable Historia Augusta. And yet as a properly forensic reading of this part of the text by Professor Tony Birley has shown, the detail included around the encounter is utterly credible, and we can identify places alluded to in it at the western end of the Wall. So it is quite reasonable to believe that this encounter took place.
Not only this, but according to the restoration of the text preferred by Birley and myself, there is a reference to a third African in this passage. The restoration post Maurum apud vallum missum in Britannia indicates that this episode took place after Severus has granted discharge to a soldier of the Mauri (the term from which ‘Moors’ derives). And has Birley has noted, we know that there was a unit of Moors stationed at Burgh-by-Sands on the Solway at this time.
Sadly, Burgh is one of the least explored forts on Hadrian’s Wall, but some sense of what may one day await an extensive campaign of excavation there comes from Transylvania in Romania, where investigations at the home of another Moorish regiment of the Roman army have revealed a temple dedicated to the gods of their homelands. Perhaps too, evidence of different North African legacies would emerge. The late Vivian Swann, a leading expert in the pottery of the Wall has presented an attractive case that the appearance of new forms of ceramics indicates the introduction of North African cuisine in northern Britain in the second and third centuries AD.
What is clear is that the Mauri of Burgh-by-Sands were not the only North Africans on the Wall. We have an African legionary’s tombstone from Birdoswald, and from the East Coast the glorious funerary stela set up to commemorate Victor, a freedman (former slave) by his former master, a trooper in a Spanish cavalry regiment. Victor’s monument now stands on display in Arbeia Museum at South Shields next to the fine, and rather better known, memorial to the Catuvellunian Regina, freedwoman and wife of Barates from Palmyra in Syria. Together these individuals, and the many other ethnic groups commemorated on the Wall, remind us of just how cosmopolitan the people of Roman frontier society were, and of how a society that stretched from the Solway and the Tyne to the Euphrates was held together.
September 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts consider aspects of the ODNB’s online evolution in the decade since 2004. Here the literary historian, David Hill Radcliffe, considers how the ODNB online is shaping new research in the humanities.
The publication of the Oxford Dictionary of National Biography in September 2004 was a milestone in the history of scholarship, not least for crossing from print to digital publication. Prior to this moment a small army of biographers, myself among them, had worked almost entirely from paper sources, including the stately volumes of the first, Victorian ‘DNB’ and its 20th-century print supplement volumes. But the Oxford DNB of 2004 was conceived from the outset as a database and published online as web pages, not paper pages reproduced in facsimile. In doing away with the page image as a means of structuring digital information, the online ODNB made an important step which scholarly monographs and articles might do well to emulate.
Database design has seen dramatic changes since 2004—shifting from the relational model of columns and rows, to semi-structured data used with XML technologies, to the unstructured forms used for linking data across repositories. The implications of these developments for the future of the ODNB remain to be seen, but there is every reason to believe that its content will be increasingly accessed in ways other than the format of the traditional biographical essay. Essays are not going away, of course. But they will be supplemented by the arrays of tables, charts, maps, and graphs made possible by linked data. Indeed, the ODNB has been moving in this direction since 2004 with the addition of thousands of curated links between individuals (recorded in biographical essays) and the social hierarchies and networks to which they belonged (presented in thematic list and group entries)—and then on to content by or about a person held in archives, museums or galleries worldwide.
Online the ODNB offers scholars the opportunity to select, group, and parse information not just at the level of the article, but also in more detailed ways—and this is where computational matters get interesting. I currently use the ODNB online as a resource for a digital prosopography attached to a collection of documents called ‘Lord Byron and his Times’, tracking relationships among more than 12,000 Byron-contemporaries mentioned in nineteenth-century letters and memoirs; of these people a remarkable 5000 have entries in the ODNB. The traditional object of prosopography was to collect small amounts of information about large numbers of persons, using patterns to draw inferences about slenderly documented lives. But when computation is involved, a prosopography can be used with linked data to parse large amounts of information about large numbers of persons. As a result, one can attend to particularities, treating individuals as members of a group or social network without reducing them to the uniformity of a class identity. Digital prosopography thus returns us to something like the nineteenth-century liberalism that inspired Sir Leslie Stephen’s original DNB (1885-1900).
The key to finding patterns in large collections of lives and documents, the evolution of technology suggests, is to atomize the data. As a writer of biographies I would select from documentary sources, collecting the facts of a life, and translating them into the form of an ODNB essay. Creating a record in a prosopography involves a similar kind of abstraction: working from (say) an ODNB entry, I abstract facts from the prose, encoding names and titles and dates in a semi-structured XML template that can then be used to query my archive, comprising data from previous ODNB abstractions and other sources. For instance: ‘find relationships among persons who corresponded with Byron (or Harrow School classmates, or persons born in Nottinghamshire, etc.) mentioned in the Quarterly Review.’ An XML prosopography is but a step towards recasting the information as flexible, concise, and extensible semantic data.
While human readers can easily distinguish the character-string ‘Oxford’ as referring to the place, the university, or the press, this is a challenge for computation—like distinguishing ‘Byron’ the poet from ‘Byron’ the admiral. One can attack this problem by using algorithms to compare adjacent strings, or one can encode strings by hand to disambiguate them, or use a combination of both. Digital ODNB essays are good candidates for semantic analysis since their structure is predictable and they are dense with significant names of persons, places, events, and relationships that can be used for data-linking. One translates character-strings into semantic references, groups the references into relationships, and expresses the relationships in machine-readable form.
A popular model for parsing semantic data is via ‘triples’: statements in the form subject / property / object, which describe a relationship between the subject and the object: the tree / is in / the quad. It is powerful because it can describe anything, and its statements can be yoked together to create new statements. For example: ‘Lord Byron wrote Childe Harold’, and ‘John Murray published Childe Harold’ are both triples. Once the three components are translated into semantically disambiguated machine-readable URIs (Uniquely Referring Identifiers), computation can infer that ‘John Murray published Lord Byron.’
Now imagine the contents of the ODNB expressed not as 60,000 biographical essays but as several billion such statements. In fact, this is far from unthinkable, given the nature of the material and progress being made in information technology. The result is a wonderful back-to-the-future moment with Leslie Stephen’s Victorian DNB wedded to Charles Babbage’s calculating machine: the simplicity of the triple and the power of finding relations embedded within them. Will the fantasies of positivist historians finally be realized? Not likely; while computation is good at questions of ‘who’, ‘what’, ‘where’, and ‘when’, it is not so good at ‘why’ and ‘how’. Biographers and historians are unlikely to find themselves out of a job anytime soon. On the contrary, once works like the ODNB are rendered machine-readable and cross-query-able, scholars will find more work on their hands than they know what to do with.
So the publication of the ODNB online in September 2004 will be fondly remembered as a liminal moment when humanities scholarship crossed from paper to digital. The labour of centuries of research was carried across that important threshold, recast in a medium enabling new kinds of investigation the likes of which—ten years on—we are only beginning to contemplate.
Most of what we hear and read about twelfth-century hottie Rosamund Clifford, aka “Fair Rosamund,” just wasn’t so. True, she was Henry II’s mistress. But that’s about it. Like so many other medieval myths, Rosamund’s legendary life and death are a later invention. Herewith, the best of (untrue) Rosamund:
Myth 1: She went to school at, lived at, had assignations with the king at, retired to, died at, or in any way hung out at Godstow Abbey.
Sadly, Rosamund never entered Godstow until she was a fair corpse. She died around the year 1176, in the midst of her affair with the king, and was buried at Godstow, probably because her mother was already buried there. Contrary to what you will read in various places, there is no evidence that the king paid for her tomb. Her tomb was placed in the front of the high altar, and the king did show particular favor to the monastery because of it. Fifteen years later, Bishop Hugh of Lincoln made the nuns move the tomb out of the church because it was inappropriate for a “whore” to be buried there.
Myth 2: She and Henry went drinking at the Trout. Or the Perch.
I read this about the pubs near Godstow in a student handbook when I was doing my postgraduate work at Oxford, and I wanted to believe it. So did visiting relatives. Alas, not true. See number 2 above: no hanging out at Godstow. But my visitors and I did enjoy some pleasant pints at both the aforementioned hostelries.
Myth 3: She lived in a maze at Woodstock.
Of course this is a later embellishment, related to the next two myths. But a fairly elaborate pleasure garden does seem to have been incorporated into the royal residence at Woodstock in this period, adjacent to a room that just a generation later was known as “Rosamund’s Chamber.” So the maze story may have evolved from a real trysting place in a complex garden.
Myth 4: The queen found her in the maze by means of a silken thread.
See previous myth. But there is, just barely, a silken thread in Rosamund’s true story. After her burial at Godstow, King Henry wanted a special relationship with her burial place, so the nunnery’s patron deeded his patronal rights in Godstow to the king. In the ceremony he used a silk cloth that was later described as “a silken thread.”
Myth 5: She was murdered by Queen Eleanor of Aquitaine.
The earliest version of this story, from the fourteenth century, has Eleanor stabbing Rosamund; in Renaissance versions the queen makes Rosamund choose between stabbing and poison. Interestingly, even the Victorians made a sympathetic victim of poor Rosamund (the fornicating mistress) and turned Eleanor (the wronged wife) into a murderous monster. Needless to say, there’s no truth to the murder stories, which arose long after Rosamund died.
Myth 6: She was the mother of Henry II’s illegitimate son Geoffrey Plantagenet, archbishop of York, and/or his illegitimate son William Longespee, earl of Salisbury.
Rosamund was too young to be Geoffrey’s mother, who was apparently a woman named Ikeni. William Longespee was the son of Ida de Tosny.
Myth 7: Latin bell inscriptions all over England make reference to her.
These inscriptions read, “I who am struck am called Maria [or Katherine], the rose of the world.” Rosamund was a rare, possibly unique, name for a woman in twelfth-century England, but the phrases rosa munda (pure rose) and rosa mundi (rose of the world) were epithets for the Virgin Mary. It’s likely that Rosamund Clifford was named (creatively and, as it turned out, ironically) in honor of the Virgin, and that the bell inscriptions came from the same general cultural source.
Myth 8: Roses were spread over her tomb.
No, just a silken pall and candles, as far as we know. It’s possible, however, that the Gallica rose ‘Rosa Mundi’ was named for her, as her legend grew in the later Middle Ages. Perhaps the rose, like the bells, was named for the Virgin Mary, but the name of the rose is one bit of Rosamund lore that seems plausible.
September 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts explore aspects of the Dictionary’s online evolution in the decade since 2004. In this post, Henry Summerson considers how new research in medieval biography is reflected in ODNB updates.
Today’s publication of the Oxford Dictionary of National Biography’s September 2014 update—marking the Dictionary’s tenth anniversary—contains a chronological bombshell. The ODNB covers the history of Britons worldwide ‘from the earliest times’, a phrase which until now has meant since the fourth century BC, as represented by Pytheas, the Marseilles merchant whose account of the British Isles is the earliest known to survive. But a new ‘biography’ of the Red Lady of Paviland—whose incomplete skeleton was discovered in 1823 in Wales, and which today resides in Oxford’s Museum of Natural History—takes us back to distant prehistory. As the earliest known site of ceremonial human burial in western Europe, Paviland expands the Dictionary’s range by over 32,000 years.
The Red Lady’s is not the only ODNB biography pieced together from unidentified human remains (Lindow Man and the Sutton Hoo burial are others), while the new update also adds the fifteenth-century ‘Worcester Pilgrim’ whose skeleton and clothing are on display at the city’s cathedral. However, the Red Lady is the only one of these ‘historical bodies’ whose subject has changed sex—the bones having been found to be those of a pre-historical man, and not (as was thought when they were discovered), of a Roman woman.
The process of re-examination and re-interpretation which led to this discovery can serve as a paradigm for the development of the DNB, from its first edition (1885-1900) to its second (2004), and its ongoing programme of online updates. In the case of the Red Lady the moving force was in its broadest sense scientific. In this ‘he’ is not unique in the Dictionary. The bones of the East Frankish queen Eadgyth (d.946), discovered in 2008 provide another example of human remains giving rise to a recent biography. But changes in analysis have more often originated in more conventional forms of historical scholarship. Since 2004 these processes have extended the ODNB’s pre-1600 coverage by 300 men and women, so bringing the Dictionary’s complement for this period to more than 7000 individuals.
In part, these new biographies are an evolution of the Dictionary as it stood in 2004 as we broaden areas of coverage in the light of current scholarship. One example is the 100 new biographies of medieval bishops that, added to the ODNB’s existing selection, now provide a comprehensive survey of every member of the English episcopacy from the Conquest to the Reformation—a project further encouraged by the publication of new sources by the Canterbury and York Society and the Early English Episcopal Acta series.
Taken together these new biographies offer opportunities to explore the medieval church, with reference to incumbents’ background and education, the place of patronage networks, or the shifting influence of royal and papal authority. That William Alnwick (d.1449), ‘a peasant born of a low family’, could become bishop of Norwich and Lincoln is, for example, indicative of the growing complexity of later medieval episcopal administration and its need for talented men. A second ODNB project (still in progress) focuses on late-medieval monasticism. Again, some notable people have come to light, including the redoubtable Elizabeth Cressener, prioress of Dartford, who opposed even Thomas Cromwell with success.
Away from religious life, recent projects to augment the Dictionary’s medieval and early modern coverage have focused on new histories of philanthropy—with men like Thomas Alleyne, a Staffordshire clergyman whose name is preserved by three schools—and of royal courts and courtly life. Hence first-time biographies of Sir George Blage, whom Henry VIII used to address as ‘my pig’, and at a lower social level, John Skut, the tailor who made clothes for most of the king’s wives: ‘while Henry’s queens came and went, John Skut remained.’
Alongside these are many included for remarkable or interesting lives which illuminate the past in sometimes unexpected ways. At the lowest social level, such lives may have been very ordinary, but precisely because they were commonplace they were seldom recorded. Where a full biography is possible, figures of this kind are of considerable interest to historians. One such is Agnes Cowper, a Southwark ‘servant and vagrant’ in the years around 1600; attempts to discover who was responsible for her maintenance shed a fascinating light on a humble and precarious life, and an experience shared by thousands of late-Tudor Londoners. Such light falls only rarely, but the survival of sources, and the readiness of scholars to investigate them, have also led to recent biographies of the Roman officers and their wives at Vindolanda, based on the famous ‘tablets’ found at Chesterholm in Northumberland; the early fourteenth-century anchorite Christina Carpenter, who provoked outrage by leaving her cell (but later returned to it), and whose story has inspired a film, a play and a novel; and trumpeter John Blanke, whose fanfares enlivened the early Tudor court and whose portrait image is the only identifiable likeness of a black person in sixteenth-century British art.
While people like Blanke are included for their distinctiveness, most ODNB subjects can be related to the wider world of their contemporaries. A significant component of the Dictionary since 2004 has been an interest in recreating medieval and early modern networks and associations; they include the sixth-century bringers of Christianity to England, the companions of William I, and the enforcers of Magna Carta. Each establishes connections between historical figures, sets the latter in context, and charts how appreciations of these networks and their participants have developed over time—from the works of early chroniclers to contemporary historians. Indeed, in several instances, notably the Round Table knights or the ‘Merrie Men’, it is this (often imaginative) interpretation and recreation of Britain’s medieval past that is to the fore.
The importance of medieval afterlives returns us to the Red Lady of Paviland. His biography presents what can be known, or plausibly surmised, about its subject, alongside the ways in which his bodily remains (and the resulting life) have been interpreted by successive generations—each perceptibly influenced by the cultural as well as scholarly outlook of the day. Next year sees the 800th anniversary of the granting of Magna Carta, a centenary which can be confidently expected to bring further medieval subjects into Oxford Dictionary of National Biography. It is unlikely that the historians responsible will be unaffected by considerations of the long-term significance of the Charter. Nor, indeed, should they be—it is the interaction of past and present which does most to bring historical biography to life.
September 2014 marked the tenth anniversary of the publication of the Oxford Dictionary of National Biography. Over the next month a series of blog posts explore aspects of the Dictionary’s online evolution in the decade since 2004. In this post, Sir David Cannadine describes his role as the new editor of the Oxford DNB.
Here at Princeton, the new academic year is very much upon us, and I shall soon begin teaching a junior seminar on ‘Winston Churchill, Anglo-America, and the “Special Relationship”’, which is always enormously enjoyable, not least because one of the essential books on the undergraduate reading list is Paul Addison’s marvellous brief biography, published by OUP, which he developed from the outstanding entry on Churchill that he wrote for the Oxford DNB. I’ve been away from the university for a year, on leave as a visiting professor at New York University, so there is a great deal of catching up to do. This month I also assume the editorial chair at the ODNB, as its fourth editor, in succession to the late-lamented Colin Matthew, to Brian Harrison, and to Lawrence Goldman.
As such, I shall be the first ODNB editor who is not resident in Britain, let alone living and working in Oxford, but this says more about our globalized and inter-connected world than it does about me. When I was contacted, several months ago, by a New York representative of OUP, asking me whether I might consider being the next editor, I gave my permanent residence in America as a compelling reason for not taking the job on. But he insisted that, far from being a disadvantage, this was in fact something of a recommendation. In following in the footsteps of my three predecessors (all, as it happens, personal friends) I am eager to do all I can to ensure that my occupancy of the editorial chair will not prove him (and OUP) to have been mistaken.
As must be true of any historian of Britain, the Oxford DNB and its predecessor have always been an essential part of my working life; and I can vividly recall the precise moment at which that relationship (rather inauspiciously) began. As a Cambridge undergraduate, I once mentioned to one of my supervisors that I greatly admired the zest, brio, and elan of J.H. Plumb’s brief life of the earl of Chatham, which I had been given a few years before as a school prize. ‘Oh’, he sniffily replied, ‘there’s no original research there; Plumb got it all from the DNB.’ Of course, I had heard of something called DNA; but what, I wondered, was this (presumably non-molecular) sequel called the DNB? Since I was clearly expected to know, I didn’t dare ask; but I soon found out, and so began a lifelong friendship.
During my remaining undergraduate days, as I worked away in the reading room of the Cambridge University Library, the DNB became a constant source of solace and relief: for when the weekly reading list seemed overwhelming, or the essay-writing was not going well, I furtively sought distraction by pulling a random volume of the DNB off the reference shelves. As a result, I cultivated what Leslie Stephen (founding editor of the Dictionary’s Victorian edition) called ‘the great art of skipping’ from one entry to another, and this remains one of the abiding pleasures provided by the DNB’s hard-copy successor. Once I started exploring the history of the modern British aristocracy, the DNB also became an invaluable research tool, bringing to life many a peer whose entry in Burke or Debrett was confined to the barest biographical outline.
Thus approached and appreciated, it was very easy to take the DNB for granted, and it was only when I wrote a lengthy essay on the volume covering the years 1961 to 1970, for the London Review of Books in 1981, that I first realized what an extraordinary enterprise it was and, indeed, had always been since the days when Leslie Stephen first founded it almost one hundred years before. I also came to appreciate how it had developed and evolved across the intervening decades, and I gained some understanding of its strengths—and of its weaknesses, too. So I was not altogether surprised when OUP bravely decided to redo the whole Dictionary, and the DNB was triumphantly reborn as the ODNB—first published almost exactly 10 years ago—to which I contributed the biographies on George Macaulay Trevelyan and Noel Annan.
Since 2004 the Oxford DNB has continued to expand its biographical coverage with three annual online updates, the most recent of which appeared last week. In September 2013 I wrote a collective entry on the Calthorpe family for an update exploring the history of Birmingham and the Black Country, and I am eager to remain an intermittent but enthusiastic contributor now that I am editor. As we rightly mark and celebrate the tenth anniversary of the publication of the ODNB, and its successful continuation across the intervening decade, it is clear that I take over an enterprise in good spirits and an organization (as the Americans would say) in good shape. Within the United Kingdom and, indeed, around the world, the ODNB boasts an unrivalled global audience and an outstanding array of global contributors; and I greatly look forward to keeping in touch, and to getting to know many of you better, in the months and years to come.
Headline image credit: ODNB, online. Image courtesy of the ODNB editorial team.
This year is the 250th anniversary of Horace Walpole’s The Castle of Otranto, first published on Christmas Eve 1764 as a seasonal ghost story. The Castle of Otranto is often dubbed the “first Gothic novel” due to Walpole describing it as a “Gothic story,” but for him the Gothic meant very different things from what it might do today. While the Gothic was certainly associated with the supernatural, it was predominantly a theory of English progress rooted in Anglo-Saxon and medieval history — effectively the cultural wing of parliamentarian politics and Protestant theology. The genre of the “Gothic novel,” with all its dire associations of uncanny horror, would not come into being for at least another century. Instead, the writing that followed in the wake of Otranto was known as the German School, the ‘Terrorist System of Writing’, or even hobgobliana.
Reading Otranto today, however, it is almost impossible to forget what 250 years of Gothickry have bequeathed to our culture in literature, architecture, film, music, and fashion: everything from the great Gothic Revival design of the Palace of Westminster to none-more-black clothes for sale on Camden Town High Street and the eerie music of Nick Cave, Jordan Reyne, and Fields of the Nephilim.
And the cinema has been instrumental in spreading this unholy word. Despite being rooted in the history of the barbarian tribes who sacked Rome and the thousand-year epoch of the Dark Ages, the Gothic was also a state-of-the-art movement. Technology drove the Gothic dream, enabling, for instance, the towering spires and colossal naves of medieval cathedrals, or enlisting in nineteenth-century art and literature the latest scientific developments in anatomy and galvanism (Frankenstein), the circulation of the blood and infection (The Vampyre), or drug use and psychology (Strange Case of Dr Jekyll and Mr Hyde).
The moving image on the cinema screen therefore had an immediate and compelling appeal. The very experience of cinema was phantasmagoric — kaleidoscopic images projected in a darkened room, accompanied by often wild, expressionist music. The hallucinatory visions of Henry Fuseli and Gustave Doré arose and, like revenants, came to life.
Camera tricks, special effects, fantastical scenery, and monstrous figures combined in a new visual style, most notably in Robert Wiene’s The Cabinet of Dr Caligari (1920) and F. W. Murnau’s Nosferatu: A Symphony of Terror (1922). Murnau’s Nosferatu, the first vampire film, fed parasitically on Bram Stoker’s Dracula; it was rumored that Max Schreck, who played the nightmarish Count Orlok, was indeed a vampire himself. The horror film had arrived.
Mid-century Hollywood movie stars such as Bela Lugosi, who first played Dracula in 1931, and Boris Karloff, who played Frankenstein’s monster in the same year, made these roles iconic. Lugosi played Dracula as a baleful East European, deliberately melodramatic; Karloff was menacing in a different way: mute, brutal, and alien. Both embodied the threat of the “other”: communist Russia, as conjured up by the cinema. Frankenstein’s monster is animated by the new cinematic energy of electricity and light, while in Dracula the Count’s life and death are endlessly replayed on the screen in an immortal and diabolical loop.
It was in Britain, however, that horror films really took the cinema-going public by the throat. Britain was made for the Gothic cinema: British film-makers such as Hammer House of Horror could draw on the nation’s rich literary heritage, its crumbling ecclesiastical remains and ruins, the dark and stormy weather, and its own homegrown movie stars such as Peter Cushing and Christopher Lee. Lee in particular radiated a feral sexuality, enabling Hammer Horror to mix a heady cocktail of sex and violence on the screen. It was irresistible.
The slasher movies that have dominated international cinema since Hammer through franchises such as Hellraiser and Saw are more sensationalist melodrama than Gothic, but Gothic film does thrive and continues to create profound unease in audiences: The Exorcist, the Alien films, Blade Runner, The Blair Witch Project, and more overtly literary pictures such as Bram Stoker’s Dracula are all contemporary classics — as is Buffy the Vampire Slayer on TV.
And despite the hi-tech nature of film-making, the profound shift in the meaning of Gothic, and the gulf of 250 years, the pulse of The Castle of Otranto still beats in these films. The action of Otranto takes place predominantly in the dark in a suffocatingly claustrophobic castle and in secret underground passages. Inexplicable events plague the plot, and the dead — embodying the inescapable crimes of the past — haunt the characters like avenging revenants. Otranto is a novel of passion and terror, of human identity at the edge of sanity. In that sense, Horace Walpole did indeed set down the template of the Gothic. The Gothic may have mutated since 1764, it may now go under many different guises, but it is still with us today. And there is no escape.
Autumn 2014 marks the tenth anniversary of the publication of the Oxford Dictionary of National Biography. In a series of blog posts, academics, researchers, and editors look at aspects of the ODNB’s online evolution in the decade since 2004. Here the ODNB’s publication editor, Philip Carter, considers how an ever-evolving Dictionary is being transformed by new opportunities in digital research.
When it was first published in September 2004, the Oxford DNB brought together the work of more than 10,000 humanities scholars charting the lives of nearly 55,000 historical individuals. Collectively it captured a generation’s understanding and perception of the British past and Britons’ reach worldwide. But if the Dictionary was a record of scholarship within a particular timeframe, it was also seen from the outset as a work in progress. This is most evident in the decision to include in the ODNB every person who had appeared in the original, Victorian DNB. Doing so defined the 2004 Dictionary (to quote the entry on Colin Matthew, its founding editor) as ‘a collective account of the attitudes of two centuries: the nineteenth as well as the twentieth, the one developing organically from the other.’
In the decade since 2004 this notion of the ODNB as an organic ‘work in progress’ has gone a step further. This is seen, in part, in the continued extension of biographical coverage, both of the ‘recently deceased’ and of newly documented lives from earlier periods—as discussed in other articles in this 10th anniversary series. But in addition to new content there’s also been the evolution—in the form of corrections, revisions, amplifications, and re-appraisals—of a sizeable share of the ODNB’s 55,000 existing biographies, as new scholarship comes to light.
The need to ‘keep-up’ with fresh research is not new. In 1908 the Victorian DNB was reprinted in an edition that collated the marginalia and correspondence born of several decades of reading. Thereafter, no further reprints were undertaken and later findings remained on file: information relating to the birthplace of the Quaker reformer Elizabeth Fry, for example—submitted by postcard in 1918—could not be address until the 2004 edition of the Dictionary. Such things are today unimaginable. Over the past ten years, and alongside the programme of new biographies, existing ODNB entries have been regularly updated online—with proposed amendments reviewed by the Dictionary’s academic editors in consultation with authors and reviewers. It’s worth remembering that today’s expectation of regular online updating is one that’s emerged in the lifetime of the published ODNB. Just 10 years ago, many saw online reference as a means of delivery not a new entity in its own right. The expectation that scholarly online reference could and should keep in step with new research and publications (and could be done while maintaining academic standards) is one pioneered, in part, by works like the ODNB.
One consequence is that Dictionary editors now focus on conservation (just as museum or gallery curators care for items in their collection) as well as on commissioning. In doing so we draw heavily on an ever-growing range of digitized records that have become available in the lifetime of the published Dictionary. This has been a truly remarkable development in humanities research in the past 5 to 10 years. For British history we’ve seen the digitization of (to name a few): the census returns for England and Wales (to 1911); indexes of civil registration in England and Wales (births, marriages, and deaths from 1838); Scottish parish registers from about 1500; early modern wills and probates; and 300 years’ worth of national and provincial newspapers. And this just scratches the surface.
In 2004 there were many people in the ODNB for whom the biographical trail ran cold. Access to paper records alone once meant that certain individuals simply disappeared from the historical record. Of course, some lives remain puzzles. But with these newly digitized sources we’re now able to address many of the previously unknown and untraceable episodes that were scattered across the 2004 edition. A decade on we’ve added details of nearly 3000 previously unknown births, marriages, and deaths for ODNB subjects. Access to newly digitized sources also prompts more wide-ranging revisions. Take, for example, the traveller Eliza Fay (1755/6-1816), known for her Original Letters from India, whose Dictionary entry has recently doubled in length owing to new genealogical research that minutely plots a troubled personal life that led Fay to travel to India and the business ventures she maintained there.
The case of Eliza Fay reminds us that this boom in digitized resources is particularly valuable for better understanding the lives of nineteenth and twentieth-century women. As a result of multiple marriages and/or multiple name changes many such biographies are prone to obscurity. There are also many occasions when women gave false information about their age, often for professional reasons. With digital resources, and a little detective work, it’s now possible to recover these stories. One example is Valerie, Lady Meux (1852-1910), who married into one of Britain’s wealthiest brewing families. To her contemporaries, and to generations of researchers, Lady Meux appeared the epitome of high society. But recent research uncovers a very different story: that of Susie Langton, the daughter of a Devon baker who—via multiple changes of birthdate and name—worked her way into the London elite. To Susie Langton (or Lady Meux), the discovery of her true past may not have been welcome, but for modern historians it becomes a key part of her story, and a fascinating case study of late-Victorian social mobility.
A good deal of this detective work is being done from the ODNB office. But much more comes in from thousands of researchers worldwide who are also making use of digitized resources. It’s our good fortune that the ODNB online is growing up with the Who Do You Think You Are? generation—a band of genealogists from whom we’ve benefited greatly thanks to their willingness to share new information. Such discoveries obviously enhance our understanding of the ODNB’s 60,000 main subjects, but they’re similarly adding much to the Dictionary’s 300,000 ‘other’ people: the parents, children, spouses, in-laws, patrons, teachers, business partners, and lovers who also populate these biographies. Looking ahead to our second decade, we anticipate that more will be made of these hundreds of thousands of ‘extras’ in creating a richer picture of the British past—as the ODNB continues to document and add to what we know.