What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Date

Click days in this calendar to see posts by day or month
<<August 2014>>
SuMoTuWeThFrSa
     0102
03040506070809
10111213141516
17181920212223
24252627282930
31      
new posts in all blogs
Viewing: Blog Posts Tagged with: Early Bird, Most Recent at Top [Help]
Results 1 - 25 of 37
1. China: Behind the bamboo curtain

By Patrick Wright


On 1 October 1954, Sir Hugh Casson, the urbane professor of interior design who had been director of architecture at the Festival of Britain, found himself standing by the Tiananmen Gate in the ancient and still walled city of Peking. In China to present a statement of friendship signed by nearly 700 British scientists and artists, he was watching a parade that the reporter James Cameron reckoned to be “the greatest show on earth”. First came the troops and the “military ironwork”, grinding past for a full hour. This was followed by a much longer civil parade in which the people marched by in barely imaginable numbers, beaming with joy at their elevated leaders who gazed back with the slightly “subdued” expression of still unaccustomed new emperors.

The spectacle with which China celebrated the fifth anniversary of the communist liberation was brilliantly organised, as Casson felt obliged to admit. He was less impressed by the admiring expressions worn by many of the other international guests: “Gold-rimmed spectacles misted with emotion, cheeks creased with years of well-meant service in this cause or in that, shirts defiantly open at the neck, badges in lapels, and there in the middle – could it have been? – an MCC tie.” That particular specimen was Ivor Montagu, a cricket-loving friend and translator of the great Soviet film-maker Sergei Eisenstein.

Sickened by the rapture of the communist regime’s ardent western friends, Casson quickly retreated to the shaded “rest room” beneath the viewing stand. Here he lingered among yellow-robed Tibetan lamas, sipping tea and exchanging impressions with other doubtful Britons: the classically minded and no longer Marxist novelist and poet Rex Warner, and AJ Ayer, the high-living logical positivist who would come home to tell the BBC that China’s parade had reminded him of the Nuremberg rallies.

Enraptured or appalled, none of these British witnesses appears to have regretted the absence of Stanley Spencer. The 63-year-old painter, so famously associated with the little Berkshire village of Cookham, had managed to escape the entire show – thanks, he later explained, to “some Mongolians”, whose timely arrival at the hotel that morning had provided the cover under which he retreated upstairs to his room.

It was the discovery that Spencer had been to China that persuaded me to look further into this forgotten episode. I soon realised that an extraordinary assortment of Britons had made their way to China in 1954, nearly two decades before 1972, when President Nixon made the stage-managed and distinctly operatic visit that has gone down in history as the moment when the west entered rapprochement with the People’s Republic of China. Were these motley British visitors just credulous idiots, for whom “Red China” was another version of the legendary Cathay? That is what the 24-year-old Douglas Hurd and the other diplomats in the British embassy compound in Peking appear to have suspected of these unwelcome freeloaders. Or was something more significant going on?

Nowadays, the rapidly increasing number of British travellers to China think nothing of getting on a plane to fly directly there. Yet Spencer had good reason to feel “trembly” as he and the five other members of his entirely unofficial cultural delegation approached the runway at Heathrow on 14 September 1954. Though Britain had recognised China a few months after the liberation, it had yet to establish proper diplomatic relations with the communist-led government, and the embarking Britons couldn’t pick up a visa until they had reached Prague. That meant crossing the iron curtain dividing Europe. “Did you go under or over it?” one joker would later ask, making light of a passage that was

0 Comments on China: Behind the bamboo curtain as of 1/1/1900
Add a Comment
2. Happy 75th Birthday Monopoly!

By Philip Carter


This month is the 75th anniversary of the London version of the popular board game, Monopoly. To mark the anniversary, editors at the Oxford DNB wondered what a historical version of the game might look like. The Oxford DNB includes the stories of more than 57,000 men and women from British history, of whom nearly half had ties to the capital city.

So who would you have met if you’d made your way around a Monopoly board in the 1400 years since Mellitus (d.624), our first definite capital dweller and, incidentally, the first ever bishop of London? Throw a 3 and you’re rubbing shoulders with pugilist Daniel Mendoza on the Whitechapel Road, while a 10 has you ‘just visiting’ a London jail, alongside Elizabeth Fry. (Perhaps you’re there to see Dr Crippen, who spent his last days in Pentonville prison before his execution 100 years this month.) Another 3 gets you to the more salubrious Whitehall (the ODNB has articles on over 1700 civil servants); an 11 sees you on the Strand, developed for real by the 17th-century property tycoon Nicholas Barbon after the Great Fire. Shake a 7 and it’s the Water Works (how about Hugh Myddelton?) Follow this with a 6 and you can browse in Bond Street, perhaps stopping at no. 123, where the Italian confectioner William Jarrin set up shop in 1822. Posh Park Lane (126 residents) and swanky Mayfair (232) beckon, not to mention £200 on passing ‘Go’. But, oh no! a 4 and it’s ‘Super Tax, Pay £100’: welcome to the ODNB’s 54 accountants.

If you’d like to play on, you can. Online you can search the Oxford DNB by city, town, and street, as well as profession.

Dr Philip Carter is Publication Editor of the Oxford Dictionary of National Biography. In the UK the ODNB is available free via nearly all public libraries and you can log-in at home by adding your library card number here. The ODNB is also available in libraries worldwide—leaving you a little bit more for that hotel on the Old Kent Road.

0 Comments on Happy 75th Birthday Monopoly! as of 1/1/1900
Add a Comment
3. London Labour and the London Poor

By Robert Douglas-Fairhurst


It was an ordinary enough London winter’s evening: chilly, damp, and churning with crowds. I’d arranged to meet a friend at the Curzon Mayfair cinema, and after my packed tube had been held up between stations – ten sweaty minutes during which my fellow passengers had fumed silently, tutted audibly, and in one or two cases struck up tentative conversations with the person whose shopping was digging into their shins – I was late. Coming out of the entrance to the station, I nimbly side-stepped a beggar with a cardboard sign – sorry, bit of a rush, direct debit to Shelter, can’t stop – and hurried on my way to the cinema.

The film was Slumdog Millionaire: a nerve-shredding if ultimately cheering investigation into the hidden lives of the Indian slums. Coming out of the cinema, though, it was impossible to avoid the realiszation that equally vivid stories lay much closer to home. I retraced my steps to the tube station, and this time instead of brushing the beggar off I listened to what he had to say. It was a sadly familiar account of alcohol, a broken marriage, and homelessness, but as he told it the events took on a vividly personal colouring that was new and strange. He made me look again at what I thought I already knew.

The idea that what takes place under our noses can be hard to see clearly is hardly an original one; indeed, anyone who lives in a city soon learns to recognize the sensation of life being jolted out of its familiar routines, and assumptions being rearranged by new experiences. However, this idea took on a new resonance a few weeks later, when I was asked to edit a new selection of London Labour and the London Poor, Henry Mayhew’s mammoth set of interviews with the street-sellers, beggars, entertainers, prostitutes, thieves, and all the rest of the human flotsam and jetsam that had washed up in the capital during the 1840s and 1850s.

Ask most readers – and not a few critics – who Henry Mayhew was, and the result is likely to be at best a puzzled stare. Though his voice pops up occasionally in recent work, from Philip Larkin’s poem ‘Deceptions’ to novels such as Michel Faber’s The Crimson Petal and the White, for the most part he has become the Invisible Man of Victorian culture. And like H. G. Wells’s hero, usually he is detectable only by the movements of his surroundings, from Charles Kingsley’s jeremiad against the exploitation of cheap tailors in Alton Locke, to the strange echoes of his interview subjects in characters like Jo in Dickens’s Bleak House.

In some ways these literary aftershocks and offshoots of London Labour and the London Poor accurately reflect the work’s own generic hybridity. Opening Mayhew’s pages, it is hard to escape the feeling that you are encountering a writer who has one foot in the world of fact, one foot in the world of fiction, and hops between them with a curious mixture of uncertainty and glee. Sober tables of research are interrupted by facts of the strange-but-true variety: ‘Total quantity of rain falling yearly in the metropolis, 10,686,132,230,400 cubic inches’, or ‘The drainage of London is about equal in length to the diameter of the earth itself’. Even cigar-ends don’t escape his myth-making tendencies. Not content with calculating the number thrown away each week (30,000) and guessing at the proportion picked up by the

0 Comments on London Labour and the London Poor as of 10/27/2010 1:46:00 AM
Add a Comment
4. How to Read a Word

By Elizabeth Knowles


When I began working for Oxford Dictionaries over thirty years ago, it was as a library researcher for the Supplement to OED. Volume 3, O–Scz, was then in preparation, and the key part of my job was to find earlier examples of the words and phrases for which entries were being written. Armed with a degree in English (Old Norse and Old English a speciality) and a diploma in librarianship, I was one of a group of privileged people given access to the closed stacks of the Bodleian Library. For several years the morning began with an hour or so consulting the (large, leather-bound volumes of the) Bodleian catalogue, followed by descent several floors underground to track down individual titles, or explore shelves of books on particular topics. Inevitably, you ended up sitting on the floor leafing through pages, looking for that particular word. The hunt could sometimes be frustrating—occasionally you reached a point where it was clear that you had exhausted all the obvious routes, and only chance (or possibly six months’ reading) would take you further. But it was never dull, and the excitement of tracking down your quarry was only enhanced by the glimpses you had on the way of background information, or particular contexts in which a word had been used. Serendipity was never far removed.

The purpose, of course, was to supply the lexicographers working on the Supplement with the raw material on which the finished entry in its structured and polished form would be based. Not all the information you gained during the search, therefore, would appear in the finished entry, and some of the contextual information (for example, other names for the same thing at a particular period, or even the use of the word by a particular person) was not necessarily directly relevant. But that did not mean that it often wasn’t interesting and thought-provoking for the researcher.

At the time (the late 1970s) research of this kind was carried out in what we would now call hard copy. Entries in the library catalogue might lead to a three-volume eighteenth-century novel, or the yellowed pages of a nineteenth-century journal or newspaper. It followed, therefore, that someone who wanted to research words in this way needed what I had the luck to have: access to the shelves of a major library. At the end of the first decade of the twenty-first century, all that has changed. We still (of course, and thankfully) have excellent dictionaries which can be our first port of call, and we still have library catalogues to guide us. But these resources, and many others, are now online, allowing us to sit in our own homes and carry out the kind of searches for which I had to spend several hours a day underground. With more and more early printed sources becoming digitally available, we can hope to scan the columns of early newspapers, or search the texts of long-forgotten, once popular novels and memoirs. Specialist websites offer particular guidance in areas such as regional forms of English.

The processes for searching in print and online are at once similar, and crucially different. In both cases, we need to formulate our question precisely: what exactly do we want to know? What clues do we already have? A systematic search by traditional means might be compared with climbing a ladder towards an objective—and occasionally finding that the way up is blocked. There are no further direct steps. Online searching always has the possibility that a search will bring up the key term in association with something (a name, another expression), which can start you off down another path—perhaps the equivalent to stepping across to a parallel ladder which will then take you higher.

There has never been a time at which there have been richer resources for the would-be word hunter to explore, and there are no limits to the questions that can

0 Comments on How to Read a Word as of 10/28/2010 1:18:00 AM
Add a Comment
5. After Yemen, what now for al-Qaeda? 2010 Place of the Year

By Alia Brahimi


The air freight bomb plot should be understood as part of al-Qaeda’s pervasive weakness rather than its strength. The intended targets, either a synagogue in Chicago and/or a UPS plane which would explode over a western city, were chosen as part of the attempt to re-focus al-Qaeda’s violence back towards western targets and pull the jihad away from the brink.

Indeed, things haven’t worked out the way Osama bin Laden hoped they would.

Quoting such diverse sources as Carl von Clausewitz, Mao Zedong, Vo Nguyen Giap and Peter Paret, al-Qaeda strategists had repeatedly emphasised the pivotal importance of attracting the support of the Muslim masses to the global jihad. For Abu Ubeid al-Qurashi, the absence of popular support meant that the mujahidin would be no more than a criminal gang. ‘It is absolutely necessary that the resistance transforms into a strategic phenomenon’, argued Abu Mus’ab al-Suri, time and time again.

However, despite the open goal handed to bin Laden by the US-led invasion of Iraq and the increased relevance and resonance of his anti-imperial rhetoric from 2003-2006, he failed to find the back of the net. His crow to Bush about Iraq being an ‘own goal’ was decidedly premature. The credibility of bin Laden’s claim to be acting in defence of Muslims exploded alongside the scores of suicide bombers dispatched to civilian centres with the direct intention of massacring swathes of (Muslim) innocents.

Moreover, where al-Qaeda in Iraq gained control over territory, as in the Diyala and Anbar provinces, the quality of life offered to the Iraqi people was a source of further alienation: music, smoking and shaving were banned, women were forced to take the veil, punishments for disobedience included rape, the chopping of hands and the beheading of children. Brutality was blended with farce as female goats were killed because their parts were not covered and their tails turned upward.

In the end, bin Laden’s ideology, which relied first and foremost on a (poetic) narrative of victimhood, became impossible to sustain. Bin Laden’s project is profoundly moral. He casts himself as the defender of basic freedoms. He eloquently portrays his jihad as entirely defensive and al-Qaeda as the vanguard group acting in defence of the umma. He maintains that all the conditions for a just war have been met.

In reality, however, all of his just war arguments – about just cause, right authority, last resort, necessity, the legitimacy of targeting civilians – are based on one fundamental assumption: that al-Qaeda is defending Muslims from non-Muslim aggressors. As such, it is essential that (1) al-Qaeda stops killing Muslims and (2) al-Qaeda starts hitting legitimate western targets and the regimes which enable the alleged western encroachment.

The emergence of al-Qaeda in the Arabian Peninsula in January 2009 can be viewed as part of this end (much as the al-Qaeda-affiliated GSPC in Algeria formed in opposition to the moral bankruptcy of the GIA). Their publications favour targeted violence such as political assassinations and attacks within US military barracks such as that perpetrated by Major Nidal Hasan at Fort Hood. Their most high-profile operations have been an assault on the US embassy in Sana’a, an attempt to assassinate the Saudi security chief Mohammed bin Nayef, and the bid by the ‘underpants bomber’ to blow up a flight from Amsterdam to Detroit.

In Yemen, al-Qaeda in the Arabian Peninsula (AQIP) have internalised lessons from Iraq and are seeking to keep the population and the tribes on side. Their statements articulate the political and social discontent of the populace. The leadership seems to subscribe to bin Laden’s argument that violence must be used strategically and not w

0 Comments on After Yemen, what now for al-Qaeda? 2010 Place of the Year as of 11/3/2010 2:02:00 AM
Add a Comment
6. Lend Me Your Ears

In recognition of the US midterm elections, I decided to have a browse through Lend Me Your Ears: The Oxford Dictionary of Political Quotations and share with you a few entries that have come from the American political world. Enjoy…

“I will seek the presidency with nothing to fall back on but the judgment of the people and with nowhere to go but the White House or home.”
Robert Dole 1923-, American Republican politician, announcing his decision to relinquish his Senate seat and step down as majority leader.

“One of the uses of history is to free us of a falsely imagined past. The less we know of how ideas actually took root and grew, the more apt we are to accept them unquestioningly, as inevitable features of the world in which we move.”
Robert H. Bork 1927-, American judge and educationalist, from The Antitrust Paradox (1978)

“The American people have spoken – but it’s going to take a little while to determine exactly what they said.”
Bill Clinton 1946-, 42nd President of the United States 1993-2001, on the US presidential election of 2000.

“We are a nation of communities, of tens and tens of thousands of ethnic, religious, social, business, labour union, neighbourhood, regional and other organizations, all of them varied, voluntary, and unique… a brilliant diversity spread like stars, like a thousand points of light in a broad and peaceful sky.”
George Bush Sr. 1924-, 41st President of the United States, acceptance speech at the Republican National Convention, 18 August 1988.

“No sane local official who has hung up an empty stocking over the municipal fireplace, is going to shoot Santa Claus just before a hard Christmas.”
Alfred Emanuel Smith 1873-1944, American politician, comment on the New Deal in New Outlook, Dec 1933

“I suggested [in 1966] that we use the panther as our symbol and call our political vehicle the Black Panther Party. The panther is a fierce animal, but he will not attack until he is backed into a corner; then he will strike out.”
Huey Newton 1942-1989, American political activist, from Revolutionary Suicide (1973)

“Although we weren’t able to shatter that highest, hardest glass ceiling this time, thanks to you, it has about 18 million cracks in it.”
Hillary Rodham Clinton 1947-, American lawyer and Democratic politician, speech to her supporters, conceding the Democratic party presidential nomination to Barack Obama, 7 June 2008.

“The oldest, wisest politician grows not more human so, but is merely a grey wharf-rat at last.”
Henry David Thoreau 1817-1862, American writer, from Journal (1853)

“On my arrival in the United States I was struck by the degree of ability among the governed and the lack of it among the governing.”
Alexis de Tocqueville 1805-1859, French historian and politician, from Democracy in America (1835-40) vol. 1

0 Comments on Lend Me Your Ears as of 1/1/1900
Add a Comment
7. What has become of genius?

By Andrew Robinson


In the early 21st century, talent appears to be on the increase, genius on the decrease. More scientists, writers, composers, and artists than ever before earn a living from their creative output. During the 20th century, performance standards and records continually improved in all fields—from music and singing to chess and sports. But where is the Darwin or the Einstein, the Mozart or the Beethoven, the Chekhov or the Shaw, the Cézanne or the Picasso or the Cartier-Bresson of today? In the cinema, the youngest of the arts, there is a growing feeling that the giants—directors such as Charles Chaplin, Akira Kurosawa, Satyajit Ray, Jean Renoir, and Orson Welles—have departed the scene, leaving behind the merely talented. Even in popular music, genius of the quality of Louis Armstrong, The Beatles, or Jimi Hendrix, seems to be a thing of the past. Of course, it may be that the geniuses of our time have yet to be recognized—a process that can take many decades after the death of a genius—but sadly this seems unlikely, at least to me.

In saying this, I know I am in danger of falling into a mindset mentioned by the great 19th-century South American explorer and polymath Alexander von Humboldt, ‘the Albert Einstein of his day’ (writes a recent biographer), in volume two of his five-volume survey Cosmos. ‘Weak minds complacently believe that in their own age humanity has reached the culminating point of intellectual progress,’ wrote Humboldt in the middle of the century, ‘forgetting that by the internal connection existing among all the natural phenomena, in proportion as we advance, the field to be traversed acquires additional extension, and that it is bounded by a horizon which incessantly recedes before the eyes of the inquirer.’ Humboldt was right. But his explorer’s image surely also implies that as knowledge continues to advance, an individual will have the time to investigate a smaller and smaller proportion of the horizon with each passing generation, because the field will continually expand. So, if ‘genius’ requires breadth of knowledge, a synoptic vision—as it seems to—then it would appear to become harder to achieve as knowledge advances.

The ever-increasing professionalization and specialisation of education and domains, especially in the sciences, is undeniable. The breadth of experience that feeds genius is harder to achieve today than in the 19th century, if not downright impossible. Had Darwin been required to do a PhD in the biology of barnacles, and then joined a university life sciences department, it is difficult to imagine his having the varied experiences and exposure to different disciplines that led to his discovery of natural selection. If the teenaged Van Gogh had gone straight to an art academy in Paris, instead of spending years working for an art dealer, trying to become a pastor, and self-tutoring himself in art while dwelling among poor Dutch peasants, would we have his late efflorescence of great painting?

A second reason for the diminution of genius appears to be the ever-increasing commercialisation of the arts, manifested in the cult of celebrity. True originality takes time—at least ten years, as I show in my book Sudden Genius?—to come to fruition; and the results may well take further time to find their audience and market. Few beginning artists, or scientists, will be fortunate enough to enjoy financial support, like Darwin and Van Gogh, over such an extended period. It is much less challenging, and more remunerative, to make a career by producing imitative, sensational, or repetitious work, like Andy Warhol, or any number of professional scientists who, as Einstein remarked, ‘take a board of wood, look for its thinnest part, and drill a great number of holes when the drilling is easy.’

Thirdly, if less obviously, our expectations of modern genius have become more sophisticated and discriminating since the time of the 19th-century Romantic movement

0 Comments on What has become of genius? as of 8/11/2010 8:41:00 PM
Add a Comment
8. What is Energy?

By Jennifer Coopersmith


Energy is the go of things, the driver of engines, devices and all physical processes. It can come in various forms (electrical, chemical, rest mass, curvature of spacetime, light, heat and so on) and change between these forms, but the total is always conserved. Newton missed energy and it was Leibniz who discovered kinetic energy (he called it vis viva). The idea was promoted on the continent, chiefly by one family, the Swiss family of feuding mathematicians, the Bernoullis, in the first half of the 18th century. The more subtle concept, potential energy, slipped in over a hundred years, uninvited, like the 13th fairy at the party.

In Feynman’s profound allegory (‘Dennis the Menace’ playing with blocks), energy is defined by its property of being conserved. But, this doesn’t answer to all our intuitions about energy. Why does it change smoothly between its various forms? For example, when a child swings on a swing, her kinetic energy decreases as the swing climbs (and gains gravitational potential energy) and then, as the swing descends, she goes faster and faster.

A different approach holds the answer. Consider the walk to the shops. You could take the shortest route or you could optimize other aspects, e.g. take a longer route but less hilly, or more shady or with the least number of road-crossings. Nature also works in this optimizing way: it tries to minimize the total ‘action’ between a starting place and a final destination. ‘Action’ is defined as ‘energy’ times ‘time’, and, in order to minimize action, the energy must be able to change in a prescribed way, smoothly and continuously, between its two forms, kinetic and potential energy, (The Principle of Least Action was discovered by an eccentric Frenchman, Pierre-Louis Moreau de Maupertuis, while head of the Berlin Academy of Science, in the mid 18th century.)

What are kinetic and potential energy? Kinetic energy is the energy of motion of an individual body whereas potential energy is the energy of interaction of parts within a system. Potential energy must be specified for each new scenario, but kinetic energy comes in one essential form and is more fundamental in this sense. However, as potential energy relates to internal aspects (of a system), it doesn’t usually change for differently moving ‘observers’. For example, the game of billiards in the lounge of the ocean liner continues unaffected, whether that liner is coasting smoothly at 30 kph or whether it’s moored to a buoy. The kinetic energy of the liner is vastly different in the two cases.

But sometimes potential energy and even mass do change from one ‘reference frame’ to another. The more fundamental quantity is the ‘least action’, as this stays the same, whatever the (valid) ‘observer’.

Heat energy is the sum of the individual microscopic kinetic energies. But the heat energy and the kinetic energy of an everyday object are very different (e.g. the kinetic energy of a kicked football and the heat energy of a football left to warm in the sun). In fact, for the early 19th century natural philosophers, considering heat as a form of energy was like committing a category error. The slow bridging of this error by people like Daniel Bernoulli, Count Rumford, Robert Julius Mayer and James Joule makes a very interesting tale.

With regards to the looming energy crisis and global warming, here are the things we must remember:

1. Nature always counts the true cost, even if we don’t
2. There is no such thing as safe energy – it is energetic, after all
3. As the sink of all our activities becomes warmer, so all our ‘engines’, cars and humans etc, will run less efficiently
4. We must consider not only energy but also ‘least action’ – and take action.

Jenn

0 Comments on What is Energy? as of 1/1/1900
Add a Comment
9. Are We Masters of Our Own Destiny?

By Jeremy Taylor


On Friday, 20th August, I joined the panel for a Great Debate entitled “Are We Masters Of Our Own Destiny?” at the University of Newcastle, organized as part of the Green Phoenix Festival, 2010. My fellow panelists were science writer Rita Carter, most famous for her books on neuroscience: The Brain Book and Mapping The Mind, and local philosopher David Large. The debate was chaired by Caspar Hewett. As we suspected, this pitted two biology-oriented commentators against a more conventional philosopher who answered the question in the affirmative because he believed we can control our own destiny in the sense that Joyce could write his masterpiece Ulysses and Wittgenstein formulate his idiosyncratic theories. The nature of Joyce-ness, Ulysses-ness, Wittgenstein-ness, and the product of the mind and skill of great artists – Rembrandt-ness if you like – transcended “mere” functional explanations of what the mind is. He took umbrage with psychology which, he claimed, pretends its functional explanation of how the mind works is the explanation. It isn’t.

Rita Carter saw things very much from the bottom up rather than the top down. The mind is made up (literally!) by myriads of tiny, unconscious neuro-chemical events in our brains. She therefore believed free will is an illusion deeply wired into the brain as a set of mechanisms which automatically create the sense of self and agency to make it feel as though we decide what our acts will be – that we are responsible for them – rather than merely responding to stimuli.

I agreed strongly with Rita by suggesting that – like the illusion of free will – a large school of modern neuroscientists believe that our moral behaviour is produced not by moral reasoning but by input of extremely simple neurochemical data from our sense organs and receptors which is turned into moral intuitions in our brains by processes of which we are oblivious – the intuition simply pops into our heads. We then apply moral reasoning to our intuitions in a post-hoc sense in order to justify these instinctive beliefs. I agreed with one prominent such neuroscientist who claims that the conscious mind is like the mahout on an elephant. The elephant is the other 99% of what is going on in our minds – things that are unconscious and automatic. If free will and morality are the unconscious products of the way our brains work, thought a number of members of the audience, what, then, is the advantage to us of the illusion that we are in control? Carter argued that without the illusion that we are responsible for our own actions, and that we are therefore accountable for them, no society could possibly function; while I argued that the illusion of moral responsibility is a social phenomenon which evolved as a sort of social glue holding human groups together by commonly agreed norms and principles “outsiders” do not share. In that sense it is similar to the evolution of theory of mind – by which we explain other peoples’ actions by inferring to ourselves the hidden states of mind – their wants, beliefs and knowledge – that must be guiding them. If a teacher could have no inkling that he owned a state of knowledge his pupil lacked, and could not learn unless that knowledge was efficiently transferred from one brain to another, no culture could thrive and be

0 Comments on Are We Masters of Our Own Destiny? as of 1/1/1900
Add a Comment
10. What is the point of agnosticism?

By Robin Le Poidevin


Do we really need agnosticism nowadays? The inventor of the name ‘agnosticism’, the Victorian evolutionist Thomas Henry Huxley, certainly found it useful to have a word describing his lack of certainty when he was surrounded by those who seemed to have no such doubt. But then he lived in a period of transition. Science, and in particular biology, appeared to undermine old certainties. On the one hand, churchmen were promoting the importance of unshakeable faith. On the other, there were philosophers advocating a materialist and anti-religious outlook. Huxley felt he couldn’t identify with either side. If the Gnostics were those who claimed to have access to a special route to religious knowledge, then Huxley would be an a-gnostic, one who does not profess to know. But perhaps agnosticism served only as a temporary stopping point en route to a more satisfactory position, a stepping stone from faith to atheism.

For Richard Dawkins, a scientist, writer and today’s perhaps most vocal atheist, we have already crossed that river. It was perhaps reasonable to be an agnostic in Huxley’s time, when it was not yet clear how science could answer some of the awkward questions posed by believers: How, if there is no divine designer, could intelligence have developed? What is the source of our moral conscience? Why was the universe so congenial to the emergence of life? Now we have some detailed answers, the idea of God is de trop. And so too is agnosticism, apparently.

What is Dawkins’ thinking here? First, the agnostic’s point that we can’t know whether or not God does not exist, is not a very interesting one. There are lots of things we don’t know for sure. We don’t know that Mars isn’t populated by fairies. Of course, we are not remotely inclined to believe that it is, but still we don’t have conclusive proof. Nevertheless, we don’t describe ourselves as agnostics about Martian fairies. Similarly, atheists can admit that they don’t have conclusive proof of God’s non-existence.

Second, not having conclusive proof does not make God’s existence just as probable as his non-existence. Moving from ‘not certain’ to ‘50/50 chance either way’ is what we might call the agnostic fallacy.

Third, a necessary feature of God makes his existence highly improbable, namely his complexity. Of course, the world itself is complex – unimaginably so – but then science has an explanation of this complexity in terms of a series of gradual evolutionary steps from simpler states. In contrast there is no evolutionary account of God’s complexity: his nature is supposed to be eternal. And that there should just exist such complexity, with no explanation, is highly improbable.

That’s a very plausible line of thought. The conclusion is that, unless you think you have overwhelming evidence for God, the rational thing is to be an atheist. But it rests on a questionable assumption. There is still room for an interesting form of agnosticism. Take a look at the third point above: that God must be complex, and so improbable. It is a part of traditional theology that God is in fact simple. Dawkins finds this incredible: how can something responsible for the creation of the world, and who has perfect knowledge of it, be less complex than that creation? There are, however, different kinds of complexity. A language is complex in one sense, in that it contains a virtually limitless range of possible expressions. But those expressions are generated from a finite number of letters, and a finite number of rules concerning the construction of sentences. A language may be complex in its variety but (relatively) simple with respect to the components and principles that give rise to that complexity. When the philosopher Gottfried Wilhelm Leibniz opined that God had created ‘the best of all possible worlds’, his

0 Comments on What is the point of agnosticism? as of 1/1/1900
Add a Comment
11. Science, religion, and magic

By Alec Ryrie


My book started out as a bit of fun, trying to tell a rollicking good story. I did that, I hope, but I also ended up somewhere more controversial than I expected: caught in the ongoing crossfire between science and religion. What I realised is that you can’t make sense of their relationship without inviting a third ugly sister to the party: magic.

The links between science and magic are pretty obvious. Science, basically, is magic that works. A lot of things that look pretty scientific to us were labelled ‘magic’ in the pre-modern period: chemistry, magnetism, even hydraulics – to say nothing of medicine. The only real difference is that modern science has a rigorous experimental basis. Arthur C. Clarke famously said that sufficiently advanced science is indistinguishable from magic. But to the novice, all science is indistinguishable from magic. You try showing a magnet to an astonished four-year-old and asking them how you did it.

Of course, science and magic are supposed to be enemies nowadays. Scientists despise magic, but still read their children fairy tales. Modern pagans dislike ‘scientism’ but they love information technology.

Religion and magic have the same sort of ambiguous relationship. They’re obviously connected: both trying to bring humanity in touch with supernatural powers. And they hate each other: the Abrahamic religions, at least, have always seen magic as heretical if not diabolical, and they view the other way isn’t much more complimentary. But the line between the two is pretty fuzzy. The theory is that magic is about trying to manipulate supernatural powers (with the magician in charge of the process) while religion is about submitting to or petitioning those powers (with God in charge). In practice, that breaks down, as magicians seek transcendent experiences and priests promulgate infallible books or sacraments.

In Christianity, though, this kind of talk has a confessional edge to it. Protestants have always argued that their (OK, full disclosure: our) form of Christianity is less tainted by magic, while Catholicism is riddled with superstition, obscurantism and priestcraft. Writing this book convinced me that this is nonsense.

Yes, Catholicism is more ritualistic. But early Protestantism was up to its neck in magic too. How could it not be? The best minds of the sixteenth century all took magic immensely seriously. It’s true that Protestants were uneasy about the way astrology (say) was being used, but they found it easier to mock it than to prove it wrong. And when they do mock it they sound crude, like flat-earthers denying the moon landings, or creationists using what Richard Dawkins calls ‘the argument from personal incredulity’ to deny evolution.

The truth was that, in the sixteenth century, only a fool would deny that magic was real. The Renaissance was turning the world upside down, sending the Earth round the Sun; explorers were discovering whole new continents. As I say in the book:

In our own age, scepticism and disbelief seem intellectually sophisticated; in the sixteenth century, they seemed self-limiting and perverse. It was unmistakable that there were more things in heaven and earth than had been dreamed of in the old philosophies. Credulity, or at least a willingness to believe, was the only sensible way of looking at the world. And when you have adopted a new mathematics, a new astronomy, a new geography and a new religion, why balk at a new magic?

So I hope the story I’m telling in this book has a serious point to make. I’m not trying to persuade anyone to be a magician (heaven forbid), but to recognise that one of the reasons science and religion have been so antagonistic is that they have a third sibling: this is a family quarrel. And both of them could do with hearing their sister’s wa

0 Comments on Science, religion, and magic as of 1/1/1900
Add a Comment
12. How do you write a Very Short Introduction to English Literature?

By Jonathan Bate

 
My last three books have been a 670 page life of the agricultural labouring poet John Clare, a two and half thousand page edition of the complete works of Shakespeare, and a 500 page “intellectual biography” of Shakespeare in the context of his age. So how could I resist an invitation from OUP to write a VERY SHORT book! Mind you, it was a ludicrous proposition to introduce a subject the size of English Literature in a mere 50,000 words (I pushed them up from the standard 40k limit for the series by cunningly asking for 60k and splitting the difference…). But the series guidelines were very helpful: “The text should not read like an encyclopedia entry or a textbook; depending on the topic, it may be more comprehensive or more idiosyncratic in its coverage. Don’t be afraid to express a point of view or to inject some style into the prose. Focus on issues, details, and context that make the subject interesting; you should draw your reader in with examples and quotations. Give the reader a sense both of your subject’s contours and of the debates that shape it.” Good principles, which have made for a great series – so many people have said how much they like these little books.

So how did I set about the task? Being a Literary History Man, I began by looking for literary historical precedent.

In 1877 a chaplain to Queen Victoria called the Reverend Stopford A. Brooke published a primer for students and general readers called English Literature. By the time of his death, half a million copies were in print. 160 pages long and produced in handy pocket format, it is the Victorian equivalent of a VSI. Brooke surveyed a vast terrain, from Beowulf and Caedmon to Charlotte Brontë and Alfred Tennyson, with admirable tenacity and vigour, if a little too much patriotic uplift and Anglo-Saxon prejudice for modern taste. But his even-paced chronological march and his desire to give at least a name-check to every author he considered significant meant that his little book too often reduced itself to a parade of the greatest (and not so great) hits of English literature. Faced with a similar task to Brooke’s, and more than one hundred further years’ literary production to cover, I adopted a more varied and selective approach. I made no attempt to offer a historical survey of English poets, novelists, playwrights and non-fiction writers. Frequently I skip over generations in a step; I loop forward and back in time as I identify key themes.

I devote a good deal of attention to questions of origin. From where do we get the idea of literature as a special kind of writing? What could justifiably be described as the first work of English literature and when did the conception of a body of national literature emerge? Which practising novelist wrote the first self-conscious defence of the art of the novel? These are some of the questions I have tried to answer.

Sometimes, I slow the pace and tighten the focus, exploring, for example, a scene from Shakespeare’s King Lear, an instance of the technique of “free indirect discourse” in Jane Austen’s Emma, a poignant stanza of nonsense by Edward Lear, a compositional change of mind on the part of Wilfred Owen, and Seamus Heaney’s preoccupation with prehistoric bodies excavated from Danish peat bogs. I make no apology for these moments of “close reading”: if the study of English Literature is to be true to its object, it must attend to particular words and phrases, verse lines and sentences, movements of thought and structures of writing. My sampling of passages, works, and forms of attention is eclectic – deliberately so, for there is no other body of writing upon earth more varied and inexhaustible than English Literature. That thought makes any attempt to write a “very short introduction” to the subject both deeply

0 Comments on How do you write a Very Short Introduction to English Literature? as of 1/1/1900
Add a Comment
13. Questioning Alternative Medicine

By Roberta Bivins


As a historian who writes about the controversial topic of ‘alternative medicine’, I get a lot of questions about whether this or that therapy ‘works’. Sometimes, these questions are a test of my objectivity as a researcher. My questioners want to know whether I am ‘believer’, or a fan of alternative medicine, or have any stake in promoting or disdaining a given medical system. Other people are asking simply for advice: is it worth trying acupuncture, say, or homeopathy for a particular condition? From either angle, such questions ask me to take a stand on whether homeopathy is quackery, or whether I believe in acupuncture channels, or chiropractic manipulation.

My instinctive – if perhaps unhelpful – response to such questions is, more or less, to shrug my shoulders and reply that I don’t really care: the issue of therapeutic efficacy isn’t at the heart of my research on this fascinating subject. Instead, I want to know what lies behind the enduring popularity of alternative medicine, what is (or is not) really ‘alternative’ about it, and why so many of biomedicine’s current crop of ‘alternatives’ have been imported from very different global medical cultures. These are questions that a historian can answer. They are also questions that shed more light on the persistence of alternative medicine than would a yes or no answer about the efficacy of any given technique. After all, we know that once-respected mainstream therapies like bloodletting and purging enjoyed centuries of popularity despite being uncomfortable, potentially dangerous and (in light of today’s medical knowledge) ineffective. Even today, patients prescribed antibiotics for a nasty cold often report feeling better after taking them – despite knowing that most colds are actually caused by viruses, and thus immune to antibiotic therapy.

My position has not always been popular with my fellow authors writing on the topic. They are often passionately committed supporters or opponents of alternative therapies, and demand that I become one or the other as well. But history studies the interplay of light and shadow, not the boundaries between black and white. So I am happy to let the healers fight it out in the battle to prove or disprove the efficacy of their chosen treatments. My job as a historian is to remind them — and to remind us all as consumers — that even the most objective evidence remains historically contingent: no medical experiment can escape from its social milieu, since both its designers and its subjects are shaped by their own historical and cultural context and beliefs.

For example, in contemporary biomedicine, it is conventional to separate the mind and the body when designing a medical experiment: hence the rise of the double-blinded random controlled trial as medicine’s ‘gold-standard’ of proof. Yet physicians and researchers simultaneously acknowledge the impact of the mind on bodily processes. They call it the ‘placebo effect’. As understandings of the mind-body relationship become more sophisticated, it is possible that the blinded RCT will fall from favour, as a limited test of therapeutic activity which obscures an important variable. Such changes have happened in the past, as evidenced by the shifting balance between deductive and inductive reasoning in scientific experimentation since the Scientific Revolution, or the changing status of ‘empiricism’ in western medicine since the 18th century. Then again, it may not: history is not a predictive science! My point is that today’s objective truths are neither value-free nor future-proof.

More practically, it is also my task to point out that the arguments used on either side — for instance, ‘homeopathy is bunk; no trace of the medicinal substance remains in a homeopathic dilution’, or ‘biomedicine reduces h

0 Comments on Questioning Alternative Medicine as of 1/1/1900
Add a Comment
14. Dressing Up, Then and Now

By Ulinka Rublack


I will never forget the day when a friend’s husband returned home to Paris from one of his business trips. She and I were having coffee in the huge sun-light living-room overlooking the Seine. We heard his key turn the big iron door. Next a pair of beautiful, shiny black shoes flew through the long corridor with its beautiful parquet floor. Finally the man himself appeared. “My feet are killing me!”, he exclaimed with a veritable sense of pain. The shoes were by Gucci.

We might think that these are the modern follies of fashion, which only now beset men as much as women. My friend too valued herself partly in terms of the wardrobe she had assembled and her accessories of bags, sunglasses, stilettos and shoes. She had modest breast implants and a slim, sportive body. They were moving to Dubai. In odd hours when she was not looking after children, going shopping, walking the dog, or jogging, she would write poems and cry.

Yet, surprisingly, neither my friend nor her husband would seem very much out of place at around 1450. Men wore long pointed Gothic shoes then, which hardly look comfortable and made walking down stairs a special skill. In a German village, a wandering preacher once got men to cut off their shoulder-long hair and slashed the tips of the pointed shoes. Men and women aspired to an elongated, delicate and slim silhouette. Very small people seemed deformed and were given the role of grotesque fools. Italians already wrote medical books on cosmetic surgery.

We therefore need to unlock an important historical problem: How and why have looks become more deeply embedded in how people feel about themselves or others? I see the Renaissance as a turning point. Tailoring was transformed by new materials, cutting, and sewing techniques. Clever merchants created wide markets for such new materials, innovations, and chic accessories, such as hats, bags, gloves, or hairpieces, ranging from beards to long braids. At the same time, Renaissance art depicted humans on an unprecedented scale. This means that many more people were involved in the very act of self-imaging. New media – medals, portraits, woodcuts, genre scenes – as well as the diffusion of mirrors enticed more people into trying to imagine what they looked like to others. New consumer and visual worlds conditioned new emotional cultures. A young accountant of a big business firm, called Matthäus Schwarz, for instance, could commission an image of himself as fashionably slim and precisely note his waist measures. Schwarz worried about gaining weight, which to him would be a sign of ageing and diminished attractiveness. While he was engaged in courtship, he wore heart-shaped leather bags as accessory. They were green, the colour of hope. Hence the meaning of dress could already become intensely emotionalized. The material expression of such new emotional worlds – heart-shaped bags for men, artificial braids for women, or red silk stockings for young boys – may strike us as odd. Yet their messages are all familiar still, to do with self-esteem, erotic appeal, or social advancement, as are their effects, which ranged from delight in wonderful crafting to worries that you had not achieved a look, or that someone just deceived you with their look. In these parts of our lives the Renaissance becomes a mirror which leads us back in time to disturb the notion that the world we live in was made in a modern age.

Ever since the Renaissance, we have had to deal with clever marketing as well as the vexing questions of what images want, and what we want from images, as well as whether clothes wear us or we wear them.

Ulinka Rublack is Senior Lecturer in early modern European history at Cambridge University and a Fellow of St John’s College. Her latest

0 Comments on Dressing Up, Then and Now as of 1/1/1900
Add a Comment
15. On Eavesdropping

Eavesdropping has a bad name. It is a form of human communication in which the information gained is stolen, and where such words as cheating and spying come into play. But eavesdropping may also be an attempt to understand what goes on in the lives of others so as to know better how to live one’s own. John L. Locke’s entertaining and disturbing  new book, Eavesdropping: An Intimate History, explores everything from sixteenth-century voyeurism to Facebook and Twitter. Below is a short excerpt from the book’s prologue, explaining why he finds eavesdropping so fascinating.

On a flight from Milan to London I was slumped down in my aisle seat, deep in thought as I reviewed an early draft of the manuscript that has become this book. Unbeknownst to me, I was being watched by a woman in the middle seat of the row immediately in front. After we had landed and the passengers were commencing the customary disembarking ritual, the woman startled me by looking over her headrest and pointedly asking if I was writing a book. I answered that I was. What’s it about, she asked. I said my book concerned the intense desire of members of our species to know what is going on in the personal lives of others. At this, the woman burst into ironic laughter since first in watching, and then in asking, she had just expressed two different forms of that very desire.

Watching and asking produce a form of intimate experience, which can be enjoyable in its own right, as well as intimate images, which may be re-experienced when privately brought to mind or – as information – shared with others. Intimacies tend to circulate preferentially among people who know and trust each other, and they usually move swiftly. Since many of these “secrets” ultimately become public knowledge, a look at how intimate material travels enables us to understand the social foundations of scandal, rough justice, and the “news,” even “history.”

I smiled in response to the lady on the plane but I could just as well have laughed, too, for here I was, writing a book about a subject on which there was little in the way of directly relevant research. Indeed, until I began to study eavesdropping –  one of the more important ways that ordinary people express the desire at issue –  I had never, in many years of research, encountered a behavior whose actual significance was so greatly at variance with its recognized importance. Look for books on social behavior with the word “eavesdropping” in the index section and you are likely to be severely disappointed. Enter the same word in computerized literature searches and your screen will display a list of books on wiretapping and other forms of electronic surveillance. But the word was coined centuries before telephones and recording equipment were invented, and the practice of eavesdropping documented nearly a thousand years earlier, when people were happy to entrust to unaided senses the question of who was doing what to whom.

Just after I began my studies of eavesdropping, a colleague asked me why I had chosen to address this particular subject. It must have seemed a radical departure from my previous work on the psychology of language. I told him that I had come across Marjorie McIntosh’s analysis of court records indicating that five and six centuries ago, English citizens had, in impressive numbers, been arrested for eavesdropping. I wondered what, in the medieval mind, would have caused this behavior to be criminalized, and what the “criminals” themselves were doing, or thought they were doing, when they went out at night and listened to their neighbors’ conversations.

I had also begun to study ethology, a field that deals with behavior in a broad range of species, and had encountered the work of Peter McGregor. He pointed out that birds increase their chances of survival by monitoring the long-distance calls of other bir

0 Comments on On Eavesdropping as of 6/9/2010 12:54:00 AM
Add a Comment
16. London Place Names: Some Origins

From Garlick Hill to Pratt’s Bottom, London is full of weird and wonderful place names. We’ve just published the second edition of A.D. Mills’s A Dictionary of London Place Names, so I thought I would check out the roots of some of London’s most famous addresses.

Abbey Road (in St. John’s Wood): Developed in the early 19th century from an earlier track, and so named from the medieval priory at Kilburn to which it led. Chiefly famous of course as the name of the 1969 Beatles album recorded here at the EMI studios.

Baker Street: Recorded thus in 1794, named after the builder William Baker who laid out the street in the second half of the 18th century on land leased from the estate in Marylebone of Henry William Portman. Remarkably enough, Baker Street’s most famous resident (at No. 221B) was a purely fictional character, the detective Sherlock Holmes created by Sir Arthur Conan Doyle in 1887!

Buckingham Palace: The present palace stands on the site of Buckingham House 1708, so named after John Sheffield, Duke of Buckingham, who had it built in 1702 (on land partly leased from the Crown) and whose heir sold it to George III in 1762. This building was rebuilt and much enlarged in the 1820s and 1830s according to the designs of John Nash and Edward Blore, becoming Queen Victoria’s favourite town residence when she came to the throne in 1837. The site was earlier known as Mulbury Garden feild 1614, The Mulbury garden 1668, the walled garden having been planted with thousands of mulberry trees by James I who apparently had the grand idea of establishing a silk industry in London.

Canary Wharf: The grand commercial development with its massive 850ft tower (the highest building in the country), begun in 1987, takes its name from a modest fruit warehouse! Canary Wharf was the name given to a warehouse built in 1937 for the Canary Islands and Mediterranean fruit trade of a company called ‘Fruit Lines Ltd’. The name of the Spanish island of Canary (i.e. Gran Canaria, this giving its name to the whole group of ‘Canary Islands’) is of course also of interest: it is derived (through French and Spanish) from Latin Canaria insula, that is ‘isle of dogs’ (apparently with reference to the large dogs found here).

Drury Lane: Recorded thus in 1598, otherwise Drewrie Lane in 1607, named from Drurye house 1567, the home of one Richard Drewrye 1554. The surname itself is interesting; it derives from Middle English druerie ‘a love token or sweetheart’. The lane was earlier called Oldewiche Lane 1393, street called Aldewyche 1398, that is ‘lane or street to Aldewyche (‘the old trading settlement’).

Knightsbridge: Cnihtebricge c.1050, Knichtebrig 1235, Cnichtebrugge 13th century, Knyghtesbrugg 1364, that is ‘bridge of the young men or retainers,’ from Old English cniht (genitive case plural -a) and brycg. The bridge was where one of the old roads to the west crossed the Westbourne stream. The allusion may simply be to a place where cnihtas congregated: bridges and wells seem always to have been favourite gathering places of young people. However there is possibly a more specific reference to the important cnihtengild (‘guild of cnihtas‘) in 11th century London  and to the limits of its jurisdiction (certainly Knightsbridge was one of the limits of the commercial jurisdiction of the City in the 12th century).

Piccadilly: This strange-looking stre

0 Comments on London Place Names: Some Origins as of 6/9/2010 11:47:00 PM
Add a Comment
17. Some pictures from Hay

A couple of weeks ago I brought you a post on the Hay Festival by OUP UK’s Head of Publicity Kate Farquhar-Thomson. Today, for those of you who couldn’t make it to the Festival (like me), here are some of Kate’s photos from the few days she spent there.

The festival site from on high

Priya Gopal, author of The Indian English Novel, speaks to a festival-goer

Scientists Steve Jones and Jerry Coyne. Coyne’s book Why Evolution is True was published by OUP in the UK.

Festival-goers on site. Doesn’t it look glorious?

Simon Baron-Cohen, author of Autism and Asperger Syndrome: The Facts, signs books.

0 Comments on Some pictures from Hay as of 1/1/1900
Add a Comment
18. What Makes Civilization?

In What Makes Civilization?, archaeologist David Wengrow provides a vivid new account of the ‘birth of civilization’ in ancient Egypt and Mesopotamia (today’s Iraq). These two regions, where many foundations of modern life were laid, are usually treated in isolation. This book aims to bring them together within a unified history of how people first created cities, kingdoms, and monumental temples to the gods. In the original blog post below, David Wengrow writes about that isolated view of the Near and Middle East.

To talk of civilizations is not just to describe the past. It is also to reflect on what is different about the societies we live in, how they relate to one another, and the extent to which their futures are bound up with traditions inherited from previous ages. The ancient Near East—including Mesopotamia (today’s Iraq) and Egypt—occupies a uniquely paradoxical place in our understanding of civilization. We freely acknowledge that many foundations of modern civilization were laid there, along the banks of the Euphrates, the Tigris, and the Nile. Yet those same societies have come to symbolise the remote and the exotic: the world of walking mummies, possessive demons, unfathomable gods, and tyrannical kings. What is the source of this paradox? For answers we usually look to the legacy of the Old Testament, and the literature of ancient Greece and Rome. But as part of a generation that was no longer obliged to read the ‘Classics’ at school, I find something unsatisfying about the idea that we have simply inherited the cultural prejudices of the ancients, as though by osmosis.

Most people today, I would have thought, are more likely to encounter the ancient Near East through the lens of Hollywood than through the biblical and Greco-Roman literature that informed the views of earlier generations. Still, when the Iraq Museum in Baghdad was looted in 2003, eight decades after its foundation by the British diplomat and archaeologist Gertrude Bell, our newspapers proclaimed ‘the death of history’. The headlines, for once, were in my opinion proportionate to the truth. Ancient Mesopotamia and surrounding parts of the Middle East were the setting for some of the most momentous turning points in human history: the origins of farming, the invention of the first writing system, of mechanised transport, the birth of cities and centralised government, but also—and no less importantly—familiar ways of cooking food, consuming alcohol, branding commodities, and keeping our homes and bodies clean. That is what archaeologists and ancient historians mean when they talk (a little coyly, these days) about ‘the birth of civilization’, 5000 years ago, on the banks of the Tigris and Euphrates.

As somebody who researches and teaches the archaeology of the Middle East for a living, I have often been struck by how little Mesopotamia is discussed outside a small circle of academics, by contrast with its ever-popular neighbour on Nile. Even less widely known are the other great urban centres of the Bronze Age: in the Indus Valley, the oases of Central Asia, on the Iranian Plateau, and along the shores of the Persian Gulf. Contrary to what most people think, the discovery of ‘lost civilizations’ did not end with the Victorian era. It has been going on, quietly and steadily, amid the turmoil of the 20th century, through fieldwork in remote and sometimes dangerous areas, and through the equally important work of analysis and translation that takes place in universities and museums. Why are the results of this steady increase in our knowledge about the ancient world not better known?

Academics and curators must themselves carry a c

0 Comments on What Makes Civilization? as of 1/1/1900
Add a Comment
19. King Arthur: Most Successful Brand in English Literature?

Helen Cooper edited and abridged the Oxford World’s Classics edition of Le Morte Darthur by Sir Thomas Malory, which is arguably the definitive English version of the stories of King Arthur. Completed in 1467-70, it charts the tragic disintegration of the fellowship of the Round Table, destroyed from within by warring factions. It also recounts the life of King Arthur, the knightly exploits of Sir Lancelot du Lake, Sir Tristram, Sir Gawain, and the quest for the Holy Grail. In the original blog post below, Helen Cooper states the case for King Arthur being the most successful commercial brand in English Literature (even more so than Shakespeare) and explains what Malory did that was so remarkable.

King Arthur has some claim to be the most successful commercial brand in the history of English literature, ahead even of Shakespeare. He has certainly been famous for much longer: his reputation has been growing for some fifteen centuries, against Shakespeare’s mere four. The historical Arthur, if he ever existed, was most likely to have been the leader of a war-band trying to hold at bay the invading Saxons in the wake of the withdrawal of the Roman armies, perhaps early in the sixth century. His fame was preserved in oral traditions for the next few hundred years, and only occasionally reached the written record; but after a Norman-Welsh cleric, Geoffrey of Monmouth, invented a full biography for him in the 1130s, stories about him have spawned and expanded, until by now we have a deluge of retellings, historical or unashamed fantasy, for adults and children; films, television series, and wargames; parodies at all levels, not least from the Monty Python team; a tourist industry, and consumer items from toy swords to T-shirts. There is even a fast-food shop in Tintagel named Excaliburgers.

Geoffrey wrote in Latin, and the story he invented remains just about plausible in historical terms: his Arthur is a great conqueror who unites Britain under his rule, overruns much of Europe and reaches the very gates of Rome. The first overtly fictional accounts of his court, not least the knights of the Round Table, were written in French. Magic begins to creep into these new stories, and so does love: there is no Lancelot in the historical tradition. For a long time, Arthurian material in English kept largely to the quasi-historical account as outlined by Geoffrey, and anyone who wanted a detailed acquaintance with the romance elaborations of the story still had to read them in French. It was not until the late fifteenth century that a Warwickshire knight, Sir Thomas Malory, distilled the full story of the Round Table into a single English version. The result, the Morte Darthur, is one of the great works of English literature, and it underlies, directly or indirectly, almost every version of the legend produced in the anglophone world since then. Greg Doran’s 2010 production of the Morte with the Royal Shakespeare Company is the latest of these, and its script, by Mike Poulton, is impressively (and exceptionally) faithful to its original.

The qualities that make Malory so remarkable are the same ones that have made most of his literary descendants want to change him. For him, actions speak not only more loudly than words but often instead of them. Causes are often missing and motives have to be deduced, in a way that sets the imagination buzzing. Morality is carried by a few adjectives: noble, worshipful, faithful, against recreant or cowardly. The love of Lancelot and Guinevere is good because it is faithful: ‘she was a true lover, and therefore she had a good end’, as Malory puts it in one of his rare authorial interventions, cutting through all the questions about

0 Comments on King Arthur: Most Successful Brand in English Literature? as of 1/1/1900
Add a Comment
20. Ethiopia Since Live Aid, Part III: On Africa, aid, and the West

Peter Gill is a journalist specialising in developing world affairs, and first travelled to Ethiopia in the 1960s. He has made films in and reported from Gaza, Lebanon, Afghanistan, South Africa, Uganda, and Sudan, as well as Ethiopia. He recently led BBC World Service Trust campaigns on leprosy and HIV/AIDS in India. His new book is Famine and Foreigners: Ethiopia Since Live Aid, which is the story of what has happened in the country since the famous music and television events 25 years ago.

This third and final part of our ‘Ethiopia Since Live Aid’ blog feature is an original post by Peter Gill, in which he discusses the West’s view of aid and Africa. If you missed it, on Tuesday we read an excerpt from the book, and yesterday we ran an exclusive Q&A with Peter.

This 2010 ‘Summer of Africa’ has been promoted as a moment of transformation – an acknowledgment that the continent may at last be on the move, that it may be beginning to cast off its image as global basket case, ceasing to be a ‘scar on the conscience of humanity,’ in the phrase of former Prime Minister Tony Blair.

It was 25 years ago in July that a great Ethiopian famine and the Live Aid concert which it inspired underlined the physical and moral enormity of mass death by starvation. These events defined popular outrage at the human cost of extreme poverty and began to build an extraordinary consensus around the merits of aid. A generation later, in the teeth of financial gales in the rich world, this consensus is under increasing scrutiny.

Of course aid works and it works at many levels. Charity is an essential characteristic of social relationships. It saves lives and it helps individuals, families, sometimes whole communities to improve their existence. What the big aid flows – from governments and charities – have not done is to change the face of poor societies, to overcome the disgrace of extreme poverty.

Now the western world may have missed its opportunity to fix the problem. It may no longer have the means. It is also far too preoccupied with addressing the processes of how best to deliver aid, and has failed to sort out whether it had the right strategy in the first place.

What went wrong, I believe, is that we kept seeing Africa in our own image – as we would like it to be, rather than as it was. The colonial period may have become history, but the colonial mindset of ‘we-know-best’ has surely persisted. We compounded the error by allowing our hearts to rule our heads in how we spend the aid money. We have been more troubled by the symptoms of poverty than to see where our help was most needed.

Our fortunate way of the life in the West – prosperity allied with liberal democratic forms of government – may be the envy and the aspiration of many in the poor world, but did that give us the right in the name of ‘good governance’ to insist that there are quick and easy steps to achieving it? In the decades after Europe’s helter-skelter decolonisation, was it realistic to ignore the lessons of our own tortured political evolution and demand swift democratic reform as a condition of aid?

Our rich world sensibilities have, rightly, been offended by deaths from preventable diseases and we have, again rightly, poured money into ever more ambitious health initiatives. But we have made little corresponding effort to help African women plan their families by plugging the huge gap in contraceptive needs. Aid expenditure on family planning has actually fallen in the past de

0 Comments on Ethiopia Since Live Aid, Part III: On Africa, aid, and the West as of 1/1/1900
Add a Comment
21. Walter Lord: Story-teller or Social Historian?

John Welshman is the author of Churchill’s Children: The Evacuee Experience in Wartime Britain. He is currently working on a book provisionally entitled Titanic: The Last Night of a Small Town (forthcoming, 2012). Below he talks about Walter Lord, who wrote the acclaimed book A Night to Remember about the Titantic. You can read his previous OUPblog posts here.

It was Walter Lord, in A Night to Remember (1955) who described the sinking of the Titanic as ‘the last night of a small town’. Lord had been born on 8 October 1917, in Baltimore, the only son of a prominent lawyer. As a boy, he had enjoyed a transatlantic cruise on the Olympic, during which he had fantasised about what it must have been like to have been aboard the Titanic. He attended private schools in Baltimore, and then read History at Princeton, graduating in 1939. Lord was at the Yale Law School at the outbreak of the Second World War. He then went to work for the Office of Strategic Services, the forerunner of the Central Intelligence Agency, first as a code clerk in Washington, and later as an intelligence analyst in London. In 1945, he returned to Yale and completed his law degree. However he decided that he did not want to practise, and instead wrote business newsletters and books.

Shortly after going to work for the J. Walter Thompson advertising agency in New York, Lord published The Freemantle Diary. It was reasonably successful on its publication in 1954. But it was A Night to Remember for which Lord was best known. Published in November 1955, the book had sold 60,000 copies by January 1956, and it stayed on the best seller list for six months. Condensed versions appeared in the Ladies Home Journal and Reader’s Digest, and it was the first of Lord’s several ‘Book of the Month Club’ selections, in June 1956. A successful television adaptation directed by George Roy Hill and narrated by Claude Rains was broadcast on 28 March 1956; it attracted 28m viewers. The British-made film of the same name, directed by Roy Baker, and starring Kenneth More and David McCallum, came out in 1958. The book has never been out of print.

On its publication, the New York Times said that the book was ‘stunning … one of the most exciting books of this or any other year’, while the Atlantic Monthly declared ‘a magnificent job of re-creative chronicling, enthralling from the first word to the last’. The magazine USA Today said that the book was ‘the most riveting narrative of the disaster’, and Entertainment Weekly declared it ‘seamless and skilful … it’s clear why this is many a researcher’s Titanic bible’. In the New York Herald Tribune, reviewer Stanley Walker drew attention to Lord’s technique as being ‘a kind of literary pointillism, the arrangement of contrasting bits of fact and emotion in such a fashion that a vividly real impression of an event is conveyed to the reader’.

No books on the Titanic had been published between 1913 and 1955. Cultural historian Steven Biel has noted that the book was well marketed, but also explains its resonance through the highly visual and aural nature of the narrative. Lord blurred history into news and drama, collapsing ‘historical duration into intense moments of lived experience’. The book

0 Comments on Walter Lord: Story-teller or Social Historian? as of 7/13/2010 11:40:00 PM
Add a Comment
22. Psychopathy and Beyond

David Canter is Professor of Psychology at the University of Hudderfield. Widely known for developing systemic offender profiling in Britain and creating the emerging field of Investigative Psychology, he also provides evidence to government enquiries and major court cases. His new book is Forensic Psychology: A Very Short Introduction, and in the short excerpt below he looks at what a ‘psychopath’ really is.

There are many individuals who commit crimes who understand perfectly what they do and its illegality but who have no obvious mental problems. They are lucid and coherent with no signs of any learning disability or psychotic symptoms. Some of them can be superficially charming and are intelligent enough to be very plausible on first acquaintance. They do not hear voices or think that they are commanded by forces beyond their power to commit crimes. Yet, over and over again, they abuse people, lie without any compunction or remorse, can be unpredictably violent, and seem unable to relate effectively to others over any extended period. Various forms of criminality are almost inevitably an aspect of the lifestyles of these individuals. In the jargon of mental health professionals, such people may be given a diagnosis that implies that their ‘personality’ is somehow disordered.

In psychiatric medicalization of human activity, a whole set of ‘personality disorders’ has been identified that attempts to distinguish different ways in which individuals may have difficulty in relating to others. The one that has found its way into popular discourse is ‘psychopathic disorder’. There are complications here because the term ‘psychopathic disorder’ is not a medical diagnosis but a legal term under English and Welsh law that refers to a ‘persistent disorder or disability of the mind’, not that far removed from the McNaughton rule that first emerged over a century and a half ago. Thus there is some debate as to which of the psychiatric diagnoses of personality disorder are closest to the legal definition of ‘psychopathic disorder’, and whether any of them relates to the popular conception of a psychopath.

The Hollywood portrayal of the psychopath is someone who is inevitably a merciless serial killer, often some sort of cross between Dracula and Frankenstein’s monster. Silent films from the 1920s such as The Cabinet of Dr Caligari to the more recent Kalifornia, or No Country for Old Men, never really provide any psychological insights into the actions of the monsters who are the anti-heroes of their dramas. They are presented as pure evil. The rather more psychologically interesting films such as Psycho or The Boston Strangler provide pseudo-Freudian explanations for the nastiness of their villains, but still present them as rather alien individuals who can appear unthreatening but deep down are malevolent.

Until you have met someone whom you know has committed horrific violent crimes but can be charming and helpful, it is difficult to believe in the Hollywood stereotype of the psychopath. Without doubt, there are people who can seem pleasant and plausible in one situation but can quickly turn to viciousness. There are also people who just never connect with others and are constantly, from an early age, at war with those with whom they come into contact. If we need a label for these people, we can distinguish them as type 1 and type 2 psychopaths. The former have superficial charm, are pathological liars, being callous and manipulative. The clearest fictional example of this sort of psychopath is Tom Ripley, who has the central role in

0 Comments on Psychopathy and Beyond as of 1/1/1900
Add a Comment
23. Philosophy Bites: A Podcast

What does Simon Blackburn have to say about morality? What does A.C. Grayling think about atheism? Alain de Botton about the aesthetics of architecture? Adrian Moore about infinity? Will Kymlicka about minority rights? For the last three years, David Edmonds and Nigel Warburton have challenged some of the world’s leading philosophers to hold forth on their favourite topics for the highly successful Philosophy Bites podcast. Now 25 of these entertaining, personal, and illuminating conversations are presented in print for the first time.

Sticking with the podcast theme, David Edmunds interviewed Nigel Warburton about the book and the podcast, as well as what it’s like to speak to all of these fantastic philosophers. You can hear the interview in the below podcast.

We will also be running a very exciting Twitter competition in conjunction with Waterstone’s around the time of the book’s UK publication, 9 August. Keep an eye on Waterstone’s Twitter feed for further details.

David Edmonds is an award-winning documentary maker for the BBC World Service, and is the author of Wittgenstein’s Poker (with John Eidinow), Bobby Fischer Goes to War, and Rousseau’s Dog. He is currently a Research Associate at the Uehiro Centre for Practical Ethics at Oxford University and a Contributing Editor for Prospect Magazine.

Nigel Warburton is Senior Lecturer in Philosophy at the Open University and author of bestselling books Philosophy: The Basics and Philosophy: The Classics. He also recently wrote Free Speech: A Very Short Introduction. He regularly teaches courses on aesthetics at Tate Modern and writes a monthly column ‘Everyday Philosophy’ for Prospect Magazine. He runs several blogs including Virtual Philosopher and Art and Allusion.

0 Comments on Philosophy Bites: A Podcast as of 1/1/1900
Add a Comment
24. On the Practitioners of Science

By Jennifer Coopersmith

There is a Jane Austen-esque phrase in my book: “it is a ceaseless wonder that our universal and objective science comes out of human – sometimes all too human – enquiry”. Physics is rather hard to blog, so I’ll write instead about the practitioners of science – what are they like? Are there certain personality types that do science? Does the science from different countries end up being different?

Without question there are fewer women physicists than men physicists and, also without question, this is a result of both nature and nurture. Does it really matter how much of the ‘blame’ should be apportioned to nature and how much to nurture? Societies have evolved the way they have for a reason, and they have evolved to have less women pursuing science than men (at present). Perhaps ‘intelligence’ has even been defined in terms of what men are good at?

Do a disproportionate number of physicists suffer from Asperger Syndrome (AS)? I deplore the fashion for retrospectively diagnosing the most famous physicists, such as Newton and Einstein, as suffering in this way. However, I’ll jump on the bandwagon and offer my own diagnosis: these two had a different ‘syndrome’ – they were geniuses, period. Contrary to common supposition, it would not be an asset for a scientist to have AS. Being single-minded and having an eye for detail – good, but having a narrow focus of interest and missing too much of the rich tapestry of social and worldly interactions – not good, and less likely to lead to great heights of creativity.

In the late 18th and early 19th centuries, the science of energy was concentrated in two nations, England and France. The respective scientists had different characteristics. In England (strictly, Britain) the scientists were made up from an undue number of lone eccentrics, such as the rich Gentleman-scientists, carrying out researches in their own, privately–funded laboratories (e.g. Brook Taylor, Erasmus Darwin, Henry Cavendish and James Joule) and also religious non-conformists, of average or modest financial means (e.g. Newton, Dalton, Priestley and Faraday). This contrasts with France, where, post-revolution, the scientist was a salaried professional and worked on applied problems in the new state institutions (e.g. the French Institute and the École Polytechnique). The quality and number of names concentrated into one short period and one place (Paris), particularly in applied mathematics, has never been equalled: Lagrange, Laplace, Legendre, Lavoisier and Lamarck, – and these are only the L’s. As the historian of science, Henry Guerlac, remarked, science wasn’t merely a product of the French Revolution, it was the chief cultural expression of it.

There was another difference between the English and French scientists, as sloganized by the science historian Charles Gillispie: “the French…formulate things, and the English do them.” For example, Lavoisier developed a system of chemistry, including a new nomenclature, while James Watt designed and built the steam engine.

From the mid-19th century onwards German science took a more leading role and especially noteworthy was the rise of new universities and technical institutes. While many German scientists had religious affiliations (for example Clausius was a Lutheran), their science was neutral with regards to religion, and this was different to the trend in Britain. For example, Thomson (later Lord Kelvin) talked of the Earth “waxing old” and other quotes from the Bible, and, although he was not explicit, appears to have had religious objections to Darwin’s Theory of Evolution (at any rate, he wanted his ‘age of the Earth calculations’ to contradict Darwin’s Theory).

Whereas personal, cultural, social, economic and political factors will undoubtedly influence the course of science, the ultimate laws must be free of all such associations. Presumably the laws of Thermodynamics would still

0 Comments on On the Practitioners of Science as of 1/1/1900
Add a Comment
25. Norman Names

I couldn’t help noticing this story, which states that many of the names still popular in English-speaking countries originate from the Normans, who won control of England in 1066. Meanwhile, names that were popular in England at the time – such as Aethelred, Eadric, and Leofric – have disappeared. With that in mind, I turned to Babies’ Names, by Patrick Hanks and Kate Hardcastle, to find out more about Norman names. Below are a selection, along with their meanings.

Adele This was borne by a 7th-century saint, a daughter if the Frankish King Dagobert II. It was also the name of William the Conqueror’s youngest daughter (c. 1062-1137), who became the wife of Stephen of Blois. The name went out of use in England in the later Middle Ages, and was revived in the 19th century. It is the stage name of English singer-songwriter Laurie Blue Atkins (b. 1988).

Alison From a very common medieval name, a Norman French diminutive of Alice. It virtually died out in England in the 15th century, but survived in Scotland, with the result that until its revival in England in the 20th century it had a strongly Scottish flavour. The usual spelling in North America is Allison.

Bernard Norman and Old French name of Germanic (Frankish) origin, meaning ‘bear-hardy’. This was borne by three famous medieval churchmen: St Bernard of Menthon (923-1008), founder of a hospice on each of the Alpine passes named after himl; the monastic reformer St Bernard of Clairvaux (1090-1153); and the scholastic philosopher Bernard of Chartres.

Emma Old French name, of Germanic (Frankish) origin, originally a short form of compound names such as Ermintrude, containing the word erm(en), irm(en) ‘entire’. It was adopted by the Normans and introduced by them to Britain. Its popularity in medieval England was greatly enhanced by the fact that it had been borne by the mother of Edward the Confessor, herself a Norman.

Hugh From an Old French name, Hugues, of Germanic (Frankish) origin derived from hug ‘heart’, ‘mind’, ’spirit’. It was originally a short form of various compound names containing this element. This was borne by the aristocracy of medieval France, adopted by the Normans, and introduced by them to Britain.

Leonard From an Old French personal name of Germanic origin, derived from leon ‘lion’ + hard ‘hardy’, ‘brave’, ’strong’. This was the name of a 5th-century Frankish saint, the patron of peasants and horses. Although it was introduced into Britain by the Normans, Leonard was not a particularly common name during the Middle Ages. It was revived in the 19th century and became very popular. The spelling Lennard is also found.

Rosalind From an Old French personal name of Germanic (Frankish) origin, from hros ‘horse’ + lind ‘weak’, ‘tender’, ’soft’. It was adopted by the Normans and introduced by them to Britain. Its popularity as a given name owes much to its use by Edmund Spenser for the character of a shepherdess in his pastoral poetry, and by Shakespeare as the name of the heroine in As You Like It.

William Probably the most successful of all the Old French names of Germanic origin that were introduced to England by the Normans. It is derived from Germanic wil ‘will’, ‘desire’ + helm ‘helmet’, ‘protection’. The fact that it was borne by the Conqueror himself does not seem to have inhibited its favour with

0 Comments on Norman Names as of 1/1/1900
Add a Comment

View Next 11 Posts