These are taken from an interview with the amazing Zoe Toft
Filed under: children's illustration, dances, flying, journeys, snow, songs Add a Comment
These are taken from an interview with the amazing Zoe Toft
There’s a reason I got stuck on Patty’s Motor Car when I was reviewing the Patty Fairfield books. A couple of reasons, I guess. And if you want to look at it that way, the reasons’ names are Philip Van Reypen and Christine Farley.
I’m a weirdo who spends a lot of time thinking about things like how Patty Fairfield’s suitors fit into the structure of the series, and I think there’s a turning point here, a two-book transition between between the first seven books of the series and the last eight. Everything through Patty’s Pleasure Trip is about Patty the kid. Then, in Patty’s Success, Wells pushes Patty into the real world by making her deal with the job market. Then she introduces Christine and Phil, apparently for the purpose of splitting up Patty and Mr. Hepworth. This book brings Christine and Phil closer–and for the record, I don’t actually dislike Christine, just what she represents–and moves Patty further into the world by giving her mobility, in the form of an electric car.
I wonder a lot whether Wells seriously considered Phil as a possible endgame suitor for Patty. I find him so consistently awful, but I can’t find any sign that Wells agrees, unless writing him as a reckless, selfish manipulator who thinks he can get away with anything because he always has before counts.
Um, so, yeah. I hate Phil Van Reypen so much. You can take that as a given, although I have no doubt I’ll manage to remind you. Anyway, the next book changes the trajectory of the series a little, but I find it difficult to read these two books that push Patty towards Phil, because he is the worst. I started keeping a journal again shortly before I started rereading this book and now it’s full of “WORST”s in relation to Phil. In fact, if you looked at my journal, you’d think the whole book was instances of Phil being awful alternating with wordless conversations between Patty and Mr. Hepworth. And it is, kind of, but some other stuff happens, too.
So, this car company holds a contest: they put out a book of puzzles and riddles and things, and the person who sends in the most complete and correct set of answers by the deadline wins an electric car. Patty, with a bit of help from Kenneth Harper, a lot of help from Phil, and a bit of important last minute help from Mr. Hepworth, submits a set of answers and–you noticed the title, right?–wins the car.
The Fairfields move to the Jersey shore for the summer, and Patty gets to drive her car around a bunch, and we’re introduced to Mona Galbraith, who Wells never actually describes as nouveau riche. Instead Wells calls her “pushing,” and says her house and her clothes are unnecessarily fancy, but it’s cool, we all know what she means.
But yeah, other than that it’s all Phil getting Patty into scrapes, which he sometimes also gets hor out of, and also there’s a delightfully uncomfortable conversation between Patty and Christine where Christine tries to get Patty to acknowledge that Mr. Hepworth is in love with her and Patty says some stuff that’s one step removed from repeating “I’m not having this conversation,” over and over again. It’s pretty great.
Anyway, I hate Phil Van Reypen, but the rest of this book is pretty fun.
About half a century ago, an MIT professor set up a summer project for students to write a computer programme that can “see” or interpret objects in photographs. Why not! After all, seeing must be some smart manipulation of image data that can be implemented in an algorithm, and so should be a good practice for smart students. Decades passed, we still have not fully reached the aim of that summer student project, and a worldwide computer vision community has been born.
We think of being “smart” as including the intellectual ability to do advanced mathematics, complex computer programming, and similar feats. It was shocking to realise that this is often insufficient for recognising objects such as those in the following image.
Image credit: Fig 5.51 from Li Zhaoping,
Understanding Vision: Theory Models, and Data
Can you devise a computer code to “see” the apple from the black-and-white pixel values? A pre-school child could of course see the apple easily with her brain (using her eyes as cameras), despite lacking advanced maths or programming skills. It turns out that one of the most difficult issues is a chicken-and-egg problem: to see the apple it helps to first pick out the image pixels for this apple, and to pick out these pixels it helps to see the apple first.
A more recent shocking discovery about vision in our brain is that we are blind to almost everything in front of us. “What? I see things crystal-clearly in front of my eyes!” you may protest. However, can you quickly tell the difference between the following two images?
Image credit: Alyssa Dayan, 2013 Fig. 1.6 from Li Zhaoping
Understanding Vision: Theory Models, and Data. Used with permission
It takes most people more than several seconds to see the (big) difference – but why so long? Our brain gives us the impression that we “have seen everything clearly”, and this impression is consistent with our ignorance of what we do not see. This makes us blind to our own blindness! How we survive in our world given our near-blindness is a long, and as yet incomplete, story, with a cast including powerful mechanisms of attention.
Being “smart” also includes the ability to use our conscious brain to reason and make logical deductions, using familiar rules and past experience. But what if most brain mechanisms for vision are subconscious and do not follow the rules or conform to the experience known to our conscious parts of the brain? Indeed, in humans, most of the brain areas responsible for visual processing are among the furthest from the frontal brain areas most responsible for our conscious thoughts and reasoning. No wonder the two examples above are so counter-intuitive! This explains why the most obvious near-blindness was discovered only a decade ago despite centuries of scientific investigation of vision.
Another counter-intuitive finding, discovered only six years ago, is that our attention or gaze can be attracted by something we are blind to. In our experience, only objects that appear highly distinctive from their surroundings attract our gaze automatically. For example, a lone-red flower in a field of green leaves does so, except if we are colour-blind. Our impression that gaze capture occurs only to highly distinctive features turns out to be wrong. In the following figure, a viewer perceives an image which is a superposition of two images, one shown to each of the two eyes using the equivalent of spectacles for watching 3D movies.
Image credit: Fig 5.9 from Li Zhaoping,
Understanding Vision: Theory Models, and Data
To the viewer, it is as if the perceived image (containing only the bars but not the arrows) is shown simultaneously to both eyes. The uniquely tilted bar appears most distinctive from the background. In contrast, the ocular singleton appears identical to all the other background bars, i.e. we are blind to its distinctiveness. Nevertheless, the ocular singleton often attracts attention more strongly than the orientation singleton (so that the first gaze shift is more frequently directed to the ocular rather than the orientation singleton) even when the viewer is told to find the latter as soon as possible and ignore all distractions. This is as if this ocular singleton is uniquely coloured and distracting like the lone-red flower in a green field, except that we are “colour-blind” to it. Many vision scientists find this hard to believe without experiencing it themselves.
Are these counter-intuitive visual phenomena too alien to our “smart”, intuitive, and conscious brain to comprehend? In studying vision, are we like Earthlings trying to comprehend Martians? Landing on Mars rather than glimpsing it from afar can help the Earthlings. However, are the conscious parts of our brain too “smart” and too partial to “dumb” down suitably to the less conscious parts of our brain? Are we ill-equipped to understand vision because we are such “smart” visual animals possessing too many conscious pre-conceptions about vision? (At least we will be impartial in studying, say, electric sensing in electric fish.) Being aware of our difficulties is the first step to overcoming them – then we can truly be smart rather than smarting at our incompetence.
Headline image credit: Beautiful woman eye with long eyelashes. © RyanKing999 via iStockphoto.
The anniversaries of conflicts seem to be more likely to capture the public’s attention than any other significant commemorations. When I first began researching the nurses of the First World War in 2004, I was vaguely aware of an increase in media attention: now, ten years on, as my third book leaves the press, I find myself astonished by the level of interest in the subject. The Centenary of the First World War is becoming a significant cultural event. This time, though, much of the attention is focussed on the role of women, and, in particular, of nurses. The recent publication of several nurses’ diaries has increased the public’s fascination for the subject. A number of television programmes have already been aired. Most of these trace journeys of discovery by celebrity presenters, and are, therefore, somewhat quirky – if not rather random – in their content. The BBC’s project, World War One at Home, has aired numerous stories. I have been involved in some of these – as I have, also, in local projects, such as the impressive recreation of the ‘Stamford Military Hospital’ at Dunham Massey Hall, Cheshire. Many local radio stories have brought to light the work of individuals whose extraordinary experiences and contributions would otherwise have remained hidden – women such as Kate Luard, sister-in-charge of a casualty clearing station during the Battle of Passchendaele; Margaret Maule, who nursed German prisoners-of-war in Dartford; and Elsie Knocker, a fully-trained nurse who established an aid post on the Belgian front lines. One radio story is particularly poignant: that of Clementina Addison, a British nurse, who served with the French Flag Nursing Corps – a unit of fully trained professionals working in French military field hospitals. Clementina cared for hundreds of wounded French ‘poilus’, and died of an unnamed infectious disease as a direct result of her work.
The BBC drama, The Crimson Field was just one of a number of television programmes designed to capture the interest of viewers. I was one of the historical advisers to the series. I came ‘on board’ quite late in the process, and discovered just how difficult it is to transform real, historical events into engaging drama. Most of my work took place in the safety of my own office, where I commented on scripts. But I did spend one highly memorable – and pretty terrifying – week in a field in Wiltshire working with the team producing the first two episodes. Providing ‘authentic background detail’, while, at the same time, creating atmosphere and constructing characters who are both credible and interesting is fraught with difficulty for producers and directors. Since its release this spring, The Crimson Field has become quite controversial, because whilst many people appear to have loved it, others complained vociferously about its lack of authentic detail. Of course, it is hard to reconcile the realities of history with the demands of popular drama.
I give talks about the nurses of the First World War, and often people come up to me to ask about The Crimson Field. Surprisingly often, their one objection is to the fact that the hospital and the nurses were ‘just too clean’. This makes me smile. In these days of contract-cleaners and hospital-acquired infection, we have forgotten the meticulous attention to detail the nurses of the past gave to the cleanliness of their wards. The depiction of cleanliness in the drama was, in fact one of its authentic details.
One of the events I remember most clearly about my work on set with The Crimson Field is the remarkable commitment of director, David Evans, and leading actor, Hermione Norris, in recreating a scene in which Matron Grace Carter enters a ward which is in chaos because a patient has become psychotic and is attacking a padre. The matron takes a sedative injection from a nurse, checks the medication and administers the drug with impeccable professionalism – and this all happens in the space of about three minutes. I remember the intensity of the discussions about how this scene would work, and how many times it was ‘shot’ on the day of filming. But I also remember with some chagrin how, the night after filming, I realised that the injection technique had not been performed entirely correctly. I had to tell David Evans that I had watched the whole sequence six times without noticing that a mistake had been made. Some historical adviser! The entire scene had to be re-filmed. The end result, though, is an impressive piece of hospital drama. Norris looks as though she has been giving intramuscular injections all her life. I shall never forget the professionalism of the director and actors on that set – nor their patience with the absent-minded-professor who was their adviser for the week.
In a centenary year, it can be difficult to distinguish between myths and realities. We all want to know the ‘facts’ or the ‘truths’ about the First World War, but we also want to hear good stories – and it is all the better if those elide facts and enhance the drama of events – because, as human beings, we want to be entertained as well. The important thing, for me, is to fully realise what it is we are commemorating: the significance of the contributions and the enormity of the sacrifices made by our ancestors. Being honest to their memories is the only thing that really matters –the thing that makes all centenary commemoration projects worthwhile.
Image credit: Ministry of Information First World War Collection, from Imperial War Museum Archive. IWM Non Commercial Licence via Wikimedia Commons.
The post The real story of allied nursing during the First World War appeared first on OUPblog.
Rebecca Solnit has gone on my list of authors whose work I’d like to own and read all of. It started off with her newest essay collection Men Explain Things To Me and was cemented by A Field Guide to Getting Lost. Field Guide was on my TBR list for years but I just never got around to it. Why did I take so long? I am a believer that every book has the right time and for whatever reason the right time wasn’t until now.
How to describe the book? Essays? Yes but not really because each one is connected. But it isn’t straight up nonfiction either because there is no real “plot” other than the theme of getting lost. Which makes it very much a long meditation. But yet there is a direction of sorts because four of the chapters/essays are called “The Blue of Distance” and these alternate with chapters called things like “Abandon” and “One-Story House.” The blue chapters all tend to be outward facing, about someone — the artist Yves Kline for instance — or about something — a certain color of blue or country western music. The other chapters tend to be more personally reflective and wide-ranging discussing things like leaving the door open for Elijah during Passover dinner, hiking in the wilderness, and family history. But even the distinction between the blue chapters and the named chapters blurs as Solnit will include personal reflection in the blue chapters and quotes Meno, Simone Weil, and a Tibetan sage in the personal chapters. I found all this intermingling to be satisfying and wanted the book to be longer than it is. A Good sign, right?
A Field Guide to Getting Lost is about many things, but at its core it is about stories:
A story can be a gift like Ariadne’s thread , or the labyrinth, or the labyrinth’s raving Minotaur; we navigate by stories, but sometimes we only escape by abandoning them.
Stories anchor us, tell us who we are or point to who we want to be. We can become lost in our stories. We can also be oppressed by our stories and only find out who we are by giving them up and losing ourselves. Trouble is, we think of being lost as a bad thing, but when we are lost we are more open to possibility than we are when we are sure of ourselves and our stories:
Never to get lost is not to live, not to know how to get lost brings you to destruction, and somewhere in the terra icognita in between lies a life of discovery.
Even when we are sure of our stories, we still change over time and lose the person we used to be. When it happens so slowly we don’t even notice it we are not bothered by it until we are startled into awareness by an old photograph or letter, or a person we haven’t seen in many years. Sometimes, of course, loss happens very fast and unexpectedly and we are thrown for a loop. Not only do we write the story of our past but we write it well into the future and a sudden loss throws us into uncertainty, a place in which we do not feel comfortable spending time. And so we worry:
Worry is a way to pretend that you have knowledge or control over what you don’t — and it surprised me, even in myself, how much we prefer ugly scenarios to the pure unknown. Perhaps fantasy is what you fill up maps with rather than saying they too contain the unknown.
In the last chapter there is a beautiful piece of a lecture Solnit shares that she heard given at the Zen Center in San Francisco. Zen, you may know, is all about mindfulness, paying attention, living in the hear and now not dwelling on the past or projecting into the future. And this lecture coming as it does nearly at the end of the final chapter, serves to sum up much of the whole book. It is such a wonderful story it is hard to pick out an exact sort of summary quote, but this might give you and idea:
‘Maybe if I really paid attention I’d notice that I don’t know what’s going to happen this afternoon and I can’t be fully confident that I am competent to deal with it. Maybe we’re willing to let in that thought. It has some reasonableness to it, I can’t exactly know, but chances are, possibilities are, it’s not going to be much different than what I’ve usually experienced and I’ll do just fine, so we close up that unsettling possibility with a reasonable response. The practice of awareness takes us below the reasonableness that we’d like to think we live with and then we start to see something quite fascinating, which is the drama of our inner dialogue, of the stories that go through our minds and the feelings that go through our heart, and we start to see in this territory it isn’t so neat and orderly and, dare I say it, safe or reasonable.’
The story goes on to remind us that it is okay to not know; okay to be uncertain; okay to run into a barrier and ask for help. It is okay to be lost. Because we can only really find what we need if we are lost:
That thing the nature of which is totally unknown to you is usually what you need to find, and finding it is a matter of getting lost.
As can be guessed from the above title, my today’s subject is the derivation of the word road. The history of road has some interest not only because a word that looks so easy for analysis has an involved and, one can say, unsolved etymology but also because it shows how the best scholars walk in circles, return to the same conclusions, find drawbacks in what was believed to be solid arguments, and end up saying: “Origin unknown (uncertain).” The public should know about the effort it takes to recover the past of the words we use. I am acutely aware of the knots language historians have to untie and of most people’s ignorance of the labor this task entails. In a grant application submitted to a central agency ten or so years ago, I promised to elucidate (rather than solve!) the etymology of several hundred English words. One of the referees divided the requested number of dollars by the number of words and wrote an indignant comment about the burden I expected taxpayers to carry (in financial matters, suffering taxpayers are always invoked: they are an equivalent of women and children in descriptions of war; those who don’t pay taxes and men do not really matter). Needless to say, my application was rejected, the taxpayers escaped with a whole skin, and the light remained under the bushel I keep in my office. My critic probably had something to do with linguistics, for otherwise he would not have been invited to the panel. In light of that information I am happy to report that today’s post will cost taxpayers absolutely nothing.
According to the original idea, road developed from Old Engl. rad “riding.” Its vowel was long, that is, similar to a in Modern Engl. spa. Rad belonged with ridan “to ride,” whose long i (a vowel like ee in Modern Engl. fee) alternated with long a by a rule. In the past, roads existed for riding on horseback, and people distinguished between “a road” and “a footpath.” But this seemingly self-evident etymology has to overcome a formidable obstacle: in Standard English, the noun road acquired its present-day meaning late (one can say very late). It was new or perhaps unknown even to Shakespeare. A Shakespeare glossary lists the following senses of road in his plays: “journey on horseback,” “hostile incursion, raid,” “roadstead,” and “highway” (“roadstead,” that is, “harbor,” needn’t surprise us, for ships were said to ride at anchor.) “Highway” appears as the last of the four senses because it is the rarest, but, as we will see, there is a string attached even to such a cautious statement. Raid is the Scots version of road (“long a,” mentioned above, developed differently in the south and the north; hence the doublets). In sum, road used to mean “raid” and “riding.” When English speakers needed to refer to a road, they said way, as, for example, in the Authorized Version of the Bible.
No disquisition, however learned, will answer in a fully convincing manner why about 250 years ago road partly replaced way. But there have been attempts to overthrow even the basic statement. Perhaps, it was proposed, road does not go back to Old. Engl. rad, with its long vowel! This heretical suggestion was first put forward in 1888 by Oxford Professor of Anglo-Saxon John Earle. In his opinion, the story began with rod “clearing.” The word has not made it into the Standard, but we still rid our houses of vermin and get rid of old junk. Rid is related to Old Engl. rod.
Earle’s command of Old English was excellent, but he did not care much about phonetic niceties. In his opinion, if meanings show that certain words are allied, phoneticians should explain why something has gone wrong in their domain rather than dismissing an otherwise persuasive conclusion as invalid. This type of reasoning cut no ice with the etymologists of the last quarter of the nineteenth century. Nor does it thrill modern researchers, even though at all times there have been serious scholars who refused to bow to the tyranny of so-called phonetic laws. Such mavericks face a great difficulty, for, if we allow ourselves to be guided by similarity of meaning in disregard of established sound correspondences, we may return to the fantasies of medieval etymology. Earle tried to posit long o in rod, though not because he had proof of its length but because he needed it to be long. A. L. Mayhew, whom I mentioned in the post on qualm, and Skeat dismissed the rod-road etymology as not worthy of discussion. Surprisingly, it was revived ten years ago (without reference to Earle), now buttressed by phonetic arguments. It appears that rod with a long vowel did exist, but, more probably, its length was due to a later process. In any case, Earle would have been thrilled. I have said more than once that etymology is a myth of eternal return.
Whatever the origin of road, we still wonder why its modern sense emerged so late. In 1934, this question was the subject of a lively exchange in the pages of The Times Literary Supplement. In response to that discussion the German scholar Max Deutschbein showed that Shakespeare never used road “way” without making it clear what he meant. Once he used the compound roadway. Elsewhere some road is followed by as common as the way between…. We read about the even road of a blank verse, easy roads (for riding), and a thievish living on the common road. The word way helps us understand what is meant in You know the very road (= “journey”: OED) into his kindness, / and cannot lose your way (Coriolanus). Deutschbein concluded that Shakespeare hardly knew our sense of road.
This sense had become universally understood only by the sixteen-seventies (Shakespeare died in 1616), and Milton (1608-1624) used it “unapologetically.” So how did it arise? Extraneous influences—Scottish and Irish—have often been considered; the arguments for their role are thin. The anonymous initiator of the discussion in The Times Literary Supplement (I am sure the author’s name is known) spun a wonderful yarn about how Shakespeare met a group of Scotsmen, learned something about the Scots, and picked up a new word. The story is clever but not particularly trustworthy. The Irish connection is even less likely. Deutschbein noted that, according to the OED, the compound roadway reached the peak of its popularity in the seventeenth century and disappeared once road established itself. Is it possible that this is where we should look for the solution of the riddle? Etymological riddles are always hard, while solutions are usually simple, and the simpler they are, the higher the chance that they are correct.
No citations for the noun roadway antedating 1600 have been found. We don’t know how early in the sixteenth century it arose, but in this case an exact date is of little consequence. The OED suggests that the earliest meaning of roadway was “riding way,” and so it must have been. At some time, speakers probably reinterpreted this noun as a tautological compound (which it was not), a word like pathway, apparently, a sixteenth-century coinage, and many others like them. Words having this meaning are prone to be made up of two near-synonyms (way-way, road-road, path-path); see my old post on such compounds. Roadway could have continued its existence for centuries, but at some time the second element was dumped as superfluous. For a relatively short period road coexisted with way as its equal partner, but then they divided their spheres of influence: road began to refer to physical reality and way to more abstract situations. We speak of impassable roads and road maps, as opposed to the way of all flesh and ways and means committees. Extraneous influences were not needed for such a process to happen.
I often complain that the scholarly literature on some words is meager. By contrast, the literature on road is extensive. A long paper devoted to it was published as recently as a year ago, whence an extremely detailed etymological introduction to the entry road in the OED online. Even if I failed to discern the complexity of the problem and untie or cut the knot, my intentions were good.
The post An etymological journey paved with excellent intentions appeared first on OUPblog.
I recently had the pleasure of meeting funny man and children’s book author, Adam Wallace, creator of titles including Mac O’Beasty, The Negatees, The Pete McGee series, Jamie Brown is Not Rich, and Better Out Than In. I am even more fortunate that he has agreed to answer some of my questions! Firstly, congratulations on being […]Add a Comment
The discovery of the periodic system of the elements and the associated periodic table is generally attributed to the great Russian chemist Dmitri Mendeleev. Many authors have indulged in the game of debating just how much credit should be attributed to Mendeleev and how much to the other discoverers of this unifying theme of modern chemistry.
In fact the discovery of the periodic table represents one of a multitude of multiple discoveries which most accounts of science try to explain away. Multiple discovery is actually the rule rather than the exception and it is one of the many hints that point to the interconnected, almost organic nature of how science really develops. Many, including myself, have explored this theme by considering examples from the history of atomic physics and chemistry.
But today I am writing about a subaltern who discovered the periodic table well before Mendeleev and whose most significant contribution was published on 20 August 1864, or precisely 150 years ago. John Reina Newlands was an English chemist who never held a university position and yet went further than any of his contemporary professional chemists in discovering the all-important repeating pattern among the elements which he described in a number of articles.
Newlands came from Southwark, a suburb of London. After studying at the Royal College of chemistry he became the chief chemist at Royal Agricultural Society of Great Britain. In 1860 when the leading European chemists were attending the Karlsruhe conference to discuss such concepts as atoms, molecules and atomic weights, Newlands was busy volunteering to fight in the Italian revolutionary war under Garibaldi. This is explained by the fact that his mother was Italian descent, which also explains his having the middle name Reina. In any case he survived the fighting and set about thinking about the elements on his return to London to become a sugar chemist.
In 1863 Newlands published a list of elements which he arranged into 11 groups. The elements within each of his groups had analogous properties and displayed weights that differed by eight units or some factor of eight. But no table yet!
Nevertheless he even predicted the existence of a new element, which he believed should have an atomic weight of 163 and should fall between iridium and rhodium. Unfortunately for Newlands neither this element, or a few more he predicted, ever materialized but it does show that the prediction of elements from a system of elements is not something that only Mendeleev invented.
In the first of three articles of 1864 Newlands published his first periodic table, five years before Mendeleev incidentally. This arrangement benefited from the revised atomic weights that had been announced at the Karlsruhe conference he had missed and showed that many elements had weights differing by 16 units. But it only contained 12 elements ranging between lithium as the lightest and chlorine as the heaviest.
Then another article, on 20 August 1864, with a slightly expanded range of elements in which he dropped the use of atomic weights for the elements and replaced them with an ordinal number for each one. Historians and philosophers have amused themselves over the years by debating whether this represents an anticipation of the modern concept of atomic number, but that’s another story.
More importantly Newlands now suggested that he had a system, a repeating and periodic pattern of elements, or a periodic law. Another innovation was Newlands’ willingness to reverse pairs of elements if their atomic weights demanded this change as in the case of tellurium and iodine. Even though tellurium has a higher atomic weight than iodine it must be placed before iodine so that each element falls into the appropriate column according to chemical similarities.
The following year, Newlands had the opportunity to present his findings in a lecture to the London Chemical Society but the result was public ridicule. One member of the audience mockingly asked Newlands whether he had considered arranging the elements alphabetically since this might have produced an even better chemical grouping of the elements. The society declined to publish Newlands’ article although he was able to publish it in another journal.
In 1869 and 1870 two more prominent chemists who held university positions published more elaborate periodic systems. They were the German Julius Lothar Meyer and the Russian Dmitri Mendeleev. They essentially rediscovered what Newlands found and made some improvements. Mendeleev in particular made a point of denying Newlands’ priority claiming that Newlands had not regarded his discovery as representing a scientific law. These two chemists were awarded the lion’s share of the credit and Newlands was reduced to arguing for his priority for several years afterwards. In the end he did gain some recognition when the Davy award, or the equivalent of the Nobel Prize for chemistry at the time, and which had already been jointly awarded to Lothar Meyer and Mendeleev, was finally accorded to Newlands in 1887, twenty three years after his article of August 1864.
But there is a final word to be said on this subject. In 1862, two years before Newlands, a French geologist, Emile Béguyer de Chancourtois had already published a periodic system that he arranged in a three-dimensional fashion on the surface of a metal cylinder. He called this the “telluric screw,” from tellos — Greek for the Earth since he was a geologist and since he was classifying the elements of the earth.
The post The 150th anniversary of Newlands’ discovery of the periodic system appeared first on OUPblog.
Dmitri Mendeleev believed he was a great scientist and indeed he was. He was not actually recognized as such until his periodic table achieved worldwide diffusion and began to appear in textbooks of general chemistry and in other major publications. When Mendeleev died in February 1907, the periodic table was established well enough to stand on its own and perpetuate his name for upcoming generations of chemists.
The man died, but the myth was born.
Mendeleev as a legendary figure grew with time, aided by his own well-organized promotion of his discovery. Well-versed in foreign languages and with a sort of overwhelming desire to escape his tsar-dominated homeland, he traveled the length and breadth of Europe, attending many conferences in England, Germany, Italy, and central Europe, his only luggage seemingly his periodic table.
Mendeleev had succeeded in creating a new tool that chemists could use as a springboard to new and fascinating discoveries in the fields of theoretical, mineral, and general chemistry. But every coin has two faces, even the periodic table. On the one hand, it lighted the path to the discovery of still missing elements; on the other, it led some unfortunate individuals to fall into the fatal error of announcing the discovery of false or spurious supposed new elements. Even Mendeleev, who considered himself the Newton of the chemical sciences, fell into this trap, announcing the discovery of imaginary elements that presently we know to have been mere self-deception or illusion.
It probably is not well-known that Mendeleev had predicted the existence of a large number of elements, actually more than ten. Their discoveries were sometimes the result of lucky guesses (like the famous cases of gallium, germanium, and scandium), and at other times they were erroneous. Historiography has kindly passed over the latter, forgetting about the long line of imaginary elements that Mendeleev had proposed, among which were two with atomic weights lower than that of hydrogen, newtonium (atomic weight = 0.17) and coronium (Atomic weight = 0.4). He also proposed the existence of six new elements between hydrogen and lithium, whose existence could not but be false.
Mendeleev represented a sort of tormented genius who believed in the universality of his creature and dreaded the possibility that it could be eclipsed by other discoveries. He did not live long enough to see the seed that he had planted become a mighty tree. He fought equally, with fierce indignation, the priority claims of others as well as the advent of new discoveries that appeared to menace his discovery.
In the end, his table was enduring enough to accommodate atomic number, isotopes, radioisotopes, the noble gases, the rare earth elements, the actinides, and the quantum mechanics that endowed it with a theoretical framework, allowing it to appear fresh and modern even after a scientific journey of 145 years.
Image: Nursery of new stars by NASA, Hui Yang University of Illinois. Public domain via Wikimedia Commons.
Martin Partington discussed a range of careers in his podcasts yesterday. Today, he tackles how new legal issues and developments in the professional environment have in turn changed organizational structures, rules and regulations, and aspects of legal education.
Co-operative Legal Services: An interview with Christina Blacklaws
Co-operative Legal Services was the first large organisation to be authorised by the Solicitors Regulatory Authority as an Alternative Business Structure. In this podcast, Martin talks to Christina Blacklaws, Head of Policy of Co-operative Legal Services.
The role of chartered legal executives: An interview with Diane Burleigh
The Chartered Institute of Legal Executives sets standards for and regulates the activities of legal executives, who play an important role in the delivery of legal services. In this podcast Martin talks with Diane Burleigh, the Chief Executive of CILEX, about the challenges facing the legal profession and the opportunities provided for Legal Executives in the rapidly developing legal world.
Educating Judges and the Judicial College: An interview with Lady Justice Hallett
The Judicial College was created by bringing together separate arrangements that had previously existed for training judicial office-holders in the courts (the Judicial Studies Board) and Tribunals Service (through the Tribunals Judicial Training Group). In this podcast Martin talks to its Chairman, Lady Justice Hallett, about the reasons for the change and ways in which the College is developing new ideas about judicial education.
Headline image credit: Law student and lecturer or academic. © Palto via iStockphoto.
I recently began reading Far from the Madding Crowd on my Kindle. I am so glad I am finally getting around to reading Hardy. Why did I wait so long? Please don’t answer that.
Anyway, after work today on the train I was reading and Oak, the main character thus far, was playing Peeping Tom, watching an older woman and a young lady he had just seen for the first time earlier that day feed a cow and take care of her new calf. The hour was late, somewhere around 1 a.m. by the stars Hardy tells us. The young lady yawns (but not in an inappropriately large way, she does have manners) and Oak, peeping through the gap in the barn boards is overwhelmed and suddenly yawns too. And I, reading the book, found myself attacked by a yawn.
Has this ever happened to you before? You are made to yawn by a character in the book yawning?
Or what about when a character is really thirsty, have you ever suddenly found yourself thirsty too? Of hungry? Books make me hungry all the time and there doesn’t even have to be a description of a great meal that makes my mouth water. I am currently reading The Memory Garden and there is an amazing dinner scene. I was doing fine, until they had blueberry sorbet. Oh that sounded good, give me a some please! I could even taste it and feel the cold in mouth even though the author didn’t spend any time actually describing it. But what has really gotten me is the chocolate cake that was mentioned a couple times. I was struck by a sudden craving. I came really close to asking Bookman if he would make one.
Other times while reading I have felt hot or cold or found myself squinting along with the character in an imagined bright sun. And of course tears. There have also been tears springing to my eyes as quickly as they spring to the eyes of the character in the book.
Being so affected probably has something to do with an active imagination and mirror neurons. When you see someone pick up a cup, for instance, mirror neurons supposedly fire in your brain in the same areas that would go to work if you were actually picking up the cup yourself. I’m wondering if I start reading books in which people get lots of exercise whether that means I am exercising too? Wouldn’t that be nice? Reading about someone running a marathon does not equal me actually running one. Very much wishful thinking but you can’t blame a girl for trying.
On 19 August 1692, George Burroughs stood on the ladder and calmly made a perfect recitation of the Lord’s Prayer. Some in the large crowd of observers were moved to tears, so much so that it seemed the proceedings might come to a halt. But Reverend Burroughs had uttered his last words. He was soon “turned off” the ladder, hanged to death for the high crime of witchcraft. After the execution, Reverend Cotton Mather, who had been watching the proceedings from horseback, acted quickly to calm the restless multitude. He reminded them among other things “that the Devil has often been transformed into an Angel of Light” — that despite his pious words and demeanor, Burroughs had been the leader of Satan’s war against New England. Thus assured, the executions would continue. Five people would die that day, one of most dramatic and important in the course of the Salem witch trials. For the audience on 19 August realized that if a Puritan minister could hang for witchcraft, then no one was safe. Their tears and protests were the beginning of the public opposition that would eventually bring the trials to an end. Unfortunately, by the time that happened, nineteen people had been executed, one pressed to death, and five perished in the wretched squalor of the Salem and Boston jails.
The fact that a Harvard-educated Puritan minister was considered the ringleader of the largest witch hunt in American history is one of the many striking oddities about the Salem trials. Yet, a close look at Burroughs reveals that his character and his background personified virtually all the fears and suspicions that ignited witchcraft accusations in 1692. There was no single cause, no simple explanation to why the Salem crisis happened. Massachusetts Bay faced a confluence of events that produced the fears and doubts that led to the crisis. Likewise, a wide range of people faced charges for having supposedly committed diverse acts of witchcraft against a broad swath of the populace. Yet, there were many reasons people were suspicious of George Burroughs, indeed he was the perfect witch.
In 1680 when Burroughs was hired to be the minister of Salem Village he quickly became a central figure in the on-going controversy over religion, politics, and money that would span more than thirty years and result in the departure of the community’s first four ministers. One of Burroughs’s parishioners wrote to him, complaining that “Brother is against brother and neighbors against neighbors, all quarreling and smiting one another.” After a little over two years in office, the Salem Village Committee stopped paying Burroughs’s salary, so he wisely left town to return to his old job, as minister of Falmouth (now Portland, Maine).
George Burroughs spent most of his career in Falmouth, a town on the edge of the frontier. He was fortunate to escape the bloody destruction of the settlement by Native Americans in 1676 (during King Philip’s War) and 1690 (during King William’s War). The latter conflict brought a string of disastrous defeats to Massachusetts, and as many historians have noted, the ensuing war panic helped trigger the witch trials. The war was a spiritual defeat for the Puritan colony as they were losing to French Catholics allied with people they considered to be “heathen” Indians. It seemed Satan’s minions would end the Puritans’ New England experiment. Burroughs was one of many refugees from Maine who were either afflicted by or accused of witchcraft. In addition, most of the judges were military officers as well as speculators in Maine lands that the war had made worthless. Some of the afflicted refugees were suffering what today would be considered post-traumatic shock. Used to the manual labor of the frontier, Burroughs was so incredibly strong that several would testify in 1692 to his feats of supernatural strength. The minister’s seemingly miraculous escapes from Falmouth in 1676 and 1690 also brought him under suspicion. Perhaps he had done so with the help of the devil, or the Indians.
Tainted by his frontier ties, the twice-widowed Burroughs’s personal life and perceived religious views amplified fears of the minister. At his trial, several testified to his secretive ways, his seemingly preternatural knowledge, and his strict rule over his wives. He forbid his wives to speak about him to others, and even censored their letters to family. Meanwhile the afflicted said they saw the specters of Burroughs’s late wives, who claimed he murdered them. The charges were groundless. However, his controlling ways and the spectacular testimony against him at least raised the question of domestic abuse. Such perceived abuse of authority — at the family, community or colony-wide level — is a common thread linking many of Salem’s accused.
Some observers believed Burroughs was secretive because they suspected he was a Baptist. This Protestant sect had legal toleration but like the Quakers, was considered dangerous by most Massachusetts Puritans because of their belief in adult baptism and adult-only membership in the church. Burroughs admitted to the Salem judges that he had not recently received Puritan communion and had not baptized his younger children (both signs that he might be a Baptist). His excuse was that he was never ordained and hence could not lead the communion service, nor could he baptize children. However, since Burroughs left his post in Maine, he admitted he had visited Boston and Charlestown and had failed to take advantage of these rights there.
Even if he was not a Baptist, as a Puritan minister he was at risk. Burroughs was just one of five ministers cried out upon in 1692. Fully, 30 percent of the people accused were ministers, their immediate family members, or extended kin. In many ways, the witch trials were a critique of the religious and political policies of the colony. But that is another story.
Header image taken by Emerson W. Baker.
What range of career options are out there for those attending law school? In this series of podcasts, Martin Partington talks to influential figures in the law about topics ranging from restorative justice to legal journalism.
Restorative Justice: An interview with Lizzie Nelson
The Restorative Justice Council is a small charitable organisation that exists to promote the use of restorative justice, not just in the court (criminal justice) context, but in other situations of conflict as well (e.g. schools). In this podcast Martin talks to Lizzie Nelson, Director of the Restorative Justice Council.
Handling complaints against lawyers: An interview with Adam Sampson
In this podcast, Martin talks to Adam Sampson, Chief Legal Ombudsman. They discuss the work of the Legal Ombudsman, how it operates, the kinds of issue it deals with, and some of the limitations the office has to deal with matters raised by dissatisfied clients.
Reporting the law: An interview with Joshua Rozenberg
Joshua Rozenberg is one of a very small number of specialist journalists who cover legal issues in a serious and thoughtful way. He has worked in a wide variety of media, including the BBC, The Daily Telegraph, and The Guardian. In this interview, he describes how he decided to become a journalist rather than a practising lawyer and comments on the challenges of devising ways to enable legal issues to be raised in mass media.
Headline image credit: Law student and lecturer or academic. © Palto via iStockphoto.
The post Law careers from restorative justice, to legal ombudsman, to media appeared first on OUPblog.
I promised to post every Monday this year and I'd been doing so well, but I kinda burned myself out in July doing the weekday posts. Summer is so wonderful! I love having the kids home! But at the same time, I have the kids home. Their presence makes it harder to get my work done. Summer is glorious and yet killer on word count.
This past week my spare attention has been absorbed in what's going on in Ferguson. Last Thursday I felt a disconnect between what the media was reporting and what the people on the grown were reporting through twitter, so I storified Antonio French's account. Feeling distant and helpless, all I feel I can do is help signal boost what people of Ferguson are saying. I'm frequently on twitter if you are too.
I've also been closely following the Amazon-Hachette news. As you may know, the two are in negotiations for new terms, and because Hachette hasn't been relenting to changes Amazon wants, it in turn is not stocking Hachette titles. My Ever After High books are published by Little, Brown, a division of Hachette. Authors are caught in the middle of this feud and many are hurting a lot. A Wonderlandiful World (my third Ever After High book) publishes a week from tomorrow. Amazon won't sell preorders of it. As Amazon accounts for 40-50% of book sales, their choice not to sell certain books is significant. I hope people who normally shop from Amazon exclusively will use this opportunity to support bookstores who are stocking these titles. This article links to an email Amazon sent to many of its customers as well as Hachette president's response.
I promise to have marvelous things to say here next week. And going forward there will be much book news and hopefully plenty of good discussions. Stay tuned!Add a Comment
Making the leap between school and university can be a stretch at the best of times, but for UK law students it can be a real struggle. As there is no requirement to study law at school before beginning an undergraduate programme, many new law students have a very limited knowledge of how the law works and what they can expect from their studies.
We asked a group of 77 law students from around the UK about how they prepared for their courses. It turns out, only a third of them did any reading before starting, but a vast majority would have done, if only their university had given them a bit of advice.
On 28 June 1914, Archduke Franz Ferdinand and his wife Sophie, Duchess of Hohenberg, were assassinated in Sarajevo, setting off a six week diplomatic battle that resulted in the start of the First World War. The horrors of that war, from chemical weapons to civilian casualties, led to the first forays into modern international law. The League of Nations was established to prevent future international crises and a Permanent Court of International Justice created to settle disputes between nations. While these measures did not prevent the Second World War, this vision of a common law for all humanity was essential for international law today. To mark the centenary of the start of the Great War, and to better understand how international law arose from it, we’ve compiled a brief reading list.
The Oxford Handbook of the History of International Law, Edited by Bardo Fassbender, Anne Peters, and Simone Peter
How did international law develop from the 15th century until the end of World War II? This 2014 ASIL Certificate of Merit winnor looks at the history of international law in relation to themes such as peace and war, the sovereignty of states, hegemony, and the protection of the individual person. It includes Milos Vec’s ‘From the Congress of Vienna to the Paris Peace Treaties of 1919′ and Peter Krüger’s ‘From the Paris Peace Treaties to the End of the Second World War’.
A detailed study into the 1922-34 exchange of minorities between Greece and Turkey, supported by the League of Nations, in which two million people were forcibly relocated. Check out the specific chapters on: Wilson and international law; US jurisprudence and international law in the wake of WWI; and the failed marriage of the US and the League of Nations and America’s reaction of isolationism through WWII.
How could the world repress aggressive war, war crimes, terrorism, and genocide in the wake of the First World War? Mark Lewis examines attempts to create specific criminal justice courts to address these crimes, and the competing ideologies behind them.
A History of Public Law in Germany 1914-1945 by Michael Stolleis, Translated by Thomas Dunlap
How did the upheaval of the first half of the 20th century impact the creation of public law within and across states? Germany offers an interesting case given its central role in many of the events.
“Neutrality and Multilateralism after the First World War” by Aoife O’ Donoghue in the Journal of Conflict and Security Law
What exactly did ‘neutrality’ mean before, during, and after the First World War? The newly independent Ireland exemplified many of the debates surrounding neutrality and multilateralism.
“What is Aggression? : Comparing the Jus ad Bellum and the ICC Statute” by Mary Ellen O’Connell and Mirakmal Niyazmatov in the Journal of International Criminal Justice
The Treaty of Versailles marked the first significant attempt to hold an individual — Kaiser Wilhelm — accountable for unlawful resort to major military force. Mary Ellen O’Connell and Mirakmal Niyazmatov discuss the prohibition on aggression, the Jus ad Bellum, the ICC Statute, successful prosecution, Kampala compromise, and protecting the right to life of millions of people.
“Delegitimizing Aggression: First Steps and False Starts after the First World War” by Kirsten Sellars in the Journal of International Criminal Justice
Following the First World war, there was a general movement in international law towards the prohibition of aggressive war. So why is there an absence of legal milestones marking the advance towards the criminalization of aggression?
“The International Criminal Tribunal for the Former Yugoslavia: The Third Wang Tieya Lecture” by Mohamed Shahabuddeen in the Chinese Journal of International Law
What is the bridge between the International Military Tribunal, formed following the Treaty of Versailles, and the International Criminal Tribunal for the former Yugoslavia? Mohamed Shahabuddeen examines the first traces of the development of international criminal justice before the First World War and today’s ideas of the responsibility of the State and the criminal liability of the individual.
“Collective Security, Demilitarization and ‘Pariah’ States” by David J. Bederman in the European Journal of International Law
When are sanctions doomed to failure? David J. Bederman analyzes the historical context of the demilitarization sanctions imposed against Iraq in the aftermath of the Gulf War of 1991 from the 1919 Treaty of Versailles through to the present day.
“Peace Treaties after World War I” by Randall Lesaffer, Mieke van der Linde in the Max Planck Encyclopedia of Public International Law
How did legal terminology and provisions concerning hostilities, prisoners of war, and other wartime-related concerns change following the introduction of modern warfare during the First World War?
“League of Nations” by Christian J Tams in the Max Planck Encyclopedia of Public International Law
What lessons does the first body of international law hold for the United Nations and individual nations today?
“Alliances” by Louise Fawcett in the Max Planck Encyclopedia of Public International Law
Peace was once ensured through a complex web of diplomatic alliances. However, those same alliances proved fatal as they ensured that various European nations and their empires were dragged into war. How did the nature of alliances between nations change following the Great War?
“International Congress of Women (1915)” by Freya Baetens in the Max Planck Encyclopedia of Public International Law
In the midst of tremendous suffering and loss, suffragists continued to march and protest for the rights of women. How did the First World War hinder the women’s suffrage movement, and how did it change many of the demands and priorities of the suffragists?
“History of International Law, World War I to World War II” by Martti Koskenniemi in the Max Planck Encyclopedia of Public International Law
A brief overview of the development of international law during the interwar period: where there was promise, and where there was failure.
Headline image credit: Stanley Bruce chairing the League of Nations Council in 1936. Joachim von Ribbentrop is addressing the council. Bruce Collection, National Archives of Australia. Public domain via Wikimedia Commons.
The post The First World War and the development of international law appeared first on OUPblog.
Oxford University Press and BPP Law School are proud to co-sponsor this national mooting competition which provides law students from around the country with the opportunity to practise and hone their advocacy skills. The event is now one of the most prestigious mooting competitions in the UK, where student advocates debate a fictitious case in a mock court of appeal in front of a judge. Over 140 law students embark on the contest each October; run on a knock-out basis they are whittled down over 4 rounds to the 4 who compete in the nail-biting final.
The final of the OUP and BPP National Mooting Competition 2013-2014 took place on Thursday 10th July, and proved to be a very enjoyable night of mooting indeed. Teams from Aston University, the London School of Economics, Kaplan Law School and Queen Mary, University of London battled it out for the top prize, with Theodore Anthony Meddick Dyson and Darren Low of Queen Mary, University of London emerging as worthy moot champions.
His Honour Judge Charles Gratwicke of Chelmsford Crown Court presided over the final and kept the students on their toes with some keen questioning. In his summing up, Judge Gratwicke praised the hard work and depth of knowledge the students demonstrated, saying: “You have displayed an exceptionally high standard of advocacy skills and the differences between the teams are paper-thin. You will all be successful because people of quality always find their niche”.
All photos by Arnaud Stephenson.
For the most part, Buddhists have historically been less concerned with explaining the world than with generating personal peace and enlightenment. However, the emergence of “engaged Buddhism” — especially in the West, has emphasized a powerful commitment to environmental protection based in no small part on a fundamental ecological awareness that lies at the heart of Buddhist thought and practice.
People who follow ecological thinking (including some of our hardest-headed scientists) may not realize that they are also embracing an ancient spiritual tradition, just as many who espouse Buddhism — succumbing, perhaps, to its chic, Hollywood appeal — may not realize that they are also endorsing a worldview with political implications that go beyond bumper stickers and trendy, feel-good support for a “free Tibet.”
Biologists readily acknowledge that living processes are connected; after all, we breathe and eat in order to metabolize, and biogeochemical cycles are fundamental to life (and not merely to ecology courses). Nonetheless, biology — like most Western science — largely seeks to reduce things to their simplest components. Although such reductionism has generally paid off (witness the deciphering of DNA, advances in neurobiology, etc.), ecologists in particular have also emphasized the stunningly complex reality of organism-environment interconnection as well as the importance of biological “communities” — which doesn’t refer to the human inhabitants of a housing development.
Although “community ecology” and complicated relationships among its living and nonliving components has become a crucial part of ecological research, recognizing the existence — not to mention the importance — of such interconnectedness nonetheless requires constant struggle and emphasis, probably because the Western mind deals poorly with boundary-less notions. This isn’t because Westerners are genetically predisposed to roadblocks that don’t exist for our Eastern colleagues, but simply because, for reasons that no one seems as yet to have unraveled, the latter’s predominant intellectual traditions have accepted and embraced the absence of such boundaries.
In The Jungle Book, Rudyard Kipling captured the power of such recognition in the magical phrase by which Mowgli the human boy gained entrance into the life of animals: “We be of one blood, you and I.” Being of one blood, and acknowledging it, is also a key Buddhistic concept, reflected as well in the biochemical reality that human beings share more than 99% of their genes with each other. At the same time, there is no reason why Mowgli’s meet-and-greet should be limited to what transpires between human beings. After all, just as the jungle-boy interacted with other creatures — wolves, monkeys, an especially benevolent snake, panther, and bear, as well as a malevolent tiger — everyone’s relationship to the rest of the world, living and even nonliving, is equally intense. Thus, we share fully 98% of our genes with chimpanzees, and more than 92% with mammals generally; modern genetics confirms that we literally are of one blood, just as modern ecology — along with modern Buddhism — confirms that the alleged distinction between organism and environment is an arbitrary error of misperception, and not the way the world really is.
The interpenetration of organism and environment also leads both ecologists and Buddhists to a more sophisticated — and often paradoxical — rejection of simple cause-and-effect relationships. Thus, the absence of clear-cut boundaries among natural systems, plus the multiplicity of relevant factors means that no one can be singled out as the cause — and indeed, the impact of these factors is so multifaceted that no single “effect” can be recognized either. Systems exist as a whole, not as isolated causative sequences. Are soils the cause or effect of vegetation? Is the prairie the cause or effect of grazing mammals? Is the speed of a gazelle the cause or effect of the speed of a cheetah? Do cells create DNA or does DNA create cells? Chickens and eggs, anyone? “Organism” and “environment” interconnect and interpenetrate such that neither can truly be labeled a “cause” or “effect” of the other.
It has long been known, for example, that organisms generate environments: beavers create wetlands, ungulates crop grasses and thereby maintain prairies, while lowly worms — as Darwin first demonstrated — are directly responsible for creating rich, loamy soil. On the other hand (or rather, by the same token) it can equally be concluded that environments generate organisms: the ecology of North America’s grass prairie was responsible for the existence of bison genes, just as causation proceeds in the other direction, too. Even as ecologists have no doubt that organism and environment are inseparable, ethologists — students of animal behavior — are equally unanimous that it is foolhardy to ask whether behavior is attributable to nature or nurture, i.e. environment or genotype. Such dichotomies are wholly artificial … something that Buddhists would call maya.
Western images are typically linear: a train, a chain, a ladder, a procession of marchers, a highway unrolling before one’s speeding car. By contrast, images derived from Indian thought (which gave rise to both Hinduism and Buddhism) are more likely to involve circularity: wheels and cycles, endlessly repeating. Although there is every reason to think that evolution proceeds as an essentially one-way street, Eastern cyclicity is readily discernible not only in ecology — a discipline that is intensely aware of how every key element and molecule relevant to life has its own cycling pattern — but also in the immediacy of cell metabolism, reflected, for example, in the Krebs cycle, or the wheel of ATP, the basic process whereby energy is released for the metabolism of living cells.
At the same time, and as we have noted earlier, there is no single entity labeled “Buddhism,” just as there is no single phenomenon identifiable as “Christianity,” “Judaism,” or “Islam.” And certain schools of Buddhism (e.g. Zen) are more sympathetic to ecological ethics than are others (e.g. Theravada, which remains more committed to personal enlightenment). To be sure, the science of ecology is partitioned as well, to some extent between theoreticians (fond of mathematical models) and field workers (more inclined to get their hands dirty in the real world), but also between ecology as a hard science and ecology in the broader sense of ethical responsibility to a complex natural world. Most spiritual traditions have some sort of moral relationship to the natural world built into them, from Christian stewardship to shamanic identification. Yet another reality, and a regrettable one, is that for the Abrahamic religions in particular (Judaism, Christianity, and Islam), separateness — of soul from body, individuals from each other, heaven from hell, human beings from the rest of the natural world, and so forth — is the primary operating assumption. This is assuredly not the case with Buddhism.
For me (and I assuredly am not alone in this), Buddhism is not a religion but rather, a practice system and philosophical perspective. And it is with pleasure and optimism that I point to the convergence between Buddhism and biology generally — and ecology in particular — as something that is not only fascinating but also deeply reassuring.
My career began in the 1970s in the field of cancer immunology, a subject, which nowadays is at the forefront of cancer research, holding the promise of delivering new therapies for treating patients suffering from a wide range of cancers. Many scientists working in the field are not readily aware that the very first research papers documenting immunity against cancer were published in 1955 in the British Journal of Cancer by Robert (Bob) Baldwin, working in Nottingham, England. Fifty years on from his pioneering work, we have a greater understanding of how the body’s immune system acts as a surveillance mechanism to constantly patrol the body and destroy newly formed tumor cells, and the conditions where tumors escape immune detection and their progress is unchecked. In the 1960s and 1970s there was an explosion in research aimed at stimulating host immunity, without the in-depth knowledge we now have on the genetic basis of cancer and malignant progression and spread of the cancer beyond the confines of the primary tumor.
Robert Baldwin pioneered research into immune-stimulants, using for example bacteria, such as the attenuated BCG organism used to immunize against tuberculosis, and led the field in the production and clinical application of monoclonal antibodies; this has resulted in several antibody based therapies being introduced into the clinic. For example, Herceptin, an antibody that recognizes a protein known as Her2/neu, is used to treat a sub-group of breast cancer patients. Because other tumors express this protein, vaccine therapy against Her2/neu in prostate as well as breast cancer is undergoing trials at the present time. I was fortunate enough to work at the Nottingham Cancer Research Campaign Laboratories in the 1970s, where this experience acted as a springboard for my continuing interest and career in immunology, principally cancer immunology.
Having subsequently worked at Sheffield Medical School, and The National Cancer Institute in Bethesda and in Philadelphia, in 1996, I found myself back in Nottingham, heading my own team of scientists, now established as The John van Geest Cancer Research Centre, where our research interests are clearly focused on (1) identifying new cancer genes and antigens for developing personalized medical care through the use of cancer biomarkers, and (2) developing new forms of immunotherapy, where the combination of therapeutic vaccination and treatments designed to decrease the effectiveness of tumor escape from immune detection, are truly “center stage.” The work at our Institute and other institutes, will, I believe, make a real impact on patient survival and clinical management in the next decade. All of the major Pharmaceutical companies have entered the race to develop immunotherapy products and a variety of approaches are now undergoing clinical trials or are already approved for use by the FDA.
What makes us believe in the immune system as a means of treating cancer is the fact that in cases where the immunity is compromised, either through the use of immune-suppressive drugs or by infection with, for example, HIV, then cancers arise. We now have an in-depth understanding of how cancer occurs and progresses, the immune cells that mediate anti-cancer activity and the molecules that switch immunity off causing a “depression” in immune function. The controversial research into “cancer stem cells” now requires considerable resource and research input over the next decade or so. It will be important to define their role in tumor progression and determine their susceptibility to immunotherapy. Their mere existence in many cancers has been difficult to establish since they represent a minority population within tumors, although where they have been identified it is true to say that they appear to represent a highly aggressive, therapy resistant cell type with the ability to self-renew and give rise to differentiated cells within the tumor mass. This finding, together with our increasing understanding of “Tumor cell plasticity” will be important to consider as we aim to utilise the power of the immune system to successfully treat cancer.
The following is an extract from Comedy: A Very Short Introduction, by Matthew Bevis. It explores the relationship between laughter and aggression.
‘Laughter is men’s way of biting,’ Baudelaire proclaimed. The sociologist Norbert Elias offered a rejoinder: ‘He who laughs cannot bite.’ So does laughter embody or diffuse aggression? One theory, offered by the neuroscientist Vilayanur Ramachandran, is that the laugh may be an aborted cry of concern, a way of announcing to a group that there has been a false alarm. The smile could operate in a similar way: when one of our ancestral primates saw another individual from a distance, he perhaps initially bared his canines as a threatening grimace before recognizing the individual as friend, not foe. So his grimace was abandoned halfway to produce a smile, which in turn may have evolved into a ritualized human greeting. Another researcher, Robert Provine, notes that chimp laughter is commonly triggered by physical contact (biting or tickling) or by the threat of such contact (chasing games) and argues that the ‘pant-pant’ of apes and the ‘ha-ha’ of humans evolved from the breathlessness of physical play. This, together with the show of teeth necessitated by the play face, has been ritualized into the rhythmic pant of the laugh. Behind the smile, then, may lie a socialized snarl; and behind the laugh, a play fight. But behind both of these facial expressions lie real snarls and real fights.
People often claim to be ‘only joking’, but many a true word is spoken in jest. Ridicule and derision are both rooted in laughter (from ridere, to laugh). The comic may loiter with shady intent on the borders of aggression; ‘a joke’, Aristotle suggested, ‘is a kind of abuse’. And comedy itself can be abused as well as used—racist and sexist jokes point to its potential cruelty. As Waters says of Price’s stand-up act in Trevor Griffiths’s The Comedians (1975): ‘Love, care, concern, call it what you like, you junked it over the side.’ Comedy is clearly at home in the company of insults, abuse, curses, and diatribes, but the mode can also lend an unusual inflection to these utterances. From Greek iambi to the licensed raillery of the Roman Saturnalia, from Pete and Dud on the implications of being called a fucking cunt to the game of The Dozens, in which numerous aspersions are cast upon Yo Mama’s character, something strange happens to aggression when it is stylized or performed. W. H. Auden pondered choreographed exchanges of insult—from Old English flyting to the modern-day exchanges of truck drivers— and observed that ‘the protagonists are not thinking about each other but about language and their pleasure in employing it inventively … Playful anger is intrinsically comic because, of all emotions, anger is the least compatible with play.’ From this perspective, comedy is the moment at which outrage becomes outrageous. Some kinds of ferocity can be delectable.
‘Playful anger’ sounds like a contradiction in terms, yet in Plato’s Philebus, Socrates notes ‘the curious mixture of pleasure and pain that lies in the malice of amusement’. Descartes suggests in The Passions of The Soul (1649) that ‘Derision or scorn is a sort of joy mingled with hatred.’ This chapter examines such curious mixtures and minglings of feeling by considering modes of comedy that seem to have a target in their sights—versions of satire, mock-heroic, parody, and caricature. We might turn first to the satirist; Walter Benjamin identified him as ‘the figure in whom the cannibal was received into civilization’. So the satirist is at once savage and civilized; he cuts us up after having been granted permission (perhaps even encouraged) to take that liberty. What is it, then, that we need this cannibal to do for us? The satirist, it would initially appear, is the comedian who allows audiences to join him on a mission. Satire is a scourge of vice, a spur to virtue; Horace imagines his ideal listener as ‘baring his teeth in a grin’. So far so good, but the listener may also get bitten from time to time: ‘What are you laughing at?’ the poet asks us, ‘Change the name and you are the subject of the story.’ Indeed, as Hamlet would later quip, ‘use every man after his desert, and who should scape whipping?’
Image credit: Business team laughing, © YanC, via iStock Photo.
Today, I'm celebrating the publication of It Happens: A Guide to Contemporary Realistic Fiction for the YA Reader by Kelly Jensen, a fellow blogger and book reviewer. We share an appreciation for literature and libraries, and I've been following her blog for a long while. It was fun to conduct this interview and learn more about her academic background and literary inspirations.
How old were you when you started reading teen fiction?
I was a teenager when I was reading teen fiction. Speak by Laurie Halse Anderson and The Perks of Being a Wallflower by Stephen Chbosky came out in 1999. I was in 8th/9th grade then, and I remember picking both up somewhere around my sophomore or junior year of high school. I read what was out there then, and I carried on reading teen fiction through college and grad school. It wasn't always the first thing I picked up -- I read a lot of adult fiction and non-fiction -- but it was always there.
What was the first YA book (or series) that you read over and over? Have you re-read it as an adult? If so, did your opinion of it change?
I don't really reread. It's not because I'm opposed to it. It's just that there's so much out there I want to read, so it's not the first thing I think to do.
That said, I've really been wanting to reread Megan McCafferty's Jessica Darling series, especially with the release of the middle grade novels in the series. I read those books starting in high school and I looked forward to picking up each one as they published. Jessica and I went through the same life stages at the same time, and even though we didn't have any actual life similarities, I always related to and "got" her.
Congratulations on the release of IT HAPPENS! How did you land that book deal?
I approached my editor about doing an article on contemporary realistic YA for VOYA, and then on a whim, I asked if she thought maybe it was something she'd be interested in seeing as a full manuscript. It happened really fast. I was asked to put together a proposal and outline, which took me about a week. I sent those to her on a Thursday and had a contract on Saturday (I woke up to it on a vacation at a friend's house at 5 am and it was hard not to wake everyone up and share).
Had you always wanted to write a book guide?
It wasn't always a plan, but it made sense. What I'd envisioned for an article was something much bigger and after I did the research of what was out there, I saw there was a gaping hole in solid resources for contemporary realistic YA fiction.
Did anything get cut from the book?
I'd included book talks with a number of my book annotations, but I ended up cutting them all. I didn't keep them since many felt like they were just variations on the annotations themselves.
Should readers keep their eyes peeled for outtakes/bonus content at your blog?
There likely won't be outtakes or bonus content but that doesn't mean there won't be updates to some of the things I talked about in the book that show up on Stacked.
Any other books up your sleeve?
Last month, I turned in an essay that will be part of Amber Keyser's The V-Word, out with Beyond Words/Simon & Schuster in spring 2016. I'm also putting together a Q&A for the same book that looks at the representations of virginity and female sexuality in teen media.
I'm working on a chapter for another library reader's advisory resource with Liz Burns about "New Adult" fiction, being edited by Jessica Moyer. There's also a possibility of another chapter on a topic I'm supremely passionate about from a professional-development standpoint, but that's a project that's not completely set in stone yet.
There is a novel in me. I've been picking at it bit by bit. I'm really not good at committing to long-term fiction projects, but it's something I really want to do, and I think this story might be the one that gets me to follow through.
How did your college education/college experience prepare you for the jobs you've had?
I can't cite specific examples of how my education prepared me for my jobs, but I can say the experiences I had outside the classroom were what helped shape my career. I went to a non-traditional undergraduate college, which trained me how to think differently about time management and project management. I spent 4 years working on the school's newspaper -- first as a writer, then 2 years as an Arts and Entertainment editor, then finally as a Co-Editor-in-Chief. I spent three years working on the school's literary magazine, too, as both a reader and an editor. Those experiences taught me a lot about working with other people and rallying for things I care deeply about (the newspaper faced budget cuts during my last year, but my co-editor and I went to student senate budget meetings and fought hard to keep our money -- and we did).
While in undergrad, I worked at the library and I did an internship at my college library. The college library doubled as the town's public library, so I got to see both sides of the picture and knew working with the public -- and teens, especially -- was something I wanted to do.
What classes would you recommend for those who plan on becoming librarians?
I went to grad school immediately after undergrad and took many classes across the board in librarianship. If I'm being honest, though, few of the public library/teen services classes did a lot for me preparation wise. My YA fiction course was bad -- I knew more from my own reading and research than I got out of the class. But the one good thing that came of it was meeting my co-blogger Kimberly...and here we are, still blogging about YA at Stacked. Hopefully we're helping people learn about YA in a way we didn't get to.
But if I were to offer suggestions for those who want to go into libraries, it's this: work in libraries. Figure out where you want to work. Figure out how you work. Then read, read, read. And if you feel inclined, write. Blogging can give you a leg up if for no other reason than you have a record you can point to showing that you're willing to learn, explore, and create.
I'm not working in libraries now, since I took on a job at Book Riot as an editor/community manager. But my experiences in libraries, in a variety of good and less-than-good work environments, helped prep me for it, too. The best preparation for any job is working the job and understanding how you work, not what you'll get out of the classroom or your homework.
You've spoken about contemporary literature at a variety of conferences. Have you always felt comfortable with public speaking? Any advice for folks reading this interview who need a confidence booster before their next professional event or school presentation?
I still get nervous all the time about speaking. But I like pushing myself out of my comfort zone, and this works for me.
The reason it works for me doubles as my advice/confidence booster: you aren't invited to speak unless you know your stuff. So when you're at the front of the room, you are the expert. There's something in that knowledge that helps me feel better -- people are listening to me because they believe I can teach them something or I can make them rethink how they look at an idea.
In undergrad, I once spoke at a college-sponsored feminist symposium. I had written a paper in my Harlem Renaissance Lit class about the main character in Zora Neale Hurston's Their Eyes Were Watching God and how her name changes and shifts throughout the story and what that meant about her power in those situations. Little did I know that a renowned Hurston scholar was on campus during the symposium but I was alerted to it when I was presenting, since she was in the room listening to me.
I never felt more nervous as I did then. She asked me some tough questions during the Q&A portion, and I thought I was going to die right there. But after, she came up to me and said she came because she was curious to hear my take on this and she asked me those questions because she knew I could think about them and articulate a response. That may have been the presentation that sort of turned things around for me, knowing that even if someone in the room is smarter than me on a topic, they don't have the same take on it that I do, and they're there because they're interested, not because they want to bring me down.
You've been blogging at stackedbooks.org for five years now. What do you enjoy writing and sharing the most -- a book review, a list of books with similar themes, general book news, or a completely unplanned but suddenly inspired post?
If it's a book I love and want people to read immediately, then it's a book review. I love writing fun booklists. But the most fun are those unplanned and inspired posts, for sure.
Kimberly and I believe we'll do this as long as it's still fun. When it stops being fun, we stop. And at this point? It's still a lot of fun. If I don't like what I'm writing, I just stop and do something else.
How did you become a contributor for Book Riot?
Rebecca asked me! She and I have been following each other on Twitter for years, and so we've always sort of known what's going on with each other in the book world. Last June she approached me and said if I ever wanted to be a contributor, then I should apply. I did and the rest is history.
When you read a book summary, what are the magic words? What immediately makes you think, "I've got to read this book!"?
Dark, gritty, and edgy are three words I love. They don't have to be in relation to realistic fiction. I'll read most genres, especially when those words are involved.
Other things that grab me: dancing, a midwest setting outside of Chicago, anything feminist or that sounds like it's going to focus on navigating girlhood.
The words "magical realism" can catch my eye, but I approach those a little more cautiously/critically.
What are your top ten favorite books?
This is the worst question. The WORST. And the reason this is the worst question is because my favorite books are all favorites for different reasons -- it can be about the story or about the writing as much as it can be about the sensory experience of where I was or what that particular book brought to my life.
I'm not going to give you ten. Instead, here are three of my favorite books, off the top of my head, as I am writing this answer: The Magician's Assistant by Ann Patchett, The Girl in the Flammable Skirt by Aimee Bender, and All the Rage by Courtney Summers.
Little Willow adds: I also love books by Courtney Summers + Check out my interview with Courtney Summers!
Visit Kelly at kellybjensen.com and stackedbooks.org and get IT HAPPENS from your retailer of choice today.
On any given day, a Google search finds the word “intransigent” deployed as though it automatically destroyed an opponent’s position. Charles Blow of the New York Times and Jacob Weisberg (no relation to the present writer) of Slate are only two of many, especially on the political left, who label Republicans “intransigent” and thereby assume they have won the argument against them.
The first intransigents, however, were on the extreme left. The Oxford English Dictionary dates the usage “intransigent” to 1873, when an extreme left party in the Spanish Cortes called themselves “los intransigentes.” Interestingly, the Spaniards did not think their self-description worked any harm to their political positions, which they felt deserved to be stated forthrightly, without compromise, and passionately.
By the early 1880s, Democrats in the United States reversed the political origins of the word when they pinned “intransigence” (as a noun) on “an uncompromising republican”. Since then, the left more than the right has mapped “intransigence” onto “extremism,” often assuming without substantive analysis that an unwillingness to compromise a position makes it not only untenable but also bizarre.
Of course, we live in a world that unhesitatingly accepts flexibility and compromise as basically good and even as a goal unto itself. However, a willingness – too often made into a norm – to compromise strongly held viewpoints has repetitively brought on destruction and death. Wouldn’t the outcome have been better if Neville Chamberlain at Munich in 1938 had dealt obstinately and inflexibly with Adolf Hitler? Shouldn’t post 9/11 Americans have been less elastic in their willingness to negotiate away our country’s prior taboo on torture?
So it turns out that intransigence is not always bad and should not be used as a pejorative until the writer defines substantively the position he is attacking. Holding firm is sometimes good and sometimes bad, exactly as being flexible can be terribly wrong if what we give away to prove we are flexible is actually something that was good.
I define intransigence as “a resistance to the urge to shift malleably from positions thought to be sound.” This definition is neutral as to the merits or demerits of the deeply held viewpoint. That part is up to all of us, who should think through what is vitally important to us individually, stick to it, fight for it, and abandon the fallacy that even those whose positions we detest are clearly wrong because they, too, are intransigent. If we are right and they are wrong, the matter will be decided because of the position we take and not our inflexibility in propounding it.
President Obama has just publicly recognized that we should not have collectively caved in on the practice of torture. Those few people who adamantly refused from 9/12 onwards to compromise that taboo deserve to be called both correct and intransigent.
Headline image: Fist. Photo by George Hodan. Public domain via PublicDomainPictures.net.
Although I haven’t seen the exhibit “Gustave Doré (1832-1883): Master of Imagination,” currently at the National Gallery of Canada, I can say that the catalog is beautiful, informative, and opened up Doré’s career in ways I had not anticipated.Add a Comment
On 1 July 2014, the Grand Chamber of the European Court of Human Rights (ECHR) announced its latest judgment affirming France’s ban on full-face veil (burqa law) in public (SAS v. France). Almost a decade after the 2005 controversial decision by the Grand Chamber to uphold Turkey’s headscarf ban in Universities (Leyla Sahin v. Turkey), the ECHR made it clear that Muslim women’s individual rights of religious freedom (Article 9) will not be protected. Although the Court’s main arguments were not the same in each case, both judgments are equally questionable from the point of view of protecting religious freedom and of the exclusion of Muslim women from public space.
The recent judgment was brought to the ECHR by an unnamed French woman known only as “SAS” against the law introduced in 2011 that makes it illegal for anyone to cover their face in a public place. Although the legislation includes hoods, face-masks, and helmets, it is understood to be the first legislation against the full-face veil in Europe. A similar ban was also passed in Belgium after the French law. France was also the first country to ban the wearing of “conspicuous religious symbols” – directed at the wearing of the headscarf in public high schools — in 2004. Since then several European countries have established policies restricting Muslim religious dress.
The French law targeted all public places, defined as anywhere not the home. Penalties for violating the law include fines and citizenship lessons designed to remind the offender of the “republican values of tolerance and respect for human dignity, and to raise awareness of her penal and civil responsibility and duties imposed life in society.”
SAS argued the ban on the full-face veil violated several articles of the European Convention and was “inhumane and degrading, against the right of respect for family and private life, freedom of thought, conscience and religion, freedom of speech and discriminatory.” She did not challenge the requirement to remove scarves, veils and turbans for security checks, also upheld by the ECHR. The ECHR rejected her argument and accepted the main argument made by the government: that the state has a legitimate interest in promoting a certain idea of “living together.”
By now, it is clear that Article 9 of the European Convention does not protect freedom of religion when the subject is a woman and the religion is Islam. While this may seem harsh, consider the ECHR’s 2011 judgment in Lautsi v. Italy, which found the display of the crucifix in Italian state schools compatible with secularism.
In Lautsi case, the Court argued that the symbol did not significantly impact the denominational neutrality of Italian schools because the crucifix is part of Italian culture. Human rights scholars have not missed the contrast between the Italian case and the earlier 2005 decision in Leyla Sahin v Turkey where the Court found that the wearing of the headscarf by students was not compatible with the principle of laicité or secularism.
The Court did not make a value judgment in SAS case about Islam, women’ rights in Islamic societies, or gender equality, as it did in earlier cases where they upheld bans on the wearing of the headscarf by teachers and students in France, Turkey and Switzerland. In all cases involving Islamic dress codes, the ECHR emphasized the “margin of appreciation” rule, which permits the court to defer to national laws.
The ECHR acted politically and opportunistically not to challenge France’s strong Republicanism and principles of laicité, sacrificing the rights of the small minority of Muslims who wear the full-face veil. Rather than protecting the individual freedom of the 2000 women, the ECHR protected the majority view of France.
The ECHR is the most powerful supra national human rights court and its decisions have widespread impact. Several countries in Europe, such as Denmark, Norway, Spain, Austria, and even the UK, have already started to discuss whether to create similar laws banning the burqa in public places. This raises concerns that cases related to the cultural behavior and religious practices of minorities could shift public opinion dangerously away from the principles of multiculturalism, democracy, human rights and religious tolerance.
The most recent law bans the full-face veil, but tomorrow, the prohibitions may be against halal food, circumcision, the location of a mosque or the visibility of a minaret; even religious education might be banned for reasons of public health, security or cultural integration. Muslims, Roma, and to some extent Jews and Sikhs, are already struggling to be accepted as equal citizens in Europe, where right wing extremism is rising, in a situation of economic crisis.
The ECHR should be extremely careful in its decisions, given the growth of nationalism, xenophobia, and anti-immigrant sentiment in Europe.Considering this context, the EHCR’s main argument in this latest judgment is worrisome, since it accepted France’s view that covering the face in public runs counter to the society’s notion of “living together,” even though this is not one of the principles of the European Convention.
The Court recognized that the concept of “living together” was problematic (Para 122). And, even in using the “wide margin of appreciation” rule, the Court acknowledged that it should “engage in a careful examination” to avoid majority’s subordination of minorities. Considering the Court’s own rules, the main reasoning for the full face veil ban—“living together” seems to be inconsistent with the Court’s own jurisprudence.
Further concerns were raised about Islamophobic remarks during the adoption debate of the French Burqa Law, and evidence that prejudice and intolerance against Muslims in French society influenced the adoption of the law. Such concerns were more strongly raised by the two dissenting opinions. The dissent found the Court’s insensitivity to what’s needed to ensure tolerance between the vast majority and a small minority could increase tensions (Para 14). The dissenting opinion was especially critical of prioritizing “living together,” not even a Convention principle, over “concrete individual rights” guaranteed by the Convention.
While the integration of Muslims and other immigrants across Europe is a legitimate concern, it is vitally important the ECHR’s constructive role. The decision in SAS v France is a dangerous jurisprudential opening for future cases involving the religious and cultural practices of minorities. The French burqa law has created discomfort among Muslims. By upholding the law, the European court deepens the mistrust between the majority of citizens and religious minorities.
Headline image credit: Arabic woman in Muslim religious dress, © Vadmary, via iStock Photo..