Stevie Nicks with ewok friend. (Via.)Add a Comment
The Reconstruction era was a critical moment in the history of American race relations. Though Abraham Lincoln’s Emancipation Proclamation made great strides towards equality, the aftermath was a not-quited newly integrated society, greatly conflicted and rife with racial tension. At the height of Radical Reconstruction, in June 1870, seventeen-month-old Irish-American Mollie Digby was kidnapped from her home in New Orleans — allegedly by two Afro-Creole women. In The Great New Orleans Kidnapping Case: Race, Law, and Justice in the Reconstruction Era, Michael A. Ross offers the first ever full account of this historic event and subsequent investigation that electrified the South. The following images set the scene of New Orleans during this time period of racial amalgamation, social friction, and tremendous unease.
Featured image: The City of New Orleans, Louisiana, Harper’s Weekly, May 1862. Public Domain via Wikimedia Commons.
The post Setting the scene of New Orleans during Reconstruction appeared first on OUPblog.
I was excited to recently see the fascinating-sounding variation-on-Camus by Kamel Daoud, Meursault, contre-enquête, make the first cuts of both the Goncourt and the Renaudot -- the two leading French literary prizes.
Warming up for those, the book has now picked up two prizes in quick succession: the Prix François Mauriac (not to be confused with ... the Prix François Mauriac (seriously, guys ? I mean ... seriously ?)) and the Prix des 5 continents (see also, for example, Algerian writer wins world French literature prize).
If a US/UK publisher hasn't pre-empted this yet ... more fools you be -- this property is hot, and the price is only going up. The concept alone should be enough to sell it, but apparently it's actually good, too. And now: prize-winning, too.
Although Canada-based, the folks behind the Cundill Prize in Historical Literature know a loonie is just a loonie and so offer their pay-out in real money: US dollars [I kid, I kid; please -- no e-mails/protests/boycotts] -- and, at 75,000 of them, they lay claim to the title of: "the most lucrative international award for a nonfiction book" -- well, according to the Toronto Star, where they announce this year's shortlist (which isn't yet available at the official site, sigh ...).
Six titles -- and one of them is actually one I've been making my way through (though haven't managed to review yet) -- a title in translation, no less: David Van Reybrouck's Congo (see the HarperCollins publicity page, or get your copy at Amazon.com or Amazon.co.uk).
In the current issue of The New Yorker Masha Gessen profiles Lyudmila Ulitskaya -- surely also one of the maybe two dozen authors in the serious running for the Nobel Prize.
As Gessen notes, Daniel Stein, Interpreter was a huge success in Russia(n) -- but: "the English translation flopped in the United States" and was "barely noticed" (I noticed, and, yeah, I was a disappointed).
Still, I am certainly looking forward to/curious about The Big Green Tent (pre-order your copy at Amazon.com or Amazon.co.uk).
The most recent addition to the complete review is my review of Open Letter's tribute-volume to Michael Henry Heim & A Life in Translation, The Man Between, edited by Esther Allen, Sean Cotter, and Russell Scott Valentino.
(It's a nice (and well-priced !) volume -- but I do note with some amusement that a misspelling of Stieg Larsson's name slipped through (yup, they called him 'Steig', p.276) -- something I'm encountering almost as frequently as misspellings of Edgar Allan Poe's middle name and the mistakenly apostrophized version of Joyce's Finnegans Wake -- a slip that feel almost Freudian coming from translators (who surely have a very uncomfortable relationship with the mega-success of that translation and its notorious history).)
Austin City Limits is the longest running musical showcase in the history of television, spanning over four decades and showcasing the talents of musicians from Willie Nelson and Ray Charles to Arcade Fire and Eminem. The show is a testament to the evolution of media and popular music and the audience’s relationship to that music, and to the city of Austin, Texas. In Austin City Limits: A History, author Tracey E. W. Laird takes us behind-the-scenes with interviews, anecdotes, and personal photographs to pay homage to this landmark festival. In doing so, she also illuminates the overarching discussion of the US public media and its influence on the broadcasting and funding of music and culture. This year, the festival celebrates its 40th anniversary with guests such as Bonnie Raitt, Jimmie Vaughan, Sheryl Crow, and Alabama Shakes, which will air on PBS on Oct. 3 at 9pm ET.
Featured image: Night view of Austin skyline and Lady Bird Lake as seen from Lou Neff Point. Photo by LoneStarMike. CC BY 3.0n via Wikimedia Commons.
Edith Wharton with bat friends.Add a Comment
For most of my life, I worked in the gravel pit as an overseer. There had been gravel there for a long time, but there wasn’t much left. Mostly, we spent our days trying to decide where to set off dynamite. We didn’t have a lot of dynamite, so we wanted to be precise. We would go for weeks and even months without lighting a single stick. I spent my days – ten-, eleven-hour days – telling the workers to try over here, to look over there, to dig here, to prod there. We sought the best rock, the least sand.Add a Comment
Of all the beverages favored by Oxford University Press staff, coffee may be the lifeblood of our organization. From the coffee bar in the Fairway of our Oxford office to the coffee pots on every floor of the New York office, we’re wired for work. Here’s a brief gallery of employees with their preferred roast — grabbed from a street cart, made to order, or part of an elaborate weekly routine.
Oxford staff are ready for their next meeting!
“My cappuccino from the OUP coffee bar in my Shakespeare insults mug – so I can fire creative insults and keep caffeinated at the same time…you canker-blossom!”
— Hannah Charters, Senior Marketing Executive, Online Product Marketing
“An Americano from the OUP espresso bar. The mug shows the mantra I like living by!”
— Rachel Fenwick, Associate Marketing Manager, Online Product Marketing
“Tall Pike in a Grande cup from Starbucks”
— Jennifer Bernard, Assistant Online Marketing Manager
“A Vacuum Pot Coffee at Edison Food and Drink Lab, Tampa Florida”
— Erin Rabbit, Designer, Creative Services, Marketing
“Grabbing coffee with a friend is one of my favorite pastimes. Good conversation over an even better coffee is the best! I’m a huge fan of locally owned coffee shops, so I always find myself recommending the Stumptown Coffee on E 8th downtown. I splurge and get the largest latte I can—iced or hot, depending on the season. The flavor is so strong! It’s a kick in the face. Otherwise, my typical go to is a cup, (or 2…or 3) of any flavored Keurig coffee in the OUP office. No match to Stumptown, but it does the job. I grew up in the south, so I like my coffee southern-style—lots of sugar and cream. Props to Mom and Dad for the sweet mug!”
— Ryan Cury, Assistant Marketing Manager
“Always opt for an espresso mid-afternoon for two equally important reasons. Firstly, it provides the boost I need to conquer the remains of the day and, secondly, it makes me feel like a giant when drinking.”
— Dan Parker, Social Media Marketing Executive
“I enjoy a standard Americano with cold milk. Because: I can’t be done with faffy coffee.”
— Kirsty Doole, Publicity Manager
“Mine was a salted caramel mochaccino.”
— Simon Thomas, Content Marketing Executive, Dictionaries
“Mine was a lovely frothy milky latte – filling and delicious!”
— Kate Farquhar-Thomson, Publicity Director
“Decaf filter coffee, for those times when you think three coffees in three hours might be too much.”
— Nicola Burton, Publicity Manager
“Put that pungent brew to your lips and feel the satisfaction.”
— Sam Blum, Publicity Assistant and member of the Fresh Pots Society
Today, 60 years ago, the visionary convention establishing the European Organization for Nuclear Research – better known with its French acronym, CERN – entered into force, marking the beginning of an extraordinary scientific adventure that has profoundly changed science, technology, and society, and that is still far from over.
With other pan-European institutions established in the late 1940s and early 1950s — like the Council of Europe and the European Coal and Steel Community — CERN shared the same founding goal: to coordinate the efforts of European countries after the devastating losses and large-scale destruction of World War II. Europe had in particular lost its scientific and intellectual leadership, and many scientists had fled to other countries. Time had come for European researchers to join forces towards creating of a world-leading laboratory for fundamental science.
Sixty years after its foundation, CERN is today the largest scientific laboratory in the world, with more than 2000 staff members and many more temporary visitors and fellows. It hosts the most powerful particle accelerator ever built. It also hosts exhibitions, lectures, shows, meetings, and debates, providing a forum of discussion where science meets industry and society.
What has happened in these six decades of scientific research? As a physicist, I should probably first mention the many ground-breaking discoveries in Particle Physics, such as the discovery of some of the most fundamental building block of matter, like the W and Z bosons in 1983; the measurement of the number of neutrino families at LEP in 1989; and of course the recent discovery of the Higgs boson in 2012, which prompted the Nobel Prize in Physics to Peter Higgs and Francois Englert in 2013.
But looking back at the glorious history of this laboratory, much more comes to mind: the development of technologies that found medical applications such as PET scans; computer science applications such as globally distributed computing, that finds application in many fields ranging from genetic mapping to economic modeling; and the World Wide Web, that was developed at CERN as a network to connect universities and research laboratories.
If you’ve ever asked yourself what such a laboratory may look like, especially if you plan to visit it in the future and expect to see building with a distinctive sleek, high-tech look, let me warn you that the first impression may be slightly disappointing. When I first visited CERN, I couldn’t help noticing the old buildings, dusty corridors, and the overall rather grimy look of the section hosting the theory institute. But it was when an elevator brought me down to visit the accelerator that I realized what was actually happening there, as I witnessed the colossal size of the detectors, and the incredible degree of sophistication of the technology used. ATLAS, for instance, is a 25 meters high, 25 meters wide and 45 meters long detector, and it weighs about 7,000 tons!
The 27-km long Large Hadron Collider is currently shut down for planned upgrades. When new beams of protons will be circulated in it at the end of 2014, it will be at almost twice the energy reached in the previous run. There will be about 2800 bunches of protons in its orbit, each containing several hundred billion protons, separated by – as in a car race, the distance between bunches can be expressed in units of time – 250 billionths of a second. The energy of each proton will be compared to that of a flying mosquito, but concentrated in a single elementary particle. And the energy of an entire bunch of protons will be comparable to that of a medium-sized car launched at highway speed.
Why these high energies? Einstein’s E=mc2 tells us that energy can be converted to mass, so by colliding two protons with very high energy, we can in principle produce very heavy particles, possibly new particles that we have never before observed. You may wonder why we would expect that such new particles exist. After all we have already successfully created Higgs bosons through very high-energy collisions, what can we expect to find beyond that? Well, that’s where the story becomes exciting.
Some of the best motivated theories currently under scrutiny in the scientific community – such as Supersymmetry – predict that not only should new particles exist, but they could explain one of the greatest mysteries in Cosmology: the presence of large amounts of unseen matter in the Universe, which seem to dominate the dynamics of all structures in the Universe, including our own Milky Way galaxy — Dark Matter.
Identifying in our accelerators the substance that permeates the Universe and shapes its structure would represent an important step forward in our quest to understand the Cosmos, and our place in it. CERN, 60 years and still going strong, is rising up to challenge.
Headline image credit: An example of simulated data modeled for the CMS particle detector on the Large Hadron Collider (LHC) at CERN. Image by Lucas Taylor, CERN. CC BY-SA 3.0 via Wikimedia Commons.
2014 marks not just the centenary of the start of World War I, and the 75th anniversary of World War II, but on 29 September it is 60 years since the establishment of CERN, the European Centre for Nuclear Research or, in its modern form, Particle Physics. Less than a decade after European nations had been fighting one another in a terrible war, 12 of those nations had united in science. Today, CERN is a world laboratory, famed for having been the home of the world wide web, brainchild of then CERN scientist Tim Berners-Lee; of several Nobel Prizes for physics, although not (yet) for Peace; and most recently, for the discovery of the Higgs Boson. The origin of CERN, and its political significance, are perhaps no less remarkable than its justly celebrated status as the greatest laboratory of scientific endeavour in history.
Its life has spanned a remarkable period in scientific culture. The paradigm shifts in our understanding of the fundamental particles and the forces that control the cosmos, which have occurred since 1950, are in no small measure thanks to CERN.
In 1954, the hoped for simplicity in matter, where the electron and neutrino partner a neutron and proton, had been lost. Novel relatives of the proton were proliferating. Then, exactly 50 years ago, the theoretical concept of the quark was born, which explains the multitude as bound states of groups of quarks. By 1970 the existence of this new layer of reality had been confirmed, by experiments at Stanford, California, and at CERN.
During the 1970s our understanding of quarks and the strong force developed. On the one hand this was thanks to theory, but also due to experiments at CERN’s Intersecting Storage Rings: the ISR. Head on collisions between counter-rotating beams of protons produced sprays of particles, which instead of flying in all directions, tended to emerge in sharp jets. The properties of these jets confirmed the predictions of quantum chromodynamics – QCD – the theory that the strong force arises from the interactions among the fundamental quarks and gluons.
CERN had begun in 1954 with a proton synchrotron, a circular accelerator with a circumference of about 600 metres, which was vast at the time, although trifling by modern standards. This was superseded by a super-proton synchrotron, or SPS, some 7 kilometres in circumference. This fired beams of protons and other particles at static targets, its precision measurements building confidence in the QCD theory and also in the theory of the weak force – QFD, quantum flavourdynamics.
QFD brought the electromagnetic and weak forces into a single framework. This first step towards a possible unification of all forces implied the existence of W and Z bosons, analogues of the photon. Unlike the massless photon, however, the W and Z were predicted to be very massive, some 80 to 90 times more than a proton or neutron, and hence beyond reach of experiments at that time. This changed when the SPS was converted into a collider of protons and anti-protons. By 1984 experiments at the novel accelerator had discovered the W and Z bosons, in line with what QFD predicted. This led to Nobel Prizes for Carlo Rubbia and Simon van der Meer, in 1984.
The confirmation of QCD and QFD led to a marked change in particle physics. Where hitherto it had sought the basic templates of matter, from the 1980s it turned increasingly to understanding how matter emerged from the Big Bang. For CERN’s very high-energy experiments replicate conditions that were prevalent in the hot early universe, and theory implies that the behaviour of the forces and particles in such circumstances is less complex than at the relatively cool conditions of daily experience. Thus began a period of high-energy particle physics as experimental cosmology.
This raced ahead during the 1990s with LEP – the Large Electron Positron collider, a 27 kilometre ring of magnets underground, which looped from CERN towards Lake Geneva, beneath the airport and back to CERN, via the foothills of the Jura Mountains. Initially designed to produce tens of millions of Z bosons, in order to test QFD and QCD to high precision, by 2000 its performance was able to produce pairs of W bosons. The precision was such that small deviations were found between these measurements and what theory implied for the properties of these particles.
The explanation involved two particles, whose subsequent discoveries have closed a chapter in physics. These are the top quark, and the Higgs Boson.
As gaps in Mendeleev’s periodic table of the elements in the 19th century had identified new elements, so at the end of the 20th century a gap in the emerging pattern of particles was discerned. To complete the menu required a top quark.
The precision measurements at LEP could be explained if the top quark exists, too massive for LEP to produce directly, but nonetheless able to disturb the measurements of other quantities at LEP courtesy of quantum theory. Theory and data would agree if the top quark mass were nearly two hundred times that of a proton. The top quark was discovered at Fermilab in the USA in 1995, its mass as required by the LEP data from CERN.
As the 21st century dawned, all the pieces of the “Standard Model” of particles and forces were in place, but one. The theories worked well, but we had no explanation of why the various particles have their menu of masses, or even why they have mass at all. Adding mass into the equations by hand is like a band-aid, capable of allowing computations that agree with data to remarkable precision. However, we can imagine circumstances, where particles collide at energies far beyond those accessible today, where the theories would predict nonsense — infinity as the answer for quantities that are finite, for example. A mathematical solution to this impasse had been discovered fifty years ago, and implied that there is a further massive particle, known as the Higgs Boson, after Peter Higgs who, alone of the independent discoveries of the concept, drew attention to some crucial experimental implications of the boson.
Discovery of the Higgs Boson at CERN in 2012 following the conversion of LEP into the LHC – Large Hadron Collider – is the climax of CERN’s first 60 years. It led to the Nobel Prize for Higgs and Francois Englert, theorists whose ideas initiated the quest. Many wondered whether the Nobel Foundation would break new ground and award the physics prize to a laboratory, CERN, for enabling the experimental discovery, but this did not happen.
CERN has been associated with other Nobel Prizes in Physics, such as to Georges Charpak, for his innovative work developing methods of detecting radiation and particles, which are used not just at CERN but in industry and hospitals. CERN’s reach has been remarkable. From a vision that helped unite Europe, through science, we have seen it breach the Cold War, with collaborations in the 1960s onwards with JINR, the Warsaw Pact’s scientific analogue, and today CERN has become truly a physics laboratory for the world.
With the next General Election on the horizon, the Conservative’s proposed European Union ‘In/Out’ referendum, slated for 2017, has become a central issue. Scotland chose to stay part of a larger union – would the same decision be taken by the United Kingdom?
In the first of a pair of posts, some key legal figures share their views on why Britain should leave the European Union.
* * * * *
“[The] EU as I see it is an anti-democratic system of governance that steadily drains decision-making power from the people and their elected national and sub-national representatives and re-allocates it to a virtually non-accountable Euro-elite. In many ways, this is its purpose. … Important policy decisions in sensitive areas of civil liberties … [are] taken by government officials and ministers with minimal input from parliaments and virtually unremarked by the media and general public.
“The European Commission started life as a regulatory agency attached to a trade bloc, which rapidly turned its regulatory powers on the Member States that had put it in place. Much the same can be said of the centralising Court of Justice, a significant policy-maker in the EU system.
“Democracy, in Robert Dahl’s sense of popular control over governmental policies and decisions or as a broad array of the rights, freedoms and – most important – the opportunities that tend to develop among people who govern themselves democratically, is out of the reach of the EU system of regulatory governance.”
* * * * *
“There is little if any direct trade advantage for remaining a member of the EU on the present terms. The direct financial burden of EU membership is some £17bn gross (£11bn net) and rising. … 170 countries in the world now operate in a global market based on trade according to “rules of origin”, and the UK now trades mostly with them, not with the EU.
“Advocates of the EU always present the “single market” as indispensable to the UK. Is this really so? The EU Single Market is the never ending pretext for the EU’s harmonisation of standards and laws across the EU. EU Single Market rules now extend well beyond what was the Single European Act 1986, and far beyond what is necessary to enable borderless trading within the EU.
“As the sixth largest trading nation in the world, were the UK to leave the EU Single Market, we would be joining the 170 other nations who trade freely in the global single market. We would regain control of our own markets and over our trade with the rest of the globe.”
– Bernard Jenkin, MP for Harwich and North Essex and Chair of the House of Commons Public Administration Select Committee
* * * * *
“The single currency is the crux. We did opt out of the Euro, but we can’t escape the Euro. The deflationary bias in the Eurozone, the catastrophic effects of a single monetary policy across such disparate economies and societies, culminating in banking and government debt crises, all continue to bear down on our exports and our overall economic performance.
“As the eighteen Eurozone countries meet apart from the non-Euro members of the EU to determine major issues of financial and fiscal policy, so we are increasingly marginalized within the EU, while having to live with the consequences of decisions in which we’ve had no part. … The EU will continue to be dominated by the Eurozone countries. They will do their best to salvage the single currency and will probably succeed, at least for some years to come. If British policy is to be characterized by more than passivity and fatalism, we will either have to establish new terms of membership of the EU (well-nigh impossible to achieve on a meaningful basis when the unanimous agreement of the EU is required), or find a way to split the existing EU into two unions of different kinds, or leave altogether.”
– Alan Howarth, Baron Howarth of Newport, former Member of Parliament
* * * * *
“Whatever the merits of the European Union from an economic or political perspective, its legal system is unfit for purpose. In the United Kingdom we expect our statutory laws to be clear and the means by which these laws are made to be transparent. We equally expect our court processes to be efficient and to deliver unambiguous judgments delineating clear legal principles. We expect there to be a clear demarcation between those who make the laws and those who interpret them. … All [matters] are quite absent within the legal institutions of the EU.
“It is perhaps understandable that an institution which is seeking to unite 28 divergent legal traditions, with multiple different languages, struggles to produce an effective legal system. However, the EU legal system sits above and is constitutionally superior to the domestic UK one. … The failures of the EU legal system are so fundamental that they constitute a flagrant violation of the rule of law. Regardless of the position of the UK within the EU, these institutions should be radically and urgently reformed.”
– Dan Tench, Partner, Olswang LLP and co-founder of UKSCblog – the Supreme Court blog
* * * * *
“The Euro has faced serious difficulties for the last five years. The economic crisis exposed flaws in the basic design, while the effort to save the currency union has led to recession and high unemployment, especially youth unemployment, in the weaker nations. This has moved the focus of the EU from the single market to the economically more important project of saving the Euro.
“The effect of this on the UK is that the direction of the EU has become more integrationist and has subordinated the interests of the non-Euro states. Currently this covers ten countries but only the UK and Denmark have a permanent opt out. The protocol being developed to ensure that the eighteen do not force their will on the ten will need revising when it becomes twenty-six versus two. The EU will not be willing to give the UK and Denmark a veto over all financial regulation. Inevitably, this will need some form of renegotiation as the UK has a disproportionately valuable banking sector which cannot be expected to accept rules designed entirely for the advancement of the Euro.”
– Jacob Rees-Mogg, MP, North East Somerset
* * * * *
“The reluctance of the European Court of Human Rights (ECtHR) to find violations of human rights in sensitive matters affecting States’ interests raises the question whether subscribing to the European Convention on Human Rights (‘ECHR’) should be a pre-requisite of European Union membership, as is now expected under the Treaty of Lisbon. … [T]he decisions of the ECtHR are accorded a special significance in the EU by the European Court of Justice because the ECHR is part of the EU’s legal system.
“This was recently demonstrated in S.A.S. v France, concerning an unnamed 24-year-old French woman of Pakistani origin who wore both the burqa and the niqab. In 2011, France introduced a ‘burqa ban’, arguin that facial coverings interfere with identification, communication, and women’s freedoms. … A British Judge has said, “I reject the view … that the niqab is somehow incompatible with participation in public life.” The ECtHR held [that] France’s burqa ban encouraged citizens to “live together” this being a “legitimate aim” of the French authorities. … Britain could leave the ECHR and make its own decisions but then, insofar as the EU continues to accord special significance to ECtHR decisions, still effectively be bound by them.”
– Satvinder Juss, Professor of Law, King’s College London and Barrister-at-Law, Gray’s Inn
Last I checked, the Nobel Prize in Literature page still had a blank in the space for specifying: "2014 Literature Prize will be announced at the earliest in:"; if the Swedish Academy has decided to announce the winner this Thursday they'll let us know today -- but it's much more likely that the announcement will only come next Thursday (or later).
Not only would a 2 October announcement not fit in with that nice banner the Nobel organization has, promising the prizes will be announced between 6 and 13 October, but as the Swedish Academy's pointman Peter Englund notes at his weblog, a fair number of Nobel-deciding academicians (including him) have been busy this weekend at the Göteborg Book Fair -- leaving them little time for deliberations.
So presumably they'll only really be getting down to business today, trying to settle on a winner from their ca. five-author strong list of finalists in time to reveal the winner next week.
So for now it's premature to wonder about the winner -- they very likely haven't settled on one yet. There is, however, room and reason to speculate on what authors are left in the running. Most years, the bookies' favorites are a pretty good indicator -- the future winner tends to be one of the top five or so (so also Alice Munro last year, even a month before the anouncement came).
So what are the current odds ? Books have been disappointingly limited this year -- but here the ones of interest:
Best known as a poet, Dannie Abse has passed away; see, for example, Dannie Abse, Welsh poet and author, dies aged 91 in The Guardian and Cardiff-born poet Dannie Abse dies aged 91 (with more thorough/thoughtful coverage surely to come).
The only Abse-title under review at the complete review is his novel, The Strange Case of Dr. Simmonds and Dr. Glas -- inspired by Hjalmar Söderberg's classic, Doctor Glas.
At her love german books weblog Katy Derbyshire proposes A Women's Prize for Translated Books -- which, given the sad situation that far more fiction written by men gets translated into English than fiction by women, may be a helpful way of attracting attention and perhaps nudging publishers (et al.) towards rectifying the ridiculous imbalance.Add a Comment
Ravenclaw has won the latest Pottermore House Cup and set a record in the process: they are the first House to ever win the Cup twice in a row. Congratulations Ravenclaws!
It was a close competition, and all four houses deserve recognition for their excellent efforts! Hufflepuff took the lead for much of the championship, but Ravenclaw snuck ahead of Gryffindor into second place just before the house points were concealed in the Great Hall on Saturday, September 20. The eagles’ triumphs earned them 1,192,309 points in the last six days of the championship, allowing them to soar to victory with a final total of 32,279,991 points.
You can read more here.Add a Comment
They've been handing out the საბა literary awards in (former Soviet, not US) Georgia annually for over a decade now, and they seem pretty well established as a leading local literary prize; they've now announced this year's winners; see also SABA Announces Literary Winners at Georgia Today.
ფორმა N100, by Zviad Kvaratskhelia (ზვიად კვარაცხელია), took best novel -- beating out five-time winner (and Journey to Karabakh-author) Aka Morchiladze.
And while there was an award for best translation into Georgian, impressively they also had one for best translation from the Georgian -- and Donald Rayfield's translation of Georgian great Otar Chiladze's Avelum took that prize; I actually have a copy but shamefully haven't yet managed to get to it yet (but I will -- meanwhile, get your copy at Amazon.com or Amazon.co.uk).
It's good also to see that a couple of new titles are due soon in Dalkey Archive Press' Georgian Literature Series -- including one in translation by Rayfield.
Dalkey's Georgian Literature Series and holding the 80th PEN International Congress (right now !) in Bishek (Kyrgyzstan) helps, but this Central Asian/Caucasian stretch of former Soviet states is among the worst represented in (English) translation of any region in the world (only South East Asia comes close) -- and that extends to a non-former-Soviet area, too, Kurdistan.
Now at Sampsonia Way Tarık Günersel has a Q & A with Selim Temo, discussing The Rise of Kurdish Literature -- a decent introductory overview.
Unfortunately, almost nothing is available in English translation, but it's good to hear that things seem to be looking up again.
In The Japan Times Kris Kosaka takes a look at several English-language Japanese literary magazines, in Read up on books about books about Japan. They have limited material accessible online, but are all of some interest.Add a Comment
Research transparency is a hot topic these days in academia, especially with respect to the replication or reproduction of published results.
There are many initiatives that have recently sprung into operation to help improve transparency, and in this regard political scientists are taking the lead. Research transparency has long been a focus of effort of The Society for Political Methodology, and of the journal that I co-edit for the Society, Political Analysis. More recently the American Political Science Association (APSA) has launched an important initiative in Data Access and Research Transparency. It’s likely that other social sciences will be following closely what APSA produces in terms of guidelines and standards.
One way to increase transparency is for scholars to “preregister” their research. That is, they can write up their research plan and publish that prior to the actual implementation of their research plan. A number of social scientists have advocated research preregistration, and Political Analysis will soon release new author guidelines that will encourage scholars who are interested in preregistering their research plans to do so.
However, concerns have been raised about research preregistration. In the Winter 2013 issue of Political Analysis, we published a Symposium on Research Registration. This symposium included two longer papers outlining the rationale for registration: one by Macartan Humphreys, Raul Sanchez de la Sierra, and Peter van der Windt; the other by Jamie Monogan. The symposium included comments from Richard Anderson, Andrew Gelman, and David Laitin.
In order to facilitate further discussion of the pros and cons of research preregistration, I recently asked Jaime Monogan to write a brief essay that outlines the case for preregistration, and I also asked Joshua Tucker to write about some of the concerns that have been raised about how journals may deal with research preregistration.
* * * * *
Study registration is the idea that a researcher can publicly release a data analysis plan prior to observing a project’s outcome variable. In a Political Analysis symposium on this topic, two articles make the case that this practice can raise research transparency and the overall quality of research in the discipline (“Humphreys, de la Sierra, and van der Windt 2013; Monogan 2013).
Together, these two articles describe seven reasons that study registration benefits our discipline. To start, preregistration can curb four causes of publication bias, or the disproportionate publishing of positive, rather than null, findings:
Additionally, there are three advantages of study registration beyond the issue of publication bias:
Overall, there is an array of reasons why the added transparency of study registration can serve the discipline, chiefly the opportunity to reduce publication bias. Whatever you think of this case, though, the best way to form an opinion about study registration is to try it by preregistering one of your own studies. Online study registries are available, so you are encouraged to try the process yourself and then weigh in on the preregistration debate with your own firsthand experience.
* * * * *
I want to make one simple point in this blog post: I think it would be a mistake for journals to come up with any set of standards that involves publically recognizing some publications as having “successfully” followed their pre-registration design while identifying others publications as not having done so. This could include a special section for articles that matched their pre-registration design, an A, B, C type rating system for how faithfully articles had stuck with the pre-registration design, or even an asterisk for articles that passed a pre-registration faithfulness bar.
Let me be equally clear that I have no problem with the use of registries for recording experimental designs before those experiments are implemented. Nor do I believe that these registries should not be referenced in published works featuring the results of those experiments. On the contrary, I think authors who have pre-registered designs ought to be free to reference what they registered, as well as to discuss in their publications how much the eventual implementation of the experiment might have differed from what was originally proposed in the registry and why.
My concern is much more narrow: I want to prevent some arbitrary third party from being given the authority to “grade” researchers on how well they stuck to their original design and then to be able to report that grade publically, as opposed to simply allowing readers to make up their own mind in this regard. My concerns are three-fold.
First, I have absolutely no idea how such a standard would actually be applied. Would it count as violating a pre-design registry if you changed the number of subjects enrolled in a study? What if the original subject pool was unwilling to participate for the planned monetary incentive, and the incentive had to be increased, or the subject pool had to be changed? What if the pre-registry called for using one statistical model to analyze the data, but the author eventually realized that another model was more appropriate? What if survey questions that was registered on a 1-4 scale was changed to a 1-5 scale? Which, if any of these, would invalidate the faithful application of the registry? Would all of them together? It seems to the only truly objective way to rate compliance is to have an all or nothing approach: either you do exactly what you say you do, or you didn’t follow the registry. Of course, then we are lumping “p-value fishing” in the same category as applying a better a statistical model or changing the wording of a survey question.
This bring me to my second point, which is a concern that giving people a grade for faithfully sticking to a registry could lead to people conducting sub-optimal research — and stifle creativity — out of fear that it will cost them their “A” registry-faithfulness grade. To take but one example, those of us who use survey experiments have long been taught to pre-test questions precisely because sometime some of the ideas we have when sitting at our desks don’t work in practice. So if someone registers a particular technique for inducing an emotional response and then runs a pre-test and figures out their technique is not working, do we really want the researcher to use the sub-optimal design in order to preserve their faithfulness to the registered design? Or consider a student who plans to run a field experiment in a foreign country that is based on the idea that certain last names convey ethnic identity. What happens if the student arrives in the field and learns that this assumption was incorrect? Should the student stick with the bad research design to preserve the ability to publish in the “registry faithful” section of JEPS? Moreover, research sometimes proceeds in fits and spurts. If as a graduate student I am able to secure funds to conduct experiments in country A but later as a faculty member can secure funds to replicate these experiments in countries B and C as well, should I fear including the results from country A in a comparative analysis because my original registry was for a single country study? Overall, I think we have to be careful about assuming that we can have everything about a study figured out at the time we submit a registry design, and that there will be nothing left for us to learn about how to improve the research — or that there won’t be new questions that can be explored with previously collected data — once we start implementing an experiment.
At this point a fair critique to raise is that the points in preceding paragraph could be taken as an indictment of registries generally. Here we venture more into simply a point of view, but I believe that there is a difference between asking people to document what their original plans were and giving them a chance in their own words — if they choose to do so — to explain how their research project evolved as opposed to having to deal with a public “grade” of whatever form that might take. In my mind, the former is part of producing transparent research, while the latter — however well intentioned — could prove paralyzing in terms of making adjustments during the research process or following new lines of interesting research.
This brings me to my final concern, which is that untenured faculty would end up feeling the most pressure in this regard. For tenured faculty, a publication without the requisite asterisks noting registry compliance might not end up being too big a concern — although I’m not even sure of that — but I could easily imagine junior faculty being especially worried that publications without registry asterisks could be held against them during tenure considerations.
The bottom line is that registries bring with them a host of benefits — as Jamie has nicely laid out above — but we should think carefully about how to best maximize those benefits in order to minimize new costs. Even if we could agree on how to rate a proposal in terms of faithfulness to registry design, I would suggest caution in trying to integrate ratings into the publication process.
The views expressed here are mine alone and do not represent either the Journal of Experimental Political Science or the APSA Organized Section on Experimental Research Methods.
Heading image: Interior of Rijksmuseum research library. Rijksdienst voor het Cultureel Erfgoed. CC-BY-SA-3.0-nl via Wikimedia Commons.
Ten years ago my dear friend Stephen Mitchelmore started his superb book blog This Space. It remains a vital inspiration, and the most essential book blog out there.Add a Comment
Recent surmounting media coverage of the Islamic State in Iraq and Syria (ISIS) has evoked fear of impending terrorist threats in the minds of many. I spoke with Colin Beck, Assistant Professor of Sociology at Pomona College, to gain his thoughts on the group, as well as the designations and motivations of terrorism. Beck is the author of “Who Gets Designated a Terrorist and Why,” recently published in Social Forces. His article examines formal terrorism designations by governments through the lens of organization studies research on categorization processes.
Why are we so concerned with ISIS now, relative to other terrorist groups and terrorist threats?
While there are a number of militant groups in Syria that foreign governments could focus on, ISIS has three things that makes it appear as a pressing threat. First, ISIS’s sudden advances in Iraq were an unanticipated event, and consequently created a media spectacle. No one really expected the Iraqi central government or Kurdish authorities to lose control of major cities and sites so quickly. Once they did, there was a major story there. Second, and related, the group has territorial control. While ISIS had controlled territory in Syria and Iraq previously, the declaration of an Islamic State in late June creates a clear target. There is little evidence that the Islamic State intends to directly attack outside of Iraq and Syria, but territorial control signals capability and threat, in the same way that aviation attacks do, as Miner and I argued in our study. Finally, ISIS engages in classically “terrorist” behavior—beheadings of captives and attacks on civilian populations. In essence, it’s the combination of sudden success, territorial control, and markers of terrorism that bring attention to the Islamic State.
None of these are sufficient explanations by themselves. For instance, compare ISIS to the recent reports about the Khorasan militant group located in Syria; in the media and even government accounts it takes on a secondary importance even though it has been suggested that Khorasan was planning direct attacks against Euro-American targets. And other Syrian Islamist groups, like al-Nusra Front, control territory but have not expanded their area so dramatically.
To what extent does the release of videos showing the beheading of victims help define ISIS as a terrorist group in the eyes of Americans?
Even before the video-taped beheadings, the attacks on Yezidis and other religious minorities seemed to signify international terrorism to the American public. There’s a seemingly odd confusion here in public opinion. While the Taliban in Afghanistan never carried out international terrorism, they were the target of the American response to September 11th just as much as Al-Qaeda was. Similarly, in Iraq, various militant groups were seen as international terrorists even without action beyond the context of the Iraqi Insurgency. Americans have thus learned to think of any militant Islamic group as terrorists; all the group needs to do is reveal its Islamicness. Attacks on religious minorities certainly do that. In this environment, beheading hostages is just another marker, especially as it echoes the acts of previously militants defined as terrorists—Al Qaeda’s beheading of Daniel Pearl in 2002 or the frequent beheadings of captives by Al Qaeda in Iraq during the Insurgency.
Why do you think that ISIS beheaded Americans so publicly? To what extent is this attempt to consolidate their strength in their region?
I am speculating here, but I wonder if the beheadings are actually more a product of cross-militant competition than a message to the outside world. The Islamic State’s leadership is not imprudent, so they must have known that attacking the citizens of western countries would create a response. (This, in fact, was one of the non-surprising results of our study of terrorism designations.) So why do it? The Islamic State could believe that the response will not actually imperil their organization and its gains. Or, possibly, that beheadings would encourage other governments to pay for hostages which have been a lucrative source of funding in recent years. More importantly, I think it is also likely that the Islamic State was trying to prove its bonafides. ISIS has been fighting with other Islamist groups among the Syrian rebels, and, in 2013, struggled with Zawahiri of Al-Qaeda over who best represents Islamist interests in the Syrian conflict. This sort of cross-Islamist conflict is quite typical, as Charles Kurzman discusses in his book The Missing Martyrs. So, perhaps, beheading hostages is a way to establish their credibility with other militants.
Is there anything else you think we can learn about terrorism from the case of ISIS?
ISIS really demonstrates the large amount of variation there is among “terrorist” groups. There are lots of different ideologies, lots of different goals, and lots of different types of groups among militants. While policymakers and the public tend to view certain forms, such as transnational networks of Islamists, as threatening, organizational forms might be best seen as different ways of solving resource dilemmas and meeting goals. I take this point up extensively in my book Radicals, Revolutionaries, and Terrorists that will be published next year by Polity Press. Also, ISIS illustrates that groups might strategically seek to conform to, or avoid, perceptions of what constitutes terrorism for various reasons. I am exploring how this might work for media labeling of terrorism in my next project.
Headline image credit: Map of the claim to power of the organization Islamic State. Created by Fiver, der Hellseher. CC BY-SA 4.0 via Wikimedia Commons.