What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Date

Click days in this calendar to see posts by day or month
new posts in all blogs
Viewing: Blog Posts Tagged with: Politics, Most Recent at Top [Help]
Results 1 - 25 of 1,013
1. Prime Minister’s Questions

By Andrew Dobson


“Noisy and aggressive,” “childish,” “over the top,” “pointless.” These are just a few recent descriptions of Prime Minister’s Questions – the most watched event in the Parliamentary week.

Public dismay at PMQs has led the Speaker, John Bercow, to consult with party leaders over reform.  The Hansard Society asked focus groups what they thought of PMQs as part of its annual look at public engagement. Nearly half said the event is “too noisy and aggressive”, the same proportion as those who felt that MPs behave unprofessionally. Meanwhile, a majority of 33% to 27% reported that it put them off politics. Only 12% said it made them “proud of our Parliament”.

John Bercow. By Office of John Bercow CC-BY-SA-3.0

Both the Deputy Prime Minister Clegg and Labour leader Ed Miliband agreed that the baying and screeching gave politics and politicians a bad name, and while Prime Minister David Cameron was a little more guarded, he too thought that Mr Bercow’s ideas were interesting and worth looking at.

So would it help if politicians listened to each other little bit more and shouted at each other a little bit less? The fact that PMQs is simultaneously the most watched and the least respected Parliamentary event is significant. No doubt we watch it precisely because we enjoy the barracking and the bawling, and there is always the possibility of grudging admiration for a smart bit of wordplay by one or other of the combatants. Parliamentary sketch writers nearly always judge the winner of PMQs on the basis of which of the party leaders has bested the other in terms of quips and ripostes – and very rarely on the basis of political substance.

So it’s hardly an informative occasion. Indeed the Hansard’s respondents’ main gripes are that questions are scripted, and that there are too many planted questions and too few honest answers.

Once again, though, maybe this misses the point. Some will say that the civilised and serious political work is done behind the scenes in committee rooms, where party loyalty is less obviously on display, and where considered debate often takes place. On this account, PMQs occupy a very small amount of parliamentary time, and anyway, the sometimes angry jousting that takes place between party leaders on Wednesdays is as much a part of politics as the polite exchange of views we find in Parliamentary committees. Where would politics be without disagreement? Would it be politics at all?

But then there are different ways of disagreeing – and some ways could turn out to be exclusionary. One of the ideas floated by John Bercow was that the flight of women from the House of Commons was in part a result of the way in which debate is conducted there.

David Cameron

David Cameron. By World Economic Forum/Moritz Hager (Flickr) CC-BY-SA-2.0

And it’s a fact that although good listening is much prized in daily conversation, it’s been almost completely ignored in the form of political conversation we know as democracy. While PMQs show that politicians aren’t always very good at listening to each other, they’re not much better at listening to the public either. Politicians instinctively know that listening in a democracy is vital to legitimacy. That’s why when they’re in trouble they reach for the listening card and initiate a “Big Conversation,” like the one Tony Blair started in late 2003, not so many months after the million people march against the Iraq war.

But won’t a government that listens hard and changes its mind just be accused of that ultimate political crime, the U-turn? In 2012, the Secretary of State for Education, Michael Gove, announced some radical changes in UK secondary school education, including a return to an older style assessment regime. Then in February 2013 he suddenly announced that the changes wouldn’t take place after all. Predictably, the Opposition spokesman called this a ‘humiliating climbdown’. Equally predictably, Gove’s supporters played the listening card for it was worth, with Nick Clegg saying effusively that, “There is no point having a consultation if you’ve already made up your mind what you’re going to do at the end of it.”

So it looks as though, as far as listening goes, governments are damned if they do and damned if they don’t: accused of weakness if they change their mind and of pig-headedness and a failure to listen if they don’t. On balance, I’d rather have them listening more – both to each other and to us. John Dryzek is surely right to say that, “the most effective and insidious way to silence others in politics is a refusal to listen.”

As the ancient Greek philosopher Epictetus says: “Nature hath given men and one tongue but two ears, that we may hear from others twice as much as we speak.”

Andrew Dobson is Professor of Politics at Keele University, UK. His most recent book is Listening for Democracy: recognition, representation, reconciliation (OUP, 2014). He is a member of the England and Wales Green Party and he co-wrote the Green Party General Election Manifesto in 2010. He is a founder member of the thinktank Green House.

Subscribe to the OUPblog via email or RSS.
Subscribe to only political sciences articles on the OUPblog via email or RSS.
Image credit: John Bercow, by Office John Bercow, CC-BY-SA-3.0 via Wikimedia Commons. (2) David Cameron, by World Economic Forum/Mortiz Hager (Flickr), CC-BY-SA-2.0 via Wikimedia Commons

The post Prime Minister’s Questions appeared first on OUPblog.

0 Comments on Prime Minister’s Questions as of 4/17/2014 5:09:00 AM
Add a Comment
2. Does torture really (still) matter?

By Rebecca Gordon


The US military involvement in Iraq has more or less ended, and the war in Afghanistan is limping to a conclusion. Don’t the problems of torture really belong to the bad old days of an earlier administration? Why bring it up again? Why keep harping on something that is over and done with? Because it’s not over, and it’s not done with.

Torture is still happening. Shortly after his first inauguration in 2009, President Obama issued an executive order forbidding the CIA’s “enhanced interrogation techniques” and closing the CIA’s so-called “black sites.” But the order didn’t end “extraordinary rendition”—the practice of sending prisoners to other countries to be tortured. (This is actually forbidden under the UN Convention against Torture, which the United States signed in 1994.) The president’s order didn’t close the prison at Guantánamo, where to this day, prisoners are held in solitary confinement. Periodic hunger strikes are met with brutal force feeding. Samir Naji al Hasan Moqbel described the experience in a New York Times op-ed in April 2013:

I will never forget the first time they passed the feeding tube up my nose. I can’t describe how painful it is to be force-fed this way. As it was thrust in, it made me feel like throwing up. I wanted to vomit, but I couldn’t. There was agony in my chest, throat and stomach. I had never experienced such pain before.

Nor did Obama’s order address the abusive interrogation practices of the Joint Special Operations Command (JSOC) which operates with considerably less oversight than the CIA. Jeremy Scahill has ably documented JSOC’s reign of terror in Iraq in Dirty Wars: The World Is a Battlefield. At JSOC’s Battlefield Interrogation Facility at Camp NAMA (which reportedly stood for “Nasty-Ass Military Area”) the motto—prominently displayed on posters around the camp—was “No blood, no foul.”

Torture also continues daily, hidden in plain sight, in US prisons. It is no accident that the Army reservists responsible for the outrages at Abu Ghraib worked as prison guards in civilian life. As Spec. Charles A. Graner wrote in an email about his work at Abu Ghraib, “The Christian in me says it’s wrong, but the corrections officer in me says, ‘I love to make a grown man piss himself.’” Solitary confinement and the ever-present threat of rape are just two forms of institutionalized torture suffered by the people who make up the world’s largest prison population. In fact, the latter is so common that on TV police procedurals like Law & Order, it is the staple threat interrogators use to prevent a “perp” from “lawyering up.”

We still don’t have a full, official accounting. As yet we have no official government accounting of how the United States has used torture in the “war on terror.” This is partly because so many different agencies, clandestine and otherwise, have been involved in one way or another. The Senate Intelligence Committee has written a 6,000-page report just on the CIA’s involvement, which has never been made public, although recent days have seen moves in this direction. Nor has the Committee been able to shake loose the CIA’s own report on its interrogation program. Most of what we do know is the result of leaks, and the dogged work of dedicated journalists and human rights lawyers. But we have nothing official, on the level, say, of the 1975 Church Committee report on the CIA’s activities in the Vietnam War.

Frustrated because both Congress and the Obama administration seemed unwilling to demand a full accounting, the Constitution Project convened a blue-ribbon bipartisan committee, which produced its own damning report. Members included former DEA head Asa Hutchinson, former FBI chief William Sessions, and former US Ambassador to the United Nations Thomas Pickering. The report reached two important conclusions: (1) “[I]t is indisputable that the United States engaged in the practice of torture,” and (2) “[T]he nation’s highest officials bear some responsibility for allowing and contributing to the spread of torture.”

No high-level officials have been held accountable for US torture. Only enlisted soldiers like Charles Graner and Lynndie England have done jail time for prisoner abuse in the “war on terror.” None of the “highest officials” mentioned in the Detainee Task Force report (people like Donald Rumsfeld, Dick Cheney, and George W. Bush) have faced any consequences for their part in a program of institutionalized state torture. Early in his first administration, President Obama argued that “nothing will be gained by spending our time and energy laying blame for the past,” but this is not true. Laying blame for the past (and the present) is a precondition for preventing torture in the future, because it would represent a public repudiation of the practice. What “will be gained” is the possibility of developing a public consensus that the United States should not practice torture any longer. Such a consensus about torture does not exist today.

Tolerating torture corrupts the moral character of the nation. We tend to think of torture as a set of isolated actions—things desperate people do under desperate circumstances. But institutionalized state torture is not an action. It is an ongoing, socially-embedded practice. It requires an infrastructure and training. It has its own history, traditions, and rituals of initiation. And—importantly—it creates particular ethical habits in those who practice it, and in any democratic nation that allows it.

Since the brutal attacks of 9/11/2001, people in this country have been encouraged to be afraid. Knowing that our government has been forced to torture people in order to keep us safe confirms the belief that each of us must be in terrible danger—a danger from which only that same government can protect us. We have been encouraged to accept any cruelty done to others as the price of our personal survival. There is a word for the moral attitude that sets personal safety as its highest value: cowardice. If as a nation we do not act to end torture, if we do not demand a full accounting from and full accountability for those responsible, we ourselves are responsible. And we risk becoming a nation of cowards.

Rebecca Gordon received her B.A. from Reed College and her M.Div. and Ph.D. in Ethics and Social Theory from Graduate Theological Union. She teaches in the Department of Philosophy and for the Leo T. McCarthy Center for Public Service and the Common Good at the University of San Francisco. She is the author of Letters From NicaraguaCruel and Usual: How Welfare “Reform” Punishes Poor People, and Mainstreaming Torture: Ethical Approaches in the Post-9/11 United States.

Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.

The post Does torture really (still) matter? appeared first on OUPblog.

0 Comments on Does torture really (still) matter? as of 4/12/2014 7:53:00 AM
Add a Comment
3. The political economy of skills and inequality

By Marius R. Busemeyer and Torben Iversen


Inequality has been on the rise in all the advanced democracies in the past three or four decades; in some cases dramatically. Economists already know a great deal about the proximate causes. In the influential work by Goldin and Katz on “The Race between Education and Technology”, for example, the authors demonstrate that the rate of “skill-biased technological change” — which is economist speak for changes that disproportionately increase the demand for skilled labor — has far outpaced the supply of skilled workers in the US since the 1980s. This rising gap, however, is not due to an acceleration of technological change, but rather to a slowdown in the supply of skilled workers. Most importantly, a cross-national comparison reveals that other countries have continued to expand the supply of skills, i.e. the trend towards rising inequality is less pronounced in these cases.

The narrow focus of economists on the proximate causes is not sufficient, however, to fully understand the dynamic of rising inequality and its political and institutional foundations. In particular, skill formation regimes and cross-country differences in collective wage bargaining influence the quantity and quality of skills and hence also differences in inequality. Generally speaking, countries with coordinated wage-setting and highly developed vocational education and training (VET) systems respond more effectively to technology-induced changes in demand than systems without such training systems.

Yet, there is a great deal of variance in the extent to which this is true, and one needs to be attentive to the broader organization of political institutions and social relations to explain this variance. One of the recurrent themes is the growing socioeconomic differentiation of educational opportunity. Countries with a significant private financing of education, for example, induce high-income groups to opt out of the public system and into high-quality but exclusive private education. As they do, some public institutions try to compete by raising tuition and fees, and with middle- and upper-middle classes footing more of the bill for their own children’s education, support for tax-financed public education declines.

Laptop in classic library

This does not happen everywhere. In countries that inherited an overwhelmingly publicly-financed system only the very rich can opt out, and the return on private education is lower because of a flatter wage structure. In this setting the middle and upper-middle classes, deeply concerned with the quality of education, tend to throw their support behind improving the public system. Yet, they will do so in ways that may reproduce class-based differentiation within the public system. Based on an analysis of the British system, one striking finding is that a great deal of differentiation happens because high-educated, high-income parents, who are most concerned with the quality of the education of their children, move into good school districts and bid up housing prices in the process. As property prices increase, those from lower socio-economic strata are increasingly shut out from the best schools.

Even in countries with less spatial inequality, in part because of a more centralized provision of public goods, socioeconomic inequality may be reproduced through early tracking of students into vocational and academic lines. This is because the choice of track is known to be heavily dependent on the social class of parents. This is reinforced by the decisions of firms to offer additional training to their best workers, which disadvantages those who start at the bottom. There is also evidence that such training decisions discriminate against women because firm-based training require long tenures and women are less likely to have uninterrupted careers. So strong VET systems, although they tend to produce less wage inequality, can undermine intergenerational class mobility and gender equality.

The rise of economic inequality also has consequence for politics. While democratic politics is usually seen as compensating for market inequality, economic and political inequality in fact tend to reinforce each other.  Economic and educational inequality destroy social networks and undermines political participation in the lower half of the distribution of incomes and skills, and this undercuts the incentives of politicians to be attentive to their needs. Highly segmented labor markets with low mobility also undermine support for redistribution because pivotal “insiders” are not at risk. Labor market “dualism” therefore delimits welfare state responsiveness to unemployment and rising inequality. In a related finding, the winners of globalization often oppose redistribution, in part because they are more concerned with competitiveness and how bloated welfare states may undermine it.

Economic, educational, and political inequalities thus also tend to reinforce each other. But the extent and form of such inequality vary a great deal across countries. This special issue helps explain why and suggests the need for an interdisciplinary approach that is attentive to national institutional and political context oppose redistribution.

Marius R. Busemeyer is Professor of Political Science at the University of Konstanz, Germany. Torben Iversen is Harold Hitchings Burbank Professor of Political Economy at Harvard University. They are Guest Editors of the Socio-Economic Review April 2014 special issue on The Political Economy of Skills and Inequality which is freely available online until the end of May 2014.

Socio-Economic Review aims to encourage work on the relationship between society, economy, institutions and markets, moral commitments and the rational pursuit of self-interest. The journal seeks articles that focus on economic action in its social and historical context. In broad disciplinary terms, papers are drawn from sociology, political science, economics and management, and policy sciences.

Subscribe to the OUPblog via email or RSS.
Subscribe to only business and economics articles on the OUPblog via email or RSS.
Image credit: Laptop in classic library. By photogl, via iStockphoto.

The post The political economy of skills and inequality appeared first on OUPblog.

0 Comments on The political economy of skills and inequality as of 4/9/2014 6:35:00 AM
Add a Comment
4. Twenty years after the Rwandan Genocide

By Scott Straus


We are now entering the month of April 2014—a time for reflection, empathy, and understanding for anyone in or involved with Rwanda. Twenty years ago, Rwandan political and military leaders initiated a series of actions that quickly turned into one of the 20th century’s greatest mass violations of human rights.

As we commemorate the genocide, our empathy needs to extend first to survivors and victims. Many families were destroyed in the genocide. Many survivors suffered enormous hardships to survive. Whatever our stand on the current state of affairs in Rwanda, we have to be enormously recognizant of the pain many endured.

In this brief post, I address three issues that speak to Rwanda today. I do so with trepidation, as discussions about contemporary Rwanda are often polarized and emotionally charged. Even though I am critical, I shall try to raise concerns with respect and recognition that there are few easy solutions.

My overall message is one of concern. At one level, Rwanda is doing remarkably and surprisingly well—in terms of security, the economy, and non-political aspects of governance. However, deep resentments and ethnic attachments persist, hardships and significant inequality remain. While it is difficult to know what people really feel, my general conclusion is that the social fabric remains tense beneath a veneer of good will. A crucial issue is that the political system is authoritarian and designed for control rather than dialogue. It is also a political system that many Rwandans believe is structured to favor particular groups over others. Fostering trust in such a political context is highly unlikely.

I also conclude that a “genocide lens” has limits for the objective of social repair. The genocide lens has been invaluable for achieving international recognition of what happened in 1994. But that lens leads to certain biases about Rwanda’s history and society that limit long-term social repair in Rwanda.

Rwandan Genocide Memorial. 7 April 2011. El Fasher: The Rwandan community in UNAMID organized the 17th Commemoration of the 1994 Genocide against Tutsi hold in Super Camp - RWANBATT 25 Military Camp (El Fasher). Photo by Albert Gonzalez Farran / UNAMID.

Rwandan Genocide Memorial. 7 April 2011. El Fasher: The Rwandan community in UNAMID organized the 17th Commemoration of the 1994 Genocide against Tutsi hold in Super Camp – RWANBATT 25 Military Camp (El Fasher). Photo by Albert Gonzalez Farran / UNAMID. CC-BY-NC-ND-2.0 via UNAMID Flickr.

During the past 20 years, a sea change in international recognition has occurred. Fifteen years ago, very few people knew globally that genocide took place in Rwanda. Today, the “Rwandan Genocide” is widely recognized as a world historical event. That global recognition is an achievement. We also know a great deal more about the causes and dynamics of the genocide itself.

However, several important controversies and unanswered questions remain. One is who killed President Habyarimana on 6 April 1994. Another is how to conceptualize when the plan for genocide began. Some date the plan for genocide to the late 1950s; others to the 1990s; still others to April 1994. A third question is how one should conceptualize RPF responsibility. Some depic the former rebels as saviors who stopped the genocide. Others argue that their actions were integral to the dynamics that led to genocide. And there are other issues as well, including how many were killed. Each of these issues remains intensely debated and hopefully will be the subject of open-minded inquiry in the years to come.

Contemporary Rwanda is at one level inspiring. The government is visionary, ambitious, and accomplished. The plan is to transform the society, economy, and culture—and to wean the state from foreign aid. The government has successfully introduced major reforms. The tax system is much improved. Public corruption is virtually absent. Remarkable results in public health and the economy have been achieved. Public security is also dramatically improved.

But there is a dark side. Most importantly, the government is repressive. The government seeks to exercise control over public space, especially around sensitive topics—in politics, in the media, in the NGO sector, among ordinary citizens, and even among donors. The net impact is the experience of intimidation and, as a friend aptly put it, many silences.

That brings me to the delicate question of reconciliation. Reconciliation is an imprecise concept for what I mean. What matters is the quality of the social fabric in Rwanda—the trust between people—and the quality of state-society relations.

Jean Baptiste and Freda reconciliation. Photo by Trocaire. CC BY 2.0 via Trocaire Flickr.

Jean Baptiste and Freda reconciliation. Photo by Trocaire. CC BY 2.0 via Trocaire Flickr.

A central pillar in Rwanda’s social reconstruction process has been justice. Much is written on gacaca, the government’s extraordinary program to transform a traditional dispute settlement process into a country-wide, decade-long process to account for genocide crimes. Gacaca brought some survivors satisfaction at finally seeing the guilty punished. Gacaca spawned some important conversations, led to important revelations, and prompted some sincere apologies.

But there were also a lot of problems. There were lies on all sides. There were manipulations of the system. Some apologies were pro-forma. And there were weak protections for witnesses and defendants alike. In many cases, justice was not done. But to my mind many the bigger issue is gacaca reinforced the idea that post-genocide Rwanda is an environment of winners and losers.

The entire justice process excluded non-genocide crimes, in particular atrocities that the RPF committed as it took power, in the northwest the late 1990s, and in Congo, where a lot of violence occurred. This meant that whole categories of suffering in the long arc of the 1990s and 2000s were neither recognized nor accounted for. Justice was one-sided. Many Rwandans experience it therefore as political justice that serve the RPF goal of retaining power.

The second issue is the scale. A million citizens, primarily Hutu, were accused. The net effect is that the legal process served to politically demobilize many Hutus, as Anu Chakravarty has written. Having watched the process of rebuilding social cohesion and state-society relations after atrocity in several places, I come to the conclusion that inclusion is vitally important.

If states privilege justice as a mechanism for social healing, judicial processes should recognize the multi-sided nature of atrocity. All groups that suffered from atrocity should be able to give voice to their experiences and, if punitive measures are on the table, seek accountability. Otherwise, in the long run, justice looks like a charade, one that ultimately may undermine the memories it is designed to preserve.

Here is where the “genocide lens” did not serve Rwanda well. A genocide lens narrates history as a story between perpetrators and victims. Yet the Rwandan reality is much more complicated.

Scott Straus is Professor of Political Science and International Studies at UW-Madison. Scott specializes in the study of genocide, political violence, human rights, and African politics. His published work includes several books on Rwanda and articles in African Affairs. A longer version of this article was presented at the “Rwanda Today: Twenty Years after the Genocide” event at Humanity House in The Hague on 3 April 2014. The author wishes to thank the organizers of that event.

To mark the 20th anniversary of the genocide, African Affairs is making some of their best articles on Rwanda freely available. Don’t miss this opportunity to read about the legacy of genocide and Rwandan politics under the RPF.

African Affairs is published on behalf of the Royal African Society and is the top ranked journal in African Studies. It is an inter-disciplinary journal, with a focus on the politics and international relations of sub-Saharan Africa. It also includes sociology, anthropology, economics, and to the extent that articles inform debates on contemporary Africa, history, literature, art, music and more.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post Twenty years after the Rwandan Genocide appeared first on OUPblog.

0 Comments on Twenty years after the Rwandan Genocide as of 4/8/2014 6:59:00 AM
Add a Comment
5. A brief history of ethnic violence in Rwanda and Africa’s Great Lakes region

By J. J. Carney


A few years ago an American Catholic priest asked me about my dissertation research. When I told him I was studying the intersection of Catholicism, ethnicity, and violence in Rwandan history, he responded, “Those people have been killing each other for ages.”

Such is the common if misguided popular stereotype. But even the better informed are often unaware of the longer historical trajectories of violence in Rwanda and the broader Great Lakes region. Although the 1994 genocide in Rwanda has garnered the most scholarly and popular attention–and rightfully so–it did not emerge out of a vacuum. As the world commemorates the 20th anniversary of the genocide, it is important to locate this epochal humanitarian tragedy within a broader historical and regional perspective.

Northwestern Rwanda by CIAT. CC BY-SA 2.0 via Flickr.

Northwestern Rwanda by CIAT. CC BY-SA 2.0 via Flickr.

First, explicitly “ethnic” violence has a relatively recent history in Rwanda. Although precolonial Rwanda was by no means a utopian paradise, the worst political violence occurred in the midst of intra-elite dynastic struggles, such as the one that followed the death of Rwanda’s famous Mwami Rwabugiri in 1895. Even after the hardening of Hutu and Tutsi identities under the influence of German and Belgian colonial rule, there was no explicit Hutu-Tutsi violence throughout the first half of the 20th century.

This all changed in the late 1950s. As prospects for decolonization advanced, Hutu elites began to mobilize the Rwandan masses on the grounds of “Hutu” identity; Tutsi elites in turn encouraged a nationalist, pan-ethnic paradigm. The latter vision may have carried the day save for the sudden July 1959 death of Rwanda’s long-serving king, Mwami Mutara Rudahigwa. Mutara’s death opened up a political vacuum, emboldening extremists on all sides. After an escalating series of incidents in October 1959, a much larger wave of ethnic violence broke out in November 1959. Hutu mobs burned Tutsi homes across northern Rwanda, killing hundreds and forcing thousands from their homes. Scores of Hutu political leaders were killed in retaliatory attacks. Even here, however, motivations could be more complicated than an ethnic zero-sum game. For example, many Hutu militia leaders later claimed that they were defending Rwanda’s Tutsi king, Mwami Kigeli V, from a cabal of Tutsi chiefs. In other cases Hutu and Tutsi self-defense forces collaborated to defend their communities.

Supported by key figures in the Catholic hierarchy and the Belgian colonial administration, Hutu political leaders like Gregoire Kayibanda soon gained the upper hand in the political struggle that followed the November 1959 violence. In turn, political violence took on increasingly ethnic overtones during election cycles in 1960 and 1961; hundreds of mostly Tutsi civilians were killed in a series of local massacres between March 1960 and September 1961. Marginalized inside Rwanda, Tutsi exile leaders launched raids into Rwanda in early 1962, sparking further retaliatory violence against Tutsi civilians in the northern town of Byumba. For their part, European missionaries and colonial officials deplored the violence even as they blamed much of it on Tutsi exile militias, attributing the Hutu reactions to uncontrollable “popular anger.”

If these earlier episodes could be classified as “ethnic massacres,” a larger genocidal event unfolded in December 1963 and January 1964. Shortly before Christmas, a Tutsi exile militia invaded Rwanda from neighboring Burundi. The incursion was quickly repulsed by a combined force of Belgian and Rwandan army units. In the immediate aftermath, the Rwandan government launched a vicious repression of Tutsi opposition political leaders. In the weeks that followed, local government “self-defense” units executed upwards of 10,000 Tutsi civilians in the southern Rwandan province of Gikongoro. Vatican Radio among other media sources deplored “the worst genocide since World War II.” Local religious leaders like Archbishop André Perraudin stood by the government, however, calling the invoking of “genocide” language “deeply insulting for a Catholic head of state.”

Rwanda’s “ethnic syndrome” spread to neighboring Burundi during the 1960s. After a failed Hutu coup d’état in April-May 1972, Burundi’s Tutsi-dominated military launched a fierce repression known locally as the “ikiza” (“curse”). Over 200,000 mostly educated Hutu were killed that summer. In Rwanda, anti-Tutsi violence broke out in February 1973. Although the number of deaths was much lower than in 1963-64, hundreds of Tutsi elites were driven into exile as pogroms broke out at Rwanda’s national university, several Catholic seminaries, and a multitude of secondary schools and parishes.

Rwanda and Burundi were both dominated by one-party military dictatorships during the 1970s and 1980s. For some years each regime paid lip service to a pan-ethnic ideal. However, as economic and political conditions worsened in the late 1980s, ethnic violence flared again in 1988 in the northern Burundian provinces of Ntega and Marangara. In October 1990, the Tutsi-dominated Rwanda Patriotic Front invaded northern Rwanda, sparking a three-year civil war that profoundly destabilized Rwandan society. Following the pattern of the early 1960s, Hutu militias responded by targeting Tutsi civilians in six separate local massacres between October 1990 and February 1994. In turn, the October 1993 assassination of Melchior Ndadaye, Burundi’s first Hutu Prime Minister, sparked a massive outbreak of ethnic violence and civil war in Burundi that would ultimately take the lives of over half a million.

In turn, one should not forget the post-1994 violence that continued to plague the region. Not only did Rwanda suffer more massacres (some directed at Hutu) between 1995 and 1998, but Burundi’s civil war continued until 2006. Perhaps worst of all, Eastern Congo after 1996 became the epicenter of what many scholars have dubbed “Africa’s World War.” The precipitous cause of the conflict was Rwanda’s invasion of Congo in October 1996, ostensibly to clear Hutu refugee camps that were serving as staging grounds for cross-border raids into Rwanda. Upwards of four million Congolese died from war-related causes over the next six years. Over a decade later, Rwandan-backed militias continue to dominate Congo’s Kivu provinces. The “afterlife” of the Rwanda genocide thus continues even in 2014.

The 1994 genocide took the lives of an estimated 800,000 Rwandans, the vast majority of them Tutsi. This genocide–and the world’s utter abandonment of the Rwandan people–should never be forgotten. But nor should we overlook the political and ethnic violence that preceded and followed the genocide, whether in Rwanda, Burundi, or the Democratic Republic of the Congo. One can only hope that the next 20 years will be kinder to a region that has suffered so much over the past generation.

J. J. Carney is Assistant Professor of Theology at Creighton University, Omaha, Nebraska. His research and teaching interests engage the theological and historical dimensions of the Catholic experience in modern Africa. He has published articles in African Ecclesial Review, Modern Theology, Journal of Religion in Africa, and Studies in World Christianity. He is author of Rwanda Before the Genocide: Catholic Politics and Ethnic Discourse in the Late Colonial Era.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post A brief history of ethnic violence in Rwanda and Africa’s Great Lakes region appeared first on OUPblog.

0 Comments on A brief history of ethnic violence in Rwanda and Africa’s Great Lakes region as of 4/7/2014 7:21:00 AM
Add a Comment
6. Two difficult roads from empire

By Martin Thomas


Britain’s impending withdrawal from Afghanistan and France’s recent dispatch of troops to the troubled Central African Republic are but the latest indicators of a long-standing pattern. Since 1945 most British and French overseas security operations have taken place in places with current or past empire connections. Most of these actions occurred in the context of the contested end of imperial rule – or decolonization. Some were extraordinarily violent; others, far less so. Historians, investigative journalists and leading intellectuals, especially in France, have pointed to extra-judicial killing, systematic torture, mass internment and other abuses as evidence of just how dirty decolonization’s wars could be. Some have gone further, blaming the dismal human rights records of numerous post-colonial states on their former imperial rulers. Others have pinned responsibility on the nature of decolonization itself by suggesting that hasty, violent or shambolic colonial withdrawals left a power vacuum filled by one-party regimes hostile to democratic inclusion. Whatever their accuracy, the extent to which these accusations have altered French and British public engagement with their recent imperial past remains difficult to assess. The readiness of government and society in both countries to acknowledge the extent of colonial violence indicates a mixed record. In Britain, media interest in such events as the systematic torture of Mau Mau suspects in 1950s Kenya sits uncomfortably with the enduring image of the British imperial soldier as hot, bothered, but restrained. Recent Foreign and Commonwealth Office releases of tens of thousands of decolonization-related documents, apparently ‘lost’ hitherto, may present the opportunity for a more balanced evaluation of Britain’s colonial record.

In France, by contrast, the media furores and public debates have been more heated. In June 2000 Le Monde’s published the searing account by a young Algerian nationalist fighter, Louisette Ighilahriz, of her three months of physical, sexual and psychological torture at the hands of Jacques Massu’s 10th Parachutist Division in Algiers at the height of Algeria’s war of independence from France. Ighilahriz’s harrowing story helped trigger years of intense controversy over the need to acknowledge the wrongs of the Algerian War. After years in which difficult Algerian memories were either interiorized or swept under capacious official carpets, big questions were at last being asked. Should there be a formal state apology? Should decolonization feature in the school curriculum? Should the war’s victims be memorialized? If so, which victims? Although the soul-searching ran deep, official responses could still be troubling. On 5 December 2002 French President Jacques Chirac, himself a veteran of France’s bitterest colonial war, unveiled a national memorial to the Algerian conflict and the concurrent ‘Combats’ (using the word ‘war’ remained intensely problematic) in former French Morocco and Tunisia. France’s first computerized military monument, the names of some 23,000 French soldiers and Algerian auxiliaries who died fighting for France scrolled down vertical screens running the length of the memorial columns.

Paris monument

Paris monument to Algerian War dead: author’s own photograph.

No mention of the war’s Algerian victims, but at least a start. Yet, seven months later, on 5 July 2003, another unveiling took place. This one, in Marignane on Marseilles’ outer fringe, was less official to be sure. A plaque to four activists of the pro-empire terror group, the Organisation de l’Armée secrète (OAS), carries the inscription ‘fighters who fell so that French Algeria might live’. Among those commemorated were two of the most notorious members of the OAS. One was Roger Degueldre, leader of the ‘delta commandos’, who, among other killings, murdered six school inspectors in Algeria days before the war’s final ceasefire. The other was Jean-Marie Bastien-Thiry, organizer of two near-miss assassination attempts on Charles de Gaulle, first President of France’s Fifth Republic. Equally troubling, it took the threat of an academic boycott in 2005 before France’s Council of State advised President Chirac to withdraw a planned stipulation that French schoolchildren must be taught the ‘positive role of the French colonial presence, notably in North Africa’.

One explanation for the intensity of these history wars is that few France and Britain’s colonial fights since the Second World War were definitively won or lost at identifiable places and times. The fall of the French fortress complex at Dien Bien Phu in May 1954, the climax of an eight-year colonial war over Vietnam’s independence from France, was the exception, not the rule. Not surprisingly, its anniversary has been regularly celebrated by the Vietnamese Communist authorities since then.

Elsewhere it was harder for people to process victory or defeat as a specific event, as a clean break offering new beginnings, rather than as an inconclusive process that settled nothing. Officials in British Kenya reported that the Mau Mau rebellion, rooted among the colony’s Kikuyu majority, was ‘all but over’ by the end of 1955. Yet emergency rule continued almost five years more. To the East, in British Malaya, a larger and more long-standing Communist insurgency was in almost incessant retreat from 1952. Surrender terms were laid down in September 1955. Two years later British aircraft peppered the Malayan jungle, not with bombs but with thirty-four million leaflets offering an amnesty-for-surrender deal to the few hundred guerrillas who remained at large. Even so Malaya’s ‘Emergency’ was not finally lifted until 1960.

In the two decades that followed, the Cold War migrated ever southwards, acquiring a more strongly African and Asian dimension. The contest between liberal capitalism and diverse models of state socialism became a battle increasingly waged in regions adjusting to a post-colonial future. Some of the bitterest conflicts of the 1960s to the 1990s originated in fights for decolonization that morphed into intractable proxy wars in which civilians often counted amongst the principal victims. In the late twentieth century France and Britain avoided the worst of all this. Should we, then, celebrate the fact that most of the hard work of ending the British and French empires was done by the dawn of the 1960s? I would suggest otherwise. For every instance of violence avoided, there were instances of conflict chosen, even positively embraced. Often these choices were made in the light of lessons drawn from other places and other empires. Just as the errors made sometimes caused worst entanglements, so their original commission reflected entangled colonial pasts. Often messy, always interlocked, these histories remind us that Britain and France travelled their difficult roads from empire together.

Martin Thomas is Professor of Imperial History at the University of Exeter. This post is partially extracted from Fight or Flight: Britain, France, and their Roads from Empire

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post Two difficult roads from empire appeared first on OUPblog.

0 Comments on Two difficult roads from empire as of 4/7/2014 4:23:00 AM
Add a Comment
7. Western (and other) perspectives on Ukraine

By Robert Pyrah


Untangling recent and still-unfolding events in Ukraine is not a simple task. The western news media has been reasonably successful in acquainting its consumers with events, from the fall of Yanukovich on the back of intensive protests in Kiev, by those angry at his venality and signing a pact with Russia over one with the EU, to the very recent moves by Russia to annex Crimea.

However, as is perhaps inevitable where space is compressed, messages brief and time short, a habit of talking about Ukraine in binaries seems to be prevalent. Superficially helpful, it actually hinders a deeper understanding of the issues at hand – and any potential resolution. Those binaries, encouraged to some extent by the nature of the protests themselves (‘pro-Russian’ or ‘pro-EU/Western’), belie complex and important heterogeneities.

Ironically, the country’s name, taken by many to mean ‘borderland’, is one such index of underlying complexity. Commentators outside the mainstream news, including specialists like Andrew Wilson, have long been vocal in pointing out that the East-West divide is by no means a straightforward geographic or linguistic diglossia, drawn with a compass or ruler down the map somewhere east of Kiev, with pro-Western versus pro-Russian sentiment ‘mapped’ accordingly. Being a Russian-speaker is not automatically coterminous with following a pro-Russian course for Ukraine; and the reverse is also sometimes true. In a country with complex legacies of ethnic composition and ruling regime (western regions, before incorporation into the USSR, were ruled at different times in the modern period by Poland, Romania and Austria-Hungary), local vectors of identity also matter, beyond (or indeed, within) the binary ethnolinguistic definition of nationality.

The Bridge to the European Union from Ukraine to Romania. Photo by Madellina Bird. CC BY-NC-SA 2.0 via madellinabird Flickr

The Bridge to the European Union from Ukraine to Romania. Photo by Madellina Bird. CC BY-NC-SA 2.0 via madellinabird Flickr.

Just as slippery is the binary used in Russian media, which portrays the old regime as legitimately elected and the new one as basically fascist, owing to its incorporation of Ukrainian nationalists of different stripes. First, this narrative supposes that being legitimately elected negates Yanukovich’s anti-democratic behaviours since that election, including the imprisonment of his main political opponent, Yulia Tymoshenko (whatever the ambivalence of her own standing in the politics of Ukraine). Second, the warnings about Ukrainian fascism call to mind George Bernard Shaw’s comment about half-truths as being especially dangerous. As well-informed Ukraine watchers like Andreas Umland and others have noted, overstating the presence of more extreme elements sets up another false binary as a way of deligitimising the new regime in toto. This is certainly not to say that Ukraine’s nationalist elements should escape scrutiny, and here we have yet another warning against false binaries: EU countries themselves may be manifestly less immune to voting in the far right at the fringes, but they still may want to keep eyes and ears open as to exactly what some of Ukraine’s coalition partners think and say about its history and heroes, the Jews, and much more.

So much for seeing the bigger picture, but events may well still take turns that few historians could predict with detailed accuracy. What we can see, at least, from the perspective of a maturing historiographic canon in the west, is that Ukraine is a country that demands a more sophisticated take on identity politics than the standard nationalist discourse allows – a discourse that has been in existence since at least the late nineteenth Century, and yet one which the now precarious-seeming European idea itself was set up to moderate.

Robert Pyrah is author of the recent review article, “From ‘Borderland’ via ‘Bloodlands’ to Heartland? Recent Western Historiography of Ukraine” (available to read for free for a limited time) in the English Historical Review. Robert Pyrah is a Member of the History Faculty and a Research Associate at the Faculty of Medieval and Modern Languages at the University of Oxford

First published in January 1886, The English Historical Review (EHR) is the oldest journal of historical scholarship in the English-speaking world. It deals not only with British history, but also with almost all aspects of European and world history since the classical era.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.

The post Western (and other) perspectives on Ukraine appeared first on OUPblog.

0 Comments on Western (and other) perspectives on Ukraine as of 4/3/2014 10:33:00 AM
Add a Comment
8. The economics of sanctions

economic policy with richard grossman

By Richard S. Grossman


Russia’s seizure of Crimea from Ukraine has left its neighbors—particularly those with sizable Russian-speaking populations such as Kazakhstan, Latvia, Estonia, and what is left of Ukraine—looking over their shoulder wondering if they are next on Vladimir Putin’s list of territorial acquisitions. The seizure has also left Europe and United States looking for a coherent response.

334px-Vladimir_Putin_12015Neither the Americans nor the Europeans will go to war over Crimea. Military intervention would be costly, unpopular at home, and not necessarily successful. Unless a fellow member of the North Atlantic Treaty Organization (which includes Latvia and Estonia) were attacked by Russia, thereby requiring a military response under the terms of the NATO treaty, the West will not go to war to check Putin’s land grabs.

So far, the West’s response—aside from harsh rhetoric—has been economic, not military. Both the United States and Europe have imposed travel and financial sanctions on a handful of close associates of Putin (which have had limited effect), with promises of escalation should Russia continue on its expansionist path.

What is the historical record on sanctions? And what are the chances for success if the West does escalate?

The earliest known use of economic sanctions was Pericles’s Megarian decree, enacted in 432 BCE, in which the Athenian leader “…banished [the Megarians] both from our land and from our markets and from the sea and from the continent” (Aristophanes, The Acharnians). The results of these sanctions, according to Aristophanes, was starvation among the Megarians.

Hufbauer, Schott, Elliot, and Oegg (2008) catalogue more than 170 instances of economic sanctions between 1910 and 2000. They find that only about one third of all sanctions efforts were even partially successful, although the success depends critically on the sanction’s goal. Limited goals (e.g. the release of a political prisoner) have been successful about half of the time; more ambitious goals (e.g. disruption of a military adventure, military impairment, regime change, or democratization) are successful between a fifth and a third of the time. Of course, these figures depend crucially on a whole host of additional factors, including the cost borne by the country imposing sanctions, the resilience of the country being sanctioned, and the necessity of international cooperation for the sanctions to be fully implemented.

Despite these cautionary statistics, sanctions can sometimes be effective. According to the US Congressional Research Service, recent US sanctions reduced Iranian oil exports by 60% and led to a decline in the value of the Iranian currency by 50%, forcing Iranian leaders to accept an interim agreement with the United States and its allies in November 2013. On the other hand, US sanctions against Cuba have been in place for more than 50 years and, although having helped to impoverish the island, they have not brought about the hoped for regime change.

Current thinking on sanctions favors what are known as “targeted” or “smart” sanctions. That is, rather than embargoing an entire economy (e.g. the US embargo of Cuba), targeted sanctions aim to hit particular individuals or sectors of the economy via travel bans, asset freezes, arms embargo, etc. Russian human rights campaigner and former World Chess Champion Gary Kasparov suggested in a Wall Street Journal opinion piece that the way to get to Putin through such smart sanctions, writing:

“If the West punishes Russia with sanctions and a trade war, that might be effective eventually, but it would also be cruel to the 140 million Russians who live under Mr. Putin’s rule. And it would be unnecessary. Instead, sanction the 140 oligarchs who would dump Mr. Putin in the trash tomorrow if he cannot protect their assets abroad. Target their visas, their mansions and IPOs in London, their yachts and Swiss bank accounts. Use banks, not tanks.”

If such sanctions were technically and legally possible—and that the expansionist urge comes from Putin himself and would not be echoed by his successor—this could be the quickest and most effective way to solve the problem.

360px-Abrakupchinskaya_oil_exploration_drilling_rig_in_Evenkiysky_DistrictA slower, but nonetheless sensible course is to squeeze Russia’s most important economic sector—energy. Russian energy exports in 2012 accounted for half of all government revenues. Sanctions that restrict Russia’s ability to export oil and gas would deal a devastating blow to the economy, which has already suffered from the uncertainty surrounding Russian intervention in Ukraine. By mid-March the Russian stock market was down over 10% for the year; the ruble was close to its record low against the dollar; and 10-year Russian borrowing costs were nearly 10%–more than 3% higher than those of the still crippled Greek economy—indicating that international lenders are already wary of the Russian economy.

A difficulty in targeting the Russian energy sector—aside from the widespread pain imposed on ordinary Russians–is that the Europeans are heavily dependent on it, importing nearly one third of their energy from Russia. Given the precarious position of its economy at the moment, an energy crisis is the last thing Europe needs. Although alternative energy sources not will appear overnight, old and new sources could eventually fill the gap, including greater domestic production and rethinking Germany’s plans to close its nuclear plants. Loosening export restrictions on the now-booming US natural gas industry would provide yet another alternative energy source to Europe and increase the effectiveness of sanctions. Freeing the industrialized world from dependence on dictators to fulfill their energy needs can only help the West’s long-term growth prospects and make it less susceptible to threats from rogue states.

If we are patient, squeezing Russia’s energy sector might work. In the short run, however, sanctioning the oligarchs may be the West’s best shot.

Richard S. Grossman is Professor of Economics at Wesleyan University and a Visiting Scholar at the Institute for Quantitative Social Science at Harvard University. He is the author of WRONG: Nine Economic Policy Disasters and What We Can Learn from Them and Unsettled Account: The Evolution of Banking in the Industrialized World since 1800. His homepage is RichardSGrossman.com, he blogs at UnsettledAccount.com, and you can follow him on Twitter at @RSGrossman. You can also read his previous OUPblog posts.

Subscribe to the OUPblog via email or RSS.
Subscribe to only business and economics articles on the OUPblog via email or RSS.
Image credits: (1) Vladimir Putin. Russian Presidential Press and Information Office. CC BY 3.0 by kremlin.ru. (2) Abrakupchinskaya oil exploration drilling rig in Evenkiysky District. Photo by ShavPS. CC-BY-SA-3.0 via Wikimedia.

The post The economics of sanctions appeared first on OUPblog.

0 Comments on The economics of sanctions as of 1/1/1900
Add a Comment
9. Politics to reconnect communities

OUP-Blogger-Header-V2 Flinders

By Matthew Flinders


Why does art and culture matter in the twenty-first century? What does it actually deliver in terms of social benefits? An innovative new participatory arts project in South Yorkshire is examining the ‘politics of art’ and the ‘art of politics’ from a number of new angles.

“The general value of arts and culture to society has long been assumed,” a recent report from the Arts Council acknowledges, “while the specifics have just as long been debated.” It is this focus on ‘the specifics’ that is most interesting because in times of relative prosperity there was little pressure from neither public nor private funders to demonstrate the broader social impact or relevance of the arts. In times of austerity, however, the situation is very different. A focus on the STEM subjects (science, technology, engineering, and mathematics) risks eviscerating the funding for the arts and humanities unless these more creative and less tangible intellectual pursuits can demonstrate their clear social value. The vocabulary of ‘social return’, ‘intellectual productive capacity’, ‘economic generation’ may well grate against the traditional values and assumptions of the arts and culture community but it is a shadow that cannot be ignored.

The publication of The Impact of the Social Sciences (Sage, 2014) provides more than a sophisticated analysis of the value of the social sciences across a range of economic, cultural, and civic dimensions. It provides a political treatise and a strategic piece of evidence-based leverage that may play an important role in future debates over the distribution of diminishing public funds. I have no doubt that the impact of the arts and humanities is equally significant. But the problem is that the systematic creation of an evidence base remains embryonic. My personal belief that the arts and humanities are educationally critical is, in many quarters, meaningless without demonstrable evidence to support these beliefs. The methodological and epistemological challenges of delivering that research are clearly significant but as the Arts Council emphasizes ‘it is something that arts and culture organizations will have to do in order to secure funding from both public and private sources’.

As a political scientist I have always been fascinated with the relationship between art and politics. Though heretical to suggest to the arts community, I have often thought that the role of the professional politician and the professional artist (indeed, with the amateur politician and the amateur artist) were more similar than was often acknowledged. Both seek to express values and visions, to inspire hope and disgust, and both wish to present a message. It is only the medium through which that message is presented that differs (and relationships of co-option, patronage, and dependency are common between these professions). But having (crudely) established a relationship between art and politics, could it be that the true value of the arts lies not in how it responds to the needs of the economy but in how it responds to the rise of ‘disaffected democrats’ and the constellation of concerns that come together in the ‘why we hate politics’ narrative?

Parliament_at_Sunset

In a time of increasing social anomie and political disengagement, especially amongst the young and the poor, can participatory arts projects provide a way of reconnecting communities?

François Matarasso’s Use or Ornament (1997) provides one of the most systematic explorations of this question and concluded that “one of the most important outcomes of [the public’s] involvement in the arts was finding their own voice, or perhaps, the courage to use it.” More recently, the New Economics Foundation’s report Diversity and Integration (2013) suggested that young people who participated in arts programmes were more likely to see themselves as “holding the potential to do anything I want to do” and being “able to influence a group of people to get things done.” Other studies tentatively offer similarly positive conclusions but with little analytical depth in terms of identifying between political reconnection, civic reconnection or personal reconnection (in terms of personal understanding, confidence and aspiration). To return to the Arts Council’s recent report – The Wider Benefits of Art and Culture to Society – the existing research base is light on ‘the specifics’.

It is for exactly this reason that the Sir Bernard Crick Centre for the Public Understanding of Politics has joined forces with ‘Art in the Park’ as part of the Arts and Humanities Research Council’s ‘Civic Value’ programme. Young people from all across South Yorkshire will be brought together to participate in an eight week arts project that uses music, film making, dance, writing, painting or whatever medium the young people select to explore social and political issues. Artists are embedded in the research and current and former politicians can be brought into the project to facilitate sessions if that is something the young people request. Surveys, focus groups, and interviews will capture how participating in the project affects political attitudes and understandings – positive, negative, political, civic, or personal – with the aim being able to answer if the arts can breathe life back into politics and reconnect communities. Now that really would be a wider benefit for society.

Flinders author picMatthew Flinders is Founding Director of the Sir Bernard Crick Centre for the Public Understanding of Politics at the University of Sheffield and also Visiting Distinguished Professor in Governance and Public Policy at Murdoch University, Western Australia. He is the author of Defending Politics (2012). He was recently a winner in the ‘This is Democracy’ International Photography Competition – but his wife now claims she took the picture. Malaika Cunningham is the Research Officer for the project discussed in this article.

Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.
Image credit: Parliament at sunset, public domain via WikiCommons.

The post Politics to reconnect communities appeared first on OUPblog.

0 Comments on Politics to reconnect communities as of 4/2/2014 5:11:00 AM
Add a Comment
10. Elinor and Vincent Ostrom: federalists for all seasons

By John Kincaid


When Elinor Ostrom visited Lafayette College in 2010, the number of my non-political science colleagues who announced familiarity with her work astonished me. Anthropologists, biologists, economists, engineers, environmentalists, historians, philosophers, sociologists, and others flocked to see her.

Elinor’s work cut across disciplines and fields of governance because she deftly employed and developed interrelated concepts having applications in multiple settings. A key foundation of these concepts is federalism—an idea central also to the work of her mentor and husband, Vincent Ostrom.

Vincent understood federalism to be a covenantal relationship that establishes unity for collective action while preserving diversity for local self-governance by constitutionally uniting separate political communities into a limited but encompassing political community. Power is divided and shared between concurrent jurisdictions—a general government having certain nationwide duties and multiple constituent governments having broad local responsibilities. These jurisdictions both cooperate and compete. The arrangement is non-hierarchical and animated by multiple centers of power, which, often competing, exhibit flexibility and responsiveness.

From this foundation, one can understand why the Ostroms embraced the concept of polycentricity advanced in Michael Polanyi’s The Logic of Liberty (1951), namely, a political or social system consisting of many decision-making centers possessing autonomous, but limited, powers that operate within an encompassing framework of constitutional rules.

This general principle can be applied to the global arena where, like true federalists, the Ostroms rejected the need for a single global institution to solve collective action problems such as environmental protection and common-pool resource management. They advocated polycentric arrangements that enable local actors to make important decisions as close to the affected situation as possible. Hence, the Ostroms also anticipated the revival of the notion of subsidiarity in European federal theory.

connecting the dots

But polycentricity also applies to small arenas, such as irrigation districts and metropolitan areas. Elinor and Vincent worked on water governance early in their careers, and both argued that metropolitan areas are best organized polycentrically because urban services have different economies of scale, large bureaucracies have inherent pathologies, and citizens are often crucial in co-producing public services, especially policing (the subject of empirical studies by Elinor and colleagues).

The Ostroms valued largely self-organizing social systems that border on but do not topple into sheer anarchy. Anarchy is a great bugaboo of centralists, who de-value the capacity of citizens to organize for self-governance. Without expert instructions from above, citizens are headless chickens. But this centralist notion exposes citizens to the depredations of vanguard parties and budget-maximizing bureaucrats.

This is why Vincent placed Hamilton’s famous statement in Federalist No. 1 at the heart of his work, namely, “whether societies of men are really capable or not, of establishing good government from reflection and choice” rather than “accident and force.” The Ostroms expressed abiding confidence in the ability of citizens to organize for self-governance in multi-sized arenas if given opportunities to reflect on their common dilemmas, make reasoned constitutional choices, and acquire resources to follow through with joint action.

Making such arrangements work also requires what Vincent especially emphasized as covenantal values, such as open communication, mutual trust, and reciprocity among the covenanted partners. Thus, polycentric governance, like federal governance, requires both good institutions and healthy processes.

As such, the Ostroms also placed great value on Alexis de Tocqueville’s notion of self-interest rightly understood. Indeed, it is the process of self-organizing and engaging one’s fellow citizens that helps participants to understand their self-interest rightly so as to act in collectively beneficial ways without central dictates.

Consequently, another major contribution of the Ostroms was to point out that governance choices are not limited to potentially gargantuan government regulation or potentially selfish privatization. There is a third way grounded in federalism.

John Kincaid is the Robert B. and Helen S. Meyner Professor of Government and Public Service at Lafayette College and Director of the Meyner Center for the Study of State and Local Government. He served as Associate Editor and Editor of Publius: The Journal of Federalism, and has written and lectured extensively on federalism and state and local government.

More on the applications and reflections on the work of Elinor and Vincent Ostrom can be found in this recently released special issue from Publius: The Journal of Federalism. An addition to this, Publius has also just released a free virtual collection of the most influential articles written by the Ostroms and published in Publiues over the past 23 years.

Publius: The Journal of Federalism is the world’s leading journal devoted to federalism. It is required reading for scholars of many disciplines who want the latest developments, trends, and empirical and theoretical work on federalism and intergovernmental relations.

Subscribe to the OUPblog via email or RSS.
Subscribe to social sciences articles on the OUPblog via email or RSS.
Image credit: Social network background. © bekir gürgen via iStockphoto.

The post Elinor and Vincent Ostrom: federalists for all seasons appeared first on OUPblog.

0 Comments on Elinor and Vincent Ostrom: federalists for all seasons as of 3/31/2014 8:47:00 AM
Add a Comment
11. Is there a “cyber war” between Ukraine and Russia?

By Marco Roscini


Alarming headlines have recently started to appear in the media (see, for example, the CNN’s “Cyberwar hits Ukraine”). This, however, is sensationalism. What has actually happened so far is limited disruption of mobile communications through Distributed Denial of Service (DDoS) attacks. In addition, certain state-run news websites and social media have been defaced and their content replaced with pro-Russian propaganda. In the months that preceded the current crisis, Ukrainian computer systems were also allegedly targeted by “cyberspies”.

If the above scenario sounds familiar it is because it isn’t the first time that cyber operations have occurred during a military crisis involving the Russian Federation. In 2008, immediately before and after the Russian troops entered the secessionist Georgian province of South Ossetia, several Georgian governmental websites were defaced and their content replaced with anti-Georgian propaganda, while DDoS attacks crippled the Caucasian nation’s ability to disseminate information. Estonia was also the target of severe DDoS attacks in 2007, although in the context of a political, and not military, confrontation with Russia. In neither case has it been convincingly demonstrated that Russia (or any other state) was responsible for the cyber operations. The same can be said of the cyber operations against Ukrainian computer systems and websites, which have also been, at least until now, far less severe than those on Georgia and on Estonia, leading some to suggest that Russia is exercising restraint in the use of its cyber capabilities.

Does international law apply in this scenario?

Fingers on the keyboard

While the DDoS attacks and the defacement of websites obviously don’t establish on their own an armed conflict between Russia and Ukraine, the fact that they have been conducted in the context of kinetic exchanges of fire and a situation of occupation may potentially lead to the application of the law of armed conflict (jus in bello). Two points are important from this perspective. First, although there have been no extensive armed hostilities between Ukraine and Russia yet, it has been reported that at least one Ukrainian soldier has been killed and another wounded, allegedly by Russian military forces or pro-Russian militias. Unlike in non-international armed conflicts, the jus in bello applies to any shot fired between states, regardless of intensity thresholds. The Commentary to Article 2 common to the 1949 Geneva Conventions on the Protection of the Victims of War clearly states that “[i]t makes no difference how long the conflict lasts, or how much slaughter takes place, or how numerous are the participating forces” (p. 23). Secondly, the fact that Crimea is now under the control of the Russian forces determines a situation of occupation that also falls under the scope of the law of armed conflict (Article 2(2) of the Geneva Conventions).

However, the law of armed conflict would extend to the DDoS attacks and other cyber operations against Ukraine only if these have a “belligerent nexus” with the hostilities and the occupation. Otherwise, they would be mere cyber crimes and would fall under the scope of domestic criminal laws. To have a belligerent nexus, the cyber operations must have been designed to cause a certain threshold of harm to a belligerent (Ukraine) in support of another (Russia) (see Recommendation V(3) of the International Committee of the Red Cross (ICRC)’s Interpretive Guidance on the Notion of Direct Participation in Hostilities). Harm must be either death, injury, or destruction on civilian persons or objects, or military harm, whether physical or not (Recommendation V(1)). Even though they didn’t result in material damage on protected persons and property, then, the threshold of harm would have been crossed if the DDoS attacks and other cyber operations had at least aimed at affecting the Ukrainian government’s ability to communicate with and the operability of its armed forces, so to disrupt Ukraine’s military operations or military capacity. From the information available, we don’t know whether this is the case.

Do the DDoS operations against Ukraine amount to “attacks” under the law of armed conflict? The question is important because the rules on targeting and protecting civilians, including the principles of distinction and proportionality and the duty to take precautions, only apply to “attacks”, defined in Article 49(1) of Protocol I Additional to the Geneva Conventions as “acts of violence against the adversary, whether in offence or in defence”. I have argued elsewhere that a cyber operation is an “attack” in this sense whenever it employs cyber capabilities that produce or are reasonably likely to produce “violent” consequences in the form of loss of life or injury of persons, more than minimal material damage to property, or loss of functionality of infrastructures. From the available information, this doesn’t seem to be the case of the DDoS attacks against the Ukrainian communication systems and, even less, of the defacement operations. Cyber “espionage” also doesn’t normally affect the functionality of the accessed system or amend/delete the data resident therein. It doesn’t have “violent” consequences and is therefore not an “attack”, although it may be an act of hostilities.

To conclude, we can’t establish for sure whether the international law of armed conflict applies to the cyber operations conducted so far against Ukraine because we don’t know whether they were designed to militarily support Russia to the detriment of Ukraine. What we do know is that the operations in questions are not “attacks”, and therefore the rules on targeting don’t apply to them, whether or not they have a belligerent nexus.

Dr. Marco Roscini is Reader in International Law at the University of Westminster. He has written extensively in international security law, including cyber warfare and nuclear non-proliferation law. His most recent book, Cyber Operations and the Use of Force in International Law, has just been published by OUP. He is also the author of ‘Cyber Operations as Nuclear Counterproliferation Measures’, published in the Journal of Conflict and Security Law (2014). Dr. Roscini regularly blogs at Arms Control Law and can be followed on Twitter at @marcoroscini.

Oxford University Press is a leading publisher in international law, including the Max Planck Encyclopedia of Public International Law, latest titles from thought leaders in the field, and a wide range of law journals and online products. We publish original works across key areas of study, from humanitarian to international economic to environmental law, developing outstanding resources to support students, scholars, and practitioners worldwide. For the latest news, commentary, and insights follow the International Law team on Twitter @OUPIntLaw.

Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.
Image credit: Fingers on a keyboard, via iStockphoto.

The post Is there a “cyber war” between Ukraine and Russia? appeared first on OUPblog.

0 Comments on Is there a “cyber war” between Ukraine and Russia? as of 3/31/2014 5:49:00 AM
Add a Comment
12. The political economy of policy transitions

By Michael Trebilcock


The long fight to end slavery, led by William Wilberforce, among many others, culminated in Britain with the enactment of the Slavery Abolition Act in 1833. This Act made provision for a payment of £20 million (almost 40% of the British budget at the time) in compensation to plantation owners in many British colonies — about US$21 billion in present day value. Moreover, only slaves below the age of six were initially freed while others were re-designated as “apprentices”, who were to be freed in two stages in 1838 and 1840. Wilberforce and many other abolitionists accepted that compensation and phased implementation was required to ensure enactment of the legislation, particularly by the House of Lords where plantation owners were strongly represented among the aristocracy.

Whenever governments change policies — whether tax, expenditure, or regulatory policies — even when the changes are on net socially beneficial, there will typically be losers. There will be people who have made investments predicated on or even deliberately induced by the pre-reform set of policies. Very few policy changes make somebody better off and nobody worse off according to their own subjective valuations (the economist’s concept of Pareto efficiency). The issue of whether and when to mitigate the costs associated with policy changes — whether through explicit government compensation, grandfathering, or phased or postponed implementation — is ubiquitous across the policy landscape.

Changes in land use regulations often exempt existing non-conforming structures. Environmental regulations, such as energy efficiency requirements for motor vehicles, are often phased in over time. More stringent requirements for qualification for entry into various professions often grandfather existing members of these professions. Stricter gun control laws often grandfather existing gun owners. In post-conflict nation building exercises, a qualified line in the sand is often drawn under past atrocities committed by antagonists.

Unfinished portrait of the MP and abolitionist William Wilberforce by the English artist Thomas Lawrence, dated 1828. National Portrait Gallery, London.

Unfinished portrait of the MP and abolitionist William Wilberforce by the English artist Thomas Lawrence, dated 1828. National Portrait Gallery, London.

The need to take transition cost mitigation strategies seriously, as a matter of political economy, stands in relatively sharp contrast to two long-standing traditions in economics which tend to marginalize this issue. Economists, from a normative welfare economics perspective, often publish academic studies that document the gross inefficiencies associated with various existing public policies. However, it is unrealistic to assume that once these inefficiencies are revealed, well-intentioned but unenlightened political representatives will immediately espouse the proposed reforms, or that alternatively an aroused citizenry will appropriately discipline venal political leaders that have been captured by rent-seeking interest groups.

An alternative positive tradition in economics — public choice theory — does take politics seriously but tends to view the existing policy outcomes of the political process as the best we can achieve in a world not populated by angels. An austere version of this theory offers few prospects that existing political equilibria can be disrupted; the iron triangle of incestuous relationships between politicians, regulators/bureaucrats, and rent-seeking interest groups is largely impermeable to change. This view is hard to square with the privatization of many state-owned enterprises and the deregulation of many industries in many countries from the 1980s onwards, and the dramatic growth in environmental, health and safety, and other forms of social regulation over this period, often over the opposition of concentrated interests.

I view the political process as much more fluid and malleable. Significant policy reforms are politically feasible with political leadership committed to judicious combinations of transition cost mitigation policies and astute framing of issues so as to engage not only the interests but also the values of a broad cross-section of a country’s citizens.

Michael J. Trebilcock is Professor of Law and Economics at the University of Toronto School of Law and the author of Dealing with Losers: The Political Economy of Policy Transitions. He specializes in law and economics, international trade law, competition law, economic and social regulation, and contract law and theory. He has won awards for his work, including the 1989 Owen Prize by the Foundation for Legal Research for his book, The Common Law of Restraint of Trade, which was chosen as the best law book in English published in Canada in the past two years.

Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.

The post The political economy of policy transitions appeared first on OUPblog.

0 Comments on The political economy of policy transitions as of 3/30/2014 9:07:00 AM
Add a Comment
13. Unlearned lessons from the McMartin Preschool case

By Ross E. Cheit


It was the longest criminal trial in American history and it ended without a single conviction. Five people were charged with child sexual abuse based on extremely flimsy evidence. Some parents came to believe outlandish stories about ritual abuse and tunnels underneath the preschool. It is no wonder that the McMartin Preschool case, once labeled the largest “mass molestation” case in history, has come to be called a witch-hunt. In a commentary to a Retro Report in the New York Times earlier this month, Clyde Haberman, former Times reporter, repeated the view that the McMartin case was a witch-hunt that spawned a wave of other cases of “dubious provenance.” But does that description do justice to the facts?

A careful examination of court records reveals that the witch-hunt narrative about the McMartin case is a powerful but not entirely accurate story. For starters, critics have obscured the facts surrounding the origins of the case. Richard Beck, quoted as an expert in the Retro Report story, recently asserted that the McMartin case began when Judy Johnson “went to the police” to allege that her child had been molested. Debbie Nathan, the other writer quoted by Retro Report, went even further, asserting that “everyone overlooked the fact that Judy Johnson was psychotic.”

Both of these claims are false.

Judy Johnson did not bring her suspicions to the police; she brought them to her family doctor who, after examining the boy, referred him to an Emergency Room. That doctor recommended that the boy be examined by a child-abuse specialist. The pediatric specialist is the one who reported to the Manhattan Beach Police Department that “the victim’s anus was forcibly entered several days ago.”

Although Judy Johnson died of alcohol poisoning in 1986, making her an easy target for those promoting the witch-hunt narrative, there is no evidence that she was “psychotic” three years earlier. A profile in the now-defunct Los Angeles Herald-Examiner, published after Johnson died, made it clear that she was “strong and healthy” in 1983 and that she “jogged constantly and ate health food.” The case did not begin with a mythical crazy woman.

face-72194_640

Retro Report also disposed of the extensive medical evidence in the McMartin case with a single claim that there was no “definitive” evidence. But defense lawyer Danny Davis allowed that the genital injuries on one girl were “serious and convincing.” (His primary argument to the jury was that much of the time that this girl attended McMartin was outside the statute of limitations.) The vaginal injuries on another girl, one of the three involved in both McMartin trials, were described by a pediatrician as proving sexual abuse “to a medical certainty.” Were the reporter and fact-checkers for Retro Report aware of this evidence?

None of this is to defend the charges against five (possibly six) teachers in the case. Nor is it an endorsement of claims, made by some parents, that scores of children had been ritually abused. Rather, it is a plea to treat the case as something that unfolded over time and the children as individuals, not as an undifferentiated mass. As it turns out, there are credible reasons that jurors in both trials voted in favor of a guilty verdict on some counts. Those facts do not fit the witch-hunt narrative. Instead, they portray the reality of a complicated case.

When the story of prosecutorial excess overshadows all of the evidence in a child sexual abuse case, children are the ones sold short by the media. That is precisely what Retro Report did earlier this month. The injustices in the McMartin case were significant, most of them were to defendants, and the story has been told many times. But there was also an array of credible evidence of abuse that should not be ignored or written out of history just because it gets in the way of a good story.

The witch-hunt narrative has replaced any complicated truths about the McMartin case, and Retro Report, whose mission is to bust media myths, just came down solidly on the side of the myth. It wasn’t all a witch-hunt.

Ross E. Cheit is professor of political science and public policy at Brown University. He is an inactive member of the California bar and chair of the Rhode Island Ethics Commission. His forthcoming book, The Witch-Hunt Narrative: Politics, Psychology, and the Sexual Abuse of Children (OUP 2014), includes a 70-page chapter on the McMartin case.

Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.
Image credit: “Face In The Shadow” by George Hodan c/o PublicDomainPictures. Public domain via pixabay.

The post Unlearned lessons from the McMartin Preschool case appeared first on OUPblog.

0 Comments on Unlearned lessons from the McMartin Preschool case as of 3/27/2014 8:55:00 AM
Add a Comment
14. Who signed the death warrant for the British Empire?

By W. David McIntyre


The rapid dissolution of the European colonial empires in the middle decades of the 20th century were key formative events in the background to the contemporary global scene. As the British Empire was the greatest of the imperial structures to go, it is worth considering who signed the death warrant. I suggest there are five candidates.

The first is the Earl of Balfour, Prime Minister 1902-1905, who later penned the exquisite ‘status formula’ of 1926 to describe the relations of Britain and its self-governing white settler colonies, then known as Dominions. They were ‘equal in status’ though ‘freely co-operating’ under a single Crown. The implications were that they were as independent as they wanted to be and this was marked in the preamble to the Statute of Westminster five years later. Some visionaries at this time suggested that places like India or Nigeria might be Dominions, too, suggesting that here was an agreed exist route from empire.

India, indeed, took the route in 1947 when Clement Attlee, the second candidate, announced that the Raj would end. The jewel in the imperial crown was removed when the Raj was partitioned into the Dominions of India and Pakistan. They were followed into independence a year later by Ceylon (Sri Lanka) and Burma (Myanmar). With the ending of the Raj it was evident that empire’s days were numbered.

Harold Macmillan

Harold Macmillan, our third candidate, extended independence more widely to Africa. His celebrated ‘Wind of Change’ speech to the South African Parliament in 1960 marked a firm foot on the decolonization accelerator pedal. Macmillan’s conservative governments, 1957-1963, granted independence to fourteen colonies (eleven in Africa).

It looked as if the process would be completed by the fourth candidate, Harold Wilson, whose ‘Withdrawal from East of Suez’ and abolition of the Colonial Office ran in parallel to Britain’s preparations to enter the European Communities. Although the conservative government of Edward Heath, 1970-1974, delayed the withdrawal for a few years, Heath announced to the Commonwealth Heads of Government Meeting in Singapore in 1971 that the empire was ‘past history’. And, as part of his application of the baleful disciplines of business management to government, he ordered a ‘Programme Analysis and Review’ of all the remaining dependent territories. This process, conducted over 1973-74, concluded that the dependencies were liabilities rather than assets and that a policy of ‘accelerated decolonization; should be adopted in many small island countries in the Pacific, Indian Ocean, and Caribbean previously deemed too small and incapable of being sovereign states.

Although the Heath government never managed to approve this policy before it was ejected from office after the miner’s strike in 1974, the second Wilson government went ahead. It was Jim Callaghan, as Secretary of State for Foreign and Commonwealth Affairs who signed the death warrant on 13 June 1975 in the form of a despatch to administrators suggesting that the dependencies had been ‘acquired for historical reasons that were no longer valid’. To avoid the charge of colonialism being made by the anti-imperialist majority in the United Nations, Britain adopted ‘accelerated decolonization’ in the Pacific Islands. During the late 1970s and the 1980s most of the remaining dependent territories moved rapidly to independence. An empire acquired over half-a-millennium was dissolved in less than half-a-century.

W. David McIntyre was educated at Peterhouse, Cambridge, the University of Washington, Seattle, and the School of Oriental and African Studies, University of London. After teaching for the Universities of Maryland, British Columbia, and Nottingham, he became Professor of History at the University of Canterbury New Zealand between 1966 and 1997. As Honorary Special Correspondent of The New Zealand International Review he reported on Commonwealth Heads of Government Meetings from 1987 to 2011. His latest book is Winding up the British Empire in the Pacific Islands.

Subscribe to the OUPblog via email or RSS.
Subscribe to only British history articles on the OUPblog via email or RSS.
Image credit: Harold Macmillan. By Vivienne (Florence Mellish Entwistle) [Open Government Licence], via Wikimedia Commons

The post Who signed the death warrant for the British Empire? appeared first on OUPblog.

0 Comments on Who signed the death warrant for the British Empire? as of 3/26/2014 6:11:00 AM
Add a Comment
15. The Ukraine crisis and the rules great powers play by

By Michael H. Hunt


Amidst all the commentary occasioned by Russia fishing in troubled Ukrainian waters, one fundamental point tends to get lost from sight. Like many other recent points of international tension, this one raises the question of what are the rules great powers play by.

The United States has championed a values-based approach with a strong missionary impulse behind it. Woodrow Wilson provided its first full-blown articulation, and post-World War II policy saw to its full-blown application. Holding a dominant global position, Washington sought with varying degrees of urgency and determination to advance a basket of ideological goods. US leaders have articulated these goods in a variety of ways such as “democracy,” “free-market capitalism,” and “human rights.” But underlying all these formulations is a strong and distinctly American belief in the autonomy of the individual and a commitment to political liberty and limited state power. In the rhetoric of American statecraft these notions are a leitmotiv. They have generally set the direction of US policy responses to problems of the sort that Ukraine poses.

This American approach contrasts with a core dictum of classic realism: great powers have fundamental security interests most often manifested territorially. The venerable term to describe this situation is “spheres of influence.” What happens near borders matters considerably more than what happens half a world away. Globalization has perhaps qualified the dictum but hardly repealed it.

Even American policymakers observe this territorial imperative in their own neighbourhood. Consider the continuing importance of the proximate in US policy: the persistent neuralgia over a defiant Cuba; military interventions in Grenada, Panama, and Haiti; recurrent covert meddling against troublesome governments south of the border; and the intense attention given Mexico. No US leaders these days invokes the Monroe Doctrine (or at least the robust Teddy Roosevelt version of it), but the pattern of US action reveals what they can’t afford to say.

Chateau Nid d'hirondelle, près de Yalta. Photo par Traroth sous GFDL. CC-BY-SA-3.0 via Wikimedia Commons

Chateau Nid d’hirondelle, près de Yalta. Photo par Traroth sous GFDL. CC-BY-SA-3.0 via Wikimedia Commons.

To be sure, Russian leaders would also like to have it both ways. They too have championed their own set of values though with less enthusiasm than did Soviet leaders, who in turn themselves fell short of the Americans in their commitment to missionary projects.

But Russian and Soviet leaders alike have given clear priority to the near frontier. The consolidation of control over eastern Europe after World War II reflected this concern. So too did the dramatic interventions of 1956 and 1968 to crush unrest and the constant string pulling by members of the Politburo assigned to keep a hawk-like watch over clients in the East bloc. The intervention in Afghanistan, shaped by a fear of Islamist unrest spreading into nearby Soviet territories, fits within this pattern. That Putin would now respond to, even exploit the political disintegration in Ukraine just as he took advantage of the disputes along Georgian border can come as a shock only to observers oblivious to the dictates of realist statecraft.

The Ukraine crisis is a striking reminder of the continuing, fundamental division over the rules of the international game. Do major powers have special regional interests, or are they tightly constrained by far-reaching standards posited and defended by the United States? The American answer doesn’t have to be the latter. FDR in his conception of the postwar order and Nixon in moving toward detente and normalization — to take two striking exceptions — recognized the need for some degree of accommodation among the leading powers. They accorded diplomacy a central role in identifying areas of accord while setting to one side knotty issues connected to lands that adjoined the major powers.

But on the whole US policy has downplayed diplomacy as a regulator of great-power relations by often making capitulation the precondition for any opponent entering into talks. Real diplomacy would get in the way of the overriding preoccupation with holding in check regional powers whether China, Iran, Russia, or India that might pose a challenge to the United States. (The EU occupies an ambiguous position in this list of regionals as a powerhouse that hasn’t yet figured out how to realize its potential and for the moment speaks through Germany.) This US approach, most forcefully articulated by the Cheney doctrine at the end of the George H. W. Bush administration, is a prescription for unending tension, with the US policy a source of constant discord at one point and then another around the world.

It is hard to imagine a more misguided basis for policy, especially for a once dominant power steadily slipping in clout. The foundations for a better managed, more peaceful, and even more humane international order is more likely to emerge from great-power negotiations and compromise. Promoting a sense of security and comity among the dominant states may in the bargain discourage rough stuff in their neighbourhoods far better than confrontation and high-minded if hypocritical blustering.

This article originally appeared on Michael H Hunt’s website.

Michael Hunt is the Everett H. Emerson Professor of History Emeritus at the University of North Carolina at Chapel Hill. He is the author of The World Transformed, 1945 to the Present. A leading specialist on international history, Hunt is the author of several prize-winning books, including The Making of a Special Relationship: The United States and China to 1914. His long-term concern with US foreign relations is reflected in several broad interpretive, historiographical, and methodological works, notably Ideology and U.S. Foreign Policy and Crises in U.S. Foreign Policy: An International History Reader.

Subscribe to the OUPblog via email or RSS.
Subscribe to only current affairs articles on the OUPblog via email or RSS.

The post The Ukraine crisis and the rules great powers play by appeared first on OUPblog.

0 Comments on The Ukraine crisis and the rules great powers play by as of 3/23/2014 5:01:00 AM
Add a Comment
16. 35 years: the best of C-SPAN

By Kate Pais


The Cable-Satellite Public Affairs Network, better known as C-SPAN, has been airing the day-to-day activities of the US Congress since 1979 — 35 years as of this week. Now across three different channels, C-SPAN has provided the American public easy access to politics in action, and created a new level of transparency in public life. Inspired by Tom Allen’s Dangerous Convictions: What’s Really Wrong with the U.S. Congress, let’s take a look at the most notable events C-SPAN has captured on film to be remembered and reviewed.

Jimmy Carter opposes the invasion of Afghanistan

President Carter denounces the Soviet Union and their choice to invade Afghanistan in January 1980 as a warning to others in Southwest Asia.

The start of Reaganomics

Known for his economic influence, this is Ronald Reagan’s first address to both houses in February 1981.

Bill Clinton: “I did not sleep with that woman”

Slipped into a speech on children’s education in January 1998, this clip shows President Clinton addressing allegations about his affair with Monica Lewinsky for the first time.

Al Gore’s Concession Speech

After the long and controversial count during the 2000 Presidential Election, candidate and former vice-president Al Gore concedes to George Bush on December 13, 2000.

George W. Bush addresses 9/11

President Bush speaks to a joint session of Congress on 20 September 2001 about the terrorist attacks on the World Trade Center and Pentagon nine days prior.

Kate Pais joined Oxford University Press in April 2013 and works as an online marketing coordinator.

Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.

The post 35 years: the best of C-SPAN appeared first on OUPblog.

0 Comments on 35 years: the best of C-SPAN as of 3/22/2014 11:19:00 AM
Add a Comment
17. Gloomy terrors or the most intense pleasure?

By Philip Schofield


In 1814, just two hundred years ago, the radical philosopher Jeremy Bentham (1748–1832) began to write on the subject of religion and sex, and thereby produced the first systematic defence of sexual liberty in the history of modern European thought. Bentham’s manuscripts have now been published for the first time in authoritative form. He pointed out that ‘regular’ sexual activity consisted in intercourse between one male and one female, within the confines of marriage, for the procreation of children. He identified the source of the view that only ‘regular’ or ‘natural’ sexual activity was morally acceptable in the Mosaic Law and in the teachings of the self-styled Apostle Paul. ‘Irregular’ sexual activity, on the other hand, had many variations: intercourse between one man and one woman, when neither of them were married, or when one of them was married, or when both of them were married, but not to each other; between two women; between two men; between one man and one woman but using parts of the body that did not lead to procreation; between a human being and an animal of another species; between a human being and an inanimate object; and between a living human and a dead one. In addition, there was the ‘solitary mode of sexual gratification’, and innumerable modes that involved more than two people. Bentham’s point was that, given that sexual gratification was for most people the most intense and the purest of all pleasures and that pleasure was a good thing (the only good thing in his view), and assuming that the activity was consensual, a massive amount of human happiness was being suppressed by preventing people, whether from the sanction of the law, religion, or public opinion, from engaging in such ‘irregular’ activities as suited their taste.

Bentham

Bentham was writing at a time when homosexuals, those guilty of ‘the crime against nature’, were subject to the death penalty in England, and were in fact being executed at about the rate of two per year, and were vilified and ridiculed in the press and in literature. If an activity did not cause harm, Bentham had argued as early as the 1770s and 1780s, then it should not be subject to legal punishment, and had called for the decriminalization of homosexuality. By the mid-1810s he was prepared to link the problem not only with law, but with religion. The destruction of Sodom and Gomorrah was taken by ‘religionists’, as Bentham called religious believers, to prove that God had issued a universal condemnation of homosexuality. Bentham pointed out that what the Bible story condemned was gang rape. Paul’s injunctions against homosexuality were also taken to be authoritative by the Church. Bentham pointed out that not only did Jesus never condemn homosexuality, but that the Gospels presented evidence that Jesus engaged in sexual activity, and that he had his male lovers — the disciple whom he loved, traditionally said to be John, and the boy, probably a male prostitute, who remained with Jesus in the Garden of Gethsemane after all the disciples had fled (for a more detailed account see ‘Not Paul, but Jesus’).

Bentham was writing after Malthus had in 1798 put forward his argument that population growth would always tend to outstrip food supply, resulting in starvation and death until an equilibrium was restored, whereupon the process would recommence. Bentham had been convinced by Malthus, but Malthus’s solution to the problem, that people should abstain from sex, was not acceptable to him. He pointed out that one advantage of non-procreative sex was that it would not add to the increase of population. Bentham also took up the theme of infanticide. He had considerable sympathy for unmarried mothers who, because of social attitudes, were ostracized and had little choice but to become prostitutes, with the inevitable descent into drink, disease, and premature death. It would be far better, argued Bentham, to destroy the child, rather than the woman. Moreover, it was kinder to kill an infant at birth than allow it to live a life of pain and suffering.

Bentham looked to ancient Greece and Rome, where certain forms of homosexual activity were not only permitted but regarded as normal, as more appropriate models for sexual morality than that which existed in modern Christian Europe. Bentham attacked the notion, still propagated by religious apologists, that homosexuality was ‘unnatural’. All that ‘unnatural’ meant, argued Bentham, was ‘not common’. The fact that something was not common was not a ground for condemning it. Neither was the fact that something was not to your taste. It was a form of tyranny to say that, because you did not like to do a particular thing, you were going to punish another person for doing it. Because you thought something was ‘disgusting’ did not mean that everyone else thought it was disgusting. You might not want to have sex with a sow, but the father of her piglets thought differently.

These writings were, for Bentham, a critical part of a much broader attack on religion and the ‘gloomy terrors’ inspired by the religious mentality. By putting forward the case for sexual liberty, he was undermining religion in one of the areas where, in his view, it was most pernicious. Bentham did not dare publish this material. He believed that his reputation would have been ruined had he done so. He died in 1832. He would have been saddened that it still retains massive relevance in today’s world.

Philip Schofield is Professor of the History of Legal and Political Thought in the Faculty of Laws, University College London, Director of the Bentham Project, and General Editor of the new authoritative edition of The Collected Works of Jeremy Bentham. The latest volume in the edition, Of Sexual Irregularities, and other writings on Sexual Morality, was published on 30 January 2014. The research that led to the preparation of the volume was funded by the Leverhulme Trust. The Bentham Project is responsible for Transcribe Bentham, the prize-winning scholarly crowdsourcing initiative, where volunteers transcribe previously unread Bentham manuscripts.

Subscribe to the OUPblog via email or RSS.
Subscribe to only philosophy articles on the OUPblog via email or RSS.
Image credit: Jeremy Bentham, aged about 80. Frontispiece to Jeremy Bentham, Principles of Legislation, edited by John Neal, Boston: Wells and Lilly, 1830. Public domain

The post Gloomy terrors or the most intense pleasure? appeared first on OUPblog.

0 Comments on Gloomy terrors or the most intense pleasure? as of 3/19/2014 5:51:00 AM
Add a Comment
18. America and the politics of identity in Britain

By David Ellwood


“The Americanisation of British politics has been striking this conference season,” declared The Economist last autumn. “British politicians and civil servants love freebies to the US ‘to see how they do things,’” reported Simon Jenkins in The Guardian in November. Among the keenest such travellers is Michael Gove, the Education Secretary. Talking to the Daily Telegraph in February 2014, Gove spoke of the entrepreneurial spirit he found in California, and how his contacts with Microsoft and Google were helping him bring the skills of Silicon Valley to Britain. Perhaps Gove simply didn’t know how many UK governments of recent decades have journeyed along the same road, with the same aims. When Chancellor, Gordon Brown was tireless in his efforts to get British business “to rival America’s entrepreneurial dash,” as he told the same Daily Telegraph in December 2003, with speeches, conferences, educational programmes and other gestures, including visits from stars such as Bill Gates and Alan Greenspan.

One of the resources the British governing class most often turns to in its search for a successful, competitve identity for their country is ‘America’. Not American policy or money of course, not even that ‘Special Relationship’ which London clings on to so forlornly. Instead it’s an inspirational version of the United States: a source of models, examples, energies, ideas, standards; an invoked America whose soft power influence and prestige never fade. It is a form of virtual political capital which governments from Thatcher to Cameron feel they can draw on to compensate them for all their frustrations in Europe, their humiliations in the wider world and the intractability of their problems at home.

Flickr - USCapitol - Supreme Court of the United States (1)

Overlooked by all the commentators without exception, there has long been an American question in Britain’s identity debate. It has not been put there by artists, experts, army officers, sports personalities or even Rupert Murdoch. It has been imported systematically and with great persistence by the governments of the last thirty years, and with it they have brought a series of possible answers. The underlying purpose has been to solve the identity crisis by way of ceaseless efforts to ‘modernise’ the nation, to renew its democracy but also to raise its ranking in those league tables of world competitiveness which the land of Darwin takes so seriously, and — of course — to distinguish it from everything supposedly going on in the European Union. Where better than America to find inspiration and encouragement for this permanent revolution of change the governing class repeatedly insists on?

A visionary image of the United States was central to Margaret Thatcher’s political revolution of the 1980s. As she told a Joint Session of Congress in 1985: “We are having to recover the spirit of enterprise which you never lost. Many of the policies you are following are the ones we are following.” Employment policy was one of the first examples, with reforms explicitly modelled on Reaganite ideology and experience. Even the wording of legislation was directly copied. Under Thatcher, Blair and Brown, certain public sectors, in particular the school and university systems, were reformed again and again in the hope of hooking them up to the motor of economic growth in the way their equivalents were thought to function in the United States. Since the 1980s, the Home Office has been the most zealous of departments in importing American methods and innovations. Simon Jenkins says: “An American friend of mine spent much of his time showing British officials around New York’s police department after its recent success in cutting crime.” Labour’s recent prime ministers were both enthralled by America’s examples. Gordon Brown proposed that school children should swear allegiance to the Union Jack, that there be a British Fourth of July, and a museum celebrating great documents of British democracy. Blair and Brown were the ones who started introducing American private health care firms into the running of the NHS.

David Cameron has followed the American path laid down by Thatcher, Blair, and Brown with zeal and ambition at least equal to theirs. The Tory foreign policy platform for the 2010 election was written by a Brit sitting in the Heritage Foundation in Washington. Just as the outgoing Labour administration created a British Supreme Court, the incoming coalition has set up a new National Security Council, with a National Security Adviser. In November 2012 the country was called upon to elect its first police commissioners, and there was talk of a single school commissioner. Now the Prime Minister talks of life sentences really meaning life in prison. All of this is based on American precedents. But Cameron’s affiliations in America seem to be deepest in parts of California where even Tony Blair did not reach, in particular the Google Corporation. A featured speaker at Google Zeitgeist conferences, Cameron is said to believe that the internet revolution as configured by Google, “meshes with the modern conservative mission – flattening hierarchies and empowering people…” Across Silicon Valley, Cameron and his strategists see a land where “a dynamic economy meets the family-friendly work-place…where hard-headed businessmen drink fruit smoothies and walk around in recycled trainers,” as an admiring journalist put it.

The evidence of the last 40-odd years suggests that in their failure to invent a generally agreed moral theme or narrative of change for their society, the British governing class clings to the America of their imaginations to fill the void. Not because the creed of Americanism as such, far less American politics as currently displayed, can provide the cohesiveness required but simply because US experience over time appears to show how a uniquely powerful machine of national pride and aspiration, embodied in institutions, rituals, stories, and proclaimed values, can keep a multicultural nation glued together and provide ever-lasting hope of renewal. With its exceptional levels of child poverty, social inequality and numbers of people in jail, the governments of the last 30 years may not have got the Americanised Britain they dreamed of. But this has not discouraged them. After all, Ministers know that their enthusiasms can always count on a far warmer reception across the Atlantic than anywhere else in the world, including in Britain itself.

David Ellwood is Senior Adjunct Professor of European Studies at Johns Hopkins University School of Advanced International Studies, Bologna Center. He is the author of The Shock of America: Europe and the Challenge of the Century. His first major book was Italy 1943-1945: The Politics of Liberation (1985) then came Rebuilding Europe: Western Europe, America and Postwar Reconstruction (1992). The fundamental theme of his research — the function of American power in contemporary European history — has shifted over the years to emphasize cultural power, particularly that of the American cinema industry. He was President of the International Association of Media and History 1999-2004 and a Fellow of the Rothermere America Institute, Oxford, in 2006. Read more from David Ellwood on OUPblog.

Subscribe to the OUPblog via email or RSS.
Subscribe to only current affairs articles on the OUPblog via email or RSS.
Image credit: Supreme Court of the United States. By US Capitol. Public domain via Wikimedia Commons

The post America and the politics of identity in Britain appeared first on OUPblog.

0 Comments on America and the politics of identity in Britain as of 3/15/2014 5:41:00 AM
Add a Comment
19. A crisis of European democracy?

By Sara B Hobolt and James Tilley


During November 2012 hundreds of thousands of people across Europe took to the streets. The protesters were, by and large, complaining about government policies that increased taxes and lowered government spending. This initially sounds like a familiar story of popular protests against government austerity programmes, but there is a twist to the tale. Many of the people protesting were not aiming their ire at the national governments making the cuts in spending, but rather at the European Union. In Portugal, people carried effigies of their prime minister on strings and claimed he was a ‘puppet of the EU’; in Greece people burned the EU flag and shouted ‘EU out’; and in Italy people threw stones at the European Parliament offices. It was, at least for some people on the streets, not the incumbent national politicians in Lisbon, Athens, and Rome who were to blame for the problem of the day, but rather politicians and bureaucrats thousands of miles away in Brussels.

The economic crisis in Europe has illustrated that citizens are increasingly blaming not just their national governments, but also ‘Europe’ for their woes. This raises the question of whether citizens can hold European politicians to account for the outcomes for which they are thought to be responsible. The notion of democratic accountability relies on the critical assumption that voters are able to assign responsibility for policy decisions and outcomes, and sanction the government in elections if it is responsible for outcomes not seen to be ‘in their best interest’. This process, however, is clearly complicated in the multilevel system of the European Union where responsibility is not only dispersed across multiple levels of government, but there are also multiple mechanisms for sanctioning governments.

Symbolique 2006

Democratic accountability in multilevel systems can be viewed as a two-step process, where specific requirements need to be met at each step to allow voters to hold governments to account. The first step is one where voters decide which level of government, if any, is responsible for specific policy outcomes and decisions. This depends on the clarity of institutional divisions of powers across levels of government, and the information available about the responsibilities of these divisions. The second step is one where voters should be able to sanction the government in an election on the basis of performance. This depends on government clarity: that is the ability of voters to identify a cohesive political actor that they can sanction accordingly.

Both of these steps are important. Assignment of responsibility to a particular level of government is a necessary, but not sufficient, condition to be able to punish an incumbent at the polls. To do so, voters also need to know which party or individual to vote for or against. Yet, the EU lacks a clear and identifiable government. Executive power is shared between the European Council and the European Commission, and legislative power is shared between the Council of the EU and the European Parliament. The primary mechanism through which citizens can hold EU institutions to account is via elections to the European Parliament. Unlike in national parliamentary systems, the majority in the European Parliament does not ‘elect’ the EU executive, however. Despite the formal powers of the European Parliament over the approval and dismissal of the European Commission there is only a tenuous link between the political majority in the Parliament and the policies of the Commission, not least since there is no clear government-opposition division in the Parliament. Despite current attempts to present rival candidates for the post of Commission president prior to the European Parliament elections in May, there is still no competition between candidates with competing policy agendas and different records at the EU level. Without this kind of politicised contest it is simply not possible for voters to identify which parties are responsible for the current policy outcomes and which parties offer an alternative.

As a consequence, the classic model of electoral accountability cannot be applied to European Parliament elections. Even if citizens think the EU is responsible for poor policy performance in an area, they find it difficult to identify which parties are ‘governing’ and punish, or reward, them at the ballot box. This has broader implications for trust and legitimacy. When people hold the EU responsible for poor performance, but cannot hold it accountable for that performance, they become less trusting of the EU institutions as a whole. Thus the danger for the EU is that every time the system fails to deliver — such as during the Eurozone crisis — the result is declining levels of trust and a crisis of confidence in the regime as a whole, because voters lack the opportunity to punish an incumbent and elect an alternative. In other words, the lack of mechanisms to hold EU policymakers to account may lead to a more fundamental legitimacy crisis in the European Union.

Sara Hobolt and James Tilley are co-authors of Blaming Europe? Responsibility without accountability in the European Union. Sara Hobolt is the Sutherland Chair in European Institutions at the European Institute of the London School of Economics and Political Science. James Tilley is a university lecturer at the Department of Politics and International m Relations at the University of Oxford and a fellow of Jesus College, Oxford.

Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
Image credit: Photo credit © European Union, 2014 via EC Audiovisual Service.

The post A crisis of European democracy? appeared first on OUPblog.

0 Comments on A crisis of European democracy? as of 3/4/2014 4:54:00 AM
Add a Comment
20. Look beneath the vote

OUP-Blogger-Header-V2 Flinders

By Matthew Flinders


Hands up if you’ve heard of National Voter Registration Day? And in the somewhat unlikely event that you have, did you realise that it took place last month?

If this momentous milestone passed you by, you’re not alone. Whatever 5 February means to the people of the United Kingdom, it’s safe to assume that electoral participation doesn’t figure prominently. This is not a surprise; it reflects a deep-seated public disengagement from politics, as indicated by the fact that only two thirds of eligible voters in the 2010 general election actually voted. Throughout the twentieth century, general election turnouts almost always exceeded 70%, but that’s a level of participation that has not been seen since 1997. Incidentally, the highest turnout since 1900 was 86.8% in January 1910, though only rate-paying men over the age of 21 could vote.

Low voter turnout is clearly a problem, but arguably a much greater worry is the growing inequality of that turnout. As a recent report from the Institute for Public Policy Research makes clear, the United Kingdom is very much a ‘divided democracy’, with electoral participation among the young and the poor declining dramatically. In the 1987 general election, for example, the turnout rate for the poorest income group was 4% lower than for the wealthiest. By 2010 the gap had grown to a staggering 23 points. A similar pattern is observable in relation to age groups. In 1970 there was an 18-point gap in turnout rates between 18–24-year-olds and those aged over 65; by 2005 this gap had more than doubled to over 40 points, before narrowing slightly to 32 points in 2010. ”If we focus on participation within these age-groups,” the IPPR report concludes “we can see that at the 2010 general election the turnout rate for a typical 70-year-old was 36 percentage points higher than that of a typical 20-year-old.”

If this isn’t bad enough there is little evidence that young people will simply start voting as they get older. On the contrary, the IPPR’s research suggests that “younger people today are less likely than previous generations to develop the habit of voting as they move into middle age.” These trends mean that politicians tend to address themselves to the older and richer sections of society – the people, in other words, that are most likely to vote. This, in turn, reinforces the views of the young and the poor that politicians don’t care about them. And that, naturally, leads to even greater political estrangement.

So what’s the solution? How do we re-establish a connection between ordinary people and politicians? In particular, how do we persuade the young and the poor that the political system really does have something to offer them?

Blue checkmark on vote checkbox, pen lying on ballot paper

The answers lie not in quick fixes or technological solutions – such as the introduction of compulsory voting, changing the ballot paper or promoting ‘digital democracy’ – but in adopting a fundamentally deeper, richer and more creative approach to democratic engagement. People will only vote – be they young or old, rich or poor – when they understand why democratic politics matters and what it can deliver. Therefore, to increase electoral participation we must focus on promoting the public understanding of politics from all perspectives (conservative, traditional, radical, etc.) in a way that demonstrates that individual responses to collective social challenges are rarely likely to be effective. It’s this deeper understanding, this notion of political literacy promoted by Sir Bernard Crick and defined as ‘a compound of knowledge, skills and attitudes’ that citizens can use to navigate the complex social and political choices that face us all. Political literacy can be seen as a basic social requirement that empowers people to become politically aware, effective, and engaged while also being respectful of differences of opinion or belief.

In this regard, the message from survey after survey is a dismal one. Large sections of the British public appear to know very little about the political system. Even relatively basic questions such as “What do MPs do?” or “What’s the difference between Parliament and the Executive?” tend to elicit a mixture of mild embarrassment and complete bafflement.

Given that levels of political literacy are so low, it’s little surprise that many people choose not to vote. They’re unaware of the very real benefits the political system delivers for them (clean water, social protection, healthcare, education, etc.) and they no longer believe that they can become the engine of real social change. And yet they can. Worse, by opting out of elections they risk diminishing their representation as politicians focus their messages on the groups that do vote. Young people are constantly reminded that to be “uneducated” – let alone innumerate or illiterate – is to risk deprivation and vulnerability, but in many ways to be politically illiterate brings with it exactly the same risks. Moreover, the impact of declining political literacy isn’t only felt at the individual level. With so many people in society alienated from politics, democracy itself is weakened

Such arguments are by no means abstract concerns. On 7 May 2015, a General Election will be held on the basis of individual voter registration rather than the previous system of household voter registration. Research suggests that although this transition is likely to increase electoral security it may also result in a considerable decline in levels of electoral participation amongst – yes, you’ve’ guessed it – the young and the poor.  This is not a reason to turn back from individual registration but it is a reason to step-back and acknowledge that if we’re really serious about healing a divided democracy, then we need to focus on promoting engaged citizenship through different channels and processes. We need to take some risks and stir things up, but most of all we need a long-term plan for fostering political literacy.

Matthew Flinders is Founding Director of the Flinders author picSir Bernard Crick Centre for the Public Understanding of Politics at the University of Sheffield and also Visiting Distinguished Professor in Governance and Public Policy at Murdoch University, Western Australia. He is the author of Defending Politics (2012). 

Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.
Image credit: Blue checkmark, photo by NFSphoto, via iStockphoto.

The post Look beneath the vote appeared first on OUPblog.

0 Comments on Look beneath the vote as of 3/5/2014 3:42:00 AM
Add a Comment
21. Collective emotions and the European crisis

By Mikko Salmela and Christian von Scheve


Nationalist, conservative, and anti-immigration parties as well as political movements have risen or become stronger all over Europe in the aftermath of EU’s financial crisis and its alleged solution, the politics of austerity. This development has been similar in countries like Greece, Portugal, and Spain where radical cuts to public services such as social security and health care have been implemented as a precondition for the bail out loans arranged by the European Central Bank and International Monetary Fund, and in countries such as Finland, France, and the Netherlands that have contributed to the bailout while struggling with the crisis themselves. Together, the downturn that was initiated by the crisis and its management with austerity politics have created an enormous potential of discontent, despair, and anger among Europeans. These collective emotions have fueled protests against governments held responsible for unpopular decisions.

Protests in Greece after recent austerity cuts

Protests in Greece after austerity cuts in 2008

However, the financial crisis alone cannot fully explain these developments, since they have also gained momentum in countries like Britain, Denmark, Norway, and Sweden that do not belong to the Eurozone and have not directly participated in the bailout programs. Another unresolved question is why protests channel (once again) through the political right, rather than the left that has benefited from dissatisfaction for the last decades? And how is it that political debate across Europe makes increasing use of stereotypes and populist arguments, fueling nationalist resentments?

A protester with Occupy Wall Street

A protester with Occupy Wall Street

One way to look at these issues is through the complex affective processes intertwining with personal and collective identities as well as with fundamental social change. A particularly obvious building block consists of fear and insecurity regarding environmental, economic, cultural, or social changes. At the collective level, both are constructed and shaped in discourse with political parties and various interest groups strategically stirring the emotions of millions of citizens. At the individual level, insecurities manifest themselves as fear of not being able to live up to salient social identities and their inherent values, many of which originate from more secure and affluent times, and as shame about this anticipated or actual inability, especially in competitive market societies where responsibility for success and failure is attributed primarily to the individual. Under these conditions, many tend to emotionally distance themselves from the social identities that inflict shame and other negative feelings, instead seeking meaning and self-esteem from those aspects of identity perceived to be stable and immune to transformation, such as nationality, ethnicity, religion, language, and traditional gender roles – many of which are emphasized by populist and nationalist parties.

The urgent need to better understand the various kinds of collective emotions and their psychological and social repercussions is not only evident by looking at the European crisis and the re-emergence of nationalist movements throughout Europe. Across the globe, collective emotions have been at the center of major social movements and political transformations, Occupy Wall Street and the Arab Spring just being two further vivid examples. Unfortunately, our knowledge of the collective emotional processes underlying these developments is yet sparse. This is in part so because the social and behavioral sciences have only recently begun to systematically address collective emotions in both individual and social terms. The relevance of collective emotions in recent political developments both in Europe and around the globe suggests that it is time to expand the “emotional turn” of sciences to these affective phenomena as well.

Christian von Scheve is Assistant Professor of Sociology at Freie Universität Berlin, where he heads the Research Area Sociology of Emotion at the Institute of Sociology. Mikko Salmela is an Academy Research Fellow at the Helsinki Collegium for Advanced Studies and a member of Finnish Center of Excellence in the Philosophy of Social Sciences. Together they are the authors of Collective Emotions published by Oxford University Press.

Subscribe to the OUPblog via email or RSS.
Subscribe to only psychology articles on the OUPblog via email or RSS.
Image credits: (1) Protests in Greece after austerity cuts in 2008. Photo by Joanna. CC-BY-2.0 via Wikimedia Commons. (2) A protester with Occupy Wall Street. Photo by David Shankbone. CC-BY-3.0 via Wikimedia Commons)

The post Collective emotions and the European crisis appeared first on OUPblog.

0 Comments on Collective emotions and the European crisis as of 3/6/2014 5:54:00 AM
Add a Comment
22. Sovereignty disputes in the South and East China Sea

By Merel Alstein


Tensions in the South and East China Seas are high and likely to keep on rising for some time, driven by two powerful factors: power (in the form of sovereignty over and influence in the region) and money (from the rich mineral deposits that lurk beneath the disputed waters). Incidents, such as the outcry over China’s recently announced Air Defence Identification Zone, have come thick and fast the last few years. One country’s historic right is another country’s attempt at annexation. Every new episode in turn prompts a wave of scholarly soul-searching as to the lawfulness of actions taken by the different countries and the ways that international law can, or cannot, help resolve the conflicts.

Maritime claims in the South China Sea by Goran tek-en. CC-BY-SA-3.0 via Wikimedia Commons.

Maritime claims in the South China Sea by Goran tek-en. CC-BY-SA-3.0 via Wikimedia Commons.

In order to help keep track of debate in blogs, journals, and newspapers on the international law aspects of the various disputes, we have created a debate map which indexes who has said what and when. It follows on from our previous maps on the use of force against Syria and the prosecution of heads of state and other high-profile individuals at the International Criminal Court. Blog posts in particular have a tendency to disappear off the page once they are a few days old, which often means that their contribution to the debate is lost. The debate maps reflect a belief that these transient pieces of analysis and commentary deserve to be remembered, both as a reflection of the zeitgeist and as important scholarly contributions in their own right.

To help readers make up their own minds about the disputes, the map also includes links to primary documents, such as the official positions of the countries involved and their submissions to the UN Commission on the Limits of the Continental Shelf.

One striking aspect of the map is how old some of the articles are, originating from the early 1970s. Controversies which seem new now actually go back some 40 years. In conflicts such as these, which cannot be understood without their history and where grievances often go back centuries, this awareness is key.

Another surprising feature is the uncertainty surrounding the legal basis of China’s claim to sovereignty over most of the South China Sea—its famous nine-dash line. Semi-official or unofficial statements by Chinese civil servants, or in one case by the Chinese Judge at the International Court of Justice, are seized on as indications of what China’s justifications are for its expansive maritime claims. A clearer official position, and more input from Chinese scholars, would significantly improve the debate.

Ultimately, the overlapping maritime claims and sovereignty disputes in the South and East China Seas are unlikely to be solved any time soon, and will keep commentators busy for years to come. We will keep the map up to date to facilitate and archive the debate. Your help is indispensable: please get in touch if you have any suggestions for improvements or for new blog posts and articles we can link to.

Merel Alstein is a Commissioning Editor for international law titles at Oxford University Press. She recently compiled a debate map on disputes in the South and East China Seas. Follow her on Twitter @merelalstein.

Oxford Public International Law is a comprehensive, single location providing integrated access across all of Oxford’s international law services. Oxford Reports on International Law, the Max Planck Encyclopedia of Public International Law, and Oxford Scholarly Authorities on International Law are ground-breaking online resources working to speed up research and provide easy access to authoritative content, essential for anyone working in international law.

Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.

The post Sovereignty disputes in the South and East China Sea appeared first on OUPblog.

0 Comments on Sovereignty disputes in the South and East China Sea as of 3/10/2014 5:38:00 AM
Add a Comment
23. When art coaxed the soul of America back to life

By Sheila D. Collins


Writing in the New York Times recently, art critic Holland Cotter lamented the fact that the current billionaire-dominated market system, “is shaping every aspect of art in the city; not just how artists live, but also what kind of art is made, and how art is presented in the media and in museums.” “Why,” he asks, “in one of the most ethnically diverse cities, does the art world continue to be a bastion of whiteness? Why are African-American curators and administrators, and especially directors, all but absent from our big museums? Why are there still so few black — and Latino, and Asian-American — critics and editors?”

It wasn’t always like this. During the 1930s under the New Deal, the arts were democratized, made accessible to ordinary people who lacked the means to buy paintings worth hundreds of thousands of dollars or to attend Broadway shows at over $100 a ticket. The New Deal’s support for the arts is one of the most interesting and unique episodes in the history of American public policy.

The federal arts programs initiated in the 1930s were intended to alleviate the economic hardships of unemployed cultural workers, to popularize art among a much wider segment of the population, and to boost public morale during a time of deep stress and pessimism, or as New Deal artist Gutzon Borglum remarked, to “coax the soul of America back to life.”

WPA Federal Art Project Poster

WPA Federal Art Project Poster, 1936. Public Domain via Wikimedia Commons.

The best known of all the programs that were enacted during the Depression was the WPA (Works Progress Administration) Art Project. It consisted of four distinct projects: a Federal Art Project, a Federal Writers’ Project, a Federal Theatre Project, and a Federal Music Project.

Paintings were given to government offices, while murals, sculptures, bas relief, and mosaics were seen on the walls of schools, libraries, post offices, hospitals, courthouses, and other public buildings. Over the course of its eight years, the WPA commissioned over five hundred murals for New York City’s public hospitals alone. Among the now well-known artists supported by these programs were painters such as Thomas Hart Benton, Jackson Pollack, Willem de Kooning, Raphael and Moses Soyer, and the sculptor, Louise Nevelson.

The print workshops set up by the WPA prepared the ground for the flowering of the graphic arts in the United States, which until that time had been limited in both media and expression. Moreover, since prints were portable and cheap, they became a vehicle for broadening the public’s understanding and appreciation of the creative arts.

Some 100 community art centers, which included galleries, classrooms, and community workshops, were established in twenty-two states–but particularly where opportunities to experience and make art were scarce. Through this effort individuals who may never have seen a large painted scene or a piece of sculpture were given the opportunity to experience not only a finished work of art but to participate in the creative process. In the New York City area alone, an estimated 50,000 people participated in classes under the Federal Art Project auspices each week. According to Smithsonian author, David A. Taylor, “the effect was electric. It jump-started people beginning careers in art amid the devastation.”

The Federal Writers’ project provided employment and experience for editors, art critics, researchers, and historians, a number of whom later became famous for their novels and poetry, such as Richard Wright, Ralph Ellison, Studs Terkel, and Saul Bellow. They were put to work writing state and regional guidebooks that were to portray the social, economic, industrial, and historical background of the country. These guidebooks represented a vast treasury of Americana from the ground up, including facts and folklore, history and legend, and histories of the famous, the infamous, and the excluded. There were also seventeen-volumes of oral histories of the last people who had lived under slavery. An additional set of folklore and oral histories of 10,000 people from all regions, occupations, and ethnic groups were collected and are now held in the American Folklife Center of the Library of Congress.

Federal Theater Project poster, 1938. Public Domain via Wikimedia Commons.

Federal Theater Project poster, 1938. Public Domain via Wikimedia Commons.

The Federal Theatre Project was the first and only attempt to create a national theatre in the United States, producing all genres of theater, including classical plays, circuses, puppet shows, musical comedies, vaudeville, dance performances, children’s theatre, and experimental plays. They were performed wherever people could gather—not only in theaters, but in parks, hospitals, convents, churches, schools, armories, circus tents, universities, and prisons. Touring companies brought theater to parts of the country where drama had been non-existent, and provided training and experience for thousands of aspiring actors, directors, stagehands, and playwrights, among them, Orson Wells, Eugene O’Neill, and Joseph Houseman.

The program emphasized preserving and promoting minority cultural forms. At a time of strict racial segregation with arts funding non-existent in African American communities, black theatre companies were established in many cities. Foreign language companies performed works in French, German, Italian, Spanish, and Yiddish.

The Federal Theatre Project also brought controversial issues to the foreground, making it one of the most embattled of all the New Deal programs. Its “Living Newspaper” section produced plays about labor disputes, economic inequality, racism, and similar issues, which infuriated a growing chorus of conservative critics who succeeded in eliminating the program in 1939.

The Federal Music Project employed 15,000 instrumentalists, composers, vocalists, and teachers as well as providing financial assistance for existing orchestras and creating new ones in places that had never had an orchestra. Many other musical forms—opera, band concerts, choral music, jazz, and pop–were also performed. Most of the concerts were either free to the public or offered at very low cost, and free music classes were open to people of all ages and abilities.

In addition to the arts programs, the Farm Security Administration’s photography program oversaw the production of more than 80,000 photographs, as part of the effort to make the nation aware of the plight of displaced rural populations. These images–produced by photographers such as Walker Evans, Gordon Parks, and Dorothea Lange helped humanize the verbal and statistical reports of the terrible poverty and turmoil in the agricultural sector of the economy and brought documentary photography into the cultural pantheon of the nation.

Between 1933 and 1942 ten thousand artists produced some 100,000 easel paintings, 18,000 sculptures, over 13,000 prints, 4,000 murals, over 1.6 million posters, and thousands of photographs. Over a thousand towns and cities now boasted federal buildings embellished with New Deal murals and sculpture. Some 6,686 writers produced more than a thousand books and pamphlets, and the Federal Theatre Project thousands of plays. More than the quantity of the output, however, is the way in which these programs shaped Americans’ understanding of who they were as a people and their country’s possibilities. Before the New Deal, the notion that government should support the arts was unheard of, but thanks to the New Deal, art had been democratized and, for a time, de-commodified, made accessible to the great majority of the American people.

Perhaps Roosevelt himself best summed up the significance of the New Deal arts programs:

A few generations ago, the people of this country were often taught . . . to believe that art was something foreign to America and to themselves . . . But . . . within the last few years . . . they have discovered that they have a part. . . . They have seen in their own towns, in their own villages, in schoolhouses, in post offices, in the back rooms of shops and stores, pictures painted by their sons, their neighbors—people they have known and lived beside and talked to. . . some of it good, some of it not so good, but all of it native, human, eager, and alive–all of it painted by their own kind in their own country, and painted about things that they know and look at often and have touched and loved. The people of this country know now . . . that art is not something just to be owned but something to be made: that it is the act of making and not the act of owning that is art. And knowing this they know also that art is not a treasure in the past or an importation from another land, but part of the present life of all the living and creating peoples—all who make and build; and, most of all, the young and vigorous peoples who have made and built our present wide country.

New Deal support for the arts had coaxed the soul of America back to life, but we are in danger of losing it again. Under the obsession with deficits, arts programs in the public schools are being cut, federal funding for the arts has dropped dramatically, and even private funding has been reduced. Without art, we are ill-equipped as a people with the collective imagination that is needed if we are to resolve the enormous challenges that confront us in the twenty-first century. Who or what will there be to coax this generation back to life?

Sheila D. Collins is Professor of Political Science Emerita, William Paterson University and editor/author with Gertrude Schaffner Goldberg of When Government Helped: Learning from the Successes and Failures of the New Deal. She serves on the speakers’ bureau of the National New Deal Preservation Association, the Research Board of the Living New Deal and the board of the National Jobs for All Coalition, is a member of the Global Ecological Integrity Group and co-chairs two seminars at Columbia University.

Subscribe to the OUPblog via email or RSS.
Subscribe to only art and architecture articles on the OUPblog via email or RSS.

The post When art coaxed the soul of America back to life appeared first on OUPblog.

0 Comments on When art coaxed the soul of America back to life as of 3/10/2014 8:36:00 AM
Add a Comment
24. Why the lobbying bill is a threat to the meaning of charity

By Matthew Hilton


On 30 January 2014 the UK government’s lobbying bill received the Royal Assent. Know more formally known as the Transparency of Lobbying, Non-Party Campaigning and Trade Union Administration Act, it seeks to curb the excesses in election campaign expenditure, as well as restricting the influence of the trade unions.

However, as various groups pointed out throughout its controversial parliamentary journey, Part 2 of the legislation will also have implications for charities, voluntary societies and non-governmental organisations once it comes into effect. Specifically, in restricting the amount of expenditure that non-party political bodies can spend ahead of a general election, it will severely curtail their lobbying, campaigning and advocacy work that has been a standard feature of their activities for some decades.

Understandably the sector has not welcomed the Act. The problem is that the legislation conflates general political lobbying with campaigning for a specific cause that is central to the charitable mission of an organisation. Sector leaders have critiqued the Bill as ‘awful’, ‘an absolute mess’ and ‘a real threat to democracy’.

It is not difficult to see why. The impact of charities on legislation in Britain has been profound and the examples run into many hundreds of specific Acts of Parliament. To mention but a few, a whole range of environmental groups successfully lobbied for the Climate Change Act 2008. Homelessness charities such as Shelter and the Child Poverty Action Group fought a battle for many years that resulted in the Housing Act 1977. The 1969 abolition of the death penalty can be partly attributed to the National Campaign for the Abolition of Capital Punishment and two pieces of legislation in 1967, the Sexual Offences Act and the Abortion Act, were very much influenced by the work of the Homosexual Law Reform Society and the Abortion Law Reform Association.

A group of campaigners from Christian Aid lobbying for Trade Justice. Photo by Kaihsu Tai. CC-BY-SA 3.0

A group of campaigners from Christian Aid lobbying for Trade Justice, Oxford, 2005. By Kaihsu Tai. CC-BY-SA 3.0 via Wikimedia Commons.

The list could run on and on, but the impact of advocacy by charities on the policy process has become far more extensive than the straightforward lobbying of MPs. Charities have been key witnesses in Royal Commissions, for instance. From the 1944 Commission on Equal Pay Act through to the 1993 Commission on Criminal Justice, voluntary organisations contributed over a 1,000 written submissions. At Whitehall, they have sought a continued presence along the corridors of power in much the same manner as commercial lobbying firms. They have achieved much through the often hidden and usually imprecise, unquantifiable and unknowable interpersonal relationships fostered with key civil servants, both senior and junior.

In more recent years, charities have taken advantage of early day motions in the House of Commons. Once infrequently employed, by the first decade of the 21st century, there were on average 1,875 early day motions in each parliamentary session. The most notable have managed to secure over 300 signatures and it is here that the influence of charities is particularly apparent. The topics that obtain such general — and cross-party — support have tended to be in the fields of disability, drugs, rights, public health, the environment, and road safety; all subjects on which charities have been particularly effective campaigners.

Not all of these lobbying activities have been successful. Leaders of charities have often expressed their frustration at being unable to influence politicians who refuse to listen, else being outgunned and out-voiced by lobbyists with greater financial muscle supporting their work. But the important point is that charities have had to engage in the political arena and it is the norm for them to do so. To restrict these activities now — even if only in the year in the run-up to a general election — actually serves to turn back a dominant trend in democratic participation that has come increasingly to the fore in contemporary Britain.

Having explored the history of charities, voluntary organisations and NGOs, tracing their growing power, influence and support, we found was that rather than there having been a decline in democracy over the last few decades there has actually been substantial shifts in how politics takes place. While trade unions, political parties and traditional forms of association life have witnessed varying rates of decline, support for environmental groups, humanitarian agencies and a whole range of single-issue campaigning groups has actually increased. Whether these groups represent a better or worse form of political engagement is not really the issue. The point is that the public has chosen to support charities — and charitable activity in the political realm — because ordinary citizens have felt these organisations are better placed to articulate their concerns, interests and values. As such, charities, often working at the frontier of social and political reform, but often alongside governments and the public sector, have become a crucial feature of modern liberal democracy.

One might have expected a government supposedly eager to embrace the ‘Big Society’ particular keen to free these organisations from the bureaucracy of the modern state. But it is quite clear that the Coalition has held a highly skewed, and rather old fashioned, view of appropriate charitable activity. The Conservatives imagined a world of geographically-specific, community self-help groups that might pick up litter on the roadside in their spare time at the weekend and who would never imagine that their role might be, for instance, to demand that local government obtains sufficient resources to ensure that the public sector — acting on the behalf of all citizens and not just a select few — would continue to maintain and beautify the world around us. There are clearly very different views on what charity is and what it should do.

Indeed, it is remarkable that when government spokespeople did comment on the nature of the charitable sector, they were quick to condemn the work of the bigger organisations. Lord Wei, the ‘Big Society tsar’, even went so far as to criticise the larger charities for being ‘bureaucratic and unresponsive to citizens’. With such attitudes it is no wonder the Big Society soon lost any pretence of adherence from the many thousands of bodies connected to the National Council of Voluntary Organisation.

It is tempting to see the particular form the Conservatives hoped the Big Society would take as part and parcel of a policy agenda that is connected to the lobbying bill. That is, there has never been an embrace of charities by Cameron and his ministers as the solution to society’s – and the state’s – ills. Rather, in viewing these developments alongside the huge cuts in public sector funding (which often trickled down to national and community-based charities), there has actually been a sustained attack on the very nature of charity, or at least it has developed as a sector in recent decades. It is no wonder that many charity leaders and CEOs, feeling cut off at the knees by the slashes to their budgets and damaged by the sustained abuse in the press for their mistakenly inflated salaries, now feel the Lobbying Act is seeking to gag their voice as well.

Matthew Hilton is Professor of Social History at the University of Birmingham, and the author of The Politics of Expertise: How NGOs Shaped Modern Britain, along with James McKay, Nicholas Crowson and Jean-François Mouhot. Together they also compiled ‘A Historical Guide to NGOs in Britain: Charities, Civil Society and the Voluntary Sector since 1945′ (Palgrave, 2012). All the data contained in these two volumes, as well as that found above, is freely available on their project website.

Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.

The post Why the lobbying bill is a threat to the meaning of charity appeared first on OUPblog.

0 Comments on Why the lobbying bill is a threat to the meaning of charity as of 3/12/2014 5:55:00 PM
Add a Comment
25. In memoriam: Tony Benn

By Jad Adams


Tony Benn has left as an enduring monument: one of the great diaries of the twentieth century, lasting from 1940, when he was fifteen, to 2009 when illness forced him to stop.

They are published as nine volumes but these are perhaps ten percent of the 15 million words in the original dairy. I am one of the few people to have had access to the manuscript diary, in the course of writing my biography of Benn. For this I received every assistance from him and his staff in the jumbled, chaotic office in the basement of his Holland Park Avenue home.

The diaries of course are of historic interest because they reveal the work of a cabinet minister and member of parliament for more than fifty years. Over time the Benn charts post-war hope, the rise of the Labour militants, the battle of Orgreave, and the decline of the Left. The books also have descriptions of constituents’ experiences in his weekly surgery, an opportunity to meet the people and sample their woes, which is hated by some MPs but was embraced by Benn.

They also show the development of Benn’s family of four children, twelve grandchildren, and the suffering of the death of parents and partner. One would be hard put to it to find anywhere in literature a more poignant description of death and continuing loss than Benn’s of Caroline, his partner of more than fifty years whose illness and death was described in remorseless detail in manuscript, some of which was published in More Time for Politics (2007).

Benn had always felt he ought to be writing a diary, as a part of the non-conformist urge to account for every moment of life as a gift from God. He explained at one of our last formal conversations: ‘It’s very self obsessed. I must admit it worries me that I should spend so much time on myself, I saw it as an account, accountability to the Almighty, when I die give him 15 million words and say: there, you decide. I think there is a moral element in it, of righteousness.’

Tony_Benn

This need to see time as a precious resource to be accounted for went back to his father, William Wedgwood Benn (Later Lord Stansgate) who expected the boy Benn to fill in a time chart showing how he had made use of his days. Benn senior had read an early self-help book by Arnold Bennett called How to Live 24 Hours a Day on making the best use of time. ‘Father became obsessed with it,’ Benn said.

Tony Benn had been keeping a diary sporadically since childhood. It had always been his ambition to keep one, and early fragments of diary exist, including one during his time in the services, where diary-keeping was forbidden for security reasons so he put key words relating to places or equipment in code. In the 1950s he began keeping a political diary and wrote at least some parts of a diary for every year from 1953. The emotional shock of his father’s death in 1960 and subsequent political upsets stopped his diary writing in 1961 and 1962 but, with a return to the House of Commons in sight, he resumed it in 1963.

He started dictating the diary to a tape recorder in 1966 when he joined the cabinet because he could not dictate accounts of cabinet meetings to a secretary who was not covered by the Official Secrets Act. Benn would store the tapes, not knowing when he would transcribe them, or indeed if they would be transcribed in his lifetime. His daughter, Melissa spoke of arriving at their home late at night when she was a teenager, and hearing her father’s voice dictating the diary, followed by snoring as he fell asleep at the microphone.

Benn stopped writing the diary after he fell ill in 2009 in what was probably the first stroke he was to suffer. He explained to me:

‘You can’t not be a diarist some of the time. One day is much the same as the other and it is a lot of effort. You really do have to be very conscientious and keep it up in detail and keep up the recordings and so on and it took over my life, also I’m not sure now that I’m not in a position on the inside on anything where my reflections would be interesting. I think my reflections might be as interesting as anybody else’s but whether it constitutes a diary when I’m not at the heart of anything…

‘I never thought of it as an achievement, just something I did, it’s been a bit of a burden to have to write it all down every night. It began as a journal where I put down things that interested me during the war, I drew a little bit on that for Years of Hope (1994). You can say you’ve achieved a reasonably accurate daily account of what has happened to you and since people are always shaped by what has happened to them so if you have a diary you get three bites at your own experience: when it happens, when you write it down and when you read it later and realise you were wrong.’

Benn did not think he would publish it in his lifetime, but in about 1983 he decided to type up six months and have a look at it. He invited Ruth Winstone to help with the diary in 1985 and found they worked so well together that she stayed and edited all the diaries.

His final thought on the long labour of the Benn Diaries was: ‘I couldn’t recommend anyone to keep a diary without warning them that it does take over your life.’

Jad Adams’ Tony Benn: A Biography is published by Biteback.  His next book is Women and the Vote: A World History to be published by OUP in the autumn.

Image credit: Portrait of Tony Benn. By I, Isujosh. CC-BY-SA-3.0 via Wikimedia Commons

The post In memoriam: Tony Benn appeared first on OUPblog.

0 Comments on In memoriam: Tony Benn as of 3/14/2014 11:51:00 AM
Add a Comment

View Next 25 Posts