What is JacketFlap

  • JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans.
    Join now (it's free).

Sort Blog Posts

Sort Posts by:

  • in

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Date

Click days in this calendar to see posts by day or month
new posts in all blogs
Viewing: Blog Posts Tagged with: philosophy, Most Recent at Top [Help]
Results 1 - 25 of 352
1. Thoughts in the necropolis

One of Glasgow’s best-known tourist highlights is its Victorian Necropolis, a dramatic complex of Victorian funerary sculpture in all its grandeur and variety. Christian and pagan symbols, obelisks, urns, broken columns and overgrown mortuary chapels in classical, Gothic, and Byzantine styles convey the hope that those who are buried there—the great and the good of 19th century Glasgow—will not be forgotten.

But, of course, they are mostly forgotten and even the conspicuous consumption expressed in this extraordinary array of great and costly monuments has not been enough to keep their names alive. And, of course, we, the living, will soon enough go the same way: ‘As you are now, so once was I’, to recall a once-popular gravestone inscription.

Is this the last word on human life? Religion often claims to offer a different perspective on death since (it is said) the business of religion is not with time, but with eternity. But what, if anything, does this mean?

‘Eternal love’ and ‘eternal memory’ are phrases that spring to the lips of lovers and mourners. Even in secular France, some friends of the recently murdered journalists talked about the ‘immortality’ of their work. But surely that is just a way of talking, a way of expressing our especially high esteem for those described in these terms? And even when talk of eternity and immortality is meant seriously, what would a human life that had ‘put on immortality’ be like? Would it be recognizably human at all? As to God, can we really conceive of what it would be for God (or any other being) to somehow be above or outside of time? Isn’t time the condition for anything at all to be?

Entrance to the Necropolis. Photo by George Pattison. Used with permission.

If we really take seriously the way in which time pervades all our experiences, all our thinking, and (for that matter) the basic structures of the physical universe, won’t it follow that the religious appeal to eternity is really just a primitive attempt to ward off the spectre of transience, whilst declarations of eternal love and eternal memory are little more than gestures of feeble defiance and that if, in the end, there is anything truly ‘eternal’ it is eternal oblivion—annihilation?

Human beings have a strong track record when it comes to denying reality.

One fashionable book of the post-war period was dramatically entitled The Denial of Death and it argued that our entire civilization was built on the inevitably futile attempt to deny the ineluctable reality of death. But if there is nothing we can do about death, must we always think of time in negative terms—the old man with the hour-glass and scythe, so like the figure of the grim reaper?

And instead of thinking of eternity as somehow beyond or above time, might not time itself offer clues as to the presence of eternity, as in the experiences that mystics and meditators say report as being momentary experiences of eternity in, with, and under the conditions of time? But such experiences, valuable as they are to those who have them, remain marginal unless they can be brought into fruitful connection with the weave of past and future.

From the beginnings of philosophy, recollection has been valued as an important clue to finding the tracks of eternity in time, as in Augustine’s search for God in the treasure-house of memory. But the past can only ever give us so much (or so little) eternity.

A recent French philosopher has proposed that time cannot undo our having-been and that the fact that the unknown slave of ancient times or the forgotten victim of the Nazi death-camps really existed means that the tyrants have failed in their attempt to make them non-human. But this is a meagre consolation if we have no hope for the future and for the flourishing of all that is good and true in time to come. Really affirming the enduring value of human lives and loves therefore presupposes the possibility of hope.

One Jewish sage taught that ‘In remembering lies redemption; in forgetfulness lies exile’ but perhaps what we it is most important to remember is the possibility of hope itself and of going on saying ‘Yes’ to the common, shared reality of human life and of reconciling the multiple broken relationships that mortality leaves unresolved.

Pindar, an ancient poet of hope, wrote that ‘modesty befits mortals’ and if we cannot escape time (which we probably cannot), it is maybe time we have to thank for the possibility of hope and for visions of a better and more blessed life. And perhaps this is also the message that a contemporary graffiti-artist has added to one of the Necropolis’s more ruined monuments. ‘Life goes on’, either extreme cynicism or, perhaps, real hope.

Featured image credit: ‘Life goes on.’ Photo by George Pattison. Used with permission.

The post Thoughts in the necropolis appeared first on OUPblog.

0 Comments on Thoughts in the necropolis as of 1/26/2015 5:15:00 AM
Add a Comment
2. Immoral philosophy

I call myself a moral philosopher. However, I sometimes worry that I might actually be an immoral philosopher. I worry that there might be something morally wrong with making the arguments I make. Let me explain.

When it comes to preventing poverty related deaths, it is almost universally agreed that Peter Singer is one of the good guys. His landmark 1971 article, “Famine, Affluence and Morality” (FAM), not only launched a rich new area of philosophical discussion, but also led to millions in donations to famine relief. In the month after Singer restated the argument from FAM in a piece in the New York Times, UNICEF and OXFAM claimed to have received about $660, 000 more than they usually took in from the phone numbers given in the piece. His organisation, “The Life You Can Save”, used to keep a running estimate of total donations generated. When I last checked the website on 13th February 2012, this figure stood at $62, 741, 848.

Singer argues that the typical person living in an affluent country is morally required to give most of his or her money away to prevent poverty related deaths. To fail to give as much as you can to charities that save children dying of poverty is every bit as bad as walking past a child drowning in a pond because you don’t want to ruin your new shoes. Singer argues that any difference between the child in the pond and the child dying of poverty is morally irrelevant, so failure to help must be morally equivalent. For an approachable version of his argument see Peter Unger, who developed and refined Singer’s arguments in his 1996 book, Living High and Letting Die.

I’ve argued that Singer and Unger are wrong: failing to donate to charity is not equivalent to walking past a drowning child. Morality does – and must – pay attention to features such as distance, personal connection and how many other people are in a position to help. I defend what seems to me to be the commonsense position that while most people are required to give much more than they currently do to charities such as Oxfam, they are not required to give the extreme proportions suggested by Singer and Unger.

Saving lives, by Oxfam East Africa, CC-BY-2.0 via Wikimedia Commons

So, Singer and Unger are the good guys when it comes to debates on poverty-related death. I’m arguing that Singer and Unger are wrong. I’m arguing against the good guys. Does that make me one of the bad guys? It is true that my own position is that most people are required to give more than they do. But isn’t there still something morally dubious about arguing for weaker moral requirements to save lives? Singer and Unger’s position is clear and easy to understand. It offers a strong call to action that seems to actually work – to make people put their hands in their pockets. Isn’t it wrong to risk jeopardising that given the possibility that people will focus only on the arguments I give against extreme requirements to aid?

On reflection, I don’t think what I do is immoral philosophy. The job of moral philosophers is to help people to decide what to believe about moral issues on the basis of reasoned reflection. Moral philosophers provide arguments and critique the arguments of others. We won’t be able to do this properly if we shy away from attacking some arguments because it is good for people to believe them.

In addition, the Singer/Unger position doesn’t really offer a clear, simple conclusion about what to do. For Singer and Unger, there is a nice simple answer about what morality requires us to do: keep giving until giving more would cost us something more morally significant than the harm we could prevent; in other words, keep giving till you have given most of your money away. However, this doesn’t translate into a simple answer about what we should do, overall. For, on Singer’s view, we might not be rationally required or overall required to do what we are morally required to.

This need to separate moral requirements from overall requirements is a result of the extreme, impersonal view of morality espoused by Singer. The demands of Singer’s morality are so extreme it must sometimes be reasonable to ignore them. A more modest understanding of morality, which takes into account the agent’s special concern with what is near and dear to her, avoids this problem. Its demands are reasonable so cannot be reasonably ignored. Looked at in this way, my position gives a clearer and simpler answer to the question of what we should do in response to global poverty. It tells us both what is morally and rationally required. Providing such an answer surely can’t be immoral philosophy.

Headline image credit: Devil gate, Paris, by PHGCOM (Own work). CC-BY-SA 3.0 via Wikimedia Commons.

The post Immoral philosophy appeared first on OUPblog.

0 Comments on Immoral philosophy as of 1/25/2015 5:41:00 AM
Add a Comment
3. Seeing things the way they are

A few really disastrous mistakes have dominated Western philosophy for the past several centuries. The worst mistake of all is the idea that the universe divides into two kinds of entities, the mental and the physical (mind and body, soul and matter). A related mistake, almost as bad, is in our philosophy of perception. All of the great philosophers of the present era, beginning with Descartes, made the same mistake, and it colored their account of knowledge and indeed their account of pretty much everything. By ‘great philosophers’, I mean Locke, Berkeley, Hume, Descartes, Leibniz, Spinoza, and Kant. I am prepared to throw in Hegel and Mill if people think they are great philosophers too. I called this mistake the “Bad Argument”. Here it is: We never directly perceive objects and states of affairs in the world. All we ever perceive are the perceptual contents of our own mind. These are variously called ‘ideas’ by Descartes, Locke, and Berkeley, ‘impressions’ by Hume, ‘representations’ by Kant, and ‘sense data’ by twentieth century theorists. Most contemporary philosophers think they have avoided the mistake, but I do not think they have. It is just repeated in different versions, especially by a currently fashionable view called ‘Disjunctivism’.

But that leaves us with a more interesting problem: What is the correct account of the relation of perceptual experience and the real world? The key to understanding this relation is to understand the intentionality of perception. ‘Intentionality’ is an ugly word, but we can pretty much make clear what it means; a mental state is intentional if it represents, or is about, objects and states of affairs in the world. So beliefs, hopes, fears, desires are all intentional in this sense. ‘Intending’ in the ordinary sense just names one kind of intentionality, along with beliefs, desires, etc. Such intentional states are representations of how things are in the world or how we would like them to be, etc., and we might say therefore that they have “conditions of satisfaction” — truth conditions in the case of belief, fulfillment conditions in the case of intentions, etc.

The biologically most basic and gutsiest forms of intentionality are those where we don’t have mere representations but direct presentations of objects and states of affairs in the world, and part of intentionality is that these must be causally related to the conditions in the world that they present. Perception and intentional action are direct presentations of their conditions of satisfaction. In the case of perception, the conditions of satisfaction have to cause the perceptual experience. In the case of action, the intention in action has to cause the bodily movement. So the key to understanding perception is to see the special features of the causal presentational intentionality of perception. The tough philosophical question is to state how exactly the character of the visual experience, its phenomenology, determines the conditions of satisfaction.

How then does the intentional content fix the conditions of satisfaction? The first step in the answer is to see that perception is hierarchical. In order to see higher level features, such that an object is my car, I have to see such basic features as color and shape. The key to understanding the intentionality of the basic perceptual experience is to see that the feature itself is defined in part by its ability to cause a certain sort of perceptual experience. Being red, for example, consists in part in the ability to cause this sort of experience. Once the intentionality of the basic perceptual features is explained, we can then ask the question of how the presentation of the higher level features, such as seeing that it is my car or my spouse, can be explained in terms of the intentionality of the basic perceptual experiences together with collateral information.

How do we deal with the traditional problems of perception? How do we deal with skepticism? The traditional problem of skepticism arises because exactly the same type of experience can be common to both the hallucinatory and the veridical cases. How are we supposed to know which is which?

Image Credit: Marmalade Skies. Photo by Tom Raven. CC by NC-ND 2.0 via Flickr

The post Seeing things the way they are appeared first on OUPblog.

0 Comments on Seeing things the way they are as of 1/18/2015 10:31:00 AM
Add a Comment
4. Why causality now?

Head hits cause brain damage, but not always. Should we ban sport to protect athletes? Exposure to electromagnetic fields is strongly associated with cancer development. Should we ban mobile phones and encourage old-fashioned wired communication? The sciences are getting more and more specialized and it is difficult to judge whether, say, we should trust homeopathy, fund a mission to Mars, or install solar panels on our roofs. We are confronted with questions about causality on an everyday basis, as well as in science and in policy.

Causality has been a headache for scholars since ancient times. The oldest extensive writings may have been Aristotle, who made causality a central part of his worldview. Then we jump 2,000 years until causality again became a prominent topic with Hume, who was a skeptic, in the sense that he believed we cannot think of causal relationships as logically necessary, nor can we establish them with certainty.

The next major philosophical figure after Hume was probably David Lewis, who proposed quite a controversial account saying roughly that something was a cause of an effect in this world if, in other nearby possible worlds where that cause didn’t happen, the effect didn’t happen either. Currently, we come to work in computer science originated by Judea Pearl and by Spirtes, Glymour and Scheines and collaborators.

All of this is highly theoretical and formal. Can we reconstruct philosophical theorizing about causality in the sciences in simpler terms than this? Sure we can!

One way is to start from scientific practice. Even though scientists often don’t talk explicitly about causality, it is there. Causality is an integral part of the scientific enterprise. Scientists don’t worry too much about what causality is­ – a chiefly metaphysical question – but are instead concerned with a number of activities that, one way or another, bear on causal notions. These are what we call the five scientific problems of causality:

Phrenology: causality, mirthfulness, and time. Photo by Stuart, CC-BY-NC-ND-2.0 via Flickr.
  • Inference: Does C cause E? To what extent?
  • Explanation: How does C cause or prevent E?
  • Prediction: What can we expect if C does (or does not) occur?
  • Control: What factors should we hold fixed to understand better the relation between C and E? More generally, how do we control the world or an experimental setting?
  • Reasoning: What considerations enter into establishing whether/how/to what extent C causes E?

This does not mean that metaphysical questions cease to be interesting. Quite the contrary! But by engaging with scientific practice, we can work towards a timely and solid philosophy of causality.

The traditional philosophical treatment of causality is to give a single conceptualization, an account of the concept of causality, which may also tell us what causality in the world is, and may then help us understand causal methods and scientific questions.

Our aim, instead, is to focus on the scientific questions, bearing in mind that there are five of them, and build a more pluralist view of causality, enriched by attention to the diversity of scientific practices. We think that many existing approaches to causality, such as mechanism, manipulationism, inferentialism, capacities and processes can be used together, as tiles in a causal mosaic that can be created to help you assess, develop, and criticize a scientific endeavour.

In this spirit we are attempting to develop, in collaboration, complementary ideas of causality as information (Illari) and variation (Russo). The idea is that we can conceptualize in general terms the causal linking or production of effect by the cause as the transmission of information between cause and effect (following Salmon); while variation is the most general conceptualization of the patterns of difference-making we can detect in populations where a cause is acting (following Mill). The thought is that we can use these complementary ideas to address the scientific problems.

For example, we can think about how we use complementary evidence in causal inference, tracking information transmission, and combining that with studies of variation in populations. Alternatively, we can think about how measuring variation may help us formulate policy decisions, as might seeking to block possible avenues of information transmission. Having both concepts available assists in describing this, and reasoning well – and they will also be combined with other concepts that have been made more precise in the philosophical literature, such as capacities and mechanisms.

Ultimately, the hope is that sharpening up the reasoning will assist in the conceptual enterprise that lies at the intersection of philosophy and science. And help decide whether to encourage sport, mobile phones, homeopathy and solar panels aboard the mission to Mars!

The post Why causality now? appeared first on OUPblog.

0 Comments on Why causality now? as of 1/18/2015 5:27:00 AM
Add a Comment
5. What is it like to be depressed?

How are we to understand experiences of depression? First of all, it is important to be clear about what the problem consists of. If we don’t know what depression is like, why can’t we just ask someone who’s depressed? And, if we want others to know what our own experience of depression is like, why can’t we just tell them? In fact, most autobiographical accounts of depression state that the experience or some central aspect of it is difficult or even impossible to describe. Depression is not simply a matter of the intensification of certain familiar aspects of experience and the diminution of others, such as feeling more sad and less happy, or more tired and less energetic. First-person accounts of depression indicate that it involves something quite alien to what — for most people — is mundane, everyday experience. The depressed person finds herself in a ‘different world’, an isolated, alien realm, adrift from social reality. There is a radical departure from ‘everyday experience’, and this is not a localized experience that the person has within a pre-given world; it encompasses every aspect of her experience and thought – it is the shape of her world. It is the ‘world’ of depression that people so often struggle to convey.

My approach involves extracting insights from the phenomenological tradition of philosophy and applying them to the task of understanding depression experiences. That tradition includes philosophers such as Edmund Husserl, Edith Stein, Martin Heidegger, Maurice Merleau-Ponty and Jean-Paul Sartre, all of whom engage in ‘phenomenological’ reflection – that is, reflection upon the structure of human experience. Why turn to phenomenology? Well, these philosophers all claim that human experience incorporates something that is overlooked by most of those who have tried to describe it — what we might call a sense of ‘belonging to’ or ‘finding oneself in’ a world. This is something so deeply engrained, so fundamental to our lives, that it is generally overlooked. Whenever I reflect upon my experience of a chair, a table, a sound, an itch or a taste, and whenever I contrast my experience with yours, I continue to presuppose a world in which we are both situated, a shared realm in which it is possible to encounter things like chairs and to experience things like itches. This sense of being rooted in an interpersonal world does not involve perceiving a (very big) object or believing that some object exists. It’s something that is already in place when we do that, and therefore something that we seldom reflect upon.

The Concern by Alex Proimos. CC BY 2.0 via Wikimedia Commons

Depression, I suggest, involves a shift in one’s sense of belonging to the world. We can further understand the nature of this once we acknowledge the role that possibilities play in our experience. When I get up in the morning, feel very tired, stop at a café on the way to work, and then look at a cup of coffee sitting in front of me, what do I ‘experience’? On one account, what I ‘see’ is just what is ‘present’, an object of a certain type. But it’s important to recognize that my experience of the cup is also permeated by possibilities of various kinds. I see it as something that I could drink from, as something that is practically accessible and practically significant. Indeed, it appears more than just significant – it is immediately enticing. Rather than, ‘you could drink me’, it says ‘drink me now’. Many aspects of our situation appear significant to us in some way or other, meaning that they harbor the potentiality for change of a kind that matters. We can better appreciate what experiences of depression consist of once we construe them in terms of shifts in the kinds of possibility that the person has access to. Whereas the non-depressed person might find one thing practically significant and another thing not significant, the depressed person might be unable to find anything practically significant. It is not that she doesn’t find anything significant, but that she cannot. And the absence is very much there, part of the experience – something is missing, painfully lacking, and nothing appears quite as it should do. In fact, many first-person accounts of depression explicitly refer to a loss of possibility. Here are some representative responses to a questionnaire study that I conducted with colleagues two years ago, with help from the mental health charity SANE:

“I remember a time when I was very young – 6 or less years old. The world seemed so large and full of possibilities. It seemed brighter and prettier. Now I feel that the world is small. That I could go anywhere and do anything and nothing for me would change.”

“It is impossible to feel that things will ever be different (even though I know I have been depressed before and come out of it). This feeling means I don’t care about anything. I feel like nothing is worth anything.”

“The world holds no possibilities for me when I’m depressed. Every avenue I consider exploring seems shut off.”

“When I’m not depressed, other possibilities exist. Maybe I won’t fail, maybe life isn’t completely pointless, maybe they do care about me, maybe I do have some good qualities. When depressed, these possibilities simply do not exist.”

By emphasizing the experience of possibility, we can understand a great deal. Suppose the depressed person inhabits an experiential world from which the possibility of anything ever changing for the better is absent; nothing offers the potential for positive change and nothing draws the person in, solicits action. This lack permeates every aspect of her experience. Her situation seems strangely timeless, as no future could differ from the present in any consequential way. Action seems difficult, impossible or futile, because there is no sense of any possibility for significant change. Her body feels somehow heavy and inert, as it is not drawn in by situations, solicited to act. She is cut off from other people, who no longer offer the possibility of significant kinds of interpersonal connection. Others might seem somehow elsewhere, far away, given that they are immersed in shared goal-directed activities that no longer appear as intelligible possibilities for the depressed person. We can thus see how the kind of ‘hopelessness’ or ‘despair’ that is central to so many experiences of depression differs in important respects from more mundane feelings that might be described in similar ways. I might lose hope in a certain project, but I retain the capacity for hope — I can still hope for other things. Some depression experiences, in contrast, involve erosion of the capacity for hope. There is no sense that anything of worth could be achieved or that anything good could ever happen — the attitude of hope has ceased to be intelligible; the person cannot hope.

Of course, it should also be conceded that depression is a heterogeneous, complicated, multi-faceted phenomenon; no single approach or perspective will yield a comprehensive understanding. Even so, I think phenomenological research has an important role to play in solving a major part of the puzzle, thus feeding into a broader understanding of depression and informing our response to it.

Heading image: Depression. Public Domain via Pixabay.

The post What is it like to be depressed? appeared first on OUPblog.

0 Comments on What is it like to be depressed? as of 1/17/2015 6:41:00 AM
Add a Comment
6. I am Henry Finch by Alexis Deacon and Viviane Schwarz

Are you ever too old for a picture book?

Walk into a bookshop, and you’ll rarely find a picture book on the shelves labelled 5-8, 9-12 or Teenage/Young Adult (the age bandings used in the most widespread chain of bookshops in the UK), implicitly telling buyers that picture books are only for those under 5.

But what if you have a picture book about Descartes’s philosophical statement “Je pense donc je suis” or to put it another way “Cogito Ergo Sum”?

A book which not only explores learning to listen to yourself, to trust your own instincts but also what it feels like when you think you have failed and how to fight against the dark thoughts that then crowd in.

Gosh, if only we all knew everything we needed to know about these issues by the time we were five! Wouldn’t life be much simpler?

henryfinchfrontcoverI am Henry Finch written by Alexis Deacon and illustrated by Viviane Schwarz is a new picture book which makes readers and listeners think about every one of these big concepts and more. It’s about being brave, about being independent, about feeling secure enough to not follow the crowd (though also being happy to be part of a community).

It’s also about totally adorable little birds and one terribly monstrous beast who wants to eat them all up.

Henry is just one of a huge flock of finches. They make a racket all day long, doing the same as each other over and again but one day Henry starts thinking for himself. He starts to have his own dreams, his own vision of who he could be, independent from the community he’s grown up in.

Alexis Deacon has written (although not specifically about Henry Finch):

“It seems to me that if every character in your story is entirely on message and engaged with the world you have created it can be very off-putting for the reader. I find that I am drawn to stories where not every character follows the grain: Reluctant characters, perverse characters, selfish characters, irreverent characters. They are often the catalysts for action too.”

And Henry Finch does indeed go against the grain, doing things differently to those around him, daring to be different. But he’s not selfish. In fact, his ability to think for himself gives him the courage to tackle the monster who threatens his family and friends.

Danger, doubt and darkness beset Henry, but he survives and shares what he has learned with his fellow finches, sparking a cascade of individual ideas and wishes as they each set off to explore the world, though not before reassuring each other that “We will come back“; the finches are thinking for themselves, but individuality doesn’t have to lead to the destruction of their community.


Deacon’s story is full of food for thought, opportunities for discussion and debate, whether you’re 4 or 40 or more. The meaty issues explored never become overwhelming, not least because Viviane Schwarz’s illustrations bring so much humour, delight and simplicity into the story.

The use of fingerprints to illustrate a narrative about what it means to be an individual is a stroke of genius; is there a more powerful symbol of individual human identity than the imprint left by the small ridges on the tips of our fingers? They also bring massive child appeal; mucky fingerprints on walls and furniture are unavoidable aspects of life with children, and so there is nothing like these marks to proudly proclaim, “Hey, I’m here, me, this child, and I can make a mark on the world around me!”.


I really like how Schwarz sometimes brings her real life community into her artwork. In her graphic novel The Sleepwalkers there are crowd scenes filled with real people she knows, and in I am Henry Finch, she’s included fingerprints from friends as well as her own. The joy she’s had in creating these images can be seen in the hugely expressive faces and wings of the finches, and that seeped into us: we just had to make our own flock of finches using the same technique.

We started out with inkpads, paper and lots of messy fingerprints…


…but soon we were experimenting with other sorts of prints too…


Then we added beaks and wings…


And soon we had our very own chattering of finches:


One or two elephants interloped! (these were made from prints using the side of our fists – click here to see what Viv Schwarz created with similar prints)


These finches were born from toe-prints, whilst the beasts were heel-prints:


They just kept on coming, causing havoc, and just getting on with doing their own thing.


Whilst fingerprinting and making our own flock of birds we listened to:

  • Fingerprints by I Am Kloot
  • All Around the World or the Myth of Fingerprints by Paul Simon with Los Lobos
  • Fingerprints by Patsy Cline

  • Other activities which could work well alongside reading I am Henry Finch include:

  • Going to hear Alexis talk about this book at Discover (in London) on March 8.
  • Making up your own body organs, from watercolour blobs. You’ll see both why this is relevant and how you could do it if you check out this post from Viviane Schwarz.
  • Learning how to dust for fingerprints, using these helpful (teacher/technician/student) notes from Creative Chemistry.
  • I’ve more philosophy in the form of illustrated books coming up soon on the blog, with offerings from the Netherlands and Spain. What are your favourite picture books which deal with the big issues in life?

    Disclosure: I received a free review copy of I am Henry Finch from the publishers.

    4 Comments on I am Henry Finch by Alexis Deacon and Viviane Schwarz, last added: 1/14/2015
    Display Comments Add a Comment
    7. Accusation breeds guilt

    One of the central tasks when reading a mystery novel (or sitting on a jury) is figuring out which of the characters are trustworthy. Someone guilty will of course say they aren’t guilty, just like the innocent – the real question in these situations is whether we believe them.

    The guilty party – let’s call her Annette – can try to convince us of her trustworthiness by only saying things that are true, insofar as such truthfulness doesn’t incriminate her (the old adage of making one’s lies as close to the truth as possible applies here). But this is not the only strategy available. In addition, Annette can attempt to deflect suspicion away from herself by questioning the trustworthiness of others – in short, she can say something like:

    “I’m not a liar, Betty is!”

    However, accusations of untrustworthiness of this sort are peculiar. The point of Annette’s pronouncement is to affirm her innocence, but such protestations rarely increase our overall level of trust. Either we don’t believe Annette, in which case our trust in Annette is likely to drop (without affecting how much we trust Betty), or we do believe Annette, in which case our trust in Betty is likely to decrease (without necessarily increasing our overall trust in Annette).

    Thus, accusations of untrustworthiness tend to decrease the overall level of trust we place in those involved. But is this reflective of an actual increase in the number of lies told? In other words, does the logic of such accusations makes it the case that, the higher the number of accusations, the higher the number of characters that must be lying?

    Consider a group of people G, and imagine that, simultaneously, each person in the group accuses one, some, or all of the other people in the group of lying right at this minute. For example, if our group consists of three people:

    G = {Annette, Betty, Charlotte}

    then Betty can make one of three distinct accusations:

    Scales of justice, photo by Michael Coghlan CC-BY-SA-2.0 via Flickr

    “Annette is lying.”

    “Charlotte is lying.”

    “Both Annette and Charlotte are lying.”

    Likewise, Annette and Charlotte each have three choices regarding their accusations. We can then ask which members of the group could be, or which must be, telling the truth, and which could be, or which must be, lying by examining the logical relations between the accusations made by each member of the group. For example, if Annette accuses both Betty and Charlotte of lying, then either (i) Annette is telling the truth, in which case both Betty and Charlotte’s accusations must be false, or (ii) Annette is lying, in which case either Betty is telling the truth or Charlotte is telling the truth (or both).

    This set-up allows for cases that are paradoxical. If:

    Annette says “Betty is lying.”

    Betty says “Charlotte is lying.”

    Charlotte says “Annette is lying.”

    then there is no coherent way to assign the labels “liar” and “truth-teller” to the three in such a way as to make sense. Since we are here interested in investigating results regarding how many lies are told (rather than scenarios in which the notion of lying versus telling the truth breaks down), we shall restrict our attention to those groups, and their accusations, that are not paradoxical.

    The following are two simple results that constraint the number of liars, and the number of truth-tellers, in any such group (I’ll provide proofs of these results in the comments after a few days).

    “Accusations of untrustworthiness tend to decrease the overall level of trust we place in those involved”

    Result 1: If, for some number m, each person in the group accuses at least m other people in the group of lying (and there is no paradox) then there are at least m liars in the group.

    Result 2: If, for any two people in the group p1 and p2, either p1 accuses p2 of lying, or p2 accuses p1 of lying (and there is no paradox), then exactly one person in the group is telling the truth, and everyone else is lying.

    These results support an affirmative answer to our question: Given a group of people, the more accusations of untrustworthiness (i.e., of lying) are made, the higher the minimum number of people in the group that must be lying. If there are enough accusations to guarantee that each person accuses at least n people, then there are at least n liars, and if there are enough to guarantee that there is an accusation between each pair of people, then all but one person is lying. (Exercise for the reader: show that there is no situation of this sort where everyone is lying).

    Of course, the set-up just examined is extremely simple, and rather artificial. Conversations (or mystery novels, or court cases, etc.) in real life develop over time, involve all sorts of claims other than accusations, and can involve accusations of many different forms not included above, including:

    “Everything Annette says is a lie!”

    “Betty said something false yesterday!”

    “What Charlotte is about to say is a lie!”

    Nevertheless, with a bit more work (which I won’t do here) we can show that, the more accusations of untrustworthiness are made in a particular situation, the more of the claims made in that situation must be lies (of course, the details will depend both on the number of accusations and the kind of accusations). Thus, it’s as the title says: accusation breeds guilt!

    Note: The inspiration for this blog post, as well as the phrase “Accusation breeds guilt” comes from a brief discussion of this phenomenon – in particular, of ‘Result 2′ above – in ‘Propositional Discourse Logic’, by S. Dyrkolbotn & M. Walicki, Synthese 191: 863 – 899.

    The post Accusation breeds guilt appeared first on OUPblog.

    0 Comments on Accusation breeds guilt as of 1/11/2015 4:38:00 AM
    Add a Comment
    8. Jill Maxick of Prometheus Books: The Powells.com Interview

    For decades, Prometheus Books has put out titles we both love and respect. Prometheus is the leading publisher in the United States of books on free thought, humanism, and atheism — as well as many more titles that serve to fire up the human mind. In fact, that almost seems to be the sole reason [...]

    0 Comments on Jill Maxick of Prometheus Books: The Powells.com Interview as of 1/9/2015 8:56:00 PM
    Add a Comment
    9. Speak of the Devil: Satan in imaginative literature

    Al Pacino is John Milton. Not John Milton the writer of Paradise Lost, although that is the obvious in-joke of the movie The Devil’s Advocate (1997). No, this John Milton is an attorney and — in what thus might be another obvious in-joke — he is also Satan, the Prince of Darkness. In the movie, he hires a fine young defense attorney, Kevin Lomax (Keanu Reeves), and offers him an escalating set of heinous — and high-profile — cases to try, a set of ever-growing temptations if you will. What will happen to Kevin in the trials to come?

    The Devil is a terrifying foe in this film, which should not surprise us. The poet Rainer Maria Rilke wrote in the Duino Elegies that “Every angel is terrifying.” We sometimes forget that our devils were angels first. Tales of angels fallen from goodness particularly bother us, and Satan’s rebellion is supposed to have inspired the most terrible of conflicts. In The Prophecy (1995), Simon (Eric Stoltz) describes the conflict in Heaven and its consequences: “I remember the First War, the way the sky burned, the faces of angels destroyed. I saw a third of Heaven’s legion banished and the creation of Hell. I stood with my brothers and watched Lucifer Fall.”

    The Doctor Who episode “The Satan Pit” (2006) also retells the story of this conflict. The Doctor (David Tennant) encounters The Beast (voiced by Gabriel Woolf) deep within a planet. The Beast tells The Doctor that he comes from a time “Before time and light and space and matter. Before the cataclysm. Before this universe was created.” In this time before Creation, The Beast was defeated in battle by Good and thrown into the pit, an origin that clearly matches that of the Satan whose legend he is said to have inspired: “The Disciples of the Light rose up against me and chained me in the pit for all eternity.”

    Satan, photo by Adrian Scottow, CC by 2.0 via Flickr

    A majority of Americans believe in Satan, a personified cosmic force of evil, but why? The Hebrew and Christian testaments say almost nothing about the Devil. As with Heaven, Hell, Purgatory, angels, and other topics related to the afterlife, most of what we know — or believe we know — about Satan comes from human imagination, not from holy scripture.

    We have used stories, music, and art to flesh out the scant references to the Devil in the Bible. We find Satan personified in medieval mystery plays and William Langland’s Piers Plowman (ca. 1367), and described in horrifying—and heartbreaking—detail in Dante’s Inferno: “If he was fair as he is hideous now, / and raised his brow in scorn of his creator, / he is fit to be the source of every sorrow.” (Inferno 34.34-36)  We find the Devil represented in the art of Gustave Dore and William Blake, and in our own time, represented graphically in the comics The Sandman, Lucifer, and disguised as “The First of the Fallen” in Hellblazer. We watch Satan prowling the crowds for the entirety of Mel Gibson’s The Passion of the Christ (2004), and arriving for an earthly visit at the end of Constantine (2005).

    And we are terrified. Like him or not, the Devil is the greatest villain of all time. Who else stands for every quality and condition that we claim to despise? Who else helps us to understand why the world contains evil — and why we are ourselves sometimes inclined toward it?

    The Inferno Canto 22, Gustave Dore, Public Domain via WikiArt

    We also work out these questions through characters who are not explicitly Satan, but who embody supernatural or preternatural evil. If writers and artists can be said to create “Christ figures,” then it makes sense that they might also create “Satan figures.” Professor Weston in C.S. Lewis’s Perelandra space trilogy, Sauron in The Lord of the Rings trilogy of books and films, Darkseid (the ruler of the hellish planet Apokolips in DC Comics), Lord Voldemort (The Dark Lord of the Harry Potter mythos), and Thomas Harris’s Hannibal Lecter all fit this profile. Such characters — dark, scheming, and because of their tremendous capacity for evil, all but all-powerful — may tell us as much about evil as our stories of Satan do. In fact, Mads Mikkelsen, who plays Lecter in the television series Hannibal, makes that comparison explicit:

    “I believe that Hannibal Lecter is as close as you can come to the devil, to Satan. He’s the fallen angel. His motives are not banal reasons, like childhood abuse or junkie parents. It’s in his genes. He finds life is most beautiful on the threshold to death, and that is something that is much closer to the fallen angel than it is to a psychopath. He’s much more than a psychopath, and there is a fascination for us.”

    In our consumption of narratives and images of the Devil, we are trying to work out what — if anything — the devil means. Even if we don’t believe in an actual fallen angel who rules this world and contends with God, most of us have come to accept that Satan is an emotionally-satisfying explanation for all that goes wrong in real life. The stories in which Satan chills us prove this beyond doubt. What could be more frightening than Al Pacino’s John Milton plotting the destruction of our hero in The Devil’s Advocate, his schemes only moments away from coming to fruition?

    Evil is real, and has real power. We see that in the daily headlines and history books, in our own lives and even in ourselves. To find out where that evil comes from — to understand why human beings do things that are so clearly wrong — perhaps we do need to wrestle with the Devil, even if the only way we encounter him is as a character in a story.

    The post Speak of the Devil: Satan in imaginative literature appeared first on OUPblog.

    0 Comments on Speak of the Devil: Satan in imaginative literature as of 1/7/2015 11:49:00 PM
    Add a Comment
    10. Atheism: Above all a moral issue

    The New Atheists – Richard Dawkins, Sam Harris, Dan Dennett, and the late Christopher Hitchens – are not particularly comfortable people. The fallacies in their arguments beg to be used in classes on informal reasoning. The narrowness of their perspectives are remarkable even by the standards of modern academia. The prejudices against those of other cultures would be breathtaking even in the era when Britannia ruled the waves. But there is a moral fervor unknown outside the pages of the Old Testament. And for this, we can forgive much.

    Atheism is not just a matter of the facts – does God exist or not? It is as much, if not more, a moral matter. Does one have the right to believe in the existence of God? If one does, what does this mean morally and socially? If one does not, what does this mean morally and socially?

    Now you might say that there has to be something wrong here. Does one have the right to believe that 2+2=4? Does one have the right to believe that the moon is made of green cheese? Does one have the right to believe that theft is always wrong? Belief or non-belief in matters such as these is not a moral issue. Even though it may be that how you decide is a moral issue or something with moral implications. How should one discriminate between a mother stealing for her children and a professional burglar after diamonds that he will at once pass on to a fence?

    But the God question is rather different, because, say what you like, it is nigh impossible to be absolutely certain one way or the other. Even Richard Dawkins admits that although he is ninety-nine point many nines certain that there is no god, to quote one of the best lines of that I-hope-not-entirely-forgotten review, Beyond the Fringe, there is always that little bit in the bottom that you cannot get out. There could be some kind of deity of a totally unimaginable kind. As the geneticist J. B. S. Haldane used to say: “My own suspicion is that the Universe is not only queerer than we suppose, but queerer than we can suppose.”

    Four Horsemen" by DIREKTOR - Own work. Licensed under CC BY-SA 4.0 via Wikimedia Commons.
    (Clockwise from top left} Richard Dawkins, Christopher Hitchens, Daniel Dennett, and Sam Harris. “Four Horsemen” by DIREKTOR. Licensed under CC BY-SA 4.0 via Wikimedia Commons.

    So in some ultimate sense the God question is up for grabs, and how you decide is a moral issue. As the nineteenth-century English philosopher, William Kingdom Clifford, used to say, you should not believe anything except on good evidence. But the problem here is precisely what is good evidence – faith, empirical facts, arguments, or what? Decent, thoughtful people differ over these and before long it is no longer a simple matter of true or false, but of what you believe and why; whether you should or should not believe on this basis; and what are going to be the implications of your beliefs, not only on your own life and behavior but also on the lives and behaviors of other people.

    If you go back to Ancient Greece, you find that above all it is the moral and social implications of non-belief that worried people like Plato. In the Laws, indeed, he prescribed truly horrendous restrictions on those who failed to fall in line – and this from a man who himself had very iffy views about the traditional Greek views on the gods and their shenanigans. You are going to be locked up for the rest of your life and receive your food only at the hands of slaves and when you die you are going to be chucked out, unburied, beyond the boundaries of the state.

    Not that this stopped people from bringing up a host of arguments against God and gods, whether or not they thought that there truly is nothing beyond this world. Folk felt it their duty to show the implausibility of god-belief, however uncomfortable the consequences. And this moral fervor, either in favor or against the existence of a god or gods, continues right down through the ages to the present. Before Dawkins, in England in the twentieth century the most famous atheist was the philosopher Bertrand Russell. His moral indignation against Christianity in particular – How dare a bunch of old men in skirts dictate the lives of the rest of us? — shines out from every page. And so it is to the present. No doubt, as he intended, many were shocked when, on being asked in Ireland about sexual abuse by priests, Richard Dawkins said that he thought an even greater abuse was bringing a child up Catholic in the first place. He is far from the first to think in this particular way.

    Believers think they have found the truth and the way. Non-believers are a lot less sure. What joins even – especially – the most ardent of partisans is the belief that this is not simply a matter of true and false. It is a matter of right and wrong. Abortion, gay marriage, civil rights – all of these thorny issues and more are moral and social issues at the heart of our lives and what you believe about God is going to influence how you decide. Atheism, for or against, matters morally.

    Featured image credit: “Sky clouds” by 12345danNL. CC0 via Wikimedia Commons.

    The post Atheism: Above all a moral issue appeared first on OUPblog.

    0 Comments on Atheism: Above all a moral issue as of 1/7/2015 5:01:00 AM
    Add a Comment
    11. Nine pieces of thought-provoking philosophy

    Despite what some may believe, philosophy is prevalent and holds a great level of importance in today’s society. It allows us to examine the most fundamental issues that we face as self-aware beings and apply them to a variety of different topics, from free-will to politics to interpretation. To get you started on your own road of contemplation, we’ve compiled some thought-provoking ideas from David Edmonds’s and Nigel Warburton’s Philosophy Bites Again:

    Headline image credit: The Thinker. Photo by Dan McKay. CC BY 2.0 via Flickr. All slideshow background images CC0/public domain via Pixabay or CC BY 2.0 via Flickr (1) (2) (3) (4) (5) (6) (7.1) (7.2) (8) (9)

    The post Nine pieces of thought-provoking philosophy appeared first on OUPblog.

    0 Comments on Nine pieces of thought-provoking philosophy as of 1/4/2015 3:17:00 AM
    Add a Comment
    12. Adderall and desperation

    “Butler Library smells like Adderall and desperation.”

    That note from a blogger at Columbia University isn’t exactly scientific. But it speaks to the atmosphere that settles in around exam time here, and at other competitive universities. For some portion of the students whose exams I’m grading this week, study drugs, stimulants, and cognitive enhancement are as much a part of finals as all-nighter and bluebooks. Exactly how many completed exams are coming to me via Adderall or Provigil is impossible to pin down. But we do know that studies have found past-year, nonprescribed stimulant use rates as high as 35% among students. We know, according to HHS, that full-time students use nonprescribed Adderall at twice the rate of non-students. We can suspect, too, that academics aren’t so different in this regard from their students. In unscientific poll, 20% of the readers of Nature acknowledged off-label use of cognitive enhancement drugs (CEDs).

    If this sounds like the windup to a drug-panic piece, it’s not. The use of cognitive enhancement drugs concerns me much less than the silence surrounding their use. At universities like Columbia, cognitive enhancement exists in something of an ethical gray zone: technically against rules that are mostly unenforced; an open conversation topic among students in the library at 2 a.m., but a blank spot in “official” academic culture. That blank in itself is worth our concern. CEDs aren’t going away–but more openness about their use could teach us something valuable about the kind of work we do here, and anywhere else focus-boosting pills are popped.

    In fact, much of the anti-cognitive enhancement drug literature dwells on the ethics of work, on the question of how much credit we can and should take for our “enhanced” accomplishments. (In focusing on these arguments, I’m setting to one side any health concerns raised by off-label drug use. I’m doing that not because those concerns are unimportant, but because the most challenging bioethics writing on the topic is less about one drug or another than about the promises and limits of cognitive enhancement in general–up to and including drugs that haven’t been invented yet.) In Beyond Therapy, the influential 2003 report on enhancement technologies from the President’s Council on Bioethics, the central argument against CED use had to do with the kind of work we can honestly claim as our own: “The attainment of [excellence] by means of drugs…looks to many people (including some Members of this Council) to be ‘cheating’ or ‘cheap.’” Work done under the influence of CEDs “seems less real, less one’s own, less worthy of our admiration.”

    Is that a persuasive argument for keeping cognitive enhancement drug use in the closet, or even for taking stronger steps to ban it on campus? I’m not so sure it is. This kind of anti-enhancement case rests on an assumption about authorship, which I call the individual view. It claims that the dignity and authenticity of our accomplishments lie largely in our ability to claim individual credit for our work. In a word, it’s producer-focused, not product-focused.

    That’s a reasonable way to think about authorship–but much of the weight of the anti-cognitive enhancement drug case rests on the presumption that it’s the only way to think about authorship. In fact, there’s another view that’s just as viable: call it the collaborative view. It’s an impersonal way of seeing accomplishment; it’s a product-focused view; it’s less concerned with allocating ownership of our accomplishments and it’s less likely to emphasize originality as the most important mark of quality. It is founded on the understanding that all work, even the most seemingly original, is subject to influences and takes place in a social context.

    You can’t tell the history of accomplishments in the arts and sciences without considering those who thought about their work in this way. We can see it in the “thefts” of content that led passages from Plutarch, via Shakespeare, to T.S. Eliot’s poetry, or in the constant musical borrowing that shapes jazz or blues or classical music. We can see it in the medieval architects and writers who, as C.S. Lewis observed, practiced a kind of “shared authorship,” layering changes one on top of the other until they produced cathedrals or manuscripts that are the product of dozens of anonymous hands. We can see it again in the words of writers like Mark Twain, who forcefully argued that “substantially all ideas are second hand,” or Eliot, who advised critics that “to divert interest from the poet to the poetry is a laudable aim.” We can even see it in the history of our language. Consider the evolution of words like genius (from the classical idea of a guardian spirit, to a special ability, to a talented person himself or herself), invent (from a literal meaning of “to find” to a secondary meaning of “to create”), and talent (from a valuable coin to an internal gift). As Owen Barfield has argued, these changes are marks of the way our understanding of accomplishment has become “internalized.” Where earlier writers tended to imagine inspiration as a process that happens from without, we’re more likely to see it as something that happens from within.

    The collaborative view is valuable even for those of us who aren’t, say, producing historically-great art. It might relieve of us of the anxiety that the work we produce is a commentary on our personal worth. It’s well-tailored to the creative borrowing and sampling that define the “remix culture” celebrated by writers like Lawrence Lessig. And it is, I think, a tonic against the kind of “callous meritocracy” that John Rawls cogently warned us about.

    Female college student stressed and overwhelmed and trying to study at the school library. © Antonio_Diaz via iStock.
    Female college student stressed and overwhelmed and trying to study at the school library. © Antonio_Diaz via iStock.

    That’s not to suggest that the collaborative view is the one true perspective on accomplishment. I’d call it one of a range of possible emphases that have struggled or prospered with the times. But if that’s the case, then we’re free to think more critically about the view of work we want to emphasize at any given time.

    What does any of this have to do with cognitive enhancement? The collaborative view I’ve outlined and a culture of open cognitive enhancement share some important links. It’s certainly not true that one has to use CEDs to take that view, but there are strong reasons why an honest and thoughtful CED user ought to do so.

    Consider the case of a journalist like David Plotz, who kept a running diary of his two-day experiment with Provigil: “Today I am the picture of vivacity. I am working about twice as fast as usual. I have a desperate urge to write…. These have been the two most productive days I’ve had in years.”

    How might such a writer account for the boost in his performance? Would he chalk it up to his inherent skill or effort, or to the temporary influence of a drug? If someone singled out his enhanced work for praise, would he be right in taking all the credit for himself and leaving none for the enhancement?

    I don’t think he would be. There is a dishonesty in failing to acknowledge the enhancement, because that failure willingly creates a false assumption: it allows us to believe that the marginal improvement in performance reflects on the writer’s efforts, growing skill, or some other personal quality, when the truth seems to be otherwise. In other words, I don’t think enhancement is dishonest in itself–it’s failing to acknowledge enhancement that’s dishonest.

    There’s nothing objectionable in collaborative work, forthrightly acknowledged. When we take an impersonal view of our work, we share credit and openly recognize our influences. And we can take a similar attitude to work done under the influence of cognitive enhancement drugs. When we speak of creative influences and working “under the influence” of CEDs, I think we’re exposing a similarity that runs deeper than a pun. Of course, one does not literally “collaborate” with a drug. But whether we acknowledge influences that shape our work or acknowledge the influence of a drug that helped us accomplish that work by improving our performance, we are forgoing full, personal credit. We are directing observers toward the quality of the work, rather than toward what the work may say about our personal qualities. We are, in a sense, making less of a “property claim” on the work. Given the history of innovators who willingly made this more modest claim, and given the benefits of the collaborative view that I’ve discussed, I don’t think that’s such bad news.

    But could a culture of open cognitive enhancement drug use really one day change the way we think about work? There are no guarantees, to be sure. When I read first-person accounts of CED use, I’m struck by the way users perceive fast, temporary, and often surprising gains in focus, processing speed, and articulateness. With that strong subjective experience comes the experience of leaving, and returning to, an “unenhanced” state. The contrast seems visceral and difficult to overlook; the marginal gains in performance seem especially difficult to take credit for. The subjective experience of CED use looks like short-term growth in our abilities, arising from an external source, to which we cannot permanently lay claim. For just that reason, I have trouble agreeing with those, like Michael Sandel, who associate cognitive enhancement with “hubris.” Why not humility instead? Of course, I don’t claim that CEDs will inspire the same reflections in all of their users. It’s certainly possible to be unreflective about the implications of CED use. I only argue that it’s a little harder to be unreflective.

    But that reflectiveness, in turn, requires openness about the enhancement already going on. As long as students fear job-market ramifications for talking on the record about their cognitive enhancement drug use, I wouldn’t nominate them as martyrs to the cause. But why not start with professors and academics–with, say, those 20% of respondents to the Nature poll? What’s tenure for anyway?

    We simply can’t separate enhancement, of any kind, from the ends we ask of it and the work we do with it. So I sympathize with the New Yorker’s Margaret Talbot when she writes that “every era, it seems, has its own defining drug. Neuroenhancers are perfectly suited for the anxiety of white-collar competition in a floundering economy…. They facilitate a pinched, unromantic, grindingly efficient form of productivity.” Yet that’s giving the drug too much credit. I’d look instead to the culture that surrounds it. Our culture of cognitive enhancement is furtive, embarrassed, dedicated to one-upping one another on exams or on the tenure track. But a healthier culture of enhancement is conceivable, and it begins with a greater measure of honesty. Adderall and desperation don’t have to be synonymous, but as long as they are, I’d blame the desperation, not the drug.

    The post Adderall and desperation appeared first on OUPblog.

    0 Comments on Adderall and desperation as of 12/31/2014 10:51:00 AM
    Add a Comment
    13. Group belief

    Groups are often said to believe things. For instance, we talk about PETA believing that factory farms should be abolished, the Catholic Church believing that the Pope is infallible, and the U.S. government believing that people have the right to free speech. But how can we make sense of a group believing something?

    This is an important question, from both a theoretical and a practical point of view. If we don’t understand what group belief is, then we won’t be able to grasp what it means to say that a group knows or should have known something. This matters a great deal, given that belief, knowledge, and culpable ignorance are intimately connected to moral and legal responsibility.

    For instance, if the Bush Administration believed that Iraq did not have weapons of mass destruction, then not only did the Administration lie to the public in saying that it did, but it is also fully culpable for the hundreds of thousands of lives needlessly lost in the Iraq war. And if BP should have known that its Deepwater Horizon oil rig was in need of repairs, then it is responsible for the vast quantity of oil that spilled into the Gulf of Mexico.

    Despite the importance of this question, the topic of group belief has received surprisingly little attention in the philosophical literature. So far, the majority of those who have addressed it favor an inflationary approach where groups are treated as entities with “minds of their own.” That is to say, groups are something more than the mere collection of their members, and group belief is something more than their individual beliefs. This rather bold view is typically motivated by arguments that claim a group can be properly said to believe something even when not a single one of its members believes it. A classic example of this sort of case is where a group decides to let a view “stand” as what the group thinks, despite the fact that none of its members actually holds the view in question.

    For instance, suppose that the Philosophy Department at a university is deliberating about the final candidate to whom it will extend admission to its graduate program. After hours of discussion, all of the members jointly agree that Jane Smith is the most qualified candidate from the remaining pool of applicants. However, not a single member of the department actually believes this; instead, they all think that Jane Smith is the candidate who is most likely to be approved by the administration. Here, it is argued that the Philosophy Department itself believes that Jane Smith is the most qualified candidate for admission, even though none of the members holds this belief. This attribution of belief to the group is supported by looking at its actions: the group asserts that Jane Smith is the most qualified candidate, it defends this position in conversation with administrators, it heavily recruits her to join the department, and so on. Why does the group do all of this? The most natural explanation of how the group behaves is that it really does think Jane is the best candidate—and this can be true even if each group member would deny it individually.

    Berlin Street Scene by Ernst Ludwig Kirchner. Public domain via Wikimedia Commons.

    This argument has led some philosophers to say that a group’s believing something should be understood in terms of the members of the group intentionally and openly jointly accepting it, where it is possible to accept something without believing it. The Philosophy Department above, then, believes that Jane Smith is the most qualified candidate for admission, even though none of the members holds this belief, precisely because they jointly agree to let this position stand as the group’s.

    There is, however, what I take to be a decisive objection to this way of thinking about group belief. Groups lie, and they do so with some frequency: a cursory review of recent news pulls up stories about the lies of Halliburton, Enron, the Bush Administration, and various pharmaceutical companies. And no matter how we understand group lies, a minimum condition is that a group must state what it believes to be false.

    Here is a paradigmatic group lie, slightly fictionalized from a real case: Phillip Morris, one of the largest tobacco companies in the world, is aware of the massive amounts of scientific evidence revealing not only the addictiveness of smoking, but also the links it has with lung cancer. While all of the members of the board of directors of the company believe this conclusion, they all openly decide and then jointly accept that, because of what is at stake financially, the official position of Phillip Morris is that smoking is neither highly addictive nor detrimental to one’s health. This claim is then published in all of their advertising materials and defended against objections.

    Herein lies the problem with the joint acceptance account: an adequate view of group belief should be able to tell the difference between a group’s stating its belief and a group’s lying. On the joint acceptance account, however, the actions of Phillip Morris in the case above make it the case that the group believes that smoking is neither highly addictive nor detrimental to one’s health. The relevant members of the company—namely, the board of directors—not only jointly accept this proposition, but also support it through their public statements, actions, planning, and so on. But surely the most natural way to think of what the company is doing is that they are lying about the health risks of smoking. Phillip Morris says what it does, not because the company genuinely believes that smoking isn’t dangerous, but because it wants to deceive others to believe this.

    Because the joint acceptance account confuses group belief with group lying, we should look for a new way to think about group belief.

    The post Group belief appeared first on OUPblog.

    0 Comments on Group belief as of 12/31/2014 10:48:00 AM
    Add a Comment
    14. Preparing for APA Eastern Meeting 2014

    Look out Philadelphia! Oxford University Press has been attending the American Philosophical Association (APA) Eastern Division Meeting for decades. The conference has been held in various cities including Baltimore, MD, Newark, DE, New York, NY, and Boston, MA. This year, we’re gearing up to travel to Philadelphia on Saturday 27th December, and we’ve asked staff across various divisions to see what they are most looking forward to.

    Clare Cashen, Higher Education Marketing:
    I’m really looking forward to the APA this year. We, in the Higher Education division, publish the majority of our new books in the fall, and the Eastern meeting is the first time we get to display them all at once. It’s always fun to connect with instructors and share what we’ve been working on. I’m also looking forward to a good Philly cheesesteak and maybe a jog up the steps of the Philadelphia Art Museum!

    Joy Mizan, Marketing:
    This will be my first time attending a conference for Oxford University Press. I’m very excited to be representing the company! I’ll be managing the booth from set up to tear down, and it’ll be a very big job. I’m looking forward to putting faces to the names of authors that I’ve been working with. I’m also excited to see what other products the various exhibitors will have. On a personal note, I’m a big fan of Philly and can’t wait to visit it again. I love the historical sites and delicious (albeit, greasy) foods!

    LOVE statue by Robert Indiana
    Image Credit: Love in Philadelphia by Liana Jackson. CC-BY-SA 2.0 via Flickr

    Peter Ohlin, Editorial:
    I look forward to Eastern to see a lot of familiar faces – authors and friends in philosophy, as well as colleagues at other publishers. It’s also a great time to take stock of what we’ve published over the last year and get feedback from readers about those books at the book display. Lastly, it’s good to hear about interesting projects that will hopefully turn into OUP books by the time future APA’s roll around.

    Emily Sacaharin, Editorial:
    I’m excited to be attending my first APA this year! It will be great to meet so many of our authors in person, especially those I’ve already gotten to know via phone and email.

    We hope to see you at the Oxford University Press booth! We’ll be offering the chance to browse and buy our new titles on display at a 20% conference discount, and free trial access to online products, including Electronic Enlightenment. Electronic Enlightenment is the most wide-ranging online collection of edited correspondence of the early modern period, linking people across Europe, the Americas and Asia from the early 17th to the mid-19th century. You can access correspondence sent between important figures in this period, such as David Hume and Adam Smith for instance. Pop by and say hello and you can also pick up sample copies of our latest philosophy journals and browse free articles from British Journal of Aesthetics, Mind, and The Philosophical Quarterly.

    We look forward to seeing you there!

    Featured image credit: Benjamin Franklin Bridge, Philadelphia, by Khush. CC BY-NC-ND 2.0 via Flickr

    The post Preparing for APA Eastern Meeting 2014 appeared first on OUPblog.

    0 Comments on Preparing for APA Eastern Meeting 2014 as of 12/25/2014 4:28:00 AM
    Add a Comment
    15. Almost paradise: heaven in imaginative literature

    Paradise, a 1982 knock-off of the movie Blue Lagoon, stars Phoebe Cates and Willie Aames as teenagers who find themselves alone in a place of natural beauty and experiencing the ultimate joy together. Ann Wilson of Heart and Mike Reno of Loverboy can see forever in each other’s eyes in “Almost Paradise,” their Top Ten hit from the Footloose soundtrack (“Almost paradise / We’re knocking on Heaven’s door”).  Ridley Scott’s Gladiator (2000) references the Elysian Fields, a paradise beyond this one where the blessed go when they die. And the Sports Illustrated swimsuit issue has more than once run a story – or titled an entire issue — “Paradise Found.” Literature and popular culture are awash with references to or appropriations of Heaven.

    The Baylor Survey of Religion determined in 2011 that the vast majority of Americans (two thirds of us, and over ninety per cent of Americans who identify as “very religious”) believe that Heaven exists. Something about the idea of a heavenly realm — call it Zion, call it Paradise, call it Elysium, call it Shangri-La, call it Nirvana — meets a deep-seated need of human beings to hope for something more after this life. Whether because it fits our sense of justice that the good should be rewarded, or because it appeals to our ingrained hope that this sometimes difficult existence isn’t all that we will ever experience, the idea of Heaven has helped to dry the tears of the suffering and offered the possibility of some greater meaning in many earthly lives.

    But Heaven is as much a concept as an actual place, even for those who believe in the actual place. The human imagination has served a vital role in helping us to imagine what Heaven might be. Dante and Milton, for example, crucially shaped our conceptions of a paradisiacal realm beyond human speech and reckoning. In Canto XXX of the Paradiso, Dante offers us a vision of light and joy, describing the saints in Heaven arranged as a rose with the Virgin Mary at its center even as he speaks at length about his inability to speak of what he has seen.

    Heaven, photo by Martin Andersson CC-by-2.0 via Flickr

    John Milton shows us God enthroned, and in glorious language supplies the dignity and beauty most human descriptions of Heaven would necessarily leave lacking:

    Now had the Almighty Father from above,

    From the pure Empyrean where he sits

    High Thron’d above all highth, bent down his eye,

    His own works and their works at once to view:

    About him all the Sanctities of Heaven

    Stood thick as Stars, and from his sight receiv’d

    Beatitude past utterance; on his right

    The radiant image of his Glory sat,

    His onely Son; (Paradise Lost, Book IV, 56-64)

    We require this sort of imaginative view of Heaven partly because the Bible (whether in the Hebrew or Christian testaments) contains very little teaching about Heaven as a place for the faithful departed. N. T. Wright notes in the book Surprised by Hope that most Christians assume that when the Bible speaks of something called heaven it is talking about the place where Christians go after death. Because they start with that belief, they misread Jesus’ teachings about the Kingdom of God or, in the Gospel of Matthew, the Kingdom of Heaven. Assuming that Jesus “is indeed talking about how to go to heaven when you die” may make us feel secure about the afterlife, but, says Wright, it “is certainly not what Jesus or Matthew had in mind.” (18) So, barring those mentions of Heaven in Jesus’ cryptic kingdom teachings, we are left with some references to a heavenly realm in apocalyptic writings like Daniel and Revelation, and some few sayings of Jesus. (The Paradise of Islam is mentioned considerably more often in the Qur’an and in the hadiths and other teachings).

    “How we live now may be shaped by what we believe is happening to us in a next life”

    Many Christians formed their understanding of Heaven from one of Jesus’ teachings in the Gospel of John: “In my Father’s house there are many dwelling places. If it were not so, would I have told you that I go to prepare a place for you? And if I go and prepare a place for you, I will come again and will take you to myself, so that where I am, there you may be also.” (John 14:2-3, NRSV) This teaching has entered into our thinking from the King James Version, where “dwelling place” is translated as “mansions,” and prompted many to think of Heaven as a place where believers will have their own mansions (although the Greek monai« has no such denotative or connotative meaning; it simply means “a place where one may remain or live”).

    But don’t tell those believers who have taken those expected mansions, shaken them with the Book of Revelation’s streets of gold, and served themselves a heavenly gated community where every occupant has a holy-water Jacuzzi with diamond handles. For many who have suffered in this life, it seems only just and right that they spend eternity in luxury. What is Paradise if it isn’t better than the world we know?

    And if, like them, your image of Heaven is of a place where you will walk streets of gold and pluck a harp while holding forth with the saints, then you are certainly not in the minority. Jon Meacham notes in a recent TIME magazine cover story that this version of Heaven appears across Christian history, and is tied up in “culture, politics, economics, class, and psychology.” How we live now may be shaped by what we believe is happening to us in a next life, and can affect everything from how we vote to how we give. But more importantly, our stories about Heaven offer us consolation; they assure us that a just God will surely reward the faithful and punish the faithless, no matter what happens to us in this life. For that reason, those stories are vital to our peace of mind.

    The post Almost paradise: heaven in imaginative literature appeared first on OUPblog.

    0 Comments on Almost paradise: heaven in imaginative literature as of 12/25/2014 3:28:00 AM
    Add a Comment
    16. Giving the gift of well-being

    In the film A Christmas Story, Ralphie desperately wants “an official Red Ryder, carbine action, 200 shot range model air rifle.” His mom resists because she reckons it will damage his well-being. (“You’ll shoot your eye out!”) In the end, though, Ralphie gets the air rifle and deems it “the greatest Christmas gift I ever received, or would ever receive.”

    This Christmas, why not give your friends and family the gift of well-being? Even removing an air rifle and the possibility of eye injury from the mix, that’s easier said than done.

    Well-being is tough to pin down. It takes many forms. A college student, a middle-aged parent, and a spritely octogenarian might all lead very different lives and still have well-being. What’s more, you can’t wrap up well-being and tuck it under the tree. All you can do is give gifts that promote it. But what kind of gift promotes well-being?

    One that establishes or strengthens the positive grooves that make up a good life. You have well-being when you’re stuck in a “positive groove” of:

    • emotions (e.g., pleasure, contentment),
    • attitudes (e.g., optimism, openness to new experiences),
    • traits (e.g., extraversion, perseverance), and
    • success (e.g., strong relationships, professional accomplishment, fulfilling projects, good health).

    Your life is going well for you when you’re entangled in a success-breeds-success cycle comprised of states you find (mostly) valuable and pleasant.

    Well-being as a positive groove
    Well-being as a positive groove

    Some gifts do this by producing what psychologists call flow. They immerse you in an activity you find rewarding. Flow gifts are easy to spot. They’re the ones, like Ralphie’s air rifle, that occupy you all day.

    A flow gift promotes well-being by snaring you into a pleasure-mastery-success loop. A flow gift turns you inward, toward a specific activity and away from the rest of the world. It involves an activity that’s fun, that you get better at with practice, and that rewards you with success, even if that “success” is winning a video game car race.

    The Flow Gift
    The Flow Gift

    Flow is important to a good life. It feels good, and it fosters excellence. It’s the difference between the piano-playing wiz and the kid (like me) who fizzled out. But there’s more to well-being than flow and excellence.

    A bonding gift turns you outward, toward other people. A bonding gift shows how someone thinks and feels about you. In O. Henry’s short story The Gift of the Magi, a young couple, Jim and Della, sacrifice their “greatest treasures” to buy each other Christmas gifts. Della sells her luxurious long hair to buy a chain for Jim’s gold watch. And Jim sells his gold watch to buy the beautiful set of combs Della yearned for.

    Bonding gifts change people’s relationships. The chain and the combs strengthen and deepen Jim and Della’s love, affection and commitment. This is why “of all who give gifts these two were the wisest.”

    The bonds of love and friendship are not just emotional. They’re causal. We’re tangled up with the people we care about in self-sustaining cycles of positive feelings, attitudes, traits and accomplishments. Good relationships are shared, interpersonal positive grooves. This is why they make us better and happier people. Bonding gifts strengthen the positive groove you share with a person you care about.

    A good relationship as an interpersonal positive groove
    A good relationship as an interpersonal positive groove

    You’re probably wondering whether you can find something that’s an effective bonding and flow gift. I must admit, I’ve never managed it. A tandem bike? Alas, no. Perhaps you can do better.

    So this holiday season, why not give “groovy” gifts – gifts that “keep on giving” by ensnaring your loved ones in cascading cycles of pleasure and value.

    Image credit: Stockphotography wrapping paper via Hubspot.

    The post Giving the gift of well-being appeared first on OUPblog.

    0 Comments on Giving the gift of well-being as of 12/17/2014 8:22:00 AM
    Add a Comment
    17. Excusing torture

    We have plenty of excuses for torture. Most of them are bad. Evaluating these bad excuses, as ethical philosophers are able to do, should disarm them. We can hope that clear thinking about excuses will prevent future generations–for the sake of their moral health–from falling into the trap.

    Ignorance. Senator John McCain knows torture at first hand and condemns it unequivocally. Most of the rest of us don’t have his sort of experience. Does that give us an excuse to condone it or cover it up? Not at all. We can easily read accounts of his torture, along with his heroic response to it. Literature about prison camps is full of tales of torture. With a little imagination, we can feel how torture would affect us. Reading and imagination are crucial to moral education.

    Anger and fear. In the grip of fear and anger, people do things they would never do in a calm frame of mind. This is especially true in combat. After heart-rending losses, soldiers are more likely to abuse prisoners or hack up the bodies of enemies they have killed. That’s understandable in the heat of battle. But in the cold-blooded context of the so-called war on terror this excuse has no traction. Of course we are angry at terrorists and we fear what they may do to us, but these feelings are dispositions. They are not the sort of passions that disarm the moral sense. So they do not excuse the torture of detainees after 9/11.

    Even in the heat of battle, well-led troops hold back from atrocities. A fellow Vietnam veteran once told me that he had in his power a Viet Cong prisoner, who, he believed, had killed his best friend. He was raging to kill the man, and he could have done it. “What held you back?” I asked. “I knew if I shot him, and word got out, my commander would have me court-martialed.” He was grateful for his commander’s leadership. That saved him from a burden on his conscience.

    Saving lives. Defenders of torture say that it has saved American lives. The evidence does not support this, as the Feinstein Committee has shown, but the myth persists. In military intelligence school in 1969 I was taught that torture is rarely effective, because prisoners tell you what they think you want to hear. Or they tell you what they want you to hear. In the case of the Battle of Algiers, one terrorist group gave the French information that led the French to wipe out competing groups.

    Drewdlecam Enhanced Interrogation
    Enhanced Interrogation by Drewdlecam. CC BY-NC-SA 2.0 via Flickr

    Suppose, however, that the facts were otherwise, that torture does save lives. That is no excuse. Suppose I go into hospital for an appendectomy and the next day my loved ones come to collect me. What they find is a cadaver with vital organs removed. “Don’t fret,” they are told. “We took his life painlessly under anesthetic and saved five other lives with his organs. A good bargain don’t you think?” No. We all know it is wrong to kill one person merely to save others. What would make it right in the case of torture?

    The detainees are guilty of terrible crimes. Perhaps. But we do not know this. They have not had a chance for a hearing. And even if they were found guilty, torture is not permitted under ethics or law.

    The ad hominem. The worst excuse possible, but often heard: Criticism of torture is politically motivated. Perhaps so, but that is irrelevant. Attacking the critics is no way to defend torture.

    Bad leadership: the “pickle-barrel” excuse. Zimbardo has argued that we should excuse the guards at Abu Ghraib because they has been plunged into a situation that we know turns good people bad. His prison experiment at Stanford proved the point. He compares the guards to cucumbers soaked in a pickle barrel. If the cucumbers turn into pickles, don’t blame them. This is the best of the excuses so far; the bipartisan Schlesinger Commission cited a failure of leadership at Abu Ghraib. Still, this is a weak excuse; not all the guards turned sour. They had choices. But good leadership and supervision would have prevented the problem, as it would at the infamous Salt Pit of which we have just learned.

    We need to disarm these bad excuses, and the best way to do that is through leadership and education. Torture is a sign of hubris–of the arrogant feeling that we have the power and knowledge to carry out torture properly. We don’t. The ancient Greeks knew that the antidote to hubris is reverence, a quality singularly missing in modern American life.

    Headline image credit: ‘Witness Against Torture: Captive Hands’ by Justin Norman. CC BY-NC-ND 2.0 via Flickr 

    The post Excusing torture appeared first on OUPblog.

    0 Comments on Excusing torture as of 12/17/2014 8:22:00 AM
    Add a Comment
    18. Parenting during the holidays

    The holiday season can be an insanely stressful time. Looking for presents, wrapping them, cooking, getting the house ready for visitors, cleaning before and after. Nothing like a normal Saturday night on the couch in front of the TV or with a couple of close friends. The holidays demand perfection. You see it all around you, friends are talking about how stressed out they are, how much they still have to do in just a couple of days. Hyper-decorated stores are talking in their own way. As you approach the 25th of December you still haven’t bought half the gifts you need to rack up for family members, the house looks like a bomb crater and you occasionally wish yourself back in the office with piles of work on your desk waiting to be completed. There are even times when you would exchange a chilly Monday morning and an 8 o’clock meeting for this nerve-racking time that’s supposed to be happy, fun and merry.

    What many rattled folks forget in the midst of buying last-minute bequests for loved ones or checking on the unhappy-looking beast in the oven minutes before guests arrive, wishing themselves far away, is that as many as half of the population face a holiday season without their dearest family members. There are people who have lost their loved ones in gruesome ways. I can’t even begin to imagine how they must feel, as they approach every new upcoming holiday season. There are people who have lost their parents to old age, people who have gone through heartbreaking divorces, separations and breakups and people who are overseas defending their country because they have no other choice. The holidays will not be what they once were for any of them. And then there are the single parents, parents many of which have decent custody agreements that are “in the best interest of the children.” According to the US Census Bureau, there are more than 10 million single parents in the United States today. Each year millions among them can look forward to days of loneliness because the little ones they really want to spend time with are with the other parent.

    When sane parents separate, many judges, thankfully, divide custody equally. Each parent gets his or her fair share of custody, if at all possible. Even when it’s not possible to share the time with the children equally, judges will usually attempt to divide up the holidays evenly. The kids spend every other holiday with mom and every other holiday with dad. It certainly is in the children’s best interest to get to spend some time with each parent. Most kids, with decent moms and dads, would prefer to spend every holiday with both parents. The precious little ones secretly hope for the impossible: That their divorced or separated parents will get back together. But despite their wishes, they adjust to the situation. They have no other choice.

    The array of Christmas presents, by SheepGuardingLlama. CC-BY-NC-ND-2.0 via Flickr.

    Nor do the parents. As we face the holidays many single parents face a very lonely time. They may be with dear family members: parents, brothers, sisters, nieces, nephews, aunts and uncles. Yet they may nonetheless feel a profound pain in their hearts, even as they watch close relatives savor the pecan pie or scream in delight when they rip open their Christmas presents. Their own children are far away. In most cases the youngsters are in a safe place elsewhere, stuffing their faces with goodies or breaking out laughing when the other grandpa makes a funny face. In most cases single parents know that their children are enjoying themselves in the company of the other caregiver and his or her extended family.

    Yet the children are missing from the scenery. Their absence is felt. “It hurts. It hurts every other Christmas when my kids are with their dad during the holidays,” says Wendy Thomas, a St. Louis, Missouri single mother of two girls ages 8 and 5. Thomas shares custody with the girls’ father, who lives in Illinois. “The first year was the hardest but I don’t think I will ever get used to it. Shopping malls and Silent Night make me shiver,” says the 38-year-old entrepreneur. This is her third Christmas and New Year’s without her children.

    Each holiday a single parent truly misses his or her children on that one day that is supposed to bring delight to everyone. “It’s going to be a lonely, lonely Christmas without you” may just be tedious background music for the families that didn’t break apart. Each year, however, the oldie is causing a tiny tear to run quietly down the cheek of some single caregiver.

    But could some of the reported agony over absent children during the holidays be the result of what psychologists call cognitive dissonance, a psychological mechanism we use to justify our choices and conflicting belief sets? For example, you choose to volunteer three hours a week at the local children’s hospital. It’s killing you. You can barely fit in everything else you have to do. But you tell everyone, including yourself, that volunteer work is truly rewarding and every (wo)man’s duty. Making irrational decisions seem rational is a way to preserve your sense of self worth.

    Studies show that the hardship involved in raising children makes us idealize parenthood and consider it an enormously rewarding enterprise. In a study published in the January 2011 issue of the journal Psychological Science researchers primed 80 parents with at least one child in two different ways. One group was asked to read a document reporting the costs of raising a child. The other parents read the same document as well as a script reporting on the benefits of having raised children when you reach old age. The participants were then given a psychological test assessing their beliefs about parenting. The team found what they expected. Parents who had only read about the financial costs of parenthood initially felt more discomfort than the other group. But they went onto idealize parenthood much more than the other participants and when interviewed later their negative feelings were gone.

    “How do single parents get through Christmas as painlessly as possible?”

    Could cognitive dissonance explain why single parents feel empty-handed and depressed during holiday seasons without their children? St. Charles, MO, family counselor Deborah Miller doesn’t think that’s what’s going on. “This year it’s my turn to be one of those parents. I’ll be the first to admit that raising a child is not always a blessing. There are countless times when I feel more like a chauffeur or a waitress or a slave than a free agent with some real me-time.” She thinks the lonely-parent phenomenon evidently is not a manifestation of cognitive dissonance, as we don’t idealize away the pain of being without our children on Christmas or New Year’s. The heartache often doesn’t go away until we see our kids again in January and abruptly remember just how draining it is to raise a child. “I’ll finally get some time to myself, and I know my son will have a blast. But I’ll miss him immensely,” says Miller.

    How do single parents get through Christmas as painlessly as possible? The solution is not necessarily to have a huge family gathering with your side of the family to ease the sorrow. A gala dinner on Christmas Day may have its advantages. You can hug your little nieces and nephews and maybe feel a bit of comfort as they open their presents in a way only children can approach surprises. You may feel a teensy bit of wonder (or is it jealousy) as you view your siblings and their spouses exchange loving smiles and their young ones take delight in the simplest of things. “It may work for some but there is a sense in which you will only be a spectator,” says Miller. She recalls her Christmas two years ago. “I felt gratified to be part of a functional family, and it was good to see my siblings interact with their children. I also remember being thankful that my parents were still alive and healthy and that they got one more holiday season with some of their grandchildren. But I also felt great sadness, because the dearest thing in my life wasn’t with me. I really missed my son that day.” This Christmas, Miller is getting together with a few friends. “Sure, we will still have Christmas dinner but there won’t be any children or presents or sacred family traditions. So hopefully I won’t be reminded of what I’m missing out on.”

    Featured image credit: Christmas Decorations, by Ian Wilson. CC-BY-2.0 via Flickr.

    The post Parenting during the holidays appeared first on OUPblog.

    0 Comments on Parenting during the holidays as of 12/14/2014 4:54:00 AM
    Add a Comment
    19. Vampires and life decisions

    Imagine that you have a one-time-only chance to become a vampire. With one swift, painless bite, you’ll be permanently transformed into an elegant and fabulous creature of the night. As a member of the Undead, your life will be completely different. You’ll experience a range of intense new sense experiences, you’ll gain immortal strength, speed and power, and you’ll look fantastic in everything you wear. You’ll also need to drink the blood of humanely farmed animals (but not human blood), avoid sunlight, and sleep in a coffin.

    Now, suppose that all of your friends, people whose interests, views and lives were similar to yours, have already decided to become vampires. And all of them tell you that they love it. They encourage you to become a vampire too, saying things like: “I’d never go back, even if I could. Life has meaning and a sense of purpose now that it never had when I was human. It’s amazing! But I can’t really explain it to you, a mere human. You’ll have to become a vampire to know what it’s like.”

    In this situation, how could you possibly make an informed choice about what to do? For, after all, you cannot know what it is like to become a vampire until you become one. The experience of becoming a vampire is transformative. What I mean by this is that it is an experience that is both radically epistemically new, such that you have to have it in order to know what it will be like for you, and moreover, will change your core personal preferences.

    “You’ll have to become a vampire to know what it’s like”

    So you can’t rationally choose to become a vampire, but nor can you rationally choose to not become one, if you want to choose based on what you think it would be like to live your life as a vampire. This is because you can’t possibly know what it would be like before you try it. And you can’t possibly know what you’d be missing if you didn’t.

    We don’t normally have to consider the choice to become Undead, but the structure of this example generalizes, and this makes trouble for a widely assumed story about how we should make momentous, life-changing choices for ourselves. The story is based on the assumption that, in modern western society, the ideal rational agent is supposed to charge of her own destiny, mapping out the subjective future she hopes to realize by rationally evaluating her options from her authentic, personal point of view. In other words, when we approach major life decisions, we are supposed to introspect on our past experiences and our current desires about what we want our futures to be like in order to guide us in determining our future selves. But if a big life choice is transformative, you can’t know what your future will be like, at least, not in the deeply relevant way that you want to know about it, until you’ve actually undergone the life experience.

    Transformative experience cases are special kinds of cases where important ordinary approaches that people try to use to make better decisions, such as making better generalizations based on past experiences, or educating themselves to better evaluate and recognize their true desires or preferences, simply don’t apply. So transformative experience cases are not just cases involving our uncertainty about certain sorts of future experiences. They are special kinds of cases that focus on a distinctive kind of ‘unknowability’—certain important and distinctive values of the lived experiences in our possible futures are fundamentally first-personally unknowable. The problems with knowing what it will be like to undergo life experiences that will transform you can challenge the very coherence of the ordinary way to approach major decisions.

    ‘Vampire Children,’ by. Shawn Allen. CC-BY-2.0 via Flickr

    Moreover, the problem with these kinds of choices isn’t just with the unknowability of your future. Transformative experience cases also raise a distinctive kind of decision-theoretic problem for these decisions made for our future selves. Recall the vampire case I started with. The problem here is that, before you change, you are supposed to perform a simulation of how you’d respond to the experience in order to decide whether to change. But the trouble is, who you are changes as you become a vampire.

    Think about it: before you become a vampire, you should assess the decision as a human. But you can’t imaginatively put yourself in the shoes of the vampire you will become and imaginatively assess what that future lived experience will be. And, after you have become a vampire, you’ve changed, such that your assessment of your decision now is different from the assessment you made as a human. So the question is, which assessment is the better one? Which view should determine who you become? The view you have when you are human? Or the one you have when you are a vampire.

    The questions I’ve been raising here focus on the fictional case of the choice to be come a vampire. But many real-life experiences and the decisions they involve have the very same structure, such as the choice to have one’s first child. In fact, in many ways, the choice to become a parent is just like the choice to become a vampire! (You won’t have to drink any blood, but you will undergo a major transition, and life will never be the same again.)

    In many ways, large and small, as we live our lives, we find ourselves confronted with a brute fact about how little we can know about our futures, just when it is most important to us that we do know. If that’s right, then for many big life choices, we only learn what we need to know after we’ve done it, and we change ourselves in the process of doing it. In the end, it may be that the most rational response to this situation is to change the way we frame these big decisions: instead of choosing based on what we think our futures will be like, we should choose based on whether we want to discover who we’ll become.

    The post Vampires and life decisions appeared first on OUPblog.

    0 Comments on Vampires and life decisions as of 1/1/1900
    Add a Comment
    20. Top 10 Turkey-Dumping Day breakup songs

    The last Thursday of November freshmen are returning home to reunite with their high school sweethearts. Except not all are as sweet as they once were. Your old flame may show up with a new admirer or give you trouble because you didn’t spend enough time on Skype on Saturday nights while away at college. Be prepared: pack an arsenal of tunes that catch the sad and sometimes mixed feelings you may have after Turkey-Dumping Day. For your convenience, a list of the 10 great breakup songs for a post-Turkey recovery:

    10. Pink’s “Blow me (One Last Kiss)”
    One of the more lighthearted tracks to make the list, Pink’s lead single from her sixth studio album The Truth About Love (2012) nonetheless gets the message across: After too much fighting, tears, and sweaty palms, the time comes when turkey is not the only thing you have finally had enough of.

    9. Passenger’s “Let Her Go”
    Passenger’s second single from the album All the Little Lights (2012) made the list not only because of the soul-wrenching, melodic tune but also because of its spot-on content. Looking into the heart of a dumper, the lyrics forcefully delineate the paradox of love: you don’t really know whether or how much you love someone, until he or she is gone.

    8. Christina Perri’s “Human”
    The lead single from Perri’s second studio album Head or Heart (2014), this pop power ballad features almost no drumsticks (pun intended). Instead it showcases the American singer’s ethereal voice. And the lyrics hit the nail on the head: Being happier and hotter without your ex may be the best way to get even. But don’t worry if you fail spectacularly, ’cause you’re only a little human.

    7. Hilary Duff’s “Stranger”
    Tapping into the style and sound of Middle-eastern belly-dance music, Hilary Duff’s single, recorded for her fourth studio album Dignity (2007), is a bouncy yet husky song about suddenly seeing an unkind stranger in the torso of your beloved. After listening to this tune, put on the dumper’s apron before slicing the turkey.

    6. Jaymes Young’s “Parachute”
    Despite its blunt language, Seattle-born singer Jaymes Young’s fragile ballad made the list because of its lyrics about being lied to and instantly knowing that it’s time to take the “l” out of “lover.”

    5. Taylor Swift’s “I knew you were trouble”
    Taylor Swift’s bass-heavy dubstep drop, recorded for her fourth studio album Red (2012), is aptly warning us about the trouble-makers–those types that make you fall in love only to leave you behind.

    4. Sam Smith’s “Stay with me”
    Although it’s not quite a turkey-dumping song but rather a desperate-for-love ballad, this gospel-inspired hit from British songwriter Sam Smith’s debut studio album In the Lonely Hour (2014) still made the list. Critics deemed it overly sentimental, but “brutally honest” is evidently a better description.

    3. David Guetta’s “Titanium”
    French DJ and music producer David Guetta is hard to pass over when it comes to ferocious breakup songs. This 2012 hit from his album Nothing But the Beat gives you relationship hardship and a shot of resilience to help take the pain out of Turkey-Dumping Day.

    2. Fefe Dobson’s “Stuttering”
    “Dobson can sing,” say the critic. Yes, indeed. The tune and the debated music video leave you stuttering and wondering: Can the green-eyed monster make you that crazy? Yes, it can, not least when the cheater isn’t your man.

    1. David Guetta’s “She Wolf”
    Katy Perry’s “Part of Me” gets an honorary mention for its heartening lyrics but it’s David Guetta who takes the first place with another ballad, featuring vocals from Australian recording artist Sia. Reflecting on the most poignant of breakups, this impassioned chorus on the feeling of being replaced takes us inside the mind of someone who is “falling to pieces.”

    Headline image credit: Broken Heart Grunge by Nicolas Raymond. CC BY 2.0 via Flickr 

    The post Top 10 Turkey-Dumping Day breakup songs appeared first on OUPblog.

    0 Comments on Top 10 Turkey-Dumping Day breakup songs as of 11/24/2014 7:20:00 AM
    Add a Comment
    21. The impossible painting

    Supposedly, early 20th century packaging for Quaker Oats depicted the eponymous Quaker holding a package of the oats, where the art on this package depicted the Quaker holding a package of the oats, which itself depicted the Quaker holding a package of the oats, ad infinitum. I have not been able to locate an photograph of the packaging, but more than one philosopher and mathematician has attributed an early interest in the nature of the infinite to childhood contemplation of this image. Here, however, I want to examine a different phenomenon: whether artwork that depicts itself in this way can lead to paradoxes.

    Let’s begin with two well-known puzzles. The older of the two– the Liar paradox – was known to ancient Greek philosophers, and challenges the following platitudes about truth:

    (T1)           A sentence is true if and only if what it says is the case.

    (T2)           Every sentence is exactly one of true and false.

    Consider the Liar sentence:

    This sentence is false.

    Is the Liar sentence true or false? If it is true, then what it says must be the case. It says it is false, so this means it is false. If it’s false, then, since it says it is false, what it says is indeed the case. But this would make it true. So the Liar sentence is true if and only if it is false, violating the platitudes.

    The second puzzle is the Russell paradox, discovered by Bertrand Russell at the beginning of the 20th Century. This paradox involves collections, or sets, of objects, and two central theses:

    (S1)      Given any property P, there is a set of objects containing all and only the objects that have P.

    (S2)      Sets are themselves objects, and can be contained in sets.

    Given (S2), we can divide objects into two types: Those that contain themselves (such as the set containing all sets whatsoever) and those that do not contain themselves (such as the set of all kittens). Thus, “is a set that does not contain itself” picks out a perfectly good property, and so by (S1) there should be a set – let’s call it R – containing exactly those things that have this property. So:

    A set is a member of R if and only if it is not a member of itself.

    Now, is R a member of itself? Either it is or it isn’t. If R is a member of itself then R isn’t a member of itself. And if R isn’t a member of itself then R is a member of itself. Either way, R both is and isn’t a member of itself. Again, a contradiction.

    ‘Drawing Hands’ 1948 M.C Escher. Public domain via WikiArt.org

    There is another puzzle that seems intimately connected to these two paradoxes, however, that has not (as far as I know) been noticed or studied – the paradox of the impossible painting. This paradox stems from two principles governing the notion of depiction (or representation) rather than truth or set-theoretic membership.

    First, it seems, at least at first glance, that we can paint anything that we can describe – if I tell you to paint a forest with exactly 28 trees, then you can produce a painting fitting that description.  Thus:

    (D1)     Given any description D, we can create a painting that depicts things exactly as described in D.

    Second, there is nothing to prevent a painting from being depicted within another painting – for example, Velazquez’s Las Meninas depicts the painter working on another painting. Thus:

    (D2)     Paintings can be depicted in paintings.

    If some paintings can depict other paintings, then it seems like we can divide paintings into two types: those that depict themselves (such as the artwork on old Quaker Oats packaging) and those that do not. Thus, “a scene depicting all and only the paintings that do not depict themselves” is a perfectly good description, and so by (D1) it should be possible to produce a painting – let’s call it I – that depicts things as described. So:

    A painting is depicted in I if and only if it does not depict itself.

    Should I depict itself ? In other words, if you are creating this painting, should you include a depiction of I itself within the scene? If you include I in the painting, then I is a painting that depicts itself, so it should not be depicted in I after all. But if you don’t include I in the painting, then I is a painting that does not depict itself, so it should have been included. Either way, you can’t create a painting that depicts things exactly as described.

    The paradox of the impossible painting is distinct from both the Liar paradox and the Russell paradox, since it involves depiction rather than truth or set-membership. But it has features in common with each. Most obviously, circularity plays a central role in all three paradoxes: the Liar paradox involves sentences that says something about themselves, the Russell paradox involves sets that are members of themselves, and the paradox of the impossible painting involves paintings that depict themselves.

    “Who knew oats could be so deep?”

    Nevertheless, the paradox of the impossible painting has features not shared by the Liar paradox, and other features not shared by the Russell paradox. First, the Liar paradox involves a sentence that clearly exists (and is grammatical, etc.) that must be accounted for, while the Russell paradox can be seen in different terms, as a sort of proof that the Russell set R just doesn’t exist, and that we need to revise (S1) accordingly. The proper response regarding the paradox of the impossible painting is more like the latter – we are not tempted to think that the paradoxical painting does or could exist, but instead conclude that there is something wrong with (D1).

    There is another sense, however, in which the paradox of the impossible painting is more like the Liar paradox than the Russell paradox. The Liar paradox arguably arises because of circularity of reference: the Liar sentence refers to, or ‘picks out’, itself. And the paradox of the impossible painting arises because of circularity of depiction – that is, paintings that depict, or ‘pick out’, themselves. Reference and depiction are different, but, insofar as they are both ways of ‘picking out’, while set-theoretic membership is not, suggests that, in this respect at least, the paradox of the impossible painting has more in common with the Liar paradox than with the Russell paradox.

    Thus, the paradox of the impossible painting ‘lies between’, or is a sort of hybrid of, the Liar paradox and the Russell paradox, with some features in common with the former and others in common with the latter. As a result, studying this puzzle further seems likely to reward us with deeper insights into these two much older and more well-known conundra. Who knew oats could be so deep?

    The post The impossible painting appeared first on OUPblog.

    0 Comments on The impossible painting as of 11/30/2014 6:12:00 AM
    Add a Comment
    22. Edward O. Wilson: What I’m Giving

    At Powell's, we feel the holidays are the perfect time to share our love of books with those close to us. For this special blog series, we reached out to authors featured in our Holiday Gift Guide to learn about their own experiences with book giving during this bountiful time of year. Today's featured giver [...]

    0 Comments on Edward O. Wilson: What I’m Giving as of 12/1/2014 4:51:00 PM
    Add a Comment
    23. The development of peace

    The story of peace is as old as the story of humanity itself, and certainly as old as war. It is a story of progress, often in very difficult circumstances. Historically, peace has often been taken, to imply an absence of overt violence or war between or sometimes within states–in other words, a negative peace. War is often thought to be the natural state of humanity, peace of any sort being fragile and fleeting. I would challenge this view. Peace in its various forms has been by far humanity’s more common experience—as the archaeological, ethnographic, and historic records indicate. Much of history has been relatively peaceful and orderly, while frameworks for security, law, redistribution of resources, and justice have constantly been advancing. Peace has been at the centre of the human experience, and a sophisticated version of peace has become widely accepted in modernity, representing a more positive form of peace.

    Peace has been organized domestically within the state, internationally through global organizations and institutions, or transnationally through actors whose ambit covers all of these levels. Peace can be public or private. Peace has often been a hidden phenomenon, subservient to power and interests.

    The longer term aspiration for a self-sustaining, positive peace via a process aimed at a comprehensive outcome has rarely been attained, however, even with the combined assistance—in recent times—of international donors, the United Nations, World Bank, military forces, or international NGOs.

    Peace is also a rather ambiguous concept. Authoritarian governments and powerful states have, throughout history, had a tendency to impose their version of peace on their own citizens as well as those of other states, as with the Soviet Union’s suppression of dissent amongst its own population and those of its satellite states, such as East Germany or Czechoslovakia. Peace and war may be closely connected, such as when military force is deployed to make peace, as with North Atlantic Treaty Organization (NATO) airstrikes in Bosnia-Herzegovina in 1995 and in Yugoslavia in 1999.

    CND badge. Photo by Gerald Holtom. CC0 via Wikimedia Commons
    CND badge. Photo by Gerald Holtom. CC0 via Wikimedia Commons

    Both George Orwell (1903–50), in his novel 1984, and the French social theorist Michel Foucault (1926–84) noted the dangers of the relationship between war and peace in their well-known aphorisms: ‘peace is war’, ‘war is peace’. Nevertheless, peace is closely associated with a variety of political, social, economic, and cultural struggles against the horrors of war and oppression. Peace activism has normally been based on campaigns for individual and group rights and needs, for material and legal equality between groups, genders, races, and religions, disarmament, and to build international institutions. This has required the construction of local and international associations, networks, and institutions, which coalesced around widely accepted agendas. Peace activism supported internationally organized civil society campaigns against slavery in the 18th century, and for basic human dignity and rights ever since. Various peace movements have struggled for independence and self-determination, or for voting rights and disarmament (most famously perhaps, the Campaign for Nuclear Disarmament).

    Ordinary people can, and have often, mobilized for peace in societal terms using peaceful methods of resistance. A wealth of historical and contemporary evidence supports a popular desire for a broad, positive form of peace. Recent research indicates that its development will tend to be hybrid. A hybrid peace framework ultimately must represent a wide range of social practices, identities, as well as indicating the coexistence of different forms of state, and a widely pluralist international community.

    Featured image credit: Dove. CC0 via Pixabay.

    The post The development of peace appeared first on OUPblog.

    0 Comments on The development of peace as of 12/5/2014 5:59:00 AM
    Add a Comment
    24. Christmas for a nonbeliever

    As a small boy in the 1920s, my father sang in the choir of the parish church, St Matthews, in Walsall in the British Midlands. Twenty years later, he was married with a couple of children and our small, tight family belonged to the Religious Society of Friends, the Quakers. Friends do not have church services. There is no hymn singing. But every Christmas Eve, religiously as one might say, at three o’clock in the afternoon, the family gathered around the radio to listen to the broadcast of carols and lessons from King’s College, Cambridge.

    That was long ago and for me, since I now live in Florida, far away. I have long since lost my faith in the Christian religion. Even if this were not so, I doubt that I would much enjoy Christmas overall. When the kids were little, it was a lot of fun. But now, it strikes me as appallingly commercialized and an occasion when you spend way too much on presents no one really wants, eat and drink to excess, and end by quarreling with people that you have not seen for a year and by which time you both realize why it is that you have not seen each other for a year.

    But every Christmas Eve I track down the broadcast of the King’s service and listen to it, even though because of time-zone differences it is now for me in the morning. Music spurs emotions as does no other art form, and I find listening an almost-melancholic experience as memories of my childhood come flooding in and I recall with huge gratitude the loving family into which I was born. I remember also my dedicated teachers recreating civilized life after the horrendous conflicts of the first part of the century. How can one speak except with respect of a man who spent the first half of the decade driving a tank over North Africa and Western Europe, and the second half explaining to nine-year olds why Pilgrim’s Progress is such a tremendous story and something of vital relevance to us today?

    Christmas lights, by shannonpareil, CC-by-2.0 via Flickr

    So Christmas remains very important for me, as does the other great highlight of the Christian calendar. As a teenager, having failed O level German miserably, I was packed off one Easter vacation to stay with a family in Germany, so I could (as I did) succeed on the second attempt. Music again. On Good Friday, German radio stations played Bach’s Matthew Passion, and listening to that – even though in respects I prefer the dramatic intensity of the St John Passion – has remained a life-long practice.

    Perhaps because it is all so German, I find myself focusing on the dreadful events of the Third Reich, but also – and obviously the theme of Christ’s sacrifice is all-important here – on those who showed super-human qualities in the face of absolute evil and terror. Above all, Sophie Scholl, at twenty-one years old a member of the White Rose group in Munich who started handing out anti-Nazi pamphlets in the middle of the war. Inevitably discovered and condemned to death, as she was led to the guillotine, she said:  “How can we expect righteousness to prevail when there is hardly anyone willing to give himself up individually to a righteous cause. Such a fine, sunny day, and I have to go, but what does my death matter, if through us, thousands of people are awakened and stirred to action?”

    I would not for anything relinquish the experience of Easter and the moments when I contemplate the truly good people – I think of those combating Ebola in West Africa – who stand so far above me and who inspire me, even though I am not worthy to clean their shoes. You don’t have to have religious faith to have these all-important emotions. You do have to be a human being.

    “How can we expect righteousness to prevail when there is hardly anyone willing to give himself up individually to a righteous cause.”

    And so finally to the third festival, that of Thanksgiving.  Growing up in England, it was something unknown to me until, to go to graduate school, I crossed the Atlantic in 1962. In the early years, in both Canada and America, people invited me into their homes to share the occasion with their family and friends. This is something that has stayed with me for over fifty years, and now at Thanksgiving – by far my favorite festival overall — my wife and I hugely enjoy filling the table with folk who are away from home or for one reason or another would not otherwise have a place to be. No special music this time – although I usually manage to drive everyone crazy by playing opera at full blast – but for me an equally poignant occasion when I reflect on the most important thing I did in my life – to move from the Old Word to the New – and on the significance of family and friends and above all of giving. In the Republic, Plato says that only the good man is the happy man. Well, that’s a bit prissy applied to me, but I know what he means. People were kind to me and my wife and I try to be kind to people. That is a wonderful feeling.

    Three festivals – memories and gratitude; sacrifice and honor; giving and friendship. That is why, although I have not a scrap of religious belief and awful though the music in the mall may be, I look forward to Christmas, and then to Easter, and then to Thanksgiving, and to the cycle all over again, many times!

    The post Christmas for a nonbeliever appeared first on OUPblog.

    0 Comments on Christmas for a nonbeliever as of 1/1/1900
    Add a Comment
    25. John Boyd and Sun Tzu’s The Art of War

    Renowned US military strategist John Boyd is famous for his signature OODA (Observe-Orientation-Decision-Action) loop, which significantly affected the way that the West approached combat operations and has since been appropriated for use in the business world and even in sports. Boyd wrote to convince people that the Western military doctrine and practice of his day was fundamentally flawed. With this goal in mind, he naturally turned to the East to seek an alternative.

    Sun Tzu: The Art of War happened to be the only theoretical book on war that Boyd did not find imperfect; it became his Rosetta stone. Boyd eventually owned seven translations of The Art of War, each with long passages underlined and with copious marginalia. He was at the same time familiar with Taoism (Lao Tzu mainly) and Miyamoto Musashi (a famous Japanese swordsman who practiced Samurai Zen). With this extensive knowledge of Eastern thought, Boyd aimed for an almost full adoption of Sun Tzu’s theory into the Western strategic framework. The theory of Sun Tzu was foreign to his audience’s way of thinking, so in order to convince them of its value he repackaged, rationalized, and modernized Eastern theories using various scientific theories from the West.

    Why couldn’t such an adoption take place using existing translations of The Art of War? Boyd understood that he could get nowhere close to the heart of Chinese strategy without first understanding the cognitive and philosophical foundations behind Chinese strategic thought. These foundations are usually lost in translation, causing an impasse in understanding the Chinese strategy that remains today. Hence Boyd made use of new sciences to illuminate what the West had been unable to illuminate before.

    For instance, Boyd recreated the naturalistic worldview of Chinese strategy in the Western framework. From this perspective, the OODA loop encompasses much more than a four-phase decision-making model: its real significance is that it reconstructs mental operations based on intuitive thinking and judgment. This kind of intuition is pivotal to strategy and strategic thinking, but was lost as the West embraced a more rational scientific mindset. It is an open secret that the speed and success of the OODA loop comes from a deep intuitive understanding of one’s relationship to the rapidly changing environment. This understanding of one’s environment comes directly from Chinese strategic thought.

    Chinese warrior. CC0 via Pixabay.
    Chinese warrior. CC0 via Pixabay.

    Another aspect of Chinese strategic thought that Boyd insisted on capturing and incorporating into the Western strategic framework is yin-yang (yin and yang). Yin-yang has been commonly misunderstood as the Chinese equivalent of “contradictions” in the West. Yin-yang, however, is not considered contradictory or paradoxical by the Chinese, but is actively used to resolve real-life contradictions and paradoxes—the key is to see yin-yang (such as win-lose, enemy-friend, strong-weak) as one concept or continuum, not two opposites. It is this Chinese philosophical and logical concept that forms the strategic chain linking Sun Tzu, Lao Tzu, and Mao Zedong.

    Once this “oneness” of things is realized, a strategist will then be able to tap into the valuable strategic information it carries, including the dynamics of situations and relationships between things, resulting in a more complete grasp of a situation, particularly in complex and multifaceted phenomena like war. In short, yin-yang provides an intuitive means for understanding the essence of reality, opening a new door to strategic insights and forecasts that were once inaccessible by using Western methods.

    Boyd’s thesis is not a general theory of war but, as one of his biographers noted, a general theory of the strategic behavior of complex adaptive systems in adversarial conditions. It is ironic that the scientific terminology used illustrates the systemic thinking behind Chinese strategic thought applied by Sun Tzu 2,500 years ago, as the terminology of complex adaptive systems and non-linearity did not exist then.

    Boyd opened a crucial window of opportunity for Western thought by repackaging and rationalizing Eastern thought. His attempt to adopt Sun Tzu into the Western strategic framework was far from being successful, and many of his proposals have gone unnoticed, but nonetheless Boyd made very significant progress in “synchronizing” Chinese and Western strategy. Once the West grasps the significance behind this unprecedented opportunity to directly absorb and adopt elements of Chinese strategy, it will open many new avenues for the development and self-rectification of Western strategic thought and practices.

    The post John Boyd and Sun Tzu’s The Art of War appeared first on OUPblog.

    0 Comments on John Boyd and Sun Tzu’s The Art of War as of 12/9/2014 8:57:00 AM
    Add a Comment

    View Next 25 Posts