JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: philosophy, Most Recent at Top [Help]
Results 1 - 25 of 337
How to use this Page
You are viewing the most recent posts tagged with the words: philosophy in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
In the film A Christmas Story, Ralphie desperately wants “an official Red Ryder, carbine action, 200 shot range model air rifle.” His mom resists because she reckons it will damage his well-being. (“You’ll shoot your eye out!”) In the end, though, Ralphie gets the air rifle and deems it “the greatest Christmas gift I ever received, or would ever receive.”
This Christmas, why not give your friends and family the gift of well-being? Even removing an air rifle and the possibility of eye injury from the mix, that’s easier said than done.
Well-being is tough to pin down. It takes many forms. A college student, a middle-aged parent, and a spritely octogenarian might all lead very different lives and still have well-being. What’s more, you can’t wrap up well-being and tuck it under the tree. All you can do is give gifts that promote it. But what kind of gift promotes well-being?
One that establishes or strengthens the positive grooves that make up a good life. You have well-being when you’re stuck in a “positive groove” of:
emotions (e.g., pleasure, contentment),
attitudes (e.g., optimism, openness to new experiences),
traits (e.g., extraversion, perseverance), and
success (e.g., strong relationships, professional accomplishment, fulfilling projects, good health).
Your life is going well for you when you’re entangled in a success-breeds-success cycle comprised of states you find (mostly) valuable and pleasant.
Some gifts do this by producing what psychologists call flow. They immerse you in an activity you find rewarding. Flow gifts are easy to spot. They’re the ones, like Ralphie’s air rifle, that occupy you all day.
A flow gift promotes well-being by snaring you into a pleasure-mastery-success loop. A flow gift turns you inward, toward a specific activity and away from the rest of the world. It involves an activity that’s fun, that you get better at with practice, and that rewards you with success, even if that “success” is winning a video game car race.
Flow is important to a good life. It feels good, and it fosters excellence. It’s the difference between the piano-playing wiz and the kid (like me) who fizzled out. But there’s more to well-being than flow and excellence.
A bonding gift turns you outward, toward other people. A bonding gift shows how someone thinks and feels about you. In O. Henry’s short story The Gift of the Magi, a young couple, Jim and Della, sacrifice their “greatest treasures” to buy each other Christmas gifts. Della sells her luxurious long hair to buy a chain for Jim’s gold watch. And Jim sells his gold watch to buy the beautiful set of combs Della yearned for.
Bonding gifts change people’s relationships. The chain and the combs strengthen and deepen Jim and Della’s love, affection and commitment. This is why “of all who give gifts these two were the wisest.”
The bonds of love and friendship are not just emotional. They’re causal. We’re tangled up with the people we care about in self-sustaining cycles of positive feelings, attitudes, traits and accomplishments. Good relationships are shared, interpersonal positive grooves. This is why they make us better and happier people. Bonding gifts strengthen the positive groove you share with a person you care about.
You’re probably wondering whether you can find something that’s an effective bonding and flow gift. I must admit, I’ve never managed it. A tandem bike? Alas, no. Perhaps you can do better.
So this holiday season, why not give “groovy” gifts – gifts that “keep on giving” by ensnaring your loved ones in cascading cycles of pleasure and value.
Image credit: Stockphotography wrapping paper via Hubspot.
We have plenty of excuses for torture. Most of them are bad. Evaluating these bad excuses, as ethical philosophers are able to do, should disarm them. We can hope that clear thinking about excuses will prevent future generations–for the sake of their moral health–from falling into the trap.
Ignorance. Senator John McCain knows torture at first hand and condemns it unequivocally. Most of the rest of us don’t have his sort of experience. Does that give us an excuse to condone it or cover it up? Not at all. We can easily read accounts of his torture, along with his heroic response to it. Literature about prison camps is full of tales of torture. With a little imagination, we can feel how torture would affect us. Reading and imagination are crucial to moral education.
Anger and fear. In the grip of fear and anger, people do things they would never do in a calm frame of mind. This is especially true in combat. After heart-rending losses, soldiers are more likely to abuse prisoners or hack up the bodies of enemies they have killed. That’s understandable in the heat of battle. But in the cold-blooded context of the so-called war on terror this excuse has no traction. Of course we are angry at terrorists and we fear what they may do to us, but these feelings are dispositions. They are not the sort of passions that disarm the moral sense. So they do not excuse the torture of detainees after 9/11.
Even in the heat of battle, well-led troops hold back from atrocities. A fellow Vietnam veteran once told me that he had in his power a Viet Cong prisoner, who, he believed, had killed his best friend. He was raging to kill the man, and he could have done it. “What held you back?” I asked. “I knew if I shot him, and word got out, my commander would have me court-martialed.” He was grateful for his commander’s leadership. That saved him from a burden on his conscience.
Saving lives. Defenders of torture say that it has saved American lives. The evidence does not support this, as the Feinstein Committee has shown, but the myth persists. In military intelligence school in 1969 I was taught that torture is rarely effective, because prisoners tell you what they think you want to hear. Or they tell you what they want you to hear. In the case of the Battle of Algiers, one terrorist group gave the French information that led the French to wipe out competing groups.
Suppose, however, that the facts were otherwise, that torture does save lives. That is no excuse. Suppose I go into hospital for an appendectomy and the next day my loved ones come to collect me. What they find is a cadaver with vital organs removed. “Don’t fret,” they are told. “We took his life painlessly under anesthetic and saved five other lives with his organs. A good bargain don’t you think?” No. We all know it is wrong to kill one person merely to save others. What would make it right in the case of torture?
The detainees are guilty of terrible crimes. Perhaps. But we do not know this. They have not had a chance for a hearing. And even if they were found guilty, torture is not permitted under ethics or law.
The ad hominem. The worst excuse possible, but often heard: Criticism of torture is politically motivated. Perhaps so, but that is irrelevant. Attacking the critics is no way to defend torture.
Bad leadership: the “pickle-barrel” excuse. Zimbardo has argued that we should excuse the guards at Abu Ghraib because they has been plunged into a situation that we know turns good people bad. His prison experiment at Stanford proved the point. He compares the guards to cucumbers soaked in a pickle barrel. If the cucumbers turn into pickles, don’t blame them. This is the best of the excuses so far; the bipartisan Schlesinger Commission cited a failure of leadership at Abu Ghraib. Still, this is a weak excuse; not all the guards turned sour. They had choices. But good leadership and supervision would have prevented the problem, as it would at the infamous Salt Pit of which we have just learned.
We need to disarm these bad excuses, and the best way to do that is through leadership and education. Torture is a sign of hubris–of the arrogant feeling that we have the power and knowledge to carry out torture properly. We don’t. The ancient Greeks knew that the antidote to hubris is reverence, a quality singularly missing in modern American life.
Headline image credit: ‘Witness Against Torture: Captive Hands’ by Justin Norman. CC BY-NC-ND 2.0 via Flickr
The holiday season can be an insanely stressful time. Looking for presents, wrapping them, cooking, getting the house ready for visitors, cleaning before and after. Nothing like a normal Saturday night on the couch in front of the TV or with a couple of close friends. The holidays demand perfection. You see it all around you, friends are talking about how stressed out they are, how much they still have to do in just a couple of days. Hyper-decorated stores are talking in their own way. As you approach the 25th of December you still haven’t bought half the gifts you need to rack up for family members, the house looks like a bomb crater and you occasionally wish yourself back in the office with piles of work on your desk waiting to be completed. There are even times when you would exchange a chilly Monday morning and an 8 o’clock meeting for this nerve-racking time that’s supposed to be happy, fun and merry.
What many rattled folks forget in the midst of buying last-minute bequests for loved ones or checking on the unhappy-looking beast in the oven minutes before guests arrive, wishing themselves far away, is that as many as half of the population face a holiday season without their dearest family members. There are people who have lost their loved ones in gruesome ways. I can’t even begin to imagine how they must feel, as they approach every new upcoming holiday season. There are people who have lost their parents to old age, people who have gone through heartbreaking divorces, separations and breakups and people who are overseas defending their country because they have no other choice. The holidays will not be what they once were for any of them. And then there are the single parents, parents many of which have decent custody agreements that are “in the best interest of the children.” According to the US Census Bureau, there are more than 10 million single parents in the United States today. Each year millions among them can look forward to days of loneliness because the little ones they really want to spend time with are with the other parent.
When sane parents separate, many judges, thankfully, divide custody equally. Each parent gets his or her fair share of custody, if at all possible. Even when it’s not possible to share the time with the children equally, judges will usually attempt to divide up the holidays evenly. The kids spend every other holiday with mom and every other holiday with dad. It certainly is in the children’s best interest to get to spend some time with each parent. Most kids, with decent moms and dads, would prefer to spend every holiday with both parents. The precious little ones secretly hope for the impossible: That their divorced or separated parents will get back together. But despite their wishes, they adjust to the situation. They have no other choice.
Nor do the parents. As we face the holidays many single parents face a very lonely time. They may be with dear family members: parents, brothers, sisters, nieces, nephews, aunts and uncles. Yet they may nonetheless feel a profound pain in their hearts, even as they watch close relatives savor the pecan pie or scream in delight when they rip open their Christmas presents. Their own children are far away. In most cases the youngsters are in a safe place elsewhere, stuffing their faces with goodies or breaking out laughing when the other grandpa makes a funny face. In most cases single parents know that their children are enjoying themselves in the company of the other caregiver and his or her extended family.
Yet the children are missing from the scenery. Their absence is felt. “It hurts. It hurts every other Christmas when my kids are with their dad during the holidays,” says Wendy Thomas, a St. Louis, Missouri single mother of two girls ages 8 and 5. Thomas shares custody with the girls’ father, who lives in Illinois. “The first year was the hardest but I don’t think I will ever get used to it. Shopping malls and Silent Night make me shiver,” says the 38-year-old entrepreneur. This is her third Christmas and New Year’s without her children.
Each holiday a single parent truly misses his or her children on that one day that is supposed to bring delight to everyone. “It’s going to be a lonely, lonely Christmas without you” may just be tedious background music for the families that didn’t break apart. Each year, however, the oldie is causing a tiny tear to run quietly down the cheek of some single caregiver.
But could some of the reported agony over absent children during the holidays be the result of what psychologists call cognitive dissonance, a psychological mechanism we use to justify our choices and conflicting belief sets? For example, you choose to volunteer three hours a week at the local children’s hospital. It’s killing you. You can barely fit in everything else you have to do. But you tell everyone, including yourself, that volunteer work is truly rewarding and every (wo)man’s duty. Making irrational decisions seem rational is a way to preserve your sense of self worth.
Studies show that the hardship involved in raising children makes us idealize parenthood and consider it an enormously rewarding enterprise. In a study published in the January 2011 issue of the journal Psychological Science researchers primed 80 parents with at least one child in two different ways. One group was asked to read a document reporting the costs of raising a child. The other parents read the same document as well as a script reporting on the benefits of having raised children when you reach old age. The participants were then given a psychological test assessing their beliefs about parenting. The team found what they expected. Parents who had only read about the financial costs of parenthood initially felt more discomfort than the other group. But they went onto idealize parenthood much more than the other participants and when interviewed later their negative feelings were gone.
“How do single parents get through Christmas as painlessly as possible?”
Could cognitive dissonance explain why single parents feel empty-handed and depressed during holiday seasons without their children? St. Charles, MO, family counselor Deborah Miller doesn’t think that’s what’s going on. “This year it’s my turn to be one of those parents. I’ll be the first to admit that raising a child is not always a blessing. There are countless times when I feel more like a chauffeur or a waitress or a slave than a free agent with some real me-time.” She thinks the lonely-parent phenomenon evidently is not a manifestation of cognitive dissonance, as we don’t idealize away the pain of being without our children on Christmas or New Year’s. The heartache often doesn’t go away until we see our kids again in January and abruptly remember just how draining it is to raise a child. “I’ll finally get some time to myself, and I know my son will have a blast. But I’ll miss him immensely,” says Miller.
How do single parents get through Christmas as painlessly as possible? The solution is not necessarily to have a huge family gathering with your side of the family to ease the sorrow. A gala dinner on Christmas Day may have its advantages. You can hug your little nieces and nephews and maybe feel a bit of comfort as they open their presents in a way only children can approach surprises. You may feel a teensy bit of wonder (or is it jealousy) as you view your siblings and their spouses exchange loving smiles and their young ones take delight in the simplest of things. “It may work for some but there is a sense in which you will only be a spectator,” says Miller. She recalls her Christmas two years ago. “I felt gratified to be part of a functional family, and it was good to see my siblings interact with their children. I also remember being thankful that my parents were still alive and healthy and that they got one more holiday season with some of their grandchildren. But I also felt great sadness, because the dearest thing in my life wasn’t with me. I really missed my son that day.” This Christmas, Miller is getting together with a few friends. “Sure, we will still have Christmas dinner but there won’t be any children or presents or sacred family traditions. So hopefully I won’t be reminded of what I’m missing out on.”
Featured image credit: Christmas Decorations, by Ian Wilson. CC-BY-2.0 via Flickr.
Renowned US military strategist John Boyd is famous for his signature OODA (Observe-Orientation-Decision-Action) loop, which significantly affected the way that the West approached combat operations and has since been appropriated for use in the business world and even in sports. Boyd wrote to convince people that the Western military doctrine and practice of his day was fundamentally flawed. With this goal in mind, he naturally turned to the East to seek an alternative.
Sun Tzu: The Art of War happened to be the only theoretical book on war that Boyd did not find imperfect; it became his Rosetta stone. Boyd eventually owned seven translations of The Art of War, each with long passages underlined and with copious marginalia. He was at the same time familiar with Taoism (Lao Tzu mainly) and Miyamoto Musashi (a famous Japanese swordsman who practiced Samurai Zen). With this extensive knowledge of Eastern thought, Boyd aimed for an almost full adoption of Sun Tzu’s theory into the Western strategic framework. The theory of Sun Tzu was foreign to his audience’s way of thinking, so in order to convince them of its value he repackaged, rationalized, and modernized Eastern theories using various scientific theories from the West.
Why couldn’t such an adoption take place using existing translations of The Art of War? Boyd understood that he could get nowhere close to the heart of Chinese strategy without first understanding the cognitive and philosophical foundations behind Chinese strategic thought. These foundations are usually lost in translation, causing an impasse in understanding the Chinese strategy that remains today. Hence Boyd made use of new sciences to illuminate what the West had been unable to illuminate before.
For instance, Boyd recreated the naturalistic worldview of Chinese strategy in the Western framework. From this perspective, the OODA loop encompasses much more than a four-phase decision-making model: its real significance is that it reconstructs mental operations based on intuitive thinking and judgment. This kind of intuition is pivotal to strategy and strategic thinking, but was lost as the West embraced a more rational scientific mindset. It is an open secret that the speed and success of the OODA loop comes from a deep intuitive understanding of one’s relationship to the rapidly changing environment. This understanding of one’s environment comes directly from Chinese strategic thought.
Another aspect of Chinese strategic thought that Boyd insisted on capturing and incorporating into the Western strategic framework is yin-yang (yin and yang). Yin-yang has been commonly misunderstood as the Chinese equivalent of “contradictions” in the West. Yin-yang, however, is not considered contradictory or paradoxical by the Chinese, but is actively used to resolve real-life contradictions and paradoxes—the key is to see yin-yang (such as win-lose, enemy-friend, strong-weak) as one concept or continuum, not two opposites. It is this Chinese philosophical and logical concept that forms the strategic chain linking Sun Tzu, Lao Tzu, and Mao Zedong.
Once this “oneness” of things is realized, a strategist will then be able to tap into the valuable strategic information it carries, including the dynamics of situations and relationships between things, resulting in a more complete grasp of a situation, particularly in complex and multifaceted phenomena like war. In short, yin-yang provides an intuitive means for understanding the essence of reality, opening a new door to strategic insights and forecasts that were once inaccessible by using Western methods.
Boyd’s thesis is not a general theory of war but, as one of his biographers noted, a general theory of the strategic behavior of complex adaptive systems in adversarial conditions. It is ironic that the scientific terminology used illustrates the systemic thinking behind Chinese strategic thought applied by Sun Tzu 2,500 years ago, as the terminology of complex adaptive systems and non-linearity did not exist then.
Boyd opened a crucial window of opportunity for Western thought by repackaging and rationalizing Eastern thought. His attempt to adopt Sun Tzu into the Western strategic framework was far from being successful, and many of his proposals have gone unnoticed, but nonetheless Boyd made very significant progress in “synchronizing” Chinese and Western strategy. Once the West grasps the significance behind this unprecedented opportunity to directly absorb and adopt elements of Chinese strategy, it will open many new avenues for the development and self-rectification of Western strategic thought and practices.
As a small boy in the 1920s, my father sang in the choir of the parish church, St Matthews, in Walsall in the British Midlands. Twenty years later, he was married with a couple of children and our small, tight family belonged to the Religious Society of Friends, the Quakers. Friends do not have church services. There is no hymn singing. But every Christmas Eve, religiously as one might say, at three o’clock in the afternoon, the family gathered around the radio to listen to the broadcast of carols and lessons from King’s College, Cambridge.
That was long ago and for me, since I now live in Florida, far away. I have long since lost my faith in the Christian religion. Even if this were not so, I doubt that I would much enjoy Christmas overall. When the kids were little, it was a lot of fun. But now, it strikes me as appallingly commercialized and an occasion when you spend way too much on presents no one really wants, eat and drink to excess, and end by quarreling with people that you have not seen for a year and by which time you both realize why it is that you have not seen each other for a year.
But every Christmas Eve I track down the broadcast of the King’s service and listen to it, even though because of time-zone differences it is now for me in the morning. Music spurs emotions as does no other art form, and I find listening an almost-melancholic experience as memories of my childhood come flooding in and I recall with huge gratitude the loving family into which I was born. I remember also my dedicated teachers recreating civilized life after the horrendous conflicts of the first part of the century. How can one speak except with respect of a man who spent the first half of the decade driving a tank over North Africa and Western Europe, and the second half explaining to nine-year olds why Pilgrim’s Progress is such a tremendous story and something of vital relevance to us today?
So Christmas remains very important for me, as does the other great highlight of the Christian calendar. As a teenager, having failed O level German miserably, I was packed off one Easter vacation to stay with a family in Germany, so I could (as I did) succeed on the second attempt. Music again. On Good Friday, German radio stations played Bach’s Matthew Passion, and listening to that – even though in respects I prefer the dramatic intensity of the St John Passion – has remained a life-long practice.
Perhaps because it is all so German, I find myself focusing on the dreadful events of the Third Reich, but also – and obviously the theme of Christ’s sacrifice is all-important here – on those who showed super-human qualities in the face of absolute evil and terror. Above all, Sophie Scholl, at twenty-one years old a member of the White Rose group in Munich who started handing out anti-Nazi pamphlets in the middle of the war. Inevitably discovered and condemned to death, as she was led to the guillotine, she said: “How can we expect righteousness to prevail when there is hardly anyone willing to give himself up individually to a righteous cause. Such a fine, sunny day, and I have to go, but what does my death matter, if through us, thousands of people are awakened and stirred to action?”
I would not for anything relinquish the experience of Easter and the moments when I contemplate the truly good people – I think of those combating Ebola in West Africa – who stand so far above me and who inspire me, even though I am not worthy to clean their shoes. You don’t have to have religious faith to have these all-important emotions. You do have to be a human being.
“How can we expect righteousness to prevail when there is hardly anyone willing to give himself up individually to a righteous cause.”
And so finally to the third festival, that of Thanksgiving. Growing up in England, it was something unknown to me until, to go to graduate school, I crossed the Atlantic in 1962. In the early years, in both Canada and America, people invited me into their homes to share the occasion with their family and friends. This is something that has stayed with me for over fifty years, and now at Thanksgiving – by far my favorite festival overall — my wife and I hugely enjoy filling the table with folk who are away from home or for one reason or another would not otherwise have a place to be. No special music this time – although I usually manage to drive everyone crazy by playing opera at full blast – but for me an equally poignant occasion when I reflect on the most important thing I did in my life – to move from the Old Word to the New – and on the significance of family and friends and above all of giving. In the Republic, Plato says that only the good man is the happy man. Well, that’s a bit prissy applied to me, but I know what he means. People were kind to me and my wife and I try to be kind to people. That is a wonderful feeling.
Three festivals – memories and gratitude; sacrifice and honor; giving and friendship. That is why, although I have not a scrap of religious belief and awful though the music in the mall may be, I look forward to Christmas, and then to Easter, and then to Thanksgiving, and to the cycle all over again, many times!
The story of peace is as old as the story of humanity itself, and certainly as old as war. It is a story of progress, often in very difficult circumstances. Historically, peace has often been taken, to imply an absence of overt violence or war between or sometimes within states–in other words, a negative peace. War is often thought to be the natural state of humanity, peace of any sort being fragile and fleeting. I would challenge this view. Peace in its various forms has been by far humanity’s more common experience—as the archaeological, ethnographic, and historic records indicate. Much of history has been relatively peaceful and orderly, while frameworks for security, law, redistribution of resources, and justice have constantly been advancing. Peace has been at the centre of the human experience, and a sophisticated version of peace has become widely accepted in modernity, representing a more positive form of peace.
Peace has been organized domestically within the state, internationally through global organizations and institutions, or transnationally through actors whose ambit covers all of these levels. Peace can be public or private. Peace has often been a hidden phenomenon, subservient to power and interests.
The longer term aspiration for a self-sustaining, positive peace via a process aimed at a comprehensive outcome has rarely been attained, however, even with the combined assistance—in recent times—of international donors, the United Nations, World Bank, military forces, or international NGOs.
Peace is also a rather ambiguous concept. Authoritarian governments and powerful states have, throughout history, had a tendency to impose their version of peace on their own citizens as well as those of other states, as with the Soviet Union’s suppression of dissent amongst its own population and those of its satellite states, such as East Germany or Czechoslovakia. Peace and war may be closely connected, such as when military force is deployed to make peace, as with North Atlantic Treaty Organization (NATO) airstrikes in Bosnia-Herzegovina in 1995 and in Yugoslavia in 1999.
Both George Orwell (1903–50), in his novel 1984, and the French social theorist Michel Foucault (1926–84) noted the dangers of the relationship between war and peace in their well-known aphorisms: ‘peace is war’, ‘war is peace’. Nevertheless, peace is closely associated with a variety of political, social, economic, and cultural struggles against the horrors of war and oppression. Peace activism has normally been based on campaigns for individual and group rights and needs, for material and legal equality between groups, genders, races, and religions, disarmament, and to build international institutions. This has required the construction of local and international associations, networks, and institutions, which coalesced around widely accepted agendas. Peace activism supported internationally organized civil society campaigns against slavery in the 18th century, and for basic human dignity and rights ever since. Various peace movements have struggled for independence and self-determination, or for voting rights and disarmament (most famously perhaps, the Campaign for Nuclear Disarmament).
Ordinary people can, and have often, mobilized for peace in societal terms using peaceful methods of resistance. A wealth of historical and contemporary evidence supports a popular desire for a broad, positive form of peace. Recent research indicates that its development will tend to be hybrid. A hybrid peace framework ultimately must represent a wide range of social practices, identities, as well as indicating the coexistence of different forms of state, and a widely pluralist international community.
At Powell's, we feel the holidays are the perfect time to share our love of books with those close to us. For this special blog series, we reached out to authors featured in our Holiday Gift Guide to learn about their own experiences with book giving during this bountiful time of year. Today's featured giver [...]
Supposedly, early 20th century packaging for Quaker Oats depicted the eponymous Quaker holding a package of the oats, where the art on this package depicted the Quaker holding a package of the oats, which itself depicted the Quaker holding a package of the oats, ad infinitum. I have not been able to locate an photograph of the packaging, but more than one philosopher and mathematician has attributed an early interest in the nature of the infinite to childhood contemplation of this image. Here, however, I want to examine a different phenomenon: whether artwork that depicts itself in this way can lead to paradoxes.
Let’s begin with two well-known puzzles. The older of the two– the Liar paradox – was known to ancient Greek philosophers, and challenges the following platitudes about truth:
(T1) A sentence is true if and only if what it says is the case.
(T2) Every sentence is exactly one of true and false.
Consider the Liar sentence:
This sentence is false.
Is the Liar sentence true or false? If it is true, then what it says must be the case. It says it is false, so this means it is false. If it’s false, then, since it says it is false, what it says is indeed the case. But this would make it true. So the Liar sentence is true if and only if it is false, violating the platitudes.
The second puzzle is the Russell paradox, discovered by Bertrand Russell at the beginning of the 20th Century. This paradox involves collections, or sets, of objects, and two central theses:
(S1) Given any property P, there is a set of objects containing all and only the objects that have P.
(S2) Sets are themselves objects, and can be contained in sets.
Given (S2), we can divide objects into two types: Those that contain themselves (such as the set containing all sets whatsoever) and those that do not contain themselves (such as the set of all kittens). Thus, “is a set that does not contain itself” picks out a perfectly good property, and so by (S1) there should be a set – let’s call it R – containing exactly those things that have this property. So:
A set is a member of R if and only if it is not a member of itself.
Now, is R a member of itself? Either it is or it isn’t. If R is a member of itself then R isn’t a member of itself. And if R isn’t a member of itself then R is a member of itself. Either way, R both is and isn’t a member of itself. Again, a contradiction.
There is another puzzle that seems intimately connected to these two paradoxes, however, that has not (as far as I know) been noticed or studied – the paradox of the impossible painting. This paradox stems from two principles governing the notion of depiction (or representation) rather than truth or set-theoretic membership.
First, it seems, at least at first glance, that we can paint anything that we can describe – if I tell you to paint a forest with exactly 28 trees, then you can produce a painting fitting that description. Thus:
(D1) Given any description D, we can create a painting that depicts things exactly as described in D.
Second, there is nothing to prevent a painting from being depicted within another painting – for example, Velazquez’s Las Meninas depicts the painter working on another painting. Thus:
(D2) Paintings can be depicted in paintings.
If some paintings can depict other paintings, then it seems like we can divide paintings into two types: those that depict themselves (such as the artwork on old Quaker Oats packaging) and those that do not. Thus, “a scene depicting all and only the paintings that do not depict themselves” is a perfectly good description, and so by (D1) it should be possible to produce a painting – let’s call it I – that depicts things as described. So:
A painting is depicted in I if and only if it does not depict itself.
Should I depict itself ? In other words, if you are creating this painting, should you include a depiction of I itself within the scene? If you include I in the painting, then I is a painting that depicts itself, so it should not be depicted in I after all. But if you don’t include I in the painting, then I is a painting that does not depict itself, so it should have been included. Either way, you can’t create a painting that depicts things exactly as described.
The paradox of the impossible painting is distinct from both the Liar paradox and the Russell paradox, since it involves depiction rather than truth or set-membership. But it has features in common with each. Most obviously, circularity plays a central role in all three paradoxes: the Liar paradox involves sentences that says something about themselves, the Russell paradox involves sets that are members of themselves, and the paradox of the impossible painting involves paintings that depict themselves.
“Who knew oats could be so deep?”
Nevertheless, the paradox of the impossible painting has features not shared by the Liar paradox, and other features not shared by the Russell paradox. First, the Liar paradox involves a sentence that clearly exists (and is grammatical, etc.) that must be accounted for, while the Russell paradox can be seen in different terms, as a sort of proof that the Russell set R just doesn’t exist, and that we need to revise (S1) accordingly. The proper response regarding the paradox of the impossible painting is more like the latter – we are not tempted to think that the paradoxical painting does or could exist, but instead conclude that there is something wrong with (D1).
There is another sense, however, in which the paradox of the impossible painting is more like the Liar paradox than the Russell paradox. The Liar paradox arguably arises because of circularity of reference: the Liar sentence refers to, or ‘picks out’, itself. And the paradox of the impossible painting arises because of circularity of depiction – that is, paintings that depict, or ‘pick out’, themselves. Reference and depiction are different, but, insofar as they are both ways of ‘picking out’, while set-theoretic membership is not, suggests that, in this respect at least, the paradox of the impossible painting has more in common with the Liar paradox than with the Russell paradox.
Thus, the paradox of the impossible painting ‘lies between’, or is a sort of hybrid of, the Liar paradox and the Russell paradox, with some features in common with the former and others in common with the latter. As a result, studying this puzzle further seems likely to reward us with deeper insights into these two much older and more well-known conundra. Who knew oats could be so deep?
The last Thursday of November freshmen are returning home to reunite with their high school sweethearts. Except not all are as sweet as they once were. Your old flame may show up with a new admirer or give you trouble because you didn’t spend enough time on Skype on Saturday nights while away at college. Be prepared: pack an arsenal of tunes that catch the sad and sometimes mixed feelings you may have after Turkey-Dumping Day. For your convenience, a list of the 10 great breakup songs for a post-Turkey recovery:
10. Pink’s “Blow me (One Last Kiss)”
One of the more lighthearted tracks to make the list, Pink’s lead single from her sixth studio album The Truth About Love (2012) nonetheless gets the message across: After too much fighting, tears, and sweaty palms, the time comes when turkey is not the only thing you have finally had enough of.
9. Passenger’s “Let Her Go”
Passenger’s second single from the album All the Little Lights (2012) made the list not only because of the soul-wrenching, melodic tune but also because of its spot-on content. Looking into the heart of a dumper, the lyrics forcefully delineate the paradox of love: you don’t really know whether or how much you love someone, until he or she is gone.
8. Christina Perri’s “Human”
The lead single from Perri’s second studio album Head or Heart (2014), this pop power ballad features almost no drumsticks (pun intended). Instead it showcases the American singer’s ethereal voice. And the lyrics hit the nail on the head: Being happier and hotter without your ex may be the best way to get even. But don’t worry if you fail spectacularly, ’cause you’re only a little human.
7. Hilary Duff’s “Stranger”
Tapping into the style and sound of Middle-eastern belly-dance music, Hilary Duff’s single, recorded for her fourth studio album Dignity (2007), is a bouncy yet husky song about suddenly seeing an unkind stranger in the torso of your beloved. After listening to this tune, put on the dumper’s apron before slicing the turkey.
6. Jaymes Young’s “Parachute”
Despite its blunt language, Seattle-born singer Jaymes Young’s fragile ballad made the list because of its lyrics about being lied to and instantly knowing that it’s time to take the “l” out of “lover.”
5. Taylor Swift’s “I knew you were trouble”
Taylor Swift’s bass-heavy dubstep drop, recorded for her fourth studio album Red (2012), is aptly warning us about the trouble-makers–those types that make you fall in love only to leave you behind.
4. Sam Smith’s “Stay with me”
Although it’s not quite a turkey-dumping song but rather a desperate-for-love ballad, this gospel-inspired hit from British songwriter Sam Smith’s debut studio album In the Lonely Hour (2014) still made the list. Critics deemed it overly sentimental, but “brutally honest” is evidently a better description.
3. David Guetta’s “Titanium”
French DJ and music producer David Guetta is hard to pass over when it comes to ferocious breakup songs. This 2012 hit from his album Nothing But the Beat gives you relationship hardship and a shot of resilience to help take the pain out of Turkey-Dumping Day.
2. Fefe Dobson’s “Stuttering”
“Dobson can sing,” say the critic. Yes, indeed. The tune and the debated music video leave you stuttering and wondering: Can the green-eyed monster make you that crazy? Yes, it can, not least when the cheater isn’t your man.
1. David Guetta’s “She Wolf”
Katy Perry’s “Part of Me” gets an honorary mention for its heartening lyrics but it’s David Guetta who takes the first place with another ballad, featuring vocals from Australian recording artist Sia. Reflecting on the most poignant of breakups, this impassioned chorus on the feeling of being replaced takes us inside the mind of someone who is “falling to pieces.”
Imagine that you have a one-time-only chance to become a vampire. With one swift, painless bite, you’ll be permanently transformed into an elegant and fabulous creature of the night. As a member of the Undead, your life will be completely different. You’ll experience a range of intense new sense experiences, you’ll gain immortal strength, speed and power, and you’ll look fantastic in everything you wear. You’ll also need to drink the blood of humanely farmed animals (but not human blood), avoid sunlight, and sleep in a coffin.
Now, suppose that all of your friends, people whose interests, views and lives were similar to yours, have already decided to become vampires. And all of them tell you that they love it. They encourage you to become a vampire too, saying things like: “I’d never go back, even if I could. Life has meaning and a sense of purpose now that it never had when I was human. It’s amazing! But I can’t really explain it to you, a mere human. You’ll have to become a vampire to know what it’s like.”
In this situation, how could you possibly make an informed choice about what to do? For, after all, you cannot know what it is like to become a vampire until you become one. The experience of becoming a vampire is transformative. What I mean by this is that it is an experience that is both radically epistemically new, such that you have to have it in order to know what it will be like for you, and moreover, will change your core personal preferences.
“You’ll have to become a vampire to know what it’s like”
So you can’t rationally choose to become a vampire, but nor can you rationally choose to not become one, if you want to choose based on what you think it would be like to live your life as a vampire. This is because you can’t possibly know what it would be like before you try it. And you can’t possibly know what you’d be missing if you didn’t.
We don’t normally have to consider the choice to become Undead, but the structure of this example generalizes, and this makes trouble for a widely assumed story about how we should make momentous, life-changing choices for ourselves. The story is based on the assumption that, in modern western society, the ideal rational agent is supposed to charge of her own destiny, mapping out the subjective future she hopes to realize by rationally evaluating her options from her authentic, personal point of view. In other words, when we approach major life decisions, we are supposed to introspect on our past experiences and our current desires about what we want our futures to be like in order to guide us in determining our future selves. But if a big life choice is transformative, you can’t know what your future will be like, at least, not in the deeply relevant way that you want to know about it, until you’ve actually undergone the life experience.
Transformative experience cases are special kinds of cases where important ordinary approaches that people try to use to make better decisions, such as making better generalizations based on past experiences, or educating themselves to better evaluate and recognize their true desires or preferences, simply don’t apply. So transformative experience cases are not just cases involving our uncertainty about certain sorts of future experiences. They are special kinds of cases that focus on a distinctive kind of ‘unknowability’—certain important and distinctive values of the lived experiences in our possible futures are fundamentally first-personally unknowable. The problems with knowing what it will be like to undergo life experiences that will transform you can challenge the very coherence of the ordinary way to approach major decisions.
Moreover, the problem with these kinds of choices isn’t just with the unknowability of your future. Transformative experience cases also raise a distinctive kind of decision-theoretic problem for these decisions made for our future selves. Recall the vampire case I started with. The problem here is that, before you change, you are supposed to perform a simulation of how you’d respond to the experience in order to decide whether to change. But the trouble is, who you are changes as you become a vampire.
Think about it: before you become a vampire, you should assess the decision as a human. But you can’t imaginatively put yourself in the shoes of the vampire you will become and imaginatively assess what that future lived experience will be. And, after you have become a vampire, you’ve changed, such that your assessment of your decision now is different from the assessment you made as a human. So the question is, which assessment is the better one? Which view should determine who you become? The view you have when you are human? Or the one you have when you are a vampire.
The questions I’ve been raising here focus on the fictional case of the choice to be come a vampire. But many real-life experiences and the decisions they involve have the very same structure, such as the choice to have one’s first child. In fact, in many ways, the choice to become a parent is just like the choice to become a vampire! (You won’t have to drink any blood, but you will undergo a major transition, and life will never be the same again.)
In many ways, large and small, as we live our lives, we find ourselves confronted with a brute fact about how little we can know about our futures, just when it is most important to us that we do know. If that’s right, then for many big life choices, we only learn what we need to know after we’ve done it, and we change ourselves in the process of doing it. In the end, it may be that the most rational response to this situation is to change the way we frame these big decisions: instead of choosing based on what we think our futures will be like, we should choose based on whether we want to discover who we’ll become.
World Philosophy Day was created by UNESCO in 2005 in order to “win recognition for and give strong impetus to philosophy and, in particular, to the teaching of philosophy in the world”. To celebrate World Philosophy Day, we have compiled a list of what we consider to be the most essential philosophy titles. We are also providing free access to several key journal articles and online products in philosophy so that you can explore this discipline in more depth. Happy reading!
Free: Why Science Hasn’t Disproved Free Will by Alfred R. Mele
Does free will exist? The question has fueled heated debates spanning from philosophy to psychology and religion. The answer has major implications, and the stakes are high. To put it in the simple terms that have come to dominate these debates, if we are free to make our own decisions, we are accountable for what we do, and if we aren’t free, we’re off the hook.
Philosophy Bites Again by David Edmonds and Nigel Warburton
This is really a conversation, and conversations are the best way to see philosophy in action. It offers engaging and thought-provoking conversations with leading philosophers on a selection of major philosophical issues that affect our lives. Their subjects include pleasure, pain, and humor; consciousness and the self; free will, responsibility, and punishment; the meaning of life and the afterlife.
Killing in War by Jeff McMahan
This is a highly controversial challenge to the consensus about responsibility in war. Jeff McMahan argues compellingly that if the leaders are in the wrong, then the soldiers are in the wrong.
Reason in a Dark Time by Dale Jamieson
In this book, philosopher Dale Jamieson explains what climate change is, why we have failed to stop it, and why it still matters what we do. Centered in philosophy, the volume also treats the scientific, historical, economic, and political dimensions of climate change.
Poverty, Agency, and Human Rights edited by Diana Tietjens Meyers
Collects thirteen new essays that analyze how human agency relates to poverty and human rights respectively as well as how agency mediates issues concerning poverty and social and economic human rights. No other collection of philosophical papers focuses on the diverse ways poverty impacts the agency of the poor.
Aha! The Moments of Insight That Shape Our World by William B. Irvine
This book incorporates psychology, neurology, and evolutionary psychology to take apart what we can learn from a variety of significant “aha” moments that have had lasting effects. Unlike other books on intellectual breakthroughs that focus on specific areas such as the arts, Irvine’s addresses aha moments in a variety of areas including science and religion.
On What Matters: Volume One by Derek Parfit
Considered one of the most important works in the field since the 19th century, it is written in the uniquely lucid and compelling style for which Parfit is famous. This is an ambitious treatment of the main theories of ethics.
What should I do?: Plato’s Crito’ in Philosophy: A Very Short Introduction by Edward Craig
Plato, born around 427 BC, is not the first important philosopher, with Vedas of India, the Buddha, and Confucius all pre-dating him. However, he is the first philosopher to have left us with a substantial body of complete works that are available to us today, which all take the form of dialogues. This chapter focuses on the dialogue called Crito in which Socrates asks ‘What should I do?’
A biography of John Locke in the Oxford Dictionary of National Biography
A philosopher regarded as one of the most influential of Enlightenment thinkers, John Locke was born on 29th August 1632 in Somerset, England. In the late 1650s he became interested in medicine, which led easily to natural philosophy after being introduced to these new ideas of mechanical philosophy by Robert Boyle. Discover what happened next in Locke’s life with this biography
‘Computing Machinery and Intelligence’ from Mind, published in 1950.
In this seminal paper, celebrated mathematician and pioneer Alan Turing attempts to answer the question, ‘Can machines think?’, and thus introduces his theory of ‘the imitation game’(now known as the Turing test) to the world. Turing skilfully debunks theological and ethical arguments against computational intelligence: he acknowledges the limitations of a machine’s intellect, while boldly exposing those of man, ultimately laying the groundwork for the study of artificial intelligence – and the philosophy behind it.
‘Phenomenology as a Resource for Patients’ from The Journal of Medicine and Philosophy, published in 2012
Patient support tools have drawn on a variety of disciplines, including psychotherapy, social psychology, and social care. One discipline that has not so far been used to support patients is philosophy. This paper proposes that a particular philosophical approach, phenomenology, could prove useful for patients, giving them tools to reflect on and expand their understanding of their illness.
Do you have any philosophy books that you think should be added to this reading list? Let us know in the comments below.
Headline image credit: Rays at Burning Man by foxgrrl. CC-BY-NC-SA-2.0 via Flickr.
In the October 9th edition of the New York Review of Books, philosopher John Searle criticized Luciano Floridi’s The Fourth Revolution, noting that Floridi “sees himself as the successor to Copernicus, Darwin, and Freud, each of whom announced a revolution that transformed our self-conception into something more modest.” In the response below, Floridi disputes this claim and many others made by Searle in his review of The Fourth Revolution.
John Searle’s review of The Fourth Revolution – How the Infosphere is Reshaping Human Reality (OUP, 2014) is astonishingly shallow and misguided. The silver lining is that, if its factual errors and conceptual confusions are removed, the opportunity for an informed and insightful reading can still be enjoyed.
The review erroneously ascribes to me a fourth revolution in our self-understanding, which I explicitly attribute to Alan Turing. We are not at the center of the universe (Copernicus), of the biological kingdom (Darwin), or of the realm of rationality (Freud). After Turing, we are no longer at the center of the world of information either. We share the infosphere with smart technologies. These are not some unrealistic AI, as the review would have me suggest, but ordinary artefacts that outperform us in ever more tasks, despite being no cleverer than a toaster. Their abilities are humbling and make us revaluate our unique intelligence. Their successes largely depend on the fact that the world has become an IT-friendly environment, where technologies can replace us without having any understanding or semantic skills. We increasingly live onlife (think of apps tracking your location). The pressing problem is not whether our digital systems can think or know, for they cannot, but what our environments are gradually enabling them to achieve. Like Kant, I do not know whether the world in itself is informational, a view that the review erroneously claims I support. What I do know is that our conceptualization of the world is. The distinction is trivial and yet crucial: from DNA as code to force fields as the foundation of matter, from the mind-brain dualism as a software-hardware distinction to computational neuroscience, from network-based societies to digital economies and cyber conflicts, today we understand and deal with the world informationally. To be is to be interactable: this is our new “ontology”.
The review denounces dualisms yet uncritically endorses a dichotomy between relative (or subjective) vs. absolute (or objective) phenomena. This is no longer adequate because today we know that many phenomena are relational. For example, whether some stuff qualifies as food depends on the nature both of the substance and of the organism that is going to absorb it. Yet relativism is mistaken, because not any stuff can count as food, sand never does. Likewise, semantic information (e.g. a train timetable) is a relational phenomenon: it depends on the right kind of message and receiver. Insisting on mapping information as either relative or absolute is as naïve as pretending that a border between two nations must be located in one of them.
The world is getting more complex. We have never been so much in need of good philosophy to understand it and take care for it. But we need to upgrade philosophy into a philosophy of information of our age for our age if we wish it to be relevant. This is what the book is really about.
Feature image credit: Macro computer citrcuit board, by Randy Pertiet. CC-BY-2.0 via Flickr.
I read more philosophy books than books on any other topic – and, to be honest, it's probably more than time that RSB reflected that a little more clearly. It's a little difficult suddenly to begin a "justification" for my interests, but I want to start shaping up to providing one, not least because it will help me (I hope) articulate what I find lacking in a number of the works that have been fascinating me of late.
Like many, my head has been (somewhat) turned by the vibrant SR/OOO blogging community. And if you spend any time in this particular pond you soon come across the work of Graham Harman – one of the big fishes.
I find Harman's work... problematic. And I'll come back to that later. But I also find it profoundly engaging, subtle and intellectually exciting. For now, I'll just mention Harman's notion of withdrawal as an example of a technical term that, I think, is particularly fecund.
Levi Bryant defines withdrawal like this: "Withdrawal is a protest against all ambitions of domination, mastery, and exploitation. What withdrawal says is that all entities harbor – as Graham likes to put it – scarcely imagined volcanic cores bubbling beneath the surface that we are never completely able to master or control. It is this from whence his profound respect for things – human and nonhuman – indeed his indignation against those that would try to reduce things to signifiers, concepts, sensations, lived experiences, intuitions, etc., arises. Harman seldom talks about politics or ethics, but who can fail to hear an ethical refrain throughout all his work..."
Harman proposes that no object is ever exhausted by its relations; that an object's real properties are hidden and can never fully be grasped. I find that a fascinating and productive thought. And I find I read so much philosophy because of a love of – and a quest to find – words and phrases, constructions and contortions, that help me form new thought-words and new thought-worlds. Philosophy, for me, is simply a search for better ways to think about the world, and if that means working through some pretty dreadful prose every now and again, so be it. So, when I ask myself why I'm pushing through pages and pages of dry, technical, definitional analysis and forbidding, unforgivable academicese, it's because of the diamonds in the dirt. A term like withdrawal opens something new up for me.
Shaviro's book is useful because it is telling me that process thought is able to shed light on Harman's philosophy, that a dialogue between those two thinkers is helpful to understanding both. It is also bringing my attention back to how very Deleuzian such thought is... all is becoming, nothing is static being, and so you can, I think, map onto that a constant 'flow' between the 'real' and the 'fictional' which doesn't bespeak a 'reality hunger' but more a constant lack in reality which is areadly always 'over-filled' by the fictive, the constructed. Reality is gappy, and thought is real. There are similarities here to Miguel de Beistegui's Proust as Philosopher. (And the car crash of scare quotes in this paragraph is evidence, of course, that further thought is needed!)
What I'm finding missing in Wolfendale's admirable volume, however, is such food for thought. Wolfendale's Kantian/Sellarsian takedown of Harman was waiting to be written. (You can read a 77-page "taster" in Speculations.) And despite Nick Land's recent comments that Wolfendale's book "deserves to be absorbed in very different terms to those it superficially invites," I'm afraid I find myself amongst the superficial. Wolfendale scores some knockout blows, but Harman bounces back up like a weeble. Wolfendale himself writes: "Whatever else can be said about Harman’s presentation of OOP, it is certainly compelling. On the one hand, it attempts to reveal the inherent oddness of the world
we live in, by painting us a landscape of a reality in which
everything is radically individual, cut off from everything
else in almost every respect, connected only by fleeting glimmers of phenomenal appearance. On the other, it attempts
to humble humanity by seeing humans as just one more disparate association of objects within the universal diaspora." Like good fiction, philosophy, for me, doesn't have to prove facts – it doesn't need to limit itself to a theory of knowledge – it needs to open up our minds and make us epistemoligically astute. And that starts with fascination, with an aesthetics perhaps. In such a struggle, Wolfendale can't help but come off sounding like something of a humourless pedant. His book does have virtues, however, and, as I said, I do find Harman problematic... but I've written enough for one day.
The elections, thankfully, are finally over, but America’s search for security and prosperity continues to center on ordinary politics and raw commerce. This ongoing focus is perilous and misconceived. Recalling the ineffably core origins of American philosophy, what we should really be asking these days is the broadly antecedent question: “How can we make the souls of our citizens better?”
To be sure, this is not a scientific question. There is no convincing way in which we could possibly include the concept of “soul” in any meaningfully testable hypotheses or theories. Nonetheless, thinkers from Plato to Freud have understood that science can have substantial intellectual limits, and that sometimes we truly need to look at our problems from the inside.
Pierre Teilhard de Chardin, the Jesuit philosopher, inquired, in The Phenomenon of Man: “Has science ever troubled to look at the world other than from without?” This not a silly or superficial question. Earlier, Ralph Waldo Emerson, the American Transcendentalist, had written wisely in The Over-Soul: “Even the most exact calculator has no prescience that something incalculable may not balk the next moment.” Moreover, he continued later on in the same classic essay: “Before the revelations of the soul, Time, Space, and Nature shrink away.”
That’s quite a claim. What, precisely, do these “phenomenological” insights suggest about elections and consumerism in the present American Commonwealth? To begin, no matter how much we may claim to teach our children diligently about “democracy” and “freedom,” this nation, whatever its recurrent electoral judgments on individual responsibility, remains mired in imitation. More to the point, whenever we begin our annual excursions to Thanksgiving, all Americans are aggressively reminded of this country’s most emphatically soulless mantra.
“You are what you buy.”
This almost sacred American axiom is reassuringly simple. It’s not complicated. Above all, it signals that every sham can have a patina, that gloss should be taken as truth, and that any discernible seriousness of thought, at least when it is detached from tangible considerations of material profit, is of no conceivably estimable value.
Ultimately, we Americans will need to learn an altogether different mantra. As a composite, we should finally come to understand, every society is basically the sum total of individual souls seeking redemption. For this nation, moreover, the favored path to any such redemption has remained narrowly fashioned by cliché, and announced only in chorus.
Where there dominates a palpable fear of standing apart from prevailing social judgments (social networking?), there can remain no consoling tolerance for intellectual courage, or, as corollary, for any reflective soulfulness. In such circumstances, as in our own present-day American society, this fear quickly transforms citizens into consumers.
While still citizens, our “education” starts early. From the primary grades onward, each and every American is made to understand that conformance and “fitting in” are the reciprocally core components of individual success. Now, the grievously distressing results of such learning are very easy to see, not just in politics, but also in companies, communities, and families.
Above all, these results exhibit a debilitating fusion of democratic politics with an incessant materialism. Or, as once clarified by Emerson himself: “The reliance on Property, including the reliance on governments which protect it, is the want of self-reliance.”
Nonetheless, “We the people” cannot be fooled all of the time. We already know that nation, society, and economy are endangered not only by war, terrorism, and inequality, but also by a steadily deepening ocean of scientifically incalculable loneliness. For us, let us be candid, elections make little core difference. For us, as Americans, happiness remains painfully elusive.
In essence, no matter how hard we may try to discover or rediscover some tiny hints of joy in the world, and some connecting evidence of progress in politics, we still can’t manage to shake loose a gathering sense of paralyzing futility.
Tangibly, of course, some things are getting better. Stock prices have been rising. The economy — “macro,” at least — is improving.
Still, the immutably primal edifice of American prosperity, driven at its deepest levels by our most overwhelming personal insecurities, remains based upon a viscerally mindless dedication to consumption. Ground down daily by the glibly rehearsed babble of politicians and their media interpreters, we the people are no longer motivated by any credible search for dignity or social harmony, but by the dutifully revered buying expectations of patently crude economics.
Can anything be done to escape this hovering pendulum of our own mad clockwork? To answer, we must consider the pertinent facts. These unflattering facts, moreover, are pretty much irrefutable.
For the most part, we Americans now live shamelessly at the lowest common intellectual denominator. Cocooned in this generally ignored societal arithmetic, our proliferating universities are becoming expensive training schools, promising jobs, but less and less of a real education. Openly “branding” themselves in the unappetizing manner of fast food companies and underarm deodorants, these vaunted institutions of higher education correspondingly instruct each student that learning is just a commodity. Commodities, in turn, learns each student, exist solely for profit, for gainful exchange in the ever-widening marketplace.
Optimally, our students exist at the university in order, ultimately, to be bought and sold. Memorize, regurgitate, and “fit in” the ritualized mold, instructs the college. Then, all be praised, all will make money, and all will be well.
But all is not well. In these times, faced with potentially existential threats from Iran, North Korea, and many other conspicuously volatile places, we prefer to distract ourselves from inconvenient truths with the immense clamor of imitative mass society. Obligingly, America now imposes upon its already-breathless people the grotesque cadence of a vast and over-burdened machine. Predictably, the most likely outcome of this rhythmically calculated delirium will be a thoroughly exhausted country, one that is neither democratic, nor free.
Ironically, we Americans inhabit the one society that could have been different. Once, it seems, we still had a unique opportunity to nudge each single individual to become more than a crowd. Once, Ralph Waldo Emerson, the quintessential American philosopher, had described us as a unique people, one motivated by industry and “self-reliance,” and not by anxiety, fear, and a hideously relentless trembling.
America, Emerson had urged, needed to favor “plain living” and “high thinking.” What he likely feared most was a society wherein individual citizens would “measure their esteem of each other by what each has, and not by what each is.”
No distinctly American philosophy could possibly have been more systematically disregarded. Soon, even if we can somehow avoid the unprecedented paroxysms of nuclear war and nuclear terrorism, the swaying of the American ship will become unsustainable. Then, finally, we will be able to make out and understand the phantoms of other once-great ships of state.
Laden with silver and gold, these other vanished “vessels” are already long forgotten. Then, too, we will learn that those starkly overwhelming perils that once sent the works of Homer, Goethe, Milton, and Shakespeare to join the works of more easily forgotten poets are no longer unimaginable. They are already here, in the newspapers.
In spite of our proudly heroic claim to be a nation of “rugged individuals,” it is actually the delirious mass or crowd that shapes us, as a people, as Americans. Look about. Our unbalanced society absolutely bristles with demeaning hucksterism, humiliating allusions, choreographed violence, and utterly endless political equivocations. Surely, we ought finally to assert, there must be something more to this country than its fundamentally meaningless elections, its stupefying music, its growing tastelessness, and its all-too willing surrender to near-epidemic patterns of mob-directed consumption.
In an 1897 essay titled “On Being Human,” Woodrow Wilson asked plaintively about the authenticity of America. “Is it even open to us,” inquired Wilson, “to choose to be genuine?” This earlier American president had answered “yes,” but only if we would first refuse to stoop so cowardly before corruption, venality, and political double-talk. Otherwise, Wilson had already understood, our entire society would be left bloodless, a skeleton, dead with that rusty death of machinery, more unsightly even than the death of an individual person.
“The crowd,” observed the 19th century Danish philosopher, Søren Kierkegaard, “is untruth.” Today, following recent elections, and approaching another Thanksgiving, America’s democracy continues to flounder upon a cravenly obsequious and still soulless crowd. Before this can change, we Americans will first need to acknowledge that our institutionalized political, social, and economic world has been constructed precariously upon ashes, and that more substantially secure human foundations now require us to regain a dignified identity, as “self-reliant” individual persons, and as thinking public citizens.
Heading image: Boxing Day at the Toronto Eaton Centre by 松林 Ｌ. CC-BY-2.0 via Wikimedia Commons.
A former colleague of mine once said that the problem with theology is that it has no subject-matter. I was reminded of Nietzsche’s (unwittingly self-damning) claim that those who have theologians’ blood in their veins see all things in a distorted and dishonest perspective, but it was counterbalanced a few years later by a comment of another philosopher – on hearing of my appointment to Heythrop College – that it was good that I’d be working amongst theologians because they are more open-minded than philosophers.
Can one be too open-minded? And isn’t the limit traversed when we start talking about God, or, even worse, believe in Him? Presumably yes, if atheism is true, but it is not demonstrably true, and it is unclear in any case what it means to be either an atheist or a theist. (Some think that theists make God in their own image, and that the atheist is in a better position to relate to God.)
The atheist with which we are most familiar likewise takes issue with the theist, and A.C. Grayling goes so far as to claim that we should drop the term ‘atheist’ altogether because it invites debate on the ground of the theist. Rather, we should adopt the term ‘naturalist’, the naturalist being someone who accepts that the universe is a natural realm, governed by nature’s laws, and that it contains nothing supernatural: ‘there is nothing supernatural in the universe – no fairies or goblins, angels, demons, gods or goddesses’.
I agree that the universe is a natural realm, governed by nature’s laws, and I do not believe in fairies or goblins, angels, demons, gods or goddesses. However, I cannot accept that there is nothing supernatural in the universe until it is made absolutely clear what this denial really means.
The trouble is that the term ‘naturalism’ is so unclear. To many it involves a commitment to the idea that the scientist has the monopoly on nature and explanation, in which case the realm of the supernatural incorporates whatever is not natural in this scientific sense.
Others object to this brand of naturalism on the ground that there are no good philosophical or scientific reasons for assigning the limits of nature to science. As John McDowell says: ‘scientism is a superstition, not a stance required by a proper respect for the achievements of the natural sciences’.
McDowell endorses a form of naturalism which accommodates value, holding that it cannot be adequately explained in purely scientific terms. Why stick with naturalism? In short, the position – in its original inception – is motivated by sound philosophical presuppositions.
It involves acknowledging that we are natural beings in a natural world, and gives expression to the demand that we avoid metaphysical flights of fancy, ensuring that our claims remain empirically grounded. To use the common term of abuse, we must avoid anything spooky.
The scientific naturalist is spooked by anything that takes us beyond the limits of science; the more liberal or expansive naturalist is not. However, the typical expansive naturalist stops short of God. Understandably so, given his wish to avoid metaphysical flights of fancy, and given the assumption that such a move can be criticised on this score.
Yet what if his reservations in this context can be challenged in the way that he challenges the scientific naturalist’s reluctance to accept his own position? (The scientific naturalist thinks that McDowell’s values are just plain spooky, and McDowell challenges this complaint on anti-scientistic grounds.)
McDowell could object that the two cases are completely different – God is spooky in the way that value is not. Yet this response simply begs the question against the alternative framework at issue – a framework which challenges the assumption that God must be viewed in these pejorative terms.
The idea that there is a naturalism to accommodate God does not mean that God is simply part of nature – I am not a pantheist – but it does mean that the concept of the divine can already be understood as implicated in our understanding of nature, rather than being thought of as entirely outside it.
So I am rejecting deism to recuperate a form of theistic naturalism which will be entirely familiar to the Christian theist and entirely strange (and spooky) to the typical atheist who is a typical naturalist. McDowell is neither of these things – that’s why his position is so interesting.
The inherent significance of bioethics and social science in medicine is now widely accepted… at least on the surface. Despite an assortment of practical problems—limited curricular time compounded by increased concern for “whitespace”—few today deny outright that ethical practice and humanistic patient engagement are important and need to be taught. But public acknowledgements all too often are undercut by a different reality, a form of hidden curriculum that overpowers institutional rhetoric and the best-laid syllabi. Most medical schools now make an effort to acknowledge that ethics and humanities training is part of their mission and we have seen growing inclusion of bioethics and medical humanities in medical curricula. However, more curricular time, in and of itself, is not enough.
Even with increases in contact hours, the value of medical ethics and humanities can be undercut by problems of frequency and duration. Many schools have dedicated significant time to bioethics when measured in contact hours, but in the form of intensive seminars that are effectively quarantined from the rest of the curriculum. While this is a challenge for modular curricula in general, it can be harder for students to integrate ethics and humanities content into biomedical contexts. Irrespective of the number of contact hours, placing bioethics in a curricular ghetto risks sending a message that it is simply is a hoop to jump through, something to eventually be set aside as one returns to the real curriculum.
While partitioning ethics and humanities content presents problems, the integration of ethics into systems-based curricula poses different challenges. While, case-based formats make integration easier, they limit the extent to which one can teach core concepts themselves. For organ systems curricula, where ethics lectures often are “sprinkled in,” the linkages with the biomedical components of the course are underspecified or inherently weak. Medical ethics and humanities are diffused in actual practice such that attempts at thematic alignment with organ systems curricula often are noticeably artificial. In turn, there is an unintentional but palpable message that ethics is an interruption to medical learning. Anyone who has delivered an ethics lecture, sandwiched between two pathology lectures in a GI course knows this feeling only too well.
Finally, there is a misalignment of goals and assessment in bioethics that remains a significant challenge. Certainly, one goal of ethics and humanities education in medical curricula is to provide concrete information about legal directives and consensus opinions. Most of us, however, want to go beyond a purely instrumental approach to ethics and promote the ability to empathize with patients and think critically about ethical and humanistic features of patient care. These issues are much more important than an instrumental approach. While there are a variety of ways to assess these higher-order capacities within a course, board exams loom large in the medical student consciousness (and rightfully so). On a multiple choice exam, being reflexive about one’s ethical framework and exploring the large supply of contingencies surrounding a particular case is a recipe for disaster. In turn, I often find myself encouraging students to pursue interesting and creative lines of thought or to challenge consensus statements from professional bodies, only to end the discussion by warning that they should abandon all such efforts on board exams. Most would agree that ethics is a dialogical activity, yet the examinations with the highest stakes send hidden messages that it is formulaic and instrumental. When “assessment drives learning,” it is difficult for students to set aside concerns about gateway exams and engage the genuine complexity of ethics.
While these challenges are curricular, pedagogical, and even cultural, I think there are practical ways that medical schools, and even individual instructors, can destabilize the messages of this hidden curriculum. First, with regard to assessment, we can teach both complex and instrumental ethical methodologies. While this may appear a rather dismal prospect, it can be made respectable by explicating the conditions under which each way of thinking is useful (e.g. the former in real life, the latter on exams). Students then learn not only to turn on and off particular test taking strategies, but this also bolsters their ability to be critical and reflexive—in this case about a instrumental processes of ethical decision-making that are problematic, but nonetheless widespread, even in practice.
Second, we need to move beyond simply including more bioethics education and toward addressing its rhythms within our curricula. I have been fortunate enough to recently join a new medical school unencumbered by a historical (read: petrified) curriculum. In addition to an institutional culture genuinely amenable to ethics and humanities, our curriculum utilizes longitudinal courses that run in parallel to the biomedical systems courses. Instructors therefore have the ability to build the sort of conceptual complexity that truly attends ethics and students have the spaced practice that is key to their development. This structure therefore avoids the problems both of quarantining and random inclusion.
Finally, bioethics curricula need to develop less emphasis on information and a greater utilization of “threshold concepts”. No medical curriculum affords enough time to exhaust the terrain of bioethics and medical humanities. Certainly we need to accept the reality that we typically are not training ethics and humanities scholars, but, at a minimum, physicians with those competencies and even more ideal, physicians who embody those values. However, where the idea of delivering ethics at an appropriate level for physicians often serves as a call for simplicity, I believe it supplies a warrant for focus on our most complex concepts, which also are the most generative and useful. When training practitioners, epistemological concepts—for example, integrative and differentiating ways of thinking—often are eschewed in favor of simpler kinds of information that promote instrumental applications to situations, and a limited ability engage the messy nuances of real world situations. Richer, more complex threshold concepts—like the sociological imagination (the ability to see the interweaving of macro and micro level phenomena)—are broadly relevant and transposable to any number of complex situations.
In the contemporary landscape, few deny outright the significance of ethics and humanities in medicine. But the explicit messaging about their importance remains outmatched by implicit messages hidden in curricula. Having just returned from the annual meeting of the American Society for Bioethics and Humanities, I cannot help but feel that we are spending too much time fighting old battles by repetitiously announcing the relevance of bioethics and too little time confronting the more insidious, hidden messages nestled deeper in the trenches of curriculum and pedagogy. This is a critical challenge.
Please find below a pastiche of Alice’s Adventures in Wonderland that illustrates what it means to choose rationally:
‘Sit down, dear’, said the White Queen.
Alice perched delicately on the edge of a chair fashioned from oyster-shells.
‘Coffee, or tea, or chocolate?’, enquired the Queen.
‘I’ll have chocolate, please.’
The Queen turned to the Unicorn, standing, as ever, behind the throne: ‘Trot along to the kitchen and bring us a pot of chocolate if you would. There’s a good Uni.’
Off he trots. And before you can say ‘jabberwocky’ is back: ‘I’m sorry, Your Majesty, and Miss Alice, but we’ve run out of coffee.’
‘But I said chocolate, not coffee’, said a puzzled Alice.
The Unicorn was unmoved: ‘I am well aware of that, Miss. As well as a horn I have two good ears, and I’m not deaf’.
Alice thought again: ‘In that case’, she said, ‘I’ll have tea, if I may?’
‘Of course you may,’ replied the Queen. ‘But if you do, you’ll be violating a funny little thing that in the so-called Real World is known as the contraction axiom; in Wonderland we never bother about such annoyances. In the Real World they claim that they do, but they don’t.’
‘Don’t they?’ asked Alice.
‘No. I’ve heard it said, though I can scarce believe it, that their politicians ordain that a poor girl like you when faced with the choice between starving or taking out a payday loan is better off if she has only the one option, that of starving. No pedantic worries about contraction there (though I suppose your waist would contract, now I come to think of it). But this doesn’t bother me: like their politicians, I am rich, a Queen in fact, as my name suggests’.
‘On reflection, I will revert to chocolate, please. And do they have any other axes there?’
‘Axioms, child, not axes. And yes, they do. They’re rather keen on what they call their expansion axiom – the opposite, in a sense, of their contraction axiom. What if Uni had returned from the kitchen saying that they also had frumenty – a disgusting concoction, I know – and you had again insisted on tea? Then as well making your teeth go brown you’d have violated that axiom.’
‘I know I’m only a little girl, Your Majesty, but who cares?’
‘Not I, not one whit. But people in the Real World seem to. If they satisfy both of these axiom things they consider their choice to be rational, which is something they seem to value. It means, for example, that if they prefer coffee to tea, and tea to chocolate, then they prefer coffee to chocolate.’
‘Well, I prefer coffee to tea, tea to chocolate, and chocolate to tea. And why shouldn’t I?’
‘Because, poor child, you’ll be even poorer than you are now. You’ll happily pay a groat to that greedy little oyster over there to change from tea to coffee, pay him another groat to change from coffee to chocolate, and pay him yet another groat to change from chocolate to tea. And then where will you be? Back where you started from, but three groats the poorer. That’s why if you’re not going to be rational you should remain in Wonderland, or be a politician.’
This little fable illustrates three points. The first is that rationality is a property of patterns of choice rather than of individual choices. As Hume famously noted in 1738, ‘it is not contrary to reason to prefer the destruction of the whole world to the scratching of my finger; it is not contrary to reason for me to chuse [sic] my total ruin to prevent the least uneasiness of an Indian’. However, it seems irrational to choose chocolate when the menu comprises coffee, tea, and chocolate; and to choose tea when it comprises just tea and chocolate. It also seems irrational to choose chocolate from a menu that includes tea; and to choose tea from a larger menu. The second point is that making consistent choices (satisfying the two axioms) and having transitive preferences (not cycling, as does Alice) are, essentially, the same thing: each is a characterisation of rationality. And the third point is that people are, on the whole, rational, for natural selection weeds out the irrational: Alice would not lose her three groats just once, but endlessly.
These three points are equally relevant to the trivia of our daily lives (coffee, tea, or chocolate) and to major questions of government policy (for example, the regulation of the loan market).
Featured image credit: ‘Drink me Alice’, by John Tenniel. Public domain via Wikimedia Commons
One of the most interesting trends in recent philosophy is what is sometimes called Speculative Realism. The name comes from a conference in 2007 at the University of London that brought together four very different philosophers who nevertheless were united in their efforts to resurrect realist metaphysics: Quentin Meillassoux, Ray Brassier, Graham Harman, and Iain Hamilton Grant. Each of them hold quite different metaphysical positions, but all four critique what they name "philosophies of correlation." As a theologian and not a philosopher, I can't help but make a connection to my field here. Just as the Radical Orthodox movement identifies a key moment in the history of philosophy (for RO, this is Duns Scotus' univocity) that leads to its destructive decline, the Speculative Realists point back to Kant's apparently disastrous argument that the thing-in-itself is unknowable. MORE...
I've just started this myself. Lots of Whitehead, and lots of good sense so far...
Marcus Aurelius’s Meditations is a remarkable phenomenon, a philosophical diary written by a Roman emperor, probably in 168-80 AD, and intended simply for his own use. It offers exceptional insights into the private thoughts of someone who had a very weighty public role, and may well have been composed when he was leading a military campaign in Germany. What features might strike us today as being especially valuable, bearing in mind our contemporary concerns?
At a time when the question of public trust in politicians is constantly being raised, Marcus emerges, in this completely personal document, as a model of integrity. Not only does he define for himself his political ideal (“a monarchy that values above all things the freedom of the subject”) and spell out what this ideal means in his reflections on the character and lifestyle of his adoptive father and predecessor as emperor, Antoninus Pius, but he also reminds himself repeatedly of the triviality of celebrity, wealth and status, describing with contempt the lavish purple imperial robe he wore as stained with “blood from a shellfish”. Of course, Marcus was not a democratic politician and, with hindsight, we can find things to criticize in his acts as emperor — though he was certainly among the most reasonable and responsible of Roman emperors. But I think we would be glad if we knew that our own prime ministers or presidents approached their role, in their most private hours, with an equal degree of thoughtfulness and breadth of vision.
Another striking feature of the Meditations, and one that may well resonate with modern experience, is the way that Marcus aims to combine a local and universal perspective. In line with the Stoic philosophy that underpins his diary, Marcus often recalls that the men and women he encounters each day are fellow-members of the brotherhood of humanity and fellow-citizens in the universe. He uses this fact to remind himself that working for his brothers is an essential part of his role as an emperor and a human being. This reminder helps him to counteract the responses of irritation and resentment that, he admits, the behavior of other people might otherwise arouse in him. At a time when we too are trying to bridge and negotiate local and global perspectives, Marcus’s thoughts may be worth reflecting on. Certainly, this seems to me a more balanced response than ignoring the friend or partner at your side in the café while engrossed in phone conversations with others across the world.
More broadly, Marcus, again in line with Stoic thinking, underlines that the ethics of human behavior need to take account of the wider fact that human beings form an integral part of the natural universe and are subject to its laws. Of course, we may not share his confidence that the universe is shaped by order, structure and providential care — though I think it is worth thinking seriously about just how much of that view we have to reject. But the looming environmental crisis, along with the world-wide rise in obesity and the alarming healthcare consequences, represent for us a powerful reminder that we need to rethink the ethics of our relationship to the natural world and re-examine our understanding of what is natural in human life. Marcus’s readiness to see himself, and humanity, as inseparable parts of a larger whole, and to subordinate himself to that whole, may serve as a striking example to us, even if the way we pursue that thought is likely to be different from that of Stoicism.
Another striking theme in the Meditations is the looming presence of death, our own and those of others we are close to. This might seem very alien to the modern secular Western world, where death is often either ignored or treated as something too terrible to mention. But the fact that Marcus’s attitude is so different from our own may be precisely what makes it worth considering. He not only underlines the inevitability of death and the fact that death is a wholly natural process, and for that reason something we should accept. He couples this with the claim that knowledge of the certainty of death does not undermine the value of doing all that what we can while alive to lead a good human life and to develop in ourselves the virtues essential for this life. Although such ideas have often formed part of religious responses to death (which have lost their hold over many people today) Marcus puts them in a form that modern non-religious people can accept. This is another reason, I think, why Marcus’s philosophical diary can speak to us today in a language we can make sense of.
Featured image: Marcus Aurelius’s original statue in Rome, by Zanner. Public domain via Wikimedia Commons.
Hi, folks, this week is another response blog. I heard a song called Constellations by Brendan James and it resonated with me. This is a long ramble, a thought journey, inspired by that song, and I hope that you find something to take with you.
I feel like don't really understand the world, and it makes me cry. I feel so out of step with the seasons and times. I can't stand reading the news, or even checking out my Facebook half the time. There are too many wars. Nation against nation. Neighbor against neighbor. Here inside me, I hunger to see people come together, to take a deep breath and just figure out where to go from here. I hope bridges are built, coalitions are made, and every voice is heard. I dream that we would all listen and find better ways. I don't want to join the madding crowd that wants to heckle the stupid, drop bombs, and dehumanize others, all in the name of a better world.
I see the Universe at night and how it is able to spin out wondrous things and at the same time wreak great destruction. I feel the transience of life and yet eternity hums in my heart. Everyone I know is trying to get through the day without dwelling on the darkness. Some take the "be positive about everything" route. Some take the "find a cause" route. I swing between the route of despair and the route of hope, that I might be the voice that breaks through the noise and says something helpful.
I have had unshakable confidence throughout my life that if I got a chance on a stage that I would move the hearts of those shivering on the edges. I have believed that I would grow like a wild weed, but now see so clearly that my life is just a breath and is gone. A Monarch butterfly was caught in between the window and the screen in my house. Some hapless caterpillar crawled between the window and screen and formed a chrysalis. The butterfly emerged and now would die if I did not figure out a gentle way to remove the screen and let it go on it's way to the graveyards of Mexico for the day of dead. When I figured out a way to set the butterfly free, it occurred to me that all of my life might be just for that. Perhaps those beautiful wings have more purpose than I will ever have.
This brings me to the heart of this thought journey. I have hungered for purpose. I have believed all my life that a day was coming that the gifts within me would become visible, like the span over us -- Orion, the Pleiades, the evening star, the moon, and the swath of the Milky Way. I have believed my gifts would come clear like those lights in the heavens. But here I am making less than minimum wage and imploding under the stress of another miss in terms of my intended goal.
In the end we are not in control of our story, and hence I must embrace the days given us. I find embracing the smallness of who I am is difficult. Megalomania is expected in rock stars, but not here in Suburbia. I have to laugh at myself a little and laugh at my little dramas.There is certainly a ridiculousness to me.
Ah, you are just a onion flower in the yard. Most folks will pass by the onion flower but, hey, go ahead and bloom. Touch ten hearts, fifty hearts, A copper star for you. Not the silver, not the gold. That's all, dear. Work it out.
Thank you for dropping by and remember every little thing shines. See you next week.
Causation is now commonly supposed to involve a succession that instantiates some lawlike regularity. This understanding of causality has a history that includes various interrelated conceptions of efficient causation that date from ancient Greek philosophy and that extend to discussions of causation in contemporary metaphysics and philosophy of science. Yet the fact that we now often speak only of causation, as opposed to efficient causation, serves to highlight the distance of our thought on this issue from its ancient origins. In particular, Aristotle (384-322 BCE) introduced four different kinds of “cause” (aitia): material, formal, efficient, and final. We can illustrate this distinction in terms of the generation of living organisms, which for Aristotle was a particularly important case of natural causation. In terms of Aristotle’s (outdated) account of the generation of higher animals, for instance, the matter of the menstrual flow of the mother serves as the material cause, the specially disposed matter from which the organism is formed, whereas the father (working through his semen) is the efficient cause that actually produces the effect. In contrast, the formal cause is the internal principle that drives the growth of the fetus, and the final cause is the healthy adult animal, the end point toward which the natural process of growth is directed.
From a contemporary perspective, it would seem that in this case only the contribution of the father (or perhaps his act of procreation) is a “true” cause. Somewhere along the road that leads from Aristotle to our own time, material, formal and final aitiai were lost, leaving behind only something like efficient aitiai to serve as the central element in our causal explanations. One reason for this transformation is that the historical journey from Aristotle to us passes by way of David Hume (1711-1776). For it is Hume who wrote: “[A]ll causes are of the same kind, and that in particular there is no foundation for that distinction, which we sometimes make betwixt efficient causes, and formal, and material … and final causes” (Treatise of Human Nature, I.iii.14). The one type of cause that remains in Hume serves to explain the producing of the effect, and thus is most similar to Aristotle’s efficient cause. And so, for the most part, it is today.
However, there is a further feature of Hume’s account of causation that has profoundly shaped our current conversation regarding causation. I have in mind his claim that the interrelated notions of cause, force and power are reducible to more basic non-causal notions. In Hume’s case, the causal notions (or our beliefs concerning such notions) are to be understood in terms of the constant conjunction of objects or events, on the one hand, and the mental expectation that an effect will follow from its cause, on the other. This specific account differs from more recent attempts to reduce causality to, for instance, regularity or counterfactual/probabilistic dependence. Hume himself arguably focused more on our beliefs concerning causation (thus the parenthetical above) than, as is more common today, directly on the metaphysical nature of causal relations. Nonetheless, these attempts remain “Humean” insofar as they are guided by the assumption that an analysis of causation must reduce it to non-causal terms. This is reflected, for instance, in the version of “Humean supervenience” in the work of the late David Lewis. According to Lewis’s own guarded statement of this view: “The world has its laws of nature, its chances and causal relationships; and yet — perhaps! — all there is to the world is its point-by-point distribution of local qualitative character” (On the Plurality of Worlds, 14).
Admittedly, Lewis’s particular version of Humean supervenience has some distinctively non-Humean elements. Specifically — and notoriously — Lewis has offered a counterfactural analysis of causation that invokes “modal realism,” that is, the thesis that the actual world is just one of a plurality of concrete possible worlds that are spatio-temporally discontinuous. One can imagine that Hume would have said of this thesis what he said of Malebranche’s occasionalist conclusion that God is the only true cause, namely: “We are got into fairy land, long ere we have reached the last steps of our theory; and there we have no reason to trust our common methods of argument, or to think that our usual analogies and probabilities have any authority” (Enquiry concerning Human Understanding, §VII.1). Yet the basic Humean thesis in Lewis remains, namely, that causal relations must be understood in terms of something more basic.
And it is at this point that Aristotle re-enters the contemporary conversation. For there has been a broadly Aristotelian move recently to re-introduce powers, along with capacities, dispositions, tendencies and propensities, at the ground level, as metaphysically basic features of the world. The new slogan is: “Out with Hume, in with Aristotle.” (I borrow the slogan from Troy Cross’s online review of Powers and Capacities in Philosophy: The New Aristotelianism.) Whereas for contemporary Humeans causal powers are to be understood in terms of regularities or non-causal dependencies, proponents of the new Aristotelian metaphysics of powers insist that regularities and dependencies must be understood rather in terms of causal powers.
Should we be Humean or Aristotelian with respect to the question of whether causal powers are basic or reducible features of the world? Obviously I cannot offer any decisive answer to this question here. But the very fact that the question remains relevant indicates the extent of our historical and philosophical debt to Aristotle and Hume.
In July 2014, the Ukrainian President, Petro Poroshenko, claimed that Ukraine wasn’t fighting a civil war in the east of the country but rather was “defending its territory from foreign mercenaries.” Conversely, rumours abounded earlier in the year that Academi, the firm formerly known as Blackwater, were operating in support of the Ukrainian government (which Academi strongly denied). What is interesting is not simply whether these claims are true, but also their rhetorical force. Being a mercenary and using mercenaries is seen as one of the worst moral failings in a conflict.
Regardless of the accuracy of the claims and counterclaims about their use in Ukraine, the increased use of mercenaries or ‘private military and security companies’ is one of the most significant transformations of military force in recent times. In short, states now rely heavily on private military and security companies to wage wars. In the First Gulf War, there was a ratio of roughly one contractor to every 100 soldiers; by 2008 in the Second Gulf War, that ratio had risen to roughly one to one. In Afghanistan, the ratio was even higher, peaking at 1.6 US-employed contractors per soldier. The total number of Department of Defense contractors (including logistical contractors) reached approximately 163,000 in Iraq in September 2008 and approximately 117,000 in Afghanistan in March 2012. A lot of the media attention surrounding the use of private military and security companies has been on the use of armed foreign contractors in conflict zones, such as Blackwater in Iraq. But the vast majority of the industry provides much more mundane logistical services, such as cleaning and providing food for regular soldiers.
Does this help to remove the pejorative mercenary tag? The private military and security industry has certainly made a concerted effort to attempt to rid itself of the tag, given its rhetorical force. Industry proponents claim private military and security companies are different to mercenaries because of their increased range of services, their alleged professionalism, their close links to employing states, and their corporate image. None of these alleged differences, however, provides a clear—i.e. an analytically necessary—distinction between mercenaries, and private military and security companies. After all, mercenaries could offer more services, could be professional, could have close links to states, and could have a flashy corporate image. Despite the proclamations of industry proponents, private military and security companies might still then be mercenaries.
But what, if anything, is morally wrong with being a mercenary or a private contractor? Could one be an ethical mercenary? In short, yes. To see this, suppose that you go to fight for a state that is trying to defend itself against attack from a genocidal rebel group, which is intent on killing thousands of innocent civilians. You get paid handsomely for this, but this is not the reason why you agree to fight—you just want to save lives. If fighting as a private contractor will, in fact, save lives, and any use of force will only be against those who are liable, is it morally permissible to be a contractor? I think so, given the import of saving lives. As such, mercenaries/private contractors might behave ethically sometimes.
Does this mean that we are incorrect to view mercenaries/private contractors as morally tainted? This would be too quick. We need to keep in mind that, although the occasional mercenary/private contractor might be fully ethical, it seems unlikely that they will be in general. There are at least two reasons to be sceptical of this. First, although there may be exceptions, it seems that financial considerations will often play a greater role in the decision for mercenaries/private contractors to take up arms than for regular soldiers. And, if we think that individuals should be motivated by concern for others rather than self-interest (manifest through the concern for financial gain), we should worry about the increased propensity for mercenary motives. Second, although it may be morally acceptable to be a mercenary/private contractor when considered in isolation, there is a broader worry about upholding and contributing to the general practice of mercenarism and the private military and security industry. One should be wary about contributing to a general practice that is morally problematic, such as mercenarism.
To elaborate, the central ethical problems surrounding private military force do not concern the employees, but rather the employers of these firms. The worries include the following:
that governments can employ private military and security companies to circumvent many of the constitutional and parliamentary—and ultimately democratic—constraints on the decision to send troops into action;
that it is questionable whether these firms are likely to be effective in the theatre, because, for instance, contractors and the firms can more easily choose not to undertake certain operations; and
that there is an abrogation of a state’s responsibility of care for those fighting on its behalf (private contractors generally don’t receive the same level of support after conflict as regular soldiers since political leaders are often less concerned about the deaths of private contractors).
There are also some more general worries about the effects on market for private force on the international system. It makes it harder to maintain the current formal constraints (e.g. current international laws) on the frequency and awfulness of warfare that are designed for the statist use of force. And a market for force can be expected to increase international instability by enabling more wars and unilateralism, as well as by increasing the ability of state and nonstate actors to use military force.
These are the major problems of mercanarism and the increased use of private military force. To that extent, I think that behind the rhetorical force of the claims about mercenaries in Ukraine, there are good reasons to be worried about their use, if not in Ukraine (where the facts are still to be ascertained), but more generally elsewhere. Despite the increased use of private military and security companies and the claims that they differ to mercenaries, we should be wary of the use of private military and security companies as well.
According to standard interpretations of 19th-century European philosophy, a stark ’either / or’ divided Hegel and Kierkegaard, and this divide profoundly shaped the subsequent development of Continental philosophy well into the 20th century. While left Hegelians carried on the legacy of Hegel’s rationalism and universalism, existentialists and postmodernists found inspiration, at least in part, in Kierkegaard’s critique of systematic philosophy, rationality, and socially integrated subjectivity. In Kierkegaard’s Relation to Hegel Reconsidered, Jon Stewart provides a detailed historical argument which challenges the standard assumption that Kierkegaard’s position was developed in opposition to Hegel’s philosophy, and as such is antithetical to it. (It is worth noting that, in Hegel: Myths and Legends, Stewart criticized the ’either / or’ from the other direction, arguing that Hegel is not the arch-rationalist he is often taken to be). Without denying the existence of a certain “metalevel” dispute between Hegel and Kierkegaard, Stewart argues that (a) many of Kierkegaard’s central ideas, such as the theory of stages, are creatively, i.e., not uncritically, adopted from Hegel, and, (b) the true target of Kierkegaard’s critique is not Hegel per se, but prominent Danish Hegelians of his time. According to Stewart, ignorance of Kierkegaard’s intellectual milieu, coupled with a distorted and inadequate understanding of Hegel, has led many English-speaking critics to adopt the overly simple ’either / or’. Stewart seeks to correct this problem by showing how Kierkegaard’s writing rose out of, and responded primarily to, debates in Denmark in the 1830’s and 40’s surrounding Hegel’s philosophy and its implications for theology. MORE...
With the public trial of 'Casino Capitalism' underway, Collapse VIII examines a pervasive image of thought drawn from games of chance. Surveying those practices in which intellectual resources are most acutely concentrated on the production of capitalizable risk, the volume uncovers the conceptual underpinnings of methods developed to extract value from contingency - in the casino, in the markets, in life.
The ancient writers of Greece and Rome are familiar to many, but what do their voices really tell us about who they were and what they believed? In Twelve Voices from Greece and Rome, Christopher Pelling and Maria Wyke provide a vibrant and distinctive introduction to twelve of the greatest authors from ancient Greece and Rome, writers whose voices still resonate across the centuries. Below is an infographic that shows how each of the great classical authors would describe their voice today, if they could.