Sarah Noonan, Intern
Yesterday David L. Bosco blogged for us about Obama’s speech in front of the UN General Assembly. Below is an excerpt from his new book, Five To Rule Them All: The UN Security Council and the Making of The Modern World, which tells the inside story of this remarkable diplomatic creation, illuminating the role of the Security Council in the postwar world, and making a compelling case for its enduring importance. In the excerpt below we are introduced to the Security Council.
The Security Council is like no other body in history. Its five permanent members-China, France, Russia, the United Kingdom, and the United States-account for nearly 30 percent of the world’s population and more than 40 percent of global economic output. In military affairs, their dominance is even more overwhelming. They control more than 26,000 nuclear warheads, 99% of all those in existence. They have a combined 5.5 million men and women in arms. When the Council is united, its members can wage war, impose blockades, unseat governments, and levy sanctions, all in the name of the international community. There are almost no limits to the body’s authority.
The council usually meets in the UN headquarters complex on New York’s East River, but it has greater power and authority than the rest of the sprawling organization. The council is a creature of great-power politics, not international bureaucracy. It is built on the assumption that five of the strongest nations have the right and duty to safeguard the globe. Most of the UN structure insists that member states are equal; the council, by contrast, grants the most powerful countries special rights and responsibilities.
The idea that the great powers should chaperone the world is not new. Coalitions of powerful nations-including the Congress of Vienna and the Holy Alliance in the eighteenth and nineteenth centuries-have tried before. The Geneva-based League of Nations, inaugurated in 1920, was the world’s answer to the horror of the First World War. It constituted the first fully developed world political organization, and it had a council of major powers charged with preserving the peace. The league and its council died prematurely when they failed to prevent an even more devastating war, but the idea of a world organization endured.
During its almost seven decades of operation, the UN Security Council has launched a broad range of diplomatic, legal, and even military initiatives to provide order. Since the late-1980s, its activities have increased dramatically. The council has blessed armed interventions in places like Bosnia, Somalia, Haiti, and Kuwait. It has imposed sanctions on the regimes in Serbia, Libya, and Sudan; launched war crimes courts to try sitting heads of state; and targeted terrorist finances. During the Cold War, the United States usually felt comfortable exercising its military power without the council’s permission. No longer. Even the George W. Bush administration-with its deep skepticism of the United Nations-worked to get the council’s approval for its policies. For many, the 2003 U.S.-led invasion of Iraq demonstrated the perils of operating without the council’s blessing, and the body has emerged from the imbroglio active and relevant. In 2007, the council authorized peacekeeping missions that involved more than 100,000 troops from dozens of nations. From nuclear proliferation to the global war against terrorism to genocide in Africa, the council is often the cockpit for global politics.
Yet even the council’s vigorous post-Cold War activity has fallen well short of effective global governance. Atrocities and crimes against humanity still plague many parts of the globe. Entire countries have collapsed, and in so doing they have exported refugees, drugs, and radicalism. Since the 1980s, Pakistan, India, and North Korea have tested nuclear weapons while the council watched. These shortcomings have led to frequent and angry charges that it is feckless, impotent, and unprincipled. More than a few commentators have charged that the United Nations and its council are an impediment rather than an aid to world order.
The council’s new activism has stirred hopes that it will assure world order, stop atrocities, and counter global threats like terrorism and weapons proliferation. Yet it exists in a world of realpolitik. Its members are, above all, powerful states with their own diverging interests. Time and again, the council’s performance has dashed hopes that its members would somehow rise above their narrow interests and work together to establish a more peaceful and just world.
Megan Branch, Intern
The only foods that I can think of as being as “American as apple pie” are recipes that have been lifted from other countries: pizza, sushi and, of course, Chinese food. College in New York has meant that I eat a lot of Chinese food. In his new book, Chop Suey: A Cultural History of Chinese Food in the United States, Andrew Coe chronicles Chinese food’s journey across the ocean and into the hearts of Americans everywhere. Below, I’ve excerpted a passage from Chop Suey in which Coe details the earliest written account of an American’s experience eating Chinese food for the first time almost 200 years ago.
Nevertheless, the first account we have of Americans eating Chinese food does not appear until 1819, thirty-five years after Shaw’s visit. It was written by Bryant Parrott Tilden, a young trader from Salem who acted as supercargo on a number of Asia voyages. In Guangzhou, he was befriended by Paunkeiqua, a leading merchant who cultivated good relations with many American firms. Just before Tilden’s ship was set to sail home, Paunkeiqua invited the American merchants to spend the day at this mansion on Honam island. Tilden’s account of that visit, which was capped by a magnificent feast, is not unlike the descriptions Shaw or even William Hickey wrote a half century earlier. First, he tours Paunkeiqua’s traditional Chinese garden and encounters some of the merchant’s children yelling “Fankwae! Fankwae!” (“Foreign devil! Foreign devil!”). Then Paunkeiqua shows him his library, including “some curious looking old Chinese maps of the world as these ‘celestials’ suppose it to be, with their Empire occupying three quarters of it, surrounded by ‘nameless islands & seas bounded only by the edges of the maps.” Finally, his host tells him: “Now my flinde, Tillen, you must go long my for catche chow chow tiffin.” In other words, dinner was served in a spacious dining hall, where the guests were seated at small tables.
“Soon after,” Tilden writes, “a train of servants came in bringing a most splendid service of fancy colored, painted and gilt large tureens & bowls, containing soups, among them the celebrated bird nest soup, as also a variety of stewed messes, and plenty of boiled rice, & same style of smaller bowls, but alas! No plates and knives and forks.” (By “messes,” Tilden probably meant prepared dishes, not unsavory jumbles.)
The Americans attempted to eat with chopsticks, with very poor results: “Monkies [sic] with knitting needles would not have looked more ludicrous than some of us did.” Finally, their host put an end to their discomfort by ordering western-style plates, knives, forks, and spoons. Then the main portion of the meal began:
Twenty separate courses were placed on the table during three hours in as many different services of elegant china ware, the messes consisting of soups, gelatinous food, a variety of stewed hashes, made up of all sorts of chopped meats, small birds cock’s-combs, a favorite dish, some fish & all sorts of vegetables, rice, and pickles, of which the Chinese are very fond. Ginger and pepper are used plentifully in most of their cookery. Not a joint of meat or a whole fowl or bird were placed on the table. Between the changing of the courses, we freely conversed and partook of Madeira & other European wines—and costly teas.”
After fruits, pastries, and more wine, the dinner finally came to an end. Tilden and his friends left glowing with happiness (and alcohol) at the honor Paunkeiqua had shown them with his lavish meal. Nowhere, however, does Tilden tell us whether the Americans actually enjoyed these “messes” and “hashes.”
Dr. David Kilcullen is one of the world’s leading experts on guerrilla warfare. He has served in every theater of the “War on Terrorism” since 9/11 as Special Advisor for Counterinsurgency to the Secretary of State Condoleezza Rice, Senior Counterinsurgency Advisor to General David Petraeus in Iraq, and chief counterterrorism strategist for the U.S. State Department. In his new book, The Accidental Guerrilla: Fighting Small Wars In The Midst of a Big One, Kilcullen takes us on the ground to uncover the face of modern warfare, illuminating both the global challenges and small wars across the world. In the excerpt below we learn why the Afghanistan is so very difficult and so very important in this struggle.
People often speak of “the Iraq War” and the “the war in Afghanistan” as if they were separate conflicts. But was we have seen, Afghanistan is one theater in a larger confrontation with the transnational takfiri terrorism, not a discrete war in itself. Because of commitments elsewhere-principally Iraq-the United States and its allies have chosen to run this campaign as an “economy of force” operation, with a fraction of the effort applied elsewhere. Most of what has happened in Afghanistan results from this, as much as from local factors. Compared to other theaters where I have worked, the war in Afghanistan is being run on a shoestring. The country is about one and a half times the size of Iraq and has a somewhat larger population (32 million, of whom about 6 million are Pashtun males of military age), but to date the United States has resourced it at about 27 percent of the funding given to Iraq, and allocated about 20 percent of the troops deployed in Iraq (29 percent counting allies). In funding terms, counting fiscal year 2008 supplemental budget requests, by 2008 operations in Iraq had cost the United States approximately $608.3 billion over five years, whereas the war in Afghanistan had cost about $162.6 billion over seven years: in terms of overall spending, about 26.7 percent of the cost of Iraq, or a monthly spending rate of about 19.03 percent that of Iraq. In addition to lack of troops and money, certain key resources, including battlefield helicopters, construction and engineering resources, and intelligence, surveillance, and reconnaissance (ISR) assets, have been critically short.
Resource allocation in itself is not a sign of success-arguably in Iraq we have spent more than we can afford for limited results-but expenditure is a good indicator of government attention. Thus the international community’s failure to allocate adequate resources for Afghanistan bespeaks an episodic strategic inattention, a tendency to focus on Iraq and think about Afghanistan only when it impinges on public opinion in Western countries, NATO alliance politics, global terrorism, or the situation in Pakistan or Iran, while taking ultimate victory in Afghanistan for granted. Two examples spring to mind: the first was when Admiral Michael G. Mullen, chairman of the U.S. Joint Chiefs of Staff, remarked in congressional testimony in December 2007 that “in Afghanistan, we do what we can. In Iraq, we do what we must,” implying that Afghan issues by definition play second fiddle to Iraq, receiving resources and attention only as spare capacity allows. The reason for Admiral Mullen’s remark emerges from the second, larger illustration of this syndrome: by invading Iraq in 2003, the United States and its allies opened a second front before finishing the first, and without sufficient resources to prosecute both campaigns effectively. Western leaders committed this strategic error primarily because of overconfidence and a tendency to underestimate the enemy: they appear to have take for granted that the demise of the Taliban, scattered and displaced but not defeated in 2001, was only a matter of time.
These leaders would have done well to remember the words of Sir Olaf Caroe, a famous old hand of the North-West Frontier of British India, ethnographer of the Pashtuns, and last administrator of the frontier province before independence, who wrote in 1958 that “unlike other wars, Afghan wars become serious only when they are over; in British times at least they were apt to produce an after-crop of tribal unrest [and]…constant intrigue among the border tribes.” Entering Afghanistan and capturing its cities is relatively easy; holding the country and securing the population is much, much harder: as the Soviets (with “assistance,” and a degree of post-Vietnam schadenfreude, from Washington) discovered to their cost, like the British, Sikhs, Mughals, Persians, Mongols, and Macedonians before them. In Afghanistan in 2001, as in Iraq in 2003, the invading Western powers confused entry with victory, a point the Russian General Staff lost no time in pointing out. The Taliban movement’s phenomenal resurgence from its nadir of early 2002 underlines this point: the insurgents’ successes seem due as much to inattention and inadequate resourcing on our part as to talent on theirs.
Afghanistan is also a very different campaign from Iraq, though the two conflicts are linked through shared Western political objectives and cooperation between enemy forces. The Iraq campaign is urban, sectarian, primarily internal, and heavily centered on Baghdad. The Afghan campaign is overwhelmingly rural, centered on the Pashtun South and East, with a major external sanctuary in Pakistan and, as of 2008, increasing support for the effort in Afghanistan than for Iraq (though rhetoric often does not translate into action). Afghanistan is seen as a war of necessity, “the good war,” the “real war on terrorism.” This gives the international community greater freedom of action than in Iraq.
Perhaps counterintuitively, events in Afghanistan also have greater proportional impact than those in Iraq, effort there has greater effect than equal effort in Iraq-a brigade (3,000 people) in Afghanistan is worth a division or more (10,000-12,000) in Iraq, in terms of its proportionate effect on the ground. Regardless of the outcome in Iraq, Afghanistan still presents an opportunity for a positive long-term legacy for Western intervention, if it results in an Afghan state capable of effectively responding to its people’s wishes and meeting their needs.
Conversely, although the American population and the international community are inured to negative media reporting about Iraq, they are less used to downbeat reporting about Afghanistan. Most people polled in successive opinion surveys have tended to assume that the Afghan campaign is going reasonably well, hence Taliban successes or sensational attacks in Afghanistan may actually carry greater political weight than equivalent events in Iraq, a campaign that is so unpopular and about which opinion is so polarized that people tent to assume it is going less well than is actually the case.
In the article below, written several weeks ago before Obama was President-elect, scholar Steven Niven, Executive Editor of the African American National Biography and the forthcoming Dictionary of African Biography, examined the historic candidacy of Barack Obama within the context of the civil rights movement and the changing nature of black politics. This article originally appeared on The Oxford African American Studies Center.
Barack Obama Jr., the first African American presidential nominee of a major political party, was born in Honolulu, Hawaii, on August 4, 1961. His birth coincided with a crucial turning point in the history of American race relations, although like many turning points it did not seem so at the time. Few observers believed that Jim Crow was in its death throes. Seven years after the Supreme Court’s Brown ruling, less than 1 percent of southern black students attended integrated schools. Southern colleges had witnessed token integration at best. In early 1961 Charlayne Hunter-Gault and Hamilton Holmes integrated the University of Georgia, but James Meredith’s application to enter Ole Miss that same year was met by Mississippi authorities with a “carefully calculated campaign of delay, harassment, and masterly inactivity,” in the words of federal judge John Minor Wisdom. Despite the Civil Rights Acts of 1957 and 1960 and promises from the new administration of President John F. Kennedy, the voting rights of African Americans remained virtually nonexistent in large swathes of Mississippi, Alabama, and Louisiana.
However, the Freedom Rides that began in the summer of 1961 and the voting rights campaign that Robert P. Moses initiated in McComb County, Mississippi, in the very week of Obama’s birth, signaled a hardening of African American resistance. There was among a cadre of activists a new determination to confront both segregation and the extreme caution of the Kennedy administration on civil rights. Later that fall, Bob Moses wrote a note from the freezing drunk-tank in Magnolia, Mississippi, where he and eleven others were being held for attempting to register black voters. “This is Mississippi, the middle of the iceberg. This is a tremor in the middle of the iceberg from a stone that the builders rejected.”
Over the next three years, Moses, Stokely Carmichael, James Farmer, James Forman, John Lewis, Diane Nash, Marion Barry, James Bevel, Bob Zellner, and thousands of activists devoted their lives to shattering that iceberg. Some, including Jimmy Lee Jackson, James Cheney, Andrew Goodman, and Michael Schwerner gave their lives in that cause. They took the civil rights struggle to the heart of the segregationist South: to McComb, Jackson, and Philadelphia, Mississippi, to Albany, Georgia, and to Birmingham, Alabama. By filling county jails and prison farms, by facing fire hoses, truncheons, and worse, they ultimately made segregation and disfranchisement untenable, paving the way for the 1964 Civil Rights Act and the 1965 Voting Rights Act.
Obama’s childhood experience of the dramatic changes wrought by the 1960s, seen from the vantage point of Hawaii and Indonesia, necessarily differed from most African American contemporaries in rural Mississippi or urban Detroit. But it would be a mistake to argue that he was untouched by those developments. His black Kenyan father, Barack Obama Sr. met his white Kansan mother, Ann Dunham, at the University of Hawaii, where the older Obama had gone to study on a program founded by his fellow Luo, Tom Mboya. Mboya’s program received financial support from civil rights stalwarts, including Jackie Robinson, Harry Belafonte, and Sidney Poitier. After Obama Sr. left his wife and child behind in 1963, Ann Dunham became the dominant figure in young Barry Obama’s formative years, and Obama has argued that the values his mother taught greatly shaped his worldview. Those values were largely secular, but grounded in the church-based interracial idealism of the early 1960s civil rights movement—the beloved community inspired by Martin Luther King Jr.’s rhetoric, Fannie Lou Hamer’s heroic activism, and Mahalia Jackson’s gospel singing.
After returning from Indonesia in 1971 to live with his white grandparents and go to high school in Hawaii, Obama’s formal education was abetted by his friendship with “Frank,” an African American drinking buddy of his grandfather, who tutored the young Obama in the history of black progressive struggles. The scholar Gerald Horne has speculated that Frank may have been Frank Marshall Davis, a pioneering radical journalist in the 1930s whose jazz criticism and poetry was influential in the Black Arts Movement of the 1960s. Davis, a Kansas native, moved to Hawaii in the late 1940s.
Obama’s work as an anti-poverty activist in Chicago in the 1980s likewise built on the legacy of Arthur Brazier and other 1960s community organizers influenced by Saul Alinsky. Arriving in Chicago in the era of Harold Washington also helped school Obama in the ways of Chicago politics. As the director of a major “get-out-the vote” drive in Illinois in the 1992 elections, he helped elect both Bill Clinton to the presidency and Carol Moseley Braun to the U.S. Senate. Connections through his wife, Michelle Robinson Obama, who lived in Chicago’s working-class black Southside, a schoolfriend of Santita Jackson (daughter of Jesse Jackson), and an aide to Mayor Richard Daley certainly helped Obama win friends and influence the right people in Chicago’s Democratic Party. The luster of his fame as the first African American president of the prestigious Harvard Law Review, as well as his self-evident political and rhetorical skills undoubtedly marked Obama out from the general pack of political hopefuls. In 1996 he easily won a seat in the Illinois Senate, representing a district that encompassed the worlds of both “Obama the University of Chicago Law Professor”—liberal, wealthy, and cosmopolitan Hyde Park—and “Obama the community organizer”—the district’s poorer neighborhoods which housed the headquarters of Operation Breadbasket.
Obama’s achievements in the Illinois legislature were solid, though not spectacular. His cool demeanor, cerebral approach, and links to Hyde Park liberalism irked established black leaders in Springfield, veterans of the civil rights struggles of the 1960s and 1970s, who viewed Obama as a Johnny-Come-Lately who had not paid his dues. The charge that he was somehow “not black enough” came to the fore in his unsuccessful primary challenge for the U.S. congressional seat of the former Black Panther, Bobby Rush, in 2000. Although Obama secured a majority of white primary voters, Rush won the vast majority of black voters and defeated Obama by a margin of 2 to 1, successfully depicting him as a Harvard-educated, Hyde Park elitist at odds with the more prosaic values of the mainly black working-class district.
Despite that setback, Obama stunned political observers four years later by winning the 2004 Democratic primary for U.S. Senate in Illinois, and then crushing his (admittedly very weak) Republican opponent in the general election, Alan Keyes. Keyes—a black, ultraconservative, fundamentalist pro-lifer who had been a minor diplomat in the Ronald Reagan administration, had few direct links to Illinois—was placed on the ballot after the primaries because the Republican primary winner had dropped out following a sex scandal. Obama also benefited from a well-received keynote address to the 2004 Democratic National Convention (DNC) in Boston. It was at the DNC that most Americans first heard and saw the self-described “skinny kid with a funny name,” who urged his fellow citizens to look beyond the fierce partisanship that had characterized politics since the 1990s.
“The pundits like to slice and dice our country into Red [Republican] States and Blue [Democratic] States,” he told the watching millions.
But I’ve got news for them, too. We worship an awesome God in the Blue States, and we don’t like federal agents poking around in our libraries in the Red States. We coach Little League in the Blue States and yes, we’ve got some gay friends in the Red States. . . . We are one people, all of us pledging allegiance to the Stars and Stripes, all of us defending the United States of America.
Obama’s arrival in the Senate in January 2005 provoked significant media interest, verging on what some have called Obama-mania. He was, after all, only the fifth African American to serve in that body in 215 years, following Hiram Rhodes Revels, Blanche Bruce, Edward Brooke, and Moseley-Braun. But media scrutiny and the popular interest in the new candidate went far beyond the attention given to Moseley-Braun in 1992. In part, this was because Obama’s election symbolized a broader generational shift in African American politics. Black political gains in the 1970s, 1980s, and 1990s, were largely achieved by a generation of politicians who came of age in the southern civil right movement, like John Lewis, Eva Clayton, Vernon Jordan, Andrew Young, or in urban Democratic politics, such as Charles Rangel in New York and Willie Brown in San Francisco. Obama was not the only Ivy League–educated black politician to emerge in the early 2000s. In 2002, Artur Davis, like Obama, a Harvard Law School graduate won election to the House of Representatives from Alabama; in 2005, Deval Patrick, another Harvard Law graduate, became only the second African American elected governor of a state (Massachusetts) since Reconstruction; and in 2006 Cory Booker (Yale Law School and Queens College Oxford) and Michael Nutter (University of Pennsylvania) were elected mayors of Newark and Philadelphia, respectively. Harold Ford Jr., a Penn grad and Tennessee congressman narrowly lost a U.S. Senate race in Tennessee the same year. Patrick and Nutter are a few years older than Obama, while Davis (b. 1967), Booker (b. 1969), and Ford (b. 1970) are slightly younger. In terms of ideology, there are also similarities in these politicians’ commitment to post-partisanship, although Ford, now leader of the Democratic Leadership Council, and Davis, have been more willing to adopt socially, as well as economically conservative positions, so as to broaden their appeal as possible statewide candidates in the South.
But perhaps the most remarkable facet of Obama-mania is the rapidity with which the freshman Senator was discussed as a possible presidential candidate in 2012 or 2016. Or rather that would have been the most remarkable facet, had Obama not sought and then won the Democratic nomination for president in 2008! It is hard to think of a comparable American politician whose rise has been so swift, dramatic, or unforeseen, except maybe that other, most famous Illinois politician, Abraham Lincoln.
Whether Obama follows Lincoln as the second U.S. president from Illinois is unknown at this time of writing, five weeks from Election Day, 2008. At the very least, Obama’s candidacy marks another tremor in the iceberg that Bob Moses faced in that Magnolia drunk-tank in the fall of 1961, and that James Meredith faced down while integrating Old Miss in the face of a full-force white riot a year later. It is, then, all too fitting—and a reasonable marker of American progress in race relations—that forty-six years later Barack Obama became the first African American to participate in a presidential debate, not just in Mississippi, but at Ole Miss, itself, the hallowed symbol of segregationist resistance.
Cliff Kincaid and his colleagues have accused Frank Marshall Davis (1905-1987) of being a Stalinist, a lifetime member of the Communist Party, and “Obama’s Communist Mentor.” Kincaid heads “Accuracy In Media” (A.I.M.), an organization dedicated to “fairness, balance and accuracy in news reporting.” Kincaid and his colleagues are all honorable men.
Having asked Kincaid to substantiate some of his accusations, and having received no reply, one can only conclude that Kincaid must be preoccupied with more important questions, because Kincaid is an honorable man.
Although the ironically named “Accuracy In Media” has yet to substantiate that Davis actually mentored Obama (a claim specifically rejected by the Obama campaign), such a relationship could have provided a bi-racial teenager with the key to success in mainstream America. To minimize criticism and maximize their potential, bi-racial African-Americans must walk a narrow identity path between group expectations. Davis was uniquely qualified to show the way. He may have significantly facilitated Obama’s vision of an inclusive society.
Davis may have advised Obama that to succeed in mainstream America, African-Americans must consider worst-case scenarios without wearing a chip on their shoulders, even though this normally requires judgment borne through actual experience. They must learn to give others (such as Cliff Kincaid) the benefit of the doubt. Success within mainstream America requires that abusive behavior should be attributed to bias only when there is no other plausible explanation.
Growing up in explicitly racist America, Davis learned never to immediately trust anybody white, but professional and personal familiarity nevertheless produced many warm interracial relationships. While one could plausibly argue that such stereotype activation is inherently racist, the recent expansion of “racist” to include such thought processes (as opposed to differential treatment) renders virtually everyone similarly culpable.
Although his experience with Jim Crow may have “incurably” limited his expectations of contemporary America, he shared Dr. King’s dream of a color-blind society. He unequivocally rejected racism, and worked tirelessly in support of equal rights for all Americans. He recognized that although victims of racism may have a reason to hate their oppressors, such reasons do not become rights – a distinction often lost on his critics. Collective responsibility is a double-edged sword.
Davis’s crusty radicalism may have perfectly counterbalanced Hawaii’s laid-back lifestyle for an African-American teenager destined for greatness. He provided coherent insight on African-American history, politics, and culture vis-à-vis mainstream America. Davis recognized the folly of cultural nationalism, including Black Separatist movements, long before meeting Obama. Obama’s grandfather may not have recognized the true value of his gift.
Although he may have used CPUSA periodicals as a publishing tool in the 1930’s and 1940’s, along with contemporaries Richard Wright and Langston Hughes, he rejected communist ideology in general and specifically attacked Stalin. He supported a fully integrated mixed economy, because neither laissez–faire capitalism nor collectivism provide the greatest benefit for the greatest number. He also had a libertarian streak that may have made Ron Paul proud.
Davis retired from activism by the 1970’s. His civil rights agenda had become the law of the land. He wrote little. Even if he had remained prolific, the burgeoning black publishing world obviated CPUSA periodical support for African-American writers. Further, the barbarity of communist regimes discredited the CPUSA and Marxist ideology. Newly divorced, he entered his golden years with glee.
As an honorable man, Kincaid must be unaware that by the 1970’s, the twin forces of Hawaiian and hippie cultures had mellowed Davis to the point that “Stalinist” charges are especially absurd. By the early 1970’s, Davis had become a virtual teddy bear, a permanent fixture of the Koa Cottages in the “Waikiki Jungle,” noted for its counterculture residents. Davis was known as a kindly old man, usually sitting on his porch a few steps from Kuhio Avenue, waving at all that passed. Although he had little money, he was always willing to share with those in deeper need.
As an honorable man, Cliff Kincaid must also be unaware that his portrayal of a raving Stalinist could not be further from the truth. Davis deeply loved the United States, despite his occasional flirtation with radical ideology. He recognized, perhaps belatedly, that the United States offers a unique combination of economic opportunity and personal freedom, thus providing sufficient strength and moral authority to champion human rights worldwide. If he HAD been so lucky, Barack Obama could not have found a finer mentor anywhere.
[...] African American Studies Center. Barack Obama Jr., the first African American presidential nomihttp://blog.oup.com/2008/11/president_barack_obama/Stokely Carmichael - Wikipedia, the free encyclopediaStokely Standiford Churchill carmichael June [...]
Not even a week after the president election the blond topblogger from Sweden, Linda Ekholm speaks out loud about Obama!
And I really believe this is truly written by heart. Scary!
http://www.finest.se/userBlog/?uid=30701&beid=1040511