JacketFlap connects you to the work of more than 200,000 authors, illustrators, publishers and other creators of books for Children and Young Adults. The site is updated daily with information about every book, author, illustrator, and publisher in the children's / young adult book industry. Members include published authors and illustrators, librarians, agents, editors, publicists, booksellers, publishers and fans. Join now (it's free).
Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.
Viewing: Blog Posts Tagged with: higher education, Most Recent at Top [Help]
Results 1 - 25 of 49
How to use this Page
You are viewing the most recent posts tagged with the words: higher education in the JacketFlap blog reader. What is a tag? Think of a tag as a keyword or category label. Tags can both help you find posts on JacketFlap.com as well as provide an easy way for you to "remember" and classify posts for later recall. Try adding a tag yourself by clicking "Add a tag" below a post's header. Scroll down through the list of Recent Posts in the left column and click on a post title that sounds interesting. You can view all posts from a specific blog by clicking the Blog name in the right column, or you can click a 'More Posts from this Blog' link in any individual post.
What does international law truly mean in the world today? For the publication of Malcom Evans’s International Law, Fourth edition, we asked several leading figures that question. Ralph Zacklin, the former UN Assistant Secretary General for Legal Affairs, provides his personal perspective on international in the edited essay below. A full version of his essay can be found on the textbook’s Online Resource Centre, along with five other personal perspectives.
By Ralph Zacklin
I have been privileged to work for almost thirty years as an international lawyer in the United Nations and from this vantage point international law is neither the omnipotent solution to the world’s problems nor is it an illusion that only die-hard pacifists cling to. It is, in fact, for the practitioner a very real and pragmatic discipline. That it may be uncertain, incomplete, and difficult to enforce does not lessen the need for the rule of law on the international plane nor does it mean that the efforts to codify the law and develop its institutions should cease or be diminished.
At the core of contemporary international law is the Charter of the United Nations. It is a tribute to its drafters in the San Francisco Conference that this instrument has retained its essential validity as a set of fundamental principles which have guided the community of States for more than fifty years. It is the basis for the development of much of international law as we know it today in such key areas as human rights, the environment, and the law of the sea and outer space, not to mention the vast array of multilateral treaties in numerous technical, economic, and scientific areas.
International law provides a common legal vocabulary within which States and other actors operate. It provides a framework for conceptions of what is ‘legal’ or ‘right’. For the author personally, the most striking lesson of the last thirty years is not the quantitative qualitative development of international law which has been substantial but the degree to which States have come to accept the existence of international law as a standard that must be observed or by which their actions must be justified.
There is another dimension to international law which is sometimes overlooked in an era of globalization. International law, however inchoate it may be, represents the expectations and claims of substantial segments of humanity. It cannot be dismissed merely because of its perceived weakness. This dimension is of particular relevance to the member States of the United Nations, the overwhelming majority of whom rely on international law-making processes in international forums to weave together the fabric of the rule of law.
This accounts for the persistence of the United Nations in the holding of major conferences or summits––much derided in some quarters––which have produced soft law Declarations on the environment, human rights, advancement of women and a panoply of economic and social rights. These fora move from agenda-setting gradually towards normative outcomes and have undeniably altered the international legal landscape over the past twenty-five years.
Law, whether domestic or international, is by nature a conservative discipline. Its evolution is slow, even laborious. International law is not, nor should it be, viewed as an ideal state in which harmony prevails. Like any other system of law, its rules and institutions mature over time. When one compares the international law of today with that of a mere three decades ago, one cannot but marvel at the advances that have been made both normatively and institutionally. The path of advancement is by no means uneventful but it continues.
I have been fortunate in my own career to have had the opportunity to contribute to significant developments in international law, such as the establishment of ad hoc criminal tribunals for Yugoslavia and Rwanda as well as, more recently, the Special Court in Sierra Leone. Over the years I have provided legal advice which has helped to shape much of the contemporary law of UN peace-keeping and, like many of my colleagues, have rejoiced in the completion of UN mandates which have resulted in the independence of countries such as Namibia and Timor-Leste. There have also been tragic failures in Rwanda, Bosnia, and Somalia.
At the outset of my career I was motivated like many young people of the time by an idealistic determination to make the world a safer and a better place. Over the years my idealism has certainly been tested, but I believe that the role and impact of international law has grown, and it continues to grow.
Ralph Zacklin is the former UN Assistant Secretary General for Legal Affairs. Malcolm Evans is a Professor of Public International Law at the University of Bristol. Malcolm Evans is the editor of International Law, which provides wide-ranging analysis of all the key issues and themes in public international law and brings together an outstanding collection of interesting and diverse writings from the leading scholars in the field.
Oxford University Press is a leading publisher in international law, including the Max Planck Encyclopedia of Public International Law, latest titles from thought leaders in the field, and a wide range of law journals and online products. We publish original works across key areas of study, from humanitarian to international economic to environmental law, developing outstanding resources to support students, scholars, and practitioners worldwide. For the latest news, commentary, and insights follow the International Law team on Twitter @OUPIntLaw.
Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.
How do you hear the call of the poet to the Muse that opens every epic poem? The following is extract from Barry B. Powell’s new free verse translation of The Odyssey by Homer. It is accompanied by two recordings: one of the first 105 lines in Ancient Greek, the other of the first 155 lines in the new translation. How does your understanding change in each of the different versions?
Sing to me of the resourceful man, O Muse, who wandered
far after he had sacked the sacred city of Troy. He saw
the cities of many men and he learned their minds.
He suffered many pains on the sea in his spirit, seeking
to save his life and the homecoming of his companions.
But even so he could not save his companions, though he wanted to,
for they perished of their own folly—the fools! They ate
the cattle of Helios Hyperion, who took from them the day
of their return. Of these matters, beginning where you want,
O daughter of Zeus, tell to us.
Now all the rest
were at home, as many as had escaped dread destruction,
fleeing from the war and the sea. Odysseus alone
a queenly nymph, Kalypso, a shining one among the goddesses,
held back in her hollow caves, desiring that he become
her husband. But when, as the seasons rolled by, the year came
in which the gods had spun the threads of destiny
that Odysseus return home to Ithaca, not even then
was he free of his trials, even among his own friends.
All the gods pitied him, except for Poseidon.
Poseidon stayed in an unending rage at godlike Odysseus
until he reached his own land. But Poseidon had gone off
to the Aethiopians who live faraway—the Aethiopians
who live split into two groups, the most remote of men—
some where Hyperion sets, and some where he rises.
There Poseidon received a sacrifice of bulls and rams,
sitting there and rejoicing in the feast.
The other gods
were seated in the halls of Zeus on Olympos. Among them
the father of men and gods began to speak, for in his heart
he was thinking of bold Aigisthos, whom far-famed Orestes,
the son of Agamemnon, had killed. Thinking of him,
he spoke these words to the deathless ones: “Only consider,
how mortals blame the gods! They say that from us
comes all evil, but men suffer pains beyond what is fated
through their own folly! See how Aigisthos pursued
the wedded wife of the son of Atreus, and then he killed
Agamemnon when he came home, though he well knew
the end. For we spoke to him beforehand, sending Hermes,
the keen-sighted Argeïphontes, to say that he should not kill
Agamemnon and he should not pursue Agamemnon’s wife.
For vengeance would come from Orestes to the son of Atreus,
once Orestes came of age and wanted to reclaim his family land.
So spoke Hermes, but for all his good intent he did not persuade
Aigisthos’ mind. And now he has paid the price in full.”
Then the goddess, flashing-eyed Athena, answered him:
“O father of us all, son of Kronos, highest of all the lords,
surely that man has fittingly been destroyed. May whoever
else does such things perish as well! But my heart
is torn for the wise Odysseus, that unfortunate man,
who far from his friends suffers pain on an island surrounded
by water, where is the very navel of the sea. It is a wooded
island, and a goddess lives there, the daughter of evil-minded
Atlas, who knows the depths of every sea, and himself
holds the pillars that keep the earth and the sky apart.
Kalypso holds back that wretched, sorrowful man.
Ever with soft and wheedling words she enchants him,
so that he forgets about Ithaca. Odysseus, wishing to see
the smoke leaping up from his own land, longs to die. But your
heart pays no attention to it, Olympian! Did not Odysseus
offer you abundant sacrifice beside the ships in broad Troy?
Why do you hate him so, O Zeus?”
Zeus who gathers the clouds
then answered her: “My child, what a word has escaped the barrier
of your teeth! How could I forget about godlike Odysseus,
who is superior to all mortals in wisdom, who more than any other
has sacrificed to the deathless goes who hold the broad heaven?
But Poseidon who holds the earth is perpetually angry with him
because of the Cyclops, whose eye he blinded—godlike
Polyphemos, whose strength is greatest among all the Cyclopes.
The nymph Thoösa bore him, the daughter of Phorkys
who rules over the restless sea, having mingled with Poseidon
in the hollow caves. From that time Poseidon, the earth-shaker,
does not kill Odysseus, but he leads him to wander from
his native land. But come, let us all take thought of his homecoming,
how he will get there. Poseidon will abandon his anger!
He will not be able to go against all the deathless ones alone,
against their will.”
Then flashing-eyed Athena, the goddess,
answered him: “O our father, the son of Kronos, highest
of all the lords, if it be the pleasure of all the blessed gods
that wise Odysseus return to his home, then let us send Hermes
Argeïphontes, the messenger, to the island of Ogygia, so that
he may present our sure counsel to Kalypso with the lovely tresses,
that Odysseus, the steady at heart, need now return home.
And I will journey to Ithaca in order that I may the more
arouse his son and stir strength in his heart to call the Achaeans
with their long hair into an assembly, and give notice to all the suitors,
who devour his flocks of sheep and his cattle with twisted horns,
that walk with shambling gait. I will send him to Sparta and to sandy
Pylos to learn about the homecoming of his father, if perhaps
he might hear something, and so that might earn a noble fame
So she spoke, and she bound beneath her feet
her beautiful sandals—immortal, golden!—that bore her
over the water and the limitless land together with the breath
of the wind. She took up her powerful spear, whose point
was of sharp bronze, heavy and huge and strong,
with which she overcomes the ranks of warriors when she is angry
with them, the daughter of a mighty father. She descended
in a rush from the peaks of Olympos and took her stand
in the land of Ithaca in the forecourt of Odysseus, on the threshold
of the court. She held the bronze spear in her hand, taking on
the appearance of a stranger, Mentes, leader of the Taphians.
There she found the proud suitors. They were taking their pleasure,
playing board games in front of the doors, sitting on the skins
of cattle that they themselves had slaughtered. Heralds
and busy assistants mixed wine with water for them
in large bowls, and others wiped the tables with porous sponges
and set them up, while others set out meats to eat in abundance.
Godlike Telemachos was by far the first to notice
her as he sat among the suitors, sad at heart, his noble
father in his mind, wondering if perhaps he might come
and scatter the suitors through the house and win honor
and rule over his own household. Thinking such things,
sitting among the suitors, he saw Athena. He went straight
to the outer door, thinking in his spirit that it was a shameful thing
that a stranger be allowed to remain for long before the doors.
Standing near, he clasped her right hand and took the bronze
spear from her. Addressing her, he spoke words that went
like arrows: “Greetings, stranger! You will be treated kindly
in our house, and once you have tasted food, you will tell us
what you need!”
So speaking he led the way, and Pallas Athena
followed. When they came inside the high-roofed house,
Telemachos carried the spear and placed it against a high column
in a well-polished spear rack where were many other spears
belonging to the steadfast Odysseus. He led her in and sat her
on a chair, spreading a linen cloth beneath—beautiful,
elaborately-decorated—and below was a footstool for her feet.
Beside it he placed an inlaid chair, apart from the others,
so that the stranger might not be put-off by the racket and fail
to enjoy his meal, despite the company of insolent men.
Also, he wished to ask him about his absent father.
A slave girl brought water for their hands in a beautiful golden
vessel, and she set up a polished table beside them.
The modest attendant brought out bread and placed it before them,
and many delicacies, giving freely from her store. A carver
lifted up and set down beside them platters with all kinds
of meats, and set before them golden cups, while a herald
went back and forth pouring out wine for them.
In came the proud suitors, and they sat down in a row
on the seats and chairs, and the heralds poured out water for
their hands, and women slaves heaped bread by them in baskets,
and young men filled the wine-mixing bowls with drink.
The suitors put forth their hands to the good cheer lying before them,
and when they had exhausted their desire for drink and food,
their hearts turned toward other things, to song and dance.
For such things are the proper accompaniment of the feast.
A herald placed the very beautiful lyre in the hands of Phemios,
who was required to sing to the suitors. And he thrummed the strings
as a prelude to song.
Barry B. Powell is Halls-Bascom Professor of Classics Emeritus at the University of Wisconsin, Madison. His new free verse translation of The Odyssey was published by Oxford University Press in 2014.His translation of The Iliad was published by Oxford University Press in 2013.
Subscribe to the OUPblog via email or RSS.
Subscribe to only classics and archaeology articles on the OUPblog via email or RSS.
Chafing from four centuries of rule by the Ottoman Empire and taking advantage of the Ottoman army’s need to suppress a rebellious local official, the Greek organization Filike Etaireia ( “Friendly Brotherhood”) launched revolts across Greece on March 25, 1821. While it took years for the Greeks to win independence, the day the revolt began is still celebrated as Greek Independence Day.
While a rebel Greek army under Alexandros Ipsilantis met an early defeat, other Greek efforts succeeded. By late 1821, the Greeks controlled the Peloponnesian peninsula, and in January of the next year a coalition of rebels formally declared independence. More territory was taken from Ottoman hands in 1822.
Soon, however, infighting among different factions plagued the Greek effort, though the struggle attracted liberals across Europe—including the British noble and poet George Gordon, Lord Byron—who flocked to the Greek cause. By the middle 1820s, the Ottomans had regained control of parts of Greece, and the independence movement was reeling.
In 1826, however, Britain, France, and Russia inserted themselves into the conflict, seeking to restore stability. Their combined fleets defeated an Ottoman and Egyptian force at the battle of Navarino in 1827. The battle was a major victory, though fighting continued until 1832. That year the Ottomans signed a treaty recognizing Greek independence.
Independence was tarnished for some Greeks by the terms of the treaty. The European imposed a constitutional monarchy, placed the German prince Otto of Bavaria on the throne, and insisted on maintaining a protectorate over the new Greek state. Nevertheless, a new Greek state had come into being.
Poet T.S. Eliot might still be right — the world might end with a whimper. But on April 1, 1948, physicists George Gamow and Ralph Alpher first proposed the now prevailing idea of how the universe began — with a big bang.
Gamow worked closely in the 1930s and 1940s with Edward Teller to understand beta decay — a kind of nuclear decay that results in the loss of electrons — and to understand the makeup of red giant stars.
From this work, Gamow and Alpher — one of his students — developed the idea that the universe was highly compressed until a vast thermonuclear explosion occurred. The explosion released neutrons, protons, and electrons. As the universe cooled, it became possible for neutrons to combine with other neutrons or with protons to form chemical elements.
Time Line of the Universe. Source: NASA/WMAP Science Team.
Gamow and Alpher published their findings in the journal Physical Review on April 1, 1948. The title of the paper — “The Origin of Chemical Elements” — suggests the link between cosmology and particle physics that the big-bang theory represents.
The paper’s authorship showed a bit of Gamow’s whimsy. Thinking it wrong to have a paper on particle physics written by one author whose name began with A (as in positively charged alpha particles) and G (as in gamma rays) without having a B (as in negatively charged beta particles), Gamow asked friend Hans Bethe to add his name to the byline. Bethe agreed, and thereby became part of history.
Just five years later, Gamow made a brilliant addition to a wholly different field. After learning of James Watson and Francis Crick’s discovery of the double helical structure of DNA, Gamow wrote Crick suggesting that the genetic code was made up of three-part segments. Gamow’s suggestion set Watson, Crick, and other researchers to investigate the possibility, which turned out — in essence (though not in the details Gamow had suggested) — to be true.
The basic problem for anyone wanting to understand contemporary world politics is the amount of material that is out there. Where on earth would you start if you wanted to explain the most important political processes? How, for example, would you explain the reasons behind 9/11, the War in Iraq, the recent global financial crisis, or the ongoing Syrian Civil War?
Whether you are aware of it or not, whenever you are faced with such issues, you have to resort to theories. A theory is not just a formal model with hypotheses and assumptions, rather, it is a kind of simplifying device that allows you to decide what the most important factors are.
Students often feel that the theoretical side of international relations is daunting, but think of it this way: imagine you own several pairs of sunglasses with different-coloured lenses. Put on the red pair and the world looks red, put on a yellow pair and it looks yellow. The world is not any different, it just looks different. So it is with theories.
In the following video, Professor Sir Steve Smith, author of The Globalization of World Politics, discuss different theories behind the Syrian Civil War, how to interpret them, and how they are important.
Sir Steve Smith is Vice-Chancellor of the University of Exeter. He was President of Universities UK from 2009 to 2011, and President of the International Studies Association for 2003-4. He is Editor of International Relations Theories: Discipline and Diversity (with Tim Dunne and Milja Kurk) and Foreign Policy: Theories, Actors, Cases (with Tim Dunne and Amelia Hadfield), as well as author of many academic papers and chapters in major international journals and edited collections.
Trusted by over 300,000 students in over 120 countries, The Globalization of World Politics is the most authoritative and complete introduction to international relations available, making it the go-to text for students of international relations. You can view more related videos at the Online Resource Centre.
Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.
The political controversies over immigration intensify across Europe. Commonly, the arguments centre around its economic costs and benefits, and they reduce the public perception of immigrants to cheap workforce. Yet, increasingly, these workers are highly skilled professionals, international students, and academics. Their presence transforms not only labour markets but also the production of knowledge and, in the end, it changes the way we all perceive immigrants and immigration.
US American scholars were first to draw attention to how immigrant scholars influence the academic field. The historian of migration Nancy Foner claimed a decade ago that the increasing group of students and faculty who study and work abroad — immigrants to the United States — heavily change the way immigration is perceived in social sciences. Immigrant scholars — according to Herbert J. Gans, a German-born American sociologist — contributed to the paradigm shift in American migration studies, from assimilationist to retentionist approach. They did so, because they were ‘insiders’ to the groups they studied; they were immigrant researchers researching immigrants.
A century ago, public interest (and funds) fueled studies on immigration by sociologists, demographers, economists and historians. The results of their studies were widely spread by journalists, novelists and mass entertainment industries. Now, budget cuts in higher education, and the increase of impact-seeking funding of the European Union, foster the concern about the societal benefits of social sciences. Paradoxically, the public interest in research on immigrants seems to fall, and academics apparently lose their capability of influencing broad publics and the politics in Europe, the boats on Lampedusa being a symbol of this problem.
For scholars who reply to short-term concerns of national public policy, the urgent question is the effectiveness of transfer of knowledge between academic and other systems that is driven by the hope for formulating better policies. Some scholars are yet reluctant to actively participate in public debates because they see their scientific objectivity in danger. The position of those scholars researching immigration who are immigrants themselves is no less ambivalent: they may play the ‘ethnic card’ to secure funding for research and access to people whom they want to study. Financial reasons may compel many to do research in their native country and they also meet the suspicion of fellow academics that tend to suspect they might lack scientific distance and objectivity.
What societal roles are available for immigrant researchers researching immigrants? Too often we look for answers to this question by tracking the processes of policy decision making, by investigating the “big-P”-politics. We are used to thinking of production of ideas and texts as separate from the impact we think they will have. Yet the way that knowledge is being negotiated during the production of texts is a key to understanding the role migration researchers studying immigrants play for the society.
Let us imagine a research situation, an interview, which is undoubtedly the most widely applied technique for conducting systematic social inquiry: a researcher typically asks questions and listens carefully to the stories the respondent tells. While one of them may say less and the other more, they interact. Interviews are interactional, and during this situation, both the researcher and the researched subject negotiate the meaning they assign to norms, values, ideas, other people, their behaviour, etc. Let’s assume both parties in this situation are immigrants. From my personal experience as an interviewer and immigrant, I recall multiple research encounters during which my interview partners prompted me to confirm their views: “you surely know, you are also an immigrant” or “you do understand me, you are also from Poland”. They presume that because of our common origin, we have a lot in common, that being an immigrant might bring us together, foster mutual understanding of problems, or even make us share the same norms and values.
But common origin does not produce ‘common individuals’, and each migration trajectory is different. It matters that I am born in Warsaw in a middle class family, have university education and work as a professor at a German university while my research subjects come from rural areas in Poland, left school early and perform manual jobs in United Kingdom. Each time I ask a question and they answer it, each time I prompt them — seemingly impersonally and in a highly controlled fashion — to continue narrating, my interview partners and I question the latent national and ethnic categories of commonality. Unintentionally, in the course of such research encounter, when confronted with misunderstandings or incomprehension, we revisit our gendered, ethnic, class, or professional identities.
For most researchers, such experiences are common and obvious. But they reflect on them in a self-referential fashion, addressing the issue to colleagues subscribing to journals on methodology of qualitative research. They aim at improving the quality of research but the meaning of this self-reflection is deeper and should be communicated to wider audiences.
It matters that when the researcher is an immigrant herself: it influences the research process, the access to research subjects and funding, and the way results of the studies are interpreted (because the researcher is sympathetic, or empathetic, to particular problems of her respondents). More importantly, immigrant immigration researchers are capable and predisposed to reveal the artificiality of fixed categorisations assigning people to places on the map and positions in social hierarchies. When they do so, they show us a possibility for new, better, modes of societal integration. In countries like Germany that have long been shaped by low-skilled immigration and public discourses around it, there is a minor but growing interest in the perspectives of immigrant researchers. Through stronger engagement in dialogue with wider audiences, the immigrant researchers can accelerate this trend. This much needed change of perspective has a chance of becoming mainstream if immigrant researchers talk about their work and research experiences with more self-confidence.
Migration Studies is an international refereed journal dedicated to advancing scholarly understanding of the determinants, processes and outcomes of human migration in all its manifestations, and gives priority to work presenting methodological, comparative or theoretical advances.
Subscribe to the OUPblog via email or RSS.
Subscribe to only sociology articles on the OUPblog via email or RSS.
Image credit: Visa application. By VIPDesignUSA, via iStockphoto.
On February 28, 1983, at the end of its eleventh season, M*A*S*H said goodbye to television. More than 105 million Americans in about 51 million homes watched the series finale, a two-and-a-half-hour-long movie directed by star Alan Alda, that featured the show’s characteristic blend of comedy and drama.
M*A*S*H debuted in 1972, two years after the release of the Robert Altman movie of the same name and four years after the publication of the Richard Hooker novel that was the original for both. Set in a mobile army surgical hospital during the Korean War, the show featured an ensemble cast that included three regulars — Alda as doctor “Hawkeye” Pierce, Loretta Swit as chief nurse Major Margaret Houlihan, and William Christopher as chaplain Father Mulcahy — who appeared in all eleven seasons.
In its first few seasons, the show’s Korean War setting made it a commentary of sorts on the Vietnam War. Even after Vietnam ended, the series examined the tragic personal cost of war and the extent to which people will go to try to maintain sanity in war. The last episode, set around the close of the Korean War, included storylines reinforcing those themes.
The show made several innovations, including use of multiple storylines in an episode, the mixture of comedy and drama, the way the camera was used to shoot scenes, and the fact that the characters developed over time.
M*A*S*H remains one of the most highly regarded of all television series. Though the records the final episode once held for number of households tuning in and total number of viewers have been surpassed by Super Bowl broadcasts, that last show remains the single most watched episode of a television series in US history. Its Neilsen rating of 60.2, which means that more than three-quarters of all televisions were tuned to it, makes it the highest-rated television show of any kind.
Aleksandr II Imperator Vseross. Source: New York Public Library.
When his father, Nicholas I, died of pneumonia, Alexander Nikolayevich Romanov succeeded to the throne of emperor of Russia, becoming Czar Alexander II. While his 36-year rule was marked by substantial reforms, it was also dogged by unrest and several assassination attempts.
Two strong influences stamped Alexander’s character. One was the autocratic personality and rule of his father; the other was his education, tinged with the principles of liberalism and romanticism. He ascended to the throne with Russia in a crisis, fighting the Crimean War against the Ottoman Empire, which had the support of Britain and France. The fighting continued for nearly a year, but Alexander had to sign a treaty making concession.
Russia’s defeat convinced him that he had to modernize the nation and spurred a program of industrialization and liberal reform. New railway lines were built, universities and courts were reformed, and there was even some steps made to reduce censorship. The signal achievement of Alexander’s reign was the emancipation of the serfs, as tens of millions of peasants were released from centuries-old feudal bonds and even given land allotments. The reform failed to produce a viable class of small farmers, however.
Another liberalization, with Russia lightening its grip on Poland, led to nationalist revolts there and the growth of radicalism both there and in Russia. Alexander responded by strengthening the secret police, which produced more unrest, further suppression, and several assassination attempts against the emperor. In 1881, he leaned once more toward liberalization, signing a decree on March 1 that would create a new constitution. That very day, he was wounded fatally in a terrorist attack, dying one day short of the anniversary of the day he took the throne.
On March 6, 1869, Dmitri Mendeleev’s breakthrough discovery was presented to the Russian Chemical Society. The chemist had determined that the known elements — 70 at the time — could be arranged by their atomic weights into a table that revealed that their physical properties followed regular patterns. He had invented the periodic table of elements.
In his early twenties, Mendeleev had intuited that the elements followed some kind of order, and he spent thirteen years trying to discover it. In developing his system, he drew on the data and ideas of scientists around the world. Two — Lothar Meyer and British chemist John Alexander Reina Newlands — had published ideas about the periodicity of elements. But Mendeleev’s addressed every known element, which theirs had not.
His system also surpassed the others because he accounted for gaps in the sequence of elements. Mendeleev said that an element would be discovered to fill each gap and even predicted the properties of those elements. The discovery of the one of these missing elements — gallium, in 1875 — helped spur wide acceptance of Mendeleev’s system.
Later work showed that Mendeleev’s reliance on atomic weight to determine periodicity is not completely correct. While atomic weight tends to increase as one moves from element to element, there are exceptions. Mendeleev also did not have the theoretical understanding to explain why the elements exhibited these periodic characteristics. Nevertheless, his achievement marked an important milestone in the understanding of the physical world.
Mendeleev did not personally present his breakthrough to the Chemical Society. Ill on the day of the meeting, he asked a colleague to deliver the report.
Interestingly, the date celebrated for this event reflects Russia’s use of the “Old Style” Julian calendar. According to the “New Style” Gregorian calendar — not adopted in Russia until after 1918 — Mendeleev’s periodic table was presented twelve days later, on March 18.
Earthquake, Tsunami, and Nuclear Disaster Strike Japan
Japan, situated on the Ring of Fire on the edge of the Pacific Ocean, has suffered some major earthquakes over the years. However, nothing before compared to the triple disaster of March 11, 2011: a massive earthquake followed by powerful tsunamis which led to a serious nuclear accident.
The horrors began shortly before three in the afternoon local time with a 9.0-magnitude earthquake. Its epicenter was nearly 20 miles below the floor of the Pacific Ocean about 80 miles east of the Japanese city of Sendai. The quake was one of the most powerful ever recorded, and the strongest to hit this region of Japan.
Map prepared by the U.S. National Oceanic and Atmospheric Administration depicting the tsunami wave height model for the Pacific Ocean following the March 11, 2011, earthquake off Sendai, Japan. Source: NOAA Center for Tsunami Research.
The quake unleashed several tsunamis, or tidal waves, that moved as fast as 500 miles an hour in all directions — most destructively to the nearby northeastern coast of Japan’s Honshu island. Waves as high as 33 feet high crashed into towns and cities along the coast, washing away anything in their path. One wave reportedly reached as far inland as 6 miles.
Several nuclear plants are located in northern Honshu. Most shut down automatically when the quake occurred, but the powerful tsunamis damaged the backup power systems at a plant in Fukushima. As a result, cooling systems shut down, and nuclear fuel overheated and later caught fire. The fire released radiation in the air, and the use of seawater to try to cool the reactors led to radiation contaminating the sea. Japanese officials had to ban people from a zone up to 18 miles around the damaged reactors, scene of the second worst nuclear accident in history.
In the end, the triple disaster cost Japan nearly 20,000 dead — mostly from the tsunamis — more than 130,000 forced from their homes, more than $300 billion in damage, and a severe jolt to the economy.
Members of the French Academy, the Académie française, after a session, crossing the Pont des Arts from the Institute / A. Castaigne. Source: Library of Congress.
For five years, beginning in 1629, a small group of writers gathered in Paris to discuss literary topics. The group soon came to the attention of Cardinal Richelieu, the power behind the French throne and a wealthy patron of the arts. He suggested that the body become official, an idea the group grudgingly accepted. On March 13, 1634, they formally constituted themselves as the Académie Française. The Academy has been in operation ever since except for a ten-year hiatus during the French Revolution. The following year they received a charter from the king.
The Academy was formed to act as the official authority on the French language, aiming to keep it pure and free of vulgar usage or foreign influence. That task has become increasingly difficult in recent years. The Academy has decried use of the verb “impacter,” Frenchifying the English verb “impact.” It has blanched when French magazine covers proclaim the presence of “le best of” lists inside. It can do nothing to compel the usage it endorses, however.
As part of its task of maintaining language standards, the Academy was charged with creating a French dictionary. The first edition of that dictionary did not appear until 1694. The most recent edition — the ninth — was published in 1992.
The Academy has a maximum membership of forty. Membership is for life, and those chosen for the honor are called “Immortals.” Many of France’s leading writers over the years have served as members, including Pierre Corneille, Jean Racine, Voltaire, and Victor Hugo — though the list of those not selected (which includes Molière, Honoré de Balzac, Marcel Proust, and Jean-Paul Sartre) is equally distinguished. The first woman member was admitted in 1980. The current secretary — one of three officers of the Academy — is a woman.
On March 16, 1521, Portuguese navigator Ferdinand Magellan, attempting to sail around the world for Spain, reached the Philippine archipelago. Magellan and his expedition were the first Europeans to reach the Philippines, a stop on the first circumnavigation of the globe, though Magellan’s portion of that journey would soon end.
The expedition of five ships and 250 men had left Spain on September 20, 1519. Magellan sought a western route — avoiding the southern tip of Africa, which Portugal controlled — to the Spice Islands (the Moluccas) of Southeast Asia. Magellan survived two mutinies before sailing around the southern tip of South America, finding the strait named for him, in November of 1520. Reaching calm waters after a dangerous passage, Magellan named the ocean west of South America “the Pacific Ocean.”
As the ships continued sailing west, supplies dwindled, the crew was forced to eat leather and drink a mixture of salt and freshwater, and men began dying of scurvy. Fortified by provisions secured at island stops along the way, the ships reached the Philippines in March 1521.
Magellan spent more than a month in the area, trading with local leaders and trying to convert them to Christianity. He grew angry at one chief who refused to cooperate, however, and ordered an attack on his village. Wounded in the fighting, Magellan bravely held his ground while the rest of his men escaped back to the ship, but then received more wounds and died on the beach.
It took until September of 1522 for the remains of the expedition, 17 survivors under the command of Juan Sebastián de Elcano, to reach Spain. Though he did not complete this voyage, Magellan is considered the first person to circumnavigate the globe because earlier in his career he had sailed an eastern route from Portugal to Southeast Asia, the same region he had reached on his last, fatal voyage by sailing west.
Nadir Shah enters Delhi and captures the Peacock Throne
On March 21, 1739, Nādir Shāh, leading Persian (modern Iranian) and Turkish forces, completed his conquest of the Mughal Empire by capturing Delhi, India, its capital. He seized vast stores of wealth, and among the prizes he carried away was the fabled Peacock Throne.
Nādir Shāh Afshār. Source: Victoria & Albert Museum.
Born in 1688, Nadr Qoli Beg belonged to a Turkish people loyal to the Safavid rulers of Iran. He became a military leader and helped Shah Tahmasp II regain the throne that had been lost to Afghan invaders. Soon after, however, he was angered by the Shah’s surrender to the Ottoman Turks. In response, he deposed the Shah and placed the Shah’s son on the throne, naming himself regent. That arrangement lasted only a few years; in 1736, he deposed the boy and assumed rule as Nādir Shāh.
The new ruler was bent on conquest. He built a navy and captured Bahrain and Oman before launching himself overland against the Mughals. His conquest of that empire went quickly, giving him the prized throne. Built originally by the Mughal ruler Shah Jahan, it reportedly had silver steps set on golden feet. The back showed two open peacock tails. The whole was studded with precious gems.
The throne became the symbol of the Iranian monarchy, though it only remained in Nādir Shāh’s hands for a short time. He was defeated in battle by the Kurds, who seized the throne and apparently dismantled it. A modern Peacock Throne was made in the early 1800s. That splendid but less spectacular model served as the throne of Iran’s Shahs until the Iranian Revolution of 1979.
Nādir Shāh did not fare much better than his magnificent throne. He continued his warring ways, building an empire that was plagued by financial problems and frequent revolts against his cruel rule. In 1749, he was killed by members of his own army.
On March 23, 1743, composer George Frideric Handel directed the first London performance of his sacred oratorio, Messiah. While the composition has become revered as a magnificent choral work — and a staple of the Christmas holiday season — it met some controversy when it first appeared.
Remarkably, Handel needed only three weeks in the summer of 1741 to write Messiah. As his text, he used a libretto compiled by Charles Jennens from verses of the Bible and from the Church of England’s Book of Common Prayer. Jennens was apparently upset that Handel wrote the work in such a short time; he thought the sacred subject needed more time.
He was also annoyed because Handel debuted the work in Dublin in the spring of 1742, not reserving it for a London premiere. Leading Irish clerics (led by Jonathan Swift) insisted that, if their church choirs were to be used to sing the oratorio, ticket sales had to go to charity. That precedent established a longstanding tradition for Messiah.
When Handel finally prepared to present the work in London, more controversy arose. Some people objected to a work on a sacred theme being performed in a secular setting — London’s Covent Garden Theater. The controversy disappeared with the popular acceptance of Handel’s music, however. Even Jennens became reconciled to the composer, in part because Handel rewrote some sections his collaborator considered poor.
Today’s performances do not reflect the scores of these initial performances. Handel revised the piece often, and current productions use one or another of these later versions. The full Messiah tells not only the Christmas story but also of Jesus’s crucifixion and resurrection. Groups that perform the oratorio at Christmas generally only perform the first part.
Elizabeth Blackwell Becomes First Woman to Receive a Medical Degree
Source: National Library of Medicine
On January 23, 1849, Elizabeth Blackwell strode to the front of the Presbyterian church in Geneva, New York, to receive her diploma from Benjamin Hale, president of Geneva Medical College. The ceremony made Blackwell — who graduated first in her class — the first woman in the modern world to receive a medical degree.
Blackwell was born to a wealthy and progressive-minded English family that moved to the United States in the 1830s, when she was around ten. She became a teacher, though that profession did not engage her. One day, a dying friend told her that she might have endured her disease better if she had been attended by a female physician. The conversation planted the idea of becoming a doctor in Blackwell’s mind.
She received some rudimentary training in medicine in the home of a local physician and began applying to medical school. Geneva accepted her, in part because the student body — to whom the question of her admission had been put — treated the idea of a female medical student as a joke. Blackwell faced the hostility of some teachers, students, and townspeople, though she eventually disarmed critics with her dedication and seriousness.
Prejudice made it difficult for Blackwell to establish a practice after her graduation. In 1853, she opened a clinic for women in New York City. She was eventually joined by her sister Emily and by Marie E. Zakrzewska, both of whom she had encouraged to earn medical degrees. The clinic grew and in 1857 was renamed the New York Infirmary for Women and Children. Eleven years later, Blackwell opened the Woman’s Medical College associated with the infirmary. In 1869, she returned to England, where she lived and worked for the rest of her life.
On January 25, 1971, General Idi Amin took advantage of the absence of President Milton Obote to stage a coup and seize power in Uganda. Amin’s turbulent rule lasted only eight years, but in that time he earned him the nickname the “Butcher of Uganda.”
Obote had led Uganda’s independence movement in 1962 and had served as its first prime minster. In 1966, though, he deposed Uganda’s king and had a new constitution written that created a republic with himself as president. Amin was an ally whom Obote named as head of the army and air force at that time.
Amin decided to move against Obote when he was under investigation for his leadership of a gang of thugs. His brutality emerged quickly. Prominent Ugandans — including the police official who had been investigating him — were killed, some by armed toughs and others in mysterious circumstances. Several thousand soldiers were killed on Amin’s orders, decimating the armed forces but putting it firmly under his control.
Amin formed four different security organizations, which he used to carry out his harsh rule. Estimates suggest that as many as 300,000 people were killed in his violent rule.
Amin’s leadership was also marked by actions based on fleeting moods. Late in 1972, he ordered all Asians expelled from Uganda. The departure of some 35,000 people, many of whom owned businesses, crippled Uganda’s economy. A Muslim, Amin was extreme in his condemnation of Israel and once praised Adolf Hitler’s execution of millions of Jews.
Fear drove several different assassination attempts between his coup and 1979. That year, Amin sent troops into neighboring Tanzania to harass some villagers. In response Tanzania’s leader, Julius Nyerere, ordered a counterattack that was joined by thousands of Ugandans. Within weeks, the rebels had seized power and Amin had fled to Libya. He died in Saudi Arabia in 2003.
On January 26, 1500, Spanish sailor Vincente Yáñez Pinzón spotted land. He named the cape the Cabo de Santa María de la Consolación. The site was near modern-day Recife, Brazil, making Pinzón the first European to explore Brazil.
Pinzón was an accomplished navigator who had taken part in the famous 1492 voyage of Christopher Columbus. Pinzón commanded the Niña while his brother Martín commanded the Pinta (a third brother, Francisco, was Martín’s chief officer on that ship). It was not until 1499, however, that Pinzón set out on a new expedition.
In November of that year, he sailed from Palos, Spain, reaching the South American coast by the next January. He spent several months exploring the coast, reaching as far north as the mouth of the Amazon River. Pinzón noticed that the color of the water had changed and, after sampling that differently color water, found it to be freshwater, and not saltwater. He named the body the Mar Dulce, or Sweetwater Sea, and using the strength of the outflowing current, he sailed for the West Indies before returning to Spain.
Records and maps from the Age of Exploration are not always clear or without controversy. Pinzón’s sighting of Brazil is subject to these uncertainties. Some historians think that he landed in Venezuela, not Brazil, and encountered the Orinoco River, not the Amazon. They believe that Portuguese explorer Pedro Álvares Cabral—who certainly reached Brazil in April of 1500—was the first European to land there. At any rate, Portugal, not Spain, gained possession of Brazil and made it the cornerstone of its American empire.
The 78-year-old man was walking to a prayer meeting with the support of two grandnieces. A man stepped out of the crowd and greeted him. The old man returned the salutation when, suddenly, the other man pulled out a pistol and shot three times. Half an hour later, Mohandas Gandhi—the leading figure of India’s independentce movement and the leading exponent of nonviolent resistance—was dead.
Born in India, Mohandas Gandhi was trained as a lawyer and first began a movement for social change in South Africa, where he had lived and worked for a time. That campaign aimed at overturning laws that limited the rights of Indians living in South Africa. The effort, based on his belief in nonviolent resistance, won some concessions from the government in 1913.
He launched his first civil disobedience movement in India in 1919, protesting a British law that required military service of all Indian men. For most of the next three decades, Gandhi was the spiritual and political leader of India, pushing for reform, boycotting British goods, protesting violence between Hindus and Muslims, and eventually pressuring Britain to grant Indian independence.
That campaign finally succeeded in 1947, though Gandhi’s hope for a united India was dashed when Britain, bowing to pressure from the Muslim League, split the area into two states—the chiefly Hindu India and the mainly Muslim Pakistan.
Religious violence followed, as members of the two faiths attacked and killed each other. Gandhi pleaded for an end to the violence and for the Hindu majority to grant tolerance to Muslims. That plea led his assassin, a Hindu fanatic, to kill the Mahatma, or “Great Soul.” A reporter who had been Gandhi’s friend wrote, “Just an old man in a loincloth in distant India: yet when he died, humanity wept.”
Iceland’s Siguroardottir Becomes the First Openly Gay World Leader
On February 1, 2009, Johanna Siguroardottir made double history: she became the first woman to serve as Iceland’s prime minister and she became the first openly gay person to become leader of any nation.
Siguroardottir’s rise to the premiership resulted from several factors. She had a long career in politics and was the longest-serving member of the Iceland’s parliament, the Althing, having first been elected in 1978. She also had experience in government positions, serving four times as Minister of Social Affairs, overseeing Iceland’s social welfare programs. Siguroardottir was a member of Iceland’s middle class, working as both a flight attendant and an office worker before entering politics. Her understanding of the basic concerns of ordinary people appealed to many Icelanders.
The other factor contributing to her achievement was Iceland’s economic mess. The island nation’s banking industry collapsed in 2008 and 2009. That crisis brought down the conservative government of Prime Minister Geir H. Haarde and caused Icelanders to favor the leftist views of the socialist Siguroardottir.
Two years after taking office, her government seems to have stabilized Iceland’s economy. Inflation had been surging above 18 percent a year at the end of 2008, just before she took office. By 2011, it had fallen under 4 percent. The growth rate of the nation’s gross domestic product, which had been negative in 2009 and 2010, in the wake of the economic collapse, was expected to reach 2.5 percent in 2011. The banking sector has been overhauled.
Success was not complete, however. Icelandic voters rejected a government-backed plan to reimburse British and Dutch depositors in Icelandic banks for lost deposits. Voters also seem not to favor Siguroardottir’s desire to enter the European Union.
Siguroardottir did enjoy a great personal moment from her premiership. When Iceland’s new law that allowed gay marriage took effect in June 2010, she married her longtime partner Jonina Leosdottir, a writer.
On February 2, 1536, Spanish explorer Pedro de Mendoza founded the city he named Nuestra Señora Santa María del Buen Aire—Buenos Aires, Argentina. The new town was meant to spearhead the Spanish effort to colonize the interior of South America. It came less than two years after conquistadors had returned to Spain from Peru with treasures seized from the Inca empire.
Spain’s Charles I was spurred by the vast Inca wealth to seek further riches in South America. He also wanted to block any effort by Portugal to expand its foothold in Brazil. Accordingly, he commissioned Mendoza to mount an expedition to explore and settle the Río de la Plata, a vast estuary in southern South America that had been sighted back in 1516.
Mendoza set out in August 1535 in command of 800 to 1700 men (accounts vary) in around a dozen ships. The expedition — the largest sent from Spain to the Americas to date — was ill fated, however. A fierce storm blew the ships off course, and after regrouping Mendoza decided that one of his lieutenants was a rebel and had him executed. Troubles continued after the founding of Buenos Aires. At first the Spaniards received gifts of food from the indigenous locals but soon after fighting broke out between the two groups. That conflict cut off the chief source of food, and the Spaniards began to starve. Mendoza sent a lieutenant upriver in search of a friendlier site. He founded Asunción, now the capital of Paraguay.
Mendoza himself headed back to Spain in 1537. He was seriously ill — perhaps from syphilis — and died on the return trip. His settlement continued to struggle, and in 1541 the remaining colonists abandoned it, heading for Asunción. Not until 1580, when Juan de Garay returned to the scene, was a permanent Spanish presence established at Buenos Aires.
Japanese Attack Port Arthur, Starting Russo-Japanese War
On February 8, 1904, just before midnight, Japanese destroyers entered the harbor of Port Arthur (now Lü-shun, China). Soon after, they unleashed torpedoes against Russian ships in a surprise attack that began the Russo-Japanese War.
The conflict grew over competition between Russia and Japan for territory in both Korea and Manchuria, in northern China. Japan had won Port Arthur, at the tip of the Liaotung Peninsula, from China in an 1894–1895 war. Russia joined with other European powers to force it to relinquish the port, however — and then three years later had compelled China to grant the city to it. These actions rankled Japan, as did Russia’s refusal to honor a promise to withdraw troops from Manchuria. Japan decided to go to war.
The attack on Port Arthur resumed in the late morning of February 9, when bigger Japanese ships began shelling the Russian fleet and nearby forts. The Russians put up more resistance than expected, however, and the Japanese ships withdrew.
The attack on Port Arthur was inconclusive, but the rest of the war went largely Japan’s way. The Japanese enjoyed several victories in 1904, seizing Korea in March, and defeating Russian forces twice in Manchuria during the summer. More success followed in 1905, with the surrender of Port Arthur in January, a victory over a large Russian army in Manchuria in March, and a decisive naval battle at Tsushima Strait in May that destroyed the Russian fleet. Russia’s government, facing unrest at home, was forced to seek peace.
The Russo-Japanese War marked the first victory of a non-European nation against a European one in modern times. It also contributed to unrest in Russia that would lead, more than a decade later, to the Russian Revolution.
On February 11, 1889, Japan’s Emperor Meiji furthered his plan to modernize and westernize his nation by promulgating a new constitution. The new plan of government created a western-style two-house parliament, called the Diet, and a constitutional monarchy — though one with a Japanese character.
When Prince Mutsuhito became emperor and took the ruling name Meiji (“enlightened ruler”) in 1867, he was determined to break with his late father’s traditionalist policies and embrace western ways. He took several steps in this direction. Along with creating a public school system and enacting land reforms, the Meiji emperor created government ministries.
The crowning governmental reform was the new constitution, which embraced the idea of citizen participation — though no plebiscite was held to give the public a voice in the document either as a whole or in detail. The emperor declared that the new constitution arose from his desire “to promote the welfare of, and to give development to the moral and intellectual faculties of Our beloved subjects.”
The constitution was modeled chiefly on the Prussian constitution, a fairly conservative document that subjected parliamentary rule to the power of the monarchy. Thus, the Meiji constitution began by declaring the emperor to be sovereign and “sacred and inviolable.” The emperor was named commander of the armed forces and given the power to declare war or make peace without needing to consult with the Diet.
The constitution was chiefly written by Itō Hirobumi, one of the elder statesmen who effectively ran the Japanese government. Itō and his colleagues assumed that they would be chiefly responsible for running the government and making policy and the emperor would not become involved except occasionally.
The Meiji constitution remained in force in Japan until after World War II, when a new constitution creating a stronger parliamentary system was adopted.
Galileo arrives in Rome for trial before Inquisition
Source: Library of Congress.
Sixty-nine years old, wracked by sciatica, weary of controversy, Galileo Galilei entered Rome on February 13, 1633. He had been summoned by Pope Urban VIII to an Inquisition investigating his Dialogue Concerning the Two Chief World Systems. The charge was heresy. The cause was Galileo’s support of the Copernican theory that the planets, including Earth, revolved around the sun.
Nicolas Copernicus had published his heliocentric theory in 1543. His ideas were condemned by religious leaders — not only Catholic ones but also Protestants Martin Luther and John Calvin — because they contradicted the Bible. Slowly, though, astronomers began to accept the sun-centered universe.
Galileo’s own acceptance, forged in the 1590s, grew stronger in 1609, when he used a new invention, the telescope, to study the planets. Discovering that the Moon had craters, Jupiter was orbited by moons, and Venus had phases like the Moon, he rejected the accepted belief that the heavens were fixed, perfect, and revolving around Earth.
Church authorities, however, objected to a 1613 letter he wrote supporting the Copernican theory. At a hearing, he was told not to actively promote Copernican ideas. A document placed in the records of the proceeding went further, saying he was ordered never to discuss the theory in any way, but evidence suggests that Galileo’s understanding the document was planted after the meeting by enemies.
By the late 1620s, Galileo believed that Pope Urban would be more open to his ideas than earlier popes. He wrote the Dialogue as a conversation between a Copernican and an adherent of the Church’s geocentric theory, hoping to escape condemnation by presenting both views. The ploy failed, and he was summoned. The panel of cardinals decided to ban his book, force him to abjure Copernican ideas, and sentence him to imprisonment. A few months later, the old man was released to his home, where he lived until 1642.
Fidel Castro arrives MATS Terminal, Washington, D.C. 15 April 1959.
Dressed in army fatigues and surrounded by supporters and reporters, 32-year old Fidel Castro took the oath of office as Cuba’s prime minister on February 16, 1959. He would remain in power for nearly fifty years.
In 1953, Castro had led an attack on a Cuban army barracks hoping to launch a revolt against the government of Fulgencio Batista. That attack failed and he was arrested and imprisoned, though later released in an amnesty of political prisoners. Castro and his brother Raúl formed a small rebel group and hid in Cuba’s eastern mountains as they gathered more supporters, trained them to fight, and connected with other anti-Batista groups. By late 1958, the rebel forces were advancing westward. On January 1, 1959, Batista fled the country and Castro entered Havana triumphant.
The initial provisional government included leaders from several rebel factions, not just Castro’s. At first, he refrained from taking any political power, although he was commander of the armed forces. In six weeks, though, the provisional prime minister—not a Castro ally—resigned, and he took the office.
During 1959, Castro supporters, including Raúl, filled more and more top-level positions. Meanwhile, hundreds of former Batista officials were tried and executed, and Castro began sending signals that he was a Communist. An exodus of thousands of Cubans began, some fearing for their lives because of links to Batista, others angered by Castro’s refusal to restore the 1940 constitution and hold promised elections. Cuban relations with the United States worsened when Castro seized the assets of several American companies and tilted toward the Soviet Union; they fractured when the U.S. government cancelled trade agreements and backed an invasion by anti-Castro Cubans, which failed miserably. By early 1962, Castro had announced that his revolution was socialist, and the United States had placed an embargo on trade with the island.
On February 26, 1924, Adolf Hitler and nine associates stood trial in a Munich courtroom. The charge was treason — they were accused of trying to overthrow the German republic. That day, Hitler turned the tables to accuse the German leaders who had surrendered in 1918, ending World War I, and created the republican government he so despised: “There is no such thing as high treason against the traitors of 1918,” he proclaimed.
Germany in the early 1920s was deeply divided. Right-wing nationalists like Hitler bitterly opposed both the republican government and the leftists and Communists who struggled with them for power. These nationalists were also inspired by the example of fascist Benito Mussolini, who had seized power in Italy. Perhaps, they thought, they too could gain power with forceful action.
Hitler’s hopes to launch a national revolt were buttressed by the apparent support of three Bavarian officials. Hoping to force them to join his cause, he staged a putsch, or coup, at a political meeting in a Munich beer garden. Declaring “The revolution has begun,” he had armed thugs from his National Socialist (Nazi) party use the threat of force to convince the three to join him. The next day, however, the three had police fire on a Nazi march, and had Hitler and others arrested.
The trial received coverage across Germany, which Hitler used to his advantage. He denounced the republican government. He denounced the three Bavarian leaders for cowardice. He remained defiant down to the guilty verdict. In his closing speech, Hitler offered a prophetic call: “The man who is born to be a dictator is not compelled: he wills it.”
Sympathetic judges gave Hitler a sentence of only five years. He served only eight months of it. He spent his time in prison writing the first half of Mein Kampf¸ his political manifesto, which detailed his anger at “the traitors of 1918” and set forth his extreme racial views. He also used his time in prison to plan a second — and more successful — takeover of Germany’s government.