Globalization



Download 39.74 Kb.
Date26.04.2016
Size39.74 Kb.
Globalization

In the visionary 1960s, media philosopher Marshall McLuhan wrote that the immense speed of electronic communication would soon make the globe “no more than a village.” The subsequent arrival of personal computers and the Internet, not to mention portable radios, fax machines, and cellphones, seems to validate his prediction. We live in a global economy and a global culture. And coming from our several civilizational vantage points, the peoples of today’s world together face global problems of environmental degradation, overpopulation, famine, and disease.


Economics

In both Europe and the Americas, the quest for greater prosperity through barrier-free trade has helped bring nations together. That quest has also driven the other nations of the world to embrace free market systems. It has bound them, if reluctantly at times, to the Western model of technological modernization, economic growth, and democratic values.

This process, called globalization, means that the separate markets of the world—local, regional, and continental—are now increasingly linked. Business is international, carried on by itinerant entrepreneurs and speeded by modern technologies of transportation, communication, and computerization. In the global economy, the processes of industrialization and urbanization that began in Europe before 1800 have become the international model. But progress has been irregular and incomplete.

The free trade consensus among the leaders of wealthy nations has fueled fierce protests from groups claiming to represent the world’s poor and disenfranchised. Drawn from an enormous array of organizations, from community groups to anarchist cells, these protestors have expressed themselves vigorously and sometimes violently against globalization. They have especially targeted the WTO (World Trade Organization), which, they argue, has endorsed policies that exploit the world’s poor. But while anti-globalization protesters have made clear what kind of world economy they oppose, they have yet to offer a vision that promises greater prosperity to the Latin American, African, and Asian peoples on whose behalf they purport to speak.

On the other hand, the imperatives of free trade, open markets, fiscal discipline, and currency stability have been in different ways and to different degrees embraced by many Asian, African, and Latin American nations. With a well-educated and primarily English-speaking population, India has profited enormously from the expansion of high technology. The South Asian nation also has been a major recipient of “outsourcing”—jobs such as computer technicians or customer service positions that have been transferred overseas, where wages are lower. The Indian experience illustrates what The Economist has described as the “death of distance,” in which technology and globalization have rendered geography meaningless. Meanwhile, Latin America’s leading nations—Mexico, Brazil, Chile, and Argentina—have aggressively embraced policies designed to attract foreign business and investment. As of 2005, Mexico, Chile, and the Central American nations also have entered into free trade agreements with the United States.
Technology

As manufacturing tasks have shifted to the “developing” zones of the world, the most highly developed have assumed the task of producing information and services rather than things. Rapid progress in cybernetics, with its origins in the military competition of World War II, has transformed how business, government, and cultural institutions operate.

War stimulated the creation of systems for the electronic storage and manipulation of data. The first computers were designed to help decipher intercepted messages during World War II. Such were Britain’s pioneering Colossus, which could manipulate 5000 characters per second, or the massive ENIAC (Electronic Numerical Integrator and Computer) developed at the University of Pennsylvania in 1946. Thereafter, the United States led development of computer technology in the new field of cybernetics. ENIAC weighed 50 tons, had 18,000 vacuum tubes, and could store the equivalent of just twenty words in memory. In the late 1990s, computers based on the silicon chip (about ¼ inch across) could store in memory the equivalent of more than 50 million words. This powerful chip gave its name to “Silicon Valley,” the northern California region of intense technological innovation. Silicon Valley was home to the first innovative personal computer company, Apple, whose Macintosh revolutionized the personal computer industry. Further north on the Pacific Coast, Washington-based Microsoft became one of the world’s largest corporations by developing the software used by most personal computers around the globe.

The design and implementation of hardware and software systems require highly trained experts. Even the operation of these systems demand workers skilled beyond the level of the manual laborers who were once the core of the “working class.” Those needs burden the educational systems of societies that aspire to excel in the enterprise of managing information. “In an economy dominated by information and knowledge,” British prime minister Tony Blair pointed out, “education is king.”


Culture

Perhaps nothing unifies peoples of the world as their participation in a common culture—a mass culture of television shows and popular music, of youth fashion, sport, and film. Although these trends first appeared in the free, exuberant 1920s, they developed their present form during the Sixties, an era marked by political protest, sexual liberation, and countercultural transformations (see Chapter 29). American patterns continue to dominate contemporary mass cultural style. In turn, these attitudes have evoked protest from those who perceive a threat to traditional religious values and models of the family.

International sport has played an important role in this process of cultural globalization. Over the last 15 years, the Olympic Games have grown in every respect—the number of athletes, of participating countries, of contested sports, and of television viewers. The 2004 Summer Olympics, held in Athens, Greece, featured such feel-good stories as the men’s soccer team from recently liberated Iraq reaching the tournament’s semi-finals and the Argentine men’s basketball team upsetting the heavily-favored U.S. squad en route to capturing the gold medal. Individual athletes, such as British soccer sensation David Beckham, American bicyclist Lance Armstrong, and Argentine basketball player Manu Ginobili, have marketed themselves as global figures.

Technological, economic, and cultural change have combined in the explosive growth of the Internet. The Internet traces its roots to the U.S. Defense Department, which wanted a tool to facilitate the exchange of information within the national security bureaucracy. As recently as January 1993, only 130 web sites existed. Then, a series of developments paved the way for the Internet of today. Creation of the Uniform Resource Locator (URL) gave every website a unique identification. Hypertext Markup Language (html) allowed web pages to display different fonts, sizes, pictures, and colors. The development of web browsers made possible easy searches for individual sites. Netscape’s creators distributed their product for free; by the end of 1994, users had downloaded one million copies of the browser. Microsoft’s Internet Explorer and Mozilla’s Firefox subsequently replaced Netscape as the dominant browsers.

In recent years, the number of websites and Internet users has proliferated beyond anything imaginable a generation before. Worldwide online commerce has developed from zero in 1994 to an estimated $6.8 trillion in 2004. In early 2001, there were more than 500 million people, disproportionately residents of the United States or Europe, who used the Internet. Of this group, 35.8 percent spoke English, and another 37.9 percent spoke non-English European languages. But Chinese, at 14.1 percent, was the second most common language of people who surfed the World Wide Web.

Beyond commerce, a plethora of uses for the Internet exist. In January 2005, the World Wide Web contained 317,646,084 separate sites, with the number growing every day. Nearly every European government, along with the EU and UN, places official documents and other information on the web. E-mail and instant messaging brings people from all parts of the globe into immediate personal contact. The ability to download audio and video files through peer-to-peer networks is changing the music and movie industries. Blogs have transformed how Americans and Europeans receive information about current events. Since the Internet now can be accessed from laptop computers, personal data assistants, and cellular telephones, users no longer need not be tied to their desks.

Although always on the technological cutting edge, the Internet is culturally neutral. While it represents a way to spread Western ideas and culture to other parts of the world, virulently anti-Western Islamist movements also have used it effectively. The Internet’s proliferation has also raised questions about its governance. Since 1995, a U.S. agency, the Internet Corporation for Assigned Names and Numbers (ICANN), has assigned all domain names. In 2005, however, the UN convened the World Summit on the Information Society, which demanded an increased role for the international body in overseeing the Internet. The United States and most countries in Western Europe have opposed such an arrangement, citing fears of politicizing the administration of the Internet.
The environment

As the West has become more integrated with the rest of the world, it increasingly cannot avoid the problems of the rest of the world. The environment provides a good example. In 1992, the UN Conference on Environment and Development (the “Earth Summit”) met in Rio de Janeiro. Delegates from more than 178 countries focused on issues such as pollution, the depletion of the stratospheric ozone layer, and global warming. Pollutants of a certain type, especially chlorofluorocarbons, have resulted in the thinning of the ozone layer, the protective gaseous zone of the earth’s atmosphere that blocks the harmful effects of the sun’s radiation. The emission of carbon dioxide and other gases, an accompaniment of industrial processes, has also contributed toward “global warming.” The air thickened by gaseous emissions traps the sun’s heat close to the surface of the earth. Temperatures rise as a result.

As a follow-up to the 1992 Rio Conference, diplomats in 1997 met in Kyoto, Japan, to hammer out a treaty addressing the issue of global warming. The resulting document committed 37 industrialized nations to cut greenhouse gas emissions. The treaty assigned to the United States, the world’s largest producer of greenhouse gases, the most significant cut—an estimated 33 percent reduction. Japan and the EU nations similarly were targeted for significant cuts.

Reflecting the general support for environmentalism on the continent, the Kyoto treaty enjoyed strong support in most of Europe. Shortly after coming to office in 2001, however, President George W. Bush (1946- ) withdrew U.S. support for the treaty (which the Senate in any case had declined to approve). Administration officials contended that the treaty would harm the U.S. economy and that it insufficiently addressed the increasing emissions of greenhouse gases in the developing world, especially from the PRC and India. Critics countered that the administration was politicizing science to protect big business interests in the United States.

The U.S. response to the Kyoto Treaty represented a major setback for an environmental movement that had gained widespread popular support in part by portraying nature as a beneficent, pacific force. Yet throughout history, nature has also been a killer—as seen in countless hurricanes, floods, tornadoes, droughts, and earthquakes. The world received a reminder of nature’s darker side in December 2004, when a tsunami (a series of massive tidal waves caused by displacement of water after a large-scale ocean earthquake) struck the coasts of Indonesia, Thailand, India, and Sri Lanka. With its center located around 100 miles west of the Indonesian island of Sumatra, the December 2004 earthquake, which measured between 9.0 and 9.3 on the Richter scale, was either the second or third most powerful in recorded history. The resulting waves killed around 300,000 people, the majority from Indonesia’s Aceh province. But the tragedy was an international one: several thousand tourists, mostly from western and northern Europe, died when the tsunami struck resort areas on the Thai coast.
Population, disease, and famine

Demographic developments in Western Europe over the past generation have run very much against worldwide patterns, which have featured a dramatic surge in population. More than 5.8 billion people lived on the earth in 1996. This total was more than twice the 1950 figure of 2.5 billion; more than three times the 1900 figure of 1.6 billion; and more than 30 times the approximately 170 million in the heyday of the Roman Empire. And the rate of population increase is itself increasing. Some experts fear a “population bomb,” population magnitudes that will threaten the welfare of all.

The most populous regions of the world experience deficits in the quality of life, visible in lowered life expectancy, harsh conditions, and poor medical care. Life expectancy is greater than 75 years in Western Europe and the United States. But it sinks to less than 65 in India, Indonesia, Egypt, and Peru, among others; to less than 55 in most of Africa; to less than 45 in a few desperate states, such as Afghanistan and Uganda. High rates of child mortality correlate to decreased longevity. In 2002, in Germany and France, five infants per 1,000 births died before the age of five; for Sierra Leone, at the other extreme, 283 children per 1,000 births did not reach their fifth birthday. NGOs and UNICEF, the United Nations Children’s Fund, have argued passionately that in an era of globalization, the West has a special obligation to protect the most vulnerable around the world.

Famine. The gap between the daily lives of citizens in the West’s wealthier nations and those in the poorer nations of Africa, Asia, and parts of Latin America also is seen in malnutrition figures. This principal cause of human misery still afflicts more than one in ten people worldwide. According to figures from the UN’s World Health Program, less than 2.5 percent of the population is undernourished in the United States, Canada, and every European country outside of the former Soviet Bloc. In Africa, by contrast, more than 35 percent of the population is undernourished in 16 separate countries. Low productivity remains the principal cause of hunger for the more than 850 million people worldwide that the UN considers undernourished.

Disease. Africa also has disproportionately suffered from the deadliest plague of recent times, AIDS (Acquired Immune Deficiency Syndrome). The disease, which is developed by those infected with HIV (Human Immunodeficiency Virus), originated in Africa, perhaps as a mutation of a dormant, endemic disease of a species of chimpanzee. It spread suddenly to the Americas around 1980, and for several years thereafter affected mostly gay males and intravenous drug users. In the United States, Ronald Reagan’s administration responded exceptionally slowly to the public health crisis that AIDS caused. The President himself did not publicly utter the word “AIDS” until September 1985, three years after the virus was first identified.

The reasons for Reagan’s inaction related, at least in part, to the changing nature of U.S. conservatism. In sharp contrast to more secular Europe, the United States experienced a revival of religious fervor after 1980. The expansion of evangelical Christianity, especially in the South and Midwest, created sizeable blocs of lower- and middle-income voters who preferred the Republican Party on issues of “moral values.” Opposition to homosexuality and support for a hard-line approach to crime formed two key tenets of this new conservative philosophy. And so, even though the United States was the world’s leader in biomedical research, the Reagan administration delayed government-funded research on AIDS for several precious years.

By 2004, more than 20 million people had died of AIDS worldwide, 3.1 million in that one year alone. An HIV-infected person may not develop the disease itself for several years. In the United States and Europe, drug combinations called “cocktails” developed during the 1990s have effectively delayed the disease’s progress and allowed most people with AIDS to live normal lives. Because of its high cost, however, that therapy is not available to most people in the developing world, where the number of AIDS cases is soaring. About 65 percent of the nearly 40 million people with HIV or AIDS in 2004 worldwide lived in Africa.

The rate of heterosexual transmission of AIDS is higher in Africa than anywhere else in the world, and the disease there has disproportionately infected women. A total of 17.6 million women were carrying the HIV virus as of 2004, and women likely will soon exceed men among African victims of AIDS. In the tiny nations of Rwanda and Burundi, 20 percent of pregnant women test positive for HIV. These women in turn transmit the disease to their offspring in 25 to 35 percent of cases. (By contrast, the transmission of AIDS from pregnant women to their babies has been all but eliminated in the United States and Western Europe.) In South Africa, President Thabo Mbeki (1942- ) has, perplexingly, disputed the overwhelming scientific consensus that AIDS is linked to the HIV virus. Mbeki has refused to make anti-retroviral drugs available to HIV-positive pregnant women in state hospitals.


New Battles

Anti-war protestors of the 1960s sang out to “give peace a chance.” Many in the next generation hoped that with the close of the Cold War, the age-old monster of war would be slain. International organizations could supervise the peaceful negotiation of differences, and disruptive “rogue” states could be disciplined by economic sanctions and collective disapproval. This hopeful vision has proved false. Banned weapons of mass destruction proliferated, and new deadly conflicts erupted.



Genocide. On a scale far more grisly than anything that Bosnia experienced, ethnic strife triggered genocide in mid-1990s Africa. Defined by a 1948 UN convention as “acts committed with intent to destroy, in whole or in part, a national, ethnical, racial, or religious group,” modern genocide began early in the 20th century with Turkey’s persecution of its Armenian minority (see Chapter 25). More recently, genocidal warfare overwhelmed the neighboring African states of Rwanda and Burundi.

After seizing control of the two countries in 1918, Belgian colonial administrators encouraged a pattern of domination by the Tutsi tribespeople of a Hutu majority (about 80 percent of the population). When the two nations achieved independence in 1962, the Hutus received political rights. Yet tensions erupted violently in 1965, 1969, 1972, and 1988. A full civil war broke out in 1994, after a plane carrying the presidents of both Rwanda and Burundi was shot down. In the chaos following the assassinations, Rwandan Hutu hard-liners targeted moderate Hutu leaders, UN peacekeepers then stationed in Rwanda (10 Belgians were among the first slaughtered), and, most tragically, ethnic Tutsis. Hutus massacred as many as 800,000 Tutsi men, women, and children, or just over one-tenth of a population of 7.5 million.

With assistance from neighboring Tanzania and Uganda, Tutsi forces fought back, and eventually conquered both countries. Three million Hutus (including many of the killers) took refuge in Zaire and Tanzania, where, corralled in refugee camps, they suffered from starvation and disease. In 1999, Bill Clinton traveled to Rwanda and apologized for not sending American troops to stop the violence. As with Bosnia, the unwillingness of the international community to act more assertively cost hundreds of thousands of lives.

Allegations of genocide also have appeared regarding recent events in Sudan. There, the Arab government of General Omar al-Bashir (1944- ) has waged a war against the black, Christian population in the nation’s south. In recent years, Al-Bashir’s regime has backed private Arab militias that have expanded the conflict to the southwestern region of Darfur. These government-sponsored militias have waged a jihad (“holy war”) in which they have looted villages, raped women, destroyed food supplies, and created hundreds of thousands of refugees or victims of famine. The UN Security Council formally condemned the Sudanese government in late 2004.


Islamist terrorism

Al-Bashir’s regime first generated international concern in the mid-1990s, when Sudan emerged as a leading state sponsor of terrorism. Beyond the zone of the Western democratic states, and even beyond the structures of the nation-state, there has emerged in recent years a new source of explosive violence. The center of international terror has shifted to what one observer has termed the “virtual caliphate,” a shadowy network of militant Islamists that has funneled arms and planned terrorist attacks. Bitterly hostile to secularism and modernity, the Islamists have targeted the United States, Western Europe, and the foundations of Western civilization itself.



Al Qaeda. The critical player in these processes has been a Saudi self-exile, Osama Bin Laden (1957-), leader of a group called Al Qaeda. The organization traces its roots to the Soviet intervention in Afghanistan. In the 1980s, radical Muslims from around the Middle East, usually financed by Saudi Arabia or Pakistan, waged jihad in the central Asian nation. Once the war ended, these “Afghan Arabs” turned their attention elsewhere. To Bin Laden, the stationing of U.S. forces in Saudi Arabia following the Gulf War threatened Islam’s most holy sites—Mecca and Medina—with America’s corrupt, decadent values. U.S. support for Israel further justified opposing America. But, as the U.S. Government’s special commission to examine the 9/11 attacks most convincingly maintained, Al Qaeda has targeted the West not because of specific policy disputes but because of what the organization perceives the West represents—democracy and cultural tolerance.

Bin Laden’s power increased dramatically in 1995, when the Taliban regime, which imposed a doctrinaire version of Islam, assumed control of Afghanistan. The new government went to such extremes as ending all schooling for girls and destroying ancient Buddhist monuments, ignoring worldwide protestation of this barbarous strike against both religion and the world’s artistic heritage. As first Sudan and then the Taliban gave Bin Laden’s terrorist efforts state protection, he grew more aggressive. In August 1998, Al Qaeda executed the near-simultaneous car bombings of American embassies in Kenya and Tanzania. In Nairobi, the more severe of the attacks, 213 people died, including 12 Americans; 4,000 were wounded. Just over two years later, a U.S. naval vessel, the USS Cole, was the victim of an Al Qaeda homicide bombing in Yemen’s Aden harbor, where the ship had stopped for refueling. The attack, which left a huge hole in the Cole’s hull, killed 17 sailors and injured 39 more.

During his time as president, Bill Clinton struggled to develop a coherent response to the Bin Laden threat. Some in the administration championed the anti-Taliban Northern Alliance, an ethnically mixed coalition that controlled the northern sliver of Afghanistan. The close ties between the Taliban and the intelligence agencies of Pakistan and Saudi Arabia, however, discouraged an aggressive U.S. response. Key European powers, notably France and Germany, also opposed any military action against the Taliban. They worried that the West waging war against an Islamic state could harm their efforts to expand European diplomatic and economic influence in the Arab world. And so most in the administration hoped to persuade the Afghan regime to voluntarily surrender Bin Laden. Since Bin Laden was helping to finance the Taliban, this hope had little chance of realization.
The 9/11 attacks and the “War on Terror”

When George W. Bush took office in January 2001, confronting Al Qaeda assumed less significance. In international affairs, the new administration focused on the threats posed by North Korea and Iraq, and its plans to construct a missile defense shield. Despite intelligence briefings from August 2001 highlighting the Al Qaeda threat, Bush appears to have been caught by surprise on September 11, 2001, when 19 operatives committed the greatest act of terrorism known to history—in terms both of innocent lives lost and of sophistication in planning and execution.

On that clear late summer morning, Al Qaeda teams hijacked four commercial flights originating in Boston and Newark. They then assumed the controls and used the planes as flying bombs. One jet struck the North Tower of New York City’s World Trade Center; a second hit the WTC’s South Tower; a third rammed into the Pentagon. Passengers in the fourth aircraft, which apparently was intended to target the White House, stormed the cockpit, forcing a crash landing in a Pennsylvania field. The attacks killed 2,985 civilians: 2,595, including 343 New York City firefighters, in the two towers of the World Trade Center, which collapsed less than two hours after they were hit; 265 on the planes; and 125 at the Pentagon.

NATO responded to the terrorism with unprecedented boldness. For the first time in its history, the organization invoked its Article 5, which termed an attack on one NATO member an attack on all. The UN also endorsed action against regimes that supported international terrorism. Within weeks of the attacks, the United States led a coalition of nations in a “War against Terrorism” that targeted the Taliban. The campaign featured the highly successful use of “smart” air power to obtain pinpoint military objectives with minimum casualties. By the end of 2001, that war effectively disabled Al Qaeda operations in Afghanistan and ousted the Taliban regime. In late 2004, Afghanistan had a free election, in which followers of the interim president, Hamid Karzai (1957- ), fared well. Nonetheless, as of 2005, Bin Laden continued to elude capture, and his network continued to pose a threat.

According to Richard Clarke, the Bush administration’s then-director of counterterrorism, the President initially responded to the WTC attacks by speculating about Saddam Hussein’s involvement in the atrocity. Bush’s remarks previewed his controversial decision to go to war with Iraq in 2003. The President previously had branded Iraq—along with North Korea and Iran, two other nations on the State Department’s list of state sponsors of terrorism—as a component of the “axis of evil.” Bush argued that in the post-9/11 world, the United States could not wait for terrorists to strike. Instead, it would engage in “pre-emptive war,” lest Saddam stockpile weapons of mass destruction.

Bush struggled, however, to build international backing for the effort, and his policy generated intense controversy in Europe. Along with token support from several former communist nations, only Australia, Britain, and Spain sent a considerable number of troops to Iraq—and the latter two nations acted despite strong public opposition to the war. The conflict, as a result, placed deep strains on the Western alliance. France, part of the coalition in the first Gulf War, resolutely opposed the U.S. military action. Ignoring Germany’s traditional pro-American tilt, Foreign Minister Joschka Fischer fumed that the United States needed to understand that “alliance partners are not satellites.” Capturing public sentiment in the German capital, the Berliner Zeitung contended that the war showed that the United States had misused the 9/11 attacks “to strengthen its selfish superpower position.”

Unlike in the 1991 Gulf War, at least, coalition forces had a clear mission in Iraq. They toppled Saddam’s government within a matter of days. When prewar claims about Iraq possessing chemical, biological, or nuclear weapons proved false, however, the war’s unpopularity in Europe intensified, threatening the position of pro-war politicians, especially Britain’s Tony Blair. Moreover, the occupiers were poorly prepared to handle the postwar situation. As a result, an ill-defined insurgency, operating with Al Qaeda support, expanded its reach. By late 2005, insurgents had killed more than 2,000 U.S. soldiers, more than 100 British soldiers, and tens of thousands of Iraqi civilians.

Europe and the Iraq war. In 2004 and 2005, Al Qaeda operatives targeted other countries that participated in the Iraq operation. On March 11, 2004, bombs planted at a Madrid bus station killed 191 and injured nearly 1,500 people. The attack was timed to coincide with Spain’s parliamentary elections. The pro-war Popular Party faced a strong challenge from the Socialist Party, whose leader, Jose Luis Rodriguez Zapatero (1960- ), opposed the Iraq involvement. Polls showed the Popular Party headed for re-election until the attack, but on Election Day, the Socialists scored an upset. The victorious Zapatero pulled Spanish troops out of Iraq shortly after assuming office.

Despite the war’s unpopularity in Britain, in May 2005, Tony Blair became the first Labour prime minister in history to win a third term, in large part due to the weakness of the main opposition, the Conservative Party. The Labour majority was, nonetheless, severely reduced; and the anti-war Liberal Democrats had their strongest-ever showing. Two months after Blair’s re-election, terror arrived in London. On July 7, 2005, Islamist suicide murderers killed 52 and injured ten times as many after four bombs exploded on subway cars and buses. Revelations that the four killers were born and raised in Britain compounded the shock of the attack. Blair’s government quickly enacted a sweeping law that gave it the power to deport Islamic radicals who preached support for terrorist activities.

Militarism, fanaticism, torture, genocide, and terror—these overshadowed the last decades of the twentieth century and the dawn of the twenty-first, even as the powerful nations of the developed world, the Cold War behind them, settled down to the business of building a prosperous global village based on technological innovation.

Conclusion: what’s next?

Observers of the tangled web of the past, historians prudently refrain from speculating about the future—and we, with them, do not presume to answer the question “what’s next” for Western civilization. But two starkly opposite possibilities present themselves:



The first is that the West is in its “sunset” stage, as opposed to the “sunrise” Islamic states on the intellectual front, countries such as India in economic affairs, and China in the strategic realm; and that the United States in particular, quoting the newly-elected (June 2005) Iranian president Mahmoud Ahmadinejad (1956- ), “is in its last throes.”

The second is that the West is still expanding, less through military domination than through economic and cultural innovation. Its languages, especially English and French, are heard worldwide; its fads and fashions captivate world consumers, especially the young; while its liberal political traditions call to reformers around the globe, suggesting that, as the White House boldly proclaimed in February 2005, “democracy and freedom” may be “on the march.”


Share with your friends:




The database is protected by copyright ©essaydocs.org 2020
send message

    Main page