Wednesday, July 9, 2008

American Exceptionalism?

American Exceptionalism
Introduction
We are engaged in a national conversation about the role of history in furthering democracy and civic engagement. The White House, members of Congress, and several major foundations and interest groups have accelerated the long-running discussion of how to strengthen history education in America’s schools and use history to promote democratic values and patriotism. The language they are using, the decisions they are making, and the money they are spending promise to dramatically affect how U.S. and world history will be taught.
Several broader issues as well as an increasing number of federal legislative efforts form the context of this discussion. Defining America’s superpower role in the post-Cold War world and at the start of the new millennium and comprehending the attacks of September 11, rising terrorism, and the war in Iraq—these have caused many to turn to history and to attempt to use the study of the past to define and inculcate patriotism, American values, and civic engagement. Meanwhile, and for similar reasons, vast amounts of federal money have been flowing into history education. In 2001, Congress approved Senator Robert Byrd’s (D-WV) Teaching American History (TAH) grant program. The Department of Education TAH program already has directed a quarter of a billion dollars to fund collaborative three-year projects between precollegiate schools and historians in history departments, museums, historical societies and elsewhere who can help improve teachers’ knowledge, understanding, and appreciation of American history. Senator Byrd’s program emphasized a return to teaching history as a separate subject and specifically excluded social studies projects. This program has funded some 289 three-year projects around the country, each of which involves dozens of teachers. Congress is considering an additional $120 million for TAH in fiscal 2004.
In fall 2002 the Bush administration launched its We the People initiative to further improve history education. It began with a National Endowment for the Humanities (NEH) essay contest for high school students and an annual NEH “Heroes in History” lecture. The We the People launch also included a White House forum in spring 2003 to explore, in the words of the president, “new policies to improve the teaching of history and civics in elementary and secondary schools, and in our colleges and universities.” Unfortunately, few of the panelists were historians, and professional and academic associations of historians were invited as spectators rather than as participants. A smaller forum in July 2003 was an improvement, with panelists and audience agreeing on the need for more history content for students and more history training for teachers. We the People also involved a National History Day (NHD) and National Archives and Records Administration (NARA) joint effort to identify and promote classroom use of the nation’s “one hundred milestone documents.” For the current fiscal year NEH received $10 million in additional We the People funding. This substantial increase to the NEH budget will be used for projects on American history across all NEH program areas. Applicants are encouraged to “submit grant applications that "
.” The president’s fiscal 2005 budget would increase NEH funding by $26.7 million, primarily to strengthen support for We the People.
Other major bills introduced this Congress demonstrate a growing concern about the state of history education. Senator Lamar Alexander (R-TN), a former U.S. secretary of education, introduced his “American History and Civics Education Act” in 2002, which would create summer academies for students and for K-12 teachers of history and civics. Alexander called for an emphasis on traditional American history, which in recent versions of his bill is defined as “key events, key persons, key ideas, and key documents that shaped the institutions and democratic heritage of the United States of America.” Judd Gregg (R-NH), another senator who has long taken an interest in the teaching of American history, introduced the “Higher Education for Freedom Act.” This bill would “establish and strengthen postsecondary programs and courses in the subjects of traditional American history, free institutions, and Western civilization.” Representative Pete Hoekstra (R-MI) introduced the “Graduate Opportunities in Higher Education Act of 2003,” which would provide millions of dollars for establishing college and university courses, seminars, research programs, new teaching materials, and other means of support for the teaching of “traditional American history (including significant constitutional, political, intellectual, economic, diplomatic, and foreign policy trends, issues, and documents; the history, nature, and development of democratic institutions of which American democracy is a part; and significant events and individuals in the history of the United States).”
This debate about using history to teach citizenship, democracy, and patriotism is also taking place across the nation in places where state social studies and history standards for the schools are being reconsidered. Minnesota, for example, is in the middle of a contentious process of revising its state social studies standards. Commentators around the country, including parents, teachers, journalists, curriculum specialists, state officials, and politicians, are casting a new eye at how American history is treated in the schools in their states.
Much of the recent attention has been productive and, for example, has resulted in a sudden abundance of funds for the NEH and the Department of Education to be spent on promoting the understanding and teaching of American history. While most members of the Organization of American Historians would support much of what is being said and done, these developments have been accompanied by a rhetoric of antagonism directed at the way historians have been practicing their craft for the past few decades. The new national debate over history pits what is being called “traditional history” against “revisionism,” or the continual exploration and reinterpretation of the past. Revisionist history is blamed for being critical of the United States and uninterested in conveying the singularity of the America experience and the significance of our nation’s values. In 2003 the Albert Shanker Institute, Thomas B. Fordham Foundation, and American Council of Trustees and Alumni published reports that attempt to strengthen this notion of “traditional history.” (See bibliography below for web links to these reports.) Whether they are employed in colleges, museums, schools, research universities, historical societies, publishing, government, historic sites, or in the private sector, historians seek new information about and new understandings of the past. They continually revise what we know about history, they sharpen and fine tune, bring new approaches and ideas to bear, and reinterpret in light of the present. This does not necessarily change the facts of history; but neither are the facts nor our interpretation of them static.
The Executive Board of the Organization of American Historians urges more professional historians and history teachers to become familiar with this debate. At its fall 2003 meeting, the board directed OAH staff to present a compendium of views that summarizes some of the major reports published last year, places them in the larger context of the culture wars of the mid-1990s, and offers several old and new responses to the proponents of “traditional history.” This summary, largely the work of Laura Micheletti Puaca, is offered as a starting point for all those interested in joining the conversation so far.
“Traditional History” vs. “Revisionist History”
A Summary of the Debate over History’s Role in Teaching Citizenship and Patriotism
Laura Puaca
April 1, 2004
PROPONENTS OF “TRADITIONAL HISTORY”
Why history matters
Proponents of “traditional history” argue that in the aftermath of 9/11, it is more important than ever for students to “learn the history of their nation, the principles on which it was founded, the workings of its government, the origins of our freedoms, and how we’ve responded to past threats from abroad.” History also teaches students how to be citizens, to understand their world, and to comprehend America’s relationships to other nations.[1] The Shanker Institute’s report, Education for Democracy concurs: “[T]he mastery of a common core of history binds us together, creates a common civic identity based on a patriotism of principles, and unites us in the shared undertaking that is both our past and our future.” As more than one person has noted in reference to 9/11, the Shanker Institute adds, “We were attacked for being American. We should at least know what being American means.” [2]
Students don’t know history
Schools are responsible for teaching students about America’s past, but many studies have shown “that history is the core subject about which young Americans know least.”[3] According to the Shanker Institute, “our students are woefully lacking in a knowledge of our past, of who we are as Americans.”[4] Sheldon Stern, writing for the Thomas B. Fordham Institute, also notes that secondary students are no longer expected to write research and term papers in U.S. history and that many students come to college with no experience writing papers.[5] J. Martin Rochester cites historian Sean Wilentz who says that educators “pose as courageous progressives dedicated to liberating schoolchildren from the tyranny of rote instruction…But if they have their way, the widely lamented historical illiteracy of today’s students will only worsen in the generations to come.”[6]
Stern links this phenomenon to low voter turnout and contempt for the democratic process, as does Rochester in his essay on the “training of idiots.” The Shanker Institute adds that, as a result, students have been left to flounder in a state of moral confusion and do not seem interested in people outside their immediate circle of friends and relatives (except for entertainment/sport figures).[7]
And what they do know is skewed
Additionally, proponents of traditional history have charged that the “serious issues…such as what constitutes appropriate history” have been altered beyond recognition by “political correctness,” if not neglected completely.[8] In their introduction to Where Did Social Studies Go Wrong?, James Leming and Lucien Ellington use David McCullough’s testimony to support this assertion. In his NEH Jefferson lecture, they write, McCullough “decried the way in which the kind of political correctness exposed in this volume [Social Studies] has stripped the American history that today’s students study of any messages as to why we should appreciate the ideals and sacrifices that have made this country great. He called the emerging national historical amnesia rooted squarely in vapid politically correct accounts of our history, a threat to liberty: ‘Something is eating away at our national memory…For a free, self-governing people, something more than a vague familiarity with history is essential if we are to hold on to and sustain our freedom.’” [9]
Proponents of “traditional history” also link this issue to the politicization of the university by “tenured radicals” who have taken over the university and indoctrinated their students with their leftist agendas. Education has become political, and objectivity, traditional values, and academic discipline have gone out the door.[10]
Teachers don’t know history either
Perhaps part of the problem, Fordham Institute President Chester Finn, Jr. suggests, is that too many teachers of history have never seriously studied the subject. Rather, they “have been certified as ‘social studies’ teachers after majoring in sociology, psychology, or social-studies pedagogy.”[11] Stern agrees, and notes that many history teachers were education majors with little or no background in history. Moreover, these teachers are “rarely encouraged, evaluated or rewarded for their knowledge of subject matter.” This leads to bad habits, such as over-dependence on textbooks and “promotes simplistic or inaccurate history teaching.”[12] Leming and Ellington add that the field of social studies brings with it its own problems. Social studies theorists—aka education professors—portray American society as morally bankrupt, advocate using the classroom for societal transformation, and are hostile toward the kinds of basic knowledge ordinary Americans want their children to have. “The theorists’ passion for radical social change and their propensity to use the public schools as a tool to do so…. has resulted in a field that eschews substantive content and subordinates a focus on effective practice to educational and political correctness.”[13] Stern proposes that states must raise the bar by requiring that new teachers of the subject possess a bachelor’s degree in history. He also advocates that they must earn the master’s in history within a contractually agreed upon number of years. Degrees in education should be unacceptable. Furthermore, as Stern notes in his recommendations, history should be removed from social studies.[14]
The need for state standards
The Shanker Institute applauds the “standards movement,” or “the long overdue idea that a common core and orderly sequence of learning in each of the major subject fields, including history/social studies, should be set forth in specific terms as a guide for curricular materials and teaching.”[15] Most states, though, do not have high-quality standards for the teaching and learning of history. While supportive of President Bush, Finn argues that the No Child Left Behind act of 2001 might actually worsen this condition. The act requires such standards for reading, math, and science, but allows other subjects such as history to “fly beneath the federal radar.”[16] According to Stern, though, the act should be considered “a floor, not a ceiling.” Forty-eight states and D.C. have already established social studies standards which are “necessarily and properly the starting point for determining what America actually intends its young people to know about their nation’s history.” Standards will also shape teacher preparation, textbook selection, etc.[17]
Challenges faced when implementing state standards. Includes broader concerns about what constitutes history, how history should be taught, and what students should know.
Stern argues that “[s]tate history standards must acknowledge the key issues and events that comprise the whole American story, including both the inspiring and terrible events in our past.” But he also notes that education does not exist in a vacuum, and history standards have become entangled with the “profound realities of American life,” namely “the anti-educational values promoted in popular culture and the bitter turf wars, culture wars, and legitimacy wars among interest groups at all levels of American society.”[18]
One important challenge Stern argues is that of presentism, examples of which include high school students charging Columbus with genocide and elementary school students rewriting the Constitution the way women might have had they had the opportunity. Stern writes that while the injustices of the past should not be expunged, they should be contextualized: “It is the task of honest history education to be anchored in context and to reject corrosive and meaningless presentism.”[19]
Another consideration is the post-9/11 “history education crisis.” The big question was what to teach children about the 9/11 attacks. Many in the social studies field as well as the education establishment in general advocated tolerance, forgiveness, diversity, and the possibility that America was responsible for the events. Teachers were not encouraged to explain to their students “why some bad people abhor freedom and seek to obliterate democracy; why America, because of what it stands for, is abhorrent to those who would enslave minds, subjugate women, and kill those who differ from themselves; why the United States is worth preserving and defending; and how our forebears responded to previous attacks upon their country in particular and freedom in general.”[20] So at a time when the country needs—more than ever—its future citizens to learn why America is worth defending, its greatest source of would-be help has turned into a hindrance.[21] Rochester adds that the NEA’s 9/11 lesson plans “were a textbook example of the trends toward not only the nonjudgmental classroom but also the therapeutic, fact-free classroom.”[22]
For Stern and others, the 9/11 debate recalled those surrounding the 1994 proposed National Standards for United States History. They argue that both sides “force inconvenient new facts through handy ideological filters” and neither will ever be satisfied.[23] The inclusion of those previously excluded had resulted in the exclusion of those previously included. New (revisionist) histories, curricula, and state standards do not provide balance, but rather, replace old distortions with new ones. Stern writes, “Today’s students can readily identify Sacajawea and Harriet Tubman but often can barely discuss Washington or Jefferson—except as slave owners….The once well-known story of the growth and expansion of American democracy and human rights is barely perceptible in many state standards and curricula.”[24] Rochester sees this emphasis on inclusion as impractical since there are only 180 days in a school year. Not everyone can be included. “Even the imperative to give equal time to women alongside men can lead to silliness.” “For better or worse,” he concludes, “DWEMS [dead white European males] dominated much of the political history of the world, certainly the history of the United States.”[25] Rochester also cites Arthur Schlesinger, Jr. who argues that those intellectuals advocating “inclusion” are actually “promoting the ‘balkanization’ of America by legitimizing divisive identity politics over the ‘melting pot’ metaphor.”[26]
Stern admits that democracy is not well served by teaching a sterilized history devoid of conflict but adds that democratic institutions will not flourish if students “swallow the distortions and half-truths promoted by leftist ideologues…who dominate the social studies establishment in our schools, the faculty in our graduate schools of education, and the history and ‘studies’ departments in our colleges and universities. Young Americans are being consciously taught to hate and be ashamed of their nation’s history and to believe that America is a uniquely evil and oppressive society.”[27] The Shanker Institute adds, “It is not just that we are flawed, the account goes, but that we are irredeemably flawed. Such an interpretation is distorted, harmful to students, and strongly counter to the views of parents.”[28]
On the role of multiculturalism, Lucien Ellington and Jana Eaton distinguish two visions of multiculturalism: “cultural pluralism” and “critical separatism.” According to Ellington and Eaton, most multicultural theorists (textbook writers, standards creators) espouse the critical separatist view. They cite studies showing that social studies education professors lean far more to the left than other teachers and fault them for their ideas about white privilege, knowledge/power, impossibility of objective truth, etc.[29] Multicultural theorists’ “postmodern perspective” also challenges what is history, what can be taught as history, and what is evidence. “If there are always ‘multiple truths,’” Ellington and Eaton write, “then what is taught as content becomes simply a matter of competing opinions.”[30]
The trend toward global education exacerbates these problems by generating suspicions about American institutions while uncritically celebrating the institutions of most other societies. For example, Diane Ravitch’s textbook study, cited in Education for Democracy, notes that world history texts present all cultures as “great and glorious,” sugarcoat non-Western practices that would otherwise be condemned if practiced by Europeans and Americans, and only portray Europeans and Americans as imperialistic.[31] In Where Did Social Studies Go Wrong?, Jonathan Burack identifies three global history “contradictions,” including: “a multiculturalism that is neither ‘multi’ nor ‘cultural’”; ‘the unbearable blandness of diversity”; and “tolerating the intolerance of the ‘other.’”[32] Burack also views global history as problematic insofar as it diminishes the role of the nation-state. More specifically, Burack criticizes the movement to internationalize the study and teaching of U.S. history, as proposed in the OAH La Pietra Report. The report’s agenda, he argues, is based on political advocacy, not historical scholarship.[33]
Other standards issues include: the “dumbing-down” of academics, which in itself is long-standing (Rochester traces it to 1893), but now must consider the Internet, the shift in cultural values (“the excesses of the Woodstock nation”), and constructivism, or the idea that students should construct their own historical interpretations (which “assumes that uninformed students can make informed judgments”). The result is a collapse of standards. And the “educational have-nots” are hurt the most. Rochester also surveys a number of social studies texts and finds: emphasis on critical thinking; focus on controversy; America-trashing; the anti-intellectualism of constructivist learning theories; cynicism (vs. skepticism); and progressive groupthink.[34]
The Fordham Institute conducted its own review of state standards in the 48 states and Washington D.C. which have them in order to determine whether students would be “adequately educated in American history—particularly in the origins and development of democratic institutions and values.”[35] The study used three broad criteria: 1) comprehensive historical content; 2) sequential development; and, 3) balance. The results were six “outstanding” grades, five “very good” marks, seven Cs, eight Ds and 23 Fs (including the District of Columbia). Model states are Indiana, California, and Alabama.
Proposed Solutions
For Stern, the most decisive step toward achieving strong U.S. history standards in all states would be “to emancipate this subject from the miasma of social studies.” Stern depicts social studies as a “nebulous, anti-historical, and a-historical invention.”[36] Ravitch traces the history of social studies and shows that when social studies was first introduced in schools in the early twentieth century, history was at its core. But over the years, and especially in the latter decades of the twentieth century, “many social studies professionals disparaged history with open disdain, suggesting that the study of the past was a useless exercise in obsolescence that attracted antiquarians and hopeless conservatives.”[37]
Rochester offers his own solutions for correcting the imbalances in the teaching of history/civics/social studies. These include teaching American history in its own right (rather than as part of world history), not being bashful about the political achievements of the American political system, insisting on facts, hiring informed teachers, and emphasizing complexity instead of relativism. [38]
Burack proposes several approaches for the teaching of global education. They include re-centering the West, approaching other cultures “honestly,” “warts and all,” “noting the contradictions of global education ideology,” “stressing the superficiality, inaccuracy, and blandness” of world cultures/history materials, and encouraging “stronger narrative history with a focus on moral and political action.” [39]
Ellington and Eaton suggest that social studies teachers should “reject critical separatist multiculturalism because it is misleading, attacks ideals integral to American success, fosters ethnic discord, promotes extreme relativism, and is objectionable on educational, evidentiary, and political grounds.”[40] They want teachers to embrace a kindler, gentler version of multiculturalism which they call “cultural pluralism.” Multicultural education should be based “on evidence and sound scholarship, instead of the ideological and affective perspectives that the theorists espouse.”[41] Ellington and Eaton recommend that: teachers should “develop American history courses that fairly describe the experiences” of minority groups; social studies instruction “should reject the theorists’ idea that all cultures are equal”; teachers should not be negative-minded social activists; and policy makers and the general public should be made aware “that radical leftist multicultural ideas have been institutionalized in teacher education programs.”[42]
The Shanker Institute advocates critical thinking that rests “on a solid basis of factual knowledge.” Content and facts—“central ideas, events, people, and works that have shaped our world, for good and ill”—should be emphasized over “learning skills.” Facts are real, important, and should be learned.[43] The Shanker Institute proposes four essentials for teaching young democrats: 1) a “robust” history/social studies curriculum, to be taught every year beginning in elementary school; 2) “a full and honest teaching of the American story”; 3) “an unvarnished account” of life in nondemocratic societies; and 4) “a cultivation of the virtues essential to democracy.”[44]
COUNTERARGUMENTS
Short summary
In response to the charge that students don’t know history and that the little they do know has been corrupted by “tenured radicals” and the like, many historians point out that: 1) debates over history teaching and interpretation are long-standing, and cannot be pinned on some imagined recent takeover of schools by hyperliberal teachers and professors; 2) no history is “objective”—it never has been, it never will be, and it should not pretend to be; 3) revising historical narratives is not a bad thing, a new thing, or an unnecessary thing; and 4) understanding American history in its entirety (for better, for worse) is not “unpatriotic” or “un-American” as critics have charged. “Truthful,” inclusive history is essential to understanding this country’s past and present.
Re: Students don’t know history.
True, many students don’t know “history.” But this is not a recent phenomenon, nor is students’ resistance to “facts.” In his article, “Don’t Know Much About History—Never Did,” Richard J. Paxton urges readers to situate recent claims about poor student performance in a historical context. After examining nearly a century of history surveys, Paxton concludes that students have consistently scored low on history surveys. In 1917, for example, a history survey conducted by J. Carleton Bell and D.F. McCollum revealed that elementary school students answered questions correctly at a 16% rate, while high school students, normal school students, and university students scored somewhat higher, at 33%, 43%, and 49%, respectively. This outcome, the authors explained, “does not show a very thorough mastery of basic historical facts.”[45]
That students in 1917, as well as another group surveyed in 1943, scored low on history surveys did not mean, however, that they were in danger of losing their “civic memory.” Nor did they jeopardize America’s civic identity or commitment to democracy, helping instead to win two world wars. Although critics continually issue such charges against contemporary students with low scores on history surveys, it is important to point out that Tom Brokaw's “Greatest Generation” performed no better on history surveys than the high school students of the 1980s, labeled by Diane Ravitch and Chester Finn as the “Generation at Risk.”[46] Therefore, to blame students’ low survey scores on recent changes in the composition of the faculty, unorthodox teaching methods, multiculturalism, or broader social changes is ineffectual, artificial, and anachronistic.
Moreover, Paxton explains, students perform roughly the same on history surveys as those testing other subjects. While critics might look to these results as evidence of American students’ intellectual decay, Paxton argues that they reveal much more about the surveys themselves. Paxton questions the reliability of “recall-on-demand” surveys and wonders, “If standardized tests do a poor job of capturing the full spectrum of student ability and knowledge, then what can be said of surveys in which a telephone rings and an interviewer quickly begins asking unexpected questions?” “In reality,” Paxton explains, “these surveys say much more about the nature of out-of the blue questions than they do about students’ knowledge of history. Even then, they say more about what students don’t know than about what they do.”[47]
Roy Rosenzweig and David Thelen also take up this issue in The Presence of the Past, which explores the paradox in which the so-called crisis of historical amnesia coexists alongside a clear and growing public interest in history, as evidenced in museum attendance, historical tourism, festival participation, etc. Instead of focusing on what Americans don't know about history, then, Thelen and Rosenzweig—along with other colleagues—set out to understand what Americans do know about history, how they use it, and what makes it meaningful. That Americans are lacking in “factual knowledge,” Thelen and Rosenzweig argue, does not mean that they are either uninterested in or indifferent to the past.[48]
Thelen and Rosenzweig find that people generally become interested in history through family and personal experiences. Through photographs, conversations with relatives, movies, books, museums, family reunions, etc., Americans actively engage history. To be more specific, informants ranked the following in order of importance: family gatherings; visiting museums/historical sites; celebrating holidays; watching movies/TV about the past; studying history in school. (When asked about which accounts they considered most credible, respondents ranked them in this order: museums, personal accounts from relatives; conversations from those who participated in or witnessed events; college history professors; high school teachers, nonfiction books, and movies/tv. People wanted to get as close to the past as possible, thus the less mediation, the higher the credibility assigned to each).[49] As a result, Rebecca Conard explains of Thelen and Rosenzweig’s findings, community, state, and national histories “are most effectively approached through doors or perception that link personal experience(s) to broader patterns of history.”[50] Thelen and Rosenzweig show that when folks think of history as “dull” and “irrelevant” (as most respondents characterized studying history in school) it is because they feel unconnected to the past as they learn it in the classroom, largely because “they don’t recognize themselves in the version of the past presented there.”[51] Respondents’ rejection of the nation-centered accounts they were forced to memorize in school, though, is not the same as an overall rejection of national history. Asked about the event that most affected them, Thelen explains in his Public Historian article, two-fifths of respondents “named public events like wars or political movements but also identified how they had experienced it as individuals. And they described how they felt both swept along by and fighting against trends in the larger society.”[52] Thelen and Rosenzweig show that as they “build bridges between personal past and larger historical stories, Americans—especially white Americans—tend to personalize the public past.”[53] People want to approach the broader past on their own terms. “Only by getting close to experience could they see the ambiguities, multiple perspectives, and transformative potential they had learned to expect in their intimate worlds.”[54]
In effect, Rosenzweig writes, “What respondents told us runs counter to the narrative of declension that says Americans are disengaged from history because cultural radicals have captured the schools (and museums) and are teaching gloomy stories about our nation—stories about McCarthyism rather than America’s triumph in the cold war, about Harriet Tubman rather than the Founding Fathers, about destroying Indians rather than taming the West. If only we would get back to the good old facts of American triumph (and the old-fashioned methods of teaching those facts), they maintain, then Americans would be reengaged. The people we interviewed said that they are already quite involved with the past….They liked history in museums and didn’t like history in schools—not because Harriet Tubman has been added, but because the schools require dry recitation of facts instead of inspiring direct engagement with the ‘real’ stuff of the past and its self-evident relationship to the present.”[55]
Re: The need for state standards
In October 2003, a group of historians at the University of Minnesota submitted to the state Department of Education a formal statement concerning the proposed Minnesota Academic Standards in History and Social Studies. “As teachers,” they argue, “we share the concerns expressed by many teachers throughout the state that the standards are unwieldy and impractical, and as historians we believe that many of the proposed standards are inaccurate, misleading, and represent an oversimplified view of American and World History.”[56]
The standards, they maintain, fail to reflect the breadth and depth of contemporary historical knowledge. This failure, they add, “leads to important omissions, misplaced and misleading emphases, and, in a number of instances, clear factual errors.”[57] The U.S. standards downplay or suppress the history of dissent, for example, while the world standards advocate the discredited characterization of global development as a “clash of civilizations.” Additionally, by overemphasizing European history and by “focusing on American ‘roots’ in the guise of world history, the standards perpetuate a myopic and misleading understanding of other civilizations.”[58] Both standards generally depict the United States as isolated or distinctly different from the rest of the world and fail to capture the complexities of America’s role as a superpower.[59]
“We do not point out these problems in order to diminish anyone’s pride in their own history,” they write. “But we are convinced that only by admitting, exploring, and analyzing these vital faults of American history alongside America’s triumphs and by more fully addressing World History, past and present, will we be enabled to learn from our shared past and resolve its complicated legacies. If not, then what is historical study for?”[60]
Similarly, in February 2004, the Georgia State University Department of History passed a resolution protesting the proposed state standards in their field. The proposed standards attempt to revise an earlier version, which had received a “B” from the Fordham Institute in part because they overemphasized social history. According to the State Superintendent and the architects of the new standards, the existing curriculum is “bloated” and based on “mediocrity and shallow standards.”[61] While the Georgia State history educators agree that the curriculum is in need of improvement, the group argues that the proposed standards are too narrow in scope, emphasize memorization over understanding, omit important information and chunks of history, ignore recent scholarship, fail to prepare students for college, and ensure continued mediocrity. Moreover, the proposed standards reinforce a conventional narrative that equates national history with a monolithic nation-state, even though most American do not recognize the lives of themselves or their families in this account. And it is this disconnect that leads many people to view the history that they learn in school as “boring.”[62]
Re: Schools aren’t what they used to be, and they’re only getting worse.
In The Opening of the American Mind, Lawrence W. Levine explores the familiar grumblings of critics: “Never before has there been such disorder, such lack of discipline, such disregard for tradition. Never before have the young shown similar contempt for good sense and for their elders (the repositories of Good Sense). Never before have educators dared to challenge the canons of learning with such abandon and lack of reverence for our cultural heritage. Never before has everything in the realm of culture become so uncompromisingly politicized—so Politically Correct—that teachers and students fear to articulate their views openly and freely. Never before has the educational canon become so diluted by the forced addition of works chosen not for their quality but because of the race or gender of their creators. Never before has the study of the Significant been so dwarfed by the pursuit of the Trivial. Never before have we lived in such a fragmented and inchoate condition in which immigrants and minorities manifest group consciousness and an unwillingness to learn the language, adhere to the traditions, or enter the structures of the larger society.” But these developments are not new ones, and by treating them as though they were, we are in effect avoiding, if not ignoring, the past.[63]
Re: abandonment of the educational “core”
Critics’ call for a return to the educational “core,” Levine argues, results from an artificial nostalgia and a misunderstanding of American educational history. As he explains, “We are told again and again that until the 1960s university education was ruled by the study of Western Civilization and a canon of the Great Books. In fact, Great Books and Western Civilization courses enjoyed only a brief ascendance: they emerged largely after World War I and declined in the decades after World War II.”[64] The inclusion of “modern” writers such as Shakespeare and Walt Whitman emerged only after divisive and heated disagreements. In other words, there is no essential, immutable core as critics imagine there to be. There have always been arguments about what works to study, and the controversies that we are experiencing today are neither new nor radical.
With regard to the charge that history has been politicized, Levine claims that current emphases on social and cultural history are no more permanent, or any more politicized, than were past emphases on political, intellectual, economic, or diplomatic history. “It also reflects the fact that history today is written, as it has always been written, by human beings who are part of their own societies and cultures.”[65] History is not skewed, but it is selective. “If we think of the American past as a text,” Levine writes, “any reading of U.S. history will be conditioned by who is reading the text and what is transpiring in the society at the time. This has always been the case. George Bancroft, James Ford Rhodes, Henry Adams, Frederick Jackson Turner, Charles Beard, and their peers, were all conditioned in their reading of the text of the American past by who they were, by what was going on in their times, by the prisms through which they chose to look at the past. This is no less—and no more—true of contemporary historians.”[66]
Contemporary critics have charged that the push for “political correctness” has served to stifle students, impose conformity, and harm classroom interactions. As a result, students become afraid to share dissenting views. Again, Levine points out that this has long been the case (although without the “politically correct” label). When was it not risky to go up against the professor on controversial issues? He writes, “Students have always had to learn to accommodate to the whims and prejudices of professors, to the attitudes and sensitivities of fellow students, and to the values and beliefs of the larger society; to, that is, the complex of considerations that today is referred to much too simply as ‘political correctness.’”[67]
What is particularly problematic, though, is that critics “have made this long-standing condition in the academe a partisan one (unique to the Left) and an exceptional one (unique to our time).”[68] The condition is neither. According to Levine, “The trouble with the widespread apocalyptic view of the sudden takeover of the university by forces essentially alien to its basic spirit is that this vision removes the American university from the context of its own extended history and transforms long-term processes of change and development into short-term accidents.”[69]
Re: “revisionism”
In the September 2003 Perspectives, James McPherson expresses concern and confusion over the charge of revisionism being levied against historians. Historians “know that revision is the life blood of historical scholarship. History is a continuing dialogue between the present and the past.” According to McPherson, “Interpretations of the past are subject to change in response to new evidence, new questions asked of the evidence, new perspectives gained by the passage of time. There is no single, eternal, and immutable ‘truth’ about past events and their meaning. The unending quest of historians, for understanding the past—that is, ‘revisionism’—is what makes history vital and meaningful.”[70]
McPherson notes that without revisionism, we might be still be stuck with images of Reconstruction as they appear in Birth of a Nation, to give one example. He adds that Supreme Court decisions have also been “revisionist” and doubts that President Bush would want to be associated with southern political leaders of the 1950s who condemned Earl Warren, et al. for their Brown decision, which was based, in part, on the research of historian John Hope Franklin and others.[71]
In History on Trial: Culture Wars and the Teaching of the Past, Gary B. Nash, Charlotte Crabtree, and Ross E. Dunn add that the act of history requires revision and reinterpretation—and always has—because “the past is necessarily embedded in the present human condition.” They quote Carl Becker who wrote, “In the history of history a myth is a once valid but now discarded version of the human story, as our now valid versions will in due course be relegated to the category of discarded myths.”[72] Indeed, interpretations of historical events have evoked disagreement—and “revision”—for centuries. The meaning of the American Revolution was first revised the day after the Paris Peace Treaty was signed on September 3, 1783. By the mid-1790s, Americans were so divided over the legacy of the Revolution that men who had fought side by side found it necessarily to organize separate events “and raise glasses to astoundingly conflicting toasts.”[73]
Nash, et. al., also raise the question of whether history that looks at the dark side of the American past is really “unpatriotic.” They maintain that “nothing can serve patriotism worse than suppressing dark chapters of our past, smoothing over clearly documentable examples of shameful behavior in public places high and low, and airbrushing disgraceful violations of our national credo such as the actions of the Ku Klux Klan or the internment of the Japanese Americans during World War II. If events like these are seen as mere footnotes to history, America’s youth are unlikely to swallow the story, especially when they see around them systemic problems that eat at the national fabric. Sooner or later they will discover that a self-congratulatory version of American history sheds little light on how we got to the place we now occupy.”[74] Moreover, they add, it is in authoritarian regimes that those in power routinely represent the national past in any way they like; in a democracy, students and citizens think for themselves.[75]
Furthermore, Nash et. al., maintain, the same critics who are “contemptuous of American historians who reexamine and reinterpret the American past are jubilant that Russia, Germany, and Japan have revised their history textbooks so that students will learn about such things as the Russian appeasement of Hitler, Stalin’s slaughter of twenty-one thousand Polish army officers, the Ukrainian genocide against Kiev’s Jews, the Japanese enslavement of Krean women for wartime sexual services, and the Holocaust. Almost all Americans would agree that children in other nations must look squarely at the dark side of their history. Is the same wisdom not applicable in the worlds leading democracy?”[76]
Lastly, Nash et. al., implore us to consider science education and the “recently” released National Science Standards, which were produced in a way similar to the National History Standards. Although the science standards have generated some debate about how much science students should learn, in what grades, etc., they “have not evoked attacks on science educators, charging them with ‘bastardizing’ their discipline because they draw on new research. No one so far is protesting that scientific revisionism is subversive or that teachers and scholars are ‘science thieves’ or ‘science bandits’ because they built standards on new knowledge—the existence of uranium, radon, and the isotope carbon 14, for example—that was unavailable several generations ago. Nor would American parents want their college-bound children to major in science if they were not going to learn E=mc2, a formula unknown two generations ago.”[77]
Re: multiculturalism
As Levine notes in his introduction, “The United States has always been a multicultural, multiethnic, multiracial society, but in our own time these truths—and their implications for higher education—have become increasingly difficult to ignore.” As education becomes more open to and representative of the diverse people, backgrounds, and cultures that make up America, its impulse to understand those parts of our history and culture that have been given short shrift grows as well.[78]
In response to critics’ claims that Western European civilization has been dominant in this country and as a result, we must pay proper attention to it, Levine notes that this assertion refers only to origins. “Western Europe was indisputably the point of origin of some of our most influential national values, attitudes, practices, and institutions. But as anyone who studies culture seriously should know, the point of origin is only part of the story; it has to be balanced by a comprehension of what happened to the values, practices, and institutions after they arrived. For they came not to an empty continent but to a peopled one; they came not to a homogenous land but to an increasingly diverse one. Nor is it accurate to speak of “Western European” culture as if it were a unified whole when in fact it comprised a host of different peoples and a series of cultures—languages, religions, nationalities, worldviews, political systems, folkways—that often were in tension with and ran counter to one another. It is simply not sufficient to speak about Western European culture, which was really a heterogeneous complex of related cultures, as if it continued to exist in some pristine form once it arrived in the United States.”[79] Therefore, Levine suggests that we discuss transformations instead of dominance or purity. The transformations that occurred as these cultures came into contact with each other and shaped each other are what define American culture.[80]
In The Presence of the Past, Thelen and Rosenzweig consider how different groups understand history. They explain that “African Americans, American Indians, and evangelical Christians sometimes construct a wider set of usable pasts building ties to their communities as well as their families. Mexican Americans occupy a figurative—as well as geographical—borderland. Like white European Americans, they rely on family pasts as they work though multiple allegiances and sort out fundamental issues of identity, but they also draw on their ethnic and national roots. Unlike white European Americans, Mexican Americans tell a version of the traditional national narrative of progress: they talk about getting closer to owning a piece of the American dream.”[81] African Americans and American Indians whom Thelen and Rosenzweig interviewed, for example, tended to link their personal histories more closely with broader history than did white Americans. According to Thelen and Rosenzweig, “All Americans use the past to build on and affirm primary relationships; African Americans and American Indians also use the past to affirm and build ties to their communities. They not only see themselves as sharing a collective past, they sometimes use these collective pasts to construct the sort of progressive narratives—history with a capital 'H'—that seem harder to find among white Americans. And in some ways American Indians and black Americans also connect their narratives much more explicitly to the American national story than most white Americans do, even while they dissent sharply from its traditional formulation.”[82]
When asked about people from the past who had affected them most, African Americans and Indian Americans chose national figures, as did white Americans. But the figures were different ones: Martin Luther King, Jr., Crazy Horse, and Kennedy, Lincoln, and Christ were among the most popular, respectively.[83] African Americans and American Indians also constructed different timelines of American history, thus decentering the “traditional” narrative. African Americans were more likely, though, than American Indians to place their experiences within “American” history.[84] As Thelen and Rosenzweig explain, both groups “offered sophisticated counternarratives of U.S. history. African Americans, however, most often saw themselves as part of the traditional story, which they told in conventional Americanist terms of emancipation and progress; they demanded inclusion in the basic narrative and complained of white failures to live up to the nation’s principles. The Sioux seemed to reject the traditional narrative structure altogether, defining themselves as a separate nation with a history that followed a dramatically different trajectory.” But, at the same time, it was the Sioux who most often invoked events and people from conventionally defined U.S. history. “In contrast to the indifference with which white respondents viewed textbook narratives of American history, the Sioux spoke with the passionate interest of the outside critic.”[85]
Thelen and Rosenzweig write, “A collective voice comes easily to these two groups….The ‘we’ they invoke stands in sharp opposition to the triumphal American ‘we’ the narrative of the American nation-state—the story often told by professional historians—is most alive for those who feel most alienated from it. This departure from conventional wisdom, like so many other insights that emerged during survey interviews, eloquently supports the hunch…[that] professional history practitioners have much to learn from listening to Americans talk about how they use and understand the past.”[86]
More Solutions
Nash et. al., propose five concrete solutions:
1) Commit ourselves to a history education that is fit for a democratic society. Information should not be considered off-limits, nor should it be considered sacrosanct and indisputable. We should give up all projects to write the final word—“Leave such stipulations to authoritarian states, which have always imposed that kind of curriculum.” “Our collective memory is bound to change as the issues that matter to us as a nation change. Historical research will continue to yield new information and interpretations….To invoke historical revisionism as a form of foul play serves democracy poorly.”
2) Recognize that the debate over whether to teach more “historical content” or more “historical thinking” sets up false dichotomy. The two are interrelated.
3) Nurture the flourishing alliances between schools and universities. Legislatures should also insist that history teachers be trained in the subject.
4) Aim for a history curriculum that embraces yet goes beyond the goal of representing a diverse variety of groups and cultures. In other words, identification is not enough. Lessons should help students “explore the broader landscapes in which groups, societies, and peoples interact. If pursued honestly, such an approach would produce unequivocally inclusive history.”
5) Reconcile the differing views of “committed multiculturalists and those Americans who believe classrooms should emphasize the study of democracy’s evolution and the Western heritage.”[87]
On a broad scale, Levine writes, “We need to integrate learning more fully and to have more sequential courses that build on one another. We need to minimize the use of inaccessible jargon wherever possible, particularly in those fields where jargon has become a way of life. We need to make a greater effort to communicate with colleagues in other disciplines, with students, and with the general public. We need to ensure that teaching ability is considered seriously in all faculty personnel decisions. We need to learn how to respond to the considerable challenge of teaching the most wide-ranging and heterogeneous body of students in the history of American higher education.”[88] Rosenzweig advocates “a historical practice that is somehow simultaneously more local and intimate and more global and cosmopolitan, more shaped by popular concerns and more enriched by insights based on systematic and detailed study of the past.”[89]
More specifically, as Thelen and Rosenzweig have shown us, history becomes interesting to people when it draws on their personal experiences. By incorporating these lessons in schools, teachers can help reconnect students with the past and make history meaningful. In his conclusion, Thelen envisions what he calls a “participatory historical culture” in which using the past could be a shared human experience and opportunity for understanding, instead of a breeding ground for divisiveness.[90] Even if they did share conservative critics’ desire to cram more facts into students, their survey results indicate that “the revival of traditional stories and traditional teaching methods…isn’t the way to do it.”[91]
In his conclusion to The Presence of the Past, Rosenzweig discusses the excitement generated when teachers incorporate primary documents in classroom lesson and projects. Although the past is still mediated, Thelen and Rosenzweig’s findings do indicate that Americans want to get as close to the past as possible, and thus, this approach helps to make students feel ‘connected.’[92] According to Thelen in his History News article, many of the study’s respondents were very much aware that the past is mediated and it is because of their understanding that different people could mediate the past very differently that they wanted to experience it themselves. Thelen writes, “They would rather engage an artifact for themselves than be told about it…. Participants in experiences certainly mediated what they reported, but their participation in the original experience gave them a better opportunity to see and report a wide range of possibilities that existed in that experience than those who reconstructed it later for other agendas.”[93] In his article in the Indiana Magazine of History, Thelen also advocates “re-enactment” as a way of making history open up.[94]
In his review of The Presence of the Past, Robert Archibald notes that both Thelen and Rosenzweig are aware that in order to connect professional and popular historymaking, history professionals would need “to relinquish exclusive authority over the significance and meanings assigned to the past—a disorienting but necessary experience.”[95] The extent to which they did so in their study, by taking seriously their respondents understandings of the past, is lauded by Ronald Grele as “democratic.” “Historians should take seriously The Presence of the Past’s initial claim about the importance of historical activity and begin to understand how we as public historians can collaborate in the building of a more democratic history, and, we hope, a more democratic society.”[96] Could this approach be an example of what critics’ call “education for democracy”?
SOME POINTS OF AGREEMENT
There are certainly points of agreement between the traditionalists and their critics which suggest the possibility of working toward some common goals. Perhaps the most important point of agreement between these two schools of thought is the need for better training of future history teachers in the subject of history itself. Secondary school teachers should have a major, or at least a minor, in history at the college level. Yet, at his time, no state requires a major in history to teach that subject in its public schools.
The Shanker Institute acknowledges that the study of the past should…“give youngsters a sense of historical consciousness—a connection and continuity with those who came before. This feeling, which is one of both belonging and responsibility, begins with knowledge but touches something that knowledge cannot reach: the mystic chords of memory that Lincoln immortalized. In feeling the presence of the past in their lives, students begin to see that there is a path that has been made ready for them, one on which they can find their place, extend into uncharted territory, and leave their footprint.” In order to avoid turning history instruction into a parade of facts, the Shanker Institute also advocates the importance of selectivity and meaningfulness to memory. [97]
The Shanker Institute is right: “We should at least know what being American means.”[98] How we go about doing that, and how we define “American” of course, is where the contentions lie.

American exceptionalism (cf. "exceptionalism") refers to the belief that the United States differs qualitatively from other developed nations, because of its national credo, historical evolution, or distinctive political and religious institutions. The difference is often expressed in American circles as some categorical superiority, to which is usually attached some alleged proof, rationalization or explanation that may vary greatly depending on the historical period and the political context.
However, the term can also be used in a negative sense by critics of American policies to refer to a willful nationalistic ignorance of faults committed by the American government.[1]
Contents[hide]
1 Overview
2 Causes in their historical context
2.1 Puritan roots
2.2 American Revolution and Republicanism
2.3 Immigration
2.4 Cold War
3 Aspects of arguments
3.1 Republican ethos and ideas about nationhood
3.2 Frontier spirit
3.3 Mobility
3.4 American Revolution
4 See also
5 Notes
6 Further reading
7 External links
//

[edit] Overview
Dorothy Ross, in Origins of American Social Science (1991), argued that there are three generic varieties of American exceptionalism:
supernaturalist explanations which emphasize the causal potency of God in selecting America as a "city on a hill" to serve as an example for the rest of the world,
genetic interpretations which emphasize racial traits, ethnicity, or gender, and
environmental explanations such as geography, climate, availability of natural resources, social structure, and type of political economy.
The term was first used in respect of the United States by Alexis de Tocqueville in 1831.[2] American exceptionalism is close to the idea of Manifest Destiny, a term used by Jacksonian Democrats in the 1840s to promote the annexation of much of what is now the Western United States (the Oregon Territory, the Texas Annexation, and the Mexican Cession). The term was later used in the 1890s by Republicans as a theoretical justification for U.S. expansion outside of North America.
The term has also come to describe the belief that the United States has an exceptional position among countries, and should not be bound by international law except where it serves American interests. This position is driven by a (usually implicit) premise that the United States cannot violate international law (and in particular international human rights norms) because of the view that America itself was largely responsible for instigating those norms in the first place. This view has come under stress due to international condemnation of U.S. human rights practices under the doctrine of War on Terror. (Also see: Human rights and the United States.)
The basis most commonly cited for American exceptionalism is the idea that the United States and its people hold a special place in the world, by offering opportunity and hope for humanity, derived from a unique balance of public and private interests governed by constitutional ideals that are focused on personal and economic freedom[citation needed]. It is therefore used by United States citizens to indicate a moral superiority of America or Americans. Others use it to refer to the American concept as itself an exceptional ideal which gives the country a privileged position, and which may or may not always be upheld by the actual people and government of the nation. Researchers and academics, however, generally use the term to strictly mean sharp and measurable differences in public opinion and political behavior between Americans and their counterparts in other developed democracies.
Opponents of the concept of American exceptionalism believe it to be little more than ethnocentrism and propaganda.[3] [4] In their arguments, they often compare the US to other countries that have claimed an exceptional nature or destiny. Examples in more recent times include Great Britain at the height of the British Empire, Israel, the USSR and Nazi Germany, while many historic empires such as Ancient Rome, China, and a wide range of minor kingdoms and tribes have also embraced exceptionalism. In each case, a basis was presented as to why the country was exceptional compared to all other countries, drawing upon circumstance, cultural background and mythos, and self-perceived national aims.

[edit] Causes in their historical context
In essence, it characterizes the course of American history as a "deliberate choice" of "freedom over tyranny" which was properly made, and was the central reason for why American society developed "successfully."[citation needed] With this in mind, American exceptionalism is just one of many national exceptionalist movements.

[edit] Puritan roots
The earliest ideologies of English colonists in the country were embodied by the Protestantism of Puritan settlers of New England. Many Puritans with Arminian leanings embraced a middle ground between strict Calvinist predestination and a less restricting theology of Divine Providence. They believed God had made a covenant with their people and had chosen them to lead the other nations of the Earth. One Puritan leader, John Winthrop, metaphorically expressed this idea as a "City upon a Hill" — that the Puritan community of New England should serve as a model community for the rest of the world. This metaphor is often used by proponents of exceptionalism.
Although the world-view of New England Puritans changed dramatically, and the strong influence of other Protestant traditions in the Middle Colonies and the South, the Puritans' deep moralistic values remained part of the national identity of the United States for centuries, remaining influential to the present day. Parts of American exceptionalism can be traced to American Puritan roots.

[edit] American Revolution and Republicanism
A milestone in the history of American Exceptionalism is the American Revolution. The ideas that created the American revolution were derived from a tradition of republicanism that had been repudiated by the British mainstream. Thomas Paine's Common Sense for the first time expressed the belief that America was not just an extension of Europe but a new land, a country of nearly unlimited potential and opportunity that had outgrown the British mother country. These sentiments laid the intellectual foundations for the Revolutionary concept of American exceptionalism and was closely tied to republicanism, the belief that sovereignty belonged to the people, not to a hereditary ruling class.
Alexis de Tocqueville stressed the advanced nature of democracy in America, arguing that it infused every aspect of society and culture, at a time (1830s) when democracy was not in fashion anywhere else.

[edit] Immigration
A core argument of exceptionalism is that America is unusually attractive to immigrants from all parts of the world for two reasons. First, advocates of American exceptionalism say that economic and political opportunities are unusually high, and that the United States possesses a high degree of social mobility. Since its founding, immigrants such as Andrew Carnegie and Carl Schurz have risen to the top layers of the economic and political system. The "American Dream" describes the perceived abundance of opportunities in the American system. Second, unlike many old world countries, immigrants can become Americans by adopting American culture and values.
Critics point out that America is now hardly unique in its appeal to immigrants, and that many countries like Australia, Canada and New Zealand are at least as popular and welcoming to immigrants.[5][6]

[edit] Cold War
American exceptionalism during the Cold War was often cast by the mass media as the American Way of Life personifying liberty engaged in a battle with tyranny as represented by communism. These attributions made use of the residual sentiment that had originally formed to differentiate the United States from the 19th century European powers and had been applied multiple times in multiple contexts before it was used to differentiate capitalist democracies (with the United States as a leader) from communist nations. American exceptionalism during this period also manifested itself in an anti-internationalist streak as part of which the United States rejected participation in international institutions which it could not control. The Bricker Amendment movement, for instance, rejected the adoption of international human rights conventions by the United States.

[edit] Aspects of arguments

[edit] Republican ethos and ideas about nationhood
Proponents of American exceptionalism argue that the United States is exceptional in that it was founded on a set of republican ideals, rather than on a common heritage, ethnicity, or ruling elite. In the formulation of President Abraham Lincoln in his Gettysburg Address, America is a nation "conceived in liberty, and dedicated to the proposition that all men are created equal". In this view, America is inextricably connected with liberty and equality. It is claimed that America has often acted to promote these ideals abroad, most notably in the First and Second World Wars, in the Cold War and today in the Iraq War. Critics argue that American policy in these conflicts was more motivated by economic or military self-interest than an actual desire to spread these ideals, and point to an extensive history of using South American nations as slave economies, suppressing democratic revolutions against US-backed dictators when necessary.
The United States' policies have been characterized since their inception by a system of federalism and checks and balances, which were designed to prevent any person, faction, region, or government organ from becoming too powerful. Some American exceptionalists argue that this system and the accompanying distrust of concentrated power prevent the United States from suffering a "tyranny of the majority", and also that it allows citizens to live in a locality whose laws reflect that citizen's values. A consequence of this political system is that laws can vary greatly across the country. Critics of American exceptionalism maintain that this system merely replaces the power of the national majority over states with power by the states over local entities. On balance, the American political system arguably allows more local dominance but prevents more national dominance than does a more unitary system.

[edit] Frontier spirit
Proponents of American exceptionalism often claim that the "American spirit" or the "American identity" was created at the frontier (following Frederick Jackson Turner's Frontier Thesis), where rugged and untamed conditions gave birth to American national vitality. However, this 'frontier spirit' was not unique to the United States - other nations such as Canada, South Africa, Argentina and Australia had long frontiers that were similarly settled by pioneers, shaping their national psyches. In fact, all of the British Imperial domains involved pioneering work. Although each nation had slightly different frontier experiences (for example, in Australia "mateship" and working together was valued more than individualism was in the United States), the characteristics arising from British attempting to "tame" a wild and often hostile landscape against the will of the original population remained common to many such nations. Of course, at the limit, all of mankind has been involved, at one time or another, in extending the boundaries of their territory.

[edit] Mobility
For most of its history, especially from the mid-19th to early 20th centuries, the United States was exceptional in its occupational and physical mobility. America is known as the "land of opportunity" and in this sense, it prided and promoted itself on providing individuals with the opportunity to escape from the contexts of their class and family background. Examples of this social mobility include:
Occupational - children could easily choose careers which were not based upon their parents' choices.
Physical - that geographical location was not seen as static, and citizens often relocated freely over long distances without barrier.
Status - As in most countries, family standing and riches were often a means to remain in a higher social circle. America was notably unusual due to an accepted wisdom that anyone - from impoverished immigrants upwards - who worked hard, could aspire to similar standing, regardless of circumstances of birth. This aspiration is commonly called living the American dream. Birth circumstances were not taken as a social barrier to the upper echelons or to high political status in American culture. This stood in contrast to other countries where many higher offices were socially determined, and usually hard to enter without being born into the suitable social group.
The United States still has class mobility, however, a 2005 study showed that children born into poverty in Europe and Canada were more likely to find prosperity than children born into poverty in the United States[7].

[edit] American Revolution
The American Revolutionary War is the claimed ideological territory of "exceptionalists". The intellectuals of the Revolution, such as Thomas Paine and Thomas Jefferson, arguably shaped America into a nation fundamentally different from its European ancestry, creating modern constitutional republicanism as we know it. Others counter that there is nothing unique about the revolution — the English "Glorious Revolution" was nearly a century prior to the American revolution and led to constitutional monarchy. The French Revolution also arguably led to a form of modern democracy.

Eurocentrism is the practice of viewing the world from a European perspective, with an implied belief, either consciously or subconsciously, in the preeminence of European (and, more generally, of Western) culture. The term Eurocentrism implies criticism of the concerns and values at the expense of non-Europeans and is not used by those who consider it factually justified.
The Eurocentrism prevalent in international affairs in the 19th to 20th centuries has its historical roots in European colonialism and imperialism from the Early Modern period (16th to 18th centuries). Many international standards (such as the Prime Meridian, the Dionysian Era or the worldwide spread of the Latin alphabet) have their roots in this period.
In both Europe and North America, the heyday of Eurocentricism was in the 19th century, today it is much less prevalent due to developments in popular culture and teaching.[1]
Alternatively, Eurocentric and Eurocentrist are occasionally used in British political discourse to describe supporters of European integration and the European Union, in other words as an antonym of Eurosceptic.
Contents[hide]
1 Origins
1.1 The European Miracle
2 Examples of Eurocentrism
2.1 Cartography
2.2 Education
2.3 World languages
3 By region
3.1 Eurocentrism in Africa
3.2 Eurocentrism in Australia
3.3 Eurocentrism in Argentina
3.4 Eurocentrism in the United States
4 Decline
5 Bibliography
6 See also
7 References
8 Bibliography
9 External links
//

[edit] Origins
Further information: The European miracle, Age of Exploration, Colonialism, and Western World
Further information: Spanish Golden Age and Britain's Imperial Century
Early Eurocentrism can be traced to the European Renaissance, in which the revival of learning based on classical sources were focused on the ancient Greek and Roman civilizations, due to their being a significant source of contemporary European civilization.
The effects of these assumptions of European superiority increased during the period of European imperialism, which started slowly in the 15th century, accelerated in the 16th, 17th and 18th centuries, and reached its zenith in the 19th century. The progressively mechanised character of European culture was contrasted with traditional hunting, farming and herding societies in many of the areas of the world being newly conquered & colonised by Europeans, such as the Americas, most of Africa, and later the Pacific and Australasia. Even the complex civilizations of Arabia, Persia, India, China, Mexico, Peru and Japan were counted as underdeveloped when compared to Europe, and were often characterised as static.[citation needed] Many European writers of this time construed the history of Europe as paradigmatic for the rest of the world. Other cultures were identified as having reached a stage through which Europe itself had already passed – primitive hunter-gatherer; farming; early civilisation; feudalism;and modern liberal-capitalism. Only Europe was considered to have achieved the last stage.
For some writers, such as Karl Marx, the centrality of Europe to an understanding of world history did not imply any innate European superiority, but he nevertheless assumed that Europe provided a model for the world as a whole. Others looked forward to the expansion of modernity throughout the world through trade, imperialism or both. By the late 19th Century, the theory that European achievements arose from innate racial superiority became widespread, justifying race-based slavery, genocide, colonisation and other forms of political and economic exploitation.
The colonising period involved the widespread settlement of parts of the Americas and Australasia with European people, and the establishment of outposts and colonial administrations in parts of Asia and Africa. As a result, the majority populations of the Americas, Australia and New Zealand typically trace their ancestry to Europe. A Eurocentric history is taught in such countries, despite geographic isolation from Europe, with many European cultural traditions.

[edit] The European Miracle
The European miracle theory, developed initially by Eric Jones in 1987, [2] describes his position that Europe was more advanced and progressive than all other civilizations prior to the year 1492, allowing it to develop capitalism, reach the New World first, and dominate world trade and politics. The theory is criticised for Eurocentrism, with critics of the theory pointing out that progress in the rest of the world paralleled Europe before 1492, and that more than half of the world's population living in urban settlements (over 10,000 people) lived in China during the fourteenth and fifteenth centuries. Jones' work can be seen as building on a Eurocentric view of earlier European thinkers, with Max Weber's idea of the Protestant work ethic and Georg Hegel's Spirit providing a rationale for claiming the European mind (and European religion) is inherently superior to that of all other continents. Immanuel Wallerstein's idea of a world-economy and world-system originating in Europe also appear to come through in European miracle theory.

[edit] Examples of Eurocentrism

World map showing Europe vertically centered

[edit] Cartography
The division of the landmass of Eurasia into the separate continents of Asia and Europe is an anomaly with no basis in physical geography. An alternative view is that Eurasia is a single continent, one of six continents in total. This view is held by some geographers[who?] and is preferred[citation needed] in Russia (which spans Asia and Europe). The separation is maintained for historical and cultural reasons.
Arno Peters highlighted the political implications of map design by promoting the Gall- Peters projection, as a contrasting world map to the Mercator projection, a commonly used world map projection at the time. The Mercator projection distorts areas further from the equator, making Europe and North America appear disproportionately large compared to similar sized areas closer to the equator, such as Africa, Central America and Australia. Alaska, for example, is presented as being similar or even slightly larger in size than Brazil, when Brazil's area is actually almost 5 times that of Alaska.
The longitude meridians of world maps based on the prime meridian, placing Greenwich, London in the centre, has been in use since 1851. Various other prime meridians were in use during the Age of Exploration. The current prime meridian has the advantage that it places the International Date Line in the Pacific, inconveniencing the smallest number of people.
A residual effect of the European origin of the English language are terms like 'Middle East', which describes an area slightly east of Europe, and the 'Far East'. Alternatively, the Western World, or 'Western civilisation' are terms that group culturally similar countries- not only Central and Western Europe, but the former European colonies of North America, Australia and New Zealand.

[edit] Education
Eurocentrism was once embedded in the study of Greek classic literature.
In the 1960s a reaction against the priority given to a canon of "Dead White European Males" provided a slogan which neatly sums up the charge of Eurocentrism (alongside other important -centrisms)[citation needed].
Garry Wills, the journalist and professor of American Studies at Northwestern University, writes that Eurocentrism created a false picture of the classics themselves.[3]
Since the 1970s, the indebtedness of Classical Greece to "the Orient" (notably the Neo-Assyrian Empire) at the time of its formation during the Early Iron Age has been given more prominence.[4]

[edit] World languages
As a direct consequence of the "European miracle" and the colonial empires, languages of Europe are over-represented among the current world languages: out of eight languages generally considered "world languages", five are of European origin (English, Spanish, Russian, Portuguese, French), besides Mandarin Chinese, Arabic and Hindi-Urdu. The asymmetry is even more pronounced in the distribution of Nobel Prize in Literature, of which a clear majority has gone to authors writing in languages of European origin. In the period 1901 to 1950, Rabindranath Tagore was the only author writing in a non-European language (Bengali) who received a Nobel Prize (in 1913). In the period 1951 to 2000, there were six laureates writing in non-European languages.
Of the six official languages of the UN (Chinese, English, French, Russian, Spanish, and Arabic), four are European. All six directly reflect historical imperialism: the European Spanish Empire (16th to 17th centuries), British Empire (16th to 20th centuries), Russian Empire (19th century) and French Empire (19th to 20th century), besides the Chinese Empire (3rd century BC to 19th century) and the Arab Caliphates (7th to 13th centuries).

[edit] By region

[edit] Eurocentrism in Africa
During the colonisation of Africa by European nations, Eurocentric systems of second-class citizenship were often set up in order to give Europeans political power far in excess of their numbers in those nations that had substantial European populations, such as Rhodesia (now Zimbabwe) and South Africa. The Congo Free State was claimed as the personal property of King Leopold II of Belgium, with a subsequent inhumane treatment and forced labour of the native population.
Eurocentrism has been said to deny Africans agency in the creation of their own history. For example, until recently, in Western scholarship cities such as Dakar, Banjul (Bathhurst), Abidjan, Conakry and others were assumed to be creations of Western colonisers. However, though they were transformed in both negative and positive ways by colonisation, these cities predate colonisation as did many of the economic and institutional patterns found in Africa.[5]

[edit] Eurocentrism in Australia
Main article: White Australia policy
The Immigration Restriction Act 1901 was one of the first Acts of Australia's Federal Parliament after nationalism lead to the country's independence from European (British) rule. The Act placed "certain restrictions on immigration and... for the removal... of prohibited immigrants". Edmund Barton, the prime minister, argued in support of the Bill with the following statement: "The doctrine of the equality of man was never intended to apply to the equality of the Englishman and the Chinaman." The White Australia Policy was gradually removed after World War Two until the 1970s.
Less overtly Eurocentric was the view that the Indigenous people of Australia did not require any compensation or consideration when their land was claimed as a British colony, as their use of the land did fit with recognised (European) views of land ownership. They were later specifically denied citizenship in the Australian constitution, and traditional European views of appropriate lifestyles and attitudes lead to a policy of cultural assimilation, designed to eradicate the race through measures including the forced removal of children.

[edit] Eurocentrism in Argentina
In Argentina an extensive racist ideology has been built on the notion of European supremacy.[6] This ideology forwards the idea that Argentina is a country populated by European immigrants "bajados de los barcos" (straight off the boat), frequently referred to as "our grandfathers", who founded a special type of "white" and European society that is not Latin-American.[7] In addition, this ideology holds forth that cultural influences from other communities such as the Aborigines, Africans, Latin-Americans, or Asians are not relevant and even undesirable. White-European racism in Argentina shares similarities with the White Australia policy that was practiced during the beginning of the 20th century.
White-European racism in Argentina has a history of government participation. The ideology even has a legal foundation that was set forth in Article 25 of the National Constitution sponsored by Juan B. Alberdi. The article establishes a difference between European immigration (which should be encouraged) and non-European immigration.
Article 25: The Federal Government will encourage European immigration; and will not restrict, limit, nor tax the entry of any foreigner into the territory of Argentina who comes with the goal of working the land, bettering industry, or introducing or teaching sciences or the arts.

[edit] Eurocentrism in the United States
Further information: Culture wars and Afrocentrism
During the 17th century, British settlers and immigrants from across Europe brought their Eurocentrism with them to America. After the American Revolution, the colonists Eurocentrism morphed into the Americentrism that was epitomised in the zeitgeist of the Jacksonian Era and Manifest Destiny.

[edit] Decline
Even in the 19th century, anti-colonial movements had developed claims about national traditions and values that were set against those of Europe. In some cases, as with China, where local ideology was even more exclusionist than the Eurocentric one, Westernisation did not overwhelm long-established Chinese attitudes to its own cultural centrality.[8]
In contrast, countries such as Australia defined their nationhood entirely in terms of an overseas extension of European history. It was, until recently, thought to have had no history or serious culture before colonisation. The history of the native inhabitants was subsumed by the Western disciplines of ethnology and archaeology. In Central America and South America a merger of immigrant and native histories was constructed. Nationalist movements appropriated the history of native civilizations such as the Mayans and Incas, to construct models of cultural identity that claimed a fusion between immigrant and native identity.
At the same time, the intellectual traditions of Eastern cultures were becoming more widely known in the West, mediated by figures such as Rabindranath Tagore. By the early 20th century some historians such as Arnold J. Toynbee were attempting to construct multi-focal models of world civilizations.
Since the end of World War II, the former worldwide dominance of European culture has waned drastically (Decolonization). The change has been most drastic in the USA, triggered by the 1950s to 1960s civil rights movement and perpetuated by the political correctness of the 1970s to 1980s. Today, Eurocentrism remains a topic in the US "culture wars", notably when juxtaposed to Afrocentrism, but its prominence is limited compared to topics of religion or social issues

Andre Gunder Frank asks us to ReOrient our views away from Eurocentrism--to see the rise of the West as a mere blip in what was, and is again becoming, an Asia-centered world. In a bold challenge to received historiography and social theory he turns on its head the world according to Marx, Weber, and other theorists, including Polanyi, Rostow, Braudel, and Wallerstein. Frank explains the Rise of the West in world economic and demographic terms that relate it in a single historical sweep to the decline of the East around 1800. European states, he says, used the silver extracted from the American colonies to buy entry into an expanding Asian market that already flourished in the global economy. Resorting to import substitution and export promotion in the world market, they became Newly Industrializing Economies and tipped the global economic balance to the West. That is precisely what East Asia is doing today, Frank points out, to recover its traditional dominance. As a result, the "center" of the world economy is once again moving to the "Middle Kingdom" of China. Anyone interested in Asia, in world systems and world economic and social history, in international relations, and in comparative area studies, will have to take into account Frank's exciting reassessment of our global economic past and future. Card catalog descriptionFrank explains the Rise of the West from 1400 forward in world economic and demographic terms, with a sweeping historical perspective that places it in clear conjunction with the Decline of the East around 1800. Anyone interested in Asia, in world systems and world economic and social history, in international relations, and in comparative area studies will have to take into account Frank's exciting reassessment of our global economic past and future.
For centuries, the Asians (Chinese, Indians, Muslims, and others) have been bystanders in world history. Now they are ready to become co-drivers.Asians have finally understood, absorbed, and implemented Western best practices in many areas: from free-market economics to modern science and technology, from meritocracy to rule of law. They have also become innovative in their own way, creating new patterns of cooperation not seen in the West.Will the West resist the rise of Asia? The good news is that Asia wants to replicate, not dominate, the West. For a happy outcome to emerge, the West must gracefully give up its domination of global institutions, from the IMF to the World Bank, from the G7 to the UN Security Council.History teaches that tensions and conflicts are more likely when new powers emerge. This, too, may happen. But they can be avoided if the world accepts the key principles for a new global partnership spelled out in The New Asian Hemisphere.
The idea that india is a poor country is a relatively recent one. Historically, South Asia was always famous as the richest region of the globe. Ever since Alexander the Great first penetrated the Hindu Kush, Europeans fantasized about the wealth of these lands where the Greek geographers said that gold was dug by up by gigantic ants and guarded by griffins, and where precious jewels were said to lie scattered on the ground like dust.
At their heights during the 17th century, the subcontinent's fabled Mughal emperors were rivaled only by their Ming counterparts in China. For their contemporaries in distant Europe, they were potent symbols of power and wealth. In Milton's Paradise Lost, for example, the great Mughal cities of Agra and Lahore are revealed to Adam after the Fall as future wonders of God's creation. This was hardly an overstatement. By the 17th century, Lahore had grown even larger and richer than Constantinople and, with its two million inhabitants, dwarfed both London and Paris.
What changed was the advent of European colonialism. Following Vasco da Gama's discovery of the sea route to the East in 1498, European colonial traders — first the Portuguese, then the Dutch and finally the British — slowly wrecked the old trading network and imposed with their cannons and caravels a Western imperial system of command economics. It was only at the very end of the 18th century, after the East India Company began to cash in on the Mughal Empire's riches, that Europe had for the first time in history a favorable balance of trade with Asia. The era of Indian economic decline had begun, and it was precipitous. In 1600, when the East India Company was founded, Britain was generating 1.8% of the world's GDP, while India was producing 22.5%. By 1870, at the peak of the Raj, Britain was generating 9.1%, while India had been reduced for the first time to the epitome of a Third World nation, a symbol across the globe of famine, poverty and deprivation.
In hindsight, what is happening today with the rise of India and China is not some miraculous novelty — as it is usually depicted in the Western press — so much as a return to the traditional pattern of global trade in the medieval and ancient world, where gold drained from West to East in payment for silks and spices and all manner of luxuries undreamed of in the relatively primitive capitals of Europe.
It is worth remembering this as India aspires to superpower status. Economic futurologists all agree that China and India during the 21st century will come to dominate the global economy. Various intelligence agencies estimate that China will overtake the U.S. between 2030 and 2040 and India will overtake the U.S. by roughly 2050, as measured in dollar terms. Measured by purchasing-power parity, India is already on the verge of overtaking Japan to become the third largest economy in the world.
Looking back at the role Europeans have played in South Asia until their departure in August 1947, there is certainly much that the West can be said to have contributed to Indian life: the Portuguese brought the chili pepper, while the British brought that other essential staple, tea — as well as the arguably more important innovations including democracy and the rule of law, railways, cricket and the English language. All contributed to India's economic resurrection. But the British should keep their nostalgia and self-satisfaction surrounding the colonial period within strict limits. For all the irrigation projects, the great engineering achievements and the famous imperviousness to bribes of the officers of the Indian Civil Service, the Raj nevertheless presided over the destruction of India's political, cultural and artistic self-confidence as well as the impoverishment of the Indian economy.
Today, things are slowly returning to historical norms. Last year the richest man in the U.K. was for the first time an ethnic Indian, Lakshmi Mittal, and Britain's largest steel manufacturer, Corus, has been bought by an Indian company, Tata. Extraordinary as it is, the rise of India and China is nothing more than a return to the ancient equilibrium of world trade, with Europeans no longer appearing as gun-toting, gunboat-riding colonial masters but instead reverting to their traditional role: that of eager consumers of the much celebrated manufactures, luxuries and services of the East.
William Dalrymple's latest book, The Last Mughal: The Fall of a Dynasty, Delhi 1857, has just been awarded the Duff Cooper Prize for History and Biography




Afrocentrism, or Afrocentricity, is a world view that emphasizes the importance of African people in culture, philosophy, and history.[1] Fundamental to Afrocentrism is the assumption that approaching knowledge from a Eurocentrist perspective, as well as certain mainstream assumptions in the application of information in the West, has led to injustices and also to inadequacies in meeting the needs of Black Africans and the peoples of the African diaspora.[citation needed] The Afrocentrist paradigm seeks to discover and also reinterpret (or reinvent) information through African eyes.[citation needed]
As an ideology and scholarly and social movement, the Afrocentrist paradigm has its beginnings in activism among Black intellectuals, political figures and historians. However, as part of a broader, multicultural movement, it is in use today beyond these contexts across a number of disciplines, among them religion, education, sociology, psychiatry, medicine and public health, and in the delivery of government and social services.[citation needed] Molefi Kete Asante describes "Afrocentricity" as a "systematic nationalism", however, it focuses more on one's consciousness instead of the change of the black nation. This term no longer represents a "coherent political ideology" but appears to be instead a set of tactics in the struggle for cultural survival.[2]
According to its critics, Afrocentrism is a mythology that exaggerates the contributions of African peoples to culture, philosophy, and history. They point to disproved claims of Egyptians being black.[citation needed] Some claim that it is an attempt to assert black superiority over other races.[citation needed] Also oftenpointed out are the alleged pseudo-scientific methods used by Afrocentrists, sometimes inventing evidence or rearranging dates and even geography to fit their theories. Finally critics of Afrocentrism have often found themselves being accused as racists, reducing the number of actual scholars prepared to respond to what they feel isn't science to begin with. A famous exception is Mary Lefkowitz.
Contents[hide]
1 History
1.1 19th and early 20th century
1.2 1960s and 1970s
1.2.1 Pre-Columbian Africa-Americas contact theories
1.3 1980s and 1990s
1.4 Contemporary
2 Eurocentrism
3 Definitions of Pan-African identity
4 Views on race
5 Role of Ancient Egypt
6 Criticism
7 List of prominent authors
8 See also
9 Notes
10 References
11 External links
//

[edit] History

A 1911 copy of the NAACP journal The Crisis depicting an Afrocentric artist's interpretation of "Ra-Maat-Neb, one of the kings of the Upper Nile"
Afrocentrists commonly contend that Eurocentrism has led to the neglect or denial of the contributions of African people and focused instead on a generally European-centered model of world civilization and history. Therefore, Afrocentrism is a paradigm shift from a European-centered history to an African-centered history. More broadly, Afrocentrism is concerned with distinguishing African achievements apart from the influence of European peoples.[3] Some Western mainstream scholars have assessed some Afrocentric ideas as pseudohistorical, especially claims regarding Ancient Egypt as contributing directly to the development of Greek and Western culture.[4] [5]Contemporary Afrocentrists may view the movement as multicultural rather than ethnocentric.[6] According to US professor Victor Oguejiofor Okafor, concepts of Afrocentricity lie at the core of the disciplines such as African American studies.[7]
Modern afrocentricity has its origins in the work of African and African diaspora intellectuals in the late nineteenth and early twentieth centuries. Afrocentricity has changed over time. Aspects have been hotly debated both outside and within Afrocentric circles.
Afrocentrism developed first as an argument among leaders and intellectuals in the Western Hemisphere. It arose following social changes in the United States and Africa due both to the end of slavery and expansion of British colonialism. Wanting to further establish their own identities in freedom, African Americans left white-dominated churches to establish their own. They pulled together in communities and often migrated to restore their families. African Americans eagerly sought education. They withdrew women and children from fieldwork as much as possible, the men received the right to vote and participate in public office, and their leaders took more active public roles despite severe discrimination and segregation.[8][9]
By the late 19th century, Great Britain had become a world power. Through the century Great Britain and France governments, travelers, scholars, artists and writers increasingly turned their attentions to Africa and the Near East as places of exploration (both physical and intellectual), settlement, exploitation of new resources, and playing out of their longstanding rivalries. They completed the Suez Canal in 1869, simplifying ship passage between Europe and the Far East. Based on their self-appraisal of the value of technology, industrialization, Western infrastructure, and culture, these European nations assumed their superiority to the peoples and cultures they encountered in Africa.

[edit] 19th and early 20th century
Edward Wilmot Blyden, an Americo-Liberian educator and diplomat active in the pan-Africa movement, perceived a change in perception taking place among Europeans towards Africans in his 1908 book African Life and Customs, which originated as a series of articles in the Sierra Leone Weekly News.[10] In it, he proposed that Africans were beginning to be seen simply as different and not as inferior, in part because of the work of English writers such as Mary Kingsley and Lady Lugard, who traveled and studied in Africa.[10] Such an enlightened view was fundamental to refute prevailing ideas among Western peoples about African cultures and Africans.
Blyden used that standpoint to show how the traditional social, industrial, and economic life of Africans untouched by "either European or Asiatic influence", was different and complete in itself, with its own organic wholeness.[10] In a letter responding to Blyden's original series of articles, Fante journalist and politician J.E. Casely Hayford commented, "It is easy to see the men and women who walked the banks of the Nile" passing him on the streets of Kumasi.[10] Hayford suggested building a University to preserve African identity and instincts. In that university, the history chair would teach
"...universal history, with particular reference to the part Ethiopia has played in the affairs of the world. I would lay stress upon the fact that while Ramses II was dedicating temples to "the God of gods, and secondly to his own glory", the God of the Hebrews had not yet appeared unto Moses in the burning bush; that Africa was the cradle of the world's systems and philosophies, and the nursing mother of its religions. In short, that Africa has nothing to be ashamed of in its place among the nations of the earth. I would make it possible for this seat of learning to be the means of revising erroneous current ideas regarding the African; of raising him in self-respect; and of making him an efficient co-worker in the uplifting of man to nobler effort."[10]
The exchange of ideas between Blyden and Hayford embodied the fundamental concepts of Afrocentricism.
In the United States, writers and editors of publications such as The Crisis and The Journal of Negro History sought to counter the prevailing view that Sub-Saharan Africa had contributed nothing of value to human history that was not the result of incursions by Europeans and Arabs.[11] Authors in these journals theorized that Ancient Egyptian civilization was the culmination of events arising from the origin of the human race in Africa. They investigated the history of Africa from that perspective.
Afrocentrists claimed The Mis-Education of the Negro (1933) by Carter G. Woodson, an African- American historian, as one of their foundational texts. Woodson critiqued education of African Americans as "mis-education" because he held that it denigrated the black while glorifying the white. For these early Afrocentrists, the goal was to break what they saw as a vicious cycle of the reproduction of black self-abnegation. In the words of The Crisis editor W.E.B. Du Bois, the world left African Americans with a "double consciousness," and a sense of "always looking at one's self through the eyes of others, of measuring one's soul by the tape of a world that looks on in amused contempt and pity."[12]
In his early years, W.E.B. Du Bois, researched West African cultures and attempted to construct a pan-Africanist value system based on West African traditions. In the 1950s Du Bois envisioned and received funding from Ghanaian president Kwame Nkrumah to produce an Encyclopedia Africana to chronicle the history and cultures of Africa. Du Bois died before being able to complete his work. Some aspects of Du Bois's approach are evident in work by Cheikh Anta Diop in the 1950s and 1960s. Diop identified a pan-African protolanguage and presented evidence that ancient Egyptians were, indeed, Africans.
Du Bois inspired a number of authors, including Drusilla Dunjee Houston. After reading his work The Negro (1915), Houston embarked upon writing her Wonderful Ethiopians of the Ancient Cushite Empire (1926). The book was a compilation of evidence related to the historic origins of Cush and Ethiopia, and assessed their influences on Greece.

[edit] 1960s and 1970s

The neutrality of this section is disputed.Please see the discussion on the talk page.(January 2008)Please do not remove this message until the dispute is resolved.
The 1960s and 1970s were times of social and political ferment which gave rise in the U.S. to the Black Nationalist, Black Power and Black Arts Movements, all driven to some degree by a rejection of Western values and an identification with "Mother Africa." Afrocentric scholars and Black youth also challenged Eurocentric ideas in academia. 1968 signaled a new era in student unrest in the U.S. when Howard University became the first major university to be shut down by student protests, in part over demands for a more Afrocentric orientation of the institution.[citation needed]
The work of Cheikh Anta Diop became very influential. In the following decades, histories related to Africa and the diaspora gradually would incorporate a more African perspective. Since that time, Afrocentrists have increasingly seen African peoples as the makers and shapers of their own histories.[1]
You have all heard of the African Personality; of African democracy, of the African way to socialism, of negritude, and so on. They are all props we have fashioned at different times to help us get on our feet again. Once we are up we shan't need any of them any more. But for the moment it is in the nature of things that we may need to counter racism with what Jean-Paul Sartre has called an anti-racist racism, to announce not just that we are as good as the next man but that we are much better.—Chinua Achebe, 1965[13]
Tejumola Olaniyan writes that Chinua Achebe easily might have included Afrocentrism in his list of "props." In this context, ethnocentric Afrocentrism was not intended to be essential or permanent. It was a consciously fashioned strategy of resistance to the Eurocentrism of the time.[12] Afrocentric scholars adopted two approaches: a deconstructive rebuttal of what they called "the whole archive of European ideological racism" and a reconstructive act of writing new self-constructed histories.[12]At a 1974 UNESCO symposium in Cairo titled "The Peopling of Ancient Egypt and the Decipherment of Meroitic Script", Cheikh Anta Diop brought together scholars of Egypt from around the world.[14]
Key texts from this period include:
The Destruction of Black Civilization (1971) by Chancellor Williams
The African Origins of Civilization: Myth or Reality (1974) by Cheikh Anta Diop
They Came Before Columbus: The African Presence in Ancient America (1976) by Ivan Van Sertima
Some Afrocentric writers focused on study of indigenous African civilizations and peoples, to emphasize African history separate from European or Arab influence. Primary among them was Chancellor Williams, whose book The Destruction of Black Civilization: Great Issues of a Race from 4500 B.C. to 2000 A.D. set out to determine a "purely African body of principles, value systems (and) philosophy of life".[15]

[edit] Pre-Columbian Africa-Americas contact theories
Main article: Pre-Columbian Africa-Americas contact theories
In the 1970s, several scholars[citation needed] advanced theories that the complex civilizations of the Americas were the result of trans-oceanic influence from the Egyptians or other African civilizations. Such a claim is the primary thesis of Ivan van Sertima's book They Came Before Columbus, published in 1978. These hyper-diffusionist writers seek to establish that the Olmec people, who built the first highly complex civilization in Mesoamerica and are considered by some to be the mother civilization for all other civilizations of Mesoamerica, were deeply influenced by Africans. Van Sertima himself contended that the Olmec civilization was a hybrid one of Africans and Native Americans. His book, published by a major publishing house, received broad exposure. While Van Sertima rejected[citation needed] the notion that his findings were driven by Afrocentrism, the book received a friendly reception among Afrocentrist proponents. His theory of pre-Columbian American-African contact has met with opposition in academia, with some Mesoamericanists charging Van Sertima with "doctoring" and twisting data to fit his conclusions, and with inventing evidence.[5] However, archaeological finds over the last two decades in South America of rock art and human skeletal remains suggest to some scholars and academicians an ancient, pre-Columbian presence of "Australoid" or "Negroid" peoples in the New World[16] who came from Australia and Melanesia earlier than the Asian ancestors of current Native American populations.[17][18]

[edit] 1980s and 1990s
In the 1980s and 1990s, Afrocentrism increasingly became seen as a tool for addressing social ills and a means of grounding community efforts toward self-determination and political and economic empowerment.
In his (1992) article "Eurocentrism vs. Afrocentrism", US anthropologist Linus A. Hoskins wrote:
The vital necessity for African people to use the weapons of education and history to extricate themselves from this psychological dependency complex/syndrome as a necessary precondition for liberation. [...] If African peoples (the global majority) were to become Afrocentric (Afrocentrized), ... that would spell the ineluctable end of European global power and dominance. This is indeed the fear of Europeans. ... Afrocentrism is a state of mind, a particular subconscious mind-set that is rooted in the ancestral heritage and communal value system. [19]
Although Afrocentricity is often associated with liberal or left-wing politics, the movement is not homogeneous. During the 1980s and 1990s, sociological research became increasingly preoccupied with the problem of the "black underclass". Some Afrocentric scholars began[citation needed] to frame Afrocentric values as a remedy for what they perceived to be the social ills of poor African Americans. American educator Jawanza Kunjufu made the case that hip hop culture, rather than being creative expression of the culture, was the root of many social ills.[20] For some Afrocentrists, the contemporary problems of the ghetto stemmed not from race and class inequality, but rather from a failure to inculcate Black youth with Afrocentric values.[21]
Afrocentric ideas also received a considerable boost from the cultural shift known as postmodernism and its privileging of difference, micro-struggles, and the politics of identity. Postmodernism's general assault on the authority and universalist claims of Western "culture" is also a mainstay in many Afrocentric agendas. In turn, postmodern pluralism has begun to permeate Afrocentric thought.[12]
In the West and elsewhere, the European, in the midst of other peoples, has often propounded an exclusive view of reality; the exclusivity of this view creates a fundamental human crisis. In some cases, it has created cultures arrayed against each other or even against themselves. Afrocentricity’s response certainly is not to impose its own particularity as a universal, as Eurocentricity has often done. But hearing the voice of African American culture with all of its attendant parts is one way of creating a more sane society and one model for a more humane world. -Asante, M. K. (1988)[22]
By the end of the 1990s, the ethnocentric Afrocentrism of the '50s, '60s and '70s[citation needed] had largely fallen out of favor.[citation needed] In 1997, US cultural historian Nathan Glazer described Afrocentricity as a form of multiculturalism. He wrote that its influence ranged from sensible proposals about inclusion of more African material in school curricula to what he called senseless claims about African primacy in all major technological achievements. Glazer argued that Afrocentricity had become more important due to the failure of mainstream society to assimilate all African Americans. Anger and frustration at their continuing separation gave black Americans the impetus to reject traditions that excluded them.[23]

[edit] Contemporary
Today, Afrocentricity takes many forms, including serving as a tool for creating a more multicultural and balanced approach to the study of history and sociology. Afrocentricity contends that race still exists as a social and political construct.[21] It argues that for centuries in academia, Eurocentric ideas about history were dominant: ideas such as blacks having no civilizations, no written languages, no cultures, and no histories of any note before coming into contact with Europeans. Further, according to the views of some Afrocentrists, European history has commonly received more attention within the academic community than the history of sub-Saharan African cultures or those of the many Pacific Island peoples. Afrocentrists contend it is important to divorce the historical record from past racism. Molefi Kete Asante's book Afrocentricity (1988) argues that African-Americans should look to African cultures "as a critical corrective to a displaced agency among Africans." Less concerned about specific claims about the race of the Egyptians or other controversial topics, some Afrocentrists believe that the burden of Afrocentricity is to define and develop African agency in the midst of the cultural wars debate. By doing so, Afrocentricity can support all forms of multiculturalism.[24]
Afrocentrists argue that Afrocentricity is important for people of all ethnicities who want to understand African history and the African diaspora. For example, the Afrocentric method can be used to research African indigenous culture. Queeneth Mkabela writes in 2005 that the Afrocentric perspective provides new insights for understanding African indigenous culture, in a multicultural context. According to Mkabela and others, the Afrocentric method is a necessary part of complete scholarship and without it, the picture is incomplete, less accurate, and less objective.[25]
Contemporary Afrocentrists may view the movement as multicultural rather than ethnocentric.[6]They see Afrocentricity as one part of a larger multicultural movement that has begun to shift the focus of historical and cultural studies away from Eurocentrism.[26] Studies of African and African-diaspora cultures have shifted understanding and created a more positive acceptance of influence by African religious, linguistic and other traditions, both among scholars and the general public. For example Lorenzo Dow Turner's seminal 1949 study of the Gullah language, a dialect spoken by black communities in Georgia and South Carolina, demonstrated that its idiosyncrasies were not simply incompetent command of English, but incorporated West African linguistic characteristics in vocabulary, grammar, sentence structure, and semantic system.[27] Likewise, religious movements such as Vodou are now less likely to be characterized as "mere superstition", but understood in terms of links to African traditions. Scholars who adopt such approaches may or may not see their work as Afrocentrist in orientation.[citation needed]
In recent years Africana Studies or Africology[1] departments at many major universities have grown out of the Afrocentric "Black Studies" departments formed in the 1970s. Rather than focusing on black topics in the African diaspora (often exclusively African American topics), these reformed departments aim to expand the field to encompass all of the African diaspora. They also seek to better align themselves with other University departments and find continuity and compromise between the radical Afrocentricity of the past decades and the multicultural scholarship found in many fields today.[28]

[edit] Eurocentrism
Main article: Eurocentrism
In part in response to the pressure of Afrocentrists, the study of history and sociology has changed, gradually incorporating Afrocentic ideas as a part of a broader push toward multiculturalism in academia. Afrocentricity has had an impact on the disciplines of African studies, Black studies and Africana studies, as well as anthropology, sociology, and the study of history as a whole. Adisa A. Alkebulan writes that the Afrocentric idea has been a guiding paradigm in postcolonial African studies and Africana studies.[29] These changes were necessary due to the limits of Eurocentrism, especially in earlier western scholarship. For example:
I am apt to suspect the Negroes to be naturally inferior to the Whites. There scarcely ever was a civilized nation of that complexion, nor even any individual, eminent either in action or speculation. No ingenious manufactures amongst them, no arts, no sciences. ...[In] our colonies, there are Negro slaves dispersed all over Europe, of whom none ever discovered the symptoms of ingenuity; though low people, without education, will start up amongst us, and distinguish themselves in every profession. In Jamaica, indeed, they talk of one Negro as a man of parts and learning; but it is likely he is admired for slender accomplishments, like a parrot who speaks a few words plainly. - David Hume 18th century Scottish historian, philosopher and essayist.[30]
By the mid-20th century many such overtly derogatory ideas had been rejected, but Afrocentrists contended that the denial, denigration and appropriation of black historical and cultural achievements made it important to study world history from a new perspective. Thus, Afrocentric scholars have worked to engage the biased methods and approaches used by some European scholars and the European-dominated intellectual community, in relation to all the people of Africa and the diaspora.
Because of bias due to Eurocentrism, scholars sometimes overlooked or denied Africans' agency in the creation of their own histories. For example until recently, Western scholars believed cities such as Dakar, Banjul (Bathhurst), Abidjan, Conakry and others were created by Western colonizers. Although the cities were transformed by colonization (in both negative and positive ways), each of them predated colonization. Similarly, many of the existing economic and institutional patterns in Africa had origins well before colonialism.[31]
Lynn Meskell writes that archaeologists working in Egypt have rarely considered the local and global ramifications of their interpretations of ancient history. According to Meskell, many continue to operate under the residual effects of colonialism.[32] In 1991 Wyatt MacGaffey wrote that the bulk of scholarly work about Africa took for granted a Eurocentric distinction between "savage" and "civilized" peoples calculated to flatter the European and white audience for which it was intended. MacGaffey writes that it has only been since the 1960s that the possibility of writing any history for Africa has been generally admitted.[33]
Nathan Glazer acknowledges that Afrocentricity and multiculturalism have played a role in shaping trends in the teaching of history and the social sciences, but he also stresses that they are not the only cultural movements responsible for the move away from now increasingly obsolete forms of Eurocentrism.[23]

[edit] Definitions of Pan-African identity

The indigenous Papuans of New Guinea have Australoid and Negroid physical characteristics[34] and are considered black in some cultures[citation needed], despite being genetically closer to Southeast Asians than to Africans.[citation needed]
Afrocentic scholars have struggled to reconcile the relationships among racial, cultural and continental identities. Some authors[who?] have used the concept of black racial identity to gather under the umbrella of "African" peoples even widely dispersed populations traditionally classified and thought of as non-Africans. These include the Dravidians of India, the people of the rest of the Indian subcontinent, and the Australoid aboriginal peoples of Australia and New Guinea.
Some Afrocentric writers also include in the African diaspora the "Negritos" of Southeast Asia (Thailand, the Philippines and Malaysia; and the Africoid, aboriginal peoples of Melanesia, Micronesia, and Polynesia.
Some Afrocentrists claim that the Olmecs of Mexico were a hybrid society comprised of Native American peoples and Africans. Mainstream historians of Mesoamerica do not share that view.[5]
Afrocentrists who adopt this approach contend that such peoples are African in a racial sense, just as the white inhabitants of modern Australia may be said to be European.[who?] In doing so, they ignore the drastically different time frames for migration of whites from Europe to Australia within the last 200 years, and ancient peoples from the African continent to India or Polynesia tens of thousands of years ago.
In 2003, geneticist Spencer Wells' findings confirmed a clear DNA link between indigenous Africans and the Australoid peoples of India, Australia and Southeast Asia, tracing the DNA of San bushmen from southeast Africa to India and on to Australia. Earlier studies showed that some of these darker-skinned ethnic groups cluster genetically more closely with neighboring East Asians than with indigenous Africans, due to millennia of intermingling with one another in relative isolation.
Afrocentrists have adopted a pan-Africanist perspective that such people of color are all "African people" or "diasporic Africans," citing physical characteristics they exhibit in common with Black Africans. Afrocentric scholar Runoko Rashidi writes that they are all part of the "global African community."
Critics of Afrocentrism note that the Southeast Asian and Melanesian peoples did not emigrate out of Africa within any time span that relates them closely to ancient African civilizations. Wells' work indicates that the ancestors of Southeast Asian and Melanesian peoples migrated out of Africa before the ancestors of modern Europeans did. The Afrocentric designation of Southeast Asians and Melanesians as "African diaspora" is also made without reference to the self-identities of the peoples in question, who may not generally consider themselves African.

[edit] Views on race
Afrocentricity contends that race exists primarily as a social and political construct. That is, that race is important because of its cultural rather than its biological significance.[21] Many Afrocentrists seek to challenge concepts such as white privilege, so-called color-blind perspectives, and race-neutral pedagogies. There are strong ties between Afrocentricity and Critical race theory.[35]
Afrocentrists hold that Africans exhibit a range of types and physical characteristics, and that such elements as wavy hair or aquiline facial features are part of a continuum of African types that do not depend on admixture with Caucasian groups. They cite work by Hiernaux [36] and Hassan [37] which they believe demonstrates that populations could vary based on microevolutionary principles (climate adaptation, drift, selection), and that such variations existed in both living and fossil Africans.[38]
Afrocentrists have condemned what they consider to be attempts at dividing African peoples into racial clusters as new versions of what they deem older, discredited theories, such as the "Hamitic Hypothesis" and the Dynastic Race Theory. These theories, they contend, attempted to identify certain African ethnicities, such as Nubians, Ethiopians and Somalis, as "Caucasoid" groups that entered Africa to bring civilization to the natives.
Afrocentrists have also charged that a double standard exists and that Western academics have made limited attempts at defining a "true white".[39] They believe that Western academics have traditionally limited the peoples they defined as "Black" Africans, but used broader "Caucasoid" or related categories to classify peoples of Egypt or certain other African ethnicities.
Afrocentric writer C.A. Diop expressed this belief in a double standard as follows in 1964:
"But it is only the most gratuitous theory which considers the Dinka, the Nouer and the Masai, among others, to be Caucasoids. What if an African ethnologist were to persist in recognising as white only the blond, blue-eyed Scandinavians, and systematically refused membership to the remaining Europeans, and Mediterraneans in particular--the French, Italians, Greek, Spanish, and Portuguese? Just as the inhabitants of Scandinavia and the Mediterranean countries must be considered as two extreme poles of the same anthropological reality, so should the Negroes of East and West Africa be considered as the two extremes in the reality of the Negro world. To say that a Shillouk, a Dinka, or a Nouer is a Caucasoid is for an African as devoid of sense and scientific interest as would be, to a European, an attitude which maintained that a Greek or a Latin were not of the same race."[40]
Afrocentrists believe that European scholars define Black people as narrowly as possible, labeling as the extreme "true Negro" only those peoples living south of the Sahara. They add that all Africans who do not meet the definition of this extreme are allocated to "Caucasoid" groupings, including Ethiopians, Somalis, Egyptians and Nubians (C. G. Seligman's Races of Africa, 1966)[41]. Afrocentrists also believe strongly in the work of certain anthropologists who have suggested that there is little evidence to support that these populations are closely related to "Caucasoids" of Europe and western Asia.[36]
For example, French historian Jean Vercoutter has claimed that selective grouping was common among scholars assessing the ethnicity of the ancient Egyptians. He has said that workers routinely classified Negroid remains as "Mediterranean", even though archaeological workers found such remains in substantial numbers with ancient artifacts. (Vercoutter 1978- The Peopling of ancient Egypt)[42]
More recent work by geneticists, however, provides evidence that Eurasians likely are descended from populations who migrated north and east out of the Horn of Africa. Hence, certain shared genetic and phenotypical characteristics among Eurasians and Northeast African groups such as Ethiopians and Somalis.[43] Some phenotypical similarities among Somalis and Eurasians exist at a higher structural level, such as orthognathism[44], tooth size[45], keen facial features and skull shape and size. According to anthropologist Loring Brace:
When the nonadaptive aspects of craniofacial configuration are the basis for assessment, the Somalis cluster with Europeans before showing a tie with the people of West Africa or the Congo Basin.[46]
Genetic analyses of male DNA in the 21st century have also indicated that Somalis carry considerable E1b1b, a Y chromosome haplogroup characteristic of Northeast African, Berber, Arab, Jewish, Mediterranean and Balkan populations.[47]
Afrocentrists argue against the classification of people they deem indigenous, "Black" Africans as Caucasoid and instead advocate use of the term Africoid to encompass the varying phenotypes of both Negroid and proto-Caucasoid African populations, as well as phenotypically Negroid Australasian populations. They contend that it is more appropriate to name Africans in a manner which reflects their geographical origin, as are Asians, as Mongoloids, and Europeans, as Caucasians.

[edit] Role of Ancient Egypt
See also: Race of Ancient Egyptians
Several Afrocentrists have said that important cultural characteristics of ancient Egypt were indigenous to Africa and that these features were present in other African civilizations.[48] Critical of much of mainstream Egyptology, Afrocentrists wrote that the study of ancient Egyptian culture had been artificially disconnected from other early African civilizations, such as Kerma and the Meroitic civilizations of Nubia — particularly in light of the fact that archaeological evidence clearly indicated a confluence among this cultural triad.[49] This perspective, championed by the Senegalese scholar Cheikh Anta Diop in the 1960s, was known formally as the Cultural Unity Theory. These related theories had proponents in the 1980s outside Afrocentric circles, among them Bruce Williams of the Oriental Institute, Chicago.[50]
Mainstream archaeologists and Egyptologists such as Frank J. Yurco and Fekri Hassan have stated that ancient Egyptian peoples comprised a mix of North and sub-Saharan African peoples that have typified Egyptians ever since. They said that the Egyptian people were generally coextensive with other Africans in the Nile valley.[51]
Early Afrocentrists pointed to the work in the 1960s of Czech anthropologist Eugene Strouhal, which described physical, cultural and material links of ancient Egypt with the peoples of Nubia and the Sahara ( Strouhal (1968, 1971- Strouhal, E., ‘Evidence of the early penetration of Negroes into prehistoric Egypt).[52], the analyses of Falkenburger (1947) which show a clear Negroid element, especially in the southern population and sometimes as predominating in the predynastic period.[53] In 1993 C Loring Brace et al wrote "The attempt to force the Egyptians into either a “black” or a “white” category has no biological justification. Our data show only that Egypt clearly had biological ties to the north and to the south, but that it was intermediate between populations to the east and the west, and that Egypt was basically Egyptian from the Neolithic right on up to historic times."[54]
Research by archaeologist Bruce Williams argued for Nubian cultural influence on formation of the Egyptian kingships. [55]
Egyptians themselves called for the inclusion of Egypt in Du Bois's early drafts of the Encyclopedia Africana. The director of the Egyptian Cultural Center in Accra wrote to praise Du Boise for having "maintained faith in the African character of Egypt's achievement," and urging that the Encyclopedia Africana keep Egypt within its Afrocentric focus. [56]
Afrocentrists have claimed a growing scholarly acceptance of Egypt as an African culture with its own unique elements. They cite mainstream scholars like Bruce Trigger, who in 1978 decried that approaches of the past were 'marred by a confusion of race, language, and culture and by an accompanying racism'.[57] and Egyptologist Frank Yurco, who in the late 1990s viewed the Egyptians, Nubians, Ethiopians, Somalians, and others as one localized Nile valley population, that need not be artificially clustered into racial percentages.[58] Afrocentrists have cited 1990s mainstream studies that confirmed the varied physical character of the Egyptian people, and influence on them from other peoples of the Nile (Nilotic influence).[59]
Afrocentrists also claimed that the ancient Egyptians made significant contributions to ancient Greece[60] and Rome[citation needed] during their formative periods. They also claimed that Egyptians were black, as discussed above.[citation needed]
This early Afrocentric view is at odds with conclusions of mid-20th c. Eurocentric scholars such as British historian Arnold Toynbee and hearkens back to the findings of earlier historians.[citation needed] Toynbee believed the ancient Egyptian cultural sphere had died out without leaving a successor. He regarded as "myth" the idea that Egypt was the "origin of Western civilization."
There are accounts in the historical record dating back several centuries, in which writers noted Egypt's contributions to Mediterranean civilizations.[61]

[edit] Criticism

It has been suggested that some of the information in this article's Criticism or Controversy section(s) be merged into other sections to achieve a more neutral presentation. (Discuss)

African American topics
African American history
Atlantic slave trade · Maafa
Slavery in the United States
African American military history
Jim Crow laws · Redlining
Civil Rights: 1896–1954 1955–1968
Afrocentrism · Reparations
African American culture
African American studies
Contemporary issues · Neighborhoods
Black Colleges · Kwanzaa · Art
Museums · Dance · Literature · Music
Religion
Black church · Black liberation theology
Black theology · Doctrine of Father Divine
Nation of Islam · Black Hebrew Israelites
Rastafari
Political movements
Pan-Africanism · Nationalism · Black Power
Capitalism · Conservatism · Populism
Leftism · Black Panther Party · Garveyism
Civic and economic groups
NAACP · SCLC · CORE · SNCC · NUL
Rights groups · ASALH · UNCF
NBCC · NPHC · The Links · NCNW
Sports
Negro Leagues
CIAA · SIAC · MEAC · SWAC
Languages
English · Gullah · Creole
African American Vernacular
Lists
African Americans
African American firsts
Landmark legislation
Related topics
African topics
Category · Portal
This box: viewtalkedit
Critics contend that some Afrocentric historical research lacks merit and that it essentially supplants and counters one form of racism with another,[citation needed] rather than attempting to arrive at the truth. Among these critics are Mary Lefkowitz, who contends Afrocentric historical claims are grounded in identity politics and myth rather than sound scholarship. [62] Lefkowitz rejects George G. M. James's views on Egypt, on the grounds that his sources predated the deciphering of Egyptian hieroglyphs and that his theories were overturned by later findings. She contends that actual ancient Egyptian texts showed little similarity to Greek philosophy and that Bernal underestimated the distinctiveness of Greek intellectual culture. Lefkowitz has criticized Afrocentricity as "an excuse to teach myth as history"[62] Her most recent book History Lesson (Yale University Press, April 2008) is a personal account of the way she was attacked for simply stating the facts. For example, her pointing out that Aristotle could not have stolen his ideas from the great Library at Alexandria because the library was founded after his death (by his pupil Alexander the Great) was countered by Afrocentrists not by disproving her statements but by accusing her of being racist.
Other critics of the Afrocentric approach in the study of history include the late Egyptologist Frank Yurco[63], and African-American history professor Clarence E. Walker who has stated that Afrocentrism is: "a mythology that is racist, reactionary, essentially therapeutic and is eurocentrism in black face."[64]
Cain Hope Felder, a Professor of New Testament Language and Literature at Howard University and supporter of Afrocentric ideas, has warned Afrocentrists to avoid certain pitfalls.[65] These include:
Demonizing categorically all white people, without careful differentiation between persons of goodwill and those who consciously perpetuate racism.
Adopting multiculturalism as a curricular alternative that eliminates, marginalizes, or vilifies European heritage to the point that Europe epitomizes all the evil in the world.
Gross over-generalizations and using factually or incorrect material is bad history and bad scholarship.[65]
Nathan Glazer writes that although Afrocentricity can mean many things, the popular press has generally given most attention to its most outlandish theories.[23] Glazer supports many of the findings in Mary Lefkowitz book Not Out of Africa but also recognizes that Afrocentricity may, at times, take the form of legitimate and relevant scholarship.[23]
Often, the work that critics of Afrocentricity call "bad scholarship" is also rejected by Afrocentrists. Adisa A. Alkebulan writes that critics have used claims of what she calls "a few non-Afrocentrists" as "an indictment against Afrocentricity."[29]
Robert Todd Carroll in The Skeptic's Dictionary refers to Afrocentrism as "pseudohistorical", and argues that the prime goal of Afrocentrism is to encourage black nationalism as well as ethnic pride in order to effectively combat the destructive consequences of cultural and universal racism.[66][67]
Gaëlle SévenierFinal paper - Freedom of expression - Word Version


American Cultural Imperialism: Gift or Threat?
r
Cultural imperialism is a very old phenomenon. For centuries, countries imposed their cultural values on other nations. Today, as a global economic and political power, the United States is inevitably intruding into the cultures of other countries of the world. Some believe that the American's spread of culture is beneficial to the entire planet, while others consider this cultural imperialism a threat.
Social contextDuring the past five hundred years, European countries colonized southern countries in the name of "spreading" Christian civilization to the "primitive" people in other parts of the world, as well as securing resources and workers for economic production. As cultural imperialism occurs, it is said to be for the own good of the other, conquered civilization, to spread universal values, rights and standards of development. The United States are currently not the only cultural imperialists, but the spread of American values in the entire world is at the leading edge of a wave of spread of Western goods and consumerist culture. Today, the phenomenon might take a different form, as it is a lot more subtle and less brutal than the European colonization: it is being done in the name of freedom of the market and freedom of expression.
The propagation of the American culture seen as unavoidable and beneficial to the worldThrough the media, the United States is spreading some universal values and human rights. To some authoritarian countries, it spreads ideas of freedom of expression, democracy, equality, and rights - concepts that should be, in some people's opinion, universal. Universality of some values may be possible - human nature is not that different from one culture to another, and many values are shared across cultures. However, the majority of the world's cultures undervalue women and children in practice if not in ethos. Finally, the majority of the world's people, regardless of the names given to governmental regimes by those with authority, continue to live without real participatory democracy. American ideals of equality, freedom, and democracy now available in the world may give more freedom to women, children, and to minorities in all cultures, and will promote anti-racist, anti-sexist or anti-authoritarian messages and regimes. Irving Kristol, in "The emerging American Imperialism," presents imperialism as an unintended consequence of market expansion rather than a conscious goal: "one of these days, the American people are going to awaken to the fact that we have become an imperial nation." But he later argues that it is not something unintentional, but that in fact many nations have facilitated and welcomed American cultural values along with American products and ways of life: "it happened because the world wanted it to happen." To him, the American missionaries live in Hollywood, which is different from the Old European imperialism, which was based on bureaucratic colonial governments and resource extraction. Christopher Dunkley, in "American Cultural Imperialism: No Bad Thing" says that "America provides some of the best available anywhere in the world." One of the reasons that American series are so successful in the world is that "thanks to its immigration policies, the US has a population with a mixture of Anglo Saxons, Scandinavians, Asians and so on that provides American broadcasters with a domestic audience which is, to all intents and purposes, international. Please the American audience and you can guarantee you will please the world."Some theories of globalization see, instead of cultural imperialism, the movement of products and ideas from across national and cultural borders in ways that produce real changes in cultures like that of the United States. In 1994, MacQuail wrote in his book Mass Communication Theory that not only was United States influencing other cultures, but other cultures were also influencing the US: "While one-way flow may be evident in terms of information flows on an information theory quantitative estimate, the reality is that as media technology and economies become more intertwined, this seemingly one-way flow reverses itself into a two-way flow in which what sells abroad influences what Americans see at home." In that perspective, we can talk about an interpenetration of cultures instead of the invasion of American culture in the world.

The American cultural imperialism as a threat to other culturesWe should not forget that the differences in cultures make the world a rich and diverse place. Every individual of each country should have the right to express his or her own culture. A cultural uniformity would lead to the extinction of cultures and it would definitely represent a great loss. However, the American culture is intruding on most cultures in the world, in many cases threatening their existence. Superman, Spider-man, and Batman replace local heroes; Pepsi and Coke replace local fruit drinks; and "trick or treat" begin to replace Dia de los Muertos. Perhaps more insidious, to compete with American cultural imports, local varieties and products begin to mimic American products. All the exportation of goods and information from the United States to the entire planet contributes to the exportation of the American culture. Today, the spread of American culture goes through every communication medium: 90% of the information available on the Internet is in English, CNN is seen in 120 countries, Stephen King is the number one best seller in the world. Obviously, there is already a process of cultural uniformity going on, and this can be seen as a great loss. The rise of English as an international language of trade and politics has been one of the strongest vehicles for the transmission of American culture. The place of English in the world has crystallized in the past decades - you can read signs in English in every capital, and fluency in English has become a taken-for-granted prerequisite for upper-level positions in international trade and politics. While the forces leading to the rise of an international language differ greatly from cultural imperialism, it would be difficult to separate the two. As English becomes a global language, it becomes clear that language and culture cannot be separated. The AP National Writer journalist Anthony Ted says "every one from the French to the Indonesians worry that where English goes, America will follow." Scholars Nye and Owen admitted that it is the goal of the United States to have English as the international language: "It is in the economic and political interests of the United States to ensure that, if the world is moving to a common language, it be English; that if the world is becoming linked by television, radio and music, the programming be American; and that, if common values are being developed, they be values with which Americans are comfortable." According to them, not only it is intentional, but also it is a "developing reality." If this spread of values, language, and information is purely because of economic and political interest for the United States, the well-being of other cultures and their freedom of expression are not taken into consideration except instrumentally - can they be bought and sold for a profit, or can they be used to political advantage - to the profit and advantage of the US. We know that the United States is the leader in exporting its information. One problem is that the United States sells its information and media products so cheaply that it is impossible for the whole world to compete. The American producers budget to cover their costs within the US market and can consequently sell at unbeatable prices internationally. A consequence is that it is much cheaper to buy, for example, a blockbuster Hollywood movie made in the United States than to make a less expensive local production in another country

Orientalism refers to the way in which non-Western (specifically Asian) cultures are perceived in the West, by scholars, writers, thinkers, politicians and society at large. Orientalism first appeared during the 19th century, when many scholars felt that a better knowledge of Asia was necessary to further the West's colonial aspirations. Edward Said developed the notion of orientalism and argued that this form of thought tells more about the values and biases of western society than about the Far East. Said observed that in Orientalism, Asian men are widely depicted as feminine and weak, but still dangerous to white, western women. Interestingly, this was also how African American men were portrayed in most American books and films, in the period prior to the Civil Rights movement, as well as aboriginal men in both the US and Canada. Orientalism involves the use of generalizations and stereotypes to describe what Orientalist thinkers understood as the exotic nature of the Far East. Orientalism also portrayed Asia as a bizarre, eccentric and backward place, utterly different from the "rational," orderly and generally democratic western world.

The Protestant Ethic and the Spirit of Capitalism is a book written by Max Weber, a German economist and sociologist in 1904 and 1905 that began as a series of essays. The original edition was in German and was entitled: Die protestantische Ethik und der 'Geist' des Kapitalismus. An English translation was made in 1930 by Talcott Parsons, and several editions have been released.
Weber wrote that capitalism evolved when the Protestant (particularly Calvinist) ethic influenced large numbers of people to engage in work in the secular world, developing their own enterprises and engaging in trade and the accumulation of wealth for investment. In other words, the Protestant ethic was a force behind an unplanned and uncoordinated mass action that led to the development of capitalism. This idea is also known as "the Weber thesis".

Cover of the German edition from 1934
Contents[hide]
1 Book contents
2 Table of contents (from the 1958 Scribner's edition, translated by Parsons)
3 Spirit of capitalism
4 See also
5 Related books
6 External links
//

[edit] Book contents
The book is not a detailed study of Protestantism but rather an introduction into Weber's later studies of interaction between various religious ideas and economics.
In The Protestant Ethic and the Spirit of Capitalism, Weber argues that Puritan ethics and ideas influenced the development of capitalism. Religious devotion, however, usually accompanied a rejection of worldly affairs, including the pursuit of wealth and possessions. Why was that not the case with Protestantism? Weber addresses this apparent paradox in the book.
He defines spirit of capitalism as the ideas and esprit that favour the rational pursuit of economic gain. Weber points out that such a spirit is not limited to Western culture if one considers it as the attitude of individuals, but that such individuals — heroic entrepreneurs, as he calls them — could not by themselves establish a new economic order (capitalism). The most common tendencies were the greed for profit with minimum effort and the idea that work was a curse and burden to be avoided especially when it exceeded what was enough for modest life. As he wrote in his essays:
In order that a manner of life well adapted to the peculiarities of the capitalism… could come to dominate others, it had to originate somewhere, and not in isolated individuals alone, but as a way of life common to the whole groups of man.
After defining the "spirit of capitalism," Weber argues that there are many reasons to find its origins in the religious ideas of the Reformation. Many observers like William Petty, Montesquieu, Henry Thomas Buckle, John Keats, and others have commented on the affinity between Protestantism and the development of commercialism.
Weber shows that certain types of Protestantism favoured rational pursuit of economic gain and that worldly activities had been given positive spiritual and moral meaning. It was not the goal of those religious ideas, but rather a byproduct — the inherent logic of those doctrines and the advice based upon them both directly and indirectly encouraged planning and self-denial in the pursuit of economic gain.
Weber traced the origins of the Protestant ethic to the Reformation. The Roman Catholic Church assured salvation to individuals who accepted the church's sacraments and submitted to the clerical authority. However, the Reformation had effectively removed such assurances. From a psychological viewpoint, the average person had difficulty adjusting to this new worldview, and only the most devout believers or "religious geniuses" within Protestanism, such as Martin Luther, were able to make this adjustment, according to Weber.
In the absence of such assurances from religious authority, Weber argued that Protestants began to look for other "signs" that they were saved. Calvin and his followers taught a doctrine of double predestination, in which from the beginning God chose some people for salvation and others for damnation. The inability to influence one's own salvation presented a very difficult problem for Calvin's followers. It became an absolute duty to believe that one was chosen for salvation, and to dispel any doubt about that: lack of self-confidence was evidence of insufficient faith and a sign of damnation. So, self-confidence took the place of priestly assurance of God's grace.
Worldly success became one measure of that self-confidence. Luther made an early endorsement of Europe's emerging labor divisions. Weber identifies the applicability of Luther's conclusions, noting that a "vocation" from God was no longer limited to the clergy or church, but applied to any occupation or trade.
However, Weber saw the fulfillment of the Protestant ethic not in Lutheranism, which he dismissed as a rather servile religion, but in Calvinistic forms of Christianity. The "paradox" Weber found was, in simple terms:
According to the new Protestant religions, an individual was religiously compelled to follow a secular vocation with as much zeal as possible. A person living according to this world view was more likely to accumulate money.
The new religions (in particular, Calvinism and other more austere Protestant sects) effectively forbade wastefully using hard earned money and identified the purchase of luxuries a sin. Donations to an individual's church or congregation was limited due to the rejection by certain Protestant sects of icons. Finally, donation of money to the poor or to charity was generally frowned on as it was seen as furthering beggary. This social condition was perceived as laziness, burdening their fellow man, and an affront to God; by not working, one failed to glorify God.
The manner in which this paradox was resolved, Weber argued, was the investment of this money, which gave an extreme boost to nascent capitalism.
By the time Weber wrote this essay, he believed that the religious underpinnings of the Protestant ethic had largely gone from society. He cited the writings of Benjamin Franklin, which emphasized frugality, hard work and thrift, but were mostly free of spiritual content. Weber also attributed the success of mass production partly to the Protestant ethic. Only after expensive luxuries were disdained, could individuals accept the uniform products, such as clothes and furniture, that industrialization offered.
In his remarkably prescient conclusion to the book, Weber lamented that the loss of religious underpinning to capitalism's spirit has led to a kind of involuntary servitude to mechanized industry.
The Puritan wanted to work in calling; we are forced to do so. For when asceticism was carried out of monastic cells into everyday life, and began to dominate worldly morality, it did its part in building the tremendous cosmos of the modern economic order. This order is now bound to the technical and economic conditions of machine production which today determine the lives of all the individuals who are born into this mechanism, not only those directly concerned with economic acquisition, with irresistible force. Perhaps it will so determine them until the last ton of fossilized coal is burnt. (Page 181, 1953 Scribner's edition.)
Weber maintained that while Puritan religious ideas had had a major influence on the development of economic order in Europe and United States, they were not the only factor (others included the rationalism in scientific pursuit, merging observation with mathematics, science of scholarship and jurisprudence, rational systematisation of government administration and economic enterprise). In the end, the study of Protestant ethic, according to Weber, merely explored one phase of the emancipation from magic, that disenchantment of the world that he regarded as the distinguishing peculiarity of Western culture.
In the final endnotes Weber states that he abandoned research into Protestantism because his colleague Ernst Troeltsch, a professional theologian, had initiated work on the book The Social Teachings of the Christian Churches and Sects. Another reason for Weber's decision was that Troeltsch's essay had provided the perspective for a broad comparison of religion and society, which he continued in his later works (the study of Judaism and the religions of China and India).
This book is also Weber's first brush with the concept of rationalization. His idea of modern capitalism as growing out of the religious pursuit of wealth meant a change to a rational means of existence, wealth. That is to say, at some point the Calvinist rationale informing the "spirit" of capitalism became unreliant on the underlying religious movement behind it, leaving only rational capitalism. In essence then, Weber's "Spirit of Capitalism" is effectively and more broadly a Spirit of Rationalization.
The essay can also be interpreted as one of Weber's criticisms of Karl Marx and his theories. While Marx's historical materialism held that all human institutions - including religion - were based on economic foundations, The Protestant Ethic turns this theory on its head by implying that a religious movement fostered capitalism, not the other way around.

[edit] Table of contents (from the 1958 Scribner's edition, translated by Parsons)
Part 1. The Problem
I. Religious Affiliation and Social Stratification
II. The Spirit of Capitalism
III. Luther's Conception of the Calling. Task of the Investigation.
Part 2. The Practical Ethics of the Ascetic Branches of Protestantism.
IV. The Religious Foundations of Worldly Asceticism
A. Calvinism
Predestination; Elimination of Magic; Rationalization of the World; Certainty of Salvation; Lutheranism vs. Calvinism; Catholicism vs. Calvinism; Monasticism vs. Puritanism; Methodical Ethic; Idea of Proof.
B. Pietism
Emotionalism; Spener; Francke; Zinzendorf; German Pietism.
C. Methodism
D. The Baptism Sects
Baptist and Quaker; Sect Principle; Inner Worldly Asceticism; Transformation of the World.
V. Asceticism and the Spirit of Capitalism
Richard Baxter; Meaning of Work; Justification of Profit; Jewish vs. Puritan Capitalism; Puritanism and Culture; Saving and Capital; Paradox of Asceticism and Rich; Serving Both Worlds; Citizenry Capitalistic Ethic; Iron Cage of Capitalism.

[edit] Spirit of capitalism
The first, and probably most vital, feature of the spirit of capitalism was that it invested “economizing” with high moral significance. The individual engages in capitalistic economizing not only for the expediency of making a living, but in the expectation that such activity would test his inner resources and thus affirm his moral worth. In this regard, the American novelist Walker Percy observed, “As long as I am getting rich, I feel well. It is my Presbyterian blood.”
A major effect of this spirit, as Durkheim noted, is that the entrepreneur performs his tasks with an earnestness of purpose that places them at the center of his life, and endows them with intrinsic dignity. There is nothing degrading about them. Such an approach to monetary gain is markedly different from the sordid passion of greed, for monetary gain was not to be used for luxury or self-indulgent bodily comfort, but rather was to be saved, and accumulated. Neither could the resulting frugality be mistaken for miserliness, as the accumulated resources were to be reinvested in worthy enterprises. The spirit of capitalism constituted a sort of moral ‘’habitus’’ which burdened the possessor of money with a steward’s obligation toward his own possessions.
Likewise, the individual entrepreneur isn’t allowed to become overly absorbed into or preoccupied with himself. His existence revolves around an objective concern outside himself, which unceasingly demands his devotion and thus, becomes a test of his self-worth. By its very nature, these economic practices require reference to a goal; however, increase in capital becomes the ultimate point of reference.
Ultimately, the point of the spirit of capitalism is to attribute moral significance to entrepreneurial activity and lend meaning to the existence of those committed to it.

The Protestant work ethic, sometimes called the Puritan work ethic, is a Calvinist value emphasizing the necessity of constant labor in a person's calling as a sign of personal salvation. Protestants beginning with Martin Luther had reconceptualised work as a duty in the world for the benefit of the individual and society as a whole. The Catholic idea of good works was transformed into an obligation to work diligently as a sign of grace.
Contents[hide]
1 History
2 Criticism
3 See also
4 References
5 External links
//

[edit] History
The term was first coined by Max Weber, the “youngest” of the German Historical School of economics, in his The Protestant Ethic and the Spirit of Capitalism. The Protestant work ethic is often credited with helping to define the societies of Northern Europe and other countries where Protestantism was strong, such as in Scandinavia, northern Germany, the United Kingdom and also the United States of America. In such societies it is regarded by many observers as one of the cornerstones of national prosperity. This perspective would say that people in countries with Protestant roots tend to be more materialistic, perfectionist, and more focused on work, compared to people in many Catholic countries, such as Spain, Italy and France where the people had a more relaxed attitude towards work, and were less materialistic.

[edit] Criticism
Proponents of the notion of the "Protestant work ethic" claim that the term refers to its Protestant origin and does not require Protestantism itself. As Ireland was ruled by a Protestant nation, while Japan modeled its modernization on largely Protestant nations like the United States, Great Britain and Germany, they could have received the secularized ethic from Protestants without accepting any religious underpinning to it. Similarly, successful capitalist countries with relatively large Catholic minorities such as the United States, Australia, United Kingdom and New Zealand tend to be ignored in the analysis and lumped together as Protestant, despite the strong influence and 'capitalist outlook' of Catholics in the business community in all of these countries. Catholics make up the majority in much of Southern Germany (Bavaria has the highest GDP of all German States, but this is a rather recent, post-WWII development).
The notion of the Protestant work ethic faced some criticism in the twentieth century. The strongest of such criticism was that it revolved mostly around the culture and history of Europe and did not take into account societies that had never been Christian. Examples often cited are East Asian nations like Japan which have a strong work ethic but never had more than a small minority of Protestants. Others feel that the recent economic progress of Catholic nations like Ireland and Brazil makes the term at best of historical use.
The capitalist development of Catholic northern Italy and southwestern Germany before and during the Protestant Reformation is also cited as a counter argument that other factors, including geographical and political ones, were the main drivers for capitalist development, not Protestantism per se. Similarly, the deep economic factors that gave rise to capitalist accumulation and development existed in Europe prior to the Reformation in 1517 and owe little to any religious factor, but more to the unraveling of feudalism and the functioning of governance institutions that strengthened property rights and lowered transaction costs.

White Anglo-Saxon Protestant, commonly abbreviated to the acronym WASP, is a sociological and cultural ethnonym that originated in the United States.
The term originated in reference to White Americans of Anglo-Saxon descent, who were Protestant in religious affiliation. However, the term does not have a precise definition, and can be used to describe greatly differing groups.[1] It initially applied to people with histories in the upper class Northeastern establishment, who were alleged to form a powerful elite. Working class whites in the U.S. are generally not referred to as "WASPs", even if they are Protestants of Anglo-Saxon descent.[2] The word white is redundant, since Anglo-Saxons — whether in the strict or popular sense of the term — are always white.
WASP is gradually being replaced in U.S. liberal circles by "white Christian" as a result of diminished exclusion of Catholics and other non-WASP whites.[1]
Strictly speaking, many people now referred to as "WASPs" are not Anglo-Saxon – that is, the descendants of some Germanic peoples, who settled in Britain between the 5th century and the Norman Conquest.[3] However, in modern North American usage, WASP may include Protestants, from Dutch, German, Huguenot (French Protestant), Scandinavian, Scottish, Scots-Irish and Welsh backgrounds.[4] Therefore, the term WASP is sometimes applied to individuals who are technically non-Anglo-Saxons, including people with:
Dutch origins, such as the Vanderbilt and Roosevelt families
German descent, such as the Rockefeller and Astor families.[5]
French descent, such as the Du Pont family
Scots origins, such as the Carnegie family.
Scots-Irish origins, such as the Mellon family.

The term WASP has many meanings. In sociology it reflects that segment of the U.S. population that founded the nation and traced their heritages to ... Western Europe... The term has largely negative connotations... Today... less than 25 percent of the U.S. population [is WASP]. Nevertheless they continue to... have disproportionate influence over... American institutions. The term... has become more inclusive. To many people, WASP now include most 'white' people who are not... members of any minority group (William Thompson & Joseph Hickey, 2005, Society in Focus).[1]

Usage of the term WASP has grown in other English-speaking countries, such as Canada and Australia, which were settled by members of similar ethnic groups. Beyond the English-speaking world, the term is sometimes used in a metaphorical sense, to refer to perceived elite social groups.
Contents[hide]
1 Usage
2 Culture attributed to WASPs
3 Criticism
4 See also
5 Notes
6 References
7 External links
//

[edit] Usage
The term was popularized by sociologist and University of Pennsylvania professor E. Digby Baltzell in his 1964 book The Protestant Establishment: Aristocracy & Caste in America. However, its first recorded use was by Andrew Hacker in 1957.[6]
The original use of WASP denoted either an ethnic group, or the culture, customs, and heritage of early Western European settlers in what is today the United States. The New England Yankee elite were almost exclusively of English extraction, although some early German immigrants, largely Protestant, arrived in the Dutch colony of New Netherland.
Protestant Christianity is considered the dominant religious sect among WASPs, particularly mainline denominations such as Presbyterianism, Congregationalism, Episcopalianism, and Unitarianism.
U.S. Northeasterners who may be labeled as "WASPs" may refer to themselves as "Yankees". In the South, whites formed a cultural and social structure distinct from that of the Northeast and are generally not considered WASPs. Unlike the North, the South never formed a large "white ethnic" population of non-Protestant immigrants, and as a result in traditional Southern society, the only real distinctions made were between whites and blacks. By and large, most Southern whites are of Scots-Irish and French ancestry. English-American Southerners historically lived mostly in the Upper South, particularly in Virginia.
In the Southwestern United States, "Anglo" is often used to contrast white Americans of European ancestry from Hispanics. It has a broader meaning than WASP, as it is sometimes used to include all non-Hispanic English-speaking Whites, regardless of their religion or ethnicity.
When using the term, speakers vary widely in terms of which ethnic group they mean to designate, and some even apply it to all Protestants of European descent. For that reason, use of the term WASP has broadened significantly since its first use. Some people use it to refer to any powerful elite, with little regard to actual ethnicity or religion. Others use it only to refer to an ethnic group and its culture.
In the United States, it is most prevalently used today to contrast early arriving, Western European, "old stock" Americans with the descendants of later arriving groups from Southern and Eastern Europe, Catholic Ireland and other parts of the world. The term WASP is also often used in a way which is synonymous with "The Establishment" or for the privilege that white Protestants in America allegedly enjoy. It is frequently used today in a derogatory fashion. In fact, many dictionaries warn the term is often "derogatory" or "insulting".

[edit] Culture attributed to WASPs
The original WASP establishment created and dominated the social structure of the United States and its significant institutions when the country's social structure took shape in the 17th century until the 20th century. Many scholars, including researcher Anthony Smith, argue that nations tend to be formed on the basis of a pre-modern ethnic "core" that provides the myths, symbols, and memories for the modern nation and that WASPs were indeed that core.[7] Many only associate America's elite institutions with WASPs when it has always been a wider, more diverse group. The class is still imagined to dominate America's prep schools and to older universities including those in the Ivy League or small liberal arts colleges, including NESCAC schools (see the "Little Ivies"). It is true that these elite institutions were important to a certain portion of WASPs, who were taught skills, habits, and attitudes and formed connections which carried over to the influential spheres of finance, culture, and politics. While people labeled as "WASPs" were not a truly insular society, well into the 20th century, prominent families preserved an attitude toward marriage carried over from the British aristocracy: A desire to marry was carefully scrutinized by the potential groom's and potential bride's families. Marriage was often influenced by the desire to maintain each party in their social and cultural milieu. This is something that occurs in other cultures as well.
WASP families are sometimes stereotyped as pursuing traditional British diversions such as squash, golf, tennis, Badminton, riding, polo, and yachting, pursuits that served as a marker of affluence. Social registers and society pages listed the privileged, who mingled in the same private clubs, attended the same churches, and lived in neighborhoods — Philadelphia's Main Line and Chestnut Hill neighborhoods, New York City's Upper East Side, and Boston's Beacon Hill are notable examples — governed by covenants designed to separate the well-bred from the merely wealthy.
It was not until after World War II that the networks of privilege and power in the old Protestant establishment began to lose significance. Many reasons have been attributed to the WASP decline and books have been written detailing it.[8] Among the reasons often cited is increased competitive pressure as the WASPs themselves opened the doors to competition. The GI Bill and government-supported mortgage programs brought higher education to the children of poor European immigrants, and the postwar era created ample economic opportunity for a growing new middle class. Nevertheless, white Protestants remain represented in the country's cultural, political, and economic élite.[9]
While the white Protestant establishment is no longer the sole elite group in American society, it remains a significant presence throughout the nation. WASPs are still predominantly upper middle to upper class and well educated, as well as occasional members of the elite. Some white Protestants families have jettisoned the notion of marriage as a way to maintain culture, and marriages between WASPs and Jews or Catholics are not altogether rare; marriages between WASPs and other races are less common but are not necessarily frowned upon.
WASPs in the Northeast, Midwest, and West were once dominant in the Republican Party. Catholics in the Northeast, generally recent Irish or Italian immigrants, populated that region's Democratic party politics. Catholic, or "white ethnic," voters and politicians failed to find favor among WASP voters even in the liberal Northeast.[10] A popular example was the 1952 senate election in Massachusetts between John F. Kennedy and Henry Cabot Lodge, Jr., decisively split along sectarian lines (despite JFK's WASPish associations such as Choate, Harvard, Spee Club, Hyannisport). While affluent, white, Protestant Northerners tended at one point toward temperamental conservatism (or noblesse oblige progressivism), trends and demographics have changed these realities. The old style Rockefeller Republicans wing of the party favored by WASPs weakened, as most recent successful Republican politicians in the Northeast have been Catholics, such as George Pataki. Five of the six New England states have recently become reliably Democratic in their presidential voting, with the exception of New Hampshire. White Protestants in the South are largely Republicans. Liberalism or Progressivism has also come to define a certain portion of WASP politics, especially in the Northeast.[11] Prominent WASPs such as Howard Dean and Ned Lamont have become visible leaders of the contemporary Democratic party.
[12] The population of religiously, at least among active Episcopalians, for instance, does not appear to be growing. One Episcopal church leader was quoted in 2006 as estimating the church's national membership as 2.2 million and attributing a low birth rate to their higher level of education. Notwithstanding, white Protestants are still the largest group of Americans with over half of Americans claiming to be Protestant vs. about 25% Catholic.[13]

[edit] Criticism
Some object to the expression because of its inaccuracy and because the term is bandied about in a casual manner by people who may not understand its full meaning or its imprecision. As noted above, many people now referred to as "WASPs" are not Anglo-Saxon in the sense of being descendants of the Germanic settlers of Britain. In addition, some see it as a racial, ethnic, and religious slur showing contempt for European Americans and an attempt to smother European American diversity, since European Americans trace their origins to a large number of European countries with a diverse history where a variety of religions are practiced: It is therefore difficult to apply a single catch-all term.

No comments: