Three days before the world “changed forever,” U.N. Human Rights Commissioner Mary Robinson tried to put a pretty face on a lackluster summit that had just ended in Durban, South Africa.  The nine-day conference, designed to address “racism, racial discrimination, xenophobia and related intolerance,” was doomed when the United States and Israel withdrew over objections to two items on the agenda.  One item under consideration was reparations for slavery, an institution that officially ended in the United States more than a century ago yet continues in many countries today.  The second was a push by a bloc of Arab nations to declare Israeli militancy in Palestine racist.  That proposal was absurd by all scientific standards and would only have escalated violence by perpetuating an ancient myth of human hatred.

In her closing remarks, Robinson admitted that “The issues have been addressed, not answered.  But we have a framework.  We have made a start and that is what counts.”  Then, while mentioning moving testimony from people “on the receiving end of racism and discrimination,” she claimed racism is the motivation for slavery.  By strict definition, however, slavery is an economic institution; it is not necessarily racist.

Certainly, when slavery was legal in parts of the United States, the vast majority of slaveowners were white, while all of the slaves were black; still, there were instances—particularly in Louisiana—of free blacks owning black slaves.  In parts of the world where slavery exists today, there is generally no racial differentiation between slaves and slaveholders.

Robinson credited the conference with putting “the gender dimension of racism on the map.”  One might also conclude, then, that “male chauvinism” is, in some way, racist.

Racism may be as ancient as mankind itself, but the science of racism dates from the mid-18th century.  In the beginning, the effort to define the races included separating them into a hierarchy.  In 1758, Swedish botanist Carolus Linnaeus named our species the Latin equivalent of “wise (or thinking) primate,” Homo sapiens.  He showed little wisdom, though, by ascribing derogatory characteristics to each of the four races except his own.  Native Americans were “ill-tempered, obstinate, contented . . . free,” he wrote, while Asians were “severe, haughty, desirous”; black Africans were “crafty, slow (and) foolish,” but whites were “active, very smart (and) inventive.”

The study of race has come a long way, especially because of recent breakthroughs in the science of DNA and the human genome.  With the exception, perhaps, of the Basques, whose obscure biological origins lend racial authenticity to their brand of separatist terrorism, there are still only four races, and both sides in the Palestinian conflict belong to one: Caucasian or white.  Ongoing studies increasingly demonstrate that all the races have far more similarities than differences.

One observer has noted that the state of Israel was founded by Western European Jews displaced by the genocidal practices of World War II, who are now pitted against native Middle Easterners, even though both Islam and Judaism (and Christianity, for that matter) originated in the same area.

As Steve Olson noted in the April 2001 issue of the Atlantic Monthly, however, geneticists have proved that:

Europeans are descended largely from populations of farmers who started migrating out of the Middle East 9,000 years ago.  As the sons and daughters of farming families left their parents’ farms and moved into new territory, they interbred with the existing hunter-gatherer populations, which produced gradients of genetic change radiating from the Middle East.  Only in mountainous areas unattractive to farmers—the Pyrenees homelands of the Basques, for example—were the genes of the indigenous peoples comparatively intact.

Olson laments that it appears to be human nature to concentrate on the differences rather than the similarities between the races.  It is far more dangerous, though, to ascribe racial qualifications to cultural, religious, or ethnic differences.

For example, the term “sand nigger” became more widely used in the American vernacular after September 11, 2001.  Attacks on Muslims were reported in many areas of the United States even though, of the approximately 1.1 billion adherents to Islam worldwide, only about 15 percent are of Arabic descent.

“Raghead” also became suddenly popular, and Sikhs were reportedly attacked, even though they are not Muslim.  Sikhism is a 15th-century offshoot of Hinduism.  It has nothing to do with Islam.  The only similarity between the two religions is that male adherents to both wear turbans, though of different styles.  However, attacking people because they look, dress, or pray differently is xenophobic, not racist.

Some believe that warfare between identifiably different groups of human beings is endemic to the human species.  The oldest archeological evidence suggests, however, that modern human beings did not wage war against other hominid species, even when they coexisted in the same areas for several millennia.  As Olson wrote,

The cave paintings of Europe, some of which date from the period when modern people were replacing Neanderthals, evince plenty of violence against animals but not against other people.

The U.S. delegation’s withdrawal from the Durban Conference no doubt had more to do with the tragic American experience with race—a drama that is still unfolding in our society—than with any scientific dogma.  The American Association of Anthropologists addressed this in May 1998: “Racial beliefs constitute myths about the diversity in the human species and about the abilities and behavior of people homogenized into ‘racial’ categories.”  Furthermore, the association argued,

The myths fused behavior and physical features together in the public mind, impeding our comprehension of both biological variations and cultural behavior, implying that both are genetically determined.  Racial myths bear no relationship to the reality of human capabilities or behavior. . . .

 

At the end of the 20th century, we now understand that human cultural behavior is learned, conditioned into infants beginning at birth, and always subject to modification.  No human is born with a built-in culture or language.  Our temperaments, dispositions, and personalities, regardless of genetic propensities, are developed within sets of meanings and values that we call “culture.”

Still, scientific treatises do not always demonstrate the extent or complexity of any human endeavor.  One example of racial warfare that occurred at Sand Creek, Colorado, in the fall of 1864 should give anyone pause.

Four years before Gen. Philip Sheridan told a Comanche chief named Tosawi that “The only good Indian I ever saw was dead,” the sentiment behind the famous misquotation of that line “The only good Indian is a dead Indian” was already firmly embedded in the minds of European adventurers in the American West.

John Chivington was a former Methodist minister and dedicated antislavery hero of the War Between the States.  His moment of glory was leading Union forces to victory against the Confederates to protect access to Colorado goldfields.  His moment of infamy, however, came after his 1862 triumph at the Battle of Glorieta Pass.  It was motivated by an impediment to greed: Arapaho and Cheyenne guarding ancestral lands guaranteed to them by a treaty with the United States in 1851 were squarely in the path of Union prospectors itching for gold.

Another famous enemy of black slavery played a part in this drama of racial iniquity: Colorado Territorial Gov. John Evans, who was appointed by President Abraham Lincoln largely for supporting Lincoln’s emancipation of the slaves.

In August 1864, a band of Arapaho warriors massacred settlers illegally squatting on their land, an action similar to Palestinian attacks on Jewish settlers on the West Bank and elsewhere in Palestine.  Evans immediately raised a company of volunteers, recruited primarily from among frustrated prospectors, for the stated purpose of killing Indians.  He put Chivington in charge.  Within a month, Cheyenne and Arapaho leaders met with Maj. Edward Wynkoop, the ranking Union Army officer in the area, declaring their desire to end all hostilities.  Both Evans and Chivington were unconvinced—or unwilling to be convinced—of the chiefs’ sincerity.

Five weeks later, Chivington arranged to have the sympathetic Wynkoop replaced by Maj. Scott Anthony, a cousin of Susan B. Anthony.  Chivington convinced Anthony to make the Indians good by making them dead.  In one of the most tragic moments in U.S. history, two established antislavery heroes and a cousin of the founder of the women’s rights movement joined forces in a race war.

Religious and cultural components exacerbated the explosive situation.  One biographer speculated that Chivington viewed Indians more harshly than slaves because the vast majority of the latter had accepted Christianity and the English language.

The mix exploded on November 29 at an encampment of Cheyenne and Arapaho noncombatants.  Most of the young men were away on a hunting trip, leaving about 150 unarmed people in the camp; two thirds of them were women and children.  The elderly included two chiefs in their 70’s who had visited the White House the year before at Lincoln’s invitation and had their picture taken with the First Lady.

Though a white handkerchief and an American flag fluttered from the main teepee, some of Chivington’s 700 militiamen dismounted and opened fire.  The bloodlust was so intense that, in the cavalry charge that followed, several comrades were wounded by friendly fire.

Fewer than a dozen victims escaped.  All of the corpses were scalped.  A rope spliced with about a hundred scalps was displayed onstage in Denver a couple of weeks later.  The audience cheered.

Most, if not all, of the bodies were mutilated.  Symbolic of the genocidal nature of racial conflicts, the soldiers ripped out genitalia for souvenirs.  Female organs were displayed on a stick.  Soldiers decorated their hats with uteruses.  Scrotums were made into tobacco pouches.  The remains were left to rot or be consumed by scavenging animals.

Both the United Nations Charter, adopted in 1945, and the UNESCO “Declaration on Race and Racial Prejudice,” approved in 1978, condemn racism by individuals and states.  The 1978 declaration, especially, equates the severity of racism with other forms of intolerance, including those based on ethnic or religious bias.  Both documents adopt the single-species view of humanity.

Article 1 of the 1945 charter directs UNESCO

to contribute to peace and security by promoting collaboration among the nations through education, science and culture in order to further universal respect for justice, for the rule of law and for the human rights and fundamental freedoms which are affirmed for the peoples of the world, without distinction of race, sex, language or religion . . .

The proposal before the U.N. Human Rights Commission to declare Israeli militancy racially motivated was contradictory to all scientific evidence and, therefore, not in keeping with the U.N. mandate to promote “collaboration among nations through education and science.”

Given the United States’ long, bloody past full of racial conflict and the ongoing, vigorous civil-rights movement over the past half-century, the U.S. delegation was completely justified in withdrawing from the Durban conference.

Much of the world viewed the action as simply one more instance of American solidarity with Israeli Zionism: a political and religious movement, not a racist one.  Then the atrocities of September 11 occurred, and on October 19, less than two weeks after the United States began dropping bombs on Afghanistan, the U.N. Human Rights Commission narrowly approved a resolution condemning Israel for three weeks of violence against Palestinians.  The text approved was presented by the Arab bloc.  The United States and the European Union opposed the resolution because it made no mention of Palestinian abuses.

The resolution amounted to different words, sung to the same tune.