New

Did anyone in Europe predict the existence of the Americas?

Did anyone in Europe predict the existence of the Americas?

According to the answer of this question, Columbus was pretty much unique in his belief that the world was small enough and Asia was big enough to make sailing west to Asia possible. Everyone else expected (correctly) that it was WAY too far away to be a practical option.

Of course, it ended up being moot, because there was a completely new landmass in between Europe and Asia: the Americas. Columbus landed there instead, and the rest was history.

But if the Europeans had anything even approaching an accurate idea of how far away Asia was to the west, they must have realized that that would be a COLOSSAL ocean. (About 11,000 miles, we can now estimate.) Did anyone propose the idea that there was more than water out there? That an ocean that big might hold a whole new continent, or even more than one, before you got back around to "India"?

Especially given the Viking discoveries of America (and of Iceland and Greenland), were there rumors or discussions of other lands to the West? Or did the European powers just assume it was empty ocean all the way to Asia, right up until they discovered an enormous landmass blocking their way?


As fate would have it, the first known globe of the Earth was created in 1492, the same year as Columbus' voyage. As such, it is also the only known globe to depict the area between Western Europe and East Asia prior to the discovery of the New World. None of the earlier flat maps I could find made any kind of legitimate effort at depicting this area.

The author was working for the King of Portugal at the time. Given this as the state of the art of European cartographical knowledge in 1492, we can see that the intervening ocean was thought to contain numerous small islands. The Azores and Canaries and Cape Verde islands were known and depicted. In the East, Japan (Cipangu), Java and other "spice" islands of the SE Asian archipelago are depicted.

There are some interesting "unknown" islands. There are several placed up in the arctic circle, perhaps as a nod to Iceland and its legends. There was also one island roughly the size of England smack dab in the middle of the ocean named "Saint Brandan". Likely this is a reference to the story of the Irish monk St. Brendon, who was said to have traversed the ocean and found an island paradise. The interesting thing about this is that it appears that this rather tall tale was apparently given much more credence in mainstream European thought than the Icelandic discoveries.

Even there though, it was clearly just thought to be a large island, not an entire new continent. So it seems fair to say the folks in the best position to speculate, the Portugese navigational community, didn't think there was anything in the middle but islands.


Europeans, perhaps not, someone in the old world, yes.

Al-Biruni (973-1050) lived in Khwarezm (modern Uzbekistan). Among other works in mathematics, astronomy, physics, mineralogy, history and geography, he calculated the circumference of Earth with a precision higher than his predecessors, and made some precise maps of known lands. In his work Codex Masudicus he conjectured that there should be an inhabited continent between Europe and Asia:

But was three fifths of the Earth's circumference really nothing but water? Biruni considered this possibility but rejected it on the grounds of both observation and logic. From his study of specific gravity he knew that most solid minerals were heavier than water. Would so watery a world not give rise to serious imbalances to which the planet would have had to adjust over time? And why, he asked, would the forces that had given rise to land on two fifths of the earth's belt not also have had an effect on the other three fifths as well? Biruni concluded that somewhere in the vast expanses of ocean between Europe and Asia there must be one or more unknown land masses or continents.

(Source: S. Frederick Starr)

His reasoning was unsound from the modern point of view, since the same arguments would imply, for example, that Pangea could not exists. But the conclusion was correct.

There is no evidence that this work was known to Columbus, but, interestingly enough, he did know a work of central Asian geographer Ahmad al-Farghani (c. 800-c. 870) who computed the circumference of Earth more accurately than the Greeks (but less accurately than al-Biruni). There is a theory that it is a confusion about the units used by al-Farghani that led Columbus to underestimating the circumference of the Earth, underlying the whole rationale for his trip.


This is far from proven science, maybe not even at the level of a hypothesis. Supposedly the king of Mali in Africa sailed to the Americas around 1300 or earlier. Don't remember the exact years. Some people have also suggested that some of the statues found in central and south america look African and this could indicate a voyage from the old world that we don't know about.

I also like to read Graham Hancock once in a while and in one of his books he said that supposedly Columbus spent a long time on some island that had a tradition of sailing and navigation going back thousands of years and they had secret maps that depicted the Americas.

King of Mali

There is some evidence to this, but not proven. I've seen the statues on TV and they do look African. Also some of the artwork from ancient mesoamerica seems to be based on the same constellations as that from the Middle East and Asia. Some of this will take time. I don't follow Hancock religiously, but some of the things he talked about in his 1993 book are coming to light.

Fingerprints of the Gods was written before Gobelki Tepi had been studied in detail.

Magicians of the Gods No, I don't believe there were 7 magical people who fled Atlantis or whatever and sailed the world. What I do think after reading this is that there is something beyond mere coincidence to explain why artwork is based on the same constellations around the world. And that people may have studied the stars and mathematics earlier than we currently know.

None of this stuff is close to being any kind of proof, but you have to wonder sometimes. Sailing then wasn't like it is today. It was extremely dangerous and doing so into the unknown even more so. It was also insanely expensive and why would a new king give Columbus so much money on such a risky venture?

When I was in school we were taught Columbus discovered America. Today we know it's not entirely true. In 50 years we might find enough evidence that people have sailed across the oceans for thousands of years.


Muslims in Europe: The Construction of a “Problem”

The presence of some 25 million Muslims in the 28 countries of the European Union is currently sparking debate, controversy, fear and even hatred. Never before have we witnessed such a climate of mutual suspicion between Muslims and mainstream European societies. Public opinion surveys in Europe show increasing fear and opposition to European Muslims, who are perceived as a threat to national identity, domestic security and the social fabric. Muslims, on the other hand, are convinced that the majority of Europeans reject their presence and vilify and caricaturise their religion.

Surveys show increasing fear towards European Muslims, who believe that Europeans caricaturise their religion

Such a misunderstanding is worrisome as it fuels dangerous Islamophobia, on the one hand, and radicalisation, on the other. European states are alarmed by these developments since they place harmonious cohabitation in jeopardy. Consequently, they have taken measures and enacted laws to combat extremist forces, curb radicalisation and improve Muslims’ integration into the receiving countries.

However the situation is not simple. How could Europe encourage Muslim integration into secular states? Are radicalisation and extremism linked to economic marginalisation? Are they a product of a narrative that divides the world into two camps: us and them? Is extremism is only faith-based? If so, why did an extremist Norwegian kill, in 2011, dozens of his compatriots who were not Muslims? European states continue to grapple with these thorny questions without being able to devise a coherent response.

My arguments are that Muslims are settling permanently in Europe, that the vast majority want to live in peace, that European integration policies have been erratic and inconsistent and that only a tiny minority of Muslims are engaged in radical activities. I also argue that in addition to faith-based radicalisation (religiously-motivated groups or individuals), there is an identity-based extremism (far-right parties), which is no less dangerous, and Europe should confront both problems by drying up the ideological sources of extremism. Finally, I make the point that Islamist radicalism in Europe remains marginal. This radicalism is not the result of failed integration, but rather local-global connections, which are linked to identity rupture and the exposure of young European Muslims to the unbearable images of destruction and violence in many Muslim countries, mainly those in the Middle East. Whether this violence is the result of Western intervention, such as the invasion of Iraq and the Israeli offensives in Gaza, or the result of the assault of Muslim regimes on their own populations, such as in Iraq or Syria, is irrelevant.


10 predictions that Watkins got right.

1. Digital colour photography

Watkins did not, of course, use the word "digital" or spell out precisely how digital cameras and computers would work, but he accurately predicted how people would come to use new photographic technology.

"Photographs will be telegraphed from any distance. If there be a battle in China a hundred years hence, snapshots of its most striking events will be published in the newspapers an hour later. photographs will reproduce all of nature's colours."

This showed major foresight, says Mr Nilsson. When Watkins was making his predictions, it would have taken a week for a picture of something happening in China to make its way into Western papers.

People thought photography itself was a miracle, and colour photography was very experimental, he says.

"The idea of having cameras gathering information from opposite ends of the world and transmitting them - he wasn't just taking a present technology and then looking to the next step, it was far beyond what anyone was saying at the time."

Patrick Tucker from the World Future Society, based in Maryland in the US, thinks Watkins might even be hinting at a much bigger future breakthrough.

"'Photographs will be telegraphed' reads strikingly like how we access information from the web," says Mr Tucker.

2. The rising height of Americans

"Americans will be taller by from one to two inches."

Watkins had unerring accuracy here, says Mr Nilsson - the average American man in 1900 was about 66-67ins (1.68-1.70m) tall and by 2000, the average was 69ins (1.75m).

Today, it's 69.5ins (1.76m) for men and 64ins (1.63m) for women.

"Wireless telephone and telegraph circuits will span the world. A husband in the middle of the Atlantic will be able to converse with his wife sitting in her boudoir in Chicago. We will be able to telephone to China quite as readily as we now talk from New York to Brooklyn."

International phone calls were unheard of in Watkins' day. It was another 15 years before the first call was made, by Alexander Bell, even from one coast of the US to the other. The idea of wireless telephony was truly revolutionary.

"Ready-cooked meals will be bought from establishment similar to our bakeries of today."

The proliferation of ready meals in supermarkets and takeaway shops in High Streets suggests that Watkins was right, although he envisaged the meals would be delivered on plates which would be returned to the cooking establishments to be washed.

5. Slowing population growth

"There will probably be from 350,000,000 to 500,000,000 people in America [the US]."

The figure is too high, says Nilsson, but at least Watkins was guessing in the right direction. If the US population had grown by the same rate it did between 1800 and 1900, it would have exceeded 1 billion in 2000.

"Instead, it grew just 360%, reaching 280m at the start of the new century."

6. Hothouse vegetables

Winter will be turned into summer and night into day by the farmer, said Watkins, with electric wires under the soil and large gardens under glass.

"Vegetables will be bathed in powerful electric light, serving, like sunlight, to hasten their growth. Electric currents applied to the soil will make valuable plants to grow larger and faster, and will kill troublesome weeds. Rays of coloured light will hasten the growth of many plants. Electricity applied to garden seeds will make them sprout and develop unusually early."

Large gardens under glass were already a reality, says Philip Norman of the Garden Museum in London, but he was correct to predict the use of electricity. Although coloured lights and electric currents did not take off, they were probably experimented with.

"Electricity certainly features in plant propagation. But the earliest item we have is a 1953 booklet Electricity in Your Garden detailing electrically warmed frames, hotbeds and cloches and electrically heated greenhouses, issued by the British Electrical Development Association.

"We have a 1956 soil heater, used in soil to assist early germination of seeds in your greenhouse."

"Man will see around the world. Persons and things of all kinds will be brought within focus of cameras connected electrically with screens at opposite ends of circuits, thousands of miles at a span."

Watkins foresaw cameras and screens linked by electric circuits, a vision practically realised in the 20th Century by live international television and latterly by webcams.

"Huge forts on wheels will dash across open spaces at the speed of express trains of today."

Leonardo da Vinci had talked about this, says Nilsson, but Watkins was taking it further. There weren't many people that far-sighted.

"Strawberries as large as apples will be eaten by our great-great-grandchildren."

Lots of larger varieties of fruit have been developed in the past century, but Watkins was over-optimistic with regard to strawberries.

10. The Acela Express

"Trains will run two miles a minute normally. Express trains one hundred and fifty miles per hour."

Exactly 100 years after writing those words, to the very month, Amtrak's flagship high-speed rail line, the Acela Express, opened between Boston and Washington, DC. It reaches top speeds of 150mph, although the average speed is considerably less than that. High-speed rail in other parts of the world, even in 2000, was considerably faster.

"There will be no C, X or Q in our everyday alphabet. They will be abandoned because unnecessary."

This was obviously wrong, says Patrick Tucker of the World Future Society, but also remarkable in the way that it hints at the possible effects of mass communication on communication itself.

2. Everybody will walk 10 miles a day

"This presents a rather generous view of future humanity but doesn't seem to consider the popularity and convenience of the very transportation breakthroughs [moving sidewalks, express trains, coaches] forecast elsewhere in the article," says Mr Tucker.

3. No more cars in large cities

"All hurry traffic will be below or above ground when brought within city limits."

However, many cities do have pedestrian zones in their historic centres. And he correctly forecast elevated roads and subways.

4. No mosquitoes or flies

"Mosquitoes, house-flies and roaches will have been exterminated."

Watkins was getting ahead of himself here. Indeed the bed bug is making a huge comeback in the US and some other countries.

Maybe the end of the mosquito and the house fly is something to look forward to in 2100?


How the U.S. Made Progress on Climate Change Without Ever Passing a Bill

Drivers Never Learn the One Lesson of Cicada Season

Detective Fiction Has Nothing on This Victorian-Science Murder Mystery

From 30,000 years ago until around 11,000 B.C., the earth was subjected to a cold snap that sucked up the sea into glaciers and ice sheets extending from the poles. This period is known as the Last Glacial Maximum, when the reach of the most recent Ice Age was at its fullest. By drilling mud cores out of the seabed, we can reconstruct a history of the land and the seas, notably by measuring concentrations of oxygen, and looking for pollen, which would have been deposited on dry ground from the flora growing there. We think therefore that sea level was somewhere between 60 and 120 meters lower than today. So it was terra firma all the way from Alaska to Russia, and all the way down south to the Aleutians—a crescent chain of volcanic islands that speckle the north Pacific.

The prevailing theory about how the people of the Americas came to those lands is via that bridge. We refer to it as a land bridge, though given its duration and size, it was simply continuous land, thousands of miles from north to south it’s only a bridge if we view it in comparison to today’s straits. The area is called Beringia, and the first people across it the Beringians. These were harsh lands, sparse with shrubs and herbs to the south, there were boreal woodlands, and where the land met the sea, kelp forests and seals.

Though these were still tough terrains, according to archaeological finds Western Beringians were living near the Yana River in Siberia by 30,000 B.C. There’s been plenty of debate over the years as to when exactly people reached the eastern side, and therefore at what point after the seas rose they became isolated as the founding peoples of the Americas. The questions that remain—and there are many—concern whether they came all at once or in dribs and drabs. Sites in the Yukon that straddle the U.S.-Alaskan border with Canada give us clues, such as the Bluefish Caves, 33 miles southwest of the village of Old Crow.

The latest radio-dating analysis of the remnants of lives in the Bluefish Caves indicates that people were there 24,000 years ago. These founding peoples spread over 12,000 years to every corner of the continents and formed the pool from which all Americans would be drawn until 1492. I will focus on North America here, and what we know so far, what we can know through genetics, and why we don’t know more.

Until Columbus, the Americas were populated by pockets of tribal groups distributed up and down both north and south continents. There are dozens of individual cultures that have been identified by age, location, and specific technologies—and via newer ways of knowing the past, including genetics and linguistics. Scholars have hypothesized various patterns of migration from Beringia into the Americas. Over time, it has been suggested that there were multiple waves, or that a certain people with particular technologies spread from north all the way south.

Both ideas have now fallen from grace. The multiple-waves theory has failed as a model because the linguistic similarities used to show patterns of migration are just not that convincing. And the second theory fails because of timing. Cultures are often named and known by the technology that they left behind. In New Mexico there is a small town called Clovis, population 37,000. In the 1930s, projectile points resembling spearheads and other hunting paraphernalia were found in an archaeological site nearby, dating from around 13,000 years ago. These were knapped on both sides—bifaced with fluted tips. It had been thought that it was the inventors of these tools who had been the first people to spread up and down the continents. But there’s evidence of humans living in southern Chile 12,500 years ago without Clovis technology. These people are too far away to show a direct link between them and the Clovis in such a way that indicates the Clovis being the aboriginals of South America.

Today, the emerging theory is that the people up in the Bluefish Caves some 24,000 years ago were the founders, and that they represent a culture that was isolated for thousands of years up in the cold north, incubating a population that would eventually seed everywhere else. This idea has become known as Beringian Standstill. Those founders had split from known populations in Siberian Asia some 40,000 years ago, come across Beringia, and stayed put until around 16,000 years ago.

Analysis of the genomes of indigenous people show 15 founding mitochondrial types not found in Asia. This suggests a time when genetic diversification occurred, an incubation lasting maybe 10,000 years. New gene variants spread across the American lands, but not back into Asia, as the waters had cut them off. Nowadays, we see lower levels of genetic diversity in modern Native Americans—derived from just those original 15—than in the rest of the world. Again, this supports the idea of a single, small population seeding the continents, and—unlike in Europe or Asia—these people being cut off, with little admixture from new populations for thousands of years, at least until Columbus.

In Montana, 20 miles or so off Highway 90, lies the minuscule conurbation of Wilsall, population 178 as of 2010. Though stacks of material culture in the Clovis tradition have been recovered throughout North America, only one person from this time and culture has risen from his grave. He’s acquired the name Anzick-1, and was laid to rest in a rock shelter in what would become—around 12,600 years later—Wilsall. He was a toddler, probably less than two years old, judging from the unfused sutures in his skull. He was laid to rest surrounded by at least 100 stone tools, and 15 ivory ones. Some of these were covered in red ochre, and together they suggest Anzick was a very special child who had been ceremonially buried in splendor. Now he’s special because we have his complete genome.

And there’s the woeful saga of Kennewick Man. While attending a hydroplane race in 1996, two locals of Kennewick, Washington, discovered a broad-faced skull inching its way out of the bank of the Columbia River. Over the weeks and years, more than 350 fragments of bone and teeth were eked out of this 8,500-year-old grave, all belonging to a middle-aged man, maybe in his 40s, deliberately buried, with some signs of injuries that had healed over his life—a cracked rib, an incision from a spear, a minor depression fracture on his forehead. There were academic squabbles about his facial morphology, with some saying it was most similar to Japanese skulls, some arguing for a link with Polynesians, and some asserting he must have been European.

With all the toing and froing about his morphology, DNA should be a rich source of conclusive data for this man. But the political controversies about his body have severely hampered his value to science for 20 years. For Native Americans, he became known as the Ancient One, and five clans, notably the Confederated Tribes of the Colville Reservation, wanted to have him ceremonially reburied under guidelines determined by the Native American Graves Protection and Repatriation Act (NAGPRA), which affords custodial rights to Native American artifacts and bodies found on their lands. Scientists sued the government to prevent his reburial, some claiming that his bones suggested he was European, and therefore not connected with Native Americans.

To add an absurd cherry on top of this already distasteful cake, a Californian pagan group called the Asatru Folk Assembly put in a bid for the body, claiming Kennewick Man might have a Norse tribal identity, and if science could establish that the body was European, then he should be given a ceremony in honor of Odin, ruler of the mythical Asgard, though what that ritual entails is not clear.

His reburial was successfully blocked in 2002, when a judge ruled that his facial bones suggested he was European, and therefore NAGPRA guidelines could not be invoked. The issue was batted back and forth for years, in a manner in which no one came out looking good. Nineteen years after this important body was found, the genome analysis was finally published.

Had he been European (or Japanese or Polynesian), it would’ve been the most revolutionary find in the history of U.S. anthropology, and all textbooks on human migration would have been rewritten. But of course he wasn’t. A fragment of material was used to sequence his DNA, and it showed that lo and behold, Kennewick Man—the Ancient One—was closely related to the Anzick baby. And as for the living, he was more closely related to Native Americans than to anyone else on Earth, and within that group, most closely related to the Colville tribes.

Anzick is firm and final proof that North and South America were populated by the same people. Anzick’s mitochondrial genome is most similar to people of central and south America today. The genes of the Ancient One most closely resemble those of tribes in the Seattle area today. These similarities do not indicate that either were members of those tribes or people, nor that their genes have not spread throughout the Americas, as we would expect over timescales of thousands of years. What they show is that the population dynamics—how ancient indigenous people relate to contemporary Native Americans—is complex and varies from region to region. No people are completely static, and genes less so.

In December 2016, in one of his last acts in office, President Barack Obama signed legislation that allowed Kennewick Man to be reburied as a Native American. Anzick was found on private land, so not subject to NAGPRA rules, but was reburied anyway in 2014 in a ceremony involving a few different tribes. We sometimes forget that though the data should be pure and straightforward, science is done by people, who are never either.

Anzick and Kennewick Man represent narrow samples—a tantalizing glimpse of the big picture. And politics and history are hampering progress. The legacy of 500 years of occupation has fostered profound difficulty in understanding how the Americas were first peopled. Two of the doyennes of this field—Connie Mulligan and Emőke Szathmáry—suggest that there is a long cultural tradition that percolates through our attempts to deconstruct the past.

Europeans are taught a history of migration from birth, of Greeks and Romans spreading over Europe, conquering lands, and interloping afar. Judeo-Christian lore puts people in and out of Africa and Asia, and the silk routes connect Europeans with the East and back again. Many European countries have been seafaring nations, exploring and sometimes belligerently building empires for commerce or to impose a perceived superiority over other people. Even though we have national identities, and pride and traditions that come with that sense of belonging, European culture is imbued with migration.

For Native Americans, this is not their culture. Not all believe they have always been in their lands, nor that they are a static people. But for the most part, the narrative of migration does not threaten European identity in the same way that it might for the people we called the Indians. The scientifically valid notion of the migration of people from Asia into the Americas may challenge Native creation stories. It may also have the effect of conflating early modern migrants from the 15th century onward with those from 24,000 years earlier, with the effect of undermining indigenous claims to land and sovereignty.

Deep among the lakes of the Grand Canyon are the Havasupai. Their name means “people of the blue-green waters,” and they’ve been there for at least 800 years. They’re a small tribe, around 650 members today, and they use ladders, horses, and sometimes helicopters to travel in and out of—or rather, up and down—the canyon. The tribe is rife with type 2 diabetes, and in 1990, the Havasupai people agreed to provide Arizona State University scientists with DNA from 151 individuals with the understanding that they would seek genetic answers to the puzzle of why diabetes was so common. Written consent was obtained, and blood samples were taken.

An obvious genetic link to diabetes was not found, but the researchers continued to use their DNA to test for schizophrenia and patterns of inbreeding. The data was also passed on to other scientists who were interested in migration and the history of Native Americans. The Havasupai only found this out years later, and eventually sued the university. In 2010, they were awarded $700,000 in compensation.

Therese Markow was one of the scientists involved, and insists that consent was on the papers they signed, and that the forms were necessarily simple, as many Havasupai do not have English as a first language, and many did not graduate from high school. But many in the tribe thought that they were being asked only about their endemic diabetes. A blood sample contains an individual’s entire genome, and with it, reams of data about that individual, their family, and evolution.

This isn’t the first time this has happened. In the 1980s, before the days of easy and cheap genomics, blood samples were taken with consent to analyze the unusually high levels of rheumatic disease in the Nuu-chah-nulth people of the Pacific Northwest of Canada. The project, led by the late Ryk Ward, then at the University of British Columbia, found no genetic link in their samples, and the project petered out. By the ’90s, though, Ward had moved to the University of Utah, and then Oxford in the U.K., and the blood samples had been used in anthropological and HIV/AIDS studies around the world, which turned into grants, academic papers, and a PBS–BBC jointly produced documentary.

The use of the samples for historical migration indicated that the origins of the Havasupai were from ancient ancestors in Siberia, which is in accordance with our understanding of human history by all scientific and archaeological methods. But it is in opposition to the Havasupai religious belief that they were created in situ in the Grand Canyon. Though nonscientific, it is perfectly within their rights to preclude investigations that contradict their stories, and those rights appear to have been violated. Havasupai Vice Chairman Edmond Tilousi told The New York Times in 2010 that “coming from the canyon . is the basis of our sovereign rights.”

Sovereignty and membership of a tribe is a complex and hard-won thing. It includes a concept called “blood quantum,” which is effectively the proportion of one’s ancestors who are already members of a tribe. It’s an invention of European Americans in the 19th century, and though most tribes had their own criteria for tribal membership, most eventually adopted Blood Quantum as part of the qualification for tribal status.

DNA is not part of that mix. With our current knowledge of the genomics of Native Americans, there is no possibility of DNA being anywhere near a useful tool in ascribing tribal status to people. Furthermore, given our understanding of ancestry and family trees, I have profound doubts that DNA could ever be used to determine tribal membership. While mtDNA (which is passed down from mothers to children) and the Y chromosome (passed from fathers to sons) have both proved profoundly useful in determining the deep ancestral trajectory of the first peoples of the Americas into the present, these two chromosomes represent a tiny proportion of the total amount of DNA that an individual bears. The rest, the autosomes, comes from all of one’s ancestors.

Some genetic genealogy companies will sell you kits that claim to grant you membership to historical peoples, albeit ill-defined, highly romanticized versions of ancient Europeans. This type of genetic astrology, though unscientific and distasteful to my palate, is really just a bit of meaningless fantasy its real damage is that it undermines scientific literacy in the general public.

Over centuries, people have been too mobile to have remained genetically isolated for any significant length of time. Tribes are known to have mixed before and after colonialism, which should be enough to indicate that some notion of tribal purity is at best imagined. Of the genetic markers that have been shown to exist in individual tribes so far, none is exclusive. Some tribes have begun to use DNA as a test to verify immediate family, such as in paternity cases, and this can be useful as part of qualification for tribal status. But on its own, a DNA test cannot place someone in a specific tribe.

That hasn’t stopped the emergence of some companies in the United States that sell kits that claim to use DNA to ascribe tribal membership. Accu-Metrics is one such company. On its web page, it states that there are “562 recognized tribes in the United States, plus at least 50 others in Canada, divided into First Nation, Inuit, and Metis.” For $125 the company claims that it “can determine if you belong to one of these groups.”

The idea that tribal status is encoded in DNA is both simplistic and wrong. Many tribespeople have non-native parents and still retain a sense of being bound to the tribe and the land they hold sacred. In Massachusetts, members of the Seaconke Wampanoag tribe identified European and African heritage in their DNA, due to hundreds of years of interbreeding with New World settlers. Attempting to conflate tribal status with DNA denies the cultural affinity that people have with their tribes. It suggests a kind of purity that genetics cannot support, a type of essentialism that resembles scientific racism.

The specious belief that DNA can bestow tribal identity, as sold by companies such as Accu-Metrics, can only foment further animosity—and suspicion—toward scientists. If a tribal identity could be shown by DNA (which it can’t), then perhaps reparation rights afforded to tribes in recent years might be invalid in the territories to which they were moved during the 19th century. Many tribes are effective sovereign nations and therefore not necessarily bound by the laws of the state in which they live.

When coupled with cases such as that of the Havasupai, and centuries of racism, the relationship between Native Americans and geneticists is not healthy. After the legal battles over the remains of Kennewick Man were settled, and it was accepted that he was not of European descent, the tribes were invited to join in the subsequent studies. Out of five, only the Colville Tribes did. Their representative, James Boyd, told The New York Times in 2015, “We were hesitant. Science hasn’t been good to us.”

Data is supreme in genetics, and data is what we crave. But we are the data, and people are not there for the benefit of others, regardless of how noble one’s scientific aims are. To deepen our understanding of how we came to be and who we are, scientists must do better, and invite people whose genes provide answers to not only volunteer their data, but to participate, to own their individual stories, and to be part of that journey of discovery.

This is beginning to change. A new model of engagement with the first people of the Americas is emerging, albeit at a glacial pace. The American Society of Human Genetics meeting is the annual who’s who in genetics, and has been for many years, where all of the newest and biggest ideas in the study of human biology are discussed. In October 2016 they met in Vancouver, and it was hosted by the Squamish Nation, a First Nations people based in British Colombia. They greeted the delegates with song, and passed the talking stick to the president for the proceedings to begin.

The relationship between science and indigenous people has been one characterized by a range of behaviors from outright exploitation to casual insensitivity to tokenism and lip service. Perhaps this time is coming to an end and we might foster a relationship based on trust, genuine engagement, and mutual respect, so that we might work together and build the capacity for tribes to lead their own research into the histories of these nations.

Though the terms Native American and Indian are relative, the United States is a nation of immigrants and descendants of slaves who have overwhelmed the indigenous population. Less than 2 percent of the current population defines itself as Native American, which means that 98 percent of Americans are unable to trace their roots, genetic or otherwise, beyond 500 years on American soil. That is, however, plenty of time for populations to come and breed and mix and lay down patterns of ancestry that can be enlightened with living DNA as our historical text.

A comprehensive genetic picture of the people of postcolonial North America was revealed at the beginning of 2017, drawn from data submitted by paying customers to the genealogy company AncestryDNA. The genomes of more than 770,000 people born in the United States were filtered for markers of ancestry, and revealed a picture of mishmash, as you might expect from a country of immigrants.

Nevertheless, genetic clusters of specific European countries are seen. Paying customers supply spit harboring their genomes, alongside whatever genealogical data they have. By aligning these as carefully as possible, a map of post-Columbus America can be summoned with clusters of common ancestry, such as Finnish and Swedish in the Midwest, and Acadians—French-speaking Canadians from the Atlantic seaboard—clustering way down in Louisiana, close to New Orleans, where the word Acadian has mutated into Cajun. Here, genetics recapitulates history, as we know the Acadians were forcibly expelled by the British in the 18th century, and many eventually settled in Louisiana, then under Spanish control.

In trying to do something similar with African Americans, we immediately stumble. Most black people in the United States cannot trace their genealogy with much precision because of the legacy of slavery. Their ancestors were seized from West Africa, leaving little or no record of where they were born. In 2014, the genetic genealogy company 23andMe published its version of the population structure of the United States. In that portrait we see a similar pattern of European admixture, and some insights into the history of the postcolonial United States.

The Emancipation Proclamation—a federal mandate to change the legal status of slaves to free—was issued by President Lincoln in 1863, though the effects were not necessarily immediate. In the genomic data, there’s admixture between European DNA and African that begins in earnest around six generations ago, roughly in the mid-19th century. Within these samples we see more male European DNA and female African, measured by Y chromosome and mitochondrial DNA, suggesting male Europeans had sex with female slaves. Genetics makes no comment on the nature of these relations.


The Most Important Predictions Of Nostradamus

History is filled with stories about people who could supposedly see into the future. From Biblical figures like Isaiah and Elijah to more recent seers like Edgar Cayce, every era in the history of humankind seems to have its prophets. They are said to have foretold future events that, in many cases, occurred a long time after their deaths. In fact, according to some prophets, some events have yet to happen. In 16 th century Europe, during the time of the Renaissance, there was one particular person who gained notoriety for his predictions. His name was Michel de Nostradame, but he came to be better known by his Latinized name, Nostradamus.

The Physician Who Became A Prophet

Nostradamus was born in Saint Remy de Provence, France, in 1503. He initially gained recognition, not for his prophecies, but his work as a physician, treating victims of the Black Plague, which was ravaging Europe at the time. It was later in his life that Nostradamus began writing down his predictions for the future. In 1555, he published his most famous work, known as Les Prophesies (The Prophecies). When Nostradamus wrote about his predictions, however, he did not do so in a simple fashion. He wrote his prophecies in several languages, including Greek, Italian, and Latin. Moreover, he did not write them in such a way that they were easy to understand.

In the 16th-century, he wrote his prophecies using quatrains, which are four-line rhyming verses. In doing this, he made his predictions very difficult to interpret. There is still debate today among experts as to how to identify what Nostradamus was trying to say in his writing. But why would he disguise his prophecies in such a way? The reason was that in the era in which Nostradamus lived, trying to predict the future could lead to persecution at the hands of the Roman Catholic Church as prophesying was considered to be heresy and the work of the devil.

Nostradamus wrote about future events that he believed would come to pass in the next two thousand years. Some of his prophecies supposedly came to pass during his lifetime or shortly after his death, while others have yet to occur and may not happen until many centuries from now have passed. So what did the 16 th century physician-turned-prophet predict? He allegedly foresaw all sorts of different events, ranging from natural disasters to armed conflicts.

Nostradamus' First Predictions

Among his first predictions were events that occurred when he was still alive. He is said to have predicted that a monk that he met on his travels would be the future pope. He was correct, as the monk eventually became Pope Sixtus V in 1585. Nostradamus is also alleged to have predicted the death of King Henri II of France, saying in one of his quatrains, “The young lion will overcome the older one, on the field of combat in a single battle he will pierce his eyes through a golden cage two wounds made one, then he dies a cruel death.” King Henri II ultimately died during a ceremonial jousting match when he was stabbed through his mask in the face by his opponent, who was six years younger than him, hence the reference to the “young lion” in Nostradamus’ quatrain. Beforehand, Nostradamus tried to warn the king not to take part in any ceremonial jousting, but to no avail. His last prediction was said to have been his death. On the evening of July 1, 1566, he allegedly told his secretary that he would not be alive the next morning. The morning after that, his secretary found him lying dead next to his bed.

Nostradamus’ most significant predictions, however, supposedly took place or will take place, centuries after the prophet’s passing. Experts suggest that he predicted the French Revolution of 1789. They attribute this prediction to one of his quatrains, which reads, “From the enslaved populace, songs, chants and demands while princes and lords are held captive in prisons. These will be in the future by headless idiots be received as divine prayers.” The French Revolution began with the storming of the Bastille Prison. The masses, which Nostradamus had referred to as “the enslaved populace,” rose to overthrow the French monarchy and establish the French Republic. During the years of upheaval that followed, many Frenchmen, Nostradamus’ supposed “headless idiots,” were executed by being decapitated via the guillotine.

The "Antichrists"

Another event that the French seer may have predicted was the rise of Napoleon Bonaparte, also referred to by Nostradamus experts as the First Antichrist. In one of his quatrains, Nostradamus used the words, “Pau, Nay, Loron,” which students of the prophet suggest is an anagram for Napaulon Roy, or Napoleon, the King (Roy) of France. Napoleon would, of course, go on to conquer nearly all of Europe before his ultimate defeat and death in exile.

Nostradamus is also thought to have predicted the rise of the second Antichrist, Adolf Hitler. The prophet wrote, “From the depths of Western Europe, a young child will be born to poor people, he who by his tongue will seduce a great troop His fame will increase towards the realms of the East.” Hitler was born in Austria, which could be argued is part of “the depths of Western Europe.” He was able to persuade his supporters, the “great troop,” to follow him in his campaigns of conquest and genocide, much of which occurred in Eastern Europe, “the realms of the east.” Nostradamus also mentioned the word Hister in another one of his quatrains, which of course, is similar to Hitler, though it may also refer to the ancient name of the Danube River near to where Hitler was born.

Did Nostradamus Predict 9/11?

Some believe that the French prophet even predicted the terrorist attacks of September 11, 2001. He wrote, “Earthshaking fire from the center of the earth will cause tremors around the New City. Two high rocks will war for a long time, and then Arethusa will redden a new river. Proponents of Nostradamus’ prophecies suggest that the New City he refers to is New York City and that the two high rocks and the center of the earth refer to the two towers that constituted the World Trade Center.

The other most essential predictions of Nostradamus are arguably the ones that have not yet come to pass. Among these prophecies is the emergence of a third Antichrist, a third world war, and the exact year the world will come to an end. The story of Nostradamus has many people asking to this day how a person born in Renaissance-era France could have predicted events that would purportedly occur centuries after his life and death. It is likely that as long as humanity seeks to know what the future holds, figures like Nostradamus will continue to intrigue us.


Race/history/evolution notes

This spurious quotation attributed to George Washington has been promoted recently by a poster at Majority Rights: “I am a citizen of the greatest Republic of Mankind. I see the human race united like a huge family by brotherly ties. We have made a sowing of liberty which will, little by little, spring up across the whole world. One day, on the model of the United States of America, a United States of Europe will come into being. The United States will legislate for all its nationalities.” Variations have also been repeated by Belgian Prime Minister Guy Verhofstadt, the half-Japanese founder of the Pan-European Union Coudenhove-Kalergi, Eric Voegelin, and others.

For the search engines, I'm reposting my reply from Majority Rights showing the source of this bogus Washington quotation below (continue reading):

Washington wrote to Lafayette that he considered himself a "citizen of the great republic of humanity," * adding: "I see the human race a great family, united by fraternal bonds."2 Elsewhere he wrote prophetically: "We have sown a seed of liberty and union that will gradually germinate throughout the earth. Some day, on the model of the United States of America, will be constitided the United States of Europe."1

Title The people of action: an essay on American idealism
Authors Gustave Rodrigues, James Mark Baldwin
Translated by Louise Seymour Houghton
Publisher C. Scribner's Sons, 1918
http://books.google.com/books?id=b8Y9AAAAYAAJ

Altho' I pretend to no peculiar information respecting commercial affairs, nor any foresight into the scenes of futurity yet as the member of an infant empire, as a Philanthropist by character, and (if I may be allowed the expression) as a Citizen of the great republic of humanity at large I cannot help turning my attention sometimes to this subject. I would be understood to mean, I cannot avoid reflecting with pleasure on the probable influence that commerce may hereafter have on human manners and society in general. On these occasions I consider how mankind may be connected like one great family in fraternal ties. I indulge a fond, perhaps an enthusiastic idea, that as the world is evidently much less barbarous than it has been, its melioration must still be progressive that nations are becoming more humanized in their policy, that the subjects of ambition and causes for hostility are daily diminishing, and, in fine, that the period is not very remote, when the benefits of a liberal and free commerce will, pretty generally, succeed to the devastations and horrors of war.

Some of the late treaties which have been entered into, and particularly that between the King of Prussia and the Ud. States, seem to constitute a new era in negotiation, and to promise the happy consequences I have just now been mentioning. But let me ask you my Dr. Marquis, in such an enlightened, in such a liberal age, how is it possible the great maritime powers of Europe should submit to pay an annual tribute to the little piratical States of Barbary? Would to Heaven we had a navy able to reform those enemies to mankind, or crush them into non-existence.

I forbear to enter into a discussion of our domestic Politics, because there is little interesting to be said upon them, and perhaps it is best to be silent, since I could not disguise or palliate where I might think them erroneous. The British still hold the frontier Posts, and are determined to do so. The Indians commit some trifling ravages, but there is nothing like a general or even open war. You will have heard what a loss we have met with by the death of poor Genl. Greene. General McDougal and Colo. Tilghman are also dead.

This ["citizen of humanity" talk] is typical enlightenment pablum [and nothing more]. No talk of a "United States of Europe" here, nor "elsewhere" from Washington. The other quotation Rodrigues attributes to Washington is merely the 19th-century French biographer putting words into the mouths of "Washington and his friends" without citing any source, Washington having never written anything like this:

Les États-Unis garantissent à chaque État admis
dans l'Union une forme républicaine de gouvernement
ils le protègent contre l'invasion ils le défendent, .«ar
la denaande de ses représentants, contre toute violence
domestique ils le rendent participant des avantages
de la société commune et ils font jouir tous les
citoyens des droits essentiels de la personne humaine.

Washington et ses amis disaient :

« Notre exemple prouvera aux hommes qu'ils ne
sont pas condamnés à recevoir éternellement leur
gouvernement du hasard et de la force, et qu'ils sont
capables de se donner de bonnes institutions par
réflexion et par choix.

» Nous avons jeté une semence de liberté et d'union,
qui germera peu à peu dans toute la terre.

» Un jour, sur le modèle des États-Unis d'Amérique,
se constitueront les États-Unis d'Europe. »

La constitution votée par la Convention américaine
commença à être appliquée en 1789.

La triple élection des députés, des sénateurs et du
présidentse fit pacifiquement. A l'unanimité, Washing-
ton fut nommé président des Etats-Unis.


Dr. Luciana Borio of the former White House National Security Council (NSC) team responsible for pandemics has previously warned of a pandemic flu threat.

According to CNN's Dale, Borio, the council's director of medical and biodefense preparedness, said in 2018: "The threat of pandemic flu is the number one health security concern. Are we ready to respond? I fear the answer is no."

John Bolton, Trump's national security adviser at the time, later disbanded the team while reorganizing the NSC.


Contents

The specifics of Paleo-Indians' migration to and throughout the Americas, including the exact dates and routes traveled, are subject to ongoing research and discussion. [1] For years, the traditional theory has been that these early migrants moved into the Beringia land bridge between eastern Siberia and present-day Alaska around 40,000–17,000 years ago, when sea levels were significantly lowered due to the Quaternary glaciation. [1] [4] These people are believed to have followed herds of now-extinct pleistocene megafauna along ice-free corridors that stretched between the Laurentide and Cordilleran ice sheets. [5] Another route proposed is that, either on foot or using primitive boats, they migrated down the Pacific coast to South America. [6] Evidence of the latter would since have been covered by a sea level rise of hundreds of meters following the last ice age. [7]

Archaeologists contend that Paleo-Indian migration out of Beringia (eastern Alaska), ranges from 40,000 to around 16,500 years ago. [8] [9] [10] [11] This time range is a hot source of debate and promises to continue as such for years to come. The few agreements achieved to date are the origin from Central Asia, with widespread habitation of the Americas during the end of the last glacial period, or more specifically what is known as the late glacial maximum, around 16,000–13,000 years before present. [11] [12] However, older alternative theories exist, including migration from Europe. [13]

Stone tools, particularly projectile points and scrapers, are the primary evidence of early human activity in the Americas. Crafted lithic flaked tools are used by archaeologists and anthropologists to classify cultural periods. [14] Scientific evidence links indigenous Americans to Asian peoples, specifically eastern Siberian populations. Indigenous peoples of the Americas have been linked to North Asian populations by linguistic dialects, the distribution of blood types, and in genetic composition as reflected by molecular data, such as DNA. [15] 8,000–7,000 BCE (10,000–9,000 years ago) the climate stabilized, leading to a rise in population and lithic technology advances, resulting in a more sedentary lifestyle.

Pre-Columbian era Edit

Before contact with Europeans, the indigenous peoples of North America were divided into many different polities, from small bands of a few families to large empires. They lived in numerous culture areas, which roughly correspond to geographic and biological zones. Societies adapted their subsistence strategies to their homelands, and some societies were hunter-gatherers, some horticulturists, some agriculturalists, and many a mix of these. Native groups can also be classified by their language family (e.g. Athapascan or Uto-Aztecan). It is important to note that people with similar languages did not always share the same material culture, nor were they always allies.

The Archaic period in the Americas saw a changing environment featuring a warmer more arid climate and the disappearance of the last megafauna. [16] The majority of population groups at this time were still highly mobile hunter-gatherers but now individual groups started to focus on resources available to them locally, thus with the passage of time there is a pattern of increasing regional generalization, for example the Southwest, Arctic, Poverty Point culture, Plains Arctic, Dalton, and Plano traditions. This kind of regional adaptation became the norm, with reliance less on hunting and gathering among many cultures, with a more mixed economy of small game, fish, seasonally wild vegetables and harvested plant foods. [17] [18] Many groups continued as big game hunters, but their hunting traditions became more varied, and meat procurement methods more sophisticated. [19] [20] The placement of artifacts and materials within an Archaic burial site indicated social differentiation based upon status in some groups. [21]

Agriculture was invented independently in two regions of North America: the Eastern Woodlands [22] and Mesoamerica. The more southern cultural groups of North America were responsible for the domestication of many common crops now used around the world, such as tomatoes and squash. Perhaps most importantly they domesticated one of the world's major staples, maize (corn). During the Plains Village period, agriculture and bison-hunting were important to Great Plains tribes.

As a result of the development of agriculture in the south, many important cultural advances were made there. For example, the Maya civilization developed a writing system, built huge pyramids, had a complex calendar, and developed the concept of zero 500 years before anyone in the Old World. The Mayan culture was still present when the Spanish arrived in Central America, but political dominance in the area had shifted to the Aztec Empire further north.

In the Southwest of North America, Hohokam and Ancestral Pueblo societies had been engaged in agricultural production with ditch irrigation and a sedentary village life for at least two millennia before the Spanish arrived in the 1540s. [23] Upon the arrival of the Europeans in the "New World", native peoples found their culture changed drastically. As such, their affiliation with political and cultural groups changed as well, several linguistic groups went extinct, and others changed quite quickly. The name and cultures that Europeans recorded for the natives were not necessarily the same as the ones they had used a few generations before, or the ones in use today.

Early contact Edit

There was limited contact between North American people and the outside world before 1492. Several theoretical contacts have been proposed, but the earliest physical evidence comes from the Norse or Vikings. Erik the Red founded a colony on Greenland in 985 CE. Erik's son Leif Eriksson is believed to have reached the Island of Newfoundland circa 1000, naming the discovery Vinland. The only Norse site outside of Greenland yet discovered in North America is at L'Anse aux Meadows, Newfoundland and Labrador in Canada. All of the Norse colonies were eventually abandoned.

The Norse voyages did not become common knowledge in the Old World. Even the permanent settlement in Greenland, which persisted until the early 1400s, received little attention and Europeans remained largely ignorant of the existence of the Americas until 1492. As part of a general age of discovery, Italian sailor Christopher Columbus proposed a voyage west from Europe to find a shorter route to Asia. He eventually received the backing of Isabella I and Ferdinand II, Queen and King of newly united Spain. In 1492 Columbus reached land in the Bahamas.

Almost 500 years after the Norse, John Cabot explored the east coast of what would become Canada in 1497. Giovanni da Verrazzano explored the East Coast of North America from Florida to presumably Newfoundland in 1524. Jacques Cartier made a series of voyages on behalf of the French crown in 1534 and explored the St. Lawrence River.

Successful colonization Edit

In order to understand what constitutes successful colonization, it is important to understand what colonization means. Colonization refers to large-scale population movements in which the migrants maintain strong links with their or their ancestors' former country, gaining significant advantages over other inhabitants of the territory by such links. When colonization takes place under the protection of clearly colonial political structures, it may most handily be called settler colonialism. This often involves the settlers' entirely dispossessing earlier inhabitants, or instituting legal and other structures which systematically disadvantage them. [24]

Initially, European activity consisted mostly of trade and exploration. Eventually Europeans began to establish settlements. The three principal colonial powers in North America were Spain, England, and France, although eventually other powers such as the Netherlands and Sweden also received holdings on the continent.

Settlement by the Spanish started the European colonization of the Americas. [25] [26] They gained control of most of the largest islands in the Caribbean and conquered the Aztec empire, gaining control of present-day Mexico and Central America. This was the beginning of the Spanish Empire in the New World. The first successful Spanish settlement in continental North America was Veracruz founded by Hernán Cortés in 1519, followed by many other settlements in colonial New Spain, including Spanish Florida, Central America, New Mexico, and later California. The Spanish claimed all of North and South America (with the exception of Brazil), and no other European power challenged those claims by planting colonies until over a century after Spain's first settlements.

The first French settlements were Port Royal (1604) and Quebec City (1608) in what is now Nova Scotia and Quebec. The Fur Trade soon became the primary business on the continent and as a result transformed the indigenous North American ways of life.

The first permanent English settlements were at Jamestown (1607) (along with its satellite, Bermuda in 1609) and Plymouth (1620), in what are today Virginia and Massachusetts respectively. Further to the south, plantation slavery became the main industry of the West Indies, and this gave rise to the beginning of the Atlantic slave trade.

By the year 1663 the French crown had taken over control of New France from the fur-trading companies, and the English charter colonies gave way to more metropolitan control. This ushered in a new era of more formalized colonialism in North America.

Rivalry between the European powers created a series of wars on the North American landmass that would have great impact on the development of the colonies. Territory often changed hands multiple times. Peace was not achieved until French forces in North America were vanquished at the Battle of the Plains of Abraham at Quebec City, and France ceded most of her claims outside of the Caribbean. The end of the French presence in North America was a disaster for most Native nations in Eastern North America, who lost their major ally against the expanding Anglo-American settlement's. During Pontiac's Rebellion from 1763 to 1766, a confederation of Great Lakes-area tribes fought a somewhat successful campaign to defend their rights over their lands west of the Appalachian Mountains, which had been "reserved" for them under the Royal Proclamation of 1763.

Viceroyalty of New Spain (present-day Mexico) was the name of the viceroy-ruled territories of the Spanish Empire in Asia, North America and its peripheries from 1535 to 1821.

The coming of the American Revolution had a great impact across the continent. Most importantly it directly led to the creation of the United States of America. However, the associated American Revolutionary War was an important war that touched all corners of the region. The flight of the United Empire Loyalists led to the creation of English Canada as a separate community

Meanwhile, Spain's hold on Mexico was weakening. Independence was declared in 1810 by Miguel Hidalgo, starting the Mexican War of Independence. In 1813, José María Morelos and the Congress of Anáhuac signed the Solemn Act of the Declaration of Independence of Northern America, the first legal document where the separation of the New Spain with respect to Spain is proclaimed. Spain finally recognized Mexico's independence in 1821.

From the time of independence of the United States, that country expanded rapidly to the west, acquiring the massive Louisiana territory in 1803. Between 1810 and 1811 a Native confederacy under Tecumseh fought unsuccessfully to keep the Americans from pushing them out of the Great Lakes. Tecumseh's followers then went north into Canada, where they helped the British to block an American attempt to seize Canada during the War of 1812. Following the war, British and Irish settlement in Canada increased dramatically.

US expansion was complicated by the division between "free" and "slave" states, which led to the Missouri Compromise in 1820. Likewise, Canada faced a division between French and English communities that led to the outbreak of civil strife in 1837. Mexico faced constant political tensions between liberals and conservatives, as well as the rebellion of the English-speaking region of Texas, which declared itself the Republic of Texas in 1836. In 1845 Texas joined the United States, which would later lead to the Mexican–American War in 1846 that began American imperialism. As a result of conflict with Mexico, the United States made further territorial gains in California and the Southwest.

The secession of the Confederate States and the resulting civil war rocked American society. It eventually led to the end of slavery in the United States, the destruction and later reconstruction of most of the South, and tremendous loss of life. From the conflict, the United States emerged as a powerful industrialized nation.

Partly as a response to the threat of American power, four of the Canadian colonies agreed to federate in 1867, creating the Dominion of Canada. The new nation was not fully sovereign, but enjoyed considerable independence from Britain. With the addition of British Columbia Canada would expand to the Pacific by 1871 and establish a transcontinental railway, the Canadian Pacific, by 1885.

In Mexico conflicts like the Reform War left the state weak, and open to foreign influence. This led to the Second French Empire to invade Mexico.

In both Russia and China the second half of the 19th century witnessed massive inflows of immigration to settle the West. These lands were not uninhabited however: in the United States the government fought numerous Indian Wars against the native inhabitants. In Canada, relations were more peaceful, as a result of the Numbered Treaties, but two rebellions broke out in 1870 and 1885 on the prairies. The British colony of Newfoundland became a dominion in 1907.

In Mexico, the entire era was dominated by the dictatorship of Porfirio Díaz.

World War I Edit

As a part of the British Empire Canada immediately was at war in 1914. Canada bore the brunt of several major battles during the early stages of the war including the use of poison gas attacks at Ypres. Losses became grave, and the government eventually brought in conscription, despite the fact this was against the wishes of the majority of French Canadians. In the ensuing Conscription Crisis of 1917, riots broke out on the streets of Montreal. In neighboring Newfoundland, the new dominion suffered a devastating loss on July 1, 1916, the First day on the Somme.

The United States stayed apart from the conflict until 1917, joining the Entente powers. The United States was then able to play a crucial role at the Paris Peace Conference of 1919 that shaped interwar Europe.

Mexico was not part of the war as the country was embroiled in the Mexican Revolution at the time.

Interwar years Edit

The 1920s brought an age of great prosperity in the United States, and to a lesser degree Canada. But the Wall Street Crash of 1929 combined with drought ushered in a period of economic hardship in the United States and Canada.

From 1937 to 1949, this was a popular uprising against the anti-Catholic Mexican government of the time, set off specifically by the anti-clerical provisions of the Mexican Constitution of 1917.

World War II Edit

Once again Canada found itself at war before her neighbors, however even Canadian contributions were slight before the Japanese attack on Pearl Harbor. The entry of the United States into the war helped to tip the balance in favor of the Allies. On August 19, 1942, a force of some 6000, largely Canadian, infantry was landed near the French channel port of Dieppe. The German defenders under General von Rundstedt destroyed the invaders. 907 Canadians were killed and almost 2,500 captured (many wounded). Lessons learned in this abortive raid were put to good use 2 years later in the successful Normandy invasion.

Two Mexican tankers, transporting oil to the United States, were attacked and sunk by the Germans in the Gulf of Mexico waters, in 1942. The incident happened in spite of Mexico's neutrality at that time. This led Mexico to declare war on the Axis nations and enter the conflict.

The destruction of Europe wrought by the war vaulted all North American countries to more important roles in world affairs. The United States especially emerged as a "superpower".

The early Cold War era saw the United States as the most powerful nation in a Western coalition of which Mexico and Canada were also a part. At home, the United States witnessed convulsive change especially in the area of race relations. In Canada this was mirrored by the Quiet Revolution and the emergence of Quebec nationalism.

Mexico experienced an era of huge economic growth after World War II, a heavy industrialization process and a growth of its middle class, a period known in Mexican history as the "El Milagro Mexicano" (Mexican miracle).

The Caribbean saw the beginnings of decolonization, while on the largest island the Cuban Revolution introduced Cold War rivalries into Latin America.

In 1959 the non-contiguous US territories of Alaska and Hawaii became US states.

During this time the United States became involved in the Vietnam War as part of the global Cold War. This war would later prove to be highly divisive in American society, and American troops were withdrawn from Indochina in 1975 with the Khmer Rouge's capture of Phnom Penh on April 17, the Vietnam People's Army's capture of Saigon on April 30 and the Pathet Lao's capture of Vientiane on December 2.

Canada during this era was dominated by the leadership of Pierre Elliot Trudeau. Eventually in 1982 at the end of his tenure, Canada received a new constitution.

Both the United States and Canada experienced stagflation, which eventually led to a revival in small-government politics. [ citation needed ]

Mexican presidents Miguel de la Madrid, in the early 1980s and Carlos Salinas de Gortari in the late 1980s, started implementing liberal economic strategies that were seen as a good move. However, Mexico experienced a strong economic recession in 1982 and the Mexican peso suffered a devaluation. Presidential elections held in 1988 were forecast to be very competitive and they were. Leftist candidate Cuauhtémoc Cárdenas, son of Lázaro Cárdenas one of the most beloved Mexican presidents, created a successful campaign and was reported as the leader in several opinion polls. On July 6, 1988, the day of the elections, a system shutdown of the IBM AS/400 that the government was using to count the votes occurred, presumably by accident. The government simply stated that "se cayó el sistema" ("the system crashed"), to refer to the incident. When the system was finally restored, the PRI candidate Carlos Salinas was declared the official winner. It was the first time since the Revolution that a non-PRI candidate was so close to winning the presidency.

In the United States president Ronald Reagan attempted to move the United States back towards a hard anti-communist line in foreign affairs, in what his supporters saw as an attempt to assert moral leadership (compared to the Soviet Union) in the world community. Domestically, Reagan attempted to bring in a package of privatization and trickle down economics to stimulate the economy.

Canada's Brian Mulroney ran on a similar platform to Reagan, and also favored closer trade ties with the United States. This led to the Canada-United States Free Trade Agreement in January 1989.

The End of the Cold War and the beginning of the era of sustained economic expansion coincided during the 1990s. On January 1, 1994 Canada, Mexico and the United States signed the North American Free Trade Agreement, creating the world's largest free trade area. Quebec held a referendum in 1995 for national sovereignty in which 51% voted no to 49% yes. In 2000, Vicente Fox became the first non-PRI candidate to win the Mexican presidency in over 70 years.

The optimism of the 1990s was shattered by the 9/11 attacks of 2001 on the United States, which prompted military intervention in Afghanistan, which Canada also participated in. Canada and Mexico did not support the United States's later move to invade Iraq.

In 2006 the drug war in Mexico evolved into an actual military conflict with each year more deadly than the last.

Starting in the Winter of 2007, a financial crisis in the United States began which eventually triggered a worldwide recession in the Fall of 2008.

In 2009, Barack Obama was inaugurated as the first African American to be President of the United States. Two years later, Osama Bin Laden, perpetrator of 9/11, was found and killed. On December 18, 2011, the Iraq War was declared formally over once the troops had pulled out. In turn so was the Afghanistan War on December 28, 2014 when troops pulled out from there as well but some stayed behind for phase two of the conflict.


'It's Not That the Story Was Buried.' What Americans in the 1930s Really Knew About What Was Happening in Germany

F ew are as aware that the news is the first draft of history as is the team behind a recently opened exhibition at the United States Holocaust Memorial Museum (USHMM). To put together Americans and the Holocaust, they combed through the German news column in more than a decade’s worth of issues of TIME magazine &mdash and parallel sections from many other magazines and newspapers &mdash and what they found refuted a persistent, though oft-debunked, myth about World War II and the Holocaust: the idea that, as the museum puts it, “Americans lacked access to information about the persecution of Jews as it was happening.”

Looking at the news that publications like TIME ran in the 1930s and 󈧬s shows that, in fact, Americans had lots of access to news about what was happening to Europe’s Jewish population and others targeted by the Nazi regime. But it also highlights a central truth about this period &mdash and human beings in general. Reading or hearing something is not the same as understanding what it truly means, curator Daniel Greene tells TIME, and there’s a wide “gap between information and understanding.”

Case in point: Dr. Paul Joseph Goebbels, Nazi propaganda minister, on the cover of the July 10, 1933, issue of TIME Magazine, from 85 years ago this Tuesday.

Though TIME’s 1933 article, about Hitler’s new cabinet, didn’t yet treat Hitler with complete seriousness &mdash he was referred to as a “Vegetarian Superman” &mdash it didn’t pull punches on the ideas behind his ascent. The article presented as fact that the consolidation of Nazi rule had lifted the spirit of the German people, even as the world watched warily, and explained that one tactic above all was helping Hitler and Goebbels with that uplift: “explaining away all Germany’s defeats and trials in terms of the Jew.”

It would have been impossible to read the story and miss the danger Hitler presented to his country’s Jewish citizens:

True Germans were not defeated in the War, so runs the Nazi tale for grown-up children. They were betrayed by Jewish pacifists. Marx was a Jew! In the welter of German revolution the Jews fomented a German Republic essentially Marxist. Under inflation “which only the Jews understood,” they bled true Germans white by their scheming speculation. Somehow or other they had something to do with the mountain of debt the Allies piled on Germany. All these “facts” are profoundly important in the Germany of today. They are at the root of national resurgence. By blaming everything on their Jewish fellow men, other Germans are escaping from their mental prison of inferiority. Louder and louder the Minister of Propaganda dins with clenched and pounding fists the exhortation he has thundered from half the platforms and over all the radios in Germany: “Never forget it, comrades, and repeat it a hundred times so you will say it in your dreams&mdash”THE JEWS ARE TO BLAME!”

See the full issue here in the TIME Vault

That chilling last line also appeared on the front cover of the issue.

The magazine reported that sterilization of Jewish citizens had been discussed, and explained the economic consequences Jewish businesses were already facing. As Greene points out, these already horrifying concepts gain an extra layer of terror with the help of hindsight, as we know now that they were only the beginning.


The End of White America?

The election of Barack Obama is just the most startling manifestation of a larger trend: the gradual erosion of “whiteness” as the touchstone of what it means to be American. If the end of white America is a cultural and demographic inevitability, what will the new mainstream look like—and how will white Americans fit into it? What will it mean to be white when whiteness is no longer the norm? And will a post-white America be less racially divided—or more so?

“Civilization’s going to pieces,” he remarks. He is in polite company, gathered with friends around a bottle of wine in the late-afternoon sun, chatting and gossiping. “I’ve gotten to be a terrible pessimist about things. Have you read The Rise of the Colored Empires by this man Goddard?” They hadn’t. “Well, it’s a fine book, and everybody ought to read it. The idea is if we don’t look out the white race will be—will be utterly submerged. It’s all scientific stuff it’s been proved.”

As briefs for racial supremacy go, The Rising Tide of Color is eerily serene. Its tone is scholarly and gentlemanly, its hatred rationalized and, in Buchanan’s term, “scientific.” And the book was hardly a fringe phenomenon. It was published by Scribner, also Fitzgerald’s publisher, and Stoddard, who received a doctorate in history from Harvard, was a member of many professional academic associations. It was precisely the kind of book that a 1920s man of Buchanan’s profile—wealthy, Ivy League–educated, at once pretentious and intellectually insecure—might have been expected to bring up in casual conversation.He is Tom Buchanan, a character in F. Scott Fitzgerald’s The Great Gatsby, a book that nearly everyone who passes through the American education system is compelled to read at least once. Although Gatsby doesn’t gloss as a book on racial anxiety—it’s too busy exploring a different set of anxieties entirely—Buchanan was hardly alone in feeling besieged. The book by “this man Goddard” had a real-world analogue: Lothrop Stoddard’s The Rising Tide of Color Against White World-Supremacy, published in 1920, five years before Gatsby. Nine decades later, Stoddard’s polemic remains oddly engrossing. He refers to World War I as the “White Civil War” and laments the “cycle of ruin” that may result if the “white world” continues its infighting. The book features a series of foldout maps depicting the distribution of “color” throughout the world and warns, “Colored migration is a universal peril, menacing every part of the white world.”

As white men of comfort and privilege living in an age of limited social mobility, of course, Stoddard and the Buchanans in his audience had nothing literal to fear. Their sense of dread hovered somewhere above the concerns of everyday life. It was linked less to any immediate danger to their class’s political and cultural power than to the perceived fraying of the fixed, monolithic identity of whiteness that sewed together the fortunes of the fair-skinned.

From the hysteria over Eastern European immigration to the vibrant cultural miscegenation of the Harlem Renaissance, it is easy to see how this imagined worldwide white kinship might have seemed imperiled in the 1920s. There’s no better example of the era’s insecurities than the 1923 Supreme Court case United States v. Bhagat Singh Thind, in which an Indian American veteran of World War I sought to become a naturalized citizen by proving that he was Caucasian. The Court considered new anthropological studies that expanded the definition of the Caucasian race to include Indians, and the justices even agreed that traces of “Aryan blood” coursed through Thind’s body. But these technicalities availed him little. The Court determined that Thind was not white “in accordance with the understanding of the common man” and therefore could be excluded from the “statutory category” of whiteness. Put another way: Thind was white, in that he was Caucasian and even Aryan. But he was not white in the way Stoddard or Buchanan were white.

The ’20s debate over the definition of whiteness—a legal category? a commonsense understanding? a worldwide civilization?—took place in a society gripped by an acute sense of racial paranoia, and it is easy to regard these episodes as evidence of how far we have come. But consider that these anxieties surfaced when whiteness was synonymous with the American mainstream, when threats to its status were largely imaginary. What happens once this is no longer the case—when the fears of Lothrop Stoddard and Tom Buchanan are realized, and white people actually become an American minority?

Whether you describe it as the dawning of a post-racial age or just the end of white America, we’re approaching a profound demographic tipping point. According to an August 2008 report by the U.S. Census Bureau, those groups currently categorized as racial minorities—blacks and Hispanics, East Asians and South Asians—will account for a majority of the U.S. population by the year 2042. Among Americans under the age of 18, this shift is projected to take place in 2023, which means that every child born in the United States from here on out will belong to the first post-white generation.

Obviously, steadily ascending rates of interracial marriage complicate this picture, pointing toward what Michael Lind has described as the “beiging” of America. And it’s possible that “beige Americans” will self-identify as “white” in sufficient numbers to push the tipping point further into the future than the Census Bureau projects. But even if they do, whiteness will be a label adopted out of convenience and even indifference, rather than aspiration and necessity. For an earlier generation of minorities and immigrants, to be recognized as a “white American,” whether you were an Italian or a Pole or a Hungarian, was to enter the mainstream of American life to be recognized as something else, as the Thind case suggests, was to be permanently excluded. As Bill Imada, head of the IW Group, a prominent Asian American communications and marketing company, puts it: “I think in the 1920s, 1930s, and 1940s, [for] anyone who immigrated, the aspiration was to blend in and be as American as possible so that white America wouldn’t be intimidated by them. They wanted to imitate white America as much as possible: learn English, go to church, go to the same schools.”

Today, the picture is far more complex. To take the most obvious example, whiteness is no longer a precondition for entry into the highest levels of public office. The son of Indian immigrants doesn’t have to become “white” in order to be elected governor of Louisiana. A half-Kenyan, half-Kansan politician can self-identify as black and be elected president of the United States.

As a purely demographic matter, then, the “white America” that Lothrop Stoddard believed in so fervently may cease to exist in 2040, 2050, or 2060, or later still. But where the culture is concerned, it’s already all but finished. Instead of the long-standing model of assimilation toward a common center, the culture is being remade in the image of white America’s multiethnic, multicolored heirs.

For some, the disappearance of this centrifugal core heralds a future rich with promise. In 1998, President Bill Clinton, in a now-famous address to students at Portland State University, remarked:

Not everyone was so enthused. Clinton’s remarks caught the attention of another anxious Buchanan—Pat Buchanan, the conservative thinker. Revisiting the president’s speech in his 2001 book, The Death of the West, Buchanan wrote: “Mr. Clinton assured us that it will be a better America when we are all minorities and realize true ‘diversity.’ Well, those students [at Portland State] are going to find out, for they will spend their golden years in a Third World America.”

Today, the arrival of what Buchanan derided as “Third World America” is all but inevitable. What will the new mainstream of America look like, and what ideas or values might it rally around? What will it mean to be white after “whiteness” no longer defines the mainstream? Will anyone mourn the end of white America? Will anyone try to preserve it?

Another moment from The Great Gatsby: as Fitzgerald’s narrator and Gatsby drive across the Queensboro Bridge into Manhattan, a car passes them, and Nick Carraway notices that it is a limousine “driven by a white chauffeur, in which sat three modish negroes, two bucks and a girl.” The novelty of this topsy-turvy arrangement inspires Carraway to laugh aloud and think to himself, “Anything can happen now that we’ve slid over this bridge, anything at all …”

For a contemporary embodiment of the upheaval that this scene portended, consider Sean Combs, a hip-hop mogul and one of the most famous African Americans on the planet. Combs grew up during hip-hop’s late-1970s rise, and he belongs to the first generation that could safely make a living working in the industry—as a plucky young promoter and record-label intern in the late 1980s and early 1990s, and as a fashion designer, artist, and music executive worth hundreds of millions of dollars a brief decade later.

In the late 1990s, Combs made a fascinating gesture toward New York’s high society. He announced his arrival into the circles of the rich and powerful not by crashing their parties, but by inviting them into his own spectacularly over-the-top world. Combs began to stage elaborate annual parties in the Hamptons, not far from where Fitzgerald’s novel takes place. These “white parties”—attendees are required to wear white—quickly became legendary for their opulence (in 2004, Combs showcased a 1776 copy of the Declaration of Independence) as well as for the cultures-colliding quality of Hamptons elites paying their respects to someone so comfortably nouveau riche. Prospective business partners angled to get close to him and praised him as a guru of the lucrative “urban” market, while grateful partygoers hailed him as a modern-day Gatsby.

“Have I read The Great Gatsby?” Combs said to a London newspaper in 2001. “I am the Great Gatsby.”

Yet whereas Gatsby felt pressure to hide his status as an arriviste, Combs celebrated his position as an outsider-insider—someone who appropriates elements of the culture he seeks to join without attempting to assimilate outright. In a sense, Combs was imitating the old WASP establishment in another sense, he was subtly provoking it, by over-enunciating its formality and never letting his guests forget that there was something slightly off about his presence. There’s a silent power to throwing parties where the best-dressed man in the room is also the one whose public profile once consisted primarily of dancing in the background of Biggie Smalls videos. (“No one would ever expect a young black man to be coming to a party with the Declaration of Independence, but I got it, and it’s coming with me,” Combs joked at his 2004 party, as he made the rounds with the document, promising not to spill champagne on it.)

In this regard, Combs is both a product and a hero of the new cultural mainstream, which prizes diversity above all else, and whose ultimate goal is some vague notion of racial transcendence, rather than subversion or assimilation. Although Combs’s vision is far from representative—not many hip-hop stars vacation in St. Tropez with a parasol-toting manservant shading their every step—his industry lies at the heart of this new mainstream. Over the past 30 years, few changes in American culture have been as significant as the rise of hip-hop. The genre has radically reshaped the way we listen to and consume music, first by opposing the pop mainstream and then by becoming it. From its constant sampling of past styles and eras—old records, fashions, slang, anything—to its mythologization of the self-made black antihero, hip-hop is more than a musical genre: it’s a philosophy, a political statement, a way of approaching and remaking culture. It’s a lingua franca not just among kids in America, but also among young people worldwide. And its economic impact extends beyond the music industry, to fashion, advertising, and film. (Consider the producer Russell Simmons—the ur-Combs and a music, fashion, and television mogul—or the rapper 50 Cent, who has parlayed his rags-to-riches story line into extracurricular successes that include a clothing line book, video-game, and film deals and a startlingly lucrative partnership with the makers of Vitamin Water.)

But hip-hop’s deepest impact is symbolic. During popular music’s rise in the 20th century, white artists and producers consistently “mainstreamed” African American innovations. Hip-hop’s ascension has been different. Eminem notwithstanding, hip-hop never suffered through anything like an Elvis Presley moment, in which a white artist made a musical form safe for white America. This is no dig at Elvis—the constrictive racial logic of the 1950s demanded the erasure of rock and roll’s black roots, and if it hadn’t been him, it would have been someone else. But hip-hop—the sound of the post- civil-rights, post-soul generation—found a global audience on its own terms.

Today, hip-hop’s colonization of the global imagination, from fashion runways in Europe to dance competitions in Asia, is Disney-esque. This transformation has bred an unprecedented cultural confidence in its black originators. Whiteness is no longer a threat, or an ideal: it’s kitsch to be appropriated, whether with gestures like Combs’s “white parties” or the trickle-down epidemic of collared shirts and cuff links currently afflicting rappers. And an expansive multiculturalism is replacing the us-against-the-world bunker mentality that lent a thrilling edge to hip-hop’s mid-1990s rise.

Peter Rosenberg, a self-proclaimed “nerdy Jewish kid” and radio personality on New York’s Hot 97 FM—and a living example of how hip-hop has created new identities for its listeners that don’t fall neatly along lines of black and white—shares another example: “I interviewed [the St. Louis rapper] Nelly this morning, and he said it’s now very cool and in to have multicultural friends. Like you’re not really considered hip or ‘you’ve made it’ if you’re rolling with all the same people.”

Just as Tiger Woods forever changed the country-club culture of golf, and Will Smith confounded stereotypes about the ideal Hollywood leading man, hip-hop’s rise is helping redefine the American mainstream, which no longer aspires toward a single iconic image of style or class. Successful network-television shows like Lost, Heroes, and Grey’s Anatomy feature wildly diverse casts, and an entire genre of half-hour comedy, from The Colbert Report to The Office, seems dedicated to having fun with the persona of the clueless white male. The youth market is following the same pattern: consider the Cheetah Girls, a multicultural, multiplatinum, multiplatform trio of teenyboppers who recently starred in their third movie, or Dora the Explorer, the precocious bilingual 7-year-old Latina adventurer who is arguably the most successful animated character on children’s television today. In a recent address to the Association of Hispanic Advertising Agencies, Brown Johnson, the Nickelodeon executive who has overseen Dora’s rise, explained the importance of creating a character who does not conform to “the white, middle-class mold.” When Johnson pointed out that Dora’s wares were outselling Barbie’s in France, the crowd hooted in delight.

Pop culture today rallies around an ethic of multicultural inclusion that seems to value every identity—except whiteness. “It’s become harder for the blond-haired, blue-eyed commercial actor,” remarks Rochelle Newman-Carrasco, of the Hispanic marketing firm Enlace. “You read casting notices, and they like to cast people with brown hair because they could be Hispanic. The language of casting notices is pretty shocking because it’s so specific: ‘Brown hair, brown eyes, could look Hispanic.’ Or, as one notice put it: ‘Ethnically ambiguous.’”

“I think white people feel like they’re under siege right now—like it’s not okay to be white right now, especially if you’re a white male,” laughs Bill Imada, of the IW Group. Imada and Newman-Carrasco are part of a movement within advertising, marketing, and communications firms to reimagine the profile of the typical American consumer. (Tellingly, every person I spoke with from these industries knew the Census Bureau’s projections by heart.)

“There’s a lot of fear and a lot of resentment,” Newman-Carrasco observes, describing the flak she caught after writing an article for a trade publication on the need for more-diverse hiring practices. “I got a response from a friend—he’s, like, a 60-something white male, and he’s been involved with multicultural recruiting,” she recalls. “And he said, ‘I really feel like the hunted. It’s a hard time to be a white man in America right now, because I feel like I’m being lumped in with all white males in America, and I’ve tried to do stuff, but it’s a tough time.’”

“I always tell the white men in the room, ‘We need you,’” Imada says. “We cannot talk about diversity and inclusion and engagement without you at the table. It’s okay to be white!

“But people are stressed out about it. ‘We used to be in control! We’re losing control!’”

If they’re right—if white America is indeed “losing control,” and if the future will belong to people who can successfully navigate a post-racial, multicultural landscape—then it’s no surprise that many white Americans are eager to divest themselves of their whiteness entirely.

For some, this renunciation can take a radical form. In 1994, a young graffiti artist and activist named William “Upski” Wimsatt, the son of a university professor, published Bomb the Suburbs, the spiritual heir to Norman Mailer’s celebratory 1957 essay, “The White Negro.” Wimsatt was deeply committed to hip-hop’s transformative powers, going so far as to embrace the status of the lowly “wigger,” a pejorative term popularized in the early 1990s to describe white kids who steep themselves in black culture. Wimsatt viewed the wigger’s immersion in two cultures as an engine for change. “If channeled in the right way,” he wrote, “the wigger can go a long way toward repairing the sickness of race in America.”

Wimsatt’s painfully earnest attempts to put his own relationship with whiteness under the microscope coincided with the emergence of an academic discipline known as “whiteness studies.” In colleges and universities across the country, scholars began examining the history of “whiteness” and unpacking its contradictions. Why, for example, had the Irish and the Italians fallen beyond the pale at different moments in our history? Were Jewish Americans white? And, as the historian Matthew Frye Jacobson asked, “Why is it that in the United States, a white woman can have black children but a black woman cannot have white children?”

Much like Wimsatt, the whiteness-studies academics—figures such as Jacobson, David Roediger, Eric Lott, and Noel Ignatiev—were attempting to come to terms with their own relationships with whiteness, in its past and present forms. In the early 1990s, Ignatiev, a former labor activist and the author of How the Irish Became White, set out to “abolish” the idea of the white race by starting the New Abolitionist Movement and founding a journal titled Race Traitor. “There is nothing positive about white identity,” he wrote in 1998. “As James Baldwin said, ‘As long as you think you’re white, there’s no hope for you.’”

Although most white Americans haven’t read Bomb the Suburbs or Race Traitor, this view of whiteness as something to be interrogated, if not shrugged off completely, has migrated to less academic spheres. The perspective of the whiteness-studies academics is commonplace now, even if the language used to express it is different.

“I get it: as a straight white male, I’m the worst thing on Earth,” Christian Lander says. Lander is a Canadian-born, Los Angeles–based satirist who in January 2008 started a blog called Stuff White People Like (stuffwhitepeoplelike.com), which pokes fun at the manners and mores of a specific species of young, hip, upwardly mobile whites. (He has written more than 100 entries about whites’ passion for things like bottled water, “the idea of soccer,” and “being the only white person around.”) At its best, Lander’s site—which formed the basis for a recently published book of the same name (reviewed in the October 2008 Atlantic)—is a cunningly precise distillation of the identity crisis plaguing well-meaning, well-off white kids in a post-white world.

“Like, I’m aware of all the horrible crimes that my demographic has done in the world,” Lander says. “And there’s a bunch of white people who are desperate—desperate—to say, ‘You know what? My skin’s white, but I’m not one of the white people who’s destroying the world.’”

For Lander, whiteness has become a vacuum. The “white identity” he limns on his blog is predicated on the quest for authenticity—usually other people’s authenticity. “As a white person, you’re just desperate to find something else to grab onto. You’re jealous! Pretty much every white person I grew up with wished they’d grown up in, you know, an ethnic home that gave them a second language. White culture is Family Ties and Led Zeppelin and Guns N’ Roses—like, this is white culture. This is all we have.”

Lander’s “white people” are products of a very specific historical moment, raised by well-meaning Baby Boomers to reject the old ideal of white American gentility and to embrace diversity and fluidity instead. (“It’s strange that we are the kids of Baby Boomers, right? How the hell do you rebel against that? Like, your parents will march against the World Trade Organization next to you. They’ll have bigger white dreadlocks than you. What do you do?”) But his lighthearted anthropology suggests that the multicultural harmony they were raised to worship has bred a kind of self-denial.

Matt Wray, a sociologist at Temple University who is a fan of Lander’s humor, has observed that many of his white students are plagued by a racial-identity crisis: “They don’t care about socioeconomics they care about culture. And to be white is to be culturally broke. The classic thing white students say when you ask them to talk about who they are is, ‘I don’t have a culture.’ They might be privileged, they might be loaded socioeconomically, but they feel bankrupt when it comes to culture … They feel disadvantaged, and they feel marginalized. They don’t have a culture that’s cool or oppositional.” Wray says that this feeling of being culturally bereft often prevents students from recognizing what it means to be a child of privilege—a strange irony that the first wave of whiteness-studies scholars, in the 1990s, failed to anticipate.

Of course, the obvious material advantages that come with being born white—lower infant-mortality rates and easier-to-acquire bank loans, for example—tend to undercut any sympathy that this sense of marginalization might generate. And in the right context, cultural-identity crises can turn well-meaning whites into instant punch lines. Consider ego trip’s The (White) Rapper Show, a brilliant and critically acclaimed reality show that VH1 debuted in 2007. It depicted 10 (mostly hapless) white rappers living together in a dilapidated house—dubbed “Tha White House”—in the South Bronx. Despite the contestants’ best intentions, each one seemed like a profoundly confused caricature, whether it was the solemn graduate student committed to fighting racism or the ghetto-obsessed suburbanite who had, seemingly by accident, named himself after the abolitionist John Brown.

Similarly, Smirnoff struck marketing gold in 2006 with a viral music video titled “Tea Partay,” featuring a trio of strikingly bad, V-neck-sweater-clad white rappers called the Prep Unit. “Haters like to clown our Ivy League educations / But they’re just jealous ’cause our families run the nation,” the trio brayed, as a pair of bottle-blond women in spiffy tennis whites shimmied behind them. There was no nonironic way to enjoy the video its entire appeal was in its self-aware lampooning of WASP culture: verdant country clubs, “old money,” croquet, popped collars, and the like.

“The best defense is to be constantly pulling the rug out from underneath yourself,” Wray remarks, describing the way self-aware whites contend with their complicated identity. “Beat people to the punch. You’re forced as a white person into a sense of ironic detachment. Irony is what fuels a lot of white subcultures. You also see things like Burning Man, when a lot of white people are going into the desert and trying to invent something that is entirely new and not a form of racial mimicry. That’s its own kind of flight from whiteness. We’re going through a period where whites are really trying to figure out: Who are we?”

The “flight from whiteness” of urban, college-educated, liberal whites isn’t the only attempt to answer this question. You can flee into whiteness as well. This can mean pursuing the authenticity of an imagined past: think of the deliberately white-bread world of Mormon America, where the ’50s never ended, or the anachronistic WASP entitlement flaunted in books like last year’s A Privileged Life: Celebrating WASP Style, a handsome coffee-table book compiled by Susanna Salk, depicting a world of seersucker blazers, whale pants, and deck shoes. (What the book celebrates is the “inability to be outdone,” and the “self-confidence and security that comes with it,” Salk tells me. “That’s why I call it ‘privilege.’ It’s this privilege of time, of heritage, of being in a place longer than anybody else.”) But these enclaves of preserved-in-amber whiteness are likely to be less important to the American future than the construction of whiteness as a somewhat pissed-off minority culture.

This notion of a self-consciously white expression of minority empowerment will be familiar to anyone who has come across the comedian Larry the Cable Guy—he of “Farting Jingle Bells”—or witnessed the transformation of Detroit-born-and-bred Kid Rock from teenage rapper into “American Bad Ass” southern-style rocker. The 1990s may have been a decade when multiculturalism advanced dramatically—when American culture became “colorized,” as the critic Jeff Chang put it—but it was also an era when a very different form of identity politics crystallized. Hip-hop may have provided the decade’s soundtrack, but the highest-selling artist of the ’90s was Garth Brooks. Michael Jordan and Tiger Woods may have been the faces of athletic superstardom, but it was NASCAR that emerged as professional sports’ fastest-growing institution, with ratings second only to the NFL’s.

As with the unexpected success of the apocalyptic Left Behind novels, or the Jeff Foxworthy–organized Blue Collar Comedy Tour, the rise of country music and auto racing took place well off the American elite’s radar screen. (None of Christian Lander’s white people would be caught dead at a NASCAR race.) These phenomena reflected a growing sense of cultural solidarity among lower-middle-class whites—a solidarity defined by a yearning for American “authenticity,” a folksy realness that rejects the global, the urban, and the effete in favor of nostalgia for “the way things used to be.”

Like other forms of identity politics, white solidarity comes complete with its own folk heroes, conspiracy theories (Barack Obama is a secret Muslim! The U.S. is going to merge with Canada and Mexico!), and laundry lists of injustices. The targets and scapegoats vary—from multiculturalism and affirmative action to a loss of moral values, from immigration to an economy that no longer guarantees the American worker a fair chance—and so do the political programs they inspire. (Ross Perot and Pat Buchanan both tapped into this white identity politics in the 1990s today, its tribunes run the ideological gamut, from Jim Webb to Ron Paul to Mike Huckabee to Sarah Palin.) But the core grievance, in each case, has to do with cultural and socioeconomic dislocation—the sense that the system that used to guarantee the white working class some stability has gone off-kilter.

Wray is one of the founders of what has been called “white-trash studies,” a field conceived as a response to the perceived elite-liberal marginalization of the white working class. He argues that the economic downturn of the 1970s was the precondition for the formation of an “oppositional” and “defiant” white-working-class sensibility—think of the rugged, anti-everything individualism of 1977’s Smokey and the Bandit. But those anxieties took their shape from the aftershocks of the identity-based movements of the 1960s. “I think that the political space that the civil-rights movement opens up in the mid-1950s and ’60s is the transformative thing,” Wray observes. “Following the black-power movement, all of the other minority groups that followed took up various forms of activism, including brown power and yellow power and red power. Of course the problem is, if you try and have a ‘white power’ movement, it doesn’t sound good.”

The result is a racial pride that dares not speak its name, and that defines itself through cultural cues instead—a suspicion of intellectual elites and city dwellers, a preference for folksiness and plainness of speech (whether real or feigned), and the association of a working-class white minority with “the real America.” (In the Scots-Irish belt that runs from Arkansas up through West Virginia, the most common ethnic label offered to census takers is “American.”) Arguably, this white identity politics helped swing the 2000 and 2004 elections, serving as the powerful counterpunch to urban white liberals, and the McCain-Palin campaign relied on it almost to the point of absurdity (as when a McCain surrogate dismissed Northern Virginia as somehow not part of “the real Virginia”) as a bulwark against the threatening multiculturalism of Barack Obama. Their strategy failed, of course, but it’s possible to imagine white identity politics growing more potent and more forthright in its racial identifications in the future, as “the real America” becomes an ever-smaller portion of, well, the real America, and as the soon-to-be white minority’s sense of being besieged and disdained by a multicultural majority grows apace.

This vision of the aggrieved white man lost in a world that no longer values him was given its most vivid expression in the 1993 film Falling Down. Michael Douglas plays Bill Foster, a downsized defense worker with a buzz cut and a pocket protector who rampages through a Los Angeles overrun by greedy Korean shop-owners and Hispanic gangsters, railing against the eclipse of the America he used to know. (The film came out just eight years before California became the nation’s first majority-minority state.) Falling Down ends with a soulful police officer apprehending Foster on the Santa Monica Pier, at which point the middle-class vigilante asks, almost innocently: “I’m the bad guy?”

But this is a nightmare vision. Of course most of America’s Bill Fosters aren’t the bad guys—just as civilization is not, in the words of Tom Buchanan, “going to pieces” and America is not, in the phrasing of Pat Buchanan, going “Third World.” The coming white minority does not mean that the racial hierarchy of American culture will suddenly become inverted, as in 1995’s White Man’s Burden, an awful thought experiment of a film, starring John Travolta, that envisions an upside-down world in which whites are subjugated to their high-class black oppressors. There will be dislocations and resentments along the way, but the demographic shifts of the next 40 years are likely to reduce the power of racial hierarchies over everyone’s lives, producing a culture that’s more likely than any before to treat its inhabitants as individuals, rather than members of a caste or identity group.

Consider the world of advertising and marketing, industries that set out to mold our desires at a subconscious level. Advertising strategy once assumed a “general market”—“a code word for ‘white people,’” jokes one ad executive—and smaller, mutually exclusive, satellite “ethnic markets.” In recent years, though, advertisers have begun revising their assumptions and strategies in anticipation of profound demographic shifts. Instead of herding consumers toward a discrete center, the goal today is to create versatile images and campaigns that can be adapted to highly individualized tastes. (Think of the dancing silhouettes in Apple’s iPod campaign, which emphasizes individuality and diversity without privileging—or even representing—any specific group.)

At the moment, we can call this the triumph of multiculturalism, or post-racialism. But just as whiteness has no inherent meaning—it is a vessel we fill with our hopes and anxieties—these terms may prove equally empty in the long run. Does being post-racial mean that we are past race completely, or merely that race is no longer essential to how we identify ourselves? Karl Carter, of Atlanta’s youth-oriented GTM Inc. (Guerrilla Tactics Media), suggests that marketers and advertisers would be better off focusing on matrices like “lifestyle” or “culture” rather than race or ethnicity. “You’ll have crazy in-depth studies of the white consumer or the Latino consumer,” he complains. “But how do skaters feel? How do hip-hoppers feel?”

The logic of online social networking points in a similar direction. The New York University sociologist Dalton Conley has written of a “network nation,” in which applications like Facebook and MySpace create “crosscutting social groups” and new, flexible identities that only vaguely overlap with racial identities. Perhaps this is where the future of identity after whiteness lies—in a dramatic departure from the racial logic that has defined American culture from the very beginning. What Conley, Carter, and others are describing isn’t merely the displacement of whiteness from our cultural center they’re describing a social structure that treats race as just one of a seemingly infinite number of possible self-identifications.


Video, Sitemap-Video, Sitemap-Videos