Thursday, July 30, 2009

Genetic pacification? Part II

Natural selection has altered at least 7% of our genome over the last 40 thousand years. And it has been doing so at an accelerating rate, particularly after agriculture replaced hunting and gathering less than ten thousand years ago. At that time, the rate of genetic change may have risen over a hundred-fold (Hawks et al., 2007).

By then, our species had colonized almost every ecological niche on the planet—savanna, tropical rain forest, temperate woodland, boreal forest, and arctic tundra. It wasn’t because we were entering new ecological environments that genetic change speeded up. It was because we were entering new cultural environments.

One of them arose with the emergence of the State and its monopoly on the use of violence. This marked a sea change in human relations. Previously, men often used violence for their own advancement, and not simply in self-defense. The goal was to become a ‘big man’—someone who could dominate the local community through bluster, bullying, and charisma. Such men were more successful not only socially but also reproductively. They tended to attract more mates and sire more children.

The tables were turned with the rise of State societies. Over large territories, power increasingly fell into the hands of a few big men, often only one, and violence became a privileged instrument of their power. This left all other men with three options:

1. Forsake violence, or at least keep it under the radar screen of State detection.
2. Subordinate it to State goals, i.e., join the army.
3. Embrace it and become outlaws.

Thus, within the borders of Statist societies, survival and reproduction came to depend on one’s willingness to comply with the State, including its monopoly on the use of violence. Successful individuals were now those who had a higher threshold for expression of violent behavior, especially when acting on their own initiative. They also tended to be individuals whose relative inhibition of violence could be released only by the voice of authority. (In modern times, an American psychologist found that people are much likelier to inflict violence when told to do so by an authority figure. Does this finding hold true in all human societies? see Milgram, 1974).

This change in cultural environment is described by Liebeschuetz (2006) when he discusses Roman and barbarian societies:

In Roman law violence against individuals was treated as an offense that concerned the community. It was open to every citizen to launch a prosecution. The state provided the courts that established whether an injury had been inflicted, decided the punishment, and inflicted it. Moreover the imperial mandate to provincial governors stated, “The man in charge of a province must see to it that he clears the province of criminals.” Ulpian explains this as meaning:

“It is the duty of the … governor to see that the province he rules is peaceful. … This he will achieve if he takes careful measures [to ensure] that the province is free from criminals and searches them out. He should search out persons guilty of sacrilege, brigands, kidnappers, and thieves and punish them according to their offenses, and he should also repress them that harbor them.” (Liebeschuetz, 2006, p. 40)


In contrast, barbarians took the law into their own hands. Although law courts existed in Germanic society, their rulings had to be enforced by the aggrieved party. There was no State enforcement:

The injury was treated as an offense against the injured and his kin and it was left to the injured and/or his kin, not to the community, to compel the person who had caused the injury to give compensation for the damage he had inflicted. Unless the perpetrator or his kin paid compensation, it was the duty of the victim or his kin to take vengeance on the perpetrator or his kin. But the use of force was likely to start a chain of retaliation, in fact a feud. (Liebeschuetz, 2006, p. 39)

The threat of the feud might well have secured a reasonably stable society. But it certainly depended on a widespread readiness to answer violence with violence. It was, I would suggest, a society in which the private individual would have been more likely to have to act violently or to have experienced violence inflicted by others than he would have under Roman administration. (Liebeschuetz, 2006, p. 46)

Historiography often assumes that the Romans saw the divide between themselves and barbarians as a matter of learning and education. To a large degree, this assumption reflects a 20th century view that such a divide had to be culturally programmed. After all, many of the barbarians went on to become civilized Europeans. The picture is less clear, however, if we go back and read what Romans writers had to say. Many considered the divide to be rooted in nature:

Both explicitly and implicitly late antique writers created a generic barbarian identity that was intimately associated with violent behavior. This was only consistent with a classical literary tradition in which barbarians were associated with several violence-related traits, including crudelitas (cruelty), feritas (wildness), immanitas savagery), inhumanitas (inhumanity), impietas (impiety), ferocitas (ferocity), furor (fury), and discordia (discord). (Mathisen, 2006, p. 28)

Their violent nature also meant that barbarians were thought to be governed by their emotions rather than by their intellect. Seneca could claim that grief particularly affected “barbarians more than persons of a peaceful and learned people” and that barbarians were more likely to become angry. He also commented on barbarian lack of self-control: “Whom does one admire more than one who controls himself, who has himself under control. It is easier to rule barbarian nations and those impatient of alien rule than to contain and control one’s own mind.” Finally, Libanius suggested, “In this regard in particular I find the Greeks also to be superior to barbarians. The latter are akin to beasts in despising pity, while the Greeks are quick to pity and get over their wrath.” (Mathisen, 2006, p. 30)

To be sure, some believed that the barbarians could assimilate to Roman norms of behavior. In the 4th and 5th centuries, large numbers of them were allowed into the Empire and confident predictions were made that they would turn their swords into ploughs and scythes (Mathisen, 2006, p. 33). Events proved otherwise.

Like many of my readers, I am largely descended from barbarians who destroyed Roman Britain in the 5th century (Shaw, 2009). Maybe “destroyed” is the wrong word. They didn’t intend to destroy anything; they just wanted the same sort of things that the Romans had. Unfortunately, by their very presence, they made the continuation of those things impossible. Civilization was eventually rebuilt, but on a new foundation.

How did they become me? It was a long process stretching over some sixty generations. Like others before them, they too went on to create their own States. These States then strove to monopolize the use of violence, thus setting in motion the same behavioral evolution that has happened elsewhere.

References

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, & R.K. Moyzis. (2007).
Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA), 104, 20753-20758.

Liebeschuetz, W. (2006). Violence in the barbarian successor kingdoms, in: Drake, H.A. (ed.) Violence in Late Antiquity. Perceptions and Practices, pp. 37-46, Burlington (Vermont) and Aldershot: Absgate.

Mathisen, R.W. (2006). Violent behavior and the construction of barbarian identity in Late Antiquity, in: Drake, H.A. (ed.) Violence in Late Antiquity. Perceptions and Practices, pp. 27-35, Burlington (Vermont) and Aldershot: Absgate.

Milgram, S. (1974). Obedience to Authority. New York: Harper & Row.

Shaw, J. (2009). Who killed the men of England. The written record of history meets genomics, evolution, demography, and molecular archaeology. Harvard Magazine, July-August. http://harvardmagazine.com/2009/07/who-killed-the-men-england?page=0,1

Thursday, July 23, 2009

Genetic pacification?

Steven Pinker has an article up on the secular decline in violence (hat tip to Mangan’s):

But from the Middle Ages to modern times, we can see a steady reduction in socially sanctioned forms of violence. Many conventional histories reveal that mutilation and torture were routine forms of punishment for infractions that today would result in a fine. In Europe before the Enlightenment, crimes like shoplifting or blocking the king's driveway with your oxcart might have resulted in your tongue being cut out, your hands being chopped off, and so on. Many of these punishments were administered publicly, and cruelty was a popular form of entertainment.

We also have very good statistics for the history of one-on-one murder, because for centuries many European municipalities have recorded causes of death. When the criminologist Manuel Eisner scoured the records of every village, city, county, and nation he could find, he discovered that homicide rates in Europe had declined from 100 killings per 100,000 people per year in the Middle Ages to less than one killing per 100,000 people in modern Europe.

Pinker concludes: “our ancestors were far more violent than we are today. Indeed, violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species' time on earth.”

For starters, I dislike hearing the first person plural and the present tense when neither is intended. By ‘we’, Steve Pinker seems to mean the European world. And by ‘modern times’ and ‘modern Europe’ he seems to mean the postwar era—not London, Paris, and Amsterdam as they exist today. Beyond this singularity in space and time, ‘we’ enter another world where people—usually young males—still turn violent for reasons ‘we’ find strange, even pathological.

This point is, in fact, raised by Pinker:

… Manuel Eisner attributes the decline in European homicide to the transition from knightly warrior societies to the centralized governments of early modernity. And today, violence continues to fester in zones of anarchy, such as frontier regions, failed states, collapsed empires, and territories contested by mafias, gangs, and other dealers of contraband.

In addition to the emergence of central authority, Pinker considers other explanations: the increasing value placed on human life; the rise of the market economy and the interdependency it creates; and the ‘expanding moral circle’—“The more one knows and thinks about other living things, the harder it is to privilege one's own interests over theirs.”

These other explanations are actually related effects. The market economy has expanded because we’ve behaved in ways that make expansion possible. For instance, we no longer see violence as a legitimate way to settle disputes. We no longer use theft and intimidation as means of self-aggrandizement. And we no longer look up to violent charismatic ‘big men’ as role models.

And yes, we value human life more for the same sort of reason that we’ve become less violent. Ditto for our expanding moral circle.

Oops, that ‘we’ again. For most humans, little has changed since time immemorial. ‘They’ trust only close kin and long-time friends. ‘They’ kill over questions of honor and loss of face. And 'they' admire men whom we consider to be thugs.

But there has been change in some regions, like the European world, East Asia, and parts of South Asia. For the historical economist Gregory Clark, the ultimate reason is the rise of the State and its monopoly on the use of violence. This monopoly created a new set of selection pressures. What had once been rewarded in the struggle for existence was now penalized. And vice versa.

Clark points out that aggressive males are rewarded with reproductive success in simple clan-based societies. Among the Yanomamö, a horticulturalist people of Amazonia, significantly more children are fathered by men who have committed homicide than by those who have not. Among the Ache, a hunter-gatherer people of Paraguay, ‘homicidal’ men do not have more offspring but more of their offspring survive.

In contrast, aggressive males are penalized in settled societies with central authority, either through lower reproductive success or through removal from the population, e.g., through imprisonment, execution, and banishment. Such societies have much lower rates of violent death for all causes, including war.

Clark documents this secular decline in violence with respect to England. In the centuries after imposition of central authority, male homicide fell steadily from 1150 to 1800, there being a parallel decline in blood sports and other violent practices (cock fighting, bear and bull baiting, public executions) that were nonetheless legal throughout almost the whole period. Clark ascribes this behavioral change to the reproductive success of upper- and middle-class individuals whose heritable characteristics differed statistically from those of the general population, particularly with respect to male violence. Although initially a small minority in medieval England, these individuals grew in number and their descendants gradually replaced the lower classes through downward mobility. By 1800, such lineages accounted for most of the English population (Clark, 2007, pp. 124-129, 182-183; Clark, 2009).

This pacification of society did not occur uniformly throughout England. Endemic violence persisted until the 18th century in the northern border regions, where any encounter with non-kin, however innocent, could lead to violence. “In a world of treachery and danger, blood relationships became highly important. Families grew into clans, and kinsmen placed fidelity to family above loyalty to the crown itself.” Disputes were settled through payment of blood money or turned into long-running feuds (Fisher, 1989, p. 628).

Clark has been criticized for failing to explain why the market economy spread so easily from England to other parts of Europe and then to the whole world. The answer is that many of these other regions had undergone the same behavioral evolution for the same reason: the emergence of strong states that monopolize the use of violence. Elsewhere, where this evolution has begun more recently, or not at all, the market economy has been less successful. It works only when strong-armed regimes ensure respect for life and property.

This is something that economic libertarians fail to grasp. Yes, the market economy is generally associated with peaceful and respectful human relations. But the line of causality doesn’t run in the direction they think it does.

References

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford, 2007.

Clark, G. (2009). The indicted and the wealthy: surnames, reproductive success, genetic selection and social class in pre-industrial England,
http://www.econ.ucdavis.edu/faculty/gclark/Farewell%20to%20Alms/Clark%20-Surnames.pdf

Fischer, D.H. (1989). Albion’s Seed. Four British Folkways in America, Oxford University Press, New York and Oxford, 1989, pp. 621-632.

Pinker, S. (2009). Why is there peace? Greater Good Magazine, April.
http://greatergood.berkeley.edu/greatergood/2009april/Pinker054.php

Thursday, July 16, 2009

Who gets to write history?

History is written by the survivors – Max Lerner

I prefer Lerner’s version to Churchill’s History is written by the victors. Opinions will survive as long as the group that holds them, and such a group may disappear for reasons besides defeat through conflict. Often, the group is temporary by nature.

Remember this when reading about how the sunshine movement eradicated rickets and tuberculosis—the two long-running epidemics of the Western world. This is a history written by those who kept making their case long after others had lost interest. To quote a Sailerism: History is written by those who write … and write … and write.

Here is how the sunshine movement is usually presented:

In the nineteenth century, different physicians advocated ‘heliotherapy’ as a means to treat rickets and kill the microbes that cause tuberculosis. They were ignored, except by a few nudists and health faddists. Then, two events broke the barrier to public acceptance. One was the Spanish flu of 1918-1919, which showed the need for strong measures to keep microbes from spreading. The other, in 1919, was the discovery that ultraviolet light can cure rickets by releasing a chemical into the bloodstream, later identified as vitamin D. Heliotherapy thus gained in popularity and by the late 1920s sunbathing had become widespread. Meanwhile, tuberculosis and rickets steadily declined until both virtually disappeared.

This narrative leaves out a key detail. The sunshine movement had little effect on the prevalence of either tuberculosis or rickets. Through the 1920s and 1930s, there was only a gradual decline in the incidence of tuberculosis, as part of a downward trend that began in the nineteenth century and that probably was due to stricter segregation of tubercular patients in hospitals and sanatoria. Steep decline began later—after the Second World War (
Wilson, 1990). Much of the credit goes to the discovery of antimicrobial drugs like isoniazid, rifampicin, and streptomycin. Just as important was the overall rise in the standard of living, particularly among urban workers. As the population became better nourished and as overcrowded tenements gave way to suburban homes, microbes could not spread so easily to weakened hosts.

Nor was rickets much affected. In Dundee, Scotland, its incidence among children held steady from 1925 to 1935 and then dropped abruptly, apparently after the introduction of a ‘milk in schools’ scheme in 1934 and then greater provision of free meals to needy children. This drop preceded vitamin D fortification of milk and infant cereals (Stewart et al., 1964). In the United States, rickets was deemed as late as 1940 to be “still probably the most common disease of early childhood” (Harrison, 1966). In Sweden, it still afflicted most fetuses and newborns during the early 1950s (Sydow et al., 1956).

By this time, many medical researchers had concluded that the cause could not be lack of sun. A Swedish autopsy study found that rickets in fetuses and newborns showed no significant correlation with the mother’s vitamin D status during pregnancy, as indicated by her consumption of vitamin D supplements or by her degree of sun exposure, i.e., daily walks in the sun, outdoor work, summer holiday in the country, and latitude within Sweden (Sydow et al., 1956). For several research teams, the cause seemed to be substances in commercial bread, like phytic acid, that immobilize calcium and phosphorus within the body (Bruce & Callow, 1934; Harrison & Mellanby, 1939; McCance et al., 1942; McCance & Widdowson, 1942). But fewer and fewer researchers were now interested. Like tuberculosis, rickets too went into a steep decline after the war, eventually becoming a medical oddity by the 1960s (Harrison, 1966).

Meanwhile, interest had been growing in the sunshine movement. It now extended far beyond the medical community, being influential in the fashion industry, the arts and literature, architecture, and the movie industry. It had become a cultural phenomenon, and one that would create much of the look and feel of modern life.

Fashion industry

Gabrielle (Coco) Chanel is usually credited with making suntans popular among women. Certainly, in the late 1920s she was the one who added tanned skin to a new androgynous look, la garçonne, that featured long legs, a flat chest, narrow hips, and large shoulders, like a young boy on the brink of puberty (Andrieu, 2008, p. 73; Bard, 1998; Galante, 1972; Wilson, 1985).

But Coco Chanel simply boarded a bus that was already on the road. As early as 1926, a Connecticut radio station announced: “a coat of tan seems to be the latest style in natural coloring at this season of the year. [It has] been increasing in favor during the last few years” (Nickerson, 1926). Female tanning became fashionable through women acting on their own initiative. Others then saw the market opportunities:

Cosmetics manufacturers took notice of the new acceptability of nonwhite skin and began to produce darker powders, as well as artificial bronzing lotions. By 1929, Jean Patou and Coco Chanel had introduced suntan products, and Helena Rubenstein was selling “Valaze Gypsy Tan Foundation.” Other cosmetics manufacturers were blending powders to be “creamy,” rather than white, and producing “ochre,” “dark rachel,” and “suntan” shades. (Berry, 2000, p. 188)

During this time, skin whiteners became less popular.

Golden Peacock Bleach Cream and other facial bleaches, which were advertised regularly in women’s magazines until the late 1920s, appeared only rarely after the early 1930s, although skin lighteners were still marketed to the African American community. (Berry, 2000, p. 188)

The same trend swept through women’s magazines. By the end of the 1920s, Vogue was telling its readers that “The 1929 girl must be tanned” and “A golden tan is the index of chic” (Vogue, 1929). In the early 1930s, however, these magazines were periodically predicting the end of the suntan fad (Berry, 2000, p. 188). Just as fashion leaders had failed to anticipate this fad, they also misjudged its staying power.

Arts and literature

In the mid-1920s, the sunshine movement spread to artists and literati through multiple points of entry. A key one was the French Riviera (Ash, 1974; Weigtman, 1970).


Among the fashionable, “heliophobia” soon gave way to “heliophilia.” The scene of this minor revolution was the French Riviera—specifically the beach of La Garoupe at Cap d’Antibes—and its chief ideologist was Gerald Murphy [an ex-pat American artist].

… It was the Americans—Cole Porter and the Murphys—who first “discovered” the Riviera as a summer resort. The Murphys began to clear La Garoupe of its layer of seaweed and persuaded the proprietor of the Hotel du Cap to remain open during the summer months … The cultivators of the simple, the connoisseurs of the primitive, formed a new elite comprised of Americans, artists and the more unconventional members of the aristocracy. Among others, the Hemingways, the Fitzgeralds, the Picassos, the Legers and the Count and Countess Etienne de Beaumont, all joined the Murphys at Cap d’Antibes in the summer. It was this elite, with some assistance from Coco Chanel, that raised the suntan to the level of higher fashion. (Ash, 1974)

Andrieu (2008, p. 73) and Weigtman (1970) describe how French writers of the 1920s placed favorable references to tanned skin in their works, associating it with lead characters and positive qualities. Similar placement appears in F. Scott Fitzgerald’s The Great Gatsby (1925). Miss Jordan Baker has "sun-strained eyes," a "slender golden arm," a "brown hand," a "golden shoulder," and a "face the same brown tint as the fingerless glove on her knee” (Fitzgerald, 1992, pp. 15, 47, 57, 84, 185).

Architecture

The sunshine movement brought a new urban landscape by moving buildings further back from the street, limiting their height, and spacing them further apart. Windows also became bigger and more numerous. Meanwhile, modernist architects looked to tuberculosis sanatoria and ocean liners to make their creations more open to the sun and air. They introduced such features as the flat roof, the balcony, and the roof or garden terrace “on which elegant Jazz age young women, dressed in patio-pyjamas, could sunbathe on chaises longues” (Campbell, 2005).

These signatures of modernism marked the design of public housing and, especially, schools, whose population was thought to be most at risk for tuberculosis and rickets:

By the late 1920s, the therapeutic qualities of sunlight were widely recognised, and its use was extended in sunshine schools and open-air clinics to the more general treatment of sickly, TB-prone and crippled children, many of them drawn from the slums. Progressive schools like Bedales early encouraged sunbathing; St Christopher School, Letchworth, installed vita glass; and pictures of Pinehurst School show the children running about naked. (Twigg, 1981)

In general, traditional architecture was seen not simply as old-fashioned but as unhealthy. This gave a revolutionary urgency to the thinking of modernists, like the Swiss architect Le Corbusier:

But when it comes to a question of demolishing rotten old houses full of tuberculosis and demoralizing, you hear them cry, “What about the iron-work, what about the beautiful old wrought-iron work.” (Campbell, 2005)

Such demolition was considered necessary to build a healthier society. “Sunlight stood for the new society of light”:

Houses in the garden cities were oriented towards the sun. Architecture in the interwar years pursued light to an almost obsessive degree. It came to be the emblem of a cluster of reforms in the 1920s aimed at making Britain a better, healthier, cleaner place to live.

… Just as the antiseptic qualities of sunlight had been observed through its action on mouldy, damp objects, so the sunlight for this post-Victorian generation could be made to shine on the dank, rotten and hidden aspects of the Victorian world. (The thirties saw the full flood of anti-Victorianism) This could mean the slum houses and sick children, but it also, very frequently, meant sexuality.
(Twigg, 1981)

Movie industry

The tanned look entered the movie industry via individual actors and actresses, notably Joan Crawford:


Joan Crawford was credited for spreading the trend among Hollywood flappers—in addition to tanning her face, Crawford browned her body and went stockingless …
(Berry, 2000, p. 188)


She was reportedly told by MGM to stop tanning because it made her look “like a lineal descendent of Sheba.” The movie industry, however, soon realized there was a market for dark skin as an item of sexual interest. The 1930s thus saw a spate of Hollywood films featuring Latin lovers, Arab sheikhs, and South Seas beauties (Berry, 2000, pp. 110-111).

No one fully understood this phenomenon. Described as a quest for the exotic, one might doubt the exoticism of stars and starlets who were just a darker version of the European phenotype. Nonetheless, through productions like South Pacific (1949), this faux interracialism would help pave the way for the real thing after the war.

Perhaps white Americans were motivated to sexualize dark skin by a ‘rare color effect’ such as exists with differing shades of hair color. The rarer brunettes are, the more they excite sexual interest in men (
Anon, 2008; Thelen, 1983). Or perhaps the motivation lay at another level of male sexual response. If women evolved a lighter complexion and other paedomorphic features as a way to inhibit male aggression and stimulate feelings of care, a darker skin tone could exert sex appeal on a more aggressive and less empathetic level (Frost, 2007; Guthrie, 1970).

Social conformity and status competition

Cultural change involves not only leaders but also followers. As more and more people sported tans, the new look tended to spread simply through social conformity and status competition:

The appearance of medical articles that begin to deal with tanning as not directly related to a “cure” shows that the acquisition of the suntan, at least among certain sections of society, was already desirable. In addition the change toward a more positive view of the suntan was to articulate well with other social changes taking place, particularly as related to travel, in the early twentieth century. The idea of health travel was established, with many invalids seeking the sun cure in sanatoria. However, the wealthy invalid was able to travel further — to locations where they could be assured of receiving sunlight, such as the Swiss Alps or the Mediterranean. (Carter & Michael, 2003, p. 269)


The cosmetic aspects of a suntan were not originally much to the fore; during the 1920s tanning was regarded as only a side effect and not spoken of with special favour. By the 1930s, however, the naturist magazines were praising the look of bronzed skin. The fashion spread beyond these circles, to the cosmopolitan and wealthy. By the 1930s the Riviera season had reversed from being winter to summer. The seaside, from being a place for bathing and for sea air became somewhere for taking off your clothes and lying in the sun; the resorts began to publish their sunshine figures; and by the mid 1930s the major cosmetic houses were producing suntan creams. A suntan became associated with youth, health and vigour, qualities that the thirties found particularly attractive sexually. (Twigg, 1981)

During the 1930s, the popularity of suntanning and nudity reached a peak, intellectual and social benefits were said to accrue from sun-exposure, and it was felt to be “imperative” for the successful executive to be tanned, as this indicated “superior physique, intelligence, and moral character.” (Koblenzer, 1998)


Conclusion

The sunshine movement was able to impose its world view—and its version of history—by winning over the creators of modern culture: artists, writers, actors, architects, and fashion designers. This victory, if we can use the term, was achieved not in the narrow realm of medical debate but in the larger one of cultural production. In truth, there was no victory because there was no battle. The ‘other side’ lost interest in explaining the tuberculosis and rickets epidemics once these had subsided. They moved on to other things.

The ‘victors’ had an unforeseen ally: a sensual, if not sexual, fascination with dark skin. By arming young men and women with a medical alibi, the sunshine movement unwittingly opened up a dimension of sexual attraction that had lain unexploited. There had, in fact, been a taboo against sexualizing dark skin, partly because of the racial connotations and partly because dark complexions among white Americans had traditionally been viewed as unfeminine for women and as hypermasculine for men.

Another ally, widespread in the twentieth century, was a belief in change and in the urgency of change. It is probably no coincidence that many sunshine advocates, like the architect Le Corbusier, saw themselves as radicals. The movement likewise had more success in pushing its agenda in the Eastern bloc, as seen in the mass relocation of working families to modernist housing projects, in State-sponsored vacations at Black Sea resorts, and in the DDR’s mass administration of vitamin D megadoses to children.

References

Andrieu, B. (2008). Bronzage. Une petite histoire du Soleil et de la peau, CNRS Éditions.

Anon. (2008).
Maxim's audience prefers brunettes; distribution is bimodal. Gene Expression, July 6, 2008.

Ash, J. (1974). The meaning of suntan, New Society, August, pp. 278-280.

Bard, C. (1998). Les garçonnes. Modes et fantasmes des Années folles. Paris: Flammarion.

Berry, S. (2000). Screen Style. Fashion and Femininity in 1930s Hollywood, Minneapolis: University of Minnesota Press.

Bruce, H. & Callow, R. (1934). Cereals and rickets. The role of inositolhexaphosphoric acid, Biochemical Journal, 28, 517-528.

Campbell, M. (2005). What tuberculosis did for modernism: The influence of a curative environment on modernist design and architecture, Medical History, 49, 463-488.

Carter, S. & Michael, M. (2003). “Here comes the sun: Shedding light on the cultural body”, in: H. Thomas & J. Ahmed (ed.) Cultural Bodies: Ethnography and Theory, pp. 260-282, Wiley-Blackwell.

Fitzgerald, F.S. 1992. The Great Gatsby. New York: Collier Books.

Frost, P. (2007). Comment on Human skin-color sexual dimorphism: A test of the sexual selection hypothesis, American Journal of Physical Anthropology, 133, 779-781.

Galante, P. (1972). Les années Chanel. Paris: Mercure de France.

Guthrie, R.D. (1970). Evolution of human threat display organs, Evolutionary Biology, 4, 257-302.

Harrison, D.C., & Mellanby, E. (1939). Phytic acid and the rickets-producing action of cereals, Biochemical Journal, 33, 1660-1680.

Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.

Koblenzer, C.C. (1998). The psychology of sun-exposure and tanning, Clinics in Dermatology, 16, 421-428.

McCance, R., & Widdowson, E. (1942). Mineral metabolism of dephytinized bread, Journal of Physiology, 101, 304-313.

McCance, R., & Widdowson, E. (1942). Mineral metabolism of healthy adults on white and brown bread dietaries, Journal of Physiology, 101, 44-85.

Nickerson, E.C. (1926). Nature's Cosmetics, Bulletin sanitaire 26(5), 134-140.

Stewart, W.K., Mitchell, R.G., Morgan, H.G., Lowe, K.G., & Thomson, J. (1964). The changing incidence of rickets and infantile hypercalcaemia as seen in Dundee, The Lancet, 283(7335), 679-730.

Sydow, G.V., Ranström, S., Berezin, D., & Axen, O. (1956). Histological findings characteristic of rickets in foetuses and young infants, Acta Paediatrica, 45, 114-138.

Thelen, T.H. (1983). Minority type human mate preference. Social Biology, 30, 162-180.

Twigg, J. (1981).
“Sunlight & nature” in: The Vegetarian Movement in England, 1847-1981: A study in the structure of its ideology, Doctoral thesis presented to the London School of Economics, University of London. http://www.ivu.org/history/thesis/sunlight.html

Vogue (1929). June 22, pp. 99, 100.

Weightman, J. (1970). The solar revolution. Reflections on a theme in French literature, Encounter, December, pp. 9-18.

Wilson, E. (1985). Adorned in Dreams: Fashion and Modernity, London: Virago.

Wilson, L.G. (1990).
The historical decline of tuberculosis in Europe and America: Its causes and significance, Journal of the History of Medicine and Allied Sciences, 45, 366-396.

Thursday, July 9, 2009

African Americans and vitamin D

Vitamin D insufficiency is more prevalent among African Americans (blacks) than other Americans and, in North America, most young, healthy blacks do not achieve optimal 25-hydroxyvitamin D [25(OH)D] concentrations at any time of year. This is primarily due to the fact that pigmentation reduces vitamin D production in the skin. Also, from about puberty and onward, median vitamin D intakes of American blacks are below recommended intakes in every age group, with or without the inclusion of vitamin D from supplements. (Harris, 2006)

It’s well known that African Americans have low levels of vitamin D in their blood. In fact, this seems to be generally true for humans of tropical origin. In a study from Hawaii, vitamin D status was assessed in healthy, visibly tanned young adults who averaged 22.4 hours per week of unprotected sun exposure. Yet 51% had levels below the current recommended minimum of 75 nmol/L (Binkley et al., 2007). In a study from south India, levels below 50 nmol/L were found in 44% of the men and 70% of the women. The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight” (Harinarayan et al., 2007). In a study from Saudi Arabia, levels below 25 nmol/L were found in respectively 35%, 45%, 53%, and 50% of normal male university students of Saudi, Jordanian, Egyptian, and other origins (Sedrani, 1984).

These low levels are usually blamed on the darker skin of tropical humans, i.e., melanin blocks the UV-B component of sunlight, which the skin needs to make vitamin D. Actually, dark skin is not a serious constraint on vitamin D production. While it is true that a single UV-B exposure of moderate intensity will produce less vitamin D in black skin than in white skin, this difference narrows with longer exposure times, since white skin cuts back vitamin D production after only 20 minutes in the sun (Holick, 1995). Even in England, where sunlight is relatively weak, Asian, West Indian, and European adolescents show similar increases in vitamin D levels during the spring and summer (Ellis et al., 1977).

Another possible reason why tropical humans make less vitamin D is that there is no need to build up a reserve for the winter, when this vitamin cannot be produced. In contrast, such a reserve is necessary in the temperate zone. This seasonal variation is shown by a study of Nebraskan men after a summer of landscaping, construction, farming, and recreation. Their mean vitamin D level was initially 122 nmol/L. By late winter, it had fallen to 74 nmol/L (Barger-Lux & Heaney, 2002). Tropical humans may thus produce less of this vitamin because their skin doesn’t have to ‘make hay while the sun shines.’ This adaptation would then persist in those groups, like African Americans, that now inhabit the temperate zone.

Whatever the reason for this lower rate of production, tropical humans seem to compensate by converting more vitamin D into its active form. Although a single UV-B exposure produces less vitamin D3 in black subjects than in whites, the difference narrows after liver hydroxylation to 25-OHD and disappears after kidney hydroxylation to 1,25-(OH)2D. The active form of vitamin D is thus kept at a constant level, regardless of skin color (Matsuoka et al., 1991, 1995).

Robins (2009) notes that nearly half of all African Americans are classified as vitamin-D deficient and yet show no signs of calcium deficiency, which would be a logical result of vitamin D deficiency. Indeed, they “have a lower prevalence of osteoporosis, a lower incidence of fractures and a higher bone mineral density than white Americans, who generally exhibit a much more favourable vitamin D status.” He also cites a survey of 232 black (East African) immigrant children in Melbourne, Australia, among whom 87% had levels below 50 nmol/L and 44% below 25 nmol/L. None had rickets—the usual sign of vitamin-D deficiency in children (McGillivray et al., 2007).

In short, low vitamin D levels seem to be normal for African Americans and nothing to worry about. Such contrary evidence, however, doesn’t deter the vitamin D worrywarts:

Despite their low 25(OH)D levels, blacks have lower rates of osteoporotic fractures. This may result in part from bone-protective adaptations that include an intestinal resistance to the actions of 1,25(OH)2D and a skeletal resistance to the actions of parathyroid hormone (PTH). However, these mechanisms may not fully mitigate the harmful skeletal effects of low 25(OH)D and elevated PTH in blacks, at least among older individuals. Furthermore, it is becoming increasingly apparent that vitamin D protects against other chronic conditions, including cardiovascular disease, diabetes, and some cancers, all of which are as prevalent or more prevalent among blacks than whites. Clinicians and educators should be encouraged to promote improved vitamin D status among blacks (and others) because of the low risk and low cost of vitamin D supplementation and its potentially broad health benefits. (Harris, 2006)


The National Institute of Health is now studying the benefits of giving African Americans mega-doses of vitamin D, in the hope of bringing their disease rates down to those of other Americans. "We're excited about the potential of vitamin D to reduce this health gap," says the study co-leader. "But it is important to get answers from clinical trials before recommending megadoses of this supplement." (see article)

Yes, it might be best to get a few answers first. Unfortunately, there are millions of people out there who are now taking mega-doses of vitamin D every day. The mass experiment has already begun and the results should be ready in a decade or so, particularly among African Americans.

But why wait? The same experiment was performed from the mid-1980s to 2009 on an African American. The results are now in …

Was MJ done in by the D men?

A local journalist recalled interviewing Michael Jackson three years ago and noted that this man, then in his mid-40s, had the withered look of someone much older—like a vieillard.

What was responsible? His repeated plastic surgeries? His starvation diet? His abuse of painkillers and tranquillizers? These are the usual suspects. In the shadows, however, lurks another suspect who will never be questioned.

Michael Jackson had probably been taking mega-doses of vitamin D. This regimen would have started when he began bleaching his skin in the mid-1980s to even out blotchy pigmentation due to vitiligo. Since this bleaching made his skin highly sensitive to UV light, his dermatologist told him to avoid the sun and use a parasol. At that point, his medical entourage would have recommended vitamin D supplements. How high a dose? We’ll probably never know, but there are certainly many doctors who recommend mega-doses for people who get no sun exposure.

Such a recommendation would have dovetailed nicely with Michael’s fondness for vitamins. A 2005 news release mentions vitamin therapy as part of his health program:

“He’s getting vitamin nutrients and supplements,” the source said.

This source would not elaborate on the type of supplements or the way in which they are being administered.

There is also an interview with his former producer Tarak Ben Ammar:

C'était un hypocondriaque et on savait jamais vraiment s'il était malade car il a été entouré de médecins charlatans qui vivaient de cette maladie, qui lui facturaient des milliers et des milliers de dollars de médicaments, de vitamines…

[He was a hypochondriac and one never really knew whether he was sick because he was surrounded by charlatan doctors who lived from this sickness, who billed him for thousands and thousands of dollars of medication, of vitamins …]

It’s known that Michael Jackson was receiving injections of the ‘Myers cocktail’ (a mix of vitamins and nutrients), but this mix doesn’t normally contain vitamin D. He was probably taking the vitamin in tablet form.

What effects can we expect from long-term use of vitamin D at high doses? Keep in mind that we are really talking about a hormone, not a vitamin. This hormone interacts with the chromosomes and will gradually shorten their telomeres if concentrations are either too low or too high. Tuohimaa (2009) argues that optimal levels may lie in the range of 40-60 nmol/L. This is well below the current recommended minimum of 75 nmol/L. Furthermore, compliance with this optimal range may matter even more for populations of tropical origin, like African Americans, since their bodies have not adapted to the wide seasonal variation of non-tropical humans.

If this optimal range is continually exceeded, the long-term effects may look like those of aging:

Recent studies using genetically modified mice, such as FGF23-/- and Klotho-/- mice that exhibit altered mineral homeostasis due to a high vitamin D activity showed features of premature aging that include retarded growth, osteoporosis, atherosclerosis, ectopic calcification, immunological deficiency, skin and general organ atrophy, hypogonadism and short lifespan.

… after the Second World War in Europe especially in Germany and DDR, children received extremely high oral doses of vitamin D and suffered hypercalcemia, early aging, cardiovascular complications and early death suggesting that hypervitaminosis D can accelerate aging.
(Tuohimaa 2009)

Have we opened a Pandora’s box? Far from being a panacea, vitamin D could be an angel of death that will make millions of people old before their time.

Poor Michael. He looked to his doctors for eternal youth and they gave him premature old age.

References

Barger-Lux, J., & Heaney, R.P. (2002). Effects of above average summer sun exposure on serum 25-hydroxyvitamin D and calcium absorption, The Journal of Clinical Endocrinology & Metabolism, 87, 4952-4956.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Ellis, G., Woodhead, J.S., & Cooke, W.T. (1977). Serum-25-hydroxyvitamin-D concentrations in adolescent boys, Lancet, 1, 825-828.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Harris, S.S. (2006). Vitamin D and African Americans, Journal of Nutrition, 136, 1126-1129.

Holick, M.F. (1995). Noncalcemic actions of 1,25-dihydroxyvitamin D3 and clinical applications, Bone, 17, 107S-111S.

Matsuoka, L.Y., Wortsman, J., Chen, T.C., & Holick, M.F. (1995). Compensation for the interracial variance in the cutaneous synthesis of vitamin D, Journal of Laboratory and Clinical Medicine, 126, 452-457.

Matsuoka, L.Y., Wortsman, J., Haddad, J.G., Kolm, P., & Hollis, B.W. (1991). Racial pigmentation and the cutaneous synthesis of vitamin D. Archives of Dermatology, 127, 536-538.

McGillivray, G., Skull, S.A., Davie, G., Kofoed, S., Frydenberg, L., Rice, J., Cooke, R., & Carapetis, J.R. (2007). High prevalence of asymptomatic vitamin-D and iron deficiency in East African immigrant children and adolescents living in a temperate climate. Archives of Disease in Childhood, 92, 1088-1093.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Sedrani, S.H. (1984). Low 25-hydroxyvitamin D and normal serum calcium concentrations in Saudi Arabia: Riyadh region, Annals of Nutrition & Metabolism, 28, 181-185.

Tuohimaa, P. (2009). Vitamin D and aging, Journal of Steroid Biochemistry and Molecular Biology, 114, 78-84.

Thursday, July 2, 2009

Why are Europeans white?

Why are Europeans so pale-skinned? The most popular explanation is the vitamin-D hypothesis. Originally developed by Murray (1934) and Loomis (1967), it has been most recently presented by Chaplin and Jablonski (2009). It can be summarized as follows:

1. To absorb calcium and phosphorus from food passing through the gut, humans need vitamin D. This vitamin is either produced in the skin through the action of UV-B light or obtained from certain food sources, notably fatty fish.

2. Humans are often vitamin-D deficient, even in tropical regions where UV-B exposure is intense and continual. This deficiency has led to high frequencies of rickets in many populations, particularly western Europeans and North Americans during the great rickets epidemic from c. 1600 to the mid-20th century. This epidemic occurred in areas where human skin was already producing sub-optimal levels of vitamin D because of the naturally weak sunlight at northern latitudes. These levels then fell even further wherever the Industrial Revolution had reduced sun exposure through air pollution, tall buildings, and indoor factory life.

3. If ancestral humans were often sub-optimal for vitamin D, natural selection should have favored lighter skin color, as a way to produce more of this vitamin by allowing more UV-B into the skin. Such selection, however, would have been counterbalanced in the tropical zone by selection for darker skin, to prevent sunburn and skin cancer.

4. This equilibrium would have ceased once ancestral humans had left the tropical zone. On the one hand, selection for darker skin would have relaxed, there being less sunburn and skin cancer. On the other, selection for lighter skin would have increased, there being less UV-B for vitamin-D production.

Ancestral humans thus began to lighten in skin color once they had entered Europe’s northern latitudes. This selection pressure eventually drove European skin color almost to the limit of depigmentation.


Were ancestral Europeans deficient for vitamin D?

There are several problems with the vitamin-D hypothesis. First, if lack of this vitamin created the selection pressure that led to white European skin, why are Europeans genetically polymorphic in their ability to maintain blood levels of vitamin D? At least two alleles reduce the effectiveness of the vitamin-D binding protein, and their homozygotes account for 9% and 18% of French Canadians (Sinotte et al., 2009). If lack of this vitamin had been so chronic, natural selection would have surely weeded out these alleles. And why does European skin limit vitamin-D production after only 20 minutes of UV-B exposure? (Holick, 1995). Why is such a limiting mechanism necessary?

There is also little evidence that ancestral Europeans suffered from vitamin-D deficiency. Before the 17th century, we have only sporadic evidence of rickets in skeletal remains and even these cases may be false positives, as Wells (1975) notes:

It is likely that these low frequencies of rickets should be even lower because some of the authors quoted above have based their diagnoses on such features as plagiocrany (asymmetry of the skull), which may occur merely from cradling habits and other causes (Wells, 1967a) or on irregularities of the teeth, which probably result from many adverse factors in foetal life as well as in infancy.

On this point, Chaplin and Jablonski (2009) affirm: “Despite taphonomic biases, it [rickets] has been recognized in early archeological and Neolithic materials at the rate of 1-2.7% (a reasonably high selective value).” In fact, Wells (1975) reports no cases from Paleolithic Europe and only sporadic cases from Neolithic Europe. The range of 1-2.7% seems to apply to “a gradual, albeit slow, increase of the disease during the European Middle Ages” (Wells, 1975). Wells (1975) cites a series of Hungarian remains that indicate an increase in frequency from 0.7 to 2.5% between the 10th and 13th centuries. As Wells notes, even this low incidence is probably inflated by false positives.

Why is skin white only among Europeans?

The vitamin-D hypothesis raises a second problem. Why is white skin an outlier among the skin tones of indigenous human populations north of 45° N? Skin is much darker among people who are native to these latitudes in Asia and North America and who receive similar levels of UV-B at ground level. Murray (1934) attributes their darker skin to a diet rich in vitamin D:

One of the chief difficulties up to now in accounting for the origin of the white or unpigmented race has been the existence of the darkly pigmented Eskimo in these same dark sunless Arctic regions which we have been discussing as the probable original habitat of the white race. The unravelling of the causes of rickets has fully explained this anomaly. The Eskimo though deeply pigmented and living in a dark habitat, nevertheless is notoriously free from rickets. This is due to his subsisting almost exclusively on a fish oil and meat diet. Cod liver oil, as has been stated, is fully as efficient as sunlight in preventing rickets. Now the daily diet of the Eskimo calculated in antirachitic units of cod liver oil equals several times the minimum amount of cod liver oil needed to prevent rickets. Because of his diet of antirachitic fats, it has been unnecessary for the Eskimo to evolve a white skin in the sunless frigid zone. He has not needed to have his skin bleached by countless centuries of evolution to admit more antirachitic sunlight. He probably has the same pigmented skin with which he arrived in the far north ages ago.

This argument fails to explain why skin is equally dark among inland natives of northern Asia and North America who consume little fatty fish and yet show no signs of rickets. One might also point out that fatty fish has long been a major food source for the coastal inhabitants of northwestern Europe. According to carbon isotope analysis of 7,000-6,000 year old human remains from Denmark, the diet must have been 70-95% of marine origin (Tauber, 1981). Yet Danes are very pale-skinned.

Some have suggested that sufficient vitamin D could have been obtained from the meat of land animals, if eaten in sufficient quantities (Sweet, 2002). This has led to a revised version of the vitamin-D hypothesis: ancestral Europeans lightened in color when they made the transition from hunting and gathering to agriculture 8,000 to 5,000 years ago, and not when they first arrived some 35,000 years ago.

Do we know when Europeans became white? This change has been roughly dated at two gene loci. At SLC45A2 (AIM1), Soejima et al. (2005) have come up with a date of ~ 11,000 BP. At SLC24A5, Norton and Hammer (2007) suggest a date somewhere between 12,000 and 3,000 BP. These are rough estimates but it looks like Europeans did not turn white until long after their arrival in Europe. As a Science journalist commented: “the implication is that our European ancestors were brown-skinned for tens of thousands of years” (Gibbons, 2007). Thus, the original version of the vitamin-D hypothesis no longer seems plausible.

Of course, the revised vitamin-D hypothesis is still plausible, i.e., Europeans became pale-skinned after giving up hunting and gathering for agriculture. But this scenario does raise problems. For one thing, it would mean that many Europeans turned white at the threshold of history. In the case of Norway, agriculture did not arrive until 2400 BC and fatty fish, rich in vitamin D, have always been a mainstay of the diet (Prescott, 1996).

References

Chaplin, G., & Jablonski, N.G. (2009). Vitamin D and the evolution of human depigmentation, American Journal of Physical Anthropology, early view

Gibbons, A. (2007). American Association Of Physical Anthropologists Meeting: European Skin Turned Pale Only Recently, Gene Suggests. Science 20 April 2007:Vol. 316. no. 5823, p. 364 DOI: 10.1126/science.316.5823.364a
http://www.sciencemag.org/cgi/content/summary/316/5823/364a

Holick, M.F. (1995). Noncalcemic actions of 1,25-dihydroxyvitamin D3 and clinical applications, Bone, 17, 107S-111S.

Loomis, W.F. (1967). Skin-pigment regulation of vitamin-D biosynthesis in Man, Science, 157, 501-506.

Murray, F.G. (1934). Pigmentation, sunlight, and nutritional disease, American Anthropologist, 36, 438-445.

Norton, H.L. & Hammer, M.F. (2007). Sequence variation in the pigmentation candidate gene SLC24A5 and evidence for independent evolution of light skin in European and East Asian populations. Program of the 77th Annual Meeting of the American Association of Physical Anthropologists, p. 179.

Prescott, C. (1996). Was there really a Neolithic in Norway? Antiquity, 70, 77-87.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Sinotte, M., Diorio, C., Bérubé, S., Pollak, M., & Brisson, J. (2009). Genetic polymorphisms of the vitamin D binding protein and plasma concentrations of 25-hydroxyvitamin D in premenopausal women, American Journal of Clinical Nutrition, 89, 634-640.

Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.

Sweet, F.W. (2002). The paleo-etiology of human skin tone.
http://backintyme.com/essays/?p=4

Tauber, H. (1981). 13C evidence for dietary habits of prehistoric man in Denmark, Nature, 292, 332-333.

Wells, C. (1975). Prehistoric and historical changes in nutritional diseases and associated conditions, Progress in Food and Nutrition Science, 1(11), 729-779.