This page intentionally left blank
The Dopaminergic Mind in Human Evolution andHistory
What does it mean to be human? There are many theories of the evolution of human behavior which seek to explain how our brains evolvedtosupportouruniqueabilitiesandpersonalities.Mostofthese havefocusedontheroleofbrainsizeorspecificgeneticadaptationsof the brain. In contrast, Fred Previc presents a provocative theory that high levels of dopamine, the most widely studied neurotransmitter, account for all major aspects of modern human behavior. He further emphasizestheroleofepigeneticratherthangeneticfactorsintherise of dopamine. Previc contrasts the great achievements of the dopami- nergic mind with the harmful effects of rising dopamine levels in modern societies and concludes with a critical examination ofwhether the dopaminergic mind that has evolved in humans is still adaptive to the health of humans and to the planet in general.
Fred H. Previc is currently a science teacher at the Eleanor Kolitz Academy in San Antonio, Texas. For over twenty years, he was a researcher at the United States Air Force Research Laboratory where he researched laser bioeffects, spatial disorientation in flight, and various topics in sensory psychology, physiological psychology, and cognitiveneuroscience.Dr.Previchaswrittennumerousarticlesonthe origins of brain lateralization, the neuropsychology of 3-D space, the origins of human intelligence, the neurochemical basis ofperformance in extreme environments, and the neuropsychology ofreligion.
Thisbookisdedicatedtomatiandoce(inmemoriam) and to Nancy, Andrew andNicholas.
The Dopaminergic Mind in Human Evolution andHistory
Fred H. Previc
CAMBRIDGE UNIVERSITY PRESS
Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo
Cambridge University Press
The Edinburgh Building, Cambridge CB2 8RU, UK
Published in the United States of America by Cambridge University Press, New York
© Fred H. Previc 2009
This publication is in copyright. Subject to statutory exception and to the provisionofrelevantcollectivelicensingagreements,noreproductionofanypart maytakeplacewithoutthewrittenpermissionofCambridgeUniversityPress.
First published in print format 2009
eBook (EBL) hardback
CambridgeUniversityPresshasnoresponsibilityforthepersistenceoraccuracy ofurlsforexternalorthird-partyinternetwebsitesreferredtointhispublication, and does not guarantee that any content on such websites is, or will remain, accurate orappropriate.
1 What makeshumansspecial? 1
- 1Myths concerning the origins ofhumanbehavior3
- 1.1Was human intelligencegeneticallyselected?3
- 1.2Did our larger brains make usmore
- 2The evolution of human intelligence:an
|1.2.1 Dopamine and advanced intelligence||13|
|1.2.2 The rise of dopamine during human evolution||17|
|2||Dopamine in the brain||19|
|2.1 The neurochemistry of dopamine||19|
|2.2 The neuroanatomy of dopamine||23|
|2.3 Dopamine and the left hemisphere||31|
|2.4 Dopamine and the autonomic nervous system||33|
|3||Dopamine and behavior||37|
3.1 Dopamine and distant space and time
3.1.1 Dopamine and attention to spatially and temporally distant cues
|3.1.2 Dopamine and goal-directedness||46|
|3.1.3 Dopamine and extrapersonal experiences||49|
|3.2 Dopamine and intelligence||53|
|3.2.1 Motor programming and sequencing||57|
|3.2.2 Working memory||59|
|3.2.3 Cognitive flexibility||59|
|3.2.4 Abstract representation||61|
|3.2.5 Temporal analysis/processing speed||62|
|3.3 Dopamine and emotion||64|
|3.4 The dopaminergic personality||66|
|3.4.1 Ventromedial dopaminergic traits||68|
3.4.2 Lateral-dopaminergic traits
|3.4.3 Dopamine and the left-hemispheric (masculine) style||71|
|4||Dopamine and mentalhealth||75|
|4.1 The “hyperdopaminergic” syndrome||75|
|4.2 Disorders involving primary dopamine dysfunction||79|
|4.2.1 Attention-deficit/hyperactivity disorder||79|
|4.2.3 Huntington’s disease||83|
|4.2.4 Mania (bipolar disorder)||84|
|4.2.5 Obsessive-compulsive disorder||86|
|4.2.6 Parkinson’s disease||88|
|4.2.9 Tourette’s syndrome||95|
|5||Evolution of the dopaminergic mind||101|
|5.1 The importance of epigenetic inheritance||101|
|5.2 Evolution of the protodopaminergic mind||104|
|5.2.1 Environmental adaptations in the “cradle of humanity”||104|
|5.2.2 Thermoregulation and its consequences||108|
|5.3 The emergence of the dopaminergic mind in later evolution||114|
|5.3.1 The importance of shellfish consumption||117|
|5.3.2 The role of population pressures and cultural exchange||119|
|6||The dopaminergic mind in history||123|
|6.1 The transition to the dopaminergic society||123|
|6.2 The role of dopaminergic personalities in human history||130|
|6.2.1 Alexander the Great||134|
|6.2.2 Christopher Columbus||136|
|6.2.3 Isaac Newton||139|
|6.2.4 Napoleon Bonaparte||142|
|6.2.5 Albert Einstein||144|
|6.2.6 Dopaminergic personalities in history – reprise||147|
|6.3 The modern hyperdopaminergic society||149|
|7||Relinquishing the dopaminergic imperative||155|
|7.1 Reaching the limits of the dopaminergic mind||155|
|7.2 Tempering the dopaminergic mind||161|
|7.2.1 Altering dopamine with individual behavior||161|
7.2.2 Knocking down the pillars of the hyperdopaminergic society 165
7.3 Toward a newconsciousness 170
- 1The chemical structure of dopamine and
- 2The dopamine neuronandsynapse.22
- 3The cardinal directions and nomenclature used in
- 4Some of the major dopamine systems,shown
in amid-sagittalview. 26
- 1The realms of interaction in 3-D space and
- 3The dopaminergic exploration of distantspace
- 4An axial (horizontal) section of a human brain showing reduced dopamine D2 receptor binding (increased dopamine activity) in the left andright
caudate nuclei in a reversal shiftmemorytask. 54
5.1 The hypothesized direction of modern human
- 1Five famous dopaminergic minds in history: Alexander the Great, Christopher Columbus, Isaac Newton, Napoleon Bonaparte, and
- 2The progression of thedopaminergicmind.154
7.1 Restoring balance to thedopaminergicmind. 172
3.1 Features of the twodopaminergicsystems. page67
- 1Features of the majorhyperdopaminergicdisorders.98
- 2Co-morbidity of the majorhyperdopaminergicdisorders.99
6.1 Dopaminergic traits in famous menofhistory. 148
I wish to thank the many scientists who shared their ideas or findings with me and especially those who reviewed either large sections of this book or the book in its entirety (Dr. Britt Bousman, Mr. Jeff Cooper, Dr. Michael Corballis, Dr. Jaak Panksepp, and Dr. Julie Sherman). I also wish to thank Andrew Peart for his support in making the publi- cation of this workpossible.
Between two and three million years ago, a small creature hardly larger than a pygmy chimpanzee but with a much larger brain relative to its body weight began a remarkable journey. The initial part of that journey didn’t involve much by today’s standards, merely the ability to scavenge and possibly chase-hunt the creatures of the sub-Saharan African savannahs, to make some rather modest stone-flaked tools for that purpose, and eventually to migrate over the African and possibly the Eurasian land mass. This little creature, arguably our first unequivocally human ancestor, was known as Homo habilis (“domestic” man). How the modest abilities of this first human emerged and were transformed into the prodigious human achievements and civilization that exist today is arguably the most important scientific mystery of all. The solution to this mystery will not only help to explain where and why we evolved as we did – it will additionally shed light on how we may continue to evolve in the future.
But, first, some basic questions must be asked, including: what is humannatureandwhatisthebasisofit?Howmuchofhumannatureis related to our genes? Is human nature related to the size and shape or lateralizationofourbrain?Howdidhumannatureevolve?Althoughour hairless skin and elongated body make our appearance quite different from our primate cousins, it is not our anatomy but our unique brain and behavior that most people consider special. Typical behaviors considered uniquely human include propositional (grammatical) lan- guage,mathematics,advancedtooluse,art,music,religion,andjudging the intent of others. However, outside of religion, which has yet to be documented in any other extant species, at least one other – and, in somecases,several–advancedspecieshavebeenshowntopossessone or more of the above traits. For example, dolphins understand and can use simple grammar in their contact with humans (Herman, 1986) and probably use even more sophisticated grammar in their own ultrasonic communications. Certain avian species such as parrots can count up to ten (Pepperberg, 1990) and, like apes, use mathematical concepts such
as similarity and transitivity (Lock and Colombo, 1996). Orangutans displayhighlyadvancedtooluse,includingthepreparationoftoolsforuse inprocuringfood(vanSchaik,2006).Asregardsmusicandart,singing is a highly developed and plastic form of communication in songbirds (Prather and Mooney, 2004), apes have proven to be adept musical instrumentalists in their drumming (Fitch, 2006), and elephants and chimpanzeeshavebeenknowntocreaterealisticandabstractpaintings.1Finally,chimpanzees(butnotmonkeys)areabletodeterminethemental states of others and to engage in mirror self-recognition (Lock and Colombo,1996),attributesnormallyconsideredpartofageneralmental capabilityknownasthe“theoryofmind”(seelaterchapters).
What mostly defines humans, then, is not a unique ability to engage in a particularbehaviorbutratherthewayinwhichweperformit.Threefeatures of human behavior are particularly salient: its context-independence, its generativity, and its degree of abstraction. Context-independent cogni- tion, emphasized in the comparative analysis of Lock and Colombo (1996), refers to the ability to perform mental operations on new and different types of informationin differentsettings.The behaviorof chimpanzees may be viewed as much more contextually dependent than that of humans because it differs considerably depending on whether they are in the wild or in captivity; in the wild, for example, chimpanzees are relatively more likely to use tools but less likely to use symbols (Lock and Colombo, 1996). Generativity refers to the incredible amount of and variety of human cognitive output – whether it be in the tens of thousands ofwordsinatypicallanguage’slexicon,thealmostlimitlessvarietiesof song and paintings, or the incredible technological progress that has continued largely unabated from the end of the Middle Stone Age to the present. Finally, the abstract nature of human cognition, similar to what Bickerton (1995) has referred to as “off-line” thinking and what Sud- dendorfand Corballis (1997) term“mental timetravel,” strikingly sets humans apart from all other species, which engage largely in the present. While some species can use symbols, only humans can create abstractones like numbers, words, and religious icons, and it is difficult to con- ceive of even such advanced creatures as chimpanzees and dolphins as going beyond a simple emotional concept of death or the fulfillment of a current motivationally driven state to such spatially and temporally distantreligious concepts as heaven and eternity. Indeed, apes spend the vast majority of their waking lives in immediate, nearby activities (eating and grooming) (see Bortz, 1985; Whiten, 1990), and even Neanderthals
1 In fact, three paintings by a chimpanzee named Congo sold for 12,000 British pounds
(over $20,000 US) in 2005 (http://news.bbc.co.uk/2/hi/entertainment/4109664.stm).
appeartohavebeenmoreconstrainedintheirspatialandtemporalmental spheres (Wynn and Coolidge,2004).
There are two major features that characterize all of the advanced cognitive skills in humans:
- they all appear to have first emerged between 50,000 and 80,000 years ago, first in Africa and later in Europe and elsewhere;and
- the context-independent, generative, and abstract expressions of these skills require high levels of a critical neurotransmitter in the brain known asdopamine.
Hence,theemergenceofintellectuallymodernhumansaround80,000years ago arguably represented the beginning of what I will refer to as the “dopaminergic mind.” How that mind depends on dopamine, how it came to evolutionary fruition, and the dangers its continued evolution pose for the denizens of industrialized societies in particular will all be discussed inlaterchaptersofthisbook.First,however,Iattempttorefutecommonly
heldexplanations(myths)ofhowhumannatureevolved.Thefirstmythis thattheevolutionofhumanintelligencewasprimarilyaproductofgenetic selection,whilethesecondisthatthespecificsize,shape,orlateralizationof ourbrainiscriticalforustobeconsideredhuman.
- 1Myths concerning the origins of humanbehavior
There are many reasons to believe that the origin of advanced human behavior was at least partly controlled by genetic evolution. For one, estimatesoftheheritabilityofintelligence,basedlargelyontwinstudies that compare the concordance (similarity) of identical twins (which share the same genome) to fraternal twins (which only share the same genetic makeup as regular siblings), are around 0.50 (see Dickens and Flynn, 2001). There are also genetic differences between chimpanzees and modern humans on the order of about 1.2 percent (Carroll, 2003), whichinprinciplecouldallowforselectionforparticulargenesthatmay have helped produce the intellectual capabilities of modern humans. Certainly, advanced intelligence should help members of a species sur- vive and reproduce, which according to Darwinian mechanisms should allow that trait to be passed on genetically to offspring. Indeed, it is highly likely that some genetic changes at least indirectly helped to advance human intelligence, although I will argue in Chapter 5that most of these were probably associated with an overall physiological adaptationthatoccurredwiththedawnofHomohabilis.
There are more compelling reasons, though, to believe that advanced humanintellectualabilitiesarenotprimarilyduetogeneticselection.Firstofall, genetic expression and transmission have been documented to be modi- fiable at many levels by a wide variety of influences (especially maternal) thatcanthemselvesbepassedtooffspringinamodeknownas“epigenetic inheritance”(Harper,2005).Indeed,thereareongoingmajorincreasesin intelligence (Dickens and Flynn, 2001) and various clinical disorders (Previc, 2007) in the industrialized societies that are occurring despite stable or even opposing genetic influences. For example, the prevalenceof autism, characterized by severely deficient social and communication skills, is dramatically increasing despite the fact that most individualswith autism never marry and thereby pass on their genes (see Chapter 4). Second, heritability estimates for intelligence and many other normal and abnormal traits may be overblown because fraternal twins do not share as similaraprenatalenvironment(amajorsourceofepigeneticinheritance)as mostidenticaltwinsduetothelackofasharedbloodsupply(Prescottetal., 1999) and because of the greater similarity of rearing in identical twins (Mandler, 2001). Third, dramatic changes in physiology, anatomy, and behavior are believed to occur when the timing of gene expression is affected by disturbances in key regulatory or hormonal centers such as the thyroid (Crockford, 2002; McNamara, 1995). Fourth, anatomical findings (McDougall et al., 2005) and genetic clock data (Cann et al., 1987; Hammer, 1995; Templeton, 2002; von Haeseler et al., 1996) clearly place the most recent ancestor common to all modern humans at around 200,000 years,2yet there is little or no evidence of art, music, religion, beads, bone tools, fishing, mining, or any other advanced human endeavors until more than 100,000 years later (McBrearty and Brooks, 2000; Mellars, 2006; Shea, 2003). One hundred thousand years may not seemlikealargeamountoftime,inthatitonlyconstitutesabout5percent ofthetotaltimeelapsedfromtheappearanceofHomohabilis,butitismore thantentimeslongerthanfromthedawnofthemostancientcivilizationto the present. Finally, there is no convincing evidence that genetic factors have played any role whatsoever in one of the most striking of all human features – the functional lateralization of the brain (Previc,1991).
Although it still remains to be determined exactly how many genes humansactuallyhave,thecurrentbestestimateisaround20,000–25,000. Given the 1.2 percent genetic divergence between chimpanzees (our geneticallyclosestlivingrelative)andmodernhumans,therewouldfirst
2 Genetic clock estimates can be derived from the rates of mutation of various types of DNA (mitochondrial, y-chromosomal, etc.) and the known variations among extant human populations.
appear to be a sufficient amount of discrepant genetic material to account for neurobehavioral differences between us and our nearest primate relation. However, the vast majority of our genome appears to be non- functional “junk” DNA and most of the remaining DNA is involved in gene regulation, with only a tiny percentage of the total DNA (<1.5
percent) actually used in transcribing the amino acid sequences that
create proteins (Carroll, 2003). The “coded” sections of the human genome also appear to show less variation between humans and apes than the “non-coded” sections (Carroll, 2003; Mandel, 1996), and much of that difference relates to genes for the protein-intensive olfactory system.3In fact, there is no evidence that any proteins, receptors, neurotransmit- ters, or other components of our basic neural machinery do not exist in chimpanzees (Rakic, 1996). Rather, most of the different genetic sequencing between chimpanzees and humans is in regulatory sections of the genome that affect gene expression (Carroll, 2003), presumably including those that affect brain and body development conjointly. As but one example, there are many genes that affect calcium production, which in turn helps regulate skeletal growth as well as the production of key brain transmitters (see Previc, 1999). Also, there are many genes that can affect the thyroid gland, which has an important influence on body metabolism, body growth, brain activity, and brain size and is arguably a major force for speciation during evolution (Crockford, 2002) and one of the few endocrine structures known to have altered its function during human evolution (Gagneux et al., 2001; Previc, 2002). It is likely, therefore, that changes in regulatory-gene activity and other factors that influence gene expression played some role in the evolution of humans, most probably in its earliest stages (see Chapter 5).4
To say that there may have been some influences on gene regulation in humans during the course of our evolution is more defensible than the notion that specific genes or sets of genes determine advanced human capabilities. Rarely does a single gene or small set of genes affect a major brain or non-brain disease, and higher cognitive capacities involve even more genes (Carroll, 2003). For example, the combined variance
3Theolfactorysystemofhumans,forexample,isbelievedtoexpress~500receptorgenes (Ressler et al., 1994), which is much less than other mammalian species that rely on olfaction to a greaterextent.
4 It has recently been claimed that the general mutation rate of genes related to brain growth has increased in humans relative to other primates faster than genes controlling general cellular function (Dorus et al., 2004), but the significance of this preliminary finding is unclear because it is not known whether the genes in question are specific to braingrowthasopposedtobodygrowthingeneral.Indeed,bodyheightcorrelateswith intelligencebyroughlythesameamountasbrainsize,andbothrelationshipsaresubject to environmental influences (Nagoshi and Johnson, 1987; Schoenemann et al.,2000).
accounted for by several key genes known to contribute to intelligence and to various clinical disorders in humans is less than 10 percent (Comings et al., 1996). The polygenic nature of higher cognition is not surprising when one considers the many cognitive skills – discussedin much greater detail by Previc (1999) and in Chapter 3– that are required for listening to, comprehending, and responding appropriately toasimplesentencesuchas“Builditandtheywillcome.”First,amotor system must recreate in our own minds what is being said; second, an incredibly rapid auditory processor must decode a multitude ofacoustic transientsandphonemes;third,acapabilityforabstractionservestolink the spoken words to their correct meaning; fourth, working memory is required to keep the first clause of the sentence in mind as we awaitthe finalone;fifth,cognitiveflexibilityisneededtorealizethat,afterhearing the second part of the sentence, the first part isn’t about construction butexpressesamoreprofoundthought;sixth,anabilitytojudgespeaker intent aids in further recognizing that this sentence as spoken by a particular individual (e.g. a philosopher) is not about construction; and finally, there must be an ability to assemble and correctly sequence a collection of phonemes that provides a spoken response that we (or any other individual) may have never uttered before. Despite all of this,someresearcherssuchasPinkerandBloom(1990)havepostulated that a single gene or small set of genes may have mutated to create specific language capabilities (e.g. grammar) only found in humans. Indeed, there was great excitement among the scientific world that a “grammar gene” had been identified in a small English family of sup- posedly grammar-deficient individuals (Gopnik, 1990), who were later showntohaveamutationofageneknownas“Foxp2”(Laietal.,2001). There eventually turned out to be several major problems with this finding, however. The first was that the affected family members did not have a selective loss of grammar but rather exhibited many other language problems as well as severe speech articulation difficulties, aninabilitytocarryoutsimplefacialgestures(likewinking),behavioral disorderssuchasautism,andevennonlinguisticcognitivedeficits(their averagenonverbalintelligencequotientwasfoundtobeonlyeighty-six, or fourteen points below their unaffected relatives) (Vargha-Khadem et al., 1995). Moreover, the Foxp2 gene mutation turns out not to be associated with the deficits exhibited by most individuals with specific language impairments (Newbury et al., 2002), nor does the human Foxp2 gene resemble that of other species (e.g. avians and dolphins) who possess advanced vocal communication skills (Webb and Zhang, 2005). The final factor mitigating the importance of the Foxp2 gene inhumanlinguisticevolutioncomesfromarecentDNAfindingin
Neanderthals, from whom the ancestors of modern humans diverged nearly 400,000 years ago. At least one of the two major variants of the modern human Foxp2 gene relative to that of chimpanzees was once thought to have occurred as recently as 10,000 years ago (Enard et al., 2002), or long after the emergence of the common human genome. However,ananalysisoftheDNAofNeanderthalsshowsthatthey,too, possessed both modern human Foxp2 variants (Krause et al., 2007), which indicates that these variants must be at least 400,000 years old given the estimated date of divergence of the Neanderthal and modern human lineages (Chapter5).
Another phenomenon tied to the evolution of humans is the lateral- ization of the human brain for advanced cognitive functions. Two of the most well-known manifestations of cerebral lateralization are the overwhelming and universal preponderance of right-handedness in humans – about 85–90 percent of individuals in Western societies exhibit some form of right motor dominance – and the greater likelihood of suffering serious speech and language deficits (known as aphasias) following damage to the left hemisphere in adulthood.5Although brain lateralization of some sort or another is common in the animal world, the degree of functional lateralization of the human brain is remarkable compared to that of other mammalian brains and especially that of the chimpanzee. Indeed, one of the great triumphs of modern neuroscience was the demonstration, mainly through studies of “split-brain” patients in which the connections between the hemispheres were severed to relieve epilepsy (Gazzaniga, 2005), that the left and right hemispheres of most humans not only differ in their linguistic capabilities but also possess very different personalities (the left is more active, controlling, and emotionally detached while the right is more earthy and emotional) and intellects (the left is more analytical, abstract, and future-oriented while the right one is more concrete, better at judging emotional and mental states, and better at visual manipulations, especially 3-D geo- metrical ones in body space). Indeed, the cognitive and personality differences between the left and right hemispheres of most humans are greater than between almost any two humans, and the specialized functions of the left hemisphere arguably render it almost as dissimilar to those of the right hemisphere as human intellectual functions in general differ fromchimpanzees.6
5 It is generally accepted that about 90–95 percent of right-handers and about 70 percent of left-handers possess left-hemispheric dominance for speech (see Previc, 1991).
6 Indeed, Gazzaniga (1983) has even gone so far as to describe the cognitive skills of an
isolated right hemisphere as “vastly inferior to the cognitive skills of a chimpanzee” (p. 536).
Although many theorists such as Annett (1985) and Crow (2000) have posited that left-hemispheric language dominance is largely determinedbyasinglegene–anddespiteevidencethat,atleastinsome species, the overall direction of body asymmetry is subject to genetic influences(Ruiz-Lozanoetal.,2000)–theevidenceisstronglyagainsta genetic explanation for brain lateralization in humans. First, the likeli- hood of one member of a twin pair having the same hand dominanceas theotherisnogreaterforidenticalthanforfraternaltwins(Reissetal., 1999),7and speech dominance for monozygotic twin pairs shows a similarly weak concordance (Jancke and Steinmetz, 1994). Second, neither handedness nor speech lateralization (see Tanaka et al., 1999; Woods, 1986) appears to be related to the genetically influenced asymmetricalpositionofthemajorbodyorganssuchastheheart,which, inanycase,isthesameinhumansasinchimpanzees.Third,theredoes not appear to be any evolutionary advantage conferred by the typical pattern of left-hemispheric dominance for handedness, as left-handers and right-handers on average do not differ in academic or athletic achievementoranyotherpersonalityvariables(seeHardycketal.,1976; Peters et al., 2006; Previc, 1991), although there may be very slight deficits for some individuals with ambiguous dominance (Peters et al., 2006).8Fourth, the development of cerebral lateralization is heavily dependent on both cultural and prenatal factors. As an example of cultural factors, aphasia following left-hemispheric damage was very uncommon a few centuries ago in Europe when the vast majority of adults were illiterate and not exposed to the left–right reading and writing of Western languages, and right-handedness remains much less prevalent in existing illiterate populations (see Previc, 1991). As an example of prenatal factors, handedness and other forms of motoric lateralizationaregreatlyreducedinotherwisenormalinfantsbornbefore the beginning of the third trimester and are affected by fetalpositioning in the final trimester, which may be crucial as a source ofearly
7Althoughagreaterconcordancebetweenidenticaltwinsusually(butnotalways)implies at least some genetic influence, the absence of a greater identical-twin concordance almostcertainlyrulesoutsuchaninfluence.Inameta-analysisbySicotteetal.(1999), which did not include the Reiss et al. 1999study, a significantly greater percentage of dizygotictwinswasfoundtobediscordantforhandedness,butthisdifferenceaveraged acrosstwenty-eightstudieswaslessthan2percent(21.03percentformonozygotictwins versus22.97percentfordizygotictwins)andcanbeeasilyaccountablebythedifferent child-rearing of the two twintypes.
8 Nonright-handedness does appear to be slightly more associated with both extreme giftednessandmentalretardation,forlargelynongeneticreasons(seePrevic,1996),but handedness certainly does not predict intelligence in the vast majority of humans (Hardyck et al.,1976).
Finally, the notion that language and language-linked brain later- alization are determined genetically is contradicted by the nature of human language as a very robust behavior that does not depend on a particular sensory modality (e.g. hearing) or motor system (e.g. speech). For example, individuals who cannot speak or move their hands can communicate with their feet, and those who cannot hear or see can use their hands to receive messages. Humans have invented languages dependent on speech sounds but also on manual signs, tactile signals, fundamental (musical) frequencies, visual icons, clicks, whistles, and probably other signals as well, all demanding many of the same skills described above for speech comprehension and production. Moreover, the mechanisms of language have expropriated the same systems used in more basic motor functions such as chewing, hand movements and eye movements, the latter two of which accompany linguistic thought (Kelso and Tuller, 1984; Kingston, 1990; McGuigan, 1966; Previc et al., 2005). And, the fact that speech is housed mostly in the left hemisphere of humans certainly doesn’t imply a causal (or more specifically, a genetic) linkage because the loss in early life of the left hemisphere does not affect subsequent language ability in any measurable way (see next section). Indeed, a pure “language” gene/protein would have to be a strange one in that it would have to:
- affectlanguageatasuperordinatelevel,independentofanyparticular sensorimotormodality;
- affect one hemisphere more than another, even though the later- alization process does not appear to be under genetic control and even though language proceeds just fine in the absence of the ori- ginally favored hemisphere;and
- affect no other sensorimotor or cognitive systems, even thoughthese other systems are closely tied to language processing and output and are, in some case, necessary for language tooccur.
In summary, a direct, major role of direct genetic selection in language and other higher-order cognitive functions is unlikely. This is consistent withthefactthatallmajorintellectualadvancesduringhumanevolution proceeded in sub-Saharan Africa (McBrearty and Brooks, 2000; Previc, 1999), even though ancestral humans had populated wide swaths of Africa, Europe, and Asia nearly two million years ago. If cognitiveability
and not physiological and dietary adaptations – which occurred mostly in sub-Saharan Africa, for reasons to be discussed in Chapter 5– was the major trait genetically selected for, then why were the other regions of the world in which cognitive ability would have also proven beneficial unable to rival sub-Saharan Africa as the cradle of human evolution?
- 1.1.2Did our larger brains make us moreintelligent?
The second “myth” concerning human evolution – that we got smarter mainly because our brains got bigger – remains very popular, even among researchers in the field. Yet, there are even more powerful arguments against this view than against the genetic selection theory. After all, elephants by far have bigger brains than anyone else in the animal kingdom, yet most would not be considered intellectual giants; conversely, birds have very small brains (hence, the derogatory term “bird-brain”), but we now realize that some bird species (e.g. parrots) actually possess relatively advanced cognitive capacities, such as lan- guage, arithmetic, and reasoning skills (Pepperberg, 1990).
Accordingly, most brain scientists accept that a better measure than brain size for predicting intelligence is brain-to-body weight; using this measure, humans fare very well, along with other creatures that we might consider intelligent (chimpanzees, dolphins, parrots). However, thereareproblemsevenwiththismeasure,becausethelowlytreeshrew– a small, energetic creature that was an early ancestor of primates such asmonkeysbutishardlynotedforitsintellectualprowess–ranksabove all others in brain-body ratio (Henneberg, 1998). Moreover, the cor- relationbetweenbrain/bodysizeandintelligenceinhumanshasgenerally been shown to be very modest, with a typical coefficient that is barelymorethanthecorrelationbetweenheightandintelligence(~0.3) (see Previc, 1999). Since no researchers have claimed that height is causally related to intelligence, there is no reason to assume that the equally modest relationship between brain size and intelligence is also causally related. Moreover, when examining the relationship between brainsizeandintelligencewithinfamiliestocontrolfordietaryandother environmental differences that differ among families, the correlation becomes essentially random (Schoenemann et al., 2000). Indeed, there are even among humans of normal body sizes great variations in brain size, ranging normally from 1,000 cc to over 1,500 cc, and some of the most brilliant minds throughout history have actually had estimated brain sizes toward the low end of that range. The Nobel prize-winning novelist Anatole France had a brain size of only 1,000 g – about the sameasthehumanancestorHomoerectus,wholivedoveramillionyears
ago – and most individuals with microcephaly (extremely small brains) without other associated disorders such as Down’s syndrome, growth retardation etc. tend to be of normal intelligence (see Skoyles, 1999, for a review). For example, one well-studied young mother given the moniker “C2” is estimated to have a brain size around 740 cc (at the low end of the Homo erectus range), despite an estimated intelligence quo- tient (IQ) of 112 (above that of the average human). Finally, the importance of brain size to intelligence is dubious from an evolutionary perspective, in that most of the increase in brain-to-body size in humans over the past million years is explained not by an increase in brain size butratherbyadecreaseinthesizeofourdigestivetractthatwasarguably made possible by the reduced digestive demands associated with the increased consumption of meat and the cooking of plant foods (Hen- neberg, 1998). Ironically, the average human brain actually shrank over the past 100,000 years or so from 1,500 cc to 1,350 cc (Carroll, 2003), despite the aforementioned explosion of human intellectual capability (see Chapter5).
Consequently, researchers have used yet another measure that com- pares the relative size of different structures such as the cerebral cortex – the outer, mostly gray mantle or “bark” of our brains, on which most of our higher-order cognitive capacities depend (hence, the positive con- notations of having lots of “gray matter”) – relative to the brain distri- bution for an insectivore such as the tree shrew. By this measure, also known as the progression index, the human neocortex is by some accounts effectively 2.5 times larger than the chimpanzee’s relative to the rest of the brain (Rapoport, 1990), although this has been disputed (Holloway, 2002). Even more strikingly, the area of the neocortex associated with mental reasoning – the prefrontal lobes – occupy 29 percent of the neocortex of humans but only 17 percent of that of the chimpanzee (Rapoport, 1990), although a more recent review argued for no difference in relative frontal-lobe sizes between humans and other apes (Semendeferi et al., 2002). At first glance, this suggests that the size of at least one portion of our brain may indeed account for the intel- lectualadvancementofhumansrelativetothegreatapes.
However, the attribution of human intelligence to a larger neocortex orlargerprefrontalcortexinparticularisaserroneousasthenotionthat overall brain size is relevant to human intelligence. For one, an indi- vidual by the name of Daniel Lyon who had a brain of only 624 cc but essentially normal intellect is believed to have had an especially small cerebralcortexrelativetothesizeoftherestofhisbrain(Skoyles,1999). Moreover,researchhasshownthatremovalofonlytheprefrontalcortex ininfancyproducesnolong-termdeficitsinintelligenceinmonkeys
(Tucker and Kling, 1969). In fact, removal of major portions of the left hemisphereininfancyproducesremarkablyfewlong-termlinguisticand other intellectual decrements (de Bode and Curtiss, 2000), even though in normal adult humans the left hemisphere is the predominant hemi- sphere for most linguistic and analytic intelligence. One child who received a hemispherectomy even ended up with above-average lan- guage skills and intelligence, despite the fact that his entire left hemi- sphere had been removed at the relatively late age of five-and-a-half years and he had disturbed vision and motor skills on one side of his body due to the surgery (Smith and Sugar, 1975). Even more strikingis thefindingofLorber(1983),whodescribedmanycasesofchildrenwith hydrocephalus, which occurs when overproduction of cerebrospinal fluidinthecenterofthebrainputspressureonthecerebralcortexand,if leftuntreated,eventuallycompressesanddamagesit.InoneofLorber’s mostdramaticcases,achildwithonly10percentofhiscerebralcortical mantleremainingeventuallyendedupwithanoverallintelligenceof130 (in the genius range), with a special brilliance in mathematics (Lorber, 1983). Somewhat cheekily, Lorber went so far as to entitle his famous chapter“Isyourbrainreallynecessary?”
The failure of large-scale cortical removal in infants and young chil- dren to substantially affect subsequent intelligence is complemented by the dramatically different intelligences that exist for similarly sized brains. The most striking example of this involves the left and right hemispheres,whicharealmostidenticalinweightandshape,asidefrom a few minor anatomical differences that appear to have little functional significance(seePrevic,1991).Yet,asnotedearlier,itwouldbehardto imagine two more different intellects and personalities. The left hemi- sphere is impressive at abstract reasoning, mathematics, and most lan- guage functions, yet it has difficulty in interpreting simple metaphors and proverbs, in judging the intent of others, and in performing other simple social tasks. By contrast, the right hemisphere is poor at most language functions (it has the grammar of a six-year-old and the vocabulary of an eleven-year-old) and does poorly on logical reasoning tests, yet it is superior to the left hemisphere at proverb interpretation, understandingtheintentofothers,self-awareness,emotionalprocessing, social interaction, certain musical tasks, and 3-D geometry (Gazzaniga, 2005). Another important example of the stark contrast between an anatomically normal brain and severe abnormalities in higher mental functioning involves the disorder known as phenylketonuria. In this genetic disorder, the enzyme phenylalanine hydroxylase is absent and unable to convert phenylalanine to tyrosine (the precursor to the neurotransmitterdopamine),resultinginabuildupofpyruvicacidanda
decreaseintyrosine(aswellasdopamine).Becausetheseproblemsonly emergeafterbirth,whenthebasicsizeandshapeofthebrainhasalready beenestablished,thebrainsofpersonssufferingfromPKUappeargrossly normal,eventhoughthosewithPKUsufferseverementalretardationif theirexcessphenylalanineisnottreatedbydietaryrestrictions.
In conclusion, there are compelling reasons to reject as myth the standard view that the evolution of human intelligence and other advanced faculties was determined by direct genetic influences that conspiredtochangethesizeandshapeofthehumanbrain.Onthebasis of his own findings with hydrocephalic children, Lorber (1983: 12) concluded that there is an “urgent need to think afresh and differently about the function of the human brain that would necessitate a major changein the neurological sciences.”Unfortunately,therevolution inthe perspectiveoftheneuroscientificcommunityatlargehasyettooccur.
- 2The evolution of human intelligence: an alternativeview
- 1.2.1Dopamine and advancedintelligence
The pervasiveness of the myth that the ability of humans to think and createinadvancedwaysisdependentontheoverallsizeofourbrainsis surprising in that few of us would automatically conclude that a bigger computer is a better one. Indeed, some of the massive early computers had less than one-billionth the speed and capacity of current notebook computers.Rather,itishowthesystemworks–acollectiveproductof such functions as internal speed, amount of parallel processing etc., known as its “functional architecture” – that largely determines its performance. In fact, by any stretch of the imagination, our brain is far largerthanitwouldeverhavetobetoperformtheadvancedintellectual functions that we evolved. For example, the number of nerve cells in it (100billion,asagenerallyacceptedestimate)timestheaveragenumber of connections per nerve cell (10,000, another generally accepted esti- mate) times the number of firings per second (up to 1,000, for rapidly firing neurons) allows our brain to perform a comparable number of calculations (1018) as our very best state-of-the-art computers. While using but a fraction of their hardware capabilities, such computers can crunchmassivenumbersinmicroseconds,generatereal-worldscenesin milliseconds,understandthecomplexsyntaxoflanguage,andplaychess better than the greatest of humans.
What, then, is the essence of why we are so intelligent relative to the restoftheanimalworld,andespeciallyotherprimates?Perhapsthemost
importantclue–indeed,the“RosettaStone”ofthebrain–liesinthedifferences between the left and right hemispheres of thehumanbrain.9As already noted, it is our left hemisphere and itsgrammatical,math-ematical, and logical reasoning skills that mostobviouslydifferentiatesour intellectual capability from that of chimpanzees,despitethecomparable size and overall shape of its righthemisphericcounterpartandthelackofaknowngeneunderlyingitsadvancedintellect.Therighthemisphereismarginallyheavierandthelefthemispheremayhaveslightly more gray matter, but the left–right differences arefarsmallerthan between brains of two different humans. There alsotypically existlargerrightfrontal(anterior)andleftoccipital(posterior)protrusions,asifthebrainwasslightlytorquedtotheright,aswellassomedifferencesin the size, shape, and neural connections of the ventralprefrontalandtemporal-parietal prefrontal regions of cortex that house,amongotherthings, the anterior and posterior speech centers (Previc,1991).How- ever, there is evidently no functional significance to the torquetotheright,andtheotherchangesdonotappeartobecausallylinkedtothefunctionallateralizationofthebrain,sincetheygenerallyoccuraftertheprocessofcerebrallateralizationiswellunderway(seePrevic,1991).A much more likely candidate for why the left andrighthemispheresdiffer so much in their functions is the relative predominanceoffourmajor neurotransmitters that are used to communicatebetween neuronsat junctures known as the synapses. The four mostimportantlateralizedneurotransmitters are dopamine, norepinephrine,serotonin,andacetylcholine.10On the basis of a wide variety of evidence(seeFlor-Henry, 1986; Previc, 1996; Tucker and Williamson, 1984), itisgen-erally accepted that dopamine and acetylcholine predominate inthelefthemisphere and that norepinephrine and serotonin predominateintheright hemisphere. The latter two transmitters are heavilyinvolvedinarousal,whichexplainswhytherighthemisphereisgenerallymore involved in emotional processing. So, that leaves dopamineand acetyl-cholineasthetwomostlikelycandidatesforunderstandingwhythelefthemisphere of humans has evolved such an advanced intellect,withatleast five suggestions that it is dopamine rather than acetylcholine that
9 The Rosetta Stone was a tablet that contained Ancient Greek and Coptic letters as well as Egyptian hieroglyphics, discovered during Napoleon’s occupation of Egypt in 1799 and initially translated by the French linguist Champollion. By matching the different symbol sets, Champollion was able to decipher the previously mysterious hieroglyphic language and in so doing provided an important (if not the most important) clue for the understanding of ancient Egyptian culture and society.
10Thelateralizationsoftwootherimportantneurotransmitters–glutamateandgamma- aminobutyricacid,orGABA–arenotaswell-studiedandarelessinterestinginany casefromanevolutionarystandpoint,forreasonstobediscussedlater.
underlieshumanintelligence.Threeofthesepertaintohowdopamineis distributed in the brain, and the other two of these pertain to the known role of dopamine incognition.
The three major arguments for a role of dopamine in advanced intelligence are:
- dopamine is highly concentrated in all nonhuman species with advancedintelligence;
- dopamine is especially rich in the prefrontal cortex, the single-most important brain region involved in mathematics, reasoning, and planning.
Nonhuman brains of other advanced species such as parrots, dolphins, and primates differ in many fundamental respects, but they are similar in that all have relatively high amounts of dopamine (Previc, 1999). Most striking is the tiny overall size and almost absent cortical mass of birds, which nonetheless have a well-developed dopamine-richstriatum (Waldmann and Gunturkun, 1993) and nidopallium caudolaterale that support their impressive mathematical and linguisticcompetencies. Indeed, the latter structure is considered analogous to the mammalian prefrontalcortexbecauseofitshighpercentageofinputfromsubcortical dopamine regions and its critical role in cognitive shifting and goal- directed behavior (Gunturkun, 2005). As regards the second argument, the dopaminergic innervation of the cerebral cortex is known to have undergone a major expansion in primates; for example, dopamine is limited to specific brain areas in rodents but is densely represented in certain cortical layers in all regions of the monkey brain (see Gaspar et al., 1989). Moreover, the dopaminergic expansion has continuedinto humans,judgingfromthealmosttwo-foldincrease(adjustedforoverall brainsize)inthesizeofthehumancaudatenucleus–inwhichdopamine ismostdenselyconcentrated–relativetothatofthechimpanzeecaudate (Rapoport, 1990). By contrast, there is no evidence that any other transmitterhasexpandedasmuch,ifatall,duringhumanevolution,and the cholinergic content of the human cerebral cortex may have actually diminished (Perry and Perry, 1995). Finally, dopamine appears to be especially well-represented in the prefrontal cortex of humans, and
11 The term “hominid” has traditionally referred only to those higher primates whose lineage directly led to modern humans, but more recently this term has been extended to apes and a new term – “hominin” – has been devised to apply to the specifically human lineage. The traditional use of “hominid” will be used throughout this book, however.
chemicalremovalofdopaminefromanotherwiseintactprefrontalcortex essentiallyduplicatesalloftheintellectualdeficitsproducedbyoutright damagetothisregion(Brozoskietal.,1979;Robbins,2000).Onemajor featureoftheprefrontalcortexisitsabilitytorecruitothercorticalregions in performing parallel mental operations, which is likely to be dopami- nergically mediated because dopamine is well-represented in the upper layersofthecerebralcortex,wheretheconnectionstotheothercortical regionsmostlyreside(Gasparetal.,1989;Chapter2).
Two other reasons linking dopamine with advanced intelligence are its direct roles in normal intelligence and in the intellectual deficits found in clinical disorders in which dopamine levels are reduced. Far more than acetylcholine, dopamine is involved in six major skills underlying advanced cognition: motor planning and execution; working memory (which allows us to engage in parallel processing because we can process and operate on different types of material at the same time); cognitive flexibility (mental shifting); temporal processing/speed; creativity; and spatial and temporal abstraction (see Chapter 3for a greater elu- cidation). Working memory and cognitive flexibility are considered the two most important components of what is known as “executive intelligence,” and tasks that assess it have been shown by brain imaging to directly activate dopamine systems in the brain (Monchi et al., 2006). Enhanced parallel processing and processing speed, which modern computers rely on to achieve their impressive processing power, are particularly associated with high general intelligence in humans (Bates and Stough, 1998; Fry and Hale, 2000), as are dopamine levels them- selves (Cropley et al., 2006; Guo et al., 2006). Second, dopamine is arguably the neurotransmitter most involved in the intellectual losses in a number of disorders including Parkinson’s disease and even normal aging, phenylketonuria, and iodine-deficiency syndrome (see Chapter 4). In phenylketonuria, for example, the genetically mediated absence of the phenylalanine hydroxylase enzyme prevents the synthesis of tyrosine, an intermediary substance in the synthesis of dopamine by the brain (Diamond et al., 1997; Welsh et al., 1990). Neurochemical imbalances rather than neuroanatomical abnormalities are believed to be the major cause (and basis of treatment) for almost every brain-relatedpsychological disorder.Inparticular,eitherelevatedordiminisheddopaminecontributes in varying degrees to Alzheimer’s disease and normal aging, attention- deficitdisorder,autism,Huntington’sdisease,iodine-deficiencysyndrome (cretinism), mania, obsessive-compulsive disorder, Parkinson’s disease, phenylketonuria, schizophrenia, substance abuse, and Tourette’s syn- drome, all of which are associated with changes in cognition, motor function, and/or motivation (see Previc, 1999; Chapter 4).
The importance of dopamine to these disorders partly explains why dopamine is by far the most widely studied neurotransmitter in the brain. For example, in the exhaustive Medline database listing studies involving different neurotransmitters, dopamine was the subject ofover 60,000 brain articles through 2008, whereas serotonin – the nextmost- studied neurotransmitter and a substance that has important inter- actions with dopamine – was the subject of about 38,000 brain articles and acetylcholine (the other predominant left-hemispheric transmitter) was the subject of less than 17,000 brainpapers.
- 1.2.2The rise of dopamine during human evolution
If the principal reason for the uniqueness of human intellectual and other behavior is that the neurochemical balance in our brains favors dopamine, the remaining great question is how dopamine ended up being so plentiful in the human brain, especially in its left hemisphere. Certainly, there are no new genes that appeared in humans that control theproductionofthemajordopaminesynapticreceptors(Previc,1999), and no variation in these genes within humans seems to strongly cor- relate with variations in intelligence (Ball et al., 1998). As will be dis- cussed further in Chapter 5, it is more likely that dopamine levels may have been indirectly altered by genetic changes that affected cal- ciumproduction,thyroidhormones,orsomemoregeneralphysiological mechanism or adaptation, given that both calcium and thyroid hor- monesareinvolvedintheconversionoftyrosinetodopa(theimmediate precursor to dopamine) in the brain. The stimulatory effect of thyroid hormonesonbothskeletalgrowthanddopaminemetabolism,aswellas the stimulatory effect of dopamine on growth hormone, are especially attractive mechanisms for explaining the triple convergence of intelli- gence, brain size, and skeletal height. The importance of the thyroid hormones is also suggested by the previously noted finding that elevated levels of thyroid hormone in humans represent the first con- firmed endocrinological difference between chimpanzees and humans (Gagneaux et al.,2001).
As reviewed by Previc (1999, 2007), there are even more plausible nongeneticexplanationsforwhydopaminelevelsincreasedduringhuman evolution.AswillbediscussedfurtherinChapters4and5,themostlikely candidate for a nongenetic or epigenetic inheritance of high dopamine levels isthe ability ofmaternal factors– specifically,the mother’sneu- rochemical balance – to influence levels of dopamine in offspring.Not onlyhavedopaminergicsystemsbeenshowntobeinfluencedbyahost ofmaternalfactors,butthereisalsocompellingevidencethatthe
neurochemical balance of humans has, in fact, changed in favor of increasing dopamine due to a combination of five major factors:
- a physiological adaptation to a thermally stressful environment (which requires dopamine to activate heat-lossmechanisms);
- increased meat and shellfish consumption (which led togreater
supplies of dopamine precursors and conversion of them into dopa- mine);
- demographic pressures that increased competition for resources and rewarded dopaminergically mediatedachievement-motivation;
- a switch to bipedalism, which led to asymmetric vestibular exposure in fetuses during maternal locomotion and resting and ultimately elevateddopamineinthelefthemisphereofmosthumans(seePrevic, 1991);and
- major increases in the adaptive value of dopaminergic traits such as achievement, conquest, aggression, and masculinity beginning with late-Neolithicsocieties.
Thelinkbetweenbipedalismandbrainlateralizationexemplifieshow epigenetic transmission could have become a seemingly permanentpart of our inheritance during our evolution. Bipedalism, together with asymmetric prenatal positioning, creates asymmetrical gravitoinertial forces impacting the fetus, which may in turn create asymmetrical ves- tibular functioning and neurochemical differences between the hemi- spheres (Previc, 1991, 1996). Although the ultimate cause of the neurochemical lateralization may be nongenetic, the switch to bipedal- ism was a permanent behavioral fixture so that the resulting cerebral lateralization would continue for all future generations of humans and superficially appear as it if had become part of the genomeitself.
Before addressing the changes in human brain dopamine levels during evolution and history in Chapters 5and 6, it will first be necessary to further detail the nature of dopaminergic systems in the human brain (Chapter 2) and dopamine’s role in normal and abnormal behavior (Chapters 3and 4). Finally, Chapter 7will discuss the consequences of the “dopaminergic mind” not only for humans but for other species. As part of that critique, the impressive accomplishments of the dopami- nergic mind throughout history will be weighed against the damage it has caused to itself, to others, and to the Earth’s ecosytems. It will be concluded that for humans to prosper (and perhaps even survive), the dopaminergic imperative that propelled humans to such great heights and more than anything else defined us as humans must in the end be relinquished.
2 Dopamine in thebrain
Inthischapter,Iwillattempttobrieflydescribewhatdopamineconsists of,whereitislocatedinthebrain,andwhatbasicactionsithas.Muchof thisinformationwillbeappliedinChapters3and4withreferencetothe roles of dopamine in normal and abnormalbehavior.
- 1The neurochemistry ofdopamine
Like all brain neurotransmitters, dopamine is a chemical that contains mostoftheimportantbuildingblocksoflife–carbon,hydrogen,oxygen, and nitrogen. It is a phylogenetically old transmitter, found inprimitive lizards and reptiles existing tens of millions of years ago. The chemical structure of dopamine is shown in Figure 2.1. Dopamine is known as a catecholamine, which derives from its having a catechol group and an amine group that are joined by an additional carbon pair. The catechol group consists of a hexagonal benzene (carbon-bonded) ring with two hydroxyl (oxygen and hydrogen, or OH) groups. The amine group is a molecule comprised of an atom of nitrogen and two atoms of hydrogen (NH2).Thechemicalstructureofdopamineisnotallthatspecial,inthat itsatomsandmoleculesderivefromsomeofthemostcommonelements on Earth and especially those found in organic compounds. Carbon, in particular,isessentialtolifeonEarth,partlybecauseitsoreadilymakes bonds with other biological molecules. Another well-known catechol- amine that is closely related to dopamine in its chemical structure is norepinephrine(alsoknownasnoradrenaline),whichissynthesizedfrom dopamine by adding an oxygen atom to one of the hydrogen atoms to formanadditionalhydroxylgroupadjoiningthecarbonpairthatlinksthe catecholandaminegroups(Figure2.1).1Athirdmajorneurotransmitter
1 In this book, “norepinephrine” will be used instead of the less widely used term “noradrenaline;” however, “noradrenergic” will be used as the adjectival form of “norepinephrine,” as is again the custom.
Figure 2.1 The chemical structure of dopamine (left) and norepinephrine (right).
thathasanaminegroupbutlacksthecatecholstructureisserotonin(also knownas5-hydroxytryptamine,or5-HT),whichisconsideredpartofthe indoleamine class. Other important neurotransmitters like acetylcholine andglutamatehaveneitheracatecholnoranaminegroup.Agoodreview of the neurochemistry of dopamine and other neurotransmitters is pro- videdinCooperetal.(2002).
The exact chemical structure of dopamine (or any other neurotrans- mitter) is not necessarily indicative of what it does in the brain. For example, despite their nearly identical chemical structures and overall stimulant effects on behavior, dopamine and norepinephrine have very different behavioral and physiological effects, and often (and perhaps even mostly) inhibit each other’s actions, as will be discussed later. Similarly, apomorphine mimics the action of dopamine in the synapse and also in its behavioral effects (e.g. increasing exploration, motor activity etc.), but it also closely resembles the chemical structure of the drug morphine, which is a powerful analgesic.
Dopamine and all other neurotransmitters engage in a process of neural transmission that in many ways resembles a military campaign
(see Figure 2.2). In the region near the dopaminergic cell
creates the substance dopa from tyrosine (a substance found in proteins) using the enzyme tyrosine hydroxylase, which adds a hydroxyl (OH) group to tyrosine, and the final substance dopamine is then created by breaking apart a carboxyl (C-O-O-H) group and removing the CO2 from it via the actions of dopa decarboxylase (Step 1). All of this is akin to the assemblage and storage of armaments and other military materiel. Once created, dopamine’s movement then resembles a logistical supply line as it traverses along the axon of the neuron and eventually reaches the end of the axon (Step 2). At this point, dopamine is loaded into vesicles until being off-loaded at the membrane of the first neuron (known as the presynaptic membrane) (Step 3). One well-known drug that helps “unload” dopamine at the presynaptic membrane is amphetamine. Once it reaches the presynaptic membrane, dopaminemoleculescrossthesynapse,atinyfluid-filledspaceofonly
- 1The neurochemistryofdopamine21
20–40 nanometers (less than 1/10,000th of a millimeter), and quickly contact with the next-in-line neuron. This contact, which is similar to establishing a beachhead, is made at a postsynaptic receptor site, which contains a protein molecule that dopamine attaches to in similar fashion as a key enters a lock (Step 4). Once occupied, a typical receptor then eventually alters the flow of ions into the neurons’ membrane and so creates a tiny electrical signal. Dopamine has at least five types of receptors: D1, D2, D3, D4, and D5, although the D1 and D2 receptors are by far the most numerous and best-studied, with the D2 receptor appearing to be of greater importance to clinical disorders such as schizophrenia and the D1 receptor better studied in relation to working memory (Civelli, 2000). Just as in a military campaign, oppositional (“antagonistic”) drugs and neurotransmitters can to varying degrees block dopamine transmission by their occupation of dopamine receptor sites on the postsynaptic membrane, thereby preventing stimulation of the postsynaptic cell. Perhaps the best known of dopamine antagonists is the widely used clinical agent haloperidol, while apomorphine is the most prominent of the “agonist” drugs that stimulate the dopamine post- synaptic receptor and are used to mimic the effects of dopamine. Other substances may affect how the dopamine molecule is broken down and rejoined in the presynaptic neuron for yet another attempt to cross the synapse, akin to the military jargon of “falling back and regrouping” (Step 5). A substance that breaks dopamine apart is monoamine oxidase, whose inhibition is used clinically to increase the supply of dopamine and norepinephrine at the synapse. Well-known drugs that block the re-uptake of dopamine back into the presynaptic neuron are cocaine and amphetamine, which render dopamine temporarily in high supply at the synapse, thereby increasing transmission. Once a receptor site on a membrane is occupied, future occupation of that site is often made more effective (or in some cases, ineffective) because of changes in the sensitivity of the membrane (which allows for learning and tolerance to drugs tooccur).
Other neurotransmitters may further modulate the activity of dopa- minergic transmission. Generally, well-known transmitters such as serotonin, norepinephrine, and even acetylcholine all alter dopamine releaseinvariousregionsofthebrain,typicallybyinhibitingit(Gervais and Rouillard, 2000; Previc, 1996, 2006, 2007; Pycock et al., 1975). Indeed, the serotonergic inhibition of dopamine release is arguably the clinicallymostimportantneurochemicalinteractioninthebrain(Damsa etal.,2004),particularlyintheventromedialandlimbicsubcorticaland cortical areas (Winstanley et al., 2005), and it will receive specialmen- tioninChapters3and4.Mostoftheinteractionsofdopaminewith
Figure 2.2 The dopamine neuron and synapse: (1) dopamine synthesis;
(2) dopamine transport and storage in vesicles; (3) dopamine release from the presynaptic terminals; (4) acquisition of dopamine molecules at dopamine receptors in the postsynaptic neuron; and (5) re-uptake of dopamine back into the presynaptic neuron.
other transmitters probably do not occur at the dopaminergic synapse per se but rather at different synapses on the same neuron. Because of spatial and temporal summation of electrical signals from a variety of synapses on the end-structures (known as dendrites) of the receiving neurons, modulatory influences at nondopaminergic synapses can easily affect the final output of neurons that handle dopaminergic as well as other types of chemical transmission. For example, an inhibi- tory electrical potential (e.g. from a serotonergic neuron) at one syn- apse could cancel out an excitatory potential (e.g. from a dopaminergic neuron) at another, thereby preventing excitation of the next neuron in line fromoccurring.
There is nothing remarkable about the general structure of the dopaminergic synapse shown in Figure 2.2. However, the typ thataffectdopamineandtheireffectsonbehaviortellusagreatdeal
about dopamine’s role in behavior. For example, two substances are especially noteworthy in the synthesis of dopamine – tyrosine(derived from protein-rich food sources such as meat and fish) and phenylalan- ine, whose failure to be converted to tyrosine partly contributes to the serious mental impairments found in phenylketonuria (see Chapter 4). Tyrosine is then converted to dopa, which is the most effective drug used in the treatment of the severe voluntary motor impairments found inParkinson’sdisease,inwhichthenigrostriataldopaminergicsystemin
the brain (see next section) suffers massive degeneration. Amp which enhances the release of dopamine at the presynaptic membrane, generallystimulatesbehaviorand,inlowdosesatleast,improvesmental
focus, which is why a related drug known as methylphenidate is helpful in treating attention-deficit disorder even as chronic over-use of amphetamine can cause serious psychological problems, including delusions of grandiosity. At the postsynaptic level, antagonists such as haloperidol are known as “typical” antipsychotics in that they have been (and still are) widely used drugs to combat schizophrenia, a hyperdo- paminergic mental disorder characterized by delusions and bizarre and disorganized thoughts (i.e. psychosis). Finally, cocaine is a powerful stimulant that prevents the re-uptake of dopamine back into the pre- synaptic neuron, thereby increasing the supply of dopamine in the synapse. The fact that cocaine and amphetamine are highly addictive substances reveals how some dopaminergic systems are linked to the process of rewards craving (see Chapter 3).
- 2The neuroanatomy ofdopamine
Equally important in understanding what dopamine does in the brain is whereitislocated.Tounderstandmoreaboutitslocalization,itisneces- sarytounderstandafewimportantpointsabouthowthebrainisorganized
and described. The view of the brain in Figure 2.3 shows axes of the brain. The longitudinal axis is known as the anterior(frontal)- posteriorone,theverticalaxisisknownasthedorsal(top)-ventral(bottom)
axis,andthelateralaxisisdirectedleft-to-rightandbrokenintotheouter (lateral) regions or inner (medial) regions. In addition, the brain can be dividedintoitsmostsuperficialportion–termedthecortex,fortheLatin wordforbark–andthesubcorticalregionslyingbelow.Forexample,ifone identifiesaparticularregionasbeingtheleftdorsolateralfrontalcortex,this meansitislocatedtowardstheupperfrontoftheouterpartofthelefthalf (hemisphere)ofthebrain.
It is also important to recognize that, as shown in Figure 2.3, the corticalandsubcorticalportionsofthebrainareeachcomposedofmany
Figure 2.3 The cardinal directions and nomenclature used in brain anatomical localization.
subregions. The neocortex is generally partitioned into four major “lobes:”
- theoccipitallobe,whichislocatedtowardstheverybackofthehead and is the only lobe devoted to a single sensory function(vision);
- theparietallobe,whichislocatedtowardthetopbackoftheheadand isinvolvedprimarilyinspatialrelationsandtheintegrationofsensory and motor systems in near-bodyspace;
- thetemporallobe,whichislocatedbelowalargefissureknownasthe Sylvian in the lower posterior portion of the brain and is involved in recognition of objects, memory, and language and other functions tied to more distant space; and
- thefrontallobe,whichliesinfrontofthemajorfissureknownasthe Rolandic and is important in motor control and cognition.
A final type of cortex is located medially in the brain and surrounds the thalamus and hypothalamus and corpus callosum, which connects the two hemispheres. This older cortex is known as the limbic cortex and includes such structures as the hippocampus and cingulate gyrus and has major connections to emotion-regulating centers such as the amygdala and hypothalamus. In terms of the subcortex, the most important regions include:
- the brainstem, containing the midbrain and hindbrain and their centers for basic bodily functions, including sleep andarousal;
- the diencephalon, which contains the major relay and integration centerknownasthethalamus(whichmergesprojectionsofthemajor sensory and motor pathways with inputs from the arousal centersof
thebrainstem)andthehypothalamus(whichislocatedjustbelowthe thalamus and maintains most of the hormonal connections with the rest of the body);and
- the subcortical forebrain, which contains the basal ganglia, long known to be involved in higher-order motor control but also more recently accorded an important role incognition.
The midbrain and brainstem are of great importance in that they contain the cell bodies that produce a variety of neurotransmitters, includingthefourmajorones:acetylcholine,dopamine,norepinephrine, and serotonin. The projections of these cell bodies can extend all the waytothecortex,althoughinthecaseofacetylcholine,themajorinput to the cortex emanates from the nucleus basalis of Meynert, located in theventralforebrain.Themajornoradrenergicandserotonergicsystems originate in the locus coeruleus and dorsal raphe nuclei, respectively. The number of neurons directly included in these important neuro- chemicalsystemsissurprisinglysmall–inhumans,only200,000carry the serotonergic projections from the midbrain and only about one million carry all the dopaminergic fibers from this region (Rosenzweig et al., 2002). These are, of course, but a tiny fraction of the estimated 100 billion or so neurons in the brain and suggest that majoralterations inbrainfunctioncan,insomecasesatleast,directlyinvolveonlyasmall percentage of braincells.
The major dopamine systems originate from several different cell groupsnearthetopofthemidbraininaregionknownasthetegmentum (covering) (Fallon and Loughlin, 1987; Iversen, 1984). In rodents, two major projections run ventrally from the tegmentum – the nigrostriatal pathways and the mesolimbic. A smaller dopaminergic projection in rodents known as the mesocortical system parallels the mesolimbic pathway but is superseded in primates by reciprocal dopaminergic loops between the dorsal striatum and lateral prefrontal cortex, on the one hand, and between the mesolimbic-linked ventral striatum (con- taining the nucleus accumbens) and the ventromedial prefrontal path- ways, on the other (Cummings, 1995; Mega and Cummings, 1994; Tekin and Cummings, 2002). There are other less-studied dopami- nergic systems that are shorter in length and do not originate in thetwo classicmidbrainsources,suchasthetuberoinfundibularsystemrunning from the hypothalamus to the pituitary and the projections from the mesolimbic system to the medial preoptic nucleus of thehypothalamus, bothinvolvedinregulatingphysiologicalinteractionswiththerestofthe body (see Section 2.4). A depiction of some of the majordopaminergic
pathways is contained in Figure 2.4, minus the extensive lateral and medial cortical loops to the dorsal striatal and mesolimbic structures, respectively.
The nigrostriatal pathway will hereafter be mostly referred to as the
lateral dopaminergic pathway, because it courses laterally relative to themesolimbicsystem.ItoriginatesfromtheA9subgroupofcellsinthe tegmentumandlinksthesubstantianigraofthemidbrainwiththedorsal portion of the corpus striatum, which includes the putamen andcaudate nucleus.Thesestructures,alongwiththeglobuspallidus,arecollectively known as the basal ganglia and have the highest concentrations of dopamine found anywhere in the brain. The nigrostriatal pathways are heavily involved in most types of motor actions and are consideredpart of the extrapyramidal motor system.2As reviewed by Iversen (1984), unilateraldamagetothenigrostriataldopaminergicpathwaysresultsina severe impairment in motor behavior, including reduced motor activity inthelimbsopposite(orcontralateral)tothesideofthebraindamageas
2 By contrast, the more direct pyramidal motor system – with cortical axons descending directly into the spinal cord – originates from the giant pyramidal-shaped cells in the primary motor cortex located just anterior to the Rolandic fissure.
well as various postural and turning asymmetries in the direction of the damaged striatum (e.g. rightward turning with right striatal damage). Damage to the lateral dopaminergic system also results in additional higher-order difficulties in the planning, timing, and switching of behaviors (Previc, 1999). Although it is sometimes claimed that the caudate is more entwined with cognitive than motor functions and vice versa for the putamen, brain imaging has shown that both structuresare involved in both motor and cognitive functions (Monchi et al., 2006; Mozleyetal.,2001).Striataldamagecanalsocreatedeficitsinattending and responding to stimuli (e.g. hemispatial neglect), and theassociation and prediction of stimuli leading to rewards may also be impaired (Schultz et al., 1997). One indication of the role of the nigrostriatal dopaminergic pathways is what happens when these pathways are severely damaged, as occurs in Parkinson’s disease. Personssuffering fromParkinson’sdiseasehaveaseriousinabilitytoperformvoluntary motor acts, although they may do remarkably well in more visually elicited, ballistic or arousal-generated motor behaviors. For example, Parkinson’spatientsmaybeabletocatchaballifitissuddenlythrown to them (Glickstein and Stein, 1991) or, as in the case of a former Parkinsonian neighbor of mine, move quite well when a swarm of bees start attacking. The nigrostriatal dopaminergic system also has import- antconnectionswiththeprefrontalcortex,particularlyitslateralregion, whichmayaccountforthecognitivedeficitsfoundinParkinson’sdis- ease,especiallythoseinvolvingattention,workingmemory,andshifting strategies (see Previc, 1999). Indeed, disruption of the lateral dopami- nergic neurochemical system by substances that destroy dopaminergic neuronsinanotherwiseanatomicallyintactprefrontalregionmimicsthe effect of neuroanatomical destruction to this region (Brozoski et al., 1979). One drug that may boost lateral prefrontal activity to a greater extent than activity in other regions is amphetamine (Chudasama etal., 2005), which as already noted helps stimulate the release of dopamine and tends to improve mental focus, at least in lowdoses.
Themesolimbicpathway,whichwillhereafterbemostlyreferredtoas the ventromedial or medial dopamine system, emanates slightly more medially from the A10 cell group in the ventral tegmentum andcourses medially through the limbic system and limbic cortex. One of the key subcortical structures in this pathway is the nucleus accumbens, which has a high concentration of dopamine and is involved in exploratory, orienting, and motivational/reward behavior. The mesolimbic system also interacts with important medial cortical and subcortical elements, including the amygdala and hippocampus (located adjacent to the medial temporal cortex), the anterior cingulate cortex, olfactorycortex,
and the ventromedial frontal cortex. This last region is believed to hold two major subregions that interact extensively with each other – the orbitofrontal frontal cortex, so-named because of its proximity to the eye socket or orbit, and the medial frontal cortex (Ongur and Price, 2000). The extended mesolimbic dopaminergic network is believed to be the one responsible for providing most of our motivational drive and creative impulses and even aggressive behavior (de Almeida et al., 2005), and the most salient behavioral loss following damage to it is a loss of motiv- ational drive. This may be especially true of the ventromedial portion of the mesolimbic system, interacting with the medial shell of the accum- bens (Ikemoto, 2007). For example, lack of lever-pressing for food and drink in rats may initially occur following mesolimbic damage (Iversen, 1984), even though the animal is capable of making the proper limb movements (unlike in the case of nigrostriatal damage). The mesolimbic system is also more affected by emotions and stress (Finlay and Zigmond, 1997) and is believed to be the more seriously disturbed in a host of clinical disorders, including schizophrenia, obsessive-compulsive disorder, attention-deficit disorder, and substance abuse. Many of these disorders reflect the inability of the ventromedial and lateral prefrontal cortical centers to inhibit our primitive motivational drives and thought processes. A well-known drug that affects the mesolimbic system pref- erentially is cocaine (Fibiger et al., 1992), which is highly addictive and, when used chronically, produces compulsive and psychotic behavior resembling schizophrenia (Rosse et al., 1994).
The classic distinction between nigrostriatal dopaminergic involve- ment in motor control/planning/cognition and mesolimbic dopaminer- gic involvement in motivational drive and reward is widely held, with considerable justification. In terms of dopaminergically mediated male sexualbehavior,forexample,damagetothenigrostriatalsystemimpairs themotoraspectsofcopulation(e.g.mountingofthefemale)morethan precopulatory (motivational) behaviors in the presence of a receptive female, whereas the reverse is true for mesolimbic lesions (Hull et al., 2004). In reality, however, the two systems have extensive cross-talk betweenthem,especiallyfromthemesolimbicmotivationregionstothe dorsal striatal areas involved in motor programming (Ikemoto, 2007) and they are very heterogeneous, even in nonprimates. Both the lateral and medial frontal regions project to the striatum (albeit to segregated regions), both D1 and D2 receptors are prominently located in each system(Civelli,2000),mostdrugsinteracttovaryingdegreeswithboth systems, both systems are involved in exploration and behavioral switching, and overactivation of both systems may occur in various psychoses.Moreover,eachoftheminitsownwayappearstobeinvolved
in distant space, whether it be in providing the basic motivational drive toward distant goals (in the case of the mesolimbic system) or in the prediction of environmental contingencies and the execution of motor strategies required to achieve those goals (as in the case of the nigro- striatal/lateral prefrontal system) (see Chapter 3).
However, the nigrostriatal system is clearly more aligned with the dopamine-rich lateral prefrontal cortex, whereas the mesolimbic system has a greater affinity with medial-frontal cortical structures. This accounts for why combined anatomical damage of the dopamine-rich caudate nucleus and the lateral prefrontal cortex is so devastating for cognition (Tucker and Kling, 1969) and why it is less important than mesolimbic overactivation in obsessive-compulsive and substance-abuse disorders, which involve a derangement of motivational, reward, and inhibitory mechanisms in the brain (Abbruzzese et al., 1995; Adler et al., 2000; Bunney and Bunney, 2000; Rosenberg and Keshavan, 1998; Volkow et al., 2005). The lateral prefrontal cortex may even exert sub- stantial inhibitory control over the mesolimbic system, whether it be in controlling impulsive and compulsive behavior, in controlling the racing thoughts and loosened thought associations in schizophrenia (Bunney and Bunney, 2000), or in dampening the mesolimbic dopaminergic response to stress, particularly in the shell of the nucleus accumbens (Finlay and Zigmond,1997).
Besidesitsrelevancetoexplainingvarioushyperdopaminergicclinical disorders, the relative activation of the medial and lateral dopaminergic systems can explain both the normal variation in dopaminergic per- sonalities, with lateral dopaminergic types being more serious-minded and focused and medial-dopaminergic types being more impulsive and creative (see Chapter 3). It can also explain many altered states suchasdreaming,inwhichtheventromedialdopaminergicsystemmay be unleashed (Solms, 2000), or depersonalization reactions, in which the lateral prefrontal dopaminergic system may be more active (see Sierra and Berrios, 1998), resulting in a state of “thinking without feeling.” By contrast, an underactive mesolimbic system can leave an intellectually capable individual lacking in motivation – known as abulia,adynamia,orapathy(Al-Adawietal.,2000;Solms,2000;Tekin and Cummings, 2002). This motivational loss is essentially what occurred when prefrontal leucotomies – cuts through the white matter at the base of the frontal lobe – were once performed as accepted practice in the treatment of various neuropsychiatric disorders. Such surgeries have now been replaced by more benign anti-dopaminergic pharmacological treatments that similarly reduce motivation level (Solms,2000).
Anotherimportantfacetoftheneuroanatomyofdopamineinhumans is that dopamine is not confined to a few cortical areas, as it is inlower mammals (Berger et al., 1991; Gaspar et al., 1989). In lower animals, dopamine is located mostly in frontal areas containing primary and associationmotorcortexandinventromedialmotivationalcenterssuchas theanteriorcingulateandnucleusaccumbens,asbefitstheimportantrole of dopamine in goal-directed motor activity. In primates and humans, however, dopamine is found in high concentration in most cortical regions as part of the large overall expansion of dopaminergic systems (Gasparetal.,1989).Still,dopamineisnotevenlydistributedthroughout thebrain,eveninhumans.Dopamineisfoundingreaterconcentrationin frontalandprefrontalregionsandisdenserinventralasopposedtodorsal posterior regions. The anterior-posterior and ventral-dorsal gradients of dopamine have functional significance, in that anterior regions are involved in the voluntary initiation of motor behavior and ventral pos- terior regions are more involved in attention to distant space, both of whichareconsistentwiththeroleofdopamineinmediatinggoal-directed actions toward extrapersonal space (see Chapter 3). By contrast, dorsal parietal-occipital areas of the brain that are involved in visual–manual interactions and consummatory behavior in near-body space have a relatively higher ratio of norepinephrine to dopamine (Brown et al., 1979).Otherareasinwhichdopamineisreducedareposteriorbrainareas containing the primary sensory representations of the tactile, auditory, andparticularlythevisualmodalities.Indeed,dopaminemayactuallybe inhibited in many parts of the brain during sensory stimulation (Previc, 2006), which is why aberrant dopaminergic phenomena such as hallu- cinationsaremorelikelytooccurduringsensoryisolation,dreaming,or other external sensory disruptions in which more anterior brain regions beginto“invent”theirownpercepts(seeSection3.1.3aswellasPrevic, 2006). The predominance of dopamine in association cortical areas, in which higher-order sensory processing or cross-modal sensory inter- actionsoccur,indicatesthatdopamineisespeciallywell-suitedtomaking connections among stimuli and events and organizing them into mental plans.Thisisbeneficial instimulatingcreativityandin“off-line” thinking and strategizing, important components of abstract reasoning. But, creativethinkingdivorcedfromallexternalreality(e.g.sensoryfeedback fromthebodyandenvironment)canalsobedangerous,intheformofthe bizarrethoughtassociationsthatexcessivedopaminergicactivityisknown to create (see Chapter4).
Another important change in the functional role of dopamine from lower mammals to primates is reflected in the different distribution of dopamineacrosscorticallayers.Atypicalpatchofgray-mattercortexin
- 3Dopamine and thelefthemisphere31
humans has six layers, of which the second from bottom (Layer 5) contains mostly descending projections back to subcortical regions, the third from the bottom (Layer 4) contains mostly ascending subcortical projections to the cortex, and the top layer (Layer 1) contains smaller numbers of cells that have enormous dendritic branching and a large numberofconnectionswithinandacrosstheentirecortex.Thedifferent sizes, neuronal densities, and shapes of these layers in different regions ofcortexarethebasisfortheanatomicalclassificationofdifferentbrain areas that, in the widely used scheme of Brodmann, number over fifty (see Rosenzweig et al., 2002). For example, cortical motor regions sending motor commands downstream generally have a large Layer 5, whereascorticalregionsinvolvedinsensoryprocessinggenerallyhavea largeLayer4sincetheyreceivesubstantialprojectionsfromsubcortical relaystations,principallyinthethalamus.Themajorlayeringdifference between primates and most other mammals is the much greater density ofdopaminergicneuronsinLayer1oftheformer(Gasparetal.,1989). Inrodents,thedopaminergiccontentofthebrainismoreuniformacross cortical layers, although it is somewhat larger in the lower two levels (which carry a great deal of motor command signals). By contrast, the dopaminergic content of the primate brain, besides being much greater overall, is especially predominant in Layer 1, which has most of the connections with other cortical areas as well as with the striatum and is more involved in coordinating activity during cognitiveoperations.
ItisinterestinginthisregardthattheLayer1dopamineneuronshave largerbranchingandconnectivity,aremorelikelytoreceiveinputsfrom thenigrostriatalpathways,andaremorelikelynottobeco-locatedwith neurotensin (Berger et al., 1991), a neurochemical involved in motiv- ational behaviors such as feeding and drinking. This finding suggests that the less motivationally dependent lateral dopaminergic systemmay have selectively expanded in humans relative to the mesolimbic one. Another indication of the disproportionate expansion of the lateral dopaminergic system during human evolution is the enormous increase of the human striatum, which contains the highest concentration of dopamine in the brain. As noted in Chapter 1, the striatum nearly doubled from chimpanzees to humans in relative terms (Rapoport, 1990), making it the largest proportionate increase in brain areaoutside of the neocortex, itself very rich indopamine.
- 3Dopamine and the lefthemisphere
The distribution of dopamine in humans resembles those of other pri- mates,althoughithascontinuedtoexpandinbothrelativeandabsolute
terms and is much more highly lateralized than in the case of other primates. Indeed, the lateralization of dopamine activity – greater in the left hemisphere in the majority of humans, particularly in its ventral regions (Previc, 1996, 2007) – may turn out to be the single-most important neurobiological factor accounting for the intellectual abilities and unique personality of that hemisphere. The lateralization of dopa- mine has been found in postmortem measures of dopamine activity and in lateralized brain activity following ingestion of dopaminergic drugs (Flor-Henry, 1986; Previc, 1996; Tucker and Williamson, 1984, for reviews) as well as more recently in brain imaging studies measuring D2 binding potential (Larisch et al., 1998; Vernaleken et al., 2007). Greater dopamine in the striatum of one hemisphere induces rotation toward the other side and a contralateral (e.g. opposite) paw preference in animals, which presumably accounts for the predominance of rightward rotation and right-handedness in humans (de la Fuente-Fernandez et al., 2000; Previc, 1991). Dyskinesias (uncontrollable motor outbursts) associated with compensatory excessive dopamine activity after chronic use of dopamine antagonists is greater on the right side (i.e. in the left hemi- sphere) (Waziri, 1980), whereas motor rigidity in Parkinson’s disease (associated with reduced dopamine activity) typically first appears on the left side of the body (i.e. in the right hemisphere) (Mintz and Myslobodsky, 1983). The left hemisphere also predominates in psy- chological disorders associated with excessive dopamine, such as mania and schizophrenia (see Chapter 4), and it also is more important in dreaming, hallucinations, meditation, and other altered states in which dopamine prevails (see Previc, 2006). Other indirect evidence in favor of a dopaminergic predominance in the left hemisphere is that hemi- sphere’s superiority in voluntary motor skills and its greater grammat- ical, reasoning, working memory, and other abstract intellectual skills, all of which are dependent on dopamine (Previc, 1999). Also, the left hemisphere in humans may be more important in the initiation of vio- lence (Andrew, 1981; Elliott, 1982; Mychack et al., 2001; Pillmann et al., 1999), although left cortical lesions near the frontal pole may actually increase aggression due to a release in lateral dopaminergic inhibition of the mesolimbic system in that same hemisphere (Paradiso et al., 1996). Conversely, the left hemisphere is deficient in most social and emotional behavior (Borod et al., 2002; Perry et al., 2001; Weintraub and Mesulam, 1983), which can be disrupted by excessive dopamine (see Previc, 2007).
Howdopamineevolvedtopredominateinthelefthemisphereofmost humans is still not completely understood. However, one leading hypothesis (see Previc, 1996) is that the greater concentration of
- 4Dopamine and the autonomicnervoussystem33
dopamine in the left hemisphere is an indirect consequence of a greater concentration of serotonin and norepinephrine in the right hemisphere, which as mentioned earlier, serve to inhibit dopamine in the same hemisphere (e.g. Pycock et al., 1975; Previc, 1996). The greater ser- otonergic and noradrenergic activity in the right hemisphere may result from the predominance of the left vestibular organ and its predomin- antly contralateral projections that terminate in the right hemisphere (Previc, 1991, 1996). As noted in Chapter 1, it is likely that asymmetry of the otolith organs – which are important in sensing gravity and in maintaining postural control – derives from an asymmetrical grav- itoinertial environment in the womb, created by lateralized fetal pos- itioning (Previc, 1991). The otoliths have long been known toinfluence the sympathetic neurotransmitters norepinephrine and serotonin, because sympathetic arousal is crucial in order to maintain a normal supply of blood to the brain during changes in our body relative to gravity and to aid the body in righting itself during a fall (see Previc, 1993;Yates,1996;YatesandBronstein,2005).Thevestibularbasisfor neurochemical and functional lateralization is consistent with the fact that the absence of normal vestibular function greatly reduces the prevalenceofright-handedness,whichreflectsthegreaterdopaminergic content of the left hemisphere (Previc, 1996). The vestibular theory of cerebral lateralization is further supported by the importance of ves- tibular inputs to body-centered perceptual and motor networks that mediatetherighthemisphere’sgreaterorientationtowardperipersonal as opposed to distant space (Previc, 1998). Whether any other prim- ordial factors influence the lateralization of dopamine to the left hemi- sphere – along with the major cognitive capabilities it supports – is unclear. As noted in Chapter 1, however, any other causes of brain lateralization are unlikely to be genetic, given the lack of differences between identical and fraternal twins in their concordances for brain lateralizationandthelackofanyknowngeneininfluencingthedirection of cerebral lateralization inhumans.
- 4Dopamine and the autonomic nervoussystem
One of the most important aspects of dopamine’s role in the brain pertains to its involvement with the autonomic nervous system. The two major portions of the autonomic system are the sympathetic system, which increases body arousal and metabolism, and the parasympathetic system, which generally quiets the body and reduces metabolism (Rosenzweig et al., 2002). Key sympathetic functions, directly regulated by brainstem and hypothalamic structures under the influence of
other brain regions and systems such as the vestibular one, include increased cardiac output, vascular constriction (to shunt blood to the muscles and brain), increased oxygen utilization, conversion of fat stores to sugar, and elevated body temperature. By contrast, the para- sympatheticsystemisinvolvedindecreasingcardiacoutput,decreasing oxygen utilization, increasing vasodilation and heat loss, and reducing digestion.
Dopamine is believed to exert a mostly parasympathetic action inthe brain, in that it helpsto:
- lower body temperature, partly by stimulating sweating (Lee et al., 1985);
- reduce respiration during hypoxia, partly by lowering temperature (Barros et al.,2004);
- increase peripheral vasodilation, which is important in erectile responses during dopamine-mediated male sexual behavior (Hull et al., 2004);and
- reduce blood pressure (hypertension) (Murphy, 2000).
In most of these situations, the actions of dopamine are similar to those of acetylcholine but opposite to those of norepinephrine and serotonin,asthelattertwoneurotransmittersgenerallyservetoincrease metabolism, temperature, vasoconstriction, and arousal (Previc, 2004) buttendtoinhibitmalesexualbehavior(Hulletal.,2004;Robbinsand Everitt,1982).Thephysiologicalandhormonalactionsofdopamineare carried out by various systems, one of which is the tuberoinfundibular pathwayintothehypothalamusandtheanteriorpituitary,whichinhibits the release of the female nursing hormone prolactin from the pituitary gland. Other dopaminergic routes involve connections to the medial preoptic nucleus of the hypothalamus (which mediates male sexual behavioraswellasparasympatheticthermalandcardiovascularactions) and the vagus nerve (which controls parasympathetic signals to the rest of the body). Dopamine agonists are clinically beneficial in treating a variety of autonomic dysfunctions, including hyperprolactinemia (Gillam et al., 2004), hypertension (Murphy, 2000), and male erectile dysfunction (Giuliano and Allard, 2001). The close physiological link- age between male behaviors and dopamine has its behavioral counter- part in the effect of testosterone to increase dopamine in the brain, in the preponderance of males in many hyperdopaminergic disorders (see Chapter 4), in the incorporation of many masculine behaviors into the dopaminergic personality (see Chapters 3and 5), and in the mas- culine behavioral mode (consciousness) of the dopamine-rich left hemisphere (see Chapter3).
The physiological role of dopamine to dampen physiological arousal and reduce oxygen utilization and metabolism complements its behav- ioral role in reducing emotional arousal (i.e. increase detachment) and in maintaining motivation and attentional focus in uncertain and stressful environments (see Chapter 3). While dopaminergic systems may mediate some transient positive emotional states such as elation and even euphoria, dopaminergic circuits may be more instrumental in shutting down negative emotional arousal contributing to fear and anxiety by their inhibition of such centers as the amygdala (Delaveau et al., 2005; Mandell, 1980). Behavioral repertoires based primarily on emotionalresponsivenesswouldbecounterproductiveoveranextended period of time, just as sustained sympathetic arousal is unsustainable physiologically.3Up to a certain point, then, dopaminergic stimulation enablesustomaintaincontroloratleastbelievethatweourselvesrather than fate controls our destiny (Declerck et al., 2006), a trait known as
“internal locus-of-control” (see Section 3.4.2). However, too
dopaminergicstimulationinresponsetopsychologicalandphysiological stress, particularly in the nucleus accumbens and other medial struc- tures, can lead to hallucinations, delusions of grandeur (exaggerated internal control), and even psychosis (Gottesmann, 2002).
It should finally be noted that the role of dopamine (as well as acetylcholine) in facilitating parasympathetic activity is consistent with the well-documented parasympathetic actions of the left hemisphere, in which dopamine and acetylcholine predominate (Previc, 1996). By contrast, the right hemisphere, dominated by noradrenergic and serotonergic activity, is more important in sympathetic arousal and emotional perception and production (see Previc, 1996; Wittling et al., 1998).
Thisbriefreviewoftheneurochemistryandneuroanatomyofthemajor dopaminergic systems clearly demonstrates how important dopamineis to the human brain. In terms of pharmacology, many of the most important drugs to combat psychological disorders have their major effect at the dopaminergic synapse. In terms of anatomy, dopaminergic systems extend throughout the entire cortex and are densest in cortical layers that predominantly connect with other cortical regions.The
3 The metaphor “cooler head” symbolizes the twin roles of dopamine in decreasing physiological arousal (e.g. lower temperature) and promoting analytical thinking; con- versely, being “hot-headed” implies more than a mere elevation of cranial temperature.
widespread intracortical projections of the prefrontal dopaminergic systemofprimates(andespeciallyhumans)testifiestoitsbeingamajor driver of the multitude of parallel processing circuits used in higher mental functions. Dopaminergic dominance is also greatest in regions suchasthelateralprefrontalcortexthatareknowntoplayacrucialrole in higher cognition. The great expansion of dopaminergic systems and their widespread distribution across the cortex in primates reverses the situation in rodents, in which other neurotransmitters are moreplentiful and widely distributed across the cerebral cortex and dopamine is con- fined to mainly motor and prefrontal regions. In addition to the overall expansion of dopaminergic systems during human evolution, the pre- frontal/striatal dopaminergic system appears to have expanded relative to the mesolimbic/mesocortical pathways, allowing for the sublimation ofmorebasicandimpulsivemesolimbicdrivesbyourrationalintellect, although the mesolimbic system continues to play an important role in motivation and creativity in modern humans. The further lateralization of dopamine to the left hemisphere has helped to make thathemisphere thecitadelofhumanreasoning,withabilitiesfarbeyondthecapabilityof anyotherspecies,aswellasthemainsourceofdreaming,hallucinations, andothermorechaoticextrapersonalexperiences.Finally,dopamine’s role in maintaining controlled, goal-directed behavior over emotionally charged behavior, particularly during stress, complements its physio- logicalroleindampeningthephysiologicalstressresponse,whichallows highly dopaminergic individuals to function well in extreme environ- ments (Previc,2004).
3 Dopamine andbehavior
The role of dopamine in normal and abnormal behavior has been the subject of a massive amount of research since dopamine was first shown to be linked to Parkinson’s disease, schizophrenia, and motivation and reward mechanisms in the 1960s and 1970s. Based on the number of publications that have dealt with it, dopamine is arguably the most important or at least most intriguing neurotransmitter in the brain. It has been implicated in a very large number of behaviors, with a the- oretical ubiquitousness that has led it to be facetiously referred to as “everyman’s transmitter because it does everything” (Koob, cited in Blum, 1997).
Thereisonegeneralconclusionregardingdopamineandbehaviorthat aclearmajorityofneuroscientistswouldagreeon:dopamineenablesand stimulatesmotorbehavior.Increaseddopaminetransmission,atleastupto acertainpoint,leadstobehavioralactivation(e.g.increasedlocomotion, vocalizations, and movements of the face and upper-extremities) and speeds up motor responses; conversely, diminished dopaminetransmis- sion leads to akinetic syndromes, including in the extreme mutism (Beninger,1983;Salamoneetal.,2005;TuckerandWilliamson,1984). Oneofthedistinguishingfeaturesofexcessivedopamineinthebrainis motor behavior known in lower animals as “stereotypy” – constant, repetitive movements that would resemble compulsive behavior in humans (Ridley and Baker, 1982). A dramatic example of dopamine’s roleinmotorbehavioristhe“stargazer”rat,whichhasveryhighdopa- minelevelsandactivitylevelsfourtofivetimesthoseofnormals(Brock and Ashby, 1996; Truett et al., 1994). Well-known drugs that increase dopaminetransmissionsuchasamphetamineandcocaineareconsidered stimulants, whereas drugs that block dopaminergic activity such as haloperidol are regarded as major tranquilizers. For example, the dopa- minergic agonist quinpirole is known to produce a six-fold increase in locomotion distance in an open-field test (Szechtman et al., 1993), and dopamineisthetransmittermostinvolvedinmanicepisodesinhumans, duringwhichactivitylevelsaredramaticallyelevated(seeChapter4).
Althoughdegradationofthedopaminergicnigrostriatalsystemcan,if severeenough,affectalmostalltypesofmotoractivity,thegeneralrole of dopamine in stimulating motor behavior belies a considerabledegree ofspecificityinitsmotoractions.Forexample,atnormalarousallevels, dopaminestimulates:
- exploratory (seeking) behavior more than proximal socialgrooming;
- sexual activity more than feeding;
- active male sexual behavior (mounting) more than receptive female behavior(lordosis);
- saccadic (ballistic) eye movements more than smooth-pursuit eye
- upward movements more than downwardones.
Whatdoallofthemotorbehaviorsthatdopamineselectivelystimulates have in common? For a satisfactory explanation, it is crucial to first understand the fundamental role of dopamine in mediating our actions in distant space andtime.
3.1 Dopamine and distant space and time
Our interactions in 3-D space are ubiquitous and represent perhaps the single-most important factor in shaping the major pathways of the human brain (Previc, 1998). All of our sensory and motor systems can be linked to various regions of 3-D space, and there are virtually no behaviors(includinglanguage)thatcannotbelinkedtoparticularspatial systems. For example, even language is more closely aligned with auditory and visual systems used in distant space than with tactile and kinesthetic systems involved in reaching activity in nearby space.Like- wise,allofourbehaviorscanbedefinedbytheirtemporalsphere–
i.e. seeking food or a mate or a secondary goal like a good jobinvolves the future whereas touch and the enjoyment of sensual pleasures are associated with thepresent.
There are four general brain systems that handle our behavioral interactions in 3-D space (see Figure 3.1). One system, known as the peripersonal,ismainlyusedtoexecutevisuallyguidedmanipulationsin near-body space. This system is biased toward the lower visual field, where the arms usually lie during reaching for and transporting objects to the mouth. Another system, known as the ambient extrapersonal,is involved in postural control and locomotion in earth-fixed space and is also biased more toward the lower field, where the groundplane
Peripersonal Ambient Extrapersonal
Figure 3.1 The realms of interaction in 3-D space and their cortical representations.
From Previc, F. H. (1998). The neuropsychology of 3-D space.
Psychological Bulletin, 124, 123–164, with permission.
predominates. Both of these systems make extensive use of vestibular, tactile and other “body” systems and run dorsally though the parietal lobe and rely on neurotransmitters such as norepinephrine and to a lesser extent serotonin, which are crucial to physiological (and emotional) arousal systems (Previc, 1998). By contrast, dopamine appears to be little involved in these systems or in their representations. As mentioned in Chapter 2, serotonin and norepinephrine are also pre- dominantly localized to the right hemisphere, which is more important both in peripersonal operations and in orienting our postural and per- ceptual systems withgravity.
Dopamine, on the other hand, is the primary neurotransmitter fortwo systems that deal primarily with extrapersonal space. One of these isthefocal-extrapersonalsystem,whichisinvolvedinsearchandscan- ning of the environment and recognition of objects within it. This system operates at a distance because objects are rarely brought into our personal space unless we have already recognized them. This system uses central vision and predominantly saccadic eye movements to carry out the visual search process, and it is localized mainly to the ventrolateral temporal lobe, the lateral prefrontal cortex, and to a lesser extent the region surrounding the parietal eye-fields. It supports detailed visual processing, effortful and controlled interactions with the extrapersonal environment, and the focused executive control func- tionsassociatedwiththelateraldopaminergicpersonality/intellect(see Section 3.4.2). A second, more peripherally located system, known as
the action-extrapersonal one, gives us a sense of “presence” in the environment and aids in exploration, navigation, and orientation to salient stimuli and landmarks. This system extends ventromedially through the medial temporal lobe and hippocampus and limbic areas and on into the ventromedial frontal areas, largely paralleling the medial dopamine cortical system described in Chapter 2. Relative to the focal-extrapersonal system, this system is less focused and con- cerned with details and it supports the more creative and impulsive functions associated with the ventromedial dopaminergic personality/ intellect (see Section 3.4.1).
One consequence of dopamine’s association with distant space is thatitisimportantin“exteroceptive”sensessuchassight(farvision), hearing, and smell, whereas its counterpart norepinephrine is more associated with “interoceptive” (tactile and bodily) signals (Crow, 1973).Indeed,itisnowrecognizedthatthereareverycloseanatomical connections between the olfactory system and the medial dopaminer- gic motivation systems (Ikemoto, 2007). Another consequence of thedopaminergic association with distant space is that, because the upper field comprises the most distant part of our visual world due to the slope of the ground plane as it recedes from us, the focal- and action- extrapersonal systems are biased toward the upper field, as isdopamine itself. A third correlate of dopamine’s involvement with distant space is its involvement with distant (especially future) time. Unlike peri- personal activities, in which the eliciting stimulus or reward may be nearby and requires little effort to reach or procure, pursuit of distant or delayed rewards requires considerable effort and delay, both of which dopamine normally helps to overcome (Denk et al., 2005; Salamone et al.,2005).
The above links to distant space account for why dopamine is selectively involved in a myriad of behaviors that are more likely to be performed outside our immediate space and time. For example, dopa- mine is critical for:
- most if not all upwardly directed behavior (Previc,1998);
- saccadiceyemovements,whichareusedtoexplorethedistantworld and are more upwardly biased than are smooth vergence and pursuit eye movements (Previc,1998);
- “male”-type sexual behavior, which in contrast to its female coun- terpart is more active than receptive and depends more on distant cues such as odors and visual cues than on internal body cues for its elicitation (Robbins and Everitt,1982).
Conversely, dopamine is less involved in and/or in some cases actually inhibits more peripersonally linked behaviors such as:
- socialgroomingandotheraffiliativebehaviors,whichareimportantfor group cohesion and for emotional health and are more affected by oxytocin,opioid,andtactilecommunication(Schlemmeretal.,1980);
- consummatory aspects of feeding, which are more dependent on opioid (mu-receptor) function (Baldo and Kelley, 2007; Blackburn et al., 1992);and
- receptive female sexual and maternal behavior such as lordosis (the arching of the back) and lactation that are stimulated by tactile cues and noradrenergic and opioid mu receptors (Depue and Morrone- Strupinsky, 2005; Robbins and Everitt, 1982).
It can even be argued that dopamine’s well-documented role in goal- directed behavior (seeking) is a consequence of its primordial orienta- tion toward distant space and time (see next section).
The role of dopamine in 3-D space and time will be further explored in the next few sections. Section 3.1.1will detail the specific role of dopamine in attending to distant space and time while Section 3.1.2will describe dopamine’s role in goal-directed/reward behavior and Section 3.1.3will review the role of dopamine in altered states and other experiences in which extrapersonal as opposed to peripersonal themes preside.
- 3.1.1Dopamine and attention to spatially and temporally distant cues
Awealthofdatahasaccumulatedoverthepastseveraldecadestoshow how dopamine in the brain is critical for attending to distant space and time.1Dopamine-deficient animals do not easily orient or make asso- ciationstodistantstimuli(Blackburnetal.,1992),whichcontrastswith the effects of dopaminergic drugs to increase behaviors controlled by distal cues (Szechtman et al., 1993). Indeed, only animals with normal dopamine levels will attend (orient) to novel distal stimuli if they are engaged in consummatory behavior like feeding (Hall and Schallert, 1988). The effects of dopamine to shift the balance of attention toward distant space and away from peripersonal space is illustrated by its role inhoardingbehavior.Ifananimalispresentedwithafoodobject,itcan eithereattheobjectonthespotorbringitbacktothenestandsaveitfor
1Distanceinthiscasedoesnotnecessarilyimplyaprecisespatialortemporalmetric(e.g. meters or minutes) but rather at the very least a departure from the immediate space surrounding the body and the immediate present.
later consumption. Usually hoarding occurs mostly for large, nearby objects(smalleronescanbeeatenonthespot),butitcanalsooccurfor moredistantobjects.Ifananimalisdeprivedofdopamine,however,the distance for which hoarding occurs is reduced to about two meters (Dringenberg et al., 2000). A similar hoarding failure with distal food objects occurs in animals with damage to the hippocampus (Whishaw, 1993), a key ventromedial structure involved in processing and remem- bering information from distant space (see Previc, 1998) and part of a more general system that is involved in orientation to distant time (Botzungetal.,2007;FellowsandFarah,2005;Okudaetal.,2003).
Other examples of dopamine’s involvement with distant space are the upward movement and attentional biases produced by dopaminergic activation, which derives from the previously noted association of upward space with the more distant portions of our visual world (Previc, 1998). Examples of these upward biases include rearing on the hindlegs, vertical sniffing, dorsiflexion (raising) of the head, and vertical climbing (as on the walls of the animal’s cage). One of the most dramatic examples of such upwardly biased behavior is the nearly continuous dorsiflexion of the head in the hyperactive, dopamine-rich “stargazer” rat (Brock and Ashby, 1996; Truett et al., 1994) (see Figure 3.2a). This behavior has interesting parallels with the upward head and eye move- ments that have consistently been shown to accompany higher mental activity in humans (Figure 3.2b) and that presumably emanate from the dopamine-rich lateral prefrontal cortex (Previc et al., 2005). The upper- field behavioral biases in animals are further consistent with the upward eye movements and upper-field visual hallucinations that are commonly found in schizophrenia (Bracha et al., 1985; Previc, 2006), which is caused by excessive dopamine. Reduced dopaminergic transmission due to damage to the nigrostriatal pathways or the ingestion of dopaminergic antagonists conversely results in fewer upward eye movements, atten- tional neglect of the upper field, and nose-diving behavior (somer- saulting from high places) (see Previc, 1998, for a review). For example, hypodopaminergic patients with Parkinson’s disease tend to reach below the target in memory-dependent reaching (Poizner et al., 1998), and they show fewer upward saccades (Corin et al., 1972; Hotson et al., 1986) and even compress (neglect) the upper field (Lee et al., 2002). By contrast, serotoninergic activation tends to decrease upwardly oriented behaviors (Blanchard et al., 1997), consistent with serotonin’s normal inhibitionofdopamineinvariousregionsofthebrain.
Another aspect of dopamine’s involvement in distant space is the well-documented tendency of animals with normal dopamine levels to exploreanovelenvironmentbysniffingarounditsperimeterandpoking
Figure 3.2 Upward dopaminergic biases.
The Stargazer rat (top) from Truett et al. (1994). Stargazer (stg), newdeafnessmutantintheZuckerrat.LaboratoryAnimalScience,44, 595–599, with permission from AALAS; humans during mental activity (bottom), photo courtesy of CarolynDeclerck.
Figure 3.3 The dopaminergic exploration of distant space across mammals.
From Previc et al. (2005). Why your “head is in the clouds” during thinking: the relationship between cognition and upper space. Acta Psychologia, 118, 7–24, with permission from Elsevier.
theirheadsinvariousholes.Infact,normaldopaminelevelsareessential foranimalstopreferanoveltoafamiliarenvironment(FinkandSmith, 1980; Ikemoto and Panksepp, 1999; Pierce et al., 1990). By contrast, dopaminergic systems actually diminish exploratory behavior in a familiarenvironment,asanimalsdeprivedofdopaminewillcontinueto explore a familiar environment, and dopamine seems less critically involved in habitual responses (Ikemoto and Panksepp, 1999). The enhanced response to novel environments has led some researchers to conclude that dopamine (in particular, the D4 receptor) mediates nov- elty-seeking in humans (Bardo et al., 1996; Cloninger et al., 1993; Dulawa et al., 1999). Unfortunately, novelty-seeking in humans often connotesthrill-seekingorsensation-seekingthatelevatesarousal,which may actually have little to do with the desire to explore distant envir- onments in a controlled, systematic manner; indeed, sensation-seeking appears to be more dependent on transmitters such as norepinephrine that are more involved in sympathetic arousal (Zuckerman, 1984). It is not surprising, then, that the relationship between dopamine genes and novelty-seeking in humans has not proven to be very robust (Kluger etal.,2002),althoughatleastonemeasureofexploratorytendencydoes appear to be correlated with dopaminergically mediated creativity (Reuter et al.,2005).
Even though dopamine may mediate exploration in all mammalian species, the instrument by which the world is explored differs across species (Figure 3.3). In rodents dopaminergically mediatedwhole-body locomotion and crude distal sensory systems such as olfaction areused
toexploreadistantworldthatcanbedefinedbyafewmetersortensof metersatmost.Primatesengageinmostlyvisualexplorationofagreatly expanded distant environment using dopaminergically mediated sac- cadic eye movements, which are made at a rate of about two to three per second and can locate small food objects at over 25 meters in the distance. In humans, the concept of space goes beyond even the here- and-now,soastoincludetheabilitytoimagineevenmoredistantworlds and concepts by means of off-line, abstract thinking and imagination (Bickerton, 1995; Suddendorf and Corballis, 1997). This type of thoughtisbelievedtohaveonlyrecentlyfullyevolved,reachingfruition with the ancient civilizations (see Taylor, 2005; Chapter 6). Despitethe vast difference between physically distant space and abstract space, physical and cognitive exploration possess several commonalities, such as dopaminergic involvement and upward biases. Not surprisingly, directlinksbetweenspatialforagingandcognitiveforaging(Hills,2006; Hills et al., 2007) and even physical and cognitive number space (Vuilleumier et al., 2004) have been documented inhumans.
Asforthedopaminergicroleinattendingtodistant(especiallyfuture) time, a prime example is its role in reward prediction. Schultz and col- leagues (Schultz et al., 1997) have extensively investigated the role of dopamine in reward prediction and have found that dopaminergic neuronsintheventraltegmentumandsubstantianigraareactiveduring reward learning. Dopaminergic neurons respond to stimuli paired with rewardsduringtheprocessoflearning,althoughtheystoprespondingto the reward itself over time and they are actually inhibited if the reward fails to be presented after the cues. Hence, the dopaminergic neurons appeartobesensitivetothepredictabilityandtimingofthecue–reward relationship.Moreover,dopamineneuronsarenothighlyresponsiveeven inobtainingrewardsifsufficientattentiontoenvironmentalcontingencies isnotrequired(i.e.therewardistoopredictable)(Horvitz,2000;Ikemoto and Panksepp, 1999). That dopaminergic systems are concerned with distant time is also illustrated by the behavior of both rats and humans who prefer a smaller but immediate reward over a larger one that is delayedafterreceivingdrugsthatblockdopaminesynthesisortransmis- sion (Denk et al., 2005; Sevy et al., 2006). The role of dopamine in allowingfordelayedrewardgratificationinanimalsisconsistentwiththe relatively greater activation of lateral prefrontal regions in humans by decisionsinvolvingdelayedasopposedtoimmediaterewards(McClure et al., 2004) and by the effect of lesions of the dopamine-rich ven- tromedialprefrontalandtemporalregionstocreatewhathasbeentermed “myopia for the future” (Bechara et al., 2000; Botzung et al., 2008; Fellows and Farah, 2005; Okuda et al.,2003).
Goal-directed action has several functional requirements, including:
- keeping the spatial location and representation of the goal objectand landmarks in immediate memory while working toward the goal (workingmemory);
- understanding causal relationships in order to predict or control the occurrence ofrewards;
- developing a temporal representation (serial ordering) of the action; and
- altering behavior on the basis of environmental feedback, as in switchingstrategies.
Dopamineistheprincipalneurotransmitterinvolvedinalloftheseskills (Previc, 1999), and dopaminergic systems – particularly the ven- tromedialones–providethemotivationaldriveforwhatareoftenquite complex and effortful actions with delayed gratification. Indeed, as noted in Chapter 2, destruction of the medial dopaminergic systems in the brain can lead to a profound motivational apathy, despite otherwise normal behavior (Salamone et al., 2005; Solms, 2000; Tekin and Cummings,2002).2
The role of dopamine in goal-directed activity would be much more limited if it did not mediate our attention to distant space and time. Predicting reward typically involves the ability to make associations between distant cues and reward objects, and dopaminergic systems have been shown to be especially important when goal-directed activity relies on distal rather than proximal visual or non-visual (e.g. tactile) cues (Blackburn et al., 1992; Szechtman et al., 1993; Whishaw and Dunnett, 1985). And, as already noted, extended behavioral sequences to achieve more distant goals also require an appreciation of the tem- poral distance to the goal. There even appears to be a close relationship between the extent of one’s future consciousness and one’s motivation level, with ventromedial frontal lesions diminishing both (see Fellows and Farah, 2005; see also Section 3.4.1).
Dopamine has been widely studied in conjunction with different aspects of goal-directed behavior in animals and in conjunction with addiction, apathy, and other motivational disturbances in humans. The
2 For example, patients receiving medial prefrontal lobectomies and leucotomies in the 1940s to treat psychiatric illness often retained intelligence levels within presurgical limits (Rosvold and Mishkin, 1950), despite profoundly impaired motivational drives.
dopaminergic motivational drive may in many cases be directedtoward obtainingrewardsnecessaryforphysicalsurvival,suchaslocatingfoodor water,butinothercasestheyarenot.Forexample,procurementofsexual rewards, sweet-tasting items, recreational drugs, and rewards with acquired value such as money and knowledge do not satisfy anyimme- diate physiological need. In these cases, the rewards must acquire an incentive value that can justify the sometimes large behavioral expend- itures required to achieve them. Most leading theories have positedthat dopamineiscruciallyinvolvedinincentivemotivation–themotivation necessarytoseekandacquiregoals/rewards,eveniftheyarenotrequired for immediate physiological survival (Blackburn et al., 1992;Beninger, 1983; Berridge and Robinson, 1998; Depue and Collins, 1999; Horvitz, 2000; Ikemoto and Panksepp, 1999; Salamone et al.,2005).
The specific role of dopamine in performing goal-directed behavior has been studied most in feeding and sexual behavior. Based on the classic distinction between appetitive and consummatory behavior, dopamine is viewed by most researchers as much more involved in appetitive/seeking/foraging behavior (Alcaro et al., 2005; Ikemoto and Panksepp, 1999). Disruption of dopamine usually does not prevent an animal from consuming food that is already nearby, but it does affect its ability to initiate the goal-directed response, learn the behavioral con- tingencies that lead to the obtaining of the reward, and maintain responding when gratification is delayed (Baldo and Kelley, 2007; Berridge and Robinson, 1998; Blackburn et al., 1992; Dringenberg et al., 2000; Denk et al., 2005; Ikemoto and Panksepp, 1999; Salamone et al., 2005). For example, dopamine neurons are stimulated during the pre- paratory (goal-directed) behaviors prior to food-ingestion, especially for palatable food, but they are not highly active during the actual con- sumption of foods, even palatable ones (Baldo and Kelley, 2007; Blackburn et al., 1992). Dopamine is even more critically involved in male sexual behavior, which in the rodent involves a very complex sequence of goal-directed actions, including locomotion to the receptive female, mounting of the female, and copulation. Intact dopaminergic systems, driven by the male hormone testosterone (Hull et al., 2004), are necessary for the first two of these preparatory/seeking behaviors, but they are not necessary for copulation per se (Blackburn et al., 1992). Nevertheless, copulation per se does activate dopaminergic neurons in the nucleus accumbens and other medial structures to a greater extent than does the consumption of regular foods (Blackburn et al., 1992), whose value is more survival-based than incentive-based. As noted earlier, dopamine’s facilitation of sexual behavior is primarily confined
to males, as most female-type sexual behavior is dependentonothertransmitters such as norepinephrine and is actuallyinhibited bydopamine agonists such as apomorphine (Robbins and Everitt, 1982).3Some researchers have interpreted the role of dopamine in malesexual behavior and other “pleasurable” activities as due toageneralinvolvement of dopamine in hedonic activity. As BerridgeandRobinson(1998) note, however, there is a difference between“liking”(actuallyreceiving pleasure from) a reward and “wanting” (craving)it. Likingimplies a positive emotional reaction, whereas “wanting”impliesamotivational drive that may or may not be associated with a positiveemotionorevenanyemotionatall.Agoodexampleofthisisaddictionto drugs, gambling, sex etc. – the reward itself may belesssignificantthan the process of obtaining the reward. It has repeatedlybeen shownthat medial dopaminergic systems are more involved in“wanting”than“liking” (Berridge and Robinson, 1998; Blackburn et al., 1992;Evanset al., 2006; Ikemoto and Panksepp, 1999; Salamone and Correa,2002).The greater role of dopamine in motivational drive rather than experi-encing sensual pleasure per se may relate to its involvementinobsessive-compulsive and psychologically addictive behavior (see Chapter4),bothof which involve intense behavioral drives fornot necessarilypleasurableor beneficial aims. Powerful dopaminergic behavioral drivescanbemaintained just to avoid aversive events or to obtain incentives (drugs,sex, money etc.) that may over time become very addictiveorself-destructive. In the extreme, humans may even choose completelyabstract religious and political ideals over basic survival. Thiscanbepositively expressed in Martin Luther King Jr.’s proclamation that“ifamanhasn’tdiscoveredsomethinghewilldiefor,heisn’tfittolive.”4But,itisalsoexpressedintheshockingwaveofsuicidebombingsintheMiddle East in recent years, the tragic absurdity of the Crusades, andthe horrific consequences of a multitude of political and religious
ideological conflicts throughout history (see Chapter 6).
Afewgeneralstatementscanbemadeconcerningtheroleofdifferent dopaminergic systems in motivation and goal-directedness. First, the ventromedial systems in the limbic and basal forebrain areas provide themajormotivationaldriveorimpulseforthegoal-directedaction.The
3 The different characteristics of male and female sexual behaviors (“active” and “receptive,” respectively) and the differential involvement of dopamine in them is consistent with other evidence that dopamine is more generally involved in “active” as opposed to “passive” cognitive, attentional and behavioral states (Tucker and Williamson, 1984).
4 Speech in Detroit, MI, June 27, 1963. Online, available at: www.quotationspage.com/
orbitofrontal and anterior cingulate cortices then help to organize and focus the goal-directed behavior into meaningful goal-directed sequences. Ultimately, the lateral prefrontal cortex, in conjunction with striatal systems, regulates these goal-directed actions in terms of pre- diction and feedback from the environment (e.g. by updating the ori- ginal target behavior in memory on the basis of current contextual information). The lateral prefrontal network also provides the capacity to shift actions or strategies, which it does by inhibiting previous responses. The involvement of the lateral dopaminergic systems in working memory and cognitive shifting and other “executive”-type behavior ultimately underlies the larger dopaminergic role in higher intelligence, as discussed in Section 3.2.
Oneinterestingmanifestationoftheroleofdopamineinmediatingboth upward and distant space and time is its involvement in altered states such as dreaming and hallucinations and in religious experiences in humans. As I recently described (Previc, 2006), all of these activities involve:
- a preponderance of extrapersonal (distant) sensory inputs (e.g. auditory andvisual);
- a dearth of peripersonal ones (e.g. tactile andkinesthetic);
- an emphasis on extrapersonal themes (e.g. flying or moving outside of one’s body);and
- the presence of upward eye movements and other upper-fieldbiases.
Dopaminergic activation, caused either by altered physiological states (e.g. hypoxia/near-death, sensory deprivation/isolation, extreme stress) or other neurochemical factors (reduced glutamatergic, cholinergic,and serotonergic activation to varying degrees), appears to be the common denominator in all of these experiences (Previc, 2006). One of themost dramatic manifestations of dopaminergic activation is the out-of-body experience, which can occur during hallucinations, dreaming, sensory deprivation, hypoxia, psychotic disorders, hypnosis, transcendence,and psychologically traumatic events such as rape. Reduced sensory input releases the normal inhibition of serotonergic, cholinergic, and nor- adrenergic sensory outputs on dopamine, which is poorly represented insensoryprocessingareas.Bycontrast,hypoxic“near-death”experi- ences, dreaming, and depersonalization reactions during traumatic experiencesmayservetoelevatedopamineaspartofageneralquieting ofthebodythatconservesoxygenduringhypoxicepisodes,combatsthe
rise in temperature before sleep (our temperature is lowered during sleep and dreaming), and lowers emotional arousal by reducing the sympa- thetic hormones involved in the response to psychologically stressful events (see Chapter 2).
Many researchers have posited a fundamental similarity between dreams and hallucinations (Hobson, 1996; Rotenberg, 1994; Solms, 2000), which even share an etymologic root.5Hallucinations bear a close relationship with dreaming, as indicated by the fact that the former frequently occur in normal individuals following sleep deprivation or during or just prior to sleep onset (“hypnogogic hallucinations”) or just after awakening (“hypnopompic hallucinations”) (Cheyne and Girard, 2004; Girard and Cheyne, 2004; Girard et al., 2007). Moreover, hallucinations can even occur in individuals while awake when accom- panied by bursts of rapid-eye-movement activity, which is normally associated with dreaming (Arnulf et al., 2000). When ingested before sleep, hallucinogens such as LSD are known to potentiate dreaming (Muzio et al., 1966), and schizophrenics describe their hallucinations as dream-like (Gottesmann, 2002). Dreaming and hallucinations are also closely tied to religious traditions or linked to religious experience (Batson and Ventis, 1982: Chapter 4; Gunter, 1983; Koyama, 1995; Pahnke, 1969; Saver and Rabin, 1997). Dreams have been considered a means of receiving messages from the supernatural (e.g. Joseph’s in the Book of Genesis) and meeting ancestors, as exemplified by the con- struction of many ancient Japanese religious temples in order to foster dreaming (Koyama, 1995). Ingestion of hallucinogenic drugs leads to mystical experiences and religious imagery (see review by Batson and Ventis, 1982: Chapter 4; Goodman, 2002; Pahnke, 1969; Saver and Rabin, 1997), and hallucinations during epilepsy and paranoid schizo- phrenia are hypothesized to have led to experiences that inspired many of the world’s leading religions, including those of St. Paul (the founder of Catholicism), Mohammed (the founder of Islam), and Joseph Smith (the founder of the Mormon religion) (see Saver and Rabin, 1997). Indeed, epilepsy and schizophrenia, both of which involve intense acti- vation of the ventromedial dopaminergic pathways, are two disorders clearly linked to hyper-religiosity (see Previc, 2006), as epilepsy was termed the “sacred” disease by the Greeks (Saver and Rabin, 1997)and “insanity”wasoriginallyaHebrewwordreferringtothosecarriedaway by religious visions (see Previc, 2006).
5 Both derive from “to wander” in some languages – e.g. hallucination from the Latin “alucinari” and dream (“reve” and “reverie” in French) from the French “resver.”
In both dreams and hallucinations, serotonergic neurons in the raphe nuclei shut down, helping to unleash dopaminergic activity in normally inhibited medial frontal pathways (Gottesmann, 2002). One interesting differencebetweendreamsandhallucinationsisthatwhereasdreamsare mostlikelytooccurinrapid-eye-movementsleep,whichisaccompanied by cholinergic activation (Hobson et al., 2000), hallucinations are more likely to involve reduced cholinergic inhibition of dopaminergic activity, since anticholinergic drugs like atropine are potent hallucino- gens. As noted earlier, one reason why dopamine is so involved in dreamsandhallucinationsisthatdopamineisordinarilynotinvolvedin the basic processing of sensory information but rather is much more plentiful in higher-order perceptual and associative regions. When the primary sensory systems – particularly those providing feedback con- cerningtactileandkinestheticsensations–areshutdownduringsensory deprivation, dreaming, and other altered states, dopaminergic systems begin to make spurious and chaotic associations among internally gen- eratedstimulithatcanbeconfusedasassociationsamongactualstimuli emanating from the environment (Previc,2006).
It is interesting to note that during the dopaminergic activation in dreams, anesthesia, hypoxia, hypnosis, meditation, and mystical and delusionalstatesinwhichout-of-bodyexperiencesarecommon,theeyes tend to roll upward (see Previc, 2006). Moreover, out-of-body illusions and hallucinations like flying are more likely to occur in the upper fieldandbeyondarm’sreach(CheyneandGirard,2004;Girardetal., 2007),andreligiousbeliefsandexperiencesinhumansaregenerallybiased toward upward, distant space (e.g. heaven) (Previc, 2006). Religious practicessuchasmeditationrequireafocusonupperspace(e.g.focusing on the “third eye,” a region on the forehead between the two eyes) orentailadiminutionofbodilysignals,allowingfortranscendence(lossof self).Furthermore,manyreligioussymbolsandtemplesarelocatedinor protrude into upper, even astrological space (e.g. spires and domes, pyramids,mountainsidemonasteries,sacredmountains,angelicfigures, Stonehenge).Similarly,greatreligiousmomentsandthemesareentwined with upper space (Moses on Mt. Sinai, Jesus on the mountaintop, Mohammed transported by the angel Gabriel on a chariot into the sky, Shamanssendingspiritsintotheskyetc.).Positive(approach)elementsof religionsuchasheavenandangelsaremorelikelytobeassociatedwith upper space, whereas negative (avoidance) elements such as hell and serpents are more likely to be associated with lower space (Meier etal., 2007;Previc,2006),inlinewithageneralbiasatleastinWesterncultures tovalueupperspacemorehighly(e.g.upliftedversusdowntrodden;exalt versusdebase,etc.)(MeierandRobinson,2004;Previc,2006).
Finally, religious (or at least spiritual) experiences are usually asso- ciated with activation of the ventral brain regions – especially the medial and superior temporal lobe and the medial prefrontal cortex – whereas peripersonal regions such as the posterior parietal lobe appear to be quieted during religious activity (Previc, 2006). The occurrence of symptoms similar to those found in medial-temporal seizures – such as olfactory and visual illusions and feelings of being spatially lost – has been frequently noted in normal persons with paranormal or spiritual beliefs and experiences (e.g. Britton and Bootzin, 2004; Fenwick et al., 1985; MacDonald and Holland, 2002; Morneau et al., 1996; Persinger, 1984; Persinger and Fisher, 1990; Persinger and Markarec, 1987). As already noted, the ventromedial cortical regions are among the most dopamine-rich regions of the cortex, as is consistent with the fact that all major drugs that create mystical experiences – glutamate antagonists, serotonin antagonists, acetylcholine antagonists, and dopamine agonists – ultimately serve to tilt the neurochemical balance toward dopamine (Previc, 2006). Due to its facilitation of associations and predictive relationships among distal stimuli, dopamine is believed to mediate superstitious associations, i.e. beliefs that events are not merely coinci- dental but are rather causally related. Superstitious behavior is linked to a distorted view of chance, which is a hallmark of paranormal thought (Brugger et al., 1991), and the underestimation of chance and ran- domness can lead highly dopaminergic persons to overestimate their control over events either directly through one’s own ability or indirectly by tapping into spiritual forces through prayer and mediums. The heightened attempt to control events, to be discussed further in Sec-
tions 3.4.2 and 3.4.3 in connection with the “locus-of-control the left-hemisphere style, underlies many religious rituals as well as nonreligious behaviors such as gambling. In fact, both religious rituals
and gambling are associated with obsessive-compulsive (heightened control) tendencies and obsessive personality styles (Previc, 2006), whicharedependentonactivationofthemedialdopaminergicpathways (see Chapter4).6
Delusions, hallucinations, and dreaming are relatively more likely to
involve activation of the left hemisphere of humans (Previc, 2006). The left hemisphere also appears predominant in most religious experiences
6 Coincidentally (or perhaps not), for many years the only endowed chair in paranormal studies – the Bigelow Chair for Consciousness Studies – was located in the heart of gambling country at the University of Nevada at Las Vegas. It is also worth notingthat male sexual titillation, fast-paced excitement, and copious amounts of alcohol that characterize Las Vegas further stimulate the dopaminergic drive and heighten the pro- clivity togamble.
and behaviors, albeit to a lesser extent (Previc, 2006). The left- hemispheric predominance in dreams and hallucinations and other altered states runs counter to the common notion that the moreintuitive right hemisphere is the source of such states. In reality, the “earthier” righthemisphereistheanchorforourbodilysenses,emotions,andsense ofself.Damagetotherighthemisphereismorelikelytoproduceaneglect of the body (as in dressing apraxia), peripersonal manipulativedisorders (as in constructional apraxia), an altered body image and even denial of bodilyhandicaps(termed“anosognosia”),impairedbodyorientationin space, reduced self-awareness, and impaired emotional recognition and expression (see Cutting, 1990; Hecaen and Albert, 1978). In extreme cases,damagetothebody/selfsystemsoftherighthemispherecanleadto actual depersonalization and somatoparaphrenic delusions, in whichthe patient may attribute sensations on their own body to an alienentity.7
Even more specifically, the altered states in which extrapersonal inputs prevail result from activation of the posterior (temporal) and anterior (prefrontal) portions of the ventromedial action-extrapersonal system, as opposed to the laterally based focal-extrapersonal system. In essence, ventromedial dopaminergic activation results in the “triumph” of extrapersonal brain activity over the body systems that anchor our self-concept and our body orientation as well as a triumph over the more “rational” executive intelligence maintained in the lateral dopaminergic systems (Previc, 2006). Conversely, the lateral (focal-extrapersonal) dopaminergic systems may be more responsible for the enormous power of the human intellect – particularly its abstract intelligence – as discussed in the next section.
- 2Dopamine andintelligence
From what has been written above about dopamine’s role in goal- directedandexploratorybehavior,itwouldbesurprisingifdopaminedid notplayanimportantroleinintelligence.However,theroleofdopamine inintelligenceclearlygoesbeyonditsmotivationalroleandcanbetiedto atleastsixspecificcognitiveskills,allofwhichcanbelinkedtoextrapersonal space.Beforedescribingtheseskills,Iwillbrieflyreviewthegeneralevi- denceforarelationshipbetweendopamineandintelligence.
7Avividexampleofthelossofconnectednesswithone’sownselforbodyfollowingright- hemispheric damage is the tendency to see the other person rather than oneself in morphed images of oneself and a famous celebrity (Keenan et al., 2000). By contrast, the isolated right hemisphere sees the morphed image as more resembling oneself than the well-known person.
Figure 3.4 An axial (horizontal) section of a human brain showing reduced dopamine D2 receptor binding (increased dopamine activity) in the left and right caudate nuclei in a reversal shift memory task.
From Monchi et al. (2006). Striatal dopamine release during performance of executive functions: A [11C] raclopride PET study.
Neuroimage, 33, 907–912, with permission from Elsevier.
There are several direct pieces of evidence for a paramount role of dopaminergic systems in intelligence. First, reduced prefrontal dopa- mine levels are believed to be a prime cause of the working memory and other executive disturbances that occur in phenylketonuria (Diamond et al., 1997; Welsh et al., 1990). As previously noted, loss of just the dopa- minergic content of prefrontal cortex reproduces most of the symptoms of outright removal of this region (Brozoski et al., 1979). Second, dopamine levels are diminished due to extreme iodine-deficiency during prenatal brain development because inadequate levels of thyroid hormones limit the conversion of tyrosine to dopa, contributing to widespread mental retardation (Previc, 2002). Third, dopamine binding and other measures have been shown to predict verbal, analytical, and executive intelligence in humans (Cropley et al., 2006; Guo et al., 2006; Reeves et al., 2005). Indeed, dopamine release in the dorsal striatum increases during executive tasks involving cognitive shifting (Monchi et al., 2006), as illustrated in Figure 3.4. Fourth, dopaminergic systemsareamongthemostprofoundlyaffectedduringthecognitive
decline of aging, with the number of dopamine receptors correlating significantly with performance on abstract reasoning, mentalflexibility, andavarietyofothercognitivetests(B€ackmanetal.,2006;Braverand Barch, 2002; Volkow et al., 1998). Finally, neonatal damage to both the prefrontal cortex and caudate nucleus destroys most of the striatal/ lateral-prefrontal dopaminergic system and produces permanent cognitive impairments in monkeys (Tucker and Kling, 1969), although this is not as true when one of the structures is spared.
There is also much indirect evidence that supports the crucial role of dopamine in intelligence. From a clinical perspective (see Chapter 4), cognitive function is altered to varying degrees in most diseases in which dopaminergic activity is abnormal, including attention-deficit disorder (where lateral prefrontal dopamine may be insufficient relative to ven- tromedial dopamine), autism (where lateral dopamine may be exces- sive), bipolar disorder (where dopamine is elevated during the manic state), Huntington’s disease (in which dopamine is relatively elevated in the striatum), Parkinson’s disease (where striatal dopamine is reduced due to damage to the nigrostriatal pathways), and schizophrenia (where dopamine in the ventromedial/mesolimbic systems is elevated). The symptom profiles in these disorders are far from identical, which is not surprising given that they affect different dopaminergic systems and to varying degrees other key neurotransmitters. However, one deficit common to virtually all of these disorders is impaired “executive” intelligence (e.g. working memory, planning, shifting strategies), in line with the importance of dopaminergic systems housed in the lateral prefrontal cortex for working memory, cognitive shifting, and other components of fluid and executive intelligence (Braver and Barch, 2002; Diamond et al., 1997; Kane and Engle, 2002; Nieoullon, 2002; Previc, 1999). Other indirect evidence of dopamine’s role in intelligence is the relatively high amount of dopamine in nonhuman orders (parrots, cetaceans, and primates) that are believed to possess advanced intelli- gence, despite their very different brain shapes and sizes (see Previc, 1999).8Finally, as noted earlier, the high level of dopamine in the left hemisphere is commensurate with that hemisphere’s leading role in abstract reasoning and executive intelligence (see following sections).
Sometheorists(e.g.Neioullin,2002)arguethatdopaminergicsystems contribute less to standardized intelligence measures than tofluid/
8 Interestingly, at least one of these species (parrots) may also exhibit excessive tics and other stereotyped movements that characterize hyperdopaminergic disorders such as autism, obsessive-compulsive disorder, and Tourette’s syndrome in humans (Garner et al.,2003).
executiveintelligence.But,eventheprefrontalcortexasawholeisnotbelieved to be crucial for scores on standardized intelligencetests(Kaneand Engle, 2002; Rosvold and Mishkin, 1950), whichtypicallymeasure“crystallized” or knowledge-based intelligence to a greater extent than fluid intelligence. Nonverbal intelligence tests such as theRaven’sPro-gressive Matrices Test that measure abstract reasoning are,bycontrast,primarily designed to measure fluid intelligence,which underliesoper-ations such as planning, working memory, shifting strategies,andabstractconceptsthataremorecloselyrelatedtotheconceptof“g”,orthegeneralintellectual factor (Carpenter et al., 1990; Kyllonen andChristal,1990).One particular dopaminergically mediated executive skill–workingmemory, the ability to store items in memory andperformmultipleparalleloperationsonthem–isevenconsideredbysomeresearcherstobethe single-most important element of “g” (Kyllonen andChristal,1990).Of course, dopamine is not the only neurotransmitterinvolvedinintelligence. Acetylcholine is also involved in workingmemory,attention,and motor programming and is similarly lateralized to thelefthemisphere(see also Ellis and Nathan, 2001; Previc, 1999, 2004).Norepinephrineisalso involved in working memory and certain motor operationsand,sinceitisderivedfromdopamine,itisalsoaffectedbythefailuretodevelopnormal tyrosine and dopamine levels in iodine-deficiencydisorderandphenylketonuria. Certainly, the increase in cardiac outputandtempera-turewithmentaleffort(Al-Absietal.,1997)pointstoacontributionofthe noradrenergically stimulated sympathetic nervous systeminmentalactivity(seePrevic,2004).Incontrasttotheaboveneurotransmitters,only serotonin does not appear to play a major role in cognition–indeed,it tends to decrease orienting, vigilance, and workingmemory(Drin-genbergetal.,2003;Lucianaetal.,1998;Schmittetal.,2002),inline
with its general oppositional role to dopamine.
Aswillbedescribedinthefollowingsections,dopamine’scontribu- tion to intelligence can be traced to its involvement in six primary cog- nitiveskills:motorprogramming,workingmemory,cognitiveflexibility, abstract representation, temporal analysis/sequencing, and generativity/ creativity(seePrevic,1999).Theseskillsareimportantnotonlytogeneral intelligencebutalsotolanguage,whichmanytheoristsconsidertobethe mostadvancedtalentinhumans.Linguisticcompetencewouldcertainly collapse without these skills, all of which depend in most adults on a normally functioning left hemisphere, and it is also difficult to imagine languagenotbeinginventedbyindividualswhopossesstheseskills.By contrast, the notion of special linguistic abilities, processing centers, or even genes (e.g. Pinker and Bloom, 1990) that act independently of overall intelligence has yet to be demonstrated. This is not to say that
someone with above-average verbal skills may in some cases be deficient invisuospatialabilitiesandviceversa–obviously,eachofushasabilities and/or interests that allow us to perform better in some skill areas than others. But, overall there is a high correlation between linguistic abilities and intelligence (Daneman and Merikle, 1996).
What is important from the standpoint of the larger role of dopamine inbrainfunctionisthatalloftheseskillscanbetieddirectlyorindirectlyto processingindistantspaceandtime.Motorprogrammingandplanningare necessary for carrying out the sequential actions needed to achieve distant goals, while working memory allows us to receive and hold new information while updating our goal-directed behavior, and cognitive flexibility allows us to change course as we move along toward the dis- tant goal. Abstract representation allows us our cognitive apparatus to escape immediate perceptual images, and rapid sequential proces- sing is needed in processing the complex and rapidly changing signals (particularly auditory ones) in extrapersonal space as we move through it. Finally, the process of generativity/creativity is important in stimu- lating new associations among various stimuli and responses. That these cognitive skills can be invoked during pure mental thought or when complex goal-directed behavior is required in nearby space does not negate the fact that under natural conditions, humans and other animals would have little need of them if goals were easily obtainable in immediate, nearby space.
- 3.2.1Motor programming andsequencing
Asalreadynoted,dopamineiscruciallyinvolvedinvoluntary,sequential motor behavior. Dopamine is richly represented in frontal motor areas and is especially involved in motor learning, timing, and programming. Dopaminergic systems in the caudate nucleus are also important in controlling the rhythm of motor outputs (Aldridge et al., 1993), with reduced dopaminergic levels affecting voluntary movements more than reflexive ones and complicated movements more than basic ones. Orofacial movements are particularly facilitated by dopamine, whichin excess leads to tics and other orofacial stereotypical behaviors (see Chapter 4). The sequencing of speech is particularly affected by dopa- minergic alterations, whether that be overactivation, as in stuttering (Brady1991),orunderactivation,asinParkinson’sdisease(Lieberman etal.,1992;Pickettetal.,1998).9Aspreviouslynoted,severedepletion
9 Indeed, spoken language is dominated by the same basic motor rhythm (5 Hz) as are chewing and other orofacial behaviors (Kelso and Tuller, 1984).
of brain dopamine results in the loss of all voluntary motor behavior, including speech.
The notion that motor programming and motor behavior are closely linked to intelligence has been somewhat contentious. Since the early part of the twentieth century, leading behavioral theorists such as Watson and Skinner have viewed linguistic thought as involving motor circuits, especially those involved in speech. Certainly, speech circuits are activated during verbal thought (McGuigan, 1966), and speech bears considerable relation to other orofacial movements in terms of its timing, duration, and musculature (Kelso and Tuller, 1984). Eye movements,usuallyupward,arealsoactivatedduringmentalarithmetic andothercomplextasks,andevenrelativelysimplementalcalculations becomequitedifficultwhensubjectsareforcedtolookdownwardornot makeanyeyemovements(Previcetal.,2005).Asbutanotherexample, topological operations like mental rotation involve activation of the hand-movement representations in the dorsal parietal and premotor regionsofcerebralcortex(Cohenetal.,1996;Vingerhoetsetal.,2002), whichmakessenseinthatunusualrotationsofobjectsprimarilyoccurin peripersonal space, in conjunction with manual activity (Previc, 1998). More generally, executing motor actions requires many of the same operations as intelligence, in that in both cases we must develop hier- archical, sequential organizational strategies for achieving goals – for example, sub-goals must be established that have to be achieved before obtaining the main goal. We must flexibly adjust our course of action depending on obstacles or events in the environment, and we must maintain goals, target positions etc. in working memory. Finally,active involvement using motor circuits is known to improve learning and memory relative to passive learning environments (Hein,1974).
The link between motor behavior and intelligence is further strengthened by the involvement of dopamine-rich motor association cortex in higher-order mental operations. The area most involved in grammaticalcomprehension–knownasarea45orBroca’sarea10–may be considered part of orofacial motor association cortex, while the area most involved in mental calculations and working memory – area 46 – may be considered part of oculomotor association cortex (Previc et al., 2005). The link between intelligence and motor behavior is further supported by the greater importance of the dopamine-rich left hemi- sphererelativetotherightoneinvoluntarymotorbehaviorandmotor
10 This general area was first recognized by the nineteenth-century neurologist Paul Broca as being important to speech and language.
As already noted, working memory is arguably the single-most importantskillrequiredforgeneralintelligence(Carpenteretal.,1990; Kyllonen and Christal, 1990). Working memory refers to the ability to store, retrieve, and operate on items in memory on a short-term basis (typically, a few seconds). It is difficult to imagine that we could carry out mental operations well if we could not maintain information in one register while at the same time operating and retrieving information from other registers. During decision-making, which activates lateral prefrontal dopamine neurons (McClure et al., 2004; Sevy et al., 2006), we must hold two or more alternatives in memory while weighing our course of action. During language comprehension, we must likewisebe able to process the last part of sentences while simultaneously per- forming high-level interpretation based on clauses encountered several seconds earlier. Not surprisingly, working memory scores are highly predictive of overall language comprehension abilities (Daneman andMerikle,1996).
Both human and animal experiments point to the important role of the lateral prefrontal cortex in working memory (see Goldman-Rakic, 1998) along with an assortment of areas with strong interconnectionsto it (e.g. the parietal eye-fields and lateral temporal lobe). Most regions heavily involved in working memory have strong interconnectionswith this region. These regions are important not only to working memory butalsotomathematicalandothercomplexreasoningtasks.Allofthese brain areas tend to be very rich in dopamine, which has perhaps more than any other neurotransmitter been implicated in working memory (Ellis and Nathan, 2001).
There is also evidence of an overall left-hemispheric bias in working memory,althoughitislesscompellingandmoreconstrainedthaninthe caseofmotorprogramming.Theleft-hemisphericadvantageinworking memory is greater when semantic processing is required, but it is also present during difficult and/or novel spatial working memory tasks(see Previc,1999).
The ability to alter one’s cognitive and motor strategy based on new information, much like an executive constantly strategizing about his
company’s future direction, is intuitively the cognitive skill most closely synonymous with the notion of executive intelligence. To arrive at optimal problem-solving and decision-making, it is necessary but not sufficient to know and process what information has been provided, as the ability to act on information (feedback) that is discrepant with one’s current approach or strategy is also critical. This is especially true with language, where ongoing comprehension requires a constant updating of a sentence’s context based on the content of previous words and phrases.
The paramount role of the lateral prefrontal cortex in cognitive shiftinghasbeenconclusivelyestablishedonthebasisofnumeroustests, the most famous of which is the Wisconsin Card-Sorting Test. In this test, cards with different forms, colors and number (e.g. two red squares) are presented, and the patient is required to sort by one cat- egory (e.g. color) and then shift to a different one (e.g. form) without explicit instructions. Sorting on the basis of the first category merely becomes“nonrewarded”whilesortingonthebasisofthesecondand latercategorybecomes“rewarded.”Patientswithprefrontaldamage, particularly in the left hemisphere, have great difficulty in performing this task even though they may be aware that the rewarded cue has changed(Barceloetal.,1997;Smithetal.,2004).Thesamebasicresult, includingtheleft-hemisphericdominance,holdstrueforothercognitive shifting tests, such as the trail-making test in which patients must con- nectapatternofalternatingnumbersandletters(e.g.A-1-B-2-C-3,etc.) (Moll et al.,2002).
The lateral prefrontal cortex’s role in shifting is linked to its rich dopamine concentration in that dopaminergic neurons are sensitive to reward contingencies and are part of an inhibitory network that can stop ongoing activity and adjust one’s responses on the basis of prior context (Braver and Barch, 2002). Moreover, dopamine activity in the dorsal striatum, which is reciprocally connected to the lateral prefrontal system, is highly altered during shifting (Monchi et al., 2006). Lateral-prefrontal dopamine neurons specifically signal a discrepancy in expected versus actual outcome (Rodriguez et al., 2006). Destruction of prefrontal dopamine systems disrupts behavioral switching in animals, and dopa- mine antagonists do the same in humans (see Koob et al., 1978; Nieoullon, 2002; Simon et al., 1980). One example of this is when one receives a cue pointing in one direction (e.g. leftward) to make an eye movement in the opposite direction (e.g. rightward); without the lateral prefrontal cortex and a normal dopaminergic system, performance in this so-called “anti-saccade” task suffers severely (Pierrot-Deseilligny et al., 2003). Another example of dopaminergically mediated contextual
switchingislatentinhibition,thetendencytoignorecuesthathavepre- viously been made irrelevant to a task. Schizophrenics and even highly creative individuals with reputedly low prefrontal activity relative to medial subcortical activity attend to previously rewarded and non- rewarded cues similarly and do not show the latent inhibition context
effect (see Section 3.2.6 and Section 4.2.8) Similarly, t dopamine in Parkinson’s disease and even normal aging mayunderlie
The ability to create and maintain abstract representations (e.g. in symbol use and concept formation) is present only in the most intel- lectually advanced animal species and is considered anothercomponent of fluid intelligence in humans (Carpenter et al., 1990). To engage in such behavior requires that the symbol and event be divorced from immediate space and time, as the symbol is spatially displaced from its referent and the concept is divorced from a specific place and time. Themostabstractspace–timeconceptsarecosmologicalandexistential ones, such as universe, infinity, heaven, and the afterlife. Without abstract symbols (letters, words etc.), advanced language skills would not be possible – indeed, the ability to traverse the past, present, and future as well as to use completely arbitrary symbols are all critical designfeaturesoflanguage(Bickerton,1995;SuddendorfandCorballis, 1997; Wilkins and Wakefield, 1995). Language, therefore, represents one of the most salient examples of off-line human thought that has no immediate impact on survival (Bickerton, 1995; Suddendorf and Corballis, 1997), in contradistinction to the predominantly present- oriented thought of otherprimates.
I have already noted how dopaminergic systems originally used for interacting in distant “real” space were expropriated during human evolution for use in even more cognitively distant spatial and temporal realms, including abstraction, mental time traveling, and religious activity. One piece of direct evidence for dopamine’s involvement in abstract concept formation and abstract reasoning comes from the Raven’s Progressive Matrices Test, which requires an understanding of the conceptual relationship among a series of abstract designs. In this test, a sample of up to nine abstract forms, arranged in a logical order with one form missing, is presented; a set of up to eight alternatives is also presented and the person must choose the correct one to replace the blank one. Dopamine levels are positively correlated with
performance on the Raven’s test (Nagano-Saito et al., 2004), consist- ent with:
- the poor performance of Parkinsonian, aging, and other dopamine- deficient populations on this test (Nagano-Saito et al., 2004; Ollat, 1992);
- the superior performance of persons with the hyperdopaminergic autistic spectrum disorder (Dawson et al., 2007; Hayashi et al., 2008);and
- thecrucialinvolvementofthedopamine-richlateralprefrontalcortex (Gray et al., 2003; Prabhakaran et al., 1997), especially in the dopamine-rich left hemisphere (Berman and Weinberger, 1990).
Indeed, many studies using different tests, including deductive logic, have shown that lateral prefrontal involvement in abstract reasoning is lateralized to the dopamine-rich left hemisphere (Deglin and Kinsbourne, 1996; Gray et al., 2003; Prabhakaran et al., 1997; Villardita, 1985), even if the tests are nonverbal in nature. In fact, the distinction between left-hemispheric“abstract/analytical/propositional”andright-hemispheric “concrete/wholistic” thought is one of the most widely accepted and important functional lateralizations in the human brain (Bradshaw and Nettleton, 1981; Nebes, 1974; Ornstein, 1972). By contrast, the earthier, more emotionally astute and practical right hemisphere is more adept at proverbinterpretation,judgingspeaker’sintent,inferringotherpeople’s thoughts, and detecting lying, as well as at certain 3-D geometrical skills linked to peripersonal operations. What all of these right-hemispheric skills have in common is that they rely on bodily-centered or emotional signals from the body (e.g., proverbs convey subtle emotional messages, judging speaker’s intent involves processing facial expressions or the emotional quality/prosody of the voice, and geometrical processing requires the 3-D visualization of objects, which primarily occurs in body space duringmanipulations).
- 3.2.5Temporal analysis/processingspeed
A key element allowing both computers and humans the capacity for advanced intelligence is a high information processing speed or “baud rate.”Colloquialexpressionsabout“beingslow”or“quick-thinking” are often used to describe intellectual capabilities. Moreover, there is strong scientific evidence linking general intelligence to the speed of information processing and memory access, using measures of reaction time and event-related brain activity (Bates and Stough, 1998; Fry and Hale, 2000). A high “baud rate” is essential for the rapid delivery and
comprehension of spoken language, which involves transient acoustic patternsinthemillisecondrange.Indeed,manydevelopmentallanguage impairmentsarebelievedtobecausedbyunderlyingdeficitsintemporal processing (Farmer and Klein, 1995; Tallal et al.,1993).
Dopaminergic systems are very important to the timing of our short- duration internal “clock.” Dopaminergic agonists speed up the clock, whereas dopaminergic antagonists slow it down (Meck, 1996). These effects are consistent with the slower information processing in Parkin- son’sdisease(Sawamotoetal.,2002),whichisnotcausallydependenton the slowed reaction-times in this disorder. The influence of dopamine in increasing processing speed is also evidenced by the faster reaction-times following intake of drugs that increase dopaminergic transmission, suchas piribedil (Schuck et al., 2002) and levodopa (Rihet et al., 2002).
The role of dopamine in speeding up cognitive processing is further consistent with the greater involvement of the left hemisphere in rapid temporalprocessing.Forexample,thelefthemisphereismoreinvolved in rhythmic aspects of language and music, speech, upper-limb move- ments, and even tactile perception (Bradshaw and Nettleton, 1981). The paramount role of the left hemisphere in the perception and pro- duction of both basic (e.g. articulation) and higher-order (e.g. syntac- tical) aspects of language appears to be highly dependent on its role in rapid sequential processing and programming in the auditory modality (Tallal et al.,1993).
Despite the first five essential skills, our advanced intelligence would nonethelessbegreatlylimitedwereitnotforthefinaldopaminergically mediated skill – the ability to generate new solutions and create and express novel ideas and associations. The incredible generativity of human language, symbol use, musical expression, engineering etc. is unquestionably one of the most distinguishing features of the human intellect relative to those of other advanced species (Corballis, 1992). The ability to generate an almost limitless number of unique sentences using only a few dozen phonemes and symbols, known as“generative grammar,” is an essential feature of human spoken and written com- munication (Chomsky, 1988; Corballis, 1992; Hockett, 1960).
As noted earlier, dopamine promotes associations among events and helps to stimulate mental fluency (the ability to generate words, asso- ciations etc). One example of this is diminished verbal and semantic fluency found in dopamine-deficiency conditions such as aging, Huntington’s disease, Parkinson’s disease, and phenylketonuria(see
Previc, 1999), which can be overcome when drugs that boost dopamine activity are provided (Barrett and Eslinger, 2007). Another interesting example of the role of dopamine in creativity comes from the link between creativity and the previously described phenomenon known as “latent inhibition.” Like highly dopaminergic schizophrenics, creative individuals also show a markedly reduced tendency to filter out irrele- vant stimuli in the latent inhibition paradigm (Carson et al., 2003). Given the link between creativity and reduced latent inhibition caused by excessive ventromedial dopaminergic activity (Weiner, 2003), as well as the generally inhibitory role of the lateral frontal system, the creative drive appears to involve primarily activation of the more “impulsive” ventromedial dopaminergic pathways (see Flaherty, 2005; Reuter et al., 2005). Another link between dopamine and creativity is that, when trying to generate mental images, words, solutions, or otherwise think creatively, our eyes tend to move upward in the direction of distant space (Falcone and Loder, 1984; Previc et al., 2005). In the extreme, the dopaminergic drive to make stimulus associations leads to superstitious behavior (in the case when the associated events are purely random), a heightened sense of relatedness during mystical states (Previc, 2006), anda“looseningofassociations”inschizophrenics,whoaremorelikely to make unusual, remote, and even bizarre associations to verbal stimuli. Seeing unusual relationships is also a hallmark of creativity, so it is not surprising that creativity is anecdotally associated with ventromedially driven states like dreaming (Horner, 2006) or that schizophrenia, mania, attention-deficit hyperactivity disorder and other hyperdopaminergic disorders are all associated with greater creativity in affected individuals and immediate family members (Abraham et al., 2006; Goodwin and Jamison, 1990: Chapter 4; Karlsson, 1974; Previc, 1999).
Corballis (1992) argues that the dopamine-rich left hemisphere con- tributes more to generativity than does the right hemisphere, and this iscertainlytrueinthecaseofverbalfluency,signlanguage,andmental imagery(seePrevic,1999).However,visuoconstructivetasksareusually betterperformedbytherighthemisphere,anditisgenerallyrecognized thatcreativityprobablyinvolvesacomplexsetofinteractionsinvolving both the left and right hemispheres (Bogen and Bogen, 1988; Hoppe, 1988).
- 3Dopamine andemotion
Whilebothdopaminergicsystemsareconcernedwitheitherinitiatingor guiding voluntary (goal-directed) behavior in extrapersonal space, the
role of dopamine in emotion is more complex and system-specific. As noted earlier, dopamine is tied to the incentive-based pleasurable sexualactandevenpromotessome“positive”activationalstatessuchas grandiosity, elation, and even euphoria (Alcaro et al., 2005; Burgdorf and Panksepp, 2006; Ikemoto, 2007), but there is less convincing evi- dence that dopaminergic systems are involved in most emotional arousal or interactions and such social tendencies as warmth, playfulness, and empathy. Rather, proximal socio-emotional affiliative interactions and most emotions depend more on serotonin, norepinephrine, opioids, and oxytocin (Depue and Morrone-Strupinsky, 2005; Nelson and Panksepp, 1998; Previc, 2004), which is why boosting serotonin and norepinephrine is the preferred pharmacological option in treating most mood disorders.
Because of its parasympathetic action, elevated lateral dopaminergic activity in particular may actually serve to dampen emotional arousal (Delaveau et al., 2005; Deutch et al., 1990; Finlay and Zigmond, 1997; Mandell, 1980), which is why emotional stress can frequently precipitate hyperdopaminergic clinical states (see Chapter 4). As already noted, dopamine has been associated with extraversion – but more in the case of “agentic” extraversion (characterized by the use of other people to achieve goals) than “affiliative” extraversion (characterized by genuine empathy for other people) (Depue and Morrone-Strupinsky, 2005; Panksepp, 1999). Indeed, the trait of emotional detachment has con- sistently been associated with high levels of dopamine and certain dopaminergic receptor genes such as DRD2 (Breier et al., 1998; Farde et al., 1997), and the ability to regulate (or even inhibit) one’s emotions is a characteristic of extremely goal-oriented individuals, whose traits most typify the lateral dopaminergic personality (see Section 3.4.2). In monkeys and other animals, dopamine agonists decrease social grooming and increase social isolation (Palit et al. 1997; Ridley and Baker, 1982; Schlemmer et al., 1980), just as social withdrawal occurs in hyperdopaminergic disorders such as autism and schizophrenia (see Chapter 4). Heightened aggressiveness has also been shown to occur following dopaminergic stimulation (de Almeida et al., 2005; Miczek et al.,2002).
The restricted involvement of dopaminergic systems in emotional arousal is consistent with the social and emotional deficiencies of the dopamine-rich left hemisphere in most humans. When tested in isolation, the left hemisphere has difficulty in understanding the emo- tional content of communications, in processing the intonations (pros- ody)ofspeechthatconveyitsaffectivecomponent,injudgingspeakers’
intent(inwhathasbeengenerallytermed“theoryofmind”),11inpro- ducing facial expressions, and in accurately gauging the emotion of another individual from facial expressions (Borod et al., 2002; Happe etal.,1999;HeilmanandGilmore,1998;Kucharska-Pieturaetal.,2003; Perryetal.,2001;Sabbagh,1999;WeintraubandMesulam,1983).Other derivative emotional deficits of the left hemisphere include the reduced ability following right-hemispheric lesions to discern the emotionally ladenmeaningofproverbs(Bryan,1988)andtodetermineifsomeoneis lyingfromhisorhervocalandfacialbehavior.Althoughapredominance of left-hemispheric activity can frequently lead to mania (Cummings, 1997;Cutting,1990;Joseph,1999),theincreasedactivitylevelinmania is not always accompanied by improved mood. Moreover, superficially improvedmoodfollowingright-hemisphericdamagecanalsostemfrom thelefthemisphere’slackofawareness(anosagnosia)ofthedamageto the other hemisphere (Cummings, 1997; Cutting, 1990).
- 4The dopaminergicpersonality
As with emotion, the role of dopamine in defining our individual per- sonalities is more complex and controversial than is the dopaminergic contribution to intelligence. Dopamine excess has been associated with numerous types of normal and subclinical personalities, including impulsive (Comings and Blum, 2000), detached (Farde et al., 1997), extroverted (Depue and Collins, 1999), novelty-seeking (Cloninger et al., 1993), schizotypal (Siever, 1994), and telic or serious-minded (Svebak, 1985). While many of these personalities may seem at first glancetobemutuallyincompatible,thedifferencesamongthemmaybe more apparent than real because of the differential involvement of the major dopaminergicsystems.
As already discussed, the lateral and ventromedial dopaminergic systems appear to mediate different functions and it seems reasonable to propose that they mediate different personality traits as well (see Table 3.1). Most evidence suggests that the ventromedialdopaminergic
11Inonewell-knownexampleofjudgingtheintentofothersusingthe“theoryofmind,” thepatientisfirstshownapictureofagirltryingtofindalostdogafterlosingitinone place and then is shown a picture of the dog in another place after the girl has left the room. Patients with right-hemispheric damage will be more likely to incorrectly predict that the girl who lost the dog will look for it in its new place, because they cannot take on the mental perspective of the girl (who never actually saw the dog placedelsewhere).
Table 3.1 Features of the two dopaminergic systems.
Associated with focal-extrapersonal systemAssociated with action-extrapersonal system Future-oriented(eventprediction) Future-oriented(exploration)
Distalgoal-oriented(strategic) Distal goal-oriented (initialdrive)
Rational(abstract) Creative(paranormalexperiences)Focused, controlled(internallocus) Hyperactive,impulsive
Unemotional May mediate emotions suchaseuphoriaandaggression
system,largelyassociatedwiththeaction-extrapersonalbrainpathways, has an intense motivational drive and connections with the limbic sys- temandismorelikelytobeinvolvedinaddiction,aggression,impulsive and compulsive behavior, sexual activity, and creative and even para- normal thought. By contrast, the lateral prefrontal cortex, associated more with the focal-extrapersonal brain pathways, is involved in plan- ning and strategizing and provides a major inhibitory control over the intense drives of the ventromedial dopaminergic system. The Freudian analogy(Freud,1927)ofthe“ego”guidingandevenoverridingthe“id” is an apt description of the relationship of the lateral to ventromedial dopaminergicsystems,althoughthe“id”shouldnotbeviewedmerelyin termsofFreud’s“pleasureprinciple”butratherasasetofimpulses, primitive drives, and loosened thought processes.12In reality, the two systems considerably overlap in their functions and largely work together, and no one individual has a purely lateral or ventromedial personality.But,therelativebalancebetweenthemmayaccountforthe variation in personality traits even among normal individuals with the same overall dopaminelevels.
12The“ego”shouldalsonotbeconfusedwiththe“self”–whereasboththe“ego”andthe “id” are outwardly directed and dopaminergically mediated, the self implies an awareness of one’s body, which explains why it is more dependent on right-hemispheric functioning (e.g. Keenan et al., 2000). Freud also postulated the existence of a “superego” that is responsible for guiding both the “ego” and “id” toward socially altruistic behavior and may be considered somewhat analogous to one’s “conscience.” The “superego” is more difficult to pin down neuroanatomically: although certain religious obsessions akin to a hyperactive conscience may be associated with orbito- frontal activity, especially on the left side (see Previc, 2006), the right hemisphere of most individuals appears to be the site of most prosocial and empathetic behavior (see Section 3.3).
- 3.4.1Ventromedial dopaminergictraits
The ventromedial dopaminergic systems provide us with an intense but unconstrained and even aggressive motivational drive directed toward distant, incentive-laden goals. It is the inability to control the ventromedial drives and thought patterns that underlie most of the hyperdopaminergic mental disorders to be described in Chapter 4. The nucleus accumbens, a dopamine-rich structure with important connectionstothelimbicsystemandventralprefrontallobe,isthebest- studied structure in this regard. Dopaminergic neurons in the medial shell of the accumbens are highly active during goal-directed activities, especially for incentive-laden goals (Alcaro et al., 2005; Berridge and Robinson, 1998; Blackburn et al., 1992; Ikemoto, 2007; Ikemoto and Panksepp, 1999). By contrast, ventromedial regions do not appear to playasmuchofaroleinexecutiveintelligenceasdoesthelateralsystem (Cummings,1995).
As noted in Section 3.1.2, opinion among most researchers has shifted from the view that the ventromedial-dopamine system mediates pleasure sensations (“liking”) to the view that this system is responsible for motivational drive (“wanting”) (Alcaro et al., 2005; Berridge and Robinson, 1998; Blackburn et al., 1992; Ikemoto and Panksepp, 1999). Functional brain imaging in humans has shown that widespread areas of the ventromedial dopaminergic system, including the orbitofrontal cortex and anterior cingulate, are intensely involved in the cravings of substance abusers (Evans et al., 2006; Goldstein and Volkow, 2002). The medial- dopaminergic motivational drive can range from the impulsive (as in attention-deficit disorder and various risk-taking behaviors including gambling, sexual promiscuity, and some drug addictions) to the com- pulsive (workaholism and obsessive-compulsive rituals) (see Chapter 4), with the latter involving cortical areas to a greater extent. However, it should be stressed that while the medial-dopaminergic system is impli- cated in most types of psychological addictions, other neurochemical systems such as the opioid and serotonergic ones are also involved –
e.g. serotonergic deficits may promote some types of substance abuse (e.g. alcoholism), partly due to release of serotonergic inhibition on the ventromedial dopaminergic system (Johnson, 2004).
The ventromedial dopaminergic system, including the nucleus accumbens and its shell, is also involved in increasing the salience of extrapersonal stimuli and in making associations between them (Blackburn et al., 1992; Kapur, 2003). As noted earlier, ventromedial dopaminergicover-activityisbelievedtoresultinloosenedassociations
among distal stimuli and/or internally generated stimuli, as in the loos- ened associations and bizarre thought patterns of schizophrenics. Dreaming is a particularly vivid example of what happens when the lateral prefrontal system is quieted and the medial systems (including the anterior cingulate and ventromedial frontal lobe) are unleashed (Solms, 2000; Previc, 2006). The extrapersonal themes, bizarre asso- ciations, and often-intense repetitive and anxiety-laden drives represent the epitome of irrationality and closely resemble the psychotic state of schizophrenics (Gottesmann, 2002; Hobson, 1996; Previc, 2006). However, the bizarre, chaotic associations created by the unleashing of ventromedial dopaminergic activity can more positively be associated with creative genius, which relies partly on the ability to generate or detect unusual associations among stimuli. Not only are there phe- nomenological, evolutionary, and familial links between genius and madness (Horrobin, 1998; Karlsson, 1974), but a link between genius and madness clearly exists in the countless number of famous individ- uals who have experienced both (Karlsson, 1974), including the Nobel prize-winning author John Nash, who suffered from paranoid schizo- phrenia. The following conversation between Nash and a colleague Mackey is particularly revealing in this regard:
“Howcouldyou”,began[HarvardprofessorGeorge]Mackey,“howcouldyou, a mathematician, a man devoted to reason and logical proof .. . how could you believe that extraterrestrials are sending you messages? How could you believe that you are being recruited by aliens from outer space to save the world? How couldyou...?”...“Because”Nashsaidslowlyinhissoft,reasonablesouthern drawl, as if talking to himself, “the ideas I had about supernatural beings came to methesamewaythatmymathematicalideasdid.”(Nasar,1998:11)
- 3.4.2Lateral-dopaminergic traits
The lateral prefrontal dopaminergic system is crucial to maintaining control over behavior and thought. In animals, it has been shown that dopamine levels are very high when stressors are controllable but less so when they are uncontrollable (Anisman and Zacharko, 1986; Coco and Weiss, 2005). When animals are required to actively control their environment by making cost–benefit decisions about rewards–
e.g. choosing a larger but delayed reward over a smaller but immediate one – lateral prefrontal dopaminergic systems are also stimulated (Denk et al., 2005; McClure et al., 2004; Sevy et al., 2006). These findings are generally consistent with dopamine’s role in mediating active explor- ation, achievement motivation, and goal-seeking, particularly in males
(Previc, 2004). The lateral dopaminergic system – originating fromthe focal-extrapersonalsysteminvolvedinoculomotorsearch,scanning,and memory–essentiallychannelstheless-constrainedmedial-dopaminergic systems into organized plans and motor sequences to achieve the intended motivational goal. It is the lateral dopaminergic system that is most closely tied to the executive skills reviewed in Section 3.2and is presumably the basis of the “telic” (intellectual and serious-minded) personality described by Svebak (1985) and the “agentic” extrovert personality (Depue and Collins,1999).
As noted previously, lateral dopaminergic systems may also be crucial in maintaining what has been termed an internal locus-of-control (see review by Declerck et al., 2006). Locus-of-control refers to the tendency toviewone’schoicesandexperiencesasbeingunderone’sowncontrol (known as “internal” control) or as determined by fate or others (knownas“external”control).Havingahighinternallocus-of-controlis believed to be an important predictor of success in life and in combating stress and disease (De Brabander and Declerck, 2004; Kushner et al., 1993; Regehr et al., 2000) and in generally surviving extreme environ- ments (Previc, 2004). It is also greater in males (De Brabander and Boone, 1990), is associated with left-hemispheric function (De Brabander et al., 1992), and is correlated with high goal-directedness, executive intelligence, achievement motivation and a future (as opposed to present) temporal perspective (Declerck et al., 2006; Markus and Nurius, 1986;Murrell and Mingrone, 1994). High dopamine levels support an internal locus-of-control belief (Declerck et al., 2006), as exemplified by the effect of dopamine-boosting drugs such as amphetamine in the lateral pre- frontal cortex to increase perceptions of internal control in those suf- fering attention-deficit disorder and certain medical diseases (Pelham et al., 1992). Because of its ability to inhibit the limbic stress response (Deutch et al., 1990; Finlay and Zigmond, 1997) and to regulate or even inhibit emotionality (Declerck et al., 2006), the lateral prefrontal dopaminergic system is especially useful for clear thinking under stress. Accordingly, lateral-dopaminergic traits frequently predominate in suc- cessful military leaders, who because of their intelligence and planning skills, belief in their ability to control events, ability to function under stress, and reduced emotional attachments, may be especially well-suited to exert bold leadership in dangerous situations (Previc, 2004).
The two greatest weaknesses of the lateral prefrontal system are:
- 4); and
- an exaggeratedbeliefinne’swnpwertocntrlpepleandevent
Too much dopamine can propel one’s internal control sense to abnor- mal limits, as illustrated by the delusions of grandiosity following administration of amphetamine (Krystal et al., 2005), which is the preferred treatment to combat under-focusing and lack of control in attention-deficit disorder. Extreme dopamine activation may lead to a belief of individuals that they can control thoughts and events at great distances and are in the midst of cataclysmic and cosmic forces acting through them, as in mania and schizophrenia (see Chapter 4), but even mild dopamine elevations can lead to schizotypy and magical ideation, in which individuals may feel that they have special powers and/or the ability to control random events (Previc, 2006).
As already noted, most individuals do not exhibit exclusively one set of dopaminergic personality traits. As will be highlighted in Chapter 6, most great historical figures have possessed both lateral and ven- tromedial traits – i.e. a high dopaminergic intelligence and strategic (distalorfar-sighted)visionalongwithcreative,impulsive,mystical,and even irrational dopaminergictendencies.
- 3.4.3Dopamine and the left-hemispheric (masculine)style
As noted throughout this chapter, dopaminergic traits such as distal orientation, abstract intelligence, paranormal beliefs, internal locus- of-control, and reduced emotionality characterize the dopamine-rich left hemisphere of most humans, as determined by unilateral stimulation of it, isolated testing of it after severance of the connections between it and the right hemisphere (i.e. the split-brain patient),13and following damage to the right hemisphere. General descriptions of the left hemisphere in most individuals as more analytical, goal-oriented, linear-thinking, and controlled are widely accepted (Bradshaw and Nettleton, 1981; Nebes, 1974; Ornstein, 1972; Tucker and Williamson, 1984). The left hemisphere also appears to be more involved in aggression (Andrew, 1981; Elliott, 1982; Mychack et al., 2001; Pillmann et al., 1999) and sexual behavior (Braun et al., 2003). Hyperdopaminergic disorders such as aut-ism, mania, obsessive-compulsive disorder, and schizophrenia are all morelikely to occur following right-hemispheric than left-hemispheric damageand therebyreflectleft-hemisphericoveractivation(see Chapter 4). With its extrapersonal drives, the left hemisphere is not only more
13 The split-brain patient has his or her corpus callosum connecting the right and left hemispheressevered,whichisperformedinrarecasestoalleviateotherwisetreatment- resistantepilepsyandallowsresearchersusingappropriateteststoassessthefunctionof the isolated left or righthemisphere.
likely to house our dreams and religious experiences (Previc, 2006; Solms,2000)butalsothehypothesis-testingscientistwithinus(Wolford et al., 2000). Conversely, as already reviewed, the right hemisphere – richerinnorepinephrineandserotonin–islessinvolvedinextrapersonal space, analytical intelligence, sexual behavior, and aggression but is more inclined toward peripersonal activity, self-awareness, and social empathy.
It is worth noting that the active, analytical, controlled, less emotional, sexually driven, and even aggressive style of the left hemisphere is more typical of male behavior in general, and it is popular to view male behavior as predominantly “left-brained” (Gray, 1992; Ornstein, 1972). Indeed, the left- and right-hemispheric modes are similar to the ancient Chinese concepts of “yang” (masculine, upward-seeking, and active) versus “yin” (feminine, downward-seeking, and passive), which, in turn, have their parallels in many other myths and concepts (Ornstein, 1972). While most males and females alike both possess left-hemispheric dominance for language and other functions, the left-hemisphere’s dopamine-related active, controlling, and even aggressive style suggests agreater“masculinity,” whereastheright-hemisphere’sgreateremotional sensitivity and empathy mediated by its relatively greater noradrenergic andserotonergicconcentrationssuggestsagreater“femininity.”Aspre- viously noted, testosterone increases overall dopaminergic activity and, as will be reviewed in Chapter 4, males are more prone to almost every hyperdopaminergic disorder.14By contrast, estrogen inhibits dopamine activity and promotes a more passive, receptive style – for example, classic female sexual receptive postures such as lordosis are stimulated by norepinephrine but inhibited by dopamine (Robbins and Everitt, 1982). Not surprisingly, most studies have shown that males around the world are generally more likely to adopt internal locus-of- control and futuristic perspectives (Bentley, 1983; de Brabander and Boone, 1990; Greene and Wheatley, 1992; Sundberg et al., 1983). The
14 Geschwind and Galaburda (1985) were among the first to emphasize the possible relationship between the left hemisphere and male-dominated disorders. They argued thattestosteronehadadirecteffectondelayingthedevelopmentofthelefthemisphere, whichisincorrectbecauseitisdamagetotherighthemispherethatmimicsmostofthe male-biasedpsychologicaldisorders(seeChapter4).Inreality,thetypicalmalebrainis more characteristic of the cognitive and emotional style of the left hemisphere, not necessarilybecausethemalebrainconsistsofanoveractivelefthemispherebutbecause the dopamine content in both of its hemispheres is exaggerated. It must be stressed, however, that the male bias in dopamine function is highly variable and hardly immutable. Indeed, as females achieve greater success in modern societies, the dopa- minergiccontentoftheirbrainsmaybeincreasing,whichcouldbeafactorintherising incidence of autism (Previc,2007).
particularrelationshipbetweenmasculinityandthelefthemispheremay have been of great historical significance in the rise of male- dominated dopaminergic societies at the end of the Neolithic period, as discussed further in Chapter6.
The combined role of dopamine in reward prediction, stimulus asso- ciations, motivation, and control may help to understand the intriguing finding of Wolford et al. (2000), whose paradigm tested the responses of the isolated left and right hemispheres in a split-brain patient to cues that randomly signaled future reward at a higher or lower probability, depending on whether the lights were on the left or right side. Using only the dopamine-rich left hemisphere, Wolford et al.’s patients futilely attempted to derive an underlying pattern and ended up doing poorly. This strange left-hemispheric behavior reflects its intense need to predict and control future rewards along with a desire to find abstract patterns or relationships – all of which are manifestations of an orientation beyond immediate space and time. By contrast, the same patients using only their earthier right hemisphere used a simple strategy and merely pressed the more highly rewarded key, thereby performing much better in the end. In fact, the behavior of the isolated right hemisphere was more similar to that of birds and nonhuman mammals, whose overall brain dopamine content is much lower than that of humans. This need to predict and control external events paradoxically underlies scientific knowledge and super- stitious behavior, both of which are products of the left hemisphere’s highly dopaminergic mind (Previc, 2006).
Based on the preceding review, it can be concluded that every major behavior or trait associated with dopamine can either directly or indir- ectly be tied to a more primordial link between dopamine and distant space and time. Based on the neuroscientific literature, it may be pre- dicted that highly dopaminergic minds are:
- above-averageinintelligence,withaparticularlyimpressiveworking memory and strategicability;
- very achievement- (goal-) oriented, almost to the point of obses- siveness;
- always seeking new goals, especially incentive-laden ones such as money,fame,power,oridealisticachievements,yetbecomingrestless once those goals areachieved;
- very confident in their ability to control their destiny and dominate over other people, sometimes to the point of grandiosity and reck- lessness;
- more aggressive than compassionate or nurturing toward other people;and
- above-average in sexual desire, especially if male, but not necessarily overly hedonistic in other respects.
If a person’s high dopaminergic content were biased toward the lateral prefrontal dopaminergic systems, he or she would tend to be more analytic, controlled, and highly fixated on objects or ideas. If that per- son’s high dopaminergic content were concentrated in the ventromedial dopaminergic system, and especially its subcortical elements, he or she would tend to be more creative, active, animated, restless and, in the extreme, aggressive and delusional.
Do you know any such individuals with high dopamine contents?Do youknowsocietiesthataredominatedbysuchindividuals?Ifnot,Iwill introduce them to you in Chapter 6, because their influence on the courseofhumanhistoryhasbeenprofound.Notonlydidmanyofthese individuals produce huge discoveries, conquests, and achievements,but they almost invariably suffered from a dark side that wreaked havocon close family members or, in some cases, entire populations. The other
4 Dopamine and mentalhealth
4.1 The “hyperdopaminergic” syndrome
Despite the many positive dopaminergic traits described in Chapter 3, the dopaminergic story has another, darker side. Whereas too little dopaminergic transmission in disorders such as Parkinson’s disease and phenylketonuria is debilitating to motor and intellectual functioning, excessive dopamine activity in one or more brain systems has been implicated in an even larger number of prominent neuropsychological disorders – including attention-deficit disorder (also known as attention- deficit/hyperactivity disorder when accompanied by hyperactivity), autism, Huntington’s disease, mania (also known as hypomania and bipolar disorder when it alternates with depression), obsessive-compulsive disorder,schizophrenia,andTourette’ssyndrome.Otherhyperdopami- nergic disorders include substance abuse (highly associated with attention- deficit/hyperactivity disorder and bipolar disorder) and stuttering (linked to Tourette’s syndrome). All of the hyperdopaminergic disorders are closely related to one another in terms of their co-morbidities andsymptoms, and more than one set of hyperdopaminergic symptoms are surprisingly often found in the same individual (e.g. autism with obsessive-compulsive and Tourette’s features; mania with schizo- phrenic-like psychosis and obsessive-compulsive behavior) or within families. The various hyperdopaminergic disorders also are highly amenable to the same pharmacological interventions (principally D2 receptor-blocking drugs). In fact, drugs that decrease dopamine levels in either the brain generally or in a specific system (e.g. the ventromedial one) are the main or secondary pharmacological treatment of choice in every hyperdopaminergic disorder except for Huntington’s desease.
Dopamine excess contributes to the motor symptoms (e.g. hyper- activity, tics, motor stereotypies), delusions and hallucinations, and socialwithdrawalfoundtovaryingdegreesintheabovedisorders.Like the hypodopaminergic disorders, the hyperdopaminergic disorders also affectintellectualperformanceinamostlynegativeway.However,ithas
already been noted how disorders such as schizophrenia, mania, and less severe types of autism are often associated with intellectual genius in those afflicted, how milder versions of these disorders are even more associated with creative genius, and how superior intellects are more frequently found among first-degree unaffected relatives (Karlsson, 1974). As reviewed in the individual sections to follow, most of the hyperdopaminergic disorders have risen in prevalence during the past few decades and now arguably constitute collectively the greatest threat to mental health in the industrialized world (8–10 percent prevalence for attention-deficit/hyperactivity disorder; ~1–3 percent each for bipolar disorder, obsessive-compulsive disorder, schizophrenia, and Tourette’s syndrome; and >0.5 percent for autism).1Finally, a male excess is found
consistent with the link between testosterone and dopamine.
There are several models, embracing both genetic and nongenetic influences, of how dopamine can deleteriously affect mental health. Geneticinfluenceshavelongbeensuspectedtoplayanimportantrolein the hyperdopaminergic disorders, based on the greater concordance rates for monozygotic (identical) twins relative to dizygotic (fraternal) twins and other siblings. (Concordance refers to the percentage of twin pairssharingadisorder–e.g.a60percentconcordanceratewouldmean thatthereisa60percentlikelihoodthatonememberofatwinpairhasa particular trait if the other one does.) It is widely believed that a higher concordance rate for identical twins (which have the same genetic makeup) than for same-sexed fraternal twins (which derive from separate embryos and are no different from regular siblings in their geneticmaterialheldincommon)impliesageneticinfluenceinagiven disorder. However, the fact that two-thirds of identical twins – but no dizygotictwins–alsosharethesamechorion(placentalbloodsupply)is a major problem for genetic estimates based on twin studies (see Pre- scott et al., 1999). Monochorionic twins have been shown, to varying degrees, to be more similar than dichorionic twins on a host of behav- ioral and physiological measures, including intelligence, birthweight, and risk for psychopathology (Davis et al., 1995; Melnick et al., 1978; Scherer, 2001). The effects of prenatal exposure to drugs, stress, and infection are all highly influenced by placental type (Gottlieb and Manchester, 1986; Sakai et al., 1991), and this chorionic-genetic
1 Although major depression is the single largest psychological disorder in the United States with a prevalence of ~15 percent, the combined lifetime prevalence of the hyperdopaminergicdisordersapproachesthatfigureandthechroniccostsmaybemuch greater.Forexample,adultswithautismsufferanumemploymentrateof70percentand ameanannualincomeforthoseemployedoflessthan$4,000(BelliniandPratt,2003).
- 1The “hyperdopaminergic” syndrome77
masquerade appears to be greatest when prenatal effects are strongest (Prescott et al., 1999), as in disorders such as autism andschizophrenia (Davis et al., 1995). Moreover, identical twins are much more similarly reared by parents than are fraternal twins (Mandler, 2001). Nevertheless,geneticheritabilityestimatesfromtwinstudiesgreaterthan 50percent,evenifinflated,pointtoatleastsomegeneticinfluenceina particular mental disorder. For example, it has been shown that alter- ationstokeydopaminegenessuchasdopamine-beta-hydroxylase–which converts dopamine to norepinephrine and whose deletion results in too muchdopamineandtoolittlenorepinephrine–createsagreaterlikelihood ofacquiringattention-deficit/hyperactivitydisorder,Tourette’ssyndrome, and other dopamine-related disorders (Comings et al., 1996). Other clinical studies have shown effects due to genetic disruption of dopa- mine receptor genes (e.g. polymorphisms) and dopamine transport genes.
However, there are well-documented prenatal and perinatal disturb- ancesthatalsoaffecttheriskforoneormoreofthesedisorders,including maternaldruguse(autism,attention-deficit/hyperactivitydisorder,bipo- lar disorder), maternal fever (autism, schizophrenia), and hypoxia at birth(attention-deficit/hyperactivitydisorder,autism,schizophrenia).As discussed in Chapter 2, these effects in most cases are associated with elevated dopamine: e.g., immune reactions and associated fever require dopaminereleasetodecreasetemperature;knownteratogenicagentslike thalidomide, various stimulants, and anti-seizure medications increase dopaminergic transmission; and hypoxia stimulates dopaminergically mediated parasympathetic mechanisms to reduce oxygen consumption. Another example of the contribution of prenatal factors to the hyperdo- paminergicdisordersisdeletionofthedopaminebeta-hydroxylasegenein mothers, which increases dopamine relative to norepinephrine in the placentalbloodsupplyandincreasestheriskofautisminoffspringeven morethandoestheabsenceofthatsamegeneintheoffspringthemselves (Robinson et al., 2001). Nongenetic/prenatal factors that contribute to dopamineelevation–includingindirectonessuchasdemographicstatus and societal pressures (see later discussion) – are more suspect in dis- orders that recently have been on the rise, such as attention-deficit/ hyperactivity disorder, autism, bipolar disorder (mania), and possibly obsessive-compulsive disorder and Tourette’s syndrome, since our geneticmakeuphasnotsubstantiallychangedinthepastfewdecades.It is especially difficult for genetic factors to explain the stable or rising incidencesofdisorderssuchasautismandschizophrenia,sinceafflicted individualsareunlikelytomarryandpassontheirgenesbecauseoftheir social inadequacies (see Shapiro and Hertzig,1991).
In contrast to drugs that increase dopamine, drugs that increase serotonin and norepinephrine (the right-hemispheric neurotransmit- ters involved in emotion and peripersonal sensory processing) not only improve mood and relieve anxiety (Nelson et al., 2005) but are therapeutically beneficial against the hyperdopaminergic disorders. Conversely, individuals with chronically low serotonin and norepi- nephrine levels associated with an underlying trait anxiety (see Tucker and Williamson, 1984), or transient depletion of serotonin and nor- adrenaline due to sleep deprivation and other psychological stressors (see Previc, 2004), are more prone to develop hyperdopaminergic psy- chopathologies such as schizophrenia. It is important in this context to understand that both norepinephrine and serotonin – but especially the latter–havereciprocal,inhibitoryinteractionswithdopamine,suchthat greater dopamine concentrations in the brain may produce less nor- adrenaline and serotonin, and vice versa. Indeed, the inhibitory action of serotonergic systems over dopaminergic ones is extremely well- documented (see Damsa et al., 2004; Previc, 2006, 2007) and is argu- ably the most significant neurochemical interaction in the entire brain from the standpoint of clinical neuropsychology. In concert with the notion of hyperdopaminergic disorders, there is also a “serotonergic dysfunction disorder” stemming from reduced serotonergic function (Petty et al., 1996). Although these two categories are not identical, there is a strong overlap between them. In fact, the use of dopamine- blocking drugs, serotonin-boosting drugs, or their combination is the preferred treatment in almost every major psychological disorder (e.g. Petty et al.,1996).
The previously described rise of dopamine during stress is, up to a certain point, beneficial. In terms of physiology, dopamine helps to dampen the arousal response by activating parasympathetic circuits, which tend to reduce heart rate, blood pressure, and oxygen con- sumption (see Chapter 2). In terms of behavior, dopamine helps us by promoting active coping with stress – whether that means stimulating escape behavior in a rat exposed to intermittent shock (Anisman and Zacharko, 1986), in performing problem-solving on a battlefield (Previc, 2004), or in merely helping us to cope with an uncertain socio-economic environment in which layoffs, divorces, emotional separ-ations etc. are extremely common. Indeed, dopamine may often help to transform our underlying stress, such as a creative person’s internal tension, into an intense motivational drive required to achieve a goal and in so doing at least temporarily reduce the anxiety. Elevated dopamine levels, along with norepinephrine and serotonin, may in theirstress-dampeningrolesbecrucialingredientsofwhatisknownas
the “hardy” or “tough” personality (see Previc, 2004). Dopamine, in particular, instills a belief in individuals that they, not fate, are in control of their destiny – i.e. the internal locus-of-control trait described in Chapter 3(Declerck et al., 2006). If carried too far, however, active coping, high achievement motivation, excessive internal locus-of-control beliefs, and other such traits associated with dopaminergic over- activation can be extremely debilitating, both in the coping individual (e.g. delusions, social detachment, etc.) as well as in the mental health of their offspring. Autism, in particular, is a hyperdopaminergic disorder that is much more likely to be found in offspring of highly successful parents (Previc, 2007).
Intheremainderofthischapter,Iwillbrieflyreviewtheetiologyand neuralbasisofthemajordopaminergicdisorders,withspecialreference to their pharmacological imbalances, genetic/environmental influences, developmental time-courses, and possible hemispheric asymmetries. Two hypodopaminergic disorders (Parkinson’s disease and phenyl- ketonuria) will be reviewed, along with six clearly hyperdopaminergic disorders (autism, Huntington’s disease, obsessive-compulsive disorder, mania, schizophrenia, Tourette’s syndrome) and another disorder – attention-deficit/hyperactivity disorder – that may involve a selective overactivation of the ventromedial dopaminergic system in conjunction with reduced lateral prefrontal dopaminergic activity. Superficial dis- similarities among the hyperdopaminergic disorders will be accounted for on the basis of different genetic and developmental influences (e.g. autism reflects both genetic and early to mid-gestational influences, schizophrenia is influenced by genetic and mid-to-late prenatal effects, Huntington’sdiseaseisthemostgeneticallydeterminedetc.)andbythe brain region affected. Furthermore, subcortical dopaminergic systems aremoreaffectedinsomedisorders(e.g.autism,Tourette’ssyndrome), whereasvaryingdegreesofdisturbancetospecificcorticaldopaminergic systems may be present in other disorders (e.g.mania).
- 2Disorders involving primary dopaminedysfunction
- 4.2.1Attention-deficit/hyperactivity disorder
Attention-deficit with hyperactivity disorder is the most prevalent of the learning disabilities, afflicting at least 8 percent of all children in the United States, with a 2.5-fold greater prevalence in males than females (Biederman and Faraone, 2005; Centers for Disease Control and Prevention, 2005). This disorder is characterized by boredom, distractibility, and impulsivity and is frequently associated with
childhood depression (Brumback, 1988). According to Papolas and Papolas(2002),attention-deficit/hyperactivitydisorderislikelytobeco- morbidinthemajorityofthosediagnosedwithbipolardisorder,whereas the rarer bipolar diagnosis can be applied to about 30 percent of those with attention-deficit/hyperactivity disorder (Geller et al., 2004). Attention-deficit/hyperactivitydisorderisalsohighlyassociated(10–30 percent co-morbidity) with autism, obsessive-compulsive disorder, substanceabuse,andTourette’ssyndrome(Gelleretal.,2004;Gillberg and Billstedt, 2000; Kalbag and Levin, 2005; Stahlberg et al., 2004) and, to a lesser extent, with schizophrenia/psychosis (Geller et al., 2004). Attention-deficit/hyperactivity disorder is clearly on the rise, having increased almost three-fold between 1991 and 1998 alone (Robison et al.,2002).
The consensus of researchers is that attention-deficit/hyperactivity disorder is primarily caused by alternations in brain norepinephrineand dopamine (Biederman and Faraone, 2006; Pliszka, 2005), although it haslongbeenanissueastowhetherdopamineiselevatedordeficientin this disorder (see Pliszka, 2005). In favor of the elevated dopamine hypothesis, attention-deficit/hyperactivity disorder has many featuresin common with highly co-morbid disorders such as autism, obsessive- compulsivedisorder,andTourette’ssyndromethataremoredefinitively linkedtoexcessivedopamine.Also,elevateddopamineinanimalseither leads to or is associated with hyperactivity, especially in animalmodels such as the spontaneous hypertensive rat and the Naples High Excit- ability rat (Pliszka, 2005; Viggiano et al., 2003; see Chapter 3). On the other hand, stimulants such as methylphenidate – a drug similar to but somewhat milder than amphetamine – that increase dopamine along with norepinephine are the currently preferred treatment for attention- deficit/hyperactivity disorder, at least in childhood. One leading hypothesis is that the ventromedial dopamine systems are overactive in this disorder whereas the lateral prefrontal pathways that provide inhibitory, executive control to focus our intellectual drive are under- active(Viggianoetal.,2003).Thishypothesisisconsistentwithevidence of dysregulated frontal-striatal circuits in attention-deficit/hyperactivity disorder (Biederman and Faraone, 2005) and with the high co-morbidities of this disorder with obsessive-compulsive disorder, substance abuse and various impulse-control behavioral states in which the ventromedial dopaminergic system is particularly active (Galvan et al., 2007). It may also help to explain why, as the prefrontal associ- ation regions fully mature in adulthood, attention-deficit/hyperactivity disorder begins to wane (Faraone et al.,2006).
Attention-deficit/hyperactivity disorder is believed to haveamostlygenetic etiology, with heritability coefficients ranging up80percent(Biederman and Faraone, 2005). Most research to date hasfocusedonvarious dopamine genes, including the D2, D4, and D5receptorgenes,the dopamine transport gene, and thedopamine-beta-hydroxylasegene(Biederman and Faraone, 2005; Comings et al., 1996; Li etal.,2006).However,attention-deficit/hyperactivitydisorderhasalsobeenlinkedtoavarietyofprenatalandperinatalinfluencesthatarebelievedtoelevateoffspring dopamine levels, including maternal smoking,maternalpsy-chosocial stress, and hypoxia at birth (Biederman andFaraone,2005).A particularly revealing aspect of attention-deficit/hyperactivitydis-order is its well-documented association withright-hemisphericdys-function, particularly in the context of right-hemisphericdeficitsandchildhooddepression(e.g.Brumback,1988;Heilmanetal.,1991).Thisrelationship is consistent with a variety of left-sided tactileandvisuo-spatial deficits (known as “soft” neurological deficits) thatmimictheattentional disturbances produced by actual damage to therighthemi-sphere. Right-hemispheric dysfunction would, of course, beexpectedtoshift the balance of hemispheric activity toward the left hemisphereand
its already predominant dopaminergic mode.
AutismandrelateddisorderssuchasAsperger’ssyndromerepresentthe fastest growing of all neurodevelopmental disorders (Previc, 2007). They are characterized by impaired social and emotional relationships and by stereotyped behaviors that take the form of rocking, whirling, head-bangingetc.inextremecases,andobsessiveverbalbehavior(such as repetitively focusing on trivia) in higher-functioning cases. These disorders are usually apparent by two to three years of age and often earlier.Autismexhibitsthemostextrememalebias(4:1)ofallthemajor neuropsychological/neurodevelopmental disorders, and this bias is still greater when autism is accompanied by otherwise normal intellectual
functioning. Autism was reported in <0.05 percent of the populationin industrializedcountriesaslittleasthreedecadesago,butitsgrowthrate has been exponential and recent surveys now estimate its prevalence at slightly greater than 0.5 percent (Previc, 2007).2In addition to its high
co-morbiditywithattention-deficit/hyperactivitydisorder,autismisalso likelytobeassociatedwithTourette’ssyndromeinabout10–20percent of high-functioning cases (Gillberg and Billstedt, 2000) and may even more frequently present with obsessive-compulsive symptoms. In fact, there are as many obsessions and compulsions in high-functioning autisticsasthereareinobsessive-compulsivepatientsthemselves(Russell et al., 2005), leading some researchers to argue for a combined “obsessive-compulsive/autistic” syndrome (Gross-Isseroff et al., 2001). Autism is also co-morbid with bipolar disorder and schizophrenia in slightlylessthan10percentofcases(Stahlbergetal.,2004).
As reviewed by Previc (2007), the etiology and neural substrate of autism remain somewhat unclear, although a great deal of evidence is accumulating that excessive dopamine is a major correlate of autism. Therehavebeenalargenumberofbrainareasimplicatedinautism,but the only consistent findings to date point to various subcortical abnor- malities involving the brainstem and possibly the cerebellum and basal ganglia(Milleretal.,2005;Nayateetal.,2005).Theinvolvementofthe basal ganglia partly explains the over-focusing and attention to visual details,whichalongwithanimpressivefluidintelligence(Dawsonetal., 2007; Hayashi et al., 2008) is consistent with an overactivation of the lateral dopaminergic (focal-extrapersonal) pathways in autism. Most of theaffectedregions–andthebrainsteminparticular–developearlyin gestation, which is consonant with the predominantly first-trimester influence of prenatal teratogenic substances that increase the risk of autism, such as the sleep agent thalidomide and the anti-seizure medi- cationvalproicacid(Milleretal.,2005;Stromlandetal.,1994).Prenatal exposure to valproic acid in particular is known to increase frontal cortical dopamine levels in a rodent model for autism (Nakasato et al., 2008). In terms of neurochemistry, most research into the causes of autismhasfocusedondopamineandserotonin.Thereislittledoubtthat stereotypical behaviors in autism are the result of dopaminergic over- activation (Ridley and Baker, 1982) and that drugs such as risperidone thatblockdopamineactionarethemostwidelyusedandeffectiveagainst autisticsymptoms(Nagarajetal.,2006;Volkmar,2001),whereasdrugs such as amphetamine that increase dopamine levels exacerbate autistic behavior (Volkmar, 2001). Although many persons with autism have higher blood levels of serotonin, serotonin levels in the autistic brain actuallytendtobereducedrelativetonormals(Previc,2007).Excessive dopamine is known to impair social behavior in monkeys (Palit et al., 1997;Schlemmeretal.,1980),whereasserotoninreducessocialanxiety and promotes social interaction (e.g. Young and Leyton, 2002). More- over,animbalanceofdopaminergicoverserotonergicactivityissuggested
by subtle disturbances in the concept of personal space in autistics (Parsons et al., 2004). It is not clear whether the emotional and social deficits in autism are caused primarily by the dopaminergic or ser- otonergic abnormalities, but even the latter may be expressedindirectly throughdopaminergicoveractivationbecauseofthepreviouslydescribed reciprocalinhibitoryinteractionsinvolvingdopamineandserotonin.
Although autism is reputedly one of the most genetic of all neuro- psychological disorders, its risk has also been shown to be increasedby a wide range of transient prenatal and perinatal factors that elevate dopamine, including maternal stress, illness, and drug use during pregnancy and hypoxia and cesarean delivery at birth (Previc, 2007). Chronicinfluencessuchasmaternalage,intelligence,personality,work stress, coping styles, etc. may exert an even greater impact on prenatal dopaminelevels.Autisticchildrenarefourtimesmorelikelytobeborn to highly educated mothers than to poorly educated ones (Croen et al., 2002), because the former presumably have higher dopamine levels associatedwithhighachievementmotivationlevelsandagreaterability tocopewiththeuncertaintiesandpressuresofmodernsocieties(Previc, 2007). Some genetic disorders in which autism is especially prevalent (e.g. tuberous sclerosis) are linked to genes that control dopaminergic transmission,butitmaybethemother’sgenesviatheirindirecteffects on prenatal transmission that exert the greatest genetic influence over the offspring’s dopamine levels. For example, the risk for autism is doubled when there is a complete absence in mothers of the dopamine beta-hydroxylase gene (Robinson et al.,2001).
One final piece of evidence suggesting that an imbalance between dopamine and serotonin is critical to the etiology of autism is that aut- istic symptoms generally resemble those following right-hemispheric braindamage(McKelveyetal.,1995).Whereastheserotonin-richright hemisphere is much better at emotional processing, empathy, social interactions,“theoryofmind,”andprocessingglobalconceptsrather than local details, the isolated dopamine-rich left hemisphere behaves much more like that of a mildly autistic person of normal or above- average intelligence (Ozonoff and Miller, 1996; Previc, 2007;Sabbagh, 1999). However, right-hemispheric damage alone cannot as easily simulate the extreme, presumably subcortically mediated stereotypical behaviors found in severe autism.
Huntington’s disease is one ofonlyafewneurologicaldisorderstraceable toasinglegene,inthiscasetheHDgene(seePurdonetal.,1996).This
disease was first described by George Huntington in 1872 and in North AmericaisprimarilyfoundindescendantsofasmallgroupofAnglicans whoimmigratedtoLongIsland,NewYork,in1642.Huntington’sdisease ischaracterizedbyinvoluntarymovementsoftheupperlimbsandfacial musculature, known as choreic or dancing movements (hence, the fre- quentlyusedtermHuntington’schorea).Huntington’sdiseaseismuch rarer than the other hyperdopaminergic disorders reviewed here, with a prevalenceofonlyfiveper100,000,withmostcasesdevelopingbetween thirtyandforty-fiveyearsofage.Itisnotprimarilyaneurochemicaldis- order per se, as it involves neurodegeneration in the corpus striatum, beginningwiththecaudatenucleusandlaterextendingintotheputamen, globuspallidus,andsubthalamicnucleus.Mostofthestriataldegeneration occursinsmallneuronswithspinydendrites,whichrelymostlyongamma aminobutyric acid (GABA) transmission. While dopaminergic neurons andreceptors(principallyD1 andD2)arealsodamaged(B€ackmanand Farde,2001),thenetresultofdisablingGABAtransmissionappearstobe anoverallincreaseindopaminergicactivity(Klawans,1987).Thechoreic movements and subsequent psychotic symptoms associated with Hun- tington’sdiseasecanbemimickedbydopaminergicdrugssuchasL-dopa and reduced by dopaminergic antagonists as well as drugs that increase GABA and acetylcholine, both of which inhibit dopamine in the striatum (Klawans, 1987; Purdon et al., 1996). It is less clear,however, whether cognitive symptoms in Huntington’s disease reflect excessive dopaminergicactivity orhypodopaminergia(B€ackmanandFarde,2001), asbothimbalancescandisruptcognitiveperformance(B€ackmanet al., 2006).
Other choreas such as Sydenham’s chorea are nongenetic in origin andarebroughtaboutbyfeversandotherindirectcatalystsofincreased dopaminergic activity. In general, choreic symptoms overlap hyperdo- paminergic motor symptoms found in attention-deficit/hyperactivity disorder, obsessive-compulsive disorder, and Tourette’s syndrome (Maiaetal.,2005),especiallyintheearly-onsetphase.However,thereis no evidence for a male predominance in the choreiform disorders, as in the other hyperdopaminergicdisorders.
- 4.2.4Mania (bipolardisorder)
Mania and its milder version (hypomania) are characterized by height- ened verbal, motor, sexual, intellectual, and other goal-directed behav- ior. Mania is referred to as bipolar disorder when it fluctuates with depression.CyclingbetweenmaniaanddepressionisknownasBipolarI disorder,cyclingbetweenlessextremehypomaniaanddepressionis
known as Bipolar II disorder, and long-term cycling of even lesssevere mood states is known as cyclothymia. Mania is considered part of sea- sonalaffectivedisorderwhenitistriggeredmoreinthesummermonths (when temperature and light levels are higher in moderate-to-extreme latitudes) and depression is triggered more in the winter months (when temperature and light levels are lower in those same latitudes). In addition to elevated motor activity, mania is also characterized by heightened cognitive activity ranging from increased creativity to delu- sions and other psychotic features (Goodwin and Jamison, 1990). Indeed, of all of the major mental disorders, mania and hypomania are most associated with heightened creativity and intelligence (Goodwin and Jamison, 1990), and bipolar disorder is suspected in many famous persons in history, as will be discussed in Chapter 6. Mania and schizophrenia are also the two disorders most convincingly associated with dopamine-mediated hyper-religiosity (Previc,2006).
Bipolar I and II disorder together conservatively afflict 1–2 percent of the adult population (Bauer and Pfennig, 2005; Berrettini, 2000; Narrow et al., 2002), although estimates range as high as 6.5 percent with less restrictive diagnostic criteria. Although no overall gender dif- ferences exist in the prevalence of bipolar disorder, it is more likely to present as mania in males, especially before twenty-five years of age (Arnold, 2003; Kennedy et al., 2005). Bipolar disorder appears to be on the rise in adolescents (Narrow et al., 2002), with its onset occurring at ever-earlier ages, and it is associated with substance abuse in over 50 percent of cases (Papolas and Papolas, 2002). Mania and schizophrenia are linked both epidemiologically (Berrettini, 2000; Papolas and Papo- las, 2002) and in terms of their symptoms, as they can be extremely difficult to tell apart in their acute phase and can in combination receive thediagnosis“schizoaffective.”Bipolardisorderalsoexhibitsafairlyhigh co-morbidity with attention-deficit/hyperactivity disorder, obsessive- compulsive disorder and substance abuse, with estimates ranging from 15 percent to over 50 percent (Angst et al., 2005; Freeman et al., 2002; Geller et al., 2004; Goodwin and Jamison, 1990; Stahlberg et al., 2004). Bipolar disorder is believed to occur in close to 10 percent of autistic children and is frequently present in family members of autistic children (Papolas and Papolas, 2002; Stahlberg et al., 2004). It is also more frequently found in Tourette’s syndrome (5–10 percent) but to a lesser extent than in the other hyperdopaminergic disorders (Kerbeshian et al., 1995).
Bipolar disorder is believed to have a strong genetic component, with heritability estimates from 65 to 85 percent (Bauer and Pfennig, 2005; Berrettini, 2000). Its largest genetic overlap appears to be with
schizophrenia, with several putative genetic loci shared in common (Berrettini, 2000). Prenatal and perinatal factors similar to those found in schizophrenia are also implicated in the incidence of bipolardisorder (Buka and Fan, 1999), including both direct ones (e.g. fetal alcohol exposure) as well as indirect ones (e.g. an excess of winter births) (Torrey et al., 1997). The excess of winter births can plausibly be explained by the higher levels of daylight and heat during the summer months – both of which stimulate dopamine – that coincide with the peak of the second trimester, in which prenatal influences on many mentaldisordersisgreatest(Watsonetal.,1999).Thereisasuggestion that increasing societal stress during adolescence may account forsome of the putative rise in bipolar disorder in recent decades, but parental observations suggest predisposing traits are present in bipolar individ- uals even ininfancy.
From a neural perspective, mania is best characterized as a neuro- chemical disorder caused by an elevation primarily of dopamine (responsible for the elevated goal-directed and other activity) and sec- ondarily of norepinephrine (responsible for, in some cases, an elevated mood). Dopamine, in particular, has been implicated by leading the- orists such as Depue and Iacono (1989), Goodwin and Jamison (1990), andSwerdlowandKoob(1987).Thisislargelybasedonthepropensity of L-dopa and various dopamine agonists to produce hypomania in bipolar patients and the therapeutic benefits of dopamine antagonists andlithium,theleadingtreatmentforbipolardisorderthatisbelievedto stabilize dopamine receptor activity (Goodwin and Jamison, 1990; Swerdlow and Koob, 1987). The stimulation of dopamine by light and high temperatures (Arbisi et al., 1994) helps to explain the cyclicity of bipolar symptoms (mania more common in summer) as well as the aforementioned greater prevalence in winter births. There is nospecific anatomical locus for the production of mania, although functional imaging studies suggest that activation of the anterior cingulate gyrus and frontal lobes is most likely to accompany it (Adler et al., 2000; Blumberg et al., 2000). Mania is also more likely to occur during acti- vation of the dopamine-rich left hemisphere (particularly in frontal areas), based on lesion studies involving either the left or right frontal lobes (Blumberg et al., 2000; Cummings, 1997; Cutting, 1990; Good- win and Jamison, 1990; Joseph, 1999; Previc,2006).
Obsessive-compulsive disorder is a relatively common major neuro- psychological disorder, with a lifetime prevalence estimated at 2–3
percent (Angst et al., 2005). Obsessions take the form of recurrent and anxiety-producing thoughts while compulsions consist of repetitive and stereotyped behaviors (e.g. hand-washing) and mental acts (e.g. excessive praying) to ward off obsessions and other disturbing events. Obsessive-compulsive traits are positively associated with dopaminergic hyper-religiosity (Previc, 2006), but whether compulsions manifest themselves as religious or not depends to a great extent on the religiosity of the society as a whole. Obsessive-compulsive disorder is moderately to strongly linked to all of the other hyperdopaminergic disorders except Huntington’sdisease.Ithashightoveryhigh(>30percent)co-morbidities
with autism, attention-deficit/hyperactivity disorder, bipolar disorder
(see earlier sections) and Tourette’s syndrome (Como et al., 2005; Faridi and Suchowersky, 2003), and it also has a substantial co-morbidiity with schizophrenia at ~15 percent (Eisen and Rasmus- sen, 1993; Fabisch et al., 2001).
In contrast to the other hyperdopaminergic disorders reviewed inthis chapter, females are slightly more likely to be afflicted by obsessive- compulsive disorder, although symptoms in males tend to develop sooner and are more chronic (Castle et al., 1995; Noshirvani et al., 1991). Males also tend to be generally better represented among those with obsessive-compulsive spectrum disorders, a loosely defined group of disorders that according to some theorists also includes body dys- morphic disorders (dissatisfaction with body appearance), trichotillo- mania (hair-pulling), and impulse-control disorders (e.g. pathological gamblingandsexualaddictions)(seeAngstetal.,2005;McElroyetal., 1994). However, even in the obsessive-compulsive spectrum disorders, the male bias is not as prominent as in attention-deficit/hyperactivity disorder, autism, schizophrenia, and Tourette’s syndrome. As already reviewed, obsessive-compulsive disorder and autism bear an especially strong association with each other, and the relationship between obsessive-compulsive and bipolar disorder is also very strong. Some researchers even argue that impulse-control disorders, which are typic- ally distinguished from classic obsessive-compulsive symptoms by less planning and anti-anxiety intent, are almost always found in mania (Moelleretal.,2001).Despiteitsrathersubstantiallifetimeprevalence, there is surprisingly little firm evidence concerning the etiology of obsessive-compulsive disorder and whether its prevalence is changing, although it does appear to be more common than previously believed (Szechtman et al., 1999). While there appears to be a heritable com- ponent,especiallyinpatientswithoverlappingTourette’ssyndrome,no specific genes have been identified, and there is also little evidence ofa prenatal or birth influence on the incidence of this disorder.
There are no gross anatomical pathologies consistently found in obsessive-compulsive disorder, although there does appear to be over- activation in the ventromedial dopaminergic system, including the anterior cingulate and portions of ventromedial (orbital) frontal cortical regions (Adler et al., 2000; Rosenberg and Keshavan, 1998). In fact, removal of the anterior cingulate in a procedure known as a cingulotomy is the leading surgical technique used to control intractable obsessive- compulsive disorder. Prefrontal cortical regions, including the lateral prefrontal cortex, function fairly normally in obsessive-compulsive dis- order (Abbruzzese et al., 1995), but they do not appear to provide suf- ficient inhibition over the subcortical and medial dopaminergic systems, which are primarily responsible for the debilitating perseverative and ritualistic behaviors.
As in schizophrenia, the average age of onset of obsessive-compulsive symptoms is in the early twenties, and obsessive-compulsive symptoms may be precipitated by stress and anxiety, which depletes serotonin and elevates dopamine in the mesolimbic areas (Finlay and Zigmond, 1997). The dopaminergic excess may be especially important in accounting for the broader obsessive-compulsive spectrum, including sexual, gambling, video-game, and other psychological addictions. For example, video- game playing stimulates striatal dopamine release (Koepp et al., 1998), while dopamine treatment for Parkinson’s disease increases pathological gambling (Dodd et al., 2005). The dopaminergic excess is also con- sistent with evidence for a left-hemispheric overactivation in obsessive- compulsive disorder (Otto, 1992), as brain imaging studies have shown that D2 receptor binding – indicating the amount of receptors not already occupied by dopamine – is significantly reduced in the left hemisphere of obsessive-compulsive patients relative to controls (Denys et al., 2004). As in other hyperdopaminergic disorders, the excessive behavioral activity in obsessive-compulsive disorder is best blocked by a combination of anti-dopaminergic drugs and serotonin re-uptake blockers (Carpenter et al., 1996; Szechtman et al., 1999), which may be most effective in reducing the specific left-hemispheric activation (Benkelfat et al., 1990).
Parkinson’s disease, first described by James Parkinson in 1817, is a neurological disorder in which degeneration of nigrostriatal dopami- nergic neurons leads to a dramatic reduction in dopamine levels in the striatum(Litvan,1996).Behaviorally,Parkinson’sdiseaseischaracter- ized primarily by a progressive loss of voluntary motor functionsalong
with tremors and secondarily by a set of intellectual deficits initially confined to executive-type ones (Previc, 1999). Parkinson’s disease is one of the most common of neurological disorders with a prevalence of
~1–2 per 1,000 and ten times that in those over sixty-five. Parkinson’s diseaseisbelievedtobemainlycausedbythemalfunctionofParkinand variousotherneuroprotectivegenesalongwithexposuretoenvironmental toxins(Cortietal.,2005).Althoughdopaminelossisacardinalfeatureof aging human brains in general (B€ackman et al.,2006), the selective dopaminereductionismuchmoresevereinParkinson’sdisease.
to use it as a model for understanding the role of striatal/lateral-frontal dopamine in motor behavior and cognition The deficits in voluntary motor behavior extend to all types of movements but, as noted earlier, they do not include automatic or reflexive behaviors. The cognitive impairments initially appear restricted mainly to executive intelligence, including deficits in:
- maintaining and operating on items in working memory (as opposed to short-term memory per se, since digit span is not impaired);
- shifting conceptual sets, as required by the Wisconsin Card-Sorting Test described in Chapter 3;
- sequential ordering of responses;and
Themotordeficitsaremoresevereinupperspace,asParkinson’spatients have trouble making upward eye movements and display downward errorsintheirarmtrajectoriesintheabsenceofvision(seeCorinetal., 1972; Poizner and Kegl, 1993). The upward deficits are, of course, consistentwiththeroleofdopamineinmediatinginteractionsinthemost distal portions of 3-D space (Previc, 1998). It should be noted that the symptomsofParkinson’s diseaseinitially appear on the leftside ofthe body(controlledbytherighthemisphere)inmostpatients(Previc,1991), which is presumably due to the fact that dopamine levels are lower to beginwithintherighthemisphereofmostindividuals.
TheprincipaltreatmentforParkinson’sdiseaseistheadministration of drugs that increase dopamine output, mainly the dopamine precursor L-dopa. But, while L-dopa obviously helps to ameliorate or at least retard the progression of classic Parkinson’s deficits involving motor initiation and sequencing and working memory, it also leads to motor problems (dyskinesias), affective disturbances (e.g. depression) and even psychotic (hallucinatory) behavior in many Parkinson’s patients (Jankovic,2002).ThedyskinesiasproducedbyL-dopacanresemblethe choreas and dystonias found in the hyperdopaminergic stateassociated
withHuntington’sdisease.Disturbancesofsleepandcardiovascularand temperature control (increased sweating) after L-dopa therapy (Quadri et al., 2000) are consistent with the mostly parasympathetic actions of dopamine, as reviewed in Chapter2.
Phenylketonuria is a disorder involving an absence or mutation of a single gene (PAH) responsible for production of the enzyme phenyl- alaninehydroxylase,whichconvertsphenylalaninetotyrosine(Kenealy, 1996). In the absence of a normal PAH gene, excessive phenylalanine and diminished tyrosine occur and combine to produce severe mental retardation. The major cause of the subnormal intelligence is the excessive phenylalanine, which disrupts basic brain and skeletal devel- opment, including the formation of myelin. Studies have shown a decrease of one-half standard deviation in intelligence scores for each
300 lmol/L of phenylalanine in the blood, with levels >1,200 lmol/L producing severe retardation (Burgard,2000).
Although abnormalities in the PAH gene itself are not rare, afflicting
~ 2 percent of the population, the recessive nature of the gene tends to diminish the actual incidence of the phenotype to about 1 in 10,000 among Caucasians. Because high concentrations of phenylalanine result in detectable increases in its metabolite – phenylpyruvic acid – untreated phenylketonuria is now quite rare in the developed world. The accepted treatment for phenylketonuria is dietary restriction of phenylalanine in early childhood, and tyrosine supplements serve to compensate for the failure to convert phenylalanine to tyrosine. Restricted phenylalanine alone does not prevent all intellectual deficits – especially those associated with executive functions like working memory and cognitive-shifting – but neither does supplemental tyrosine alone prevent the majority of mental retardation. It has also been shown that phenylketonuria during pregnancy (known as “maternal phenylketo- nuria”) produces intellectual deficits in offspring not even genetically prone to this disorder (Hanley et al., 1996; Kenealy, 1996).
The detrimental lack of tyrosine during early development is con- sistentwiththeimportanceoftyrosinetothesynthesisofdopamineand norepinephrine. In particular, the executive deficits in phenylketonuria have been attributed to dysfunction of the lateral prefrontal dopami- nergic systems (Diamond et al., 1997; Welsh et al., 1990), presumably resultingfromthedecreasedtyrosine.Thisisconsistentwiththeeffects of tyrosine restriction during pregnancy on dopamine levels and behaviorinanimals(Santanaetal.,1994)andwiththedeleteriouseffect
of maternal iodine restriction and hypothyroidism on the conversion of tyrosine to dopa and on offspring intelligence in humans (see Previc, 2002). Because the altered neurochemistry in phenylketonuria is less specific for dopamine than is the nigrostriatal degeneration in Parkin- son’s disease, many of the motor and other deficits found in the latter disease are not manifested in phenylketonuria, although a tendency toward hyperthermia may be present in both disorders (Blatteis et al., 1974; Quadri et al.,2000).
Althoughitafflictsonlyabout1percentofthepopulation,schizophreniaisthebest-studiedofallneuropsychologicaldisorders,partlybecauseofthebizarrethoughtpatternsanddelusionsassociatedwithitandthefactthat so many great minds have at least temporarily succumbed toitorsimilar psychoses, including the physicists Newton andFaraday(seeKarlsson, 1974), the playwright August Strindberg, thenovelistFranzKafka, the poet Ezra Pound, and the Nobel laureateandmathematician,John Nash. Recent meta-analyses have determined that thereisanoverallmaleexcessinschizophreniaof~40percent(Alemanetal.,2003), but early-onset schizophrenia (the most severe form)isabouttwice as likely to occur in males as in females. By contrast,thelater- onset, milder version is more likely to occur in females inmiddleageas the inhibitory influence of estrogen over dopamine beginstowanewith approaching menopause (Castle, 2000). Aspreviouslyreviewed,schizophreniaisstronglyassociatedwithobsessive-compulsivedisorder andmania,anditspsychoticsymptomsaresimilartothosefoundin Huntington’s disease. Schizophrenia also bears a smallerbutstillgreater-than-expected co-morbiditywithattention-deficit/hyperactivitydisorder and autism and probably Tourette’s syndrome (Mulleretal.,2002), although little formal data exist concerning thelastconnection.The major diagnostic signs in schizophrenia are categorizedas either “positivesymptoms”(e.g.hallucinations,delusions,thoughtdisorder)or “negative symptoms” (affective disturbances such aspoorsocialinteraction, depressed mood, and anhedonia, the lossofpleasurablesensations).The“split”conveyedbythenameschizophreniadoesnotrefertoadividedselfasisoftenpopularlyconveyedbutrathertoaninnerworlddivorcedfromexternalreality.Someofthespecificthought
deficits in schizophrenia include:
- insensitivity to situational context/feedback, reflected in thelackofeffectsofpreviousrewardcontingencies(e.g.latentinhibition,as
- remote (loosened) associations between seemingly unrelatedstimuli; and
- a reduced ability to understand the intent of others.
All of these symptoms reflect to varying degrees a hyperdopaminergic state comprised of insufficiently grounded and controlled mental activity (Previc, 2006; Swerdlow and Koob, 1987; Weiner, 2003). While certain features of schizophrenia are also similar to those of autism (deficits in global processing and theory of mind), mania (delusions), and Huntington’s disease (psychosis), these and other hyperdopaminergic disorders are also characterized by increases and/or abnormalities in motor output (e.g. stereotypical movements), whereas disorganized thoughts are more salient than aberrant motor behavior in schizophre- nia, at least early on in the illness. In the acute phase, schizophrenia cannot easily be distinguished from mania, but the latter is more tran- sient and is not associated with the negative schizophrenic symptoms, which tend to develop more chronically. Without neurological testing, schizophrenic psychosis is also difficult to distinguish from the psychosis found in temporal-lobe epilepsy, in which dopamine is elevated (Previc, 2006). Similarly, milder delusional tendencies and unusual sensory experiences are found in a normal personality variant known as “schizotypy,” which is often associated with mild neurological abnor- malities of the temporal lobe (Dinn et al., 2002). Psychosis is also found in 10–20 percent of hyperthyroid patients (Benvenga et al., 2003),which is interesting in view of the evolutionary link between thyroid output and dopamine that will be discussed in Chapter 5.
Prominent thought disorders in schizophrenics include paranoia and delusionsofcontrol,rangingfromthe“Messiahcomplex”toabelief that one is being controlled by external forces such as aliens. Religious themes also figure prominently among the schizophrenic delusions, but thesearereplacedbynonreligious(e.g.sexualorgrandiosity)themesin lessreligioussocieties(Previc,2006).Schizophrenicdelusionsareoften cosmic in nature, as evidenced in the following description by Jaspers (1964:295):
The cosmic experience is characteristic of schizophrenic experience. The end of the world is here, the twilight of the gods. A mighty revolution is at hand inwhich the patient plays a major role. He is the center of all that is coming to pass. He has immense tasks to perform, of vast powers. Fabulous distant influences, attractions,andobstructionsareatwork.“Everything”isalwaysinvolved:allthe
peoples of the earth, all men, all the gods, etc. The whole of human history is experienced at once. The patient lives through infinite millennia. The instant is an eternity to him. He sweeps through space with immense speed, to conduct mighty battles; he walks safely by the abyss.
The schizophrenic emphasis on religious and cosmic themes reflects a fundamental disturbance in their interactions with 3-D space and in the systems that deal with 3-D space (see Previc, 1998, 2006). Relative to normals,schizophrenicstendtoshowabiastowardextrapersonalspace andadeficitinperipersonalspatialoperations.Thisextrapersonalbiasis reflected in numerous ways,including:
- a flattened 3-D appearance of the world, which indicates a lack of depth perception ordinarily provided by our peripersonalsystem;
- an upper-field predominance of visualhallucinations;
- deficits in pursuit tracking of objects, which is used mostly in peripersonalspace;
- deficits in body imaging andawareness;
- loss of prosody, emotional perception, and other functions that rely on our body-arousal system;and
- reduced sensitivity to bodily signals, such as from the vocal muscu- lature, which leads to the erroneous attribution of self-generated internal thoughts to external voices (Bick and Kinsbourne, 1987).
Even the loosened, remote thought associations characteristic of schizophrenics may be a manifestation of a more fundamentaltendency to connect spatiotemporally distant stimuli (Previc,2006).
The emphasis on extrapersonal themes is consistent with the func- tional neuroanatomical profile in schizophrenia. Schizophrenia is gen- erally believed to reflect overactivity in the ventral cortical and limbic pathways–particularlythemedial-prefrontalandmedial-temporalareas and associated subcortical regions comprising the action-extrapersonal system – along with diminution of parietal and occipital inputs (see Previc, 2006). The hallucinations, delusions, and other positive symp- toms of schizophrenia emanate primarily from activity in the medial dopaminergic pathways, while the negative symptoms tend to reflect reducedactivityintheparietallobeandotherposteriorareas(Buchanan et al., 1990). However, even positive symptoms such as hallucinations depend to some extent on diminished posterior sensory inputs. For example, all of us tend to hear or imagine things in our brains, but the discrimination of internally generated auditory and visual signals from externalrealitywouldbemoredifficultifwecouldnotfeelourselvestalk
or if we did not receive normal auditory and visual signals from the environment.
Although some models of schizophrenia posit that medial-temporal inputs are reduced relative to overactive medial-frontal structures (e.g. Weiner, 2003), overactivation of the temporal lobes is more likely as suggested by the aforementioned resemblance of schizophrenia to temporal-lobe epilepsy, in which the medial temporal lobe is hyper- excitable (Sachdev, 1998). As noted in Chapter 3, lateral prefrontal regions may be less active relative to medial subcortical circuits (Abi-Dargham and Moore, 2003; Bunney and Bunney, 2000; Davidson and Heinrichs, 2003), thereby preventing the latter’s ego-control mechanisms from overcoming the chaotic thoughts inspired by medial dopaminergic activity. Overactivation of the left temporal lobe in schizophrenia has also been well-documented (Cutting, 1990; Gruzelier, 1999; Gur and Chin, 1999; Rotenberg, 1994), consistent with the general role of the dopamine-rich left hemisphere in delusions and hallucinations and with its exaggerated emphasis on extrapersonal space (Previc, 1998, 2006). Few if any of these functional imbalances in brain activity directly result from neuroanatomical pathology; indeed, aside from a possible reduction of volume in the temporal lobe, there is little evidence that actual neuroanatomical damage contributes to or even correlates with schizophrenia (Chua and McKenna, 1995). Hence, the consensus of researchers is that the schizophrenic behavioral syndrome is mostly a consequence of functional changes in specific neurochemical circuits, principally those involvingdopamine.
The theory of dopamine overactivation in schizophrenia is one of the longest standing and most widely accepted in neuropsychology (see Kapur, 2003; Swerdlow and Koob, 1987). Dopaminergic elevation can account for every positive symptom of schizophrenia, ranging from the loosened thought associations to the saccadic intrusions to the delusions and hallucinations and even the bias toward distant space. Drugs that elevate dopamine like L-dopa and amphetamine create or worsen psychotic symptoms, whereas drugs that block dopamine activity (mostly D2 receptors) have long been the treatment of choice in this disorder. Other neurochemical systems, especially serotonin and glu- tamate, are also implicated in schizophrenia (Vollenweider and Geyer, 2001). Both of these systems inhibit dopamine and, when blocked, are believed to contribute to hallucinations and other positive symptoms (see Previc, 2006). Their involvement in schizophrenia is attested to by the greater therapeutic effectiveness of newer “atypical” antipsychotic drugs such as clozapine, which not only block dopamine D4 receptors butalsoaffectavarietyofotherneurochemicalsystems.Asnotedin
earlier chapters, serotonin appears to be mostly involved in bodily arousal and peripersonal functions, so its reduction would also be expected to tip the balance toward extrapersonal activity (Previc, 1998, 2006).
Like most other neurodevelopmental disorders involving dopamine, schizophreniaiscausedbyacombinationofgenetic,prenatal,perinatal, and postnatal influences (Lewis and Levitt, 2002). The concordance amongidenticaltwinsisabout50percent,whichsuggestsaconsiderable genetic influence, but no single gene or genetic factor has consistently been shown to be linked to schizophrenia (Lewis and Levitt, 2002). Alongwithhypoxiaatbirth,maternalinfection/feverandmalnutrition– especially in the second trimester of pregnancy – represent two of the best-documented prenatal factors (Watson et al., 1999), with maternal infection leading to as much as a seven-fold increase in the risk of schizophrenia (Brown, 2000). All of these maternal factors elevate brain dopamine, and the delayed prenatal influence (second trimester and beyond, consistent with a well-established excess of winter births; Torreyetal.,1997)suggeststhatschizophrenia,evenmorethanautism, mayinvolvedopaminergicabnormalitiesinthelater-developingcerebral cortex. Postnatal psychosocial and other stressors that deplete the brain of norepinephrine and serotonin and thereby shift the neurochemical balance further in favor of dopamine are believed to help precipitate most cases of schizophrenia. For example, thermal stress, which leads to elevated dopamine levels (see Previc, 1999), can exacerbate schizo- phrenia (Hare and Walter, 1978) as well as mania and Tourette’ssyn- drome (Lombroso et al., 1991; Myers and Davies, 1978). However, although various postnatal stressors may serve as catalysts, the funda- mentalpredispositiontoschizophreniaclearlyarisesfromgeneticaswell as early neurodevelopmental influences during the prenatal and peri- natal periods (Lewis and Levitt,2002).
Tourette’s syndrome is a chronic neurodevelopmental disorder charac- terized by motor tics involving mainly the orofacial region. These tics consist of shoulder shrugging, grimacing, blinking, grunting and even morecomplexvocalbehaviorsuchasecholalia(repeatingother’swords) or coprolalia (socially inappropriate verbal utterances) (Faridi and Suchowersky, 2003). Tourette’s syndrome is associated with poor aca- demic achievement and social adjustment, but it is not nearly as debilitating as other hyperdopaminergic disorders such as autism and schizophrenia. Tourette’s syndrome afflicts up to 2 percent of the
population (Faridi and Suchowersky, 2003; Robertson, 2003), which is much more common than once believed, and it has a moderate male bias (at least 1.5:1). As reviewed earlier, Tourette’s syndrome has extremely high co-morbidities with attention-deficit/hyperactivity dis- order (estimates range from 8 percent to 80 percent), autism (up to 80 percentofautisticpersonshavetics,accordingtoGillbergandBillstedt, 2000), and obsessive-compulsive disorder (50 percent of Tourette’s patientshaveobsessive-compulsivesymptoms,accordingtoComoetal., 2005). Tourette’s syndrome exhibits a smaller but still greater-than- expected association with mania and schizophrenia (<10percent).
Finally, Tourette’s syndrome is closely related to stuttering, another
hyperdopaminergic disorder characterized by vocal dystonias and a male predominance (Abwender et al., 1998). For example, stuttering is accompanied by motor tics in about 50 percent of cases, and it has a similarly high co-morbidity with obsessive-compulsive disorder (Abwender et al., 1998).
Tourette’ssyndromehasstrongfamiliallinksandisgenerallyviewed as having a substantial genetic component, with a concordance rate of
>50 percent for identical twins (Faridi and Suchowersky, 2003).Genes that regulate dopamine (such as dopamine beta-hydroxylase) havebeen implicated(Comingsetal.,1996),althoughnoparticulargenehasbeen identifiedascriticalandsoapolygenicinfluenceissuspected(Faridiand
Suchowersky, 2003; Nomura and Segawa, 2003). Prenatal factors may also be involved, as suggested by greater maternal than paternal trans- mission (Faridi and Suchowersky, 2003), but they do not seem to be as influential as in autism and schizophrenia. While tic expression in Tourette’s patients can be exacerbated by both physical and psycho- socialstress,postnatal/environmentalfactorsalsodonotseemtoplayas important an etiological role as in mania andschizophrenia.
Tourette’s syndrome is generally viewed as a disorder of the dopa- mine-rich basal ganglia and the prefrontal, orbitofrontal, and limbic corticalareas.Therearenostrikingneuroanatomicalabnormalitiesasin Huntington’sdisease,althoughvolumechangesinthebasalgangliahave occasionally been reported. Although some dopamine-rich brain areas, particularlythoseinthebasalganglia,maybeunderactiveinTourette’s patients, these areas may paradoxically be overactivated when actual symptoms such as tics are exhibited (see Nomura and Segawa, 2003). The transient dopaminergic overactivation may arise from a super- sensitivityofdopaminergicreceptors,possiblycausedbychronicallylow striatal dopamine levels (Nomura and Segawa, 2003). The leading treatmentforTourette’ssyndromeconsistsofdopamine-blockingdrugs
such as the typical or atypical neuroleptics (Faridi and Suchowersky, 2003), although other transmitter systems that indirectly affect dopa- minelevelshavealsobeenthesubjectoftreatments.Dopamineagonists at low doses may be of some benefit (Nomura and Segawa, 2003), although amphetamine and similar dopamine-activating drugs tend to increase tic frequency and severity at higherdosages.
The preceding review of nine dopamine-related disorders illustrates a verydifferentsetofsymptomsindisordersthatelevatedopamineversus those that reduce it. In the ones in which dopamine is overactive, increased motor activity and/or mental activity is present, whereas slower motor and/or mental activity are found in phenylketonuria and Parkinson’s disease. The hyperdopaminergic disorders tend to show strikingly high co-morbidities in most cases, except for Huntington’s disease,whichiscausedbyasingle-genemutation.Thesedisorderstend not to be causally related to any overriding neuroanatomical pathology, againwiththeexceptionofHuntington’sdisease.Rather,they
- mostly reflect serotonergic underactivation versus dopaminergic overactivation in the ventral cortical areas and in the already dopa- mine-rich lefthemisphere;
- aretriggeredbystressoranxiety,whichisknowntoincreaseactivity in the ventromedial dopaminergic pathways (see Chapter 2); and
- arepreferentiallytreatedbyaregimethatinvolvesserotoninboosters or dopamineblockers.
The hyperdopaminergic disorders mostly exhibit a mild to strong male prevalence, at least in early-onset cases, and their overall prevalence is either definitely or possibly rising except in schizophrenia. Conse- quently, it may be appropriate to think of the hyperdopaminergic disorders not as separate syndromes but rather as overlapping symptom sets with the same underlying neurochemicalimbalances.
As Table 4.1indicates, the hyperdopaminergic disorders are hardly
monolithicineithertheirsymptomsoretiology.WhereasHuntington’s disease is a single-gene disorder and five others (autism, attention- deficit/hyperactivity disorder, bipolar disorder, schizophrenia, and Tour- ette’s syndrome) have varying degrees of genetic inheritance associated with them, four of those with genetic etiologies also show substantial prenatal/perinatal inheritance. And, at least two of the later-onset dis- orders – schizophrenia and obsessive-compulsive disorder – maybe
Table 4.1 Features of the major hyperdopaminergic disorders.
|Mania (bipolar)||"||"||?||þ||earlier onset, <2:1||þ||*||"|
þfor LH indicates predominant symptoms are those of left hemisphere.
þfor hyperkinetic indicates this symptom is present.
*indicates mild prenatal influence.
**indicates strong prenatal influence.
"indicates elevation or increase.
x indicates no change.
? indicates association meriting further research. Mild male bias refers to ratio of males to females.
Table 4.2 Co-morbidity of the major hyperdopaminergic disorders.
Schizophrenia – *
*represents mild or familial linkages only;
**represents substantial co-morbidity of 10–30 percent;
***represents strong co-morbidity of over 30 percent.
precipitated by underlying trait anxiety or stress that depletes the brain of serotonin and norepinephrine. Some of the disorders (specifically, Huntington’sdiseaseandTourette’ssyndrome)resultfromsubcortical dopaminergic activation, one (autism) bears some similarities to the over-focusing associated with prefrontal dopaminergic activation, and two others (schizophrenia and attention-deficit/hyperactivity disorder) arguably result from an excess of ventromedial dopaminergic activity relative to lateral prefrontal dopaminergic activity. In two other dis- orders – obsessive-compulsive disorder and mania/hypomania – over- activation of the ventromedial pathways may co-exist with normal or even elevated lateral prefrontal activity, resulting in heightened behav- ioraloutputinvolvingplanningandstrategy.Anotherwayofviewingthe hyperdopaminergic disorders is that they lie on a continuum ranging from the impulsive (e.g. attention-deficit/hyperactivity disorder, mania, some addictions, and schizophrenia) to the compulsive (e.g. autism, obsessive-compulsive disorder). In the former disorders, the medial dopaminergic drive appears greater than the lateral dopaminergic inhibition, resulting in impulsive, disordered but also creative thoughts, while in the latter disorders the lateral and other prefrontal dopami- nergic inhibition is relatively intact or even elevated, resulting in over- focusing of behavior andstereotypy.
Itisinterestingtonotethatofthevarioushyperdopaminergicdisorders, obsessive-compulsive disorder is the most interrelated, exhibiting sub- stantialorstrongco-morbiditywitheveryotherhyperdopaminergicdis- order except Huntington’s disease (Table 4.2). Its highco-morbidities
100 Dopamine and mental health
are not surprising in that obsessive-compulsive symptoms most clearly epitomize dopaminergic overactivation – heightened motor or mental activity, goal-directed behavior, concern about spatial and temporally distant events (e.g. the future), and attempts to control the environment in elaborate and extreme ways to ward off emotional or other types of aversive consequences, sometimes to the point of bodily neglect. Furthermore, both the lateral and ventromedial dopaminergic systems are either active or overactive in obsessive-compulsive disorder, and an excess of left-hemispheric activity occurs in it as in almost all of the other hyperdopaminergic disorders. And, while obsessive-compulsive disorder itself shows only a minor gender difference, many obsessive-compulsive spectrum disorders such as gambling, sexual addiction, and excessive video-game use are much greater in males, in line with the general male predominance in the hyperdopaminergicdisorders.
Perhaps the most troubling aspect about the hyperdopaminergic dis- ordersisthat,exceptforHuntington’sdiseaseandschizophrenia,they are either definitely or possibly rising, which cannot be reconciled with thenotionthattheyaremostlygeneticallydetermined.Becauseprenatal factorsclearlyinfluencetheriskfordevelopingmosthyperdopaminergic disorders, the increasing prevalence of these disorders in modern industrialized nations may more properly be ascribed to demographic changes, deleterious environmental exposures, and/or psychological pressures in modern society that have pushed maternal dopaminelevels to dangerously high levels (Previc, 2007), which I will further expound upon in Chapter 7.
5 Evolution of the dopaminergic mind
It is customary, albeit limiting, to view human brain evolution in terms of the events leading up to genetically and anatomically modern humans, now believed to be approximately 200,000 years ago. All subsequent changes in humans are attributed to the effects of “culture,” and human “history” is relegated to even more modern events beginning with the formation of agricultural societies. If, however, the expansion of dopamine was not due mainly to genetically mediated changes in our neuroanatomy but rather to epigenetic changes in our neurochemistry, then the physical brain evolution of modern humans has continued all the way to the present. This chapter will focus on the evolution of the dopaminergic mind leading up to, and including, the cultural explosion in Homo sapiens that has been termedthe“BigBang”(Mithen,1996), whichoccurred firstin Southern Africa between 70,000 and 80,000 years ago and later appeared in Europe around 40,000–50,000 years ago, while Chapter 6will focus on the changes in the dopaminergic mind since the dawn of history. I will highlight two major events in human evolution – the evolution of the “protodopaminergic”mindbeginningaroundtwomillionyearsagoand the emergence of the later dopaminergic mind with its distinctly human intellectual abilities less than 100,000 years ago. First, however, it is necessary to describe further the contribution of epigenetic influences to inheritance, given the crucial role they appear to have played in our intellectual evolution.
5.1 The importance of epigenetic inheritance
As discussed in Chapter 1, there are strong reasons to believe that the evolutionofhumanintelligencedidnotdependonchangesinbrainsize oronchangesinthegenome,atleastinthelaterstages,butratheronan expansionofdopaminergicsystemsinthebrain.What,then,causedthe dopaminergic expansion that led to the modern human mind? As alluded to in Chapter 1and detailed in an earlier theory (Previc, 1999), theevolutionofthedopaminergicminddependedonphysiological
influences and dietary changes that together led to increased dopamine in the brain. Whereas the genome was once believed to almost exclu- sively determine our inheritance, it is now widely accepted that epigenetic influences, especially those occurring in the womb, affect and sometimes even override gene expression at all levels and thereby modify brain development (Gottlieb, 1998; Harper, 2005; Keller, 2000;
Lickliter and Honeycutt, 2003; Nathanielsz, 1999; Petronis, 2001). The maternal environment affects immune function, heart disease, diabetes, and cancer risk (Nathanielsz, 1999; Petronis, 2001) and even gross physical appearance in some cases (Gottlieb, 1998); but most relevant to this thesis, it has been shown to have an especiallypowerful influence on brain development and behavior, and may even be con- sidered a source of speciation (Lickliter and Honeycutt, 2003). What is critical from the standpoint of human brain evolution is that maternal effects are trans-generational in that, for example, a mother with high dopamine levels can prenatally pass those levels on to her childrenand, in turn, they to their children (Harper, 2005; Lickliter and Honeycutt, 2003; Nathanielsz,1999).
Threeofthemostimportantexamplesofepigeneticinfluencesonbrain development are the effects of prenatal iodine-deficiency (Boyages and Halpern, 1993; DeLange, 2000), the effects of poorly controlled phenylalanineandtyrosinelevelsinphenylketonuria(Hanleyetal.,1996), andtheeffectsofhighmaternaldopaminelevelsontheincreasedriskfor autism(Previc,2007).Prenataliodine-deficiencysyndromeismorelikely toaffectbraindevelopmentthanskeletalgrowthandisestimatedtoplace over one billion humans at risk worldwide for reduced intelligence (DeLange, 2000). In autism and phenylketonuria, the maternal risk involvesbothgeneticandnongeneticfactors,buteventhegeneticriskis partly manifested in an aberrant prenatal neurochemical environment. Forexample,doubleallelicdeletionofthegeneresponsibleforcreating the enzyme dopamine beta-hydroxylase that converts the neurotrans- mitter dopamine to norepinephrine should theoretically produce an individualwithnonorepinephrineandhighlevelsofdopamine,whichnot only would disturb mental health (see Chapter 4) but would also be catastrophic physiologically since norepinephrine is the keyneurotrans- mittermaintainingnormalsympatheticcardiacoutput.Eveninthiscase, however, normal maternal norepinephrine levels can overcome the off- spring’s own lack of norepinephrine (Thomas et al., 1995), whereas conversely an abnormally high maternal dopamine-to-norepinephrine ratiocancreateexcessivedopaminelevelseveninoffspringwhoseown dopaminebeta-hydroxylasegeneisnormal(Robinsonetal.,2001).
- 1The importance ofepigenetic inheritance103
Dopaminergicsystemsinhumansarehighlysusceptibletoamyriadof prenatal influences, based on direct and indirect manipulations of maternal dopamine levels during pregnancy. Many studies have shown that maternal ingestion of tyrosine (the precursor to dopa and dopamine), cocaine (which blocks the re-uptake of dopamine), amphetamine (which both stimulates dopamine release and blocks dopamine re-uptake), and haloperidol (which blocks the action of dopamine postnatally) all affect postnatal dopamine levels and various dopamine-mediated behaviors in offspring (e.g. Archer and Fredriksson, 1992; Santana et al., 1994; Zhang et al., 1996). The general finding from these and other studies is that stimulation of maternal dopamine systems prenatally results in enhanced postnatal dopaminergic activity (Santana et al., 1994), whereas reduced maternal dopamine activity diminishes postnatal dopaminergic activity (Zhang et al.,1996).
As noted in Chapter 4, the importance of the prenatal environment to brain development has even challenged the basic assumptions of behavioralgenetics.Itisassumedthatmonozygoticanddizygotictwins share the same prenatal environment and that the difference between their concordance rates is genetic; whereas, a difference in concord- ance rates for dizygotic versus regular siblings is caused by the greater shared prenatal and/or postnatal environments of the former. Using these methods, the genetic influence has been estimated at ~50 percent in the case of intelligence (Dickens and Flynn, 2001) and even higher in some of the major psychological disorders like attention-deficit/ hyperactivity disorder, autism, bipolar disorder, and schizophrenia. In reality, though, the prenatal and postnatal environments of mono- zygotic and dizygotic twins are not identical and the importance of shared zygosity is probably greatly exaggerated (Mandler, 2001; Prescott et al.,1999).
Itisnotclearexactlyhowmuchofthedopaminergicexpansionduring hominid evolution can be ascribed to epigenetic factors versus how much can be ascribed to genetic adaptations. But, the recent large increases in many industrialized nations in intelligence (Dickens and Flynn,2001)andinautism(Previc,2007)–twophenomenaclearlytied to dopamine, as discussed in Chapters 3and 4– suggest that popula- tion-widechangesindopaminergicactivitycanoccurandhaveoccurred without changes in the genome. In the remainder of this chapter, I will propose a set of scenarios that provide a plausible and comprehensive explanation for the rise of the dopaminergic mind. These scenarios, which involve both genetic and epigenetic inheritance, will be divided intotwomainportions– evolutionoftheprotodopaminergicmind
from a few million years ago to approximately 200,000 years ago, and evolution of the later dopaminergic mind leading to the cultural “Big Bang” ~70,000–80,000 years ago.
- 2Evolution of the protodopaminergicmind
5.2.1 Environmental adaptations in the “cradle of humanity”
Somewherebetweenfiveandsixmillionyearsago,thefirsthominids–the Australopithecines – began to appear in sub-Saharan Africa (Arsuaga, 2003; Coppens, 1996). This date has been established on the basis of both fossil evidence and mitochondrial DNA evidence.1There were a series of Australopithecine lineages, only one of which (Australopithecus afarensis) is generally accepted as having directly led tohumans.
The divergence of humans and chimpanzees occurred within one to two million years of a reactivation of rifting that eventually resulted in the East African plateau becoming considerably drier than the West Africantropicalforest(Arsuaga,2003;Coppens,1996).Fossilevidence nowconclusivelydemonstratesthattheearlyhominidswereconfinedto an open, arid savanna environment that may have surrounded streams, lakes,orothersourcesofwater(Brunetetal.,1995).Australopithecuswas clearlybipedal,albeitwithretainedarborealcapabilities(Arsuaga,2003; Coppens, 1996). Bipedalism would allow the early hominids to exploit the open savanna environment to a much greater extent than the great apes because of their greater ease of locomotion (Carrier, 1984) and because a bipedal posture is much less likely to absorb the heat of the sun (Wheeler, 1985). In turn, the greater exploitation of the savanna nicheledtolessofarelianceonafrugivorousdietthaninthecaseofthe forest-dwelling chimpanzee (Grine and Kay, 1988).2Other adaptations to a thermal environment, to be discussed shortly, additionally con- tributedtotheadvantageoftheearlyhominidsoverthegreatapesinthe increasinglyaridsavannasofEastAfrica,anadvantagethatisborneout bytheabsenceofanyco-mingledapefossilremainsinthelocalesofthe early hominids (Brunet et al., 1995; Coppens,1996).
Despite their bipedalism, the australopithecines were short-statured creatures with brains no larger than that of the chimpanzee. A continuing divergence of gracile australopithecines (Australopithecus
1 Mitochondrial DNA in cells is, unlike normal DNA, contained in the cell’s nucleus, transmitted only through the mother. Based on a known rate of mutation, systematic deviations in DNA among individuals and species can provide reasonably good esti- mates concerning the time-course of biological evolution (see Cann et al., 1987).
2 It is worth noting, as evidence of their limited locomotory prowess, that modern apes traverse a total of only about one-half mile per day (Bortz, 1985).
afarenis)fromrobustaustralopithecinesisbelievedtohaveeventuallyled totheemergenceofthegenusHomoaround2–2.5millionyearsagoin EastAfrica(Arsuaga,2003;Coppens,1996;Falk,1990).Homohabilisis believedtohaverepresentedthefirstmajoradvanceoverchimpanzeesin relativebrainsize,withitsbrainestimatedtobeabout50percentlarger than that of Australopithecus in allometric (i.e. brain-to-body) terms (Arsuaga, 2003; Falk, 1990). Homo habilis also represented a major advanceculturally,inthatalargeanddiversearrayofstone-flakedtools have been found near its remains (Coppens, 1996). These finds have oftenbeenlocatedatadistancefromwhereskeletalremainswerefound, suggestingatransporttoanimalcarcassesfromwhichmeatandmarrow were removed (Arsuaga, 2003). Such behavior would require a certain amountofforesightand,inturn,anexpandedemphasisonmoredistant spaceandtime.TheevolutionaryscenarioleadingfromAustralopithecus totheemergenceofHomoisnotentirelyclear,althoughFalk(1990)for onearguesthatthegracileaustralopithecinesmayhaveoccupiedamore open savanna niche than did the robust australopithecines.
WhatisclearisthattheappearanceofHomohabilisoccurredinanera ofadditionalclimatechangeduringthetransitionfromthelatePliocene to early Pleistocene geologic epochs, in which the aridity of the sub- SaharanEastAfricanplateaufurtherincreased(Arsuaga,2003;Coppens, 1996). A worldwide cooling took place between two and three million yearsago,resultinginaloweringofhumidityandadramaticdecreasein vegetation;forexample,theratiooftreetograsspollensdecreasedfrom
0.4 to 0.01 in the Omo valley of Ethiopia during this period (Coppens, 1996). The increasing dryness of the East African plains favored the inclusionofmeatinthediet,whichisbelievedtohaveoccurredaround two million years ago (Eaton, 1992; Leonard and Robertson, 1997) as attested to by the animal bones and cutting tools typically found with Homohabilisremainsandthemoreefficientchewingcapabilityofitsteeth and jaws (Coppens, 1996). It should be noted that present-day hunter- gatherers now consume about one-third of their calories from animal sources (Bortz, 1985; Eaton, 1992; Lee, 1979), which is comparable to that of hunter-gatherers in the Early Stone Age (Eaton, 1992), butmore importantly, meat consumption can rise to 100 percent of the diet in times of drought (Bortz, 1985; Lee,1979).
The emergence of Homo habilis was followed by several waves of human migration out of Africa and onto the Eurasian land mass, the first of which may have occurred as early as 1.9 million years ago (Wanpo et al., 1995). By the early Pleistocene era, a relatively advanced Homo erectus and its exclusively African cousin (Homo ergaster) with much largerbrainsizes(~1,000cc)andheightthanHomohabilishadpopulated
wideareasofAfrica,Europe,andAsia.Eventually,Homoerectusevolved into archaic humans around 500,000 years ago in Africa, including HomohelmeiandHomoheidelbergensis(foundinAfrica)andHomosapiens neanderthalensis (also known as the Neanderthals, who later dominated Europe for ~200,000 years and lived contemporaneously with modern humans). The continuing African evolution of modern humans is further supported by numerous fossil and archaeological findings, including:
- theearliestmixtureofHomoerectusandHomosapienscranialfeatures, found in East African skulls dating back approximately one million years;
- the earliest fossil remains of archaic Homo sapiens in sub-Saharan Africa, slightly less than 500,000 years ago;
- the origin and continued evolution, primarily in Africa during the EarlyStoneAge(EarlyPaleolithicEra),ofprimitivestonetoolssuch as bifacial stone flakes and hand-axes;and
- the first appearance of blade technology in southern Africa at the beginning of the Middle Stone Age (~250,000–300,000 years ago).
The beginning of the Middle Stone Age appears to have occurred slightlyafterthesplitbetweenNeanderthalsandmodernhumans,which isbelievedtohaveoccurredaround400,000yearsago,basedinparton the slight difference (0.5 percent) between the Neanderthal and modern human genomes (Noonan et al.,2006).
It is generally accepted by most anthropologists that modern humans aremostlyifnotexclusivelytheproductofAfrica,whichisoftenreferred to as the “cradle of humanity.” Recent archaeological and DNAevi- dence independently confirm that anatomically and genetically modern humans first emerged in Africa sometime around 200,000 years ago. Based on the rate of random mutations in mitochondrial DNA and y-chromosomal DNA and the greater diversity in modern-day DNA in sub-SaharanAfrica,divergenceanalysessuggestthatallHomosapiens sapiens evolved from a small gene pool of less than 10,000 in Africa approximately 200,000 years ago (Cann et al., 1987; Hammer, 1995; Templeton, 2002; von Haesler et al., 1996).3Although until recently the oldest fully anatomically modern human skull from Klasies River inSouthAfricadatedto130,000yearsago,amorerecentskullfrom
3 Although specific polymorphisms and other DNA markers point to older dates for the occurrenceofthemostrecentcommonancestorofallhumans,theyaremoredifficultto interpret in that they are based on much less data and are much more variable, whereas the mitochondrial DNA and y-chromosomal data collectively derive from over 1,500 individuals each and are remarkably consistent in their dating (Templeton, 2002).
Omo Kibish in Ethiopia dating back to 195,000 years ago shows essentially modern features (McDougall et al., 2005), and it is believed that around 150,000 years ago descendants of the earliest Homo sapiens sapiens migrated from this region to South Africa (Behar et al., 2008).
It may be concluded, therefore, that conditions in Africaspearheaded virtuallyeverymajorevolutionaryadvanceinthepathfromchimpanzees to modern humans. But, many issues pertaining to the evolution of humans have not been conclusively settled, such as how gradual or punctuated that evolution was or how closely our anatomical evolution matched our cognitive evolution. There is much more evidence for gradualisminouranatomicalevolution,ascontinualprogressiontoward the modern human skull occurred from two million to two hundred thousandyearsago,incontrasttotherathermeageradvancesincultural output before 100,000 years ago (Coppens, 1996). Around 200,000 years ago, a striking divergence between our anatomical and cultural evolution occurred in that the modern human genome and craniofacial structure were essentially finalized, whereas over 100,000 years more wererequiredtoproducethefirstclearevidenceofart,beads,advanced tools, commerce, and other indicants of a modern-like intellectual capability. Indeed, the cultural distinctions between Neanderthals and genetically modern humans were not all that dramatic in many places suchastheMiddleEastevenaslateas90,000yearsago(Arsuaga,2003; Mellars, 2006; Shea, 2003; Wynn and Coolidge, 2004), and later cul- tural advances may have occurred in Neanderthals eitherindependently (Zilh~aoetal.,2006)orafterinteractingwithmodernhumansinEurope (Amos, 2003; Arsuaga, 2003), despite the former’s clearly different genetic makeup and craniofacial anatomy (e.g. larger brain and brow). Thereisevenevidencethatmodernhumansmayhavebeentemporarily displaced by Neanderthals in the Levant region (present-day Israel) around this time (Arsuaga, 2003; Mellars, 2006; Shea, 2003).
Whereas modern humans 100,000 years ago did not demonstrate a huge, if any, intellectual advantage over their Neanderthal cousins, this was not true 30,000–40,000 years later. Beginning around 65,000 years ago, there was a rapid littoral expansion of modern humans across the Red Sea into Southwest, South, and Southeast Asia – about four kilo- meters per year, according to Macaulay et al. (2005) – followed by a fairly rapid replacement of the Eurasian Neanderthal populations. Amajorbiologicallydrivenbehavioralchangepresumablyoccurredinthis period to decisively set the stage for the modern human intellect (Shea, 2003). Heretofore, theorists have attempted to understand the origins of the modern human mind in terms of various differences between us and our anatomically and genetically distinct Neanderthal cousins, but a
muchgreatercluetotheoriginsofthehumanmindliesinacomparison of modern human behavior of 70,000 years ago with that of the genet- icallyandanatomicallymodernhumansof130,000yearsearlierinAfrica.
The next section will attempt to explain why sub-Saharan so pivotal in the evolution of humans during the early-to-middlePleis-
tocene geologic era spanning the period from two million to 200,000 years ago, particularly regarding the environmental and dietary influ- ences that evidently led to the first major rise of dopamine in hominid evolution.
5.2.2 Thermoregulation and its consequences
ThedryingofthesavannasoverthecourseofmillionsofyearsforcedaretreatofchimpanzeesandgorillastothelushforestsonthewesternsideoftheEasternriftvalley.Forasmallandrelativelydefenselesscreaturesuch as Homo habilis that could tolerate heat, a drier climateofferedahugeopportunityinthatitcouldexploitthemiddayenvironmentinwhich the great apes and many dangerous predators wouldberelativelyinactive,whileatnightitcouldretreattocavesandevenarborealsafety.Despiteourabilitytothriveinawidevarietyoftropicalenvironments, humans are still at risk in hyperthermic environments, asillustratedbythe large number of heat stroke deaths that occur in theelderlyduringurban heatwaves and in the young during extreme physicalexertioninwhichsweatingmaybepreventedbecauseofheavyclothing(Figa- Talamanca and Gualandi, 1989). Humans also face aspecialthermalchallengethatnonhumanprimatesdonot,inthatverylargebrains generate much more heat than do smaller ones. However,humanshaveevolved several highly efficient heat-loss mechanismsthatarguablyprovideuswithagreaterthermaltolerancethananyotherspeciesintheanimal kingdom (Bortz, 1985; Carrier, 1984). The combinationofsuch traitsasabipedalposture,hairlessskin,andanextraordinarynumberofsweat glands was arguably of much greater survival value totheearlyhominidsthanwereanyoftheintellectualadvancesthatoccurredin
Theadoptionofabipedalposture,whileprimarilybeneficialinterms of locomotion and freedom of the hands, also reduces the radiant head load of humans when the sun is at its peak (Wheeler, 1985). The optimization of bipedalism, especially for endurance running, required various improvements in the ability of lower-limb tendons, joints, and
4 Actually, the skin of humans is far from hairless, but the individual hairs are much smaller and finer than in other primates.
bones in humans to be efficient and and durable (Bramble and Lieberman, 2004), while elongating the body produced a concomitant increaseinthebody’ssurface-to-massratio,whichfacilitatesevaporative heat loss by having relatively more of the human body exposed to air. A large surface-to-mass ratio is a characteristic of all tropical human populations, as predicted by the “Allen’s rule” of mammalian thermal physiology (Jablonski, 2004). The hairless skin of humans – unique among primate species – further increases the efficiency of heat dissi- pation through the skin (Wheeler, 1985; Zihlman and Cohn, 1988).The most important element of our extraordinarily efficient heat loss capability is the enormous number (~2–4 million) and efficiency of eccrinesweatglandsfoundonourbodies.Unlikefurryanimalsthathave a large number of apocrine or oily sweat glands (40 percent, in thecase of apes), eccrine or watery sweat glands vastly predominate in humans (Jablonski, 2004). Because of these glands, our physical exertion in a desert environment may result in the loss of twelve liters or more of sweatperday(Bortz,1985),witheachliterofsweatcarryingaway~600 kcal of heat (Nunneley, 1996). To maintain normal hydration, as much as 15–20 liters of water must be consumed daily while exercising in a hot, dry environment (Nunneley, 1996), which is consistent with the restriction of early hominid fossil remains to the shores of what in the early Pleistocene era were thriving rivers and lakes. For the most part, the heat loss achieved through sweating is under central control rather than being determined by local skin temperatures. This factor is critical because, under exertion in heat, the gradient between environmental temperature and core temperature may limit the heat loss from passive mechanisms like convection and conduction (Jablonski, 2004). It should be noted that eccrine sweating is an efficient means of heat loss only when fur is not present, which strongly suggests the co-evolution of hairlessness, an increase in eccrine sweat glands, and probably bipedalism as well (Folk and Semken, 1991; Jablonski, 2004; Zihlman and Cohn,1988).
The functional utility of the exceptional adaptation of hominids to heatstressliesintheabilitytoengageinpersistence(chase)huntingand midday scavenging (Bortz, 1985; Bramble and Lieberman, 2004; Carrier, 1984; Krantz, 1968; Shipman, 1986; Zihlman and Cohn, 1988).Chasehuntinginvolvespursuingananimalinthehotsununtilit enters a hyperthermic condition and dies (“chase-myopathy”); for instance, zebras and cheetahs will grow hyperthermic to the point of collapse after an all-out chase of about one kilometer (Bortz, 1985; Taylor and Rowntree, 1973). Persistence hunting depends not on a sophisticated hunting technology (which the early hominidsalmost
certainly did not have) but only on physical endurance, and it is still occasionally pursued by modern Bushmen in Africa (Bortz, 1985; Carrier, 1984). The wildebeest of Eastern and Southern Africa is especially prone to chase-myopathy because it is reluctant to leave its area (Taylor, 1980, cited in Bortz, 1985), which is noteworthy because wildebeests are believed to have comprised the single-largest source of meat in the diets of the East African hominids (Bortz, 1985). The only tools needed for chase hunting are primitive stone-cutting instruments used to extricate the meat from the dead animal. Such tools could also have been used in scavenging meat/marrow from carcasses (Shipman, 1986), a thermally demanding task given that the traveling long dis- tances to and from the carcass and the extrication of the marrow pre- sumably occurred during midday hours, when nocturnal predators would have been less of a threat. Our hominid ancestors evidently engaged in both chase-hunting and scavenging, though probably more of the latter (Shipman, 1986).5
While the human capacity to dissipate heat is impressive relative to otheranimals,theuseofthiscapacitytoachievethephysicalendurances necessary for successful scavenging and chase-hunting requires that several conditions be met. First, the ambient environment must be arid for optimal sweating to occur (Nunneley, 1996). Second, a fluid supply must be readily available, given the loss of hydration caused by the sweating during the endurance activity (Carrier, 1984). Finally, the thermoregulatorysystemmustbeabletokickinrapidlyduringextreme physicalexertion,whichcreatesatleasttwentytimestheheatloadthatis accrued while we are at rest (Nunneley, 1996). The first two of these conditions were fulfilled by the early hominid environments of Eastern and Southern Africa, which were essentially arid savannas that are believed to have contained either nearby lakes or rivers.6The last con- ditioncouldhavebeenmetbyanexpansionofdopaminergicsystemsin the brain, particularly the dopaminergic nigrostriatal pathway. Asnoted
5 Regardless of whether the animal was killed or scavenged, the extraction of meat was madeeasierbyanadditionalincreaseinthelengthandflexibilityofthethumb(Carroll, 2003), which probably took place as part of the general physiological/anatomical adaptation during the late Pliocene/early Pleistocenetransition.
6 The fact that our early hominid ancestors as well as prehistoric modern humans in sub-
Saharan Africa tended to congregate near sources of water does not imply that our ancestors once spent large amounts of time in an actual aquatic environment, as pro- posed by Morgan (1997) and others. Much of the “aquatic” theory has beendiscredited
(see Moore, www.aquaticape.org), and all of the supposedly aquatic ph adaptations either do not exist (e.g. humans do not a “diving reflex” like other marine mammals) or were adaptive for other purposes (e.g. a hairless skin helped to dissipate
heat and a descended larynx promoted vocal communication).
inChapter2,thenigrostriatalsystemmediatesvoluntarymotoractivity, so it is ideally suited to lowering body temperature after the onset ofactivity(Cox,1979;Leeetal.,1985).Thenigrostriatalstimulationof dopaminergic heat-loss mechanisms evidently begins within minutes of activation of the striatum during motor activity and continues until coretemperaturesreachabout2–3○Cabovenormal,afterwhichdopa- mine levels and temperature are no longer correlated (Fukumura et al., 1998). The dopaminergic pathways leading from the striatum to the anterior hypothalamus (Kiyohara et al., 1984) are especially well pos- itioned to relay a feed-forward signal to the hypothalamus to activate peripheralmechanismsbeforeitreceivesfeedbackconcerningtherisein coretemperature.
Acrucialrolefordopaminergicmechanismsinheat-lossgenerationis suggestedbyalargebodyofevidencefromanimalsandhumanclinical patients(seePrevic,1999).Forexample,reducedsweatingandheatloss
– sometimes reaching dangerous levels – are found in hypo-dopami- nergic disorders such as Parkinson’s disease and phenylketonuria and following use of anti-dopaminergic medication in schizophrenia (Figa- Talamanca and Gualandi, 1989). Also, anti-fever agents are known to block the pyrogenic actions of the prostaglandins, which normally inhibit dopamine (Schwarz et al., 1982). There are many other neuro- physiologicalmechanismsthatcontributetothermaltolerance,buteven these other mechanisms may be dependent on dopaminergicregulation. One such neuroendocrine mechanism is growth hormone, which is higher in heat-tolerant individuals (Niess et al., 2003) but reduced in those with less plentiful eccrine sweat glands and sweat output (Hasan et al., 2001; Lange et al., 2001). Dopamine is a major stimulant for growth hormone, especially when growth hormone levels are low (Bansaletal.,1981;Boydetal.,1970;Husemanetal.,1986;Jacobyetal., 1974), although it inhibits growth hormone production when it is excessive, as in pituitary adenomas (Bansal et al.,1981).
As discussed in Chapter 2, expansion of dopaminergic systems rep- resents the major neurochemical difference between primates andmany other mammals such as rodents, a trend that evidently continued in humans. The precise mechanism through which dopamine levels becameelevatedisunclear,buttheroleofatleastsomegeneticselection is suggested by the 1.4 percent genetic divergence of humans and apes and the co-occurrence of so many important changes that affectedbody shape and size as well as a host of physiological systems that increased thermalefficiency.Theconstellationofphysiologicalchangesdescribed above need not have involved separate mutations for genes controlling each of these functions. For example, dopamine is known to increase
growth hormone and in so doing contributes to elongating the body as well as promoting heat loss. An elongated body and greater thermal tolerancecouldalsohaveledtogreaterbipedallocomotionandexercise, which in turn further increased dopamine levels by, among other actions, increasing calcium production and the calcium-dependent activity of tyrosine hydroyxlase, which converts tyrosine into dopa (Chaouloff, 1989; Gilbert, 1995; Heyes et al., 1988; Sutoo and Akiyama, 1996). Similarly, the enhancement of bipedalism for running and heat loss ensured, through chase-hunting and scavenging, a greater supply of protein-rich meat sources containing rich amounts oftyrosine (Previc, 1999), which further increased the supply of dopamine (see Chapter 2). Increased protein from meat consumption is known to increaseheight(Suzuki,1981)7and,becausemeatiseasiertodigestthan rawplantfood,ameat-baseddietwouldpermitthedigestivetracttogrow smaller(Henneberg,1998),therebyreducingbodymassrelativetobrain sizeandbodysurfaceandfurtheraidinginheatdissipation.8Itisnotclear whethertheothermajorphysiologicaladaptation–namely,thereduction inhairgrowth–wasduetoaseparatesetofgeneticmutationsorasso- ciatedwiththeabovehormonalandneurochemicalchanges.Hairgrowth is controlled by a complex mixture of genetic and hormonalinfluences, and sometimes a given hormone may have the same hair-stimulating effectwheninsevereexcessordeficiency(AlonsoandRosenfield,2003). To date, no genetic factors have been found to underlie the differences between humans and chimpanzees in hairgrowth.
The consequences of bipedalism and the dopaminergic expansion may, in combination, account for most of the prominent brain features that distinguished Homo habilis and later humans from the Australo- pithecines. These include:
- the aforementioned larger brain-to-body mass ratio, due to the effect ofdopaminergicallystimulatedgrowthhormonetoincreasebrainsize relative to a stable or even decreasing digestive tract (seeabove);
- a greater convolutedness of the cortex, due to the higher concen- trations of dopamine in the upper cortical layers, which when enlarged relative to the lower layers tend to buckle and formfissures (Richman et al., 1975);and
7 For example, the height of the average Japanese was increased by a remarkable eight centimeters from 1949 to 1979 due to the increased consumption of meat (Suzuki, 1981).
8 Possibly coinciding with the switch to a carnivorous diet, a genetic mutation specific to
myosin and affecting jaw muscle strength and chewing strength has now been traced to the human lineage beginning about 2.5 million years ago (Stedman et al., 2004).
- lateralization of cortical function, most likely related to the switchto an upright stance that produced asymmetrical prenatal inertial forces during motion. The latter ultimately may be presumed to have led to a predominance of vestibular processing in the right hemisphere of most humans (Previc, 1991), a bias towards right- handedness, and the creation of a dopamine-rich left hemisphere that functionally bears little resemblance to the brain of the chim- panzee (Gazzaniga,1983).9
As noted in Chapter 1, there is no evidence that the changes in brain size, convolutedness, and lateralization were in and of themselves important to the evolution of human intelligence and individually selected for genet- ically, so they may be considered mostly epiphenomena of the larger physiological/skeletal adaptation. Although one must always be cautious when only limited fossil evidence from the past is available, it is more plausible to infer from the dopaminergically enhanced physiology and diet of modern-day hunter-gatherers and the critical role of dopamine in advanced intelligence that it was the increase in dopamine that elevated theintellectualcapacityofHomohabilisaboveitspredecessors.Twoofthe many consequences of this increased physiological and intellectual cap- acity were the much more widespread and sophisticated tool use of Homo habilis–whichgaverisetoitsLatinnamemeaning“domestic”–andits putativemigrationclearintoAsiaasearlyas1.9millionyearsago(Wanpo et al.,1995).
More fundamentally, the increased distances traversed by Homo habilis in its scavenging and hunting would contribute to an increased emphasis on distant space and time – hallmarks of dopaminergic thought. But despite acquiring many modern-like features, the intel- lectual capacity of Homo habilis and its immediate successors was still quite inferior to that of modern humans. And, although the brains of HomoerectusandarchaicHomosapiensexpandedgreatlyinabsolutesize, they grew much more modestly in relative brain-to-body size since humans also grew in height. Although they continued to migrate extensively throughout Eurasia, these protodopaminergic humans never achieved the intellectual level of their modern counterparts, as they used modifications of the same basic stone-flake tool design for hundreds of thousands of years. There is some evidence that slight improvements in blade technology did occur before 200,000 years ago (McBrearty and
9 The belief that bipedalism contributed to cerebral lateralization in Homo habilis is shared by other theorists such as Corballis (1989) and is consistent with evidence of right- handedness (e.g. asymmetrical stone-flaking) in the Homo habilis archaeological record and an even greater dextrality in the Homo erectus record (Toth, 1985).
Brooks, 2000), and it is possible that the immediate descendants of Homoerectusmayevenhavepossessedaprimitivelanguage(Bickerton 1995; Corballis, 1992), perhaps even some capacity for speech (Kay et al., 1998). Moreover, the modest but significant correlation between body height and intelligence in modern humans (Johnson, 1991) suggests that at least some of the dopamine-related physiological adaptations that elongated the body in the archaic humans may have been associated with increased intelligence. But despite the increasing physical resemblance of archaic humans to modern humans, the final leap forward to the modern human intellect required yet another confluence of events that occurred long after the ancestral genome and anatomy common to all modern humans were crystallized.
- 3The emergence of the dopaminergic mind in later evolution
For at least 100,000 years after the emergence of a genome and anatomy common to all modern humans, the pace of human intellectual evolu- tion continued to lag. This is somewhat remarkable in that humans clearly had the physical capacity for speech and language, but that capacity did not appear to be in and of itself sufficient to create the great intellect found in modern humans. The period from 100,000 to 200,000 years ago is part of the late Pleistocene geologic epoch and includes the beginnings of the Middle Stone Age, which most experts date at around 100,000 years ago but which has been pushed back by some to around 250,000–300,000 years ago (McBrearty and Brooks, 2000). While it is still debatable whether humans before 100,000 years ago had developed the beginnings of advanced blade and point technology, certainly the greatest technological achievements of the Middle Stone Age (shaped bone tools, barbed points, microliths, mining, fishing, beads, art, music, etc.) were all delayed until less than 100,000 years ago. Although a recent finding points to an isolated example of primitive bead pro- duction (shells with small holes) as far back as 90,000 years ago (Vanhaereny et al., 2006), widespread and systematic advances in tool- making technology and art did not occur until around 70,000–80,000 years ago (Henshilwood et al., 2001; McBrearty and Brooks, 2000), primarily along the coast of Southern Africa. To what extent many of these cultural advances appeared gradually has been the subject of intense debate. Mithen (1996) has been a leading advocate of a cultural “Big Bang” (although he argues it occurred as late as 50,000 years ago in Europe), whereas McBrearty and Brooks (2000) posit a more gradual advance(eventhoughtheirownFigure13showsevidenceofanabrupt
jump in many technological areas around 80,000–100,000 years ago). Although some cultural achievements like musical instruments and animal artwork may date back to as little as 40,000–50,000 years ago (Fitch, 2006; McBrearty and Brooks, 2000), tally marks from 77,000 year-old artifacts found in Blombos Cave on the coast of South Africa suggest an earlier mathematical capability. An even more remarkable but still-controversial finding from Rhino Cave in Botswana suggests that animal sculpture and even worship occurred 70,000 years ago, marking the earliest evidence of religious ritual in modern humans (Handwerk, 2006). So, it would appear that not only was there indeed a “cultural Big Bang” but that it can be first traced to Southern Africa around 70,000–80,000 years ago. Moreover, recent geneticand other data now suggest that these humans quickly advanced fromSouth Africa to the horn of Africa beginning around 70,000 years ago (see Figure 5.1). This relatively small population of humans, possibly after merging with a small population of East Africans, later crossed into theMiddleEastnolaterthan65,000yearsagoandarebelievedtohave becometheancestorsofallpresent-daynon-Africansaswellas,because of back-migration, the vast majority of present-day Africans. The Khoisan of south-central Africa may be one of the few populations to contain remnants of the anatomically modern humans who first migrated from East Africa to South Africa 150,000 years ago (Behar et al.,2008).
For those who firmly believe that human intelligence was the result of genetic adaptation, the >100,000 year gap between the emergence of genetically modern versus culturally modern humans is highly problem- atic, even if one accepts the youngest limit for the common ancestor from
mostgeneticdatings(150,000yearsago)(Hammer,1995).Andifolder modernartifactsareeventuallyfound,thiswouldnotnegatethefactthat theoverwhelmingmajorityofhumanartifactsasrecentlyas100,000years agofailtomanifesttheadvancedcharacteristicsthatbecamewidespread just30,000yearslater.Itishardtoimaginethemajorityofpresent-day humans exhibiting so little a tendency to create art or fishing barbs or bone tools for that long. All of these arguments support an epigenetic explanation for the explosion of culture within the past 100,000 years. Thiswouldbeconsistentwiththeextremeunlikelihoodthattherewasa direct genetic selection for any specific cognitive abilities required to supporttheseculturalachievements(seeWynnandCoolidge,2004;
Ch ap te r 1, this volume). Indeed, if cognitive selection adaptivevalue,whydidittakeoveronemillionyearsfromtheappearance
ofHomoerectustocreatethefirstmodernhumanartifactsand,eventhen, only in a single region of the world (i.e. Southern Africa)?
Figure 5.1 The hypothesized direction of modern human origins and migration.
Notes and sources:
- modern humans first emerged in East Africa around 200,000 years ago (McDougall et al.,2005);
- part of the East African population migrated to Southern Africa beginning around 150,000 years ago (Behar et al.,2008);
- “modern”humanintelligenceoccursalongthecoastofSouthAfrica around 70,000–80,000 years ago (Henshilwood et al.,2001);
- migration of one portion of the South African population back to
East Africa occurs shortly thereafter (Behar et al., 2008);
- the merged East African and South African populations migrate along the South Asian coast all the way to Australia, beginning around 65,000–70,000 years (Macaulay et al., 2005), as well as to other parts of Africa (Behar et al., 2008).
One plausible explanation for the intellectual advances in Southern Africa during the late Pleistocene era is the change in diet – specifically, the widespread consumption of shellfish – that occurred around 100,000 years ago in coastal areas of South Africa and possibly elsewhere. A second, more speculative explanation relates to demographic pres- sures that increased population density and cultural exchange. As
reviewed in the next sections, both of these changes could have greatly elevated brain dopamine to a level capable of supporting the newly advanced intellect and, through epigenetic inheritance, be passed on to subsequent generations of humans.
- 5.3.1The importance of shellfishconsumption
Some researchers (e.g. Erlandson, 2001) believe that human consump- tion of marine fauna began 150,000 years ago, possibly due to a coastal migration to escape increasingly arid inland regions, and isolated evidence for possible shellfish consumption has been found in Eritrea dating back 125,000 years (Walter et al., 2000) and at Pinnacle Point in South Africa around 160,000 years ago (Marean et al., 2007). However, it is generally accepted that widespread transport and consumption of shellfish occurred less than 100,000 years ago along the South African coastline (Broadhurst et al., 1998; Dobson, 1998; McBrearty and Brooks, 2000). Shellfish are rich in iodine and essential fatty acids (Broadhurst et al., 1998; Dobson, 1998), both of which have been shown to increase dopamine activity and intellectual development (DeLange, 2000; Wainwright, 2002). Iodine, found in red meat but especially in marine animals, is converted by the thyroid gland into thyroxine, which in turn stimulates tyrosine hydro- xylase and the production of dopa (see Previc, 2002), while essential fatty acids are known to increase dopamine receptor binding and dopamine levels (Delion et al., 1996; Wainwright, 2002). As noted earlier, inad- equate iodine, leading to reduced levels of dopamine and norepinephrine, is the single largest preventable source of mental retardation in the world today (Boyages and Halpern, 1993;DeLange, 2000). Conversely, hyperthyroidism is frequently associated with dopaminergically mediated psychosis (Benvenga et al., 2003). More generally, the timing and amount of thyroid output is an important influence on mammalian speciation, and it has been specifically implicated in the process of domestication (Crockford, 2002).
As reviewed by Gagneux et al. (2001) and Previc (2002), there appears to have been a major increase in thyroid hormone production (particularly of T3, which is formed from iodine) with the advent of modern humans. This finding is consistent with the similarity of many superficialphysicalfeaturesofchimpanzeesandthoseofiodine-deficient humans(O’Donovan,1996)andtherelativelyenlargedhumanthyroid gland relative to that of the chimpanzee (Crile, 1934). By contrast, chimpanzees and other mammals have relatively larger adrenal glands, which are important for transient arousal, which brain dopamine inhibits. One important difference between Neanderthals and modern
humansappearstohavebeentheamountofiodineintheirbodies,since Neanderthalsexhibitmanyskeletalfeatures(largefemurs,shortstature, extended brows, larger brains) characteristic of iodine-deficientmodern humans (Dobson, 1998) and because prehistoric modern humans are believed to have consumed relatively more shellfish than big game (Dobson, 1998; McBrearty and Brooks, 2000; Richards et al., 2001). The large amount of shellfish consumed is especially noteworthy in Klasies River excavations, but our modern human ancestors generally tended to inhabit coastal areas, rivers, and large lakes where aquatic fauna would be easy to obtain. There is some tentative evidence that Neanderthals, at least early on, were less likely to inhabit coastal or otherwise iodine-rich regions than modern humans (Dobson, 1998), even as later Neanderthals began to consume more marine fauna. It has been long debated why modern humans so quickly replaced Neander- thals in Europe after 50,000 years ago, but population replacement has been widespread among even among modern human populations and theenteringpopulationusuallypossessesgreaterdietarybreadthinthese instances(O’Connell,2007).TheNeanderthals’over-relianceonlarge game animals may have been particularly disastrous when the last Ice Ageapproacheditspeakaround30,000yearsagoaslargegamebecame less plentiful and subject to competition with growing populations of modern humans. The relative inability of Neanderthals to successfully switchtomorediversesourcesoffoodmaynothavehadanythingtodo with their brains, in that modern human populations frequently show resistancetochangingtheirfoodprocurementstrategiesastheirterritory is encroached upon by other human groups, which can mainly be attributedtosocio-culturalfactors(O’Connell,2007).
Given the importance of the thyroid gland in human brain develop- ment and function, it is reasonable to conclude that the consumption of shellfish was a major impetus for bringing the dopaminergic mind to fruition. Indeed, advanced tool-making artifacts in the Middle Stone Age coincide both geographically and temporally with the consumption of shellfish at various coastal sites in Eastern and Southern Africa (McBrearty and Brooks, 2000; Walter et al., 2000). However, increased shellfish consumption alone could not have led to advanced intelligence, because crab-eating macaque monkeys would then rank among the most intelligent of species. It may be presumed that the later dietary- induced rise in dopamine in humans proved so effective partly because it was built on the earlier increase in dopamine that was part of the general physiological adaptation for endurance activity. Moreover, the increase in shellfish consumption was probably not the only nongenetic factor responsible for the dramatic cultural advance beginning less than 100,000
years ago, because even current iodine-deficient humans, despite their reduced intellect, are capable of art and religion. Moreover, since humans probably began to consume marine fauna in large quantities at least several millennia before the explosion of advanced intellectual artifacts around 80,000 years ago, shellfish consumption may have contributed to but not guaranteed the emergence of the dopaminergic mind.
Although directly contributing to improved intellectual functioning, the evolution of human intelligence and behavior was aided by a more general benefit of consuming marine fauna – namely, that it greatly improved and stabilized the human diet and thereby promoted a longer lifespan (McBrearty and Brooks, 2000). The increased longevity, in turn, is believed to have led to population pressures, increased migration, and increased exchanges among small groups of humans (McBrearty and Brooks, 2000; Mellars, 2006). Although it has been argued that genet- ically modern humans experienced a major population decline due to drought and other climatic conditions (Ambrose, 1998), this claim has been refuted (Hawks et al., 2000) and was unlikely to be true in South Africa, which had a plentiful supply of marine fauna. Certainly, the remarkable number of major archaeological sites from Cape Town on the west to Port Elizabeth on the east (Henshilwood et al., 2001) sug- gests that South African coastal regions were well-populated in the Middle Stone Age. And, skeletal evidence indicates that modern humans in Europe had a slightly longer lifespan than their Neanderthal cousins (Arsuaga, 2003). Longevity-induced population pressures would presumably have followed the rise of shellfish consumption with a lag, and environmental changes in sub-Saharan Africa between 70,000 and 80,000 years ago are also believed to have further increased the competition for resources and sped up migration and interchange among populations (Mellars, 2006). These twin factors could explain the lag between the initial widespread consumption of shellfish and the remarkable cultural advances 75,000 years ago.
Increased social exchange is known to be a catalyst for intellectual advances in other advanced primates, such as Sumatran orangutans (Van Schaik, 2006), and the role of the cities in promoting social exchange and intellectual progress has long been recognized.10
10 A fascinating recent example of the role of social factors in cognitive development is thatofdeafNicaraguanchildrenwhohadpreviouslylivedathomeinlinguisticisolation
Moreover, increasing population growth and exchanges could make the identification with one’s group even more important. The growth of art and possibly music in sub-Saharan Africa toward the end of the Middle Stone Age years ago may both be manifestations of this trend, given their roles in social identification and group cohesion (Brown, 2000; Shea, 2003). Moreover, the slower cultural advancement in South Asia rela- tive to Europe toward the end of the Middle Stone Age (60,000–30,000 years ago) as well as the slower cultural progression of Neanderthals relative to modern humans have both been partly attributed to the higher population densities of modern humans in Europe (James and Petraglia, 2005; Wynn and Coolidge,2004).
Increasedpopulationandincreasedsocialandculturalexchangesmay notonlyhavebenefitedfromincreasingintellectualprowessbutmay,in turn,havecontributedtothembiologically.Culturalevolutionisknown toalterbiologicalinheritance(Lalandetal.,1999),andthecompetitive stresses and achievement drives of modern urban society are believed to contribute to elevated dopamine levels (Pani, 2000; Previc, 2007). Indeed, hyperdopaminergic disorders such as schizophrenia and autism aremorelikelytooccurinurbanareas(Lauritsenetal.,2005;Marcellis et al., 1999; Palmer et al., 2006), and a milder form of these pressures could have been present to a certain extent even in sub-Saharan Africa in the late Pleistocene. On the other hand, it is difficult to imagine a current human even in a sparsely populated region not producing art, ornaments, or other advanced objects, and certainly no densely popu- lated primate population has ever managed to even approach the intel- lectual output of the average human. Nevertheless, increased dopamine due to dietary improvements and increased social interaction, building upon an already protodopaminergic mind, may have collectively been sufficient to lay the foundations of the true dopaminergicmind.
Inlinewiththerelativelystablegenome,nomajoranatomicalchangewas associated with this later dopaminergic progression,althoughthehumanbraindidslightlydecreaseinsizefrom~1,500cctothepresent-day1,350cc(Carroll,2003).But,oncehominidevolutionhadpro-gressed to its later stage, genetic changes were no longernecessaryforthedopaminergicincreasetobeexpressedthroughouttheentirespecies.Cultural and dietary influences on dopamine,transmittedprenatally,wouldhavebeenpassedonandenhancedinsuccessivegenerationsandthereby effectively been rendered a permanent part ofourinheritance. Even when humans moved inland and no longer relied as much on
and then were placed in a group setting and readily formed a new (emergent) language on their own (Senghas et al., 2004).
aquatic fauna, their dopaminergically mediated cultural achievements were self-sustaining. Thus, not only was the relatively stable human genome and anatomy beginning around 200,000 years ago insufficient incatalyzingtheintellectualexplosionthatoccurredover100,000years later,itwasalsounnecessarytomaintainthenewlyadvancedintellectas modern humans left Africa and spread throughout the world.
According to the scenario just presented, the dramatic expansion and increasing functional importance of dopamine systems in the human brainoccurredoveralongperiodoftime,onlypartiallyparalleledbythe changes in our physical appearance, and the dopamine rise continued long after our common genetic lineage. For those who believe that the great intelligence of humans relative to the rest of the primate world requires an exalted evolutionary progression, this theory is highly dis- appointing inthat:
- thereisnopositingofspecificgeneticmutationsforlanguageorother advanced intellectualfunctions;
- no importance is ascribed to the changes in cranial size and shape; and
- most of our intelligence is regarded as a by-product of physiological adaptation, diet and population pressures.
This theory is, however, consistent with a number of crucial facts, including:
- the lack of a causal relationship between brain size andintelligence;
- the tenuous relevance of the human genome tointelligence;
- the importance of dopamine to our advanced intellect (Chapter 3);
- the specific expansion of the dopamine-rich striatum and cerebral cortex relative to the rest of the brain in humans as compared to chimpanzees (Rapoport,1990);
- the known effects of diet and physiology on dopaminefunction;
- the sub-Saharan origins of humanity;and
- the >100,000 year gap between the establishment of the modern human anatomy and genome and the appearance of a variety of cultural artifacts associated with a distinctly modern humanintellect.
This theory merges genetic, epigenetic and cultural factors and blends the African gradualistic theory of McBrearty and Brooks (2000) with the general intellectual explosion posited by Mithen (1996), but with his Europeanlocusoftheintellectual“BigBang”nowplacedatanearlier
junctureinSouthernAfrica.Inaddition,thistheoryclearlyexplainswhy cognitive abilities like language, religion, art, advanced tool-making, mining, long-distance exchange etc. did not emerge as separate genetic selections but rather depended on a more fundamental increase in cognitivepotentialasexpressedinenhancedworkingmemory,cognitive flexibility, spatial and temporal distance (off-line thinking), mental speed, and creativity that could only be found in a brain rich in dopa- mine (see also Wynn and Coolidge, 2004). Finally, this theory dispels the notion that the evolution of advanced human intelligence was crit- ically dependent on the capacity for speech, since the latter waspresent at least 200,000 years ago in the first anatomically modern humans and probably another 100,000 or 200,000 years before that in archaic humans (Arensburg et al., 1990), without leading to any dramatic intellectualadvances.
By highlighting the epigenetic inheritance of dopaminergic activity, this scenario shows why the emergence of the dopaminergic mind was not associated with a unique, immutable genetic process but rather one that continued to advance long past the establishment of our common
genome and anatomy. In fact, as described in the next dopaminergic mind appears, at least in the industrialized nations, to
have undergone at least two additional transformations during ourmore recenthistory.
6 The dopaminergic mind inhistory
If one believes that human evolution – especially in its intellectual aspects–didnotrelyexclusivelyorevenlargelyonbrainsizeandgenetic transmission, then human evolution has never ceased. Hence, it would be incorrect to assume that all genetically modern humans andsocieties have the same neurochemistry and therefore the same intellectual abil- ities, personalities, goals, and propensity toward mental disorder. In particular, there is reason to believe that levels of dopamine are now muchhigherinmembersofmodernindustrializedsocietiesthaninmore primitive societies. This chapter will focus on two major historical epochs–thetransitionfromthehunter-gatherersocietiestotheancient civilizations and the dramatic expansion of the dopaminergic con- sciousnessandlifestyleinthetwentiethcentury.Insodoing,thischapter will highlight the role of influential individuals in history who have manifested dopaminergic traits and behaviors and played important roles in shaping our modern dopaminergicworld.
6.1 The transition to the dopaminergic society
Despite a certain degree of technical proficiency, Neanderthals and even modern humans for their first 100,000 years appear to have lacked the generativity and pervasive “off-line” thinking capabilities of later humans. Once the prehistoric cultural “Big Bang” had progressed to its final stages and the last great Ice Age began to recede around 20,000 years ago, intellectual evolution proceeded at a rapid pace, with seem- ingly but an instant required from the Neolithic Era and the beginnings of agriculture to the ancient civilizations and the Copper, Bronze, and Iron Ages. Intermixed with occasional periods of stagnation and despair, cultural advances in the form of literature, mathematics, legal codes, navigation, engineering, and architecture grew steadily over many mil- lennia and in many places around the world, often independently of each other. Eventually, the gradual accumulation of knowledge and com- mercegavewaytotheEuropeanRenaissance,whichsetthestageforthe
finalexpansionofthedopaminergicmind.However,itwasnotuntilthe second half of the twentieth century that the dopaminergic mind reacheditsultimatestate.Evidenceofthisheightenedtwentieth-century dopaminergic state is found in rapid increases in dopaminergically mediated intelligence and technological advances (Dickens and Flynn, 2001) and in a host of hyperdopaminergic mental disorders. Unlike the earlier increases in dopamine levels, there were no major environ- mental adaptations or dietary changes associated with thedopaminergic rise since prehistoric times. Rather, most of the dopamine increase occurred because higher dopamine levels became highly adaptive to individuals surviving and prospering in the increasingly stratified, complex, and competitive socio-economic systems that emergedduring humanhistory.
It may be instructive to review briefly how different the earliest hunter-gatherer societies must have been relative to our current indus- trial and post-industrial societies. It has been repeatedly documented that present-day hunter-gatherers work at most only about 20–30 hours a week, with little time lost for commuting. Although some authors claim the hunter-gatherers comprise what Sahlin termed “the original affluent society” (Sahlin, 1972; Taylor, 2005), their amount of leisure time has probably been somewhat exaggerated, at least in the case of females (Caldwell and Caldwell, 2003; Hawkes and O’Connell, 1981). Nevertheless, rather than compulsively arising at a certain time to begin each workday – or constantly worrying about the future – work appears to be more sporadically engaged in and there is a much greater “present orientation.” Work is also engaged in as a communal activity, sharing of resources is the norm (Taylor, 2005), and most leisure time is also spent in communal activities such as games and music (Sahlin, 1972), the latter of which is believed to have emerged mainly as an activity to promote socio-emotional bonding (Brown, 2000). Even agrarian life is considered too demanding by many hunter-gatherers, who given the option prefer the simpler hunter-gatherer life that is mostly devoid of material possessions (Sahlin, 1972). Indeed, material rewards unrelated to immediate consumption (i.e. secondary rewards) are of limited value in these societies, since they only create logistical problems given the frequent moves to new locations required by the nomadic existence. The per capita energy consumption of the hunter- gatherer is estimated at 3,000 calories per day, or less than 1 percent of modern humans, who have extensive transportation and electrical power requirements (Clark, 1989: 102). And, hunter-gatherers are largely free of the diseases and epidemics associated with urban existence, such as measles,influenza,plague,smallpoxetc.Ofcourse,therewouldhave
beenmanydiseasesuniquetotheprehistorichunter-gatherers,andtheir overalllifespanhasbeenestimatedatonlytwenty-oneyears(Acsadiand Nemeskeri,citedinCaldwellandCaldwell,2003),whichisconsiderably less than that of current hunter-gatherers (~30–35 years), who have much better tools and engage in limited cultivation (Arsuaga, 2003).
Taylor (2005) and others have reviewed evidence that early hunter- gatherer societies were relatively nonviolent and highly sexually per- missive, consistent with the high rate of sexually transmitted disease among current hunter-gatherers such as the !Kung San (Caldwell and Caldwell, 2003). Certainly, there were no standing armies among the hunter-gatherers, as they lived in small bands, and many of them were undoubtedly very peaceful, based on their artwork, their skeletal remains, and the fact that many present-day hunter-gatherer societies are basically nonviolent (Taylor, 2005). But, there is certainly evidence of violent confrontations involving many prehistoric groups, such as the Caribs and Yanomamo of South America, and up to 25 percent of prehistoric adult male deaths have been ascribed to violence (Coleman, cited in Caldwell and Caldwell, 2003). It is generally agreed that pre- historic hunter-gatherer societies were much more egalitarian than were the agrarian societies that followed, with women and men sharing child- rearing and work, and most of these societies – as do a large number of remaining ones – presumably had a maternal lineage system. Their religious systems mostly did not emphasize remote, otherworldly, or high gods (e.g. “sky gods”) but rather invoked natural and ancestral spirits. Finally, the prehistoric peoples tended to live more within nature than did the agrarian descendants. This does not mean that they were environmental angels – after all, many if not most of the large mammals and marsupials outside of Africa and Asia disappeared before the dawn of agriculture, and the burning practices of prehistoric societies con- tributed to deforestation and erosion in many parts of the world (Roszak, cited in Taylor, 2005).
As the Earth began to warm substantially around 20,000 years ago, the hunter-gatherer lifestyle eventually gave way to the development of primitive cultivation (e.g. seeding of wild grains and fruit trees) and nomadic herding that probably began 10,000–15,000 years agoas population pressures reduced the availability of large game animals in SouthwestAsia(Arsuaga,2003;Clark,1989;WeissandBradley,2001). The mixture of hunting and gathering, migratory grazing and farming waseventuallyreplacedbyamuchgreaterrelianceoncultivationofthe soil and the development of permanent agrarian settlements around 10,000yearsago,whichledtoanincreaseinenergyconsumptionto
~15,000 calories per day. It is widely believed that the agrarian lifespan
may have initially slightly decreased (Clark, 1989; Taylor, 2005), as moreconfinedsettlementlivingexposedhumanstomoreepidemicsand animaldiseases.However,thereisalsoevidencethathumanpopulations may have expanded more rapidly than before, which indicates that an improved diet may have increased fertility and offset a highermortality (Caldwell and Caldwell,2003).
With the advent of agriculture, the psychological relationship with nature was dramatically altered – instead of living with nature, the agrarian societies were dedicated to transforming (controlling) nature through cultivation, irrigation, harvesting etc. Unlike the hunter- gatherers, who had few permanent possessions and could easily switch diets in times of drought and/or migrate, the agrarian societies were based on an economy and lifestyle that was more sedentary and cyclical, depending on storage (hoarding) and prediction of seasonal weather for planting and harvesting (Clark, 1989). Both hoarding and prediction require a greater orientation to future events, which is a hallmark of the dopaminergic personality (see Chapter 3); indeed, hoarding is a classic obsessive-compulsive symptom and is dependent on the integrity of dopaminergic systems (Kalsbeek et al., 1988). The greater sensitivity to lunar and solar cycles also led to an expanded sense of dopaminergically mediated upper space, and otherworldly religious symbols (i.e. sky gods) began to replace the mythical animal and human figures that constituted the first religious icons.
The agrarian existence led to the emergence of several “proto- civilizations” in what Taylor (2005) refers to as Saharasia, a broad stretch of territory extending from North Africa to the Indus River. During the first several thousand years after the beginnings of agricul- ture, the warming earth and melting glaciers produced rainfall in this broad region that was mostly sufficient to support vast savannas and even forests at the higher elevations. Food was plentiful, the settlements were not all that densely populated – probably only a hundred to a few thousand persons in the larger ones – and the societies were generally peaceful and egalitarian (most homes were of the same basic type and size). Earth goddesses and women priests were prominent in the early civilizations, and sexual behavior was still relatively permissive.
As recounted by Taylor (2005), however, global temperatures con- tinuedtoriseandthelastremainingglaciersintheregionbegantomelt. Along with overgrazing and other rapacious agricultural practices, the global climate change led large swaths of Saharasia to dry out, and increasing competition for resources ensued. By 6,000 years ago, cities had grown larger, social stratification increased, warfare suddenly erupted on a large scale, and masculine values began to predominate.
The constant warring that unfolded was more intellectualized, senseless, and disturbing than in prehistoric times, as it was often a meglomaniacal display of power by rulers and was frequently accompanied by mutila- tion and torture (Wilson, 1985). The natural religions were replaced by otherworldly hierarchical religions in which mostly male gods and male priests reigned supreme, and a different attitude toward nature emerged (dominance rather than co-existence). Technological achievements in the Bronze and Iron Ages occurred at an astonishing pace, fueled partly (or perhaps even largely) by the need for better weapons and other armor; indeed, so great was the technological prowess of the ancient civilizations that some of their products – notably, the famous Anti- kythera Mechanism, which was an astronomical calculator made of thirty geared wheels – were more sophisticated than anything produced for the next 1,000 years (Charette, 2006). Daily life became much more “propositional” – i.e. regulated by complex, abstract laws and written communications. A more negative attitude toward nature, maternal symbols of nature, and the body itself emerged, and sexual permissiveness and partial nudity were replaced by body coverage and a multitude of sexual prohibitions. The glorified status of male warriors and the concentration of wealth in much more stratified societies ultimately led to the replacement of matrilineal descent and inheritance by patrilineal descent and inheritance, which may have further restricted the sexual freedom of females.1The populations of Central and Southwest Asia may have been among the first to exhibit these new attitudes and lifestyles, but other civilizations soon followed and large fortressed city-states were constructed throughout this region and even in parts of Europe, Africa and East Asia. The most wrenching part of the environmental change occurred just after the end of the dramatic post-glacial sea rise known as the Flandrian Transgression, during which a worldwide drought of almost unimaginable proportions between 2300 and 2000 BC led to the collapse of many civilizations around the world and the emergence of large refugee populations (Bowen, 2005; Weiss and Bradley, 2001). By 1500 BC, one of the last of the relatively egalitarian, peaceful and permissive early Saharasian civ- ilizations (the Minoan in Crete) was conquered by the more aggressive Myceneans.
1 As reviewed by Primavesi (1991), the co-emergence of masculine dominance and property wealth created a need for a strict patriarchal lineage that necessarily restricted femalesexualfreedom.Unlikeinthematriarchalsociety,wherelineageisobviousfrom birth, patriarchal lineage can only be presumed if female sexual behavior is limited exclusively to thehusband.
The dramatic transformation of human existence into a much more competitive, inegalitarian, and ruthlessly technological one has been termed “The Fall” by Taylor (2005), and it may be the basis for the biblical story of Adam and Eve.2It has also been associated with a major change in human consciousness. Humans before this era viewed themselves as surrounded by and controlled and even aided by various gods; indeed, what might be today considered pathological symptoms such as hallucinations and delusions of being controlled (the “alien- control” syndrome) may actually have been quite common and tolerated during this time (Jaynes, 1976). The new consciousness, however, posited humans as independent, self-conscious agents, and the gods that they had once placed such faith in but who failed to prevent several centuries of natural catastrophes became more distant and less involved in controlling their thoughts and actions (Jaynes, 1976). Many theorists, including Jaynes (1976), Taylor (2005), and Wilson (1985), have argued that this era represents the beginning of left-hemispheric dom- inance, inner thoughts, ego explosion, masculine aggression, other- worldliness, a linear temporal perspective, and a technological mindset. It is certainly true that these features are more characteristic of the left hemisphere, but there is no actual evidence that our anatomical or functional lateralization was altered during this epoch. Rather, what occurred was a dramatic increase in the dopaminergic mind, which is typified by its advanced cognition, competitive drive, masculine style, focus on distant (future) goals and space, and control over nature and others (“agentic extraversion”), all at the expense of empathy towards others and maintenance of the emotional self. The dopaminergic mind merely masqueraded as an increased left-hemispheric dominance because the higher dopamine content of the left hemisphere makes it most typical of that mind in general. Furthermore, the lateral dopaminergicsystem may have been the part of the dopaminergic brain/mind that increased the most, because the internal control, ego-strength, techno-logical abstraction, and diminution of outside thoughts (hallucinations)
2 Just as the first great civilization (the Sumerian) emerged in Mesopotamia (now Southern Iraq), so can the Garden of Eden be traced to this same general location (Hamblin, 1987) and timeframe (4000–5000 BC). In Eden, Adam and Eve (who was supposedly formed from the rib of Adam according to Sumerian myth) lived in a lush forest (which was present in many parts of the Middle East 6,000–8,000 years ago) and lived without shame of their bodies (sexual permissiveness), and in a state of techno- logical ignorance. But, after disobeying God, Adam and Eve acquired knowledge, shame of their bodies, and were kicked out of lush Eden and into the desert (which started expanding around this same time), and for the first time greed and violence occurred in theiroffspring.
Unlike the previous expansions of the dopaminergic mind, this one was not due to a genetic evolution, physiological adaptation or trans- formed diet. Although brain size in genetically modern humans did decreaseslightlyto1,350cc,ithadstabilizedbeforetheriseoftheproto- civilizations, and the genetic variability among extant humans is remarkably low, only ~0.1 percent (Carroll, 2003). Moreover, the fact thatpatriarchal,stratified,sexuallyrestrictive,warlikesocietieswithsky- gods,priestclasses,andahighleveltechnologicalcapabilityoccurredin other civilizations throughout the world (e.g. the Aztecs in Mexico, the Mayans in Central America, the Incas in South America, and the ChineseinEastAsia)suggeststhatitwasprincipallytheemergenceof
The late-Neolithic change in human consciousness associated with
the rise of the first dopaminergic societies would not be the last time in history that humans themselves would change their neurochemistry and pass it on to their offspring. Led by men with highly dopaminergic minds, the human race continued to expand in its technological and scientific sophistication in all parts of the world – and particularly in Europe, the Middle East, and East Asia. Beginning at the time of the Renaissance in Europe and the age of the great explorers and acce- lerating into the twentieth century, the dopaminergic mind entered a new stage and eventually came to its full (and arguably overripe) fruition.
In the next section, I will review how dopaminergic trait the leading (albeit flawed) figures in modern history led them to great accomplishments that would help shape the modern world, before concluding with a brief discussion of the modern hyperdopaminergic
3AcentralthesisofJaynes(1976),basedonananalysisoftheculturalandliteraryrecord fromtheancientcivilizations,isthatbetween2500BCand1500BCtheinnerthoughts ofhumansthatwereonceattributedtoexternalvoices(e.g.thegods),inmuchthesame way as are the hallucinations of people with schizophrenia, began to be moreattributed tothevoicesofinnerconsciousness.Inaddition,evidencedinHomer’sOdyssey,humans increasingly began to be portrayed as in possession of their own thoughts and actions, and there were more literary references with abstract meaning (e.g. nouns ending with
“ness”) and specific temporal connotation (e.g. “hesitate”). As noted in Ch issubstantialevidencethattheinhibitorycontrolofthelateraldopaminergicsystemover
the subcortical portions of the medial dopaminergic system is weakened in schizo- phrenia, thereby leading left-hemispheric inner speech to be perceived as auditory hallucinations.
6.2 The role ofdopaminergicpersonalitiesin humanhistory
The evolution from the hunter-gatherer and agrarian societies to our currentindustrializedsocietiesoccurredinbutaspeckofgeologicaland evenbiologicaltime.However,itdidnotoccurovernight,anditdidnot necessarily occur by accident. Rather, human societies during the past 4,000 years have almost always been led or influenced by dominant individualsintheformofmilitaryrulers,religiousleaders,explorers,and scientists.Highlydopaminergicindividualsmaybeamongthe2percent orsoofhigh-achievementindividualsdescribedbyToffler(1970),who areseeminglywell-adaptedtotheenvironmentsofmodernsocietiesand who are in most cases its leaders. However, even highly successful individuals may be prone to hyperdopaminergic syndromes such as hypomania (Goodwin and Jamison, 1990) and obsessive-compulsive spectrum disorders such as workaholism, excessive risk-taking, and sexual addiction, and these individuals also may be part of families in which hyperdopaminergic disorders are common (Karlsson, 1974). Itis alsointerestingtonoteinthisregardthatfamouspersonsare,likethose who suffer from schizophrenia and bipolar disorder, more likely to be born in the winter months (Kaulins, 1979), possibly due to high maternal dopamine levels in the summer months caused by higher temperatures and longer daylight hours (see Chapter 4).4
The fact that almost all famous individuals in history prior to the twentieth century were male is consistent with Carlyle’s famous quota- tionthat“Thehistoryoftheworldisbutthebiographiesofgreatmen.”5This phenomenon at least partly reflects the general predominance and admiration of male traits in human societies since the Bronze Age. But, most famous historical figures were not only male but also very youthful
4 Kaulins (1979) recorded the birth dates of 11,439 famous people listed in the 1974 edition of Encyclopedia Britannica. Across all fields, there was an 18 percent increase in January and February births relative to births in July and August. Of thirty-one fields categorized by Kaulins, twenty-three showed the excess winter-birth trend. (Kaulins actually had thirty-three categories, but two of them had only one person in each sea- son.) Seewww.lexiline.com/lexiline/lexi118.htm.
6 Of the five individuals to be reviewed in this chapter, the only exception to this was Columbus, who was around forty-one at the time of his first voyage to the Western Hemisphere. However, his first formal petition to sail westward to the Indies wasmade eleven years earlier, and his dreams of sailing westward to glory probably began long beforethat.
geniuses are generally most productive around age forty, although this depends somewhat on:
- the particular profession, in that mathematicians and physicists and artists tend to peak younger than medical and socialscientists;
survivingadultswaslessthanfifty-fveyears,whichfavoredoutputat a younger average age;and
- the maturity of a field ofendeavor.
The last factor may be increasing the peak age of productivityinmodernsocieties, since at least ten years of experience are required inmostfieldstodaytodoone’sbestwork(Simonton,1994)andsincemostscientists todayhavesomuchtolearnthattheyarestillengagedinpost-doctoralresearchevenintotheirearlythirties.Ontheotherhand,manyofthecreative breakthroughs that led to significant achievements later on inlifeareknowntohaveoccurredmuchearlierintheperson’slife,sotheageofpeak creativity is undoubtedly younger than the age ofpeakproductivity.It has recently been argued that, for better (e.g. science) orworse(e.g.criminal activity), great male achievements occur earlyinadulthoodbecause of high testosterone levels and the need of malesto attractdesirable female partners (Kanazawa, 2003). There may, indeed,beatestosterone link – as well as more mundane causes likegreaterfamilyand other responsibilities, poorer health, and decreased energy levelsthatreduceproductivityinmiddleage–butthetestosteronelinkmaynotbeasdirectasimplied.Testosteroneincreasesthemalesexdrive,but it also elevates brain dopamine levels, which inturnincreaseaggressiveness, obsessiveness, and male sexual behavior.Aspreviously discussed in Chapter 4, the testosterone–dopamine linkaccountsforwhy males are more likely to develop suchhyperdopaminergicdisordersas obsessive-compulsive disorder, mania, and schizophreniainearlyadulthood.Thedeclineincreativityandachievementinmaleswithage is, therefore, highly consistent with the decline in dopamine levelsin
Greathistoricalfigurescannotmerelybeconsideredobsessed,aggres- sive, and hypersexual, though many of them were clearly sexually pro- miscuous(Alias,2000),includingevidentlyasurprisingnumberofgreat physicists such as Albert Einstein and Richard Feynman. What most of thesefamousmendidpossesswasgreatintelligence,asuperiorworking memorycapacity,andakeenstrategicvision(i.e.“far-sightedness”),in line with the dopaminergic emphasis on distant space and future time. These intellectual traits were coupled with a sense of personal (even religious)destinyandarestlesslessandambitiontodiscoverorconquer
newworldsand/orpromotenewideas.Onthenegativeside,manyifnot mostofthesemenwereemotionallycold,somewhatdeluded,andprone tootherdopaminergictraitssuchasexcessiverisk-taking.
Therearedozensofhighlydopaminergicmenofhistoricalsignificance who could have been spotlighted in this chapter for their enormous workingmemories,incrediblementalfocus,andstronggoal-directedness (see Alias, 2000; Simonton, 1994). The only persons to be highlighted here will be Westerners, partly due to my bias. I neglect to discuss two of the most infamous leaders of modern history (Adolf Hitler and Josef Stalin), not because they were lacking in dopamine – their clinical tendencies toward bipolar disorder and paranoia speak to the contrary
–butbecausetheirnegative“accomplishments”(arguablythegreatest mass murderers in history) dwarf their positive ones. Nor have I highlighted any religious figures, again not because they were devoid of highly dopaminergic brains – on the contrary, religious drives and experiences are very much associated with high levels of dopamine (Previc, 2006) – but rather because their contributions are less amenable to objective historical analysis due to their perceived reli- gious stature and, in many cases, a lack of contemporary and histor- ically valid accounts. To illustrate the role of the dopaminergic mind in history, I have chosen to review briefly five persons who were arguably among the most influential in history – Alexander the Great, Christopher Columbus, Isaac Newton, Napoleon Bonaparte, and Albert Einstein (see Figure 6.1). Two of these rank among the greatest of military leaders, one was arguably the greatest explorer in history, and the other two arguably rank as the two most famous scientists ever. On any list of historical greatness, all would rank in the upper echelons because of their achievements. Though their personalitieswere different in many respects, they all shared a high degree of intelligence, a sense of personal destiny, a religious/cosmic preoccu- pation, an enormous focus (obsession) with achieving supreme goals and conquests, an emotional detachment that in most cases led to ruthlessness, and a risk-taking mentality that led to consequences rangingfromthemerelyembarrassingtotheoutrightdisastrous.7
7 Some of the “dopaminergic” also-rans include such intelligent, complex, and morally ambiguous (e.g. idealistic and ruthless) figures as Julius Caesar, Mustafa Kemal (Ataturk), and Winston Churchill. Another fascinating and influential dopaminergic mind – Howard Hughes – was the subject of the recent Hollywood blockbuster The Aviator. Hughes was a brilliant, driven, and creative aviator and movie pioneer who became the world’s first authentic billionaire, but he was also wracked by debilitating hyperdopaminergic symptoms as obsessive-compulsive disorder, tics (a feature of Tourette’s syndrome), and delusional and paranoid thoughts (schizotypy).
Figure 6.1 Five famous dopaminergic minds in history: Alexander the Great, Christopher Columbus, Isaac Newton, Napoleon Bonaparte, and Albert Einstein.
Public domain images courtesy of the Library of Congress.
- 6.2.1Alexander theGreat
The Macedonian king Alexander (356–323 BC) ranks, alongside Ghengis Khan, as one of the two greatest conquerors in history. But, like Ghengis Khan, his greatness stems not just from his military victories but also from his great intellect and vision for the administration of his empire, most of which did not come to fruition because of his early death. Unfortunately, Alexander’s legacy was also marred by a ruth- lessness that, while not approaching that of the Mongol leader, never- theless is shocking by today’sstandards.8
Alexander was raised as the son of Philip, king of Macedonia. He had a keen intellect, and one of his teachers was Aristotle. Upon his father’s death, he ascended to the throne and quickly conquered the other Greek city-states and extended his control to the Black Sea, and he soon began to unify the Greeks against their great rivals, the Persians. His initial goal was to avenge an earlier invasion by the Persians against Greece and to restore himself to the Persian throne. He attacked and defeated Darius at Issus and then successfully laid siege to the entire eastern Mediter- ranean region. He returned to Persia and fought a decisive battle at Guagemala, in which the larger Persian army was defeated and the capture of the cities of Babylon and Persepolis ensued.
Had Alexander stopped at this point, his legacy would not have been as large in history. Alexander had, even before his conquest of Darius’s dominions, been offered the Western half of the Persian empire, which his friend and general Parmenion is said to have argued “I would accept the proposalifIwereAlexander,”andtowhichAlexanderfamouslyreplied “So would I, if I were Parmenion.” Nor was Alexander content to be considered“kingofthePersians;”rather,hewantedtobecome“kingof Asia” andultimately“kingoftheWorld.”Heaimedtoconquertheentire extent of the world, which Aristotle believed lay as far east as the Indus River.Alexander’sarmiestraveledasfarnorthastheSamarkandregionof central Asia, crossed the Hindu Kush in a remarkable campaign and then delvedsouthtocrosstheIndusRiverinwhatisnowPakistan.Alexander’s men eventually rebelled against going farther, and Alexander ultimately knew that he had not conquered the entire world east of Greece. But, he had built an empire greater than anyone before him and had brought about an enormous exchange of trade and culture as the Asian and European cultures made their first significantencounter.
8 Most of this account comes from Alexander the Great: The Invisible Enemy byO’Brien (1992),TheGeniusofAlexandertheGreatbyHammond(1997),andAlexandertheGreat, a web biography by Jona Lendering(www.livius.org/aj-al/alexander/alexander00.html).
Alexander’sgreatnessresultednotonlyfromthesizeofhisconquests or the daring and brilliance of his military strategies – reflected in the factthathisvictoriousarmywasoftenoutnumberedbynativearmiesin the Asian campaigns – or his ability to form alliances. Alexander pos- sessedagreatervisionthananyofhiscontemporaries,asheundertooka policy of fusion, both economically (through trade) and socially (through intermarriage). He was a great proponent of Greek cultureand he helped to transmit it far and wide throughout Asia, and he trained Asian soldiers at a young age in Greek culture and martial arts. But, he also adopted many Asian cultural elements, let Asian satraps run muchoftheempire,andevenwantedtomakeBabylonthefuturehome of his empire. His brilliance, strategic vision, and tremendous sense of personal destiny must be regarded as some of his more “positive” dopaminergictraits.
Unfortunately,Alexander’slegacywastarnishedbytheconsequences of his more negative dopaminergic traits. Whatever more noble aims may have inspired his original conquests were more than offset by ruthlessness and, in later years, a paranoia fueled by alcohol. One example of this was the beachfront crucifixion of 2,000 men of Tyre (a city in southern Lebanon), which had resisted his army for several months before falling. Other examples were his wholesale destruction of Persepolis, which he later somewhat regretted, the wholesale slaughter of Greek expatriates in India for a cultural transgression, and his senseless purges of adminis- trators following the return to Persia after his disastrous return trip across the Makran Desert, in what is now southern Iran. His daring was accompanied by a restlessness and even stubbornness, which led to the disastrous Makran voyage, in which by some accounts almost 80 percent of his army may have perished. Finally, his tremendous belief in his abilities and destiny was offset by a mystical, megalomaniacal streak in which he cultivated an almost god-like image of himself. As a member of theMacedonianroyalfamily,hecouldsomewhat“legitimately”claimto be descended from demigods such as Hercules and he acquired royal lineage during conquests (such as being the heir to Ammon in Egypt), but these were mythical and titular links. Alexander’s mysticism, however, took the form of a much more grandiose self-deification, which led to ridicule and even a serious near-mutiny by his Macedonian troops in one instance, which he quelled by supposedly accepting his mortal lineage in front of them in one of his more masterful scenes.
Alexander’searlydeathattheageofthirty-twoledtotherapiddiv- ision of his physical empire, but the cultural unifications brought about byhisconquestswouldcontinuecenturiesafterhisdeathandhislegacy would especially influence the psyche of ancientRome.
- 6.2.2Christopher Columbus
Christopher Columbus (“Colon” in Spanish) (1451–1506) certainly ranks as one of the most influential and mysterious persons in history.9Despite his so-called “discovery” of the Western Hemisphere – arguably thesingle-greatesteventinallofhistory–Columbus’slifeisriddledwith enigma, including his birth date (probably late summer or early fall of 1451), birthplace (widely but not universally accepted as Genoa, Italy), and his religion (alternatively, Jewish by family with later conversion to Christianity, or pious Christianity throughout). However, as described by Wilford (1991), the driving force behind Columbus’s obsession with finding a Western passage to the Far East has befuddled historians most ofall.
WhatisknownisthatColumbusreceivedthroughoutthecourseofhis life a more than adequate education for his times. He left in his early teens for a life at sea, initially as a Portuguese sailor, traveling widely alongthecoastlinesofEuropeandAfrica,includingtheCanaryIslands, thewesternmostextentoftheSpanishempire.Columbusgainedagood knowledge of geometry, astronomy, and cartography, as would have been required of navigators, and he became quite proficient at several languages, including Latin. It is not clear when or how Columbus first entertainedthebeliefthatawestwardcoursetoAsiacouldbeachieved. There were previous indications of a populated land mass to the west, because westerly winds and currents (so named because of their ori- gination in the west) would occasionally carry artifacts and even occa- sionaldeadbodiestotheislandslyingoffthewesterncoastofAfrica(i.e. the Azores, Canaries, and Madeiras). There was also the belief of the Italian scholar Toscanelli, with whom Columbus corresponded, that a westwardpassagecouldsucceed,andtherewereevenapocryphalstories of an encounter with a mariner in the Madeira Islands who secretlytold Columbusofbeingblownoff-courseinawestwarddirectionandbriefly landing on a large land mass. But, most of these clues would havebeen available to other sailors, yet only Columbus developed the single- minded obsession to sail westward. Hence, researchers have concluded that only by understanding Columbus’ personality can his great quest and discovery beunderstood.
Columbus knew that, as the Portuguese methodically extended their voyages along the coast of Africa, a maritime route to the Indies and their fabulous wealth would eventually be discovered. But, he was
restless and could not wait for such an eventuality, which because of its length might in any case prove unprofitable relative to the overland routes already in existence. Columbus recognized the great conse- quences of sailing westward to the Indies and totally convinced himself, for what in retrospect turned out to be conveniently flawed reasons, that it was feasible to make such a voyage.10Partly there was the allure of riches, but there was also the delusion that he was chosen by God to be the instrument by which the westward route would be discovered, and that his exploits would lead to great wealth and power that Spain would eventually use to reclaim the Holy Land. Columbus relayed these delusional and even paranoid thoughts in the Book of Prophecies, drafted in the 1499–1500 timeframe before his fourth voyage, a work that many historians dismissed as the result of Columbus’ poor mental state fol- lowing the humiliations suffered after his initial voyages. But, West (cited in Wilford, 1991) as well as Wilford himself argue convincingly that the seeds of his delusions and mysticism were present long before his initial voyage to America and reflected a core element of Columbus’s personality – a delusion of grandeur combined with an intense mysti- cism, going far beyond normal piety (Wilford, 1991).
This delusional and mystical side of Columbus coalesced with his other great dopaminergic traits – intelligence, restlessness, and obses- siveness. As regards his intelligence, Columbus had a keen mind, a variety of intellectual interests, and a successful self-education. Columbus was prone to making creative although often outlandish and even comical hypotheses and associations, and he was a brilliant if somewhat intuitive navigator.11His restlessness is well-documented, both before and after his initial voyage, but his dogged pursuit of the dream of sailing westward is a testament to both the strength of his delusions as well as his inordinate goal-directedness.
Like most great historical obsessions, Columbus’s vision was con- ceived in his youth. His first formal petition to mount a westward
10 Columbus assumed that East Asia extended further eastward than actually was the case,thattheEarthwas25percentsmallerthaninreality(becauseheacceptedthelow- endofcontemporaryestimatesoftheconversionofdegreestonauticalmiles),andthere werelikelytobeas-yetundiscoveredislandsthatwouldserveassteppingstonestothe East Indies. His final estimate of the distance from Spain to Japan (then known as Cipangu) was only 2,400 nautical miles, or about 25 percent of the actual distance!
11 Morison (1983) proposed that, because of their enormous expertise at both ocean and inland sailing, Columbus and James Cook rank as the greatest navigators of all time. Columbus’ eight remarkably successful transatlantic voyages and his explorationsof uncharted lands and waters in the space of a decade arguably rank as the greatest maritimefeatinhistory.MorisonprovidesexampleafterexampleofColumbus’sgenius for dead-reckoning and his amazing ability to negotiate and overcome a variety of formidable obstacles on his varioussails.
expedition was made to his native countrymen (the Genoans), when Columbus had just turned thirty. He then unsuccessfully petitioned the kingofPortugalforafleettosailwestwardin1483,attheageofthirty- two. For the next decade, his determination never failed, and in 1492 he finally was successful in obtaining his first expeditionary fleet from the Spanish throne, along with a claim to enormous potential riches and titles. Columbus experienced extraordinary luck on his voyage, unknowingly hitting the easterly tradewinds on the outbound journey and the westerlies on the return sail. He and his crews made landfall afterpassingthe3,400nauticalmilepoint(roughly1,000nauticalmiles beyond his original estimate of the distance of Japan), just after he had agreed to turn back if no landfall was sighted by the very next day. Columbus eventually reached the Western Hemisphere in October of 1492 in the Bahamas chain, probably on what was formerly known as Watling Island and since rechristened San Salvador Island. Columbus wouldmakefourvoyagesinall,andhewasthefirstmodernEuropeanto discoverCentralandSouthAmericaaswellaseverymajorislandinthe Caribbean. Although Columbus failed in his effort to find a passage to the Indies, even he eventually began to realize he had discovered anew continent in the process.12
Columbus’ discovery had an astounding psychological impact on Europe and is said to have inspired as much as any other event the transition from medieval Europe to a new age of exploration (Morison, 1983), which in turn inspired a great many scientific and mathematical discoveries. Columbus achieved an even greater iconic status in the emerging United States, with a host of American cities, rivers, univer- sitiesandotherlandmarksnamedafterhim.Forcenturies,Columbus’s discoverywasregardedmostlyinatriumphallight,butthedarkersideof hisdopaminergicpersonalityledtoconsequencesthatarenowregarded astragicandevencatastrophic.WhilehisfirstencounterwiththeNative AmericansoftheCaribbeanwasrelativelybenign,hissecondoneledto the formation of a colony (Isabel, on the island of Hispaniola) and enslavement of those natives who resisted the Spanish. Columbusdoes
12MuchismadeofColumbus’ssupposed“blunder”innotrealizingthathewasnowhere near the East Indies. Columbus did recognize that he had discovered a new continent in South America, but even after his final voyage there was no formal proof (although there were lots of suggestions) that he was not in the vicinity of the East Indies. My general opinion of Columbus echoes that of leading historians in that he had a dual personality – a medial dopaminergic one that was propelled by mystical and cosmo- graphical ideas and an almost unshakable belief that he had reached the East Indies along with a lateral frontal dopaminergic system that supported his impressive empirical observations and reasoning and would have eventually led him to the realization of his correct geographicallocation.
not bear full responsibility for the tragedy that unfolded, being himself initially fair in his treatment of the natives, who he merely wanted to convert, not punish. However, once conflict with the natives had been initiated by others, Columbus became extremely harsh in his treatment ofthenativesandevensomewhatharshincommandinghisownpeople, the latter treatment leading to a series of mutinies and his subsequent imprisonment at the end of his third voyage. The harsh tributes he demanded of the natives and their subsequent enslavement, partly to providesomemeasureofcommercialreturn,13wasparticularlyegregious, andmostoftheslaveshebroughtbackdiedontheirwaytoSpain.Inthe end, it was estimated that of the original 300,000 natives living on the islandofHispaniola(nowcontainingtheDominicanRepublicandHaiti), athirdperished duringthefirstfew yearsofColumbus’sgovernancesand onlyabout15,000survivedtwentyyearsaftertheinitialSpanishconquest (Morison, 1983). This was a prelude to an even more ghastly fate awaitingthenativesoftheAmericanmainland,whosepopulationaftera centuryofSpanishrulewasalmosttotallydecimated,fromapre-conquest estimateoffiftymilliontobutafewmillionintheend.
If Columbus’s discoveries sounded the beginning of the end of the Medieval period, the discoveries of Isaac Newton (1642–1727) repre- sented a crushing, final blow. His brilliant discoveries and analysis led to the first-ever systematic understanding of the cosmos and two of its most mysterious elements – light and gravity – and cemented his rank as arguably the most influential scientist and mathemetician in history.14
Newton showed brilliance as a student early on, and it was in hislate teensthatitbecameclearthathewasbettersuitedforacademiathanfor the farm (which nonetheless provided him some practical trainingearly on). Newton was admitted to Cambridge University at the age of eighteenandquicklymasteredtheprinciplesofEuclidiangeometry,but itwasshortlyaftergraduationattheageoftwenty-twothatheconceived his great ideas about light and gravity and even the calculus (which he
13 Columbus was aware that, because of his failure to find major gold deposits and other commercially valuable goods in Hispaniola, not only his future wealth but also his futureexplorationsfundedbytheSpanishmonarchswereinjeopardy.Thiswasamajor factor that led to his resorting to slave-trading as an alternative source ofrevenue.
14 This account of Newton is distilled from numerous sources, including Isaac Newton by Gleick (2003), Newton’s Gift by Berlinski (2000), Isaac Newton Reluctant Genius by Ipsen(1985)andthemorecriticalsourceNewton’sTyranny:TheSuppressedDiscoveries ofStephenGrayandJohnFlamsteedbyClarkandClark(2001).
termed “fluxions”) in what has been termed a “miraculous” year. His prodigious intellectual creations occurred largely in isolation, when hehad returned to his small town after the plague had once again swept through the cities of Europe. Newton, like many scientists, worked mostly alone, but the isolation throughout his early life was largely self-imposed and so extreme that many of his ideas were published long after their creation and only after much coaxing by friends, or were eventually published by someone else (e.g. the calculus and some of his Bible theories). He was personally somewhat unkempt (though less so when he was working for the British Mint) and he would get so absorbed in his distant and abstract thoughts that he would often fail to eat or sleep.15Yet, Newton could also be very practical in that he invented numerous devices, including the sextant and the best telescope of his time (based on reflection), and engaged in a lifelong, hands-on pursuit of alchemy.
Histwogreatworks–TheMathematicalPrinciplesofNaturalPhilosophy (known by its Latin moniker “Principia”) and Optiks– represent his lasting achievements, although both were published long after his initial ideas (the first volume of Principia was published at the age of forty-four and Optiks at the age of sixty-two). The Principia in particular arguably representsoneofthetwogreatestscientifictriumphsinhistory–theother beingEinstein’sgeneraltheoryofrelativity,whoseequationswerepub- lished in 1916. Principia offered a unified explanation of a host of ter- restrial and astronomical phenomena in the form of the universal law of gravitation, which was a stellar example of dopaminergically mediated creative association of “remote” phenomena. Newton presented his concepts not only in scientific jargon but also in the form of mathematical equations, which had previously been mainly limited to astronomy. It requiredNewton’scombinedgeniusasarigorousexperimentalistaswell as the greatest mathematician of his time.16Newton, as both scientist and mathematician, did not hesitate to tackle the abstract concept of infinity, which Descartes had shied away from. By unifying physics and math- ematics, Newton’s fame spread far and wide, and he became the first scientist ever to be knighted by the king of England.