Image

Summarizing Neurologist Elkhonon Goldberg’s Book Entitled “The New Executive Brain: Frontal Lobes in a Complex World” (Part II)

Share this Blog post

In my December 6th, 2011, post, I began a multi-part series in which I announced my plan to summarize the book by neurologist Elkhonon Goldberg entitled The New Executive Brain: Frontal Lobes in a Complex World. This summary will take the form of a series of bullet points contained within multiple posts. This post contains part II:

•  I’d like to begin part II by pointing out something rather odd about Elkhonon’s take on EF or executive functioning (such as planning, mental time travel, staying focused, switching attention, mental modeling, etc.): All the way through his book Elkhonon draws heavily from the world of organismic systems theory (as principally defined by Ludwig von Bertalanffy) but never once refers to systems theory. I found this to be rather odd. I would suggest that the following quote by Elkhonon reveals that he is using a systems theory background even though he never points to it:

[A]ttention can best be described as a loop like process involving complex interactions among the prefrontal cortex, ventral brain stem, and posterior cortex. Breakdown anywhere along this [dynamic feedback] loop may interfere with attention, thus producing a form of attention deficit disorder.

In the same way Elkhonon is not able to explain how EF gets set up and how it works without using a systems framework, so too Bowlby had to use a systems framework to make sense out of how early safe and secure attachment relationships lead (if all goes well) to open and flexible Inner Working Models (which, as mentioned in part I, are part and parcel of EF). The next bullet point is another example of the systems frame that Elkhonon uses but never mentions directly.

•  Consider the following quote by Elkhonon:

It should come as no surprise that the features of ADD [attention deficit disorder] are usually combined with some aspects of executive deficit. When the executive deficit is severe, the diagnosis of ADD becomes superfluous. But when it is mild, when the attentional impairment stands out and the executive deficit is minimal, then the diagnosis of ADD is properly made. In most cases biochemical disorder affecting the frontal lobe connections is present, but there is no actual structural damage to the frontal lobes. In some cases the attentional deficit is highly encapsulated and may coexist with supreme capacity for planning and foresight. Winston Churchill may have been a case in point. Numerous descriptions of his behavior are strongly evocative of ADHD [attention deficit and hyperactivity disorder]. Yet he was the man who foresaw the danger of Lenin, then Stalin, then Hitler, then Stalin again, ahead of most other political leaders of the free world. Thus he can hardly be faulted for a lack of foresight.

Elkhonon points out that an attentional deficit can be “highly encapsulated and may coexist with supreme capacity for planning and foresight.” During a workshop several years ago, neurobiologist Louis Cozolino (workshop summary available) called this form of encapsulation “splinter cognition.” Cozolino told us that many savants suffer (or are blessed with) splinter cognition: a cognitive ability that is split off from the rest of the overall cognitive system. In his 1969 book General System Theory (summary available), Ludwig von Bertalanffy spends a bit of time talking about what he calls progressive segregation. Allow me to pull this excerpt from my summary of GST:

— begin excerpt —

At this point, von Bertalanffy mentions a systems concept that he calls progressive segregation. As von Bertalanffy puts it, progressive segregation “appears to be unusual in physical systems but is common and basic in biological, psychological and sociological systems.” Progressive segregation occurs when a “system passes from a state of wholeness to a state of independence of the elements.” As von Bertalanffy explains it, “The primary state is that of a unitary system which splits up gradually into independent causal chains.” von Bertalanffy continues, “The reason for the predominance of segregation into subordinate partial systems implies an increase of complexity in the system.” “As long as the system is a unitary whole” so says von Bertalanffy, “a disturbance will be followed by attainment of a new stationary state, due to interactions within the system. The system is self-regulating.” However, von Bertalanffy makes it clear that a system can only maintain a self-regulating state up to a certain point. If the system is shocked or traumatized beyond this point, a state of “independent causal chains” will result. “The partial processes will go on irrespective of each other,” to quote von Bertalanffy. I think this is an important point to keep in mind because a state of “independent causal chains” can be achieved via two paths (von Bertalanffy calls this equifinality): 1) increasing complexity of a system, or 2) a breakdown of the self-regulating nature of a system (e.g., the system becomes dysregulated). As an example, when a parent decides to allow his or her infant or child to “cry it out,” more than likely the system will become dysregulated and a state of “independent causal chains” will result—partial processes going on irrespective of each other. In essence, the system is allowed to devolve into a collection of mechanistic subsystems characterized by direct cause and effect. At this point, the various behavioral systems will present themselves as separate entities.

— end excerpt —

I know what you are thinking: maybe stressing an overall cognitive system in such a way that a “splinter” or “capsule” form of cognition results thus giving us a savant state, isn’t such a bad idea. I’m not sure this can be done. As a matter of fact, I remember a movie (made from a book) from the late 1970s—The Boys from Brazil (1978)—where the plot line centered on raising a group of boys in exactly the same way as Hitler with the intention of creating another despotic leader.

•  Even though an individual effort in all likelihood would not produce a desired savant skill, I think society as a whole can select for certain savant skill sets. Today, as evidenced by such powerful business people as Bill Gates, the late Steve Jobs, Mark Zuckerberg, and Jerry Yang (of Yahoo fame), brain workers who fall on the autism spectrum are being selected for. In much the same way Churchill was able to spot despotic leaders (and understand their intentions), the above brain workers are able to spot and develop digital opportunities. Most would agree that Steve Jobs truly was a savant in this regard. Consider this excerpt from my summary of a book by David Anderegg entitled Nerds—Who They Are and Why We Need More of Them (2007, Tarcher Penguin).

— begin excerpt —

Anderegg observes, “Many social skills deficits will be less and less debilitating in the future.” This agrees with the work of Richard Florida presents in his book Rise of the Creative Class as well as the work social commentator Robert Putnam presents in his book Bowling Alone: The Collapse and Revival of American Community. Here’s a “showstopper” statement by Anderegg: “We may be entering an era when people who have excellent social skills but little ability or inclination for focused work in the symbolic realm cannot [engage in Freud’s] love and work. They may be able to love [the main focus of the current compassion movement], but the work part may get harder and harder for the man whose primary skill is conversation.” In essence, Anderegg is telling us that in the near future we will witness a shift whereby “brain workers” will become increasingly successful while more and more “back workers” will continue to fail. As Anderegg puts it, “The direction of material success as well as power is all on the side of nerds and geeks, and sooner or later they will rewrite the DSM (Diagnostic and Statistical Manual of Mental Disorders) as well.”

— end excerpt —

Anderegg was correct: The diagnosis of Asperger’s syndrome (a high-functioning form of autism) will be removed from the next version of the DSM. Simply, it’s hard to have so many high-powered corporate leaders and, at the same time, diagnosis them as having a mental pathology or deficit. As society continues to select for the nerd, it is you and I who will be diagnosed as possessing a mental pathology or deficit. From an evolutionary perspective, splinter cognition could be looked at as a mutation. And if that mutation provides for greater adaptability (e.g., the adaptability that Asperger’s affords in an increasingly digitally-defined world), it will be selected for. It’s tough if you’re the one (like myself) that’s being selected against. To bring this back around to Elkhonon, consider his thought on how certain executive functions may fit with certain environmental situations:

Strategic military planning or strategic corporate planning usually do not require unflagging, split-second alertness and thus may be successfully executed even when attention is somewhat compromised, hence the Winston Churchill phenomenon.

•  At this point, Elkhonon talks about how the mid-brain areas collectively have their role or life, and the upper-brain areas collectively (where EF skills come to life)  have a different role or life. Elkhonon states:

To understand the division of labor between the prefrontal cortex and the basal ganglia in a developed mammalian brain, we must consider a hierarchy of contexts and context sizes.

OK, again, talking about “a hierarchy of contexts and context sizes” conjures up visions of Bertalanffy’s systems theory. Elkhonon continues thus:

The basal ganglia make their “executive decisions” (to act or not to act and how to act) on the basis of a very narrow context. By contrast, the prefrontal cortex makes its executive decisions on the basis of a much broader, richer context.

Allow me to quote Elkhonon at some length as he tries to explain to us the difference between the context of the mid-brain versus the context of the upper-brain:

We leaf through a book but we tap a computer keyboard [as I am doing right now]. Does this mean that we grab a knife, lift a cup, leaf through a book, or tap a keyboard every time we encounter one in our environment [like young children often do]? Not at all—we do it only when a larger context warrants it: when we need to cut a steak, when we are thirsty and the cup is filled and clean, when we want to extract some information from a book, or send an e-mail to a friend. In fact, most of the time we encounter any of these objects we simply ignore them. A mere encounter with an object does not automatically trigger the object-associated behavior. But in patients with severe Tourette’s syndrome or with prefrontal damage it often does, the famous “utilization behavior” described by François Lhermitte.

Why do we see “utilization behavior” in both Tourette’s syndrome and focal prefrontal lesions? Because in both conditions the basal ganglia escape from the frontal lobe control and BEGIN TO OPERATE ON THEIR OWN [my emphasis], the way they had been operating before the frontal lobes arrived on the evolutionary scene.

•  Elkhonon brings out an important point: the mid-brain is centrally concerned with what he calls “veridical information” whereas the upper-brain is concerned with what he calls “adaptive” or “actor-centered” information. I think this is a wildly important point to get, so, once again, I’ll  let Elkhonon describe it to you:

In a nutshell, veridical decisions deal with “finding the truth,” and adaptive, actor-centered decisions involve choosing “what is good for me.” Most “executive leadership” decisions are priority based, made in ambiguous environments, and adaptive, rather than veridical, in nature. The cognitive processes involved in resolving ambiguous situations through priorities are very different from those involved in solving strictly deterministic [e.g., S-R or stimulus and response] situations. Ironically, cognitive ambiguity and priority-based decision making have been all but ignored in cognitive neuropsychology. … [T]he lack of satisfactory scientific methods [that a systems paradigm would bring] does not change the fact that priority-based, adaptive decision-making in ambiguous situations is central to our lives, and that the frontal lobes are partcularly important in such decison making. So rather than brushing the problem aside as “unworthy,” the appropriate scientific methods must be found.

I wish to draw the reader’s attention to the fact that if all we are concerned with is veridical information, simple cause and effect, a S-R world view, limited contexts, certainty, etc., then we can live comfortably out of the mid-brain. And many in the scientific world have gone to great lengths to reduce our existence to that of the mid-brain. I’ll end with a copy of an email I recently sent to a colleague concerning a report that just came out charging that we overmedicate kids in foster care:

— begin email —

I trust you saw the article on “drugging foster care kids.” I just finished reading what turned out to be a truly fascinating book. The book is Artificial Happiness: The Dark Side of the New Happiness Class by Ronald Dworkin. Dworkin is an M.D. (anesthesiologist) and Ph.D. Dworkin does an amazing job tracing out the roots of the prescription drug epidemic, which, apparently, is now devastating foster care. He goes into detail concerning the turf war between psychiatrists and primary care doctors that waged back in the 1970s and 1980s. As you might expect, primary care doctors won (aided in large part by the rise of managed care). Dworkin characterizes this win as a win over mind-body dualism. This win comes by way of the monism that primary care doctors have created. Here’s how Dworkin describes this win:

“Although many doctors have never heard of monism or the mind-brain debate, they benefit from monism’s triumph. The story of the Artificial Happiness is a story of competing ideologies. Because the medical profession’s ideology was rivaled only by religion, any retreat by religion affected the doctors’ position. Monism’s triumph over dualism signified not only religion’s loss but also medicine’s gain. When neuroscientists showed how the mind could spring from the brain, doctors reaped an overarching conceptual framework to supplement psychotropic drug ideology, thereby boosting Artificial Happiness’s legitimacy.”

A bit further along Dworkin writes:

“The union of cognitive psychology and neuroscience in the late 1980s made monism’s triumph official. Cognitive neuroscience merged the study of mental activity with the biology of the brain and became a symbol and an inspiration, a message to the outside world that monism was now medical science’s official creed.”

All this to say that this new monism of medical science is what is legitimating the foster care drug epidemic. To change it, you’d have to go up against this monism. Good luck. Even Bowlbian attachment theory is going the way of this monism: all of attachment functioning can be explained by neuronal firing patterns as revealed by fMRI studies.

I don’t think there’s much one can do about this new monism, but I think it behooves one to know where it came from, why, and what its entailments are. Personally, I think Dworkin does a great job with this topic. So much of the Artificial Happiness story is also the Artificial Attachment story as well. Why engage in real attachment when the artificial variety is so easy, so readily available, and so endorsed by a body of cultural authority: medical doctors.

— end email —

Maybe, just maybe, the current epidemic of ADD and ADHD expresses a desire to escape the reduced monism of the mid-brain. People like Bertalanffy and Bowlby (and many other systems thinkers) tried to break the confines of a reduced mid-brain paradigm. Maybe ADD and ADHD is nature’s way of trying to move us from the limited context of the mid-brain to the larger, richer world of the adaptive and the ambiguous. Feeding kids (and adults) psychotropic drugs amounts to an all-out frontal attack (no pun intended) on such efforts. Who knows, maybe we will witness the birth of all manner of splinter or encapsulated cognition. Who knows which will be selected for. Certainly the nerd splinter skill set is definitely being selected for. Does the monism of materialism work? Yup, sure does. But it works by forcing us to live out of the mid-brain. As Dworkin alerts us, “[I] n the late 1960s, during the medical profession’s crisis, the biogenic amine theory left the realm of science and became ideology.” It is the biogenic amine theory that associates neurotransmitters and depression. That’s it. That’s the single idea that legitimates drugging foster care kids. Heck, it’s the single idea that legitimates drugging an entire culture. This is what happens when a scientific theory leaves the rails and becomes an ideology. When TV ads for antidepressants state (rather quietly) that scientists “believe” that a reduction is neurotransmitters causes depression, take the believe part seriously: it’s code for ideology.