Quick Look: Autistic and Non-Autistic Brain Differences Isolated for First Time

Share this Blog post

Autistic and non-autistic brain differences isolated for first time — ScienceDaily.

I recently read the article Autistic and Non-Autistic Brain Differences Isolated for First Time. The article profiles work being done at the University of Warwick. Here’s how the article starts out:

The functional differences between autistic and non-autistic brains have been isolated for the first time, following the development of a new methodology for analysing MRI scans.

Developed by researchers at the University of Warwick, the methodology, called Brain-Wide Association Analysis (BWAS), is the first capable of creating panoramic views of the whole brain and provides scientists with an accurate 3D model to study.

Here’s a brief description of the BWAS process. Suffice it to say that BWAS involves processing a staggering amount of data. Clearly highly sophisticated computer systems are required:

The researchers used BWAS to identify regions of the brain that may make a major contribution to the symptoms of autism.

BWAS does so by analysing 1,134,570,430 individual pieces of data; covering the 47,636 different areas of the brain, called voxels, which comprise a functional MRI (fMRI) scan and the connections between them.

Yeow! That’s amazing. I’ll leave you to read more about BWAS at your leisure. In this Quick Look, I’d like to highlight the findings because I think they reveal something about autism that is quite interesting.

The article quotes Professor Jianfeng Feng, from the University of Warwick’s Department of Computer Science, when the professor states:

We identified in the autistic model a key system in the temporal lobe visual cortex with reduced cortical functional connectivity. This region is involved with the face expression processing involved in social behaviour. This key system has reduced functional connectivity with the ventromedial prefrontal cortex, which is implicated in emotion and social communication.

This finding is nothing new. Researchers have known for some time now that persons on the autism spectrum have a difficult time reading emotions displayed through facial expressions. For more on this topic, see Simon Baron-Cohen’s 1995 book entitled Mindblindness—An Essay on Autism and Theory of Mind. It’s this next finding that I find very interesting:

The researchers also identified in autism a second key system relating to reduced cortical functional connectivity, a part of the parietal lobe implicated in spatial functions.

It would seem that persons on the autism spectrum have a difficult time with what cognitive scientists call spatial cognition. But this begs the question, “Why are ‘reading facially expressed emotions’ and ‘spatial cognition’ related?” Here’s the article’s take:

The [researchers] propose that these two types of functionality, face expression-related, and of one’s self and the environment, are important components of the computations involved in theory of mind, whether of oneself or of others, and that reduced connectivity within and between these regions may make a major contribution to the symptoms of autism.

So, it would appear that reading faces (and developing a theory of mind) involves being able to read the environment. And when you think about it, this makes sense. When we see a person crying, we tend to look around to see what’s going on in the environment that may be contributing to or causing the crying. If I see a person crying at a cemetery, reading the environment allows me to put the crying into a context: the person more than likely is crying because she or he has lost a loved one. But if I see a person crying at the local lottery office, I may assume that these are tears of joy: the person just won a large sum of money.

Believe it or not, the digital age along with social media are changing the “facial expression – environmental context” relationship. Now that mobile devices have flooded our lives we are able to experience “teleported contexts” almost entirely divorced from physical or spatial constraint. The new social phenomenon of taking a “self picture” (e.g., selfie) may be a push back against the “facial expression – environmental context” rupture. The self picture seems to say, “See, my self was actually in this place at this time: can you believe it!” (1)

The other day I got out of my car at my office. Just up the way a woman was crying at the bus stop. I immediately scanned the environment for a cause or for some context. I could not find any. I started to move toward the woman to offer help until I could see clearly: she was on her cell phone. The cause or context was not where the woman was; the cause or context was wherever that call was coming from, which could be in town, in state, or across the country, possibly the world. In that moment, the “facial expression – environmental context” connection was broken. There was no way for me to assess whether this person needed help or not. I have to admit, I felt a bit helpless. In some respects this “facial expression – environmental context” rupture rendered me functionally autistic. (2) The disconnect out in the environment mirrored the disconnect happening in my brain.

As I suggested in my last blog post, it may well be that persons on the autism spectrum are better suited to navigate digital space because their brain connectivity (as described above) allows them to readily adapt to such an environment. Sure, we try to make up for the “facial expression – environmental context” rupture expressed by digital space by using emoticons (i.e., ;->) or selfies, but I think such attempts are more for us non-autistic, analog, “neuropaths.” As David Anderegg suggests in his 2007 book entitled Nerds—Who They Are and Why We Need More of Them, in the not-too-distant future, the autistic brain will be the normal brain, and those of us saddled with analog brains will be the neuorpaths. I’d say we’re almost there. Enjoy the analog world while you can.


(1) Selfies are now being used to brag about criminal activities. And this makes no sense because selfies that contain evidence of criminal activity are being used by law enforcement agents to make arrests. These selfies are also being used as evidence to bring about criminal convictions. For one bizarre example, see the article entitled Teen killed classmate and uploaded ‘selfie’ with the body to Snapchat, police say. Apparently a teen allegedly murdered his classmate and then posed for a selfie with the corpse. Yes, police did use the selfie to apprehend the alleged murderer. Taking a selfie with a corpse begs the question, “Have we entered into such a dissociative state that the very fabric of our society is beginning to unravel?”

(2) Consider this article: Psychologists Discover the Simplest Way to Boost Your Mood. According to researchers at UC Berkeley, the best way to improve your mood is to go out into nature and have an awe-inspiring experience, like viewing the Grand Canyon for the first time. This type of research also makes an interesting but somewhat unexpected finding: people who have awe-inspiring experiences out in nature tend to have increased levels of empathy that motivate them to help other people. The article puts it this way:

[A] small 2012 study found that when people were shown something new and awe-inspiring — even if they were simply recalling a past experience or reading about someone else’s — they were more willing to volunteer their time to help others compared with people who were shown something that made them feel happy.

This ties back to the main message of this post: to develop a theory of mind (which is key to developing empathy), spatial experiences—especially awe-inspiring spatial experiences—are a key ingredient. The article closes out thus:

Simply put, awe makes us feel like we’re a part of something bigger than ourselves. By getting us out of our own heads, awe-inspiring experiences encourage us to look at the world inquisitively and feel more at ease. So go on a hike or check out the stars on a clear night. It’ll be awesome. We promise.