top of page
Writer's pictureSam Taw

Brain-Computer Interfaces – neural enhancement or the end of mankind?


How is it that we have become the apex predator on Earth? Simple - evolution favoured our ingenuity.

We are not the fastest, strongest or most agile creatures on this planet, but collectively we are most certainly the most intelligent.

From as early as the Stone Age, man has used the brain to think and plan ahead, to make tools and take step by step actions to complete a goal. Every new skill mastered forged new pathways of learned behaviour, allowing for a further increase in dexterity. Our brain morphology adapted to the new technology, art and culture of the times, widening the gap between us and the next intelligent lifeforms, like aquatic mammals and apes.

Have we finished with the evolutionary process?

In simplistic terms of size, we probably have. Despite the human stature increasing over the decades, there is nothing to suggest that a larger physical brain equates to greater intelligence, or elephants and whales would rule the world. If the brain, and consequently skull size increased out of proportion to the human body, then balance might become an issue, having an impact on our ability to walk and adding pressure on our spines.

Albert Einstein had a statistically smaller brain than the average man at the time of his death in 1955. The autopsy and analysis of his preserved neural tissue, found that his brain structure differed from others used as a control group. Despite controversy over the clandestine way that his brain was removed and what the morphological differences might suggest, his brain was not larger than the average person.

There were, however, two main factors which were impossible to refute. In 1984, Professor Marian Diamond, University of California, published a paper which compared the density of certain cells in Einstein’s brain to that of eleven control subjects. She found that one particular area was unusually dense with glial cells. These are the crucial powerhouses, providing nutrients and chemical transporters, and removing waste from our neurons. The significant area in which these cells were packed, was the left inferior parietal area, known for assimilating and processing information from a number of different brain regions. She concluded that this region was so well developed due to the complex nature of his work. This supposition was later criticised for its lack of rigour. Her response was that she only had access to the brain of one genius with which to compare results. It’s a fair argument.

When you look at the surface of any brain, a long branching valley seems to separate the upper and lower lobes in each side above the ear. This is known as the Sylvian Fissure. Einstein’s Sylvian Fissure was different, in that a part of it was missing. It is speculated that this allowed for greater communication between the various brain regions. Professor Sandra Witelson, McMaster University, Ontario, suggested that it could account for why Einstein thought in the way that he did. Although studies continue, what is clear is that our brains have the potential for enormous structural change.

How will our brains adapt in the future?

Twenty years ago, a person carrying a mobile phone was a rarity. Now the reverse is true. We are deluged with neural stimuli every waking hour, in the belief that everyone across the world should be in instant contact with everyone else around the clock. Entertainment is streamed at our convenience and runs secondary to our perpetual desire to check social media and work correspondence.

Multi-tasking is an outdated and laughable concept. We are adapting to combine input and reasoning into a stream of consciousness that spills into the digital ether. Does spreading our focus across multiple platforms make us better adapted to deal with countless tasks at once, or less able to excel at a particular goal? At present, it makes the human race prolific, but are we creating quantity at the cost of quality?

Does having a butterfly mind make us less proficient at one practised and defined skill?

Richard Yonck, who calls himself a lead futurist, from Seattle, believes that our future lies in ‘blended intelligence’, that being an integration of more bio-tech into our bodies. We already enhance the life quality for some with pacemakers, cochlear implants and artificial joints. Yonck goes one step further, suggesting that we will control processors directly from our central nervous system with Brain-Computer-Interfaces.

Prototypes and pioneers in this field have already begun an exciting new phase of artificial limbs controlled by neural implants. In 2016, a joint project between UChicago, Pitt and the University of Pittsburg Medical Center, demonstrated how a patient could control a robotic arm using only their brain waves. The team have secured further funding to continue to refine the brain-computer interface for paralysed patients using prosthetics. There are many similar projects across the world.

Yonck suggests that ‘mentally accessible’ interfaces will go far beyond that of voice activated software or predictive searching, but will integrate seamlessly with our lives, controlling all aspects of our surrounding environment as easily as our parasympathetic nervous system regulates breathing. We could have cars that automatically take you where you want to go without having to voice the request, or so says Elon Musk on Twitter.

Will this result in robotic children?

That is the million-dollar question. Just how far do we allow Artificial Intelligence to take over before we stray into the realms of science fiction disaster. After all, one of the greatest minds in the twenty-first century, Stephen Hawking, said:

‘The genie is out of the bottle. We need to move forward on artificial intelligence development but we also need to be mindful of its very real dangers. I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that replicates itself. This will be a new form of life that will outperform humans.’ From an interview with Wired, November 2017

Other great and controversial thinkers have said similar. Elon Musk is quoted as saying:

"I have exposure to the most cutting edge AI, and I think people should be really concerned by it," He said at the National Governors Association. "AI is a fundamental risk to the existence of human civilization in a way that car accidents, airplane crashes, faulty drugs or bad food were not — they were harmful to a set of individuals within society, of course, but they were not harmful to society as a whole.

Their solutions involve methods of regulation, but will that be enough to stop the most powerful countries of the world from dominating by AI force?

Our lives in their hands.

Our children seem less bothered by the proliferation of machines in our day to day living. They have grown up in a media rich environment where it is difficult to know where software ends and biology begins.

Co-Authors Judith E Glasier and Dr Debra Pearce-McCall, put forward their hypothesis in Psychology Today, that the current generation could be referred to as having a ‘Millennial Possibility Mindset.’ Pearce-McCall explains that the collation of ideas and information within an environment where a sense of self is discouraged in favour of a shared collective, serves a common goal. Less focus is placed on standing out from the crowd, where an equality of possibilities is achieved. She claims that this could lead to an upending of institutional hierarchies within the workplace.

Matthew Lieberman et al, 2013, confirmed that a particular region of the brain called the temporal-parietal junction activates when we share with others. This millennial attitude against a rigid hierarchical structure can lead to conflict. Older generations think this behaviour manifests as rude, disrespectful or that they have ‘an air of entitlement.’

The youth of today grow up with a massive level of devotion from parents compared to previous generations. They have access to a vast network of communication, entertainment channels and unparalleled connectivity. With this, comes a demand to be heard, and an equal say in the direction of progress and development, without waiting to gain experience or earn their place in society.

These inclusive, sharing, caring millennials live in an uncertain world. One which can change overnight on a single viral post. For them, predicting how things will alter is less important than a certainty that they will experience constant change. Brain morphology will adapt to these new and unpredictable parameters, where order is no longer a measurable constant. Curiosity and flexibility become the attributes that favours survival.

The Millennial generation mark the beginning of this new paradigm in our evolution. To achieve a genuine sense of flexibility, they must first overcome the hard-wired and inherited responses to change-related brain chemistry. Our central nervous systems are programmed to respond to uncertainty by releasing stress hormones. These chemicals pay a massive toll on our physical well-being. The fight or flight mechanism from an Adrenalin spike is an enormous obstacle to overcome, to allow change to be viewed as an opportunity rather than something which will trigger a fear reaction.

If they can truly disarm their primitive biology, how will they measure fear? How will they be able to protect themselves from the potential assault of encroaching Artificial Intelligence? Will they retain enough individualism to see beyond the superficial benefits of biotech integration before the machines have genetically engineered a new race of placid humanoid workers?

Intelligence and ingenuity may currently be the superpowers which enable us to sit at the top of the food chain, but for how long? As Hawking said, it only takes one AI team to create self-replicating autonomy and human kind could find itself topping the charts of the endangered species list.

The future is both exhilarating and terrifying. Can we afford to leave the biggest decisions of all time to the next generation of selfie-taking, foot-stamping Millennials?

3 views0 comments
bottom of page