Last week I wrote about resurgence in psychedelic research: and how it holds out the promise of new therapies for people with depression and addiction problems.
It might seem extreme.
I can see how you might not be comfortable with the idea of taking a 25mg dose of psilocybin and hallucinating for several hours under medical supervision.
But the medical evidence so far is compelling.
A UK study in The Lancet Psychiatry found that eight of 12 patients who participated in the trial were free of long-term depression after just one week of psilocybin treatment.
Three months later only five were in full remission.
And the early results of research into LSD and DMT are just as compelling.
I can see these treatments becoming very popular.
It’s easy to see how the sense of self – which can be crippling for people suffering from depression and addiction – can melt away during these treatments opening the possibility of new patterns of behaviour once you come back to reality.
And the more research I do into brain technology, the more I’m surprised at the lengths people will go to experiment with the condition of the brain.
Take the emerging field of “sensory enhancement”.
This is the business of developing tools to give people additional senses.
The simplest example is a hearing aid.
By placing a cochlear implant behind the ear, someone born deaf can animate electrical activity in the brain and learn to recognise these patterns as “sound”.
At first the experience of the cochlear implant is not like sound: it feels more like a painless electrical shocks inside the head.
But after a month, these electrical patterns resolve into a tinny, distorted radio.
And in time, the brain registers this feedback as sound.
In the same way, scientists working in the field of “sensory enhancement” are programming the brain to pick up on all sorts of new senses.
These go far beyond vision, hearing, taste, touch and smell.
In basements, startup spaces, and university laboratories, scientists are collaborating on brain tools that can experience senses of animals, strangers and even intelligent machines.
Let me give you a few examples.
The Bottlenose is a handheld device built by amateur biohackers.
It uses ultrasound to detect the distance of objects, then vibrates the user’s finger at different frequencies, giving you the experience of echolocation.
The Bottlenose can also capture sonar, ultraviolet, Wi-Fi, and thermal data and transmits information about these fields to implanted magnets.
In fact this is part of a whole field of hackers are interested in simulating the senses of animals.
Other examples, are cochlear implants that can be tuned to pick low frequencies, such as those used by elephants or really high ones, such as those used by dolphins.
Why would we do this?
Well the idea is that our reality is constrained by the way we filter information through our main senses. We actually pick up on very few of the stimuli that surround us.
For example, we don’t really pick up on infrared light like a rattlesnake or ultraviolet light like bees. We have only a vague sense of magnetic flows, while birds use it to migrate thousands of miles across the planet.
And we are absolutely miserable when it comes to taste and smell. Ants have ten to twenty substances that they use to organise their families.
We have no sense of that.
One of the leaders in this field is Professor Kevin Warwick, an engineer at Coventry University and author of I, Cyborg.
As part of his experiments, Warwick developed a device that pick up on ultrasonic waves using an implant: giving him a bat-like ability to sense movements at a few metres distance.
“I was looking to see if our brains could expand, which of course they can. We also experimented with infrared, which gives you a remote sense of heat, because humans can’t detect heat from a distance. I think almost anything is possible – UV, X-ray – we can extend our abilities to start sensing whatever we like.”
2. Hive Minds
In fact, Warwick has been experimenting on himself since 1998.
Another famous experiments involved wirelesly connecting an electrode in his arm to one in his wife’s arm, so that wherever they were, they could feel other flexing a hand.
More recently, the neuroscientist David Eagleman has attempted to take this idea further by wirelessly connecting heart and sweat monitors to his wife and himself so that they can sense each others’ moods.
This might sound like a terrible idea.
But Eagleman’s company NeoSensory is definitely worth watching.
Here is a fascinating talk he gave recently at Google.
And it’s not just a sense of sweat and heart beats that we’ll soon by exchanging. According to Peter Watts, a Canadian science-fiction author and former marine biologist….
“There’s a machine in a lab in Berkeley, California, that can read the voxels right off your visual cortex and figure out what you’re looking at based solely on brain activity. One of its creators, Kendrick Kay, suggested back in 2008 that we’d eventually be able to read dreams (also, that we might want to take a closer look at certain privacy issues before that happened). His best guess was that this might happen a few decades down the road – but it took only four years for a computer in a Japanese lab to predict the content of hypnagogic hallucinations (essentially, dreams without REM) at 60 per cent accuracy, based entirely on fMRI data.”
Meanwhile, at Clemson University in South Carolina, Ganesh Venayagamoorthy is busy teaching neurons to run everything from to stock markets to electric grids: connecting clouds of neural patterns in the brain to a computer simulation of these networks, then allowing the brain to learn through feedback loops between your neurons and the patterns in the market.
3. Life as a Machine
Now to be honest, that last one sounds a little far fetched to me.
Especially when you consider the quality of the feedback between implants and the machines than hobbyists are busy connecting to.
This week, a software engineer and popular biohacker, Amie DD, released a video showing how he has connected an implant to her new Tesla Model 3.
Here is the video (WARNING: there is quite a lot of blood during the procedure).
She removed the RFID chip from the Tesla Model 3 valet card, using acetone.
Then she placed it into a biopolymer, which was injected through a hollow needle into her left arm.
This is more an example of body modification than “sensory enhancement”.
But other biohackers are inserting body implants that are connecting to internet devices. Others have implants that store hundreds of gigabytes of data, stream movies, or act as a server for an anonymous chat room.
One final area that really intrigues me is that is animating dormant senses.
These are the senses that our ancestors may have served our ancestors, but have weakened as we have become reliant on technology.
For the bulk of our evolution, before we could be considered ‘human’, our navigational abilities relied on using our sense organs. We take it for granted that we can see our way with our eyes.
But we also have other senses that we can use to orient ourselves – more than ‘six’ if we include the vestibular system, which underlies our ability to balance, and proprioception, our sense of bodily articulation and movement.
Yet some navigators in more “primitive” communities seem to rely on another sense: magnetoreception, or navigation by using the Earth’s magnetic fields.
The study of human magnetoreception was spearheaded in the late 1970s by Robin Baker, a zoologist at the University of Manchester.
The search for human magnetoreception was on. It was spearheaded in the late 1970s by Robin Baker, then a young British zoologist at the University of Manchester.
His early investigations, now known as the ‘Manchester experiments’, involved taking vanloads of blindfolded students on road trips along different winding routes deep into the Cheshire countryside, c50km away from campus. It was expected they would point randomly in any direction while blindfolded, and show improved accuracy once their blindfolds were removed. For an initial set of 20 subjects, the data clearly showed exactly the opposite pattern – the blindfolded students were better at navigating, mysteriously demonstrating an eerily accurate sense of direction; 90 per cent of them pointed within 45 degrees of the true ‘home direction’.
In the early nineties, an emerging professor at Caltech, Joe Kirschvink set to probe living brains for any signals that could be linked with magnetoreception.
Kirschvink and a team of experts with support from the Human Frontier Science Program and the US Defense Advanced Research Projects Agency (DARPA), developed a machine for picking up on these signals.
The apparatus consists of an underground chamber just big enough for a single human. Within it are a series of coils through which the strength, polarity and inclination of magnetic field lines can be controlled. Any brain activity corresponding to field variations is measured using EEG. The chamber’s outer layer consists of a Faraday cage – a layer of aluminium to filter out any electromagnetic contamination caused by radio, computers, smartphones, elevators.
According to Kirschvink…
‘Almost the first day we had it up and running, and we were looking at the EEG trace, it looked like the alpha waves were dropping. The alpha wave is the thing to look at in the brain – it’s the resting state. It monitors the senses. If a signal comes in, this alpha-wave hum sharply drops. That was the huge discovery. The magnetic field was the only signal the participants’ brains could be responding to – they were sensing it.”
In fact the evidence for a human magnetosensory capacity implies that our early primate ancestors and even hunter-gatherers, had access to magnetoreception because they needed it for survival: using the feel of the moon and its fields to travel great distances over seas, or in remote, desolate areas, all without the aid of a compass or Google Maps.
Now that sounds like an escape.
Next week: a story about a supermaterial that could deliver a 5X return over the next few years.