Tim Frank Andersen
Are Brain-Computer Interfaces a thing now?
What I learned from my Charlie Tango talk with Sune Alstrup — investor in NextMind.
Imagine being able to interact with your surroundings just using your thoughts? The first thing that comes to mind for me is professor Xavier and Magneto from X-men. But is it all science fiction or are we closer to this future form of interaction between man and machine than most of us think?
That was the key question I wanted to get answered when I invited Sune Alstrup for a Charlie Tango talk. Sune is a tech entrepreneur and investor. He co-founded the Eye-tracking company EyeTribe, which was sold to Facebook in 2017 and that has been integrated into the VR frontrunner Oculus. Today he is an investor in the French company NextMind, which is among the world leaders in non-invasive real-time brain-computer interfaces (BCI).
First a bit about the basics and the history:
Brain-machine interfaces (BMI) and brain-computer interfaces (BCI) are devices that enable direct communication between the brain and an external device. For example, controlling a prosthetic limb or controlling your mobile phone.
Since the 70’s the idea of implanting electrodes that translate brain cell activity into data has been studied, tested, and tried out to conquer paralysis, diseases like Parkinson's and hearing loss. But it is still quite experimental, and you basically have to drill a hole in people’s heads.
NextMind represents the next generation of non-invasive systems that combine deep neural networks (AI) and neural signals from the visual cortex of the brain to transform a user’s intention into direct brain commands and thereby creating a connection to the outside world. All you have to do is to wear a small piece of equipment and then imagine the thing you want to do in your head. Of course, this comes with limitations, but it’s a very interesting start.
NextMind now has their DevKit out, so game creators or interface designers all around the world can get started to figure out not just if we can, but if we WILL. And NextMind is not the only company foreseeing a future where systems will be controlled by the speed of thought. Ctrl-Labs had a system that detected nervous impulses, they got acquired by Facebook in late 2019.
Kernel is another neuro-technology company experimenting with allowing people to type on screen — just using their minds. They — and several other start-ups where portraited in the 2020 award-winning documentary: I am Human
So, what are the interesting use cases for this technology if you regard it as a mass consumer product? Well, controlling all your stuff with your mind is one thing. Turning off the coffee machine, dimming the light, switching channels, and changing the volume of your TV-set — these kinds of things. But that might not be the killer app. New types of game controls integrated with spatial computing sounds like more fun, but gaming is a tough and crowded field.
Then, Neuro Authentication sounds more interesting: opening your laptop, front door, or car with your mind or using your mind as the ultimate password. Sune also mentioned retrieving a picture by imagining it and helping disabled in lots of different ways. The imaginary use cases are many.
So far the technology works as a one-way street: thoughts from the brain control digital interfaces. But what if we could go the other way and ingest new knowledge into our brain?
Elon Musk and his company Neuralink think the future here lies in brain augmentation. He argues that this kind of empowerment is already happening with our mobile services: Google Maps, Wikipedia, translate, etc. Now, all we have to do is to connect the brain with the computer to enhance our capabilities. Musk even argues that it will be our only chance to survive, once AI overtakes the human intelligence threshold. Neuralink aims to help you control your mood, memory, hunger, and thirst feeling and maybe even your speech, mathematical and language skills. But then we are back to implantations — and much closer to a Matrix scenario with an implanted hardwired chip that communicates with your mobile over Bluetooth, turning humans into cyborgs, which could scare the s… out of most people.
And this also leads to relevant worries about ethics. Is this the gate to next level surveillance and mind-reading? And what about long term health issues when you start to tinker with the brain?
While we wait for the opportunity to get a brain upgrade and a direct link to the web, it might be time for companies and UX departments to begin to get their hands around this next-level interaction form. And the NextMind DevKit might be your best, cheapest, and easiest way to get started.