Imagine opening an app on your iPad or controlling it without touching the screen or even looking at it, just by thinking. It sounds like something out of a science fiction movie, right? But it's now a reality, thanks to the development of a new brain-computer interface (BCI) technology from Apple in collaboration with Synchron. This advanced technology enables people to control their devices using neural signals. In this article, we'll explore the details of this fascinating innovation and how it has changed the life of a person with amyotrophic lateral sclerosis (ALS).

What is brain control technology in Apple devices?

In a historic step and pioneering experiment, Synchron released a video showing for the first time a person, "Mark," a participant in Synchron's clinical study with amyotrophic lateral sclerosis (ALS), controlling an iPad entirely with his thoughts. He was able to navigate the iPad's home screen, open apps, and type text, all using only his mind.
The secret to this technology lies in the Stentrode BCI device, implanted in a blood vessel above the motor cortex. This device captures neural signals associated with the intention of movement and transmits them wirelessly to an external device that decodes these signals. The device then delivers commands directly to iPadOS via the BCI HID protocol developed by Apple, which enables direct and rapid communication between the brain and the iPad.
This means the system shares current screen data to improve performance and responsiveness. For example, users can navigate the home screen, open apps, and even type text without any physical movement, speech, or eye contact.
The experience relied on Apple's built-in "Switch Control" accessibility feature, allowing for an integrated solution that delivers seamless and responsive interaction. Thanks to the BCI HID protocol, contextual data is exchanged between the device and the neural technology in real time, improving performance and making the experience more natural.
Mark's Story: From Illness to Complete Control

Let's talk about the real hero here: Mark, a participant in Synechron's COMMAND clinical study. Mark has ALS, or amyotrophic lateral sclerosis, which severely limits his mobility.
In the video, we see him controlling the iPad with just his mind: opening apps, browsing, and typing messages. This would not have been possible without the safe implantation of the Stentrode device, which is performed via a catheter inserted through the jugular vein, avoiding open brain surgery, unlike other technologies like Neuralink, which require implanting electrodes directly into brain tissue.
Dr. Tom Oxley, Founder and CEO of Synchron, said:
“This is the first time the world has witnessed direct control of Apple devices using thought alone. It is a quantum leap in the future of human-machine interaction.”
Apple and Synchron began working together on early pilot projects with the Vision Pro headset in 2024, which Mark also controlled using his mind. Over time, support expanded to include iPhone and iPad, and Apple is expected to roll out the technology more widely later in 2025.
Advantages and the Future: Is This the Beginning of a New Era?

This technology is not just an innovation; it represents hope for millions of people with disabilities. To date, Synechron has implanted its device in 10 patients in the United States and Australia, with an experimental designation from the US Food and Drug Administration (FDA) for experimental devices.
The main advantage is safety, the transplant procedure is less invasive, which reduces risks.
In the future, we may see this technology expanded to include everyone, not just patients. Imagine typing an email while walking without holding your phone! However, we must consider privacy and ethics when reading brain signals.
A step towards a better world
The new technology from Apple and Synchron represents the beginning of a new era that could change the lives of millions of people around the world, especially those with mobility disabilities. Mind-controlled devices could one day become commonplace, just as we use touch or voice today.
Source:



7 comment