Computers are now ubiquitous—some even consider them an extension of the human body. We’re only at the beginning stage of the era of cybernetics though, and communication between humans and machines is still significantly slow. To sync our biological and digital selves, we type using our fingers, a limited medium that needs to be incorporated into a high-bandwidth pipeline. This is an interface and data-rate problem that is stifling the development of the whole human-machine symbiosis ecosystem. Enter Arctop, the AI-based company whose neural operating system, Neuos, translates brain activity from AR/VR headsets, headphones, and electrically embedded fabric into high-resolution maps of human emotions, reactions, and intent.
Arctop was created in June 2016 by neuroscientist Dan Furman, Ph.D. and CEO, and Eitan Kay, CTO, in San Francisco, CA after the two connected earlier that year at Brainnovations by Israel Brain Technologies, a non-equity Israel-based accelerator that was hosted by Google in Tel Aviv. The two co-founders are ostensibly polar opposites, but their profiles complement each other with incredible harmony.
Furman, a Harvard University graduate and Technion Ph.D. recipient for research in algorithmic decoding of brain signal, has dedicated his professional career to the intersection of neurology and brain-computer interfaces (BCI). He attributes major entrepreneurial inspiration to an early experience developing an innovative, non-invasive neurosurgery method with neurosurgeon specialist Christopher Duma, MD. The procedure used brain imaging to predict the pattern that malignant brain tumors spread—then based on the prediction used targeted gamma radiation to proactively halt tumor growth along its “leading edge.” A 2016 Journal of Neurosurgery article on this method showed it to be a significant improvement upon previous techniques and it continues to be developed and tested in multicenter clinical trials.
At the time he was only 15-years old, but since then, as a neurobiology student and Arctop founder, has continued to work on the world’s front line of cutting-edge BCI technology. After he graduated from Harvard, Furman worked for NeuroVigil, a sleep analysis company that analyzes sleep patterns to monitor and diagnose diseases, like sleep apnea, depression, and autism. There, he was tapped to work closely with the late Stephen Hawking. As Hawking’s Lou Gehrig’s disease progressed, his facial atrophy worsened, and his Intel-made cheek-clicking device waned in efficiency, Furman adapted NeuroVigil’s sleep technology to create a new communication apparatus. Luckily, during the research and development phase, Hawking regained enough facial muscle control to use his cheek-clicking device, but it was this experience that inspired Furman to develop brain interfaces at a larger scale.

Conversely, co-founder Eitan Kay has gravitated to the unconventional path to AI innovation, thanks to being a true autodidact. Kay’s nonconformist spirit can be traced back to his expulsion from an elementary school computer class for trying hack their computer system. True to his nature, he dropped out high school at 10th grade, and even after obtaining the necessary diplomas to attend college, dropped out of software engineering studies before finishing a semester. Since then, Kay has taken on a number of software and mechanical engineering, and development roles with a cyber security focus in industry. He has managed and worked on simulation projects for jet fighter pilot and infantry weapon training, as well as online advertisement fraud-prevention, and eventually followed his passion to working on virtual reality technology to improve brain rehabilitation for stroke victims – which was the project he was accepted into Brainnovations for.
As the entrepreneur-duo went through the Israel Brain Technologies accelerator, they realized their shared vision and complementary skill sets. And so they teamed up to create a software engine that could harness real-time brain patterns and interact directly with electronics, combining Kay’s background in programmatically modulating digital content and promoting plasticity for stroke rehabilitation, and Furman’s expertise in neural interfaces and signal-processing algorithm development.
Zoom forward to today. From their seaside office in Tel Aviv, Arctop is demonstrating an impressive stack of core software technology that effectively serves as an operating system between the user’s brain, electronic devices, and their surrounding environment. Because the hardware can range from a mixed-reality headpiece, like Magic Leap One, (“enveloping your head’s immediate electromagnetic orbit” as Furman put it), to devices that are distributed more widely in the physical world, like smart TVs, speakers, and robots (“endowing spaces themselves with special powers, an embodied intelligence” as Furman said), Arctop’s software is set to transform the communicative flow between humans and computers of all different shapes and sizes.

Furman and Kay began by tinkering with a cryptography-based approach that they believed could help crack what Kay called the “information problem.” In short, human brain signals are very noisy and messy, so to read enough information out for specific applications to be functional in the real world, a paradigm shift was needed to advance the way that the signal was processed. Furman had already made some headway on the problem in prior research that was published by IEEE — where he was the first to show an algorithm that could accurately decode imagined finger movements. This decoding challenge had been attempted since the late 1970’s, but prior to Furman’s findings, was understood to be impossible by the scientific community. Still, even with the high-caliber resolution that Furman’s research advanced, the “information problem” still existed in the spaces in which the two founders were developing their company’s software to run.
The two soon identified unique markers of individual brains that were so pronounced that users could be authenticated by them, like FaceID does with faces, and then began using these markers to filter and enhance the complex brain signals to get at deeper insights. Or in other words, they started using brain identity to solve the “information problem.” Combining rigorous research and development with AI, they created Neuos, their real-time software platform that uses brain signal patterns to authenticate individuals, and decode emotional and cognitive states with a fine, temporal resolution. “We have the sharpest, hardest working R&D team in BCI, no question” asserted Furman.
Accurate processing of the neural dynamics that underlie delight, fear, surprise, engagement and other mental events that Neuos analyzes is a highly complex task that is challenging because the signal-to-noise ratio is so low, but also because every brain is so different. Without top tier AI algorithms, making sense of ongoing brain signals is nearly impossible. With its state-of-the-art approach to signal processing and AI infrastructure, Neuos technology introduces three breakthroughs—authentication, emotional analysis, and intent analysis—that are rooted in neurobiology and electrophysiology, since the software understands phenomena rather than single-occasion mental events.
Their brain identification methodology is a major breakthrough in security, where the method introduces a new framework for identity authentication. And in today’s climate of biometric authentication, and it’s gross vulnerability to forgery, this new framework might become the new standard for authentication systems. Unlike fingerprints for example, a brain identity cannot be stolen or surveilled remotely, like in 2014, when hackers obtained the ten digital fingerprints of Germany’s Defence Minister through close-range photos of her hands. Once your digital fingerprint is compromised, there’s no more fingers to validate for a secured password key. “The brain signal signature cannot be copied” explained Furman. “The brain is changing all the time, and there’s a tight time lock during which identity can be measured in a way that’s more secure and also more convenient.”

After marking the brain signature, to make sense of the brain dynamics requires the caressing magic of their AI infrastructure to learn in context and predict by absorbing electrical patterns and processing outputs from an ensemble of algorithms. Traditionally, brain-computer interfaces applied a limited amount of neuroscience understanding, classifying a narrow type of activity or distinct anatomical features and hard coding them into the Interface. Neuos understands completely differently, through specially designed Deep Learning architectures that characterize an individual’s brain identity and pickup on the nuanced patterns that are most meaningful and informative in a given context. Once those nuanced patterns are made sense of, they can then identify the emotions and intents of the individual within a AR/VR headset or when simply wearing a light bandana with embedded sensors.
With the brain’s electromagnetic signals decoded and the software infrastructure for doing so at scale, Arctop is now contributing to a range of industries, providing emotion and cognition analytics, that include A/B testing for games and movies, security authorization processes, and adaptive learning environments. The ability to dynamically adjust media in real-time according to the emotional responses of the individual is perhaps the most titillating of all Arctop’s technology applications. They lightly alluded to a secret project in the works for a next generation Netflix production. Given the success of Netflix’s recent interactive Black Mirror movie Bandersnatch, and at least one current Arctop demo experience, it isn’t hard to imagine a Neuos-enabled interactive TV show or film in the near future.

Intent analysis is the third breakthrough of Arctop’s AI platform, giving their Neuos engine the seemingly magical ability to discern an individual’s response to questions in the form of yes or no, in order to call up more information or dismiss it. While not every user can communicate yes/no decisions with perfect accuracy yet, after a short calibration session, many become fluent and are able to effectively operate the system in different scenarios. While it sounds basic, this simple binary choice ability actually opens up many possibilities, like selecting answers, sending silent commands to Alexa via Bose headphones, or even confirming a payment in the brain equivalent to Alibaba’s smile to pay.
The key to Neuos intent analysis, Furman explained, is partly an intricately designed new user session that calibrates an initial double loop of reinforcement learning to a specific brain, where the classifications of the machine become reinforced by the classifications of the brain’s own actions, and vice versa. The brain is changing as Neuos perceives it, and by adaptively delivering content to the brain, Neuos changes mental patterns. Once the initial calibration is complete and the new user’s model is built, Furman said, the system becomes much more empathic and in sync with the individual.
Arctop truly is championing the use of AI for brain-computer interfaces, and they are on track to become a core technology in the rapidly expanding computing hardware and device markets. Along the way, they might even enable a revolution in gaming, where their advanced analytics have the potential to completely transform immersive entertainment experiences. With their real-time platform Neuos, emotion analytics web portal, and API for integrations actively running, the startup is growing its Tel Aviv-based data science and engineering teams and expanding its business in the U.S. and Japan.