fbpx
May 5, 2024

Neurable receives $2 million to explore brain-computer interfaces

One of the principle challenges of computing is how best to interface human beings and our machines. Keyboards, mice, and touchpads have become second nature to most of us and we’re starting to get accustomed to voice commands and even haptic feedback.

But every single computer interface invented so far, even those more exotic concepts like eye tracking, share one single weakness: They are external devices that need some kind of artificial middleman to translate our intentions. One way around that limitation is to interface directly with the brain, which is something that brain-computer interface (BCI) developer Neurable recently received $2 million in seed money to explore.

More: Immersion is integrating Ultrahaptics with HoloLens to let you reach out and touch things

Neurable has some patent-pending technology that monitors a user’s brain activity to determine their intent, utilizing real-time software combined with connected devices that are powered by the human brain. The company is creating a software development kit (SDK) specifically to enable developers to integrate its technology into virtual and augmented reality headsets and content.

University of Michigan

As Ramses Alcaide, Neurable co-founder and CEO, puts it, “Our goal is to build a new platform for human-computer interaction. Our investors share our vision for the broad potential of our technology and for creating a world without limitations. We appreciate their confidence.”

The initial seed round of funding was led by Brian Shin via Accomplice’s Boston Syndicate, along with Point Judith Capital, Loup Ventures, the Kraft Group, and others. Shin was effusive in his excitement over Neurable’s technology, saying, “The team at Neurable believe that they can enable people to easily control devices and objects with their minds. The implications would be enormous. They have a chance to completely alter the way humans interact with technology which is something that I had to be a part of.”

Neurable’s technology is derived from research conducted at the University of Michigan’s Direct Brain Interface Laboratory. While working on his Ph.D., Alcaide studied under Dr. Jane Huggins, a prominent researcher in the field of brain-computer interfaces (BCI). By combining new findings on how brainwaves function, along with using new machine learning concepts to perform complex data analysis, Alcaide hopes to improve the speed and accuracy of understanding user intent.

More: Eye-tracking could help readers by noticing when they struggle with a word

One of the principle applications of BCI is in VR and AR applications, where it would enable completely hands-free interaction and avoid the limitations of voice commands and eye-tracking technology. Users won’t need to worry about having holes drilled into their skulls to implant electrodes into their brains, however, as Neurable’s technology is wireless, non-invasive, and uses dry electrodes to sense brainwaves.

University of Michigan

VR and AR companies are expected to be the primary customers of Neurable’s technology and SDK, which is platform-agnostic and will work with Oculus Rift, HTC Vive, Microsoft HoloLens, and more. The SDK will be released in the second half of 2017, meaning that while there is no specific timeline for marketable products, it shouldn’t be too very long before we’ll just have to think about what we want our computers to do.

from Planet GS via John Jason Fallows on Inoreader http://ift.tt/2hhI4lY
Mark Coppock

%d bloggers like this: