Srivatsa Krishna
Have you ever wondered how many people in the world have sleep disorders? The estimates vary: one study says 30-40% of the population in every country suffers from some kind of sleep disorder or the other. Another study pegs over a billion people have sleep disorders on the planet, which is 1/7th th of the total population. A cutting-edge wearables startup which got incubated in X moonshot laboratory, called NextSense, is trying to find a way to your brain through your ears, using earbuds and advanced technology. An electroencephalogram (EEG)-reading earbuds to put it simply.
NextSense is a brain health company which is trying to get to your brain and cure it, or at least ameliorate brain disorders by analysing neural activity. Just like Elon Musk’s Neuralink is getting into your brain with an implanted chip, NextSense is doing so via a more non-intrusive way through your ears. Given that they know a whole new category of smartwatches and smart rings, which monitor your vitals, it is indeed — and in a way — an idea to get to your brain through your earbuds, which people use for several hours a day. But what about the impact of wearing buds for extended periods will have on your hearing capability?
The NextSense platform uses longitudinal EEG data collected at the point of experience with biosensing earbuds that can be worn comfortably at night and as needed throughout the day. The company correlates it with behavioural patterns collected from other smart devices to identify triggers, diagnose certain conditions, and tailor treatment and medication in ways hitherto not done.
NextSense’s founder Jonathan Berent started off with Computer Science at Stanford but ended up with Philosophy which might be the perfect combination for understanding the world of sleep, dreams, and neurons. The brief demo he gives is impressive, but they still have some way to go for commercialisation — if that happens. Or it may end up as a medical device in hospitals.
A medical EEG is cumbersome and takes time, with electrodes strapped on your skull and body. It is the best way to study sleep apnea, which many people have without even realising that they have it. Skeptics were initially not convinced that the tiny sensors in earbuds could pick up weak electrical brain signals and even if they did, the next challenge would be miniaturisation. NextSense has conquered both these and even got a tight seal using Tecticoat, a super-pliable, conductive coating.
With recent dramatic improvements in artificial intelligence (AI) and machine learning (ML), and the emergence of large language learning models (LLMs), this could be the tipping point where AI can be used to wade through tonnes of data gathered from your ears to come up with a medically meaningful diagnostic. What is perhaps still a few years away is an FDA approval along with commercialisation, leading to something like an Apple a, where one can not only use it for calls, but also becomes your companion for measuring the state of your brain and sleep. NextSense has raised about $12 million, but needs much more funding to achieve its lofty and laudable goals.
Now how much usage is required for getting medical-grade EEG and whether it makes sense is the million dollar question? Airpods and Fitbit have created a whole new genre of smartwatches and buds, NextSense is certainly onto something with its mission to study your brain through your ears, and a pot of gold may not be too far away. The company is young, barely five years into its exciting journey. Some others such as Athelas and Therabody in the broader healthtech space are a unicorn and a soonicorn respectively and one could argue they are perhaps in a technologically less complex space than NextSense.
KEY BENEFITS
- NextSense collects longitudinal EEG data using biosensing earbuds, worn at night and as needed during the day
- Raised about $12 million but needs significantly more to achieve its goals
- Jonathan Berent started off with Computer Science at Stanford but switched to Philosophy, preparing him to explore sleep, dreams, and neurons
(The author is an IAS officer. Views are personal.)