Scroll Top

A ball of brain cells on a chip just learned maths and speech recognition


So, you think your brain has to be in your head to analyse and learn things? Well, you might be wrong about that … and it could be the future of computing and AI.


Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Recently I talked about a new synthetic biological intelligence breakthrough that saw human brain cells beat some of the best Artificial Intelligences (AI) at playing pong. And now those brain cells in a dish – that are maybe in search of a human body or a mouse to merge with – have learned, literally, new tricks.


See also
UBeam uses sound to wirelessly charge phones on stage, confounds critics


A tiny ball of brain cells hums with activity as it sits atop an array of electrodes. For two days, it receives a pattern of electrical zaps, each stimulation encoding the speech peculiarities of eight people. By day three, it can discriminate between speakers.

Dubbed Brainoware, the system raises the bar for biological computing by tapping into 3D brain organoids, or “mini-brains.” These models, usually grown from human stem cells, rapidly expand into a variety of neurons knitted into neural networks.

Like their biological counterparts, the blobs spark with electrical activity – suggesting they have the potential to learn, store, and process information. Scientists have long eyed them as a promising hardware component for brain-inspired computing.


See also
Microsoft's pig wrangling competition is teaching AI's to cooperate


This week, a team at Indiana University Bloomington turned theory into reality with Brainoware. They connected a brain organoid resembling the cortex – the outermost layer of the brain that supports higher cognitive functions – to a wafer-like chip densely packed with electrodes.


The Future of AI and Biological AI, by Keynote Futurist Matthew Griffin


The mini-brain functioned like both the central processing unit and memory storage of a supercomputer. It received input in the form of electrical zaps and outputted its calculations through neural activity, which was subsequently decoded by an AI tool.

When trained on soundbites from a pool of people – transformed into electrical zaps – Brainoware eventually learned to pick out the “sounds” of specific people. In another test, the system successfully tackled a complex math problem that’s challenging for AI.

The system’s ability to learn stemmed from changes to neural network connections in the mini-brain – which is similar to how our brains learn every day. Although just a first step, Brainoware paves the way for increasingly sophisticated hybrid bio computers that could lower energy costs and speed up computation.


See also
Smart molecule breakthrough could boost computer memory by over a 100 fold


The setup also allows neuroscientists to further unravel the inner workings of our brains.

“While computer scientists are trying to build brain-like silicon computers, neuroscientists are trying to understand the computations of brain cell cultures,” wrote Drs. Lena Smirnova, Brian Caffo, and Erik C. Johnson at Johns Hopkins University who were not involved in the study. Brainoware could offer new insights into how we learn, how the brain develops, and even help test new therapeutics for when the brain falters.

With its 200 billion neurons networked into hundreds of trillions of connections, the human brain is perhaps the most powerful computing hardware known.

Its setup is inherently different than classical computers, which have separate units for data processing and storage. Each task requires the computer shuttle data between the two, which dramatically increases computing time and energy. In contrast, both functions unite at the same physical spot in the brain.


See also
AI tutor trounces human experts at teaching students neurosurgery


Called synapses, these structures connect neurons into networks. Synapses learn by changing how strongly they connect with others – upping the connection strength with collaborators that help solve problems and storing the knowledge at the same spot.

The process may sound familiar. Artificial neural networks, an AI approach that’s taken the world by storm, are loosely based on these principles. But the energy needed is vastly different. The brain runs on 20 watts, roughly the power needed to run a small desktop fan. A comparative artificial neural network consumes eight million watts. The brain can also easily learn from a few examples, whereas AI notoriously relies on massive datasets.

Scientists have tried to recapitulate the brain’s processing properties in hardware chips. Built from exotic components that change properties with temperature or electricity, these neuromorphic chips combine processing and storage within the same location. These chips can power computer vision and recognize speech. But they’re difficult to manufacture and only partially capture the brain’s inner workings.

Instead of mimicking the brain with computer chips, why not just use its own biological components? Rest assured, the team didn’t hook living brains to electrodes. Instead, they turned to brain organoids. In just two months, the mini-brains, made from human stem cells, developed into a range of neuron types that connected with each other in electrically active networks.


See also
US Army to develop artificial human skin and powdered blood


The team carefully dropped each mini-brain onto a stamp-like chip jam-packed with tiny electrodes. The chip can record the brain cells’ signals from over 1,000 channels and zap the organoids using nearly three dozen electrodes at the same time. This makes it possible to precisely control stimulation and record the mini-brain’s activity. Using an AI tool, abstract neural outputs are translated into human-friendly responses on a normal computer.

In a speech recognition test, the team recorded 240 audio clips of 8 people speaking. Each clip capturing an isolated vowel. They transformed the dataset into unique patterns of electrical stimulation and fed these into a newly grown mini-brain. In just two days, the Brainoware system was able to discriminate between different speakers with nearly 80 percent accuracy.

Using a popular neuroscience measure, the team found the electrical zaps “trained” the mini-brain to strengthen some networks while pruning others, suggesting it rewired its networks to facilitate learning.

In another test, Brainoware was pitted against AI on a challenging math task that could help generate stronger passwords. Although slightly less accurate than an AI with short-term memory, Brainoware was much faster. Without human supervision, it reached nearly compatible results in less than 10 percent of the time it took the AI.


See also
China races to lay down new laws to deal with Generative AI


“This is a first demonstration of using brain organoids [for computing],” study author Dr. Feng Guo told MIT Technology Review. The new study is the latest to explore hybrid biocomputers – a mix of neurons, AI, and electronics.

Back in 2020, a team merged artificial and biological neurons in a network that communicated using the brain chemical dopamine. More recently, nearly a million neurons, lying flat in a dish, learned to play the video game Pong from electrical zaps.

Brainoware is a potential step up – and a step closer to the world of cyborg beings. Compared to isolated neurons, organoids better mimic the human brain and its sophisticated neural networks. But they’re not without faults. Similar to deep learning algorithms, the mini-brains’ internal processes are unclear, making it difficult to decode the “black box” of how they compute – and how long they retain memories.

Then there’s the “wetlab “problem. Unlike a computer processor, mini-brains can only tolerate a narrow range of temperature and oxygen levels, while constantly at risk of disease-causing microbe infections. This means they have to be carefully grown inside a nutrient broth using specialized equipment. The energy required to maintain these cultures may offset gains from the hybrid computing system.


See also
Hackers found a way to steal data from air gapped networks using powerlines


However, mini-brains are increasingly easier to culture with smaller and more efficient systems – including those with recording and zapping functions built-in. The harder question isn’t about technical challenges; rather, it’s about what’s acceptable when using human brains as a computing element. AI and neuroscience are rapidly pushing boundaries, and brain-AI models will likely become even more sophisticated.

“It is critical for the community to examine the myriad of neuroethical issues that surround biocomputing systems incorporating human neural tissues,” wrote Smirnova, Caffo, and Johnson.

Related Posts

Leave a comment


Awesome! You're now subscribed.

Pin It on Pinterest

Share This