News Release

Brain organoids could unlock energy-efficient AI

With $2 million in NSF support, an interdisciplinary team at Lehigh University is studying how neurons process information to design smarter, more sustainable artificial intelligence

Grant and Award Announcement

Lehigh University

Neural organoid

image: 

Neural organoid in a 3D printed scaffold

view more 

Credit: Courtesy of Yevgeny Berdichevsky/Lehigh University

Our brains are masters of efficiency.

“Biology is very energy optimized,” says Yevgeny Berdichevsky, an associate professor of bioengineering and electrical and computer engineering at Lehigh University. “The amount of energy the brain uses at any given time is roughly equal to a light bulb in terms of wattage. Replicating those computations in hardware would demand orders of magnitude more power.” 

Berdichevsky and his collaborators in Lehigh’s P.C. Rossin College of Engineering and Applied Science and College of Health were recently awarded a $2 million grant from the National Science Foundation to explore the complex information processing that occurs in the brain and harness it to make artificial intelligence both more powerful and energy efficient. 

The funding comes from NSF’s Emerging Frontiers in Research and Innovation (EFRI) program, which supports research into using biological substrates (what Berdichevsky calls “wetware”) to replicate the countless computations our brains perform—such as processing sensory input to create a picture of the world and directing our muscles to act on it. 

“People have long built hardware-based neural networks to mimic the human brain,” says Berdichevsky. “But real brain circuits perform complex tasks that hardware still can’t. We want to identify those computations to inspire the next generation of AI algorithms—improving not only their efficiency, but also their capacity to process information.” 

The team will study neurons within a brain organoid, a millimeter-sized, three-dimensional structure grown in the lab from adult stem cells that is similar to a developing brain. 

Their first challenge is to organize the neurons to resemble the human cortex.

“In organoids, neurons connect randomly,” says Berdichevsky. “In our brains, they’re highly ordered—and we need that control for computation.”

To solve this, team member Lesley Chow, an associate professor of bioengineering and materials science and engineering, will fabricate 3D-printed biomaterial scaffolds that guide neuron placement.

“We’ve learned that we can insert neural spheroids—clusters of different neural types—into scaffold sockets, stack the layers, and essentially engineer the whole organoid from the bottom up,” he explains. 

Next, the researchers will test whether the neurons can perform dynamic computations, such as interpreting moving images. Today’s drones and autonomous vehicles rely on an “optical flow” algorithm within computer vision software to track motion. But, says Berdichevsky, the software is less effective than hoped.

“My goal is to use the complex dynamics of cortical neurons to do this better and with less energy,” he says. 

The team will adapt methods from earlier studies where they stimulated neurons with light. They encoded an image into a sequence of optical pulses, then directed the pulses to specific neurons, allowing the cells to “see” the images. 

“It’s not so different from the way it works in our brain,” he says. “Our eyes essentially transform optical information into electrical information that then travels to the neurons in the cortex. Here, we bypass the eye and stimulate the neurons directly.”

Once stimulated, input neurons can relay information to output neurons, and the team can measure neural activity using a microscope. “Based on this previous work, we know we can get information into the network. The next step is to have the network do something useful with it, which is the purpose of this new project.”

The researchers plan to implement a biological version of the optical flow algorithm by playing a movie of natural scenes (fields, animals) through optical pulses to determine if the network detects motion.

“We’ll be expressing a gene in these neurons that turns into a fluorescent protein,” he says. “The protein increases its fluorescence when the neuron is active, and decreases when it’s not. We can then take snapshots of which neurons are active and which are not.” 

Yuntao Liu, an assistant professor of electrical and computer engineering, and Berdichevsky, will then develop a decoding algorithm and a computer model to interpret these patterns. By analyzing which neurons light up, the algorithm should reveal not only what the network is perceiving, but also the speed and direction of moving objects. The computer model will help the team design learning and training protocols for the engineered organoid.

Berdichevsky and his team hope to develop a proof of concept that engineered organoids can support biological computation and to use those findings to inspire more efficient and more powerful artificial neural networks. 

Ally Peabody Smith, an assistant professor of community and population health, will explore the ethical, social, and legal implications of utilizing brain organoids. “We don’t expect these organoids to be conscious—they’re far too small and simple,” Berdichevsky says. “But we recognize the ethical concerns and want to demonstrate that our work stays well below any threshold of consciousness.”

Such an ambitious undertaking demands expertise across disciplines, which Berdichevsky finds most compelling.

“I like the integrated design,” he says. “Computational algorithms, neuroscience, bioengineering, tissue engineering, and even philosophy all must work together. It’s a truly multidisciplinary effort. 
 

Related Links: 


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.