Scroll Top

America’s most powerful supercomputer is building superior AI’s

future_tiran_summit_ai

WHY THIS MATTERS IN BRIEF

It normally takes human data scientists and the best AI’s weeks to build neural network software, but ORNL’s Titan supercomputer is building them in a day, and the AI’s it’s producing are far superior to anything humans can create.

 

Last year, in a world first, we saw an Artificial Intelligent (AI), in this case Google’s AutoML, build new AI’s, or as some described it AI “parents” building AI “children,” that immediately outperformed those created by some of the world’s leading data scientists. Dystopian Hollywood script writers have long written stories about the rise of intelligent machines that try to take over and enslave humanity, with The Matrix and Terminator being perhaps the most notable examples, and while we’re still some way away from, hopefully not realising that future, it is, as one possible timeline, getting closer.

 

See also
The US Military's Replicator program will churn out thousands of military drones

 

This week researchers at the US Department of Energy’s Oak Ridge National Laboratory (ORNL), using Titan, the most powerful supercomputer in the US, and the fifth most powerful in the world, announced they’ve created and tested a new AI that can generate neural networks better than human data scientists, but here’s the crux, it can do it in less than a day and that’s a ground breaking achievement.

At the moment it can take even the most experienced data scientists months to create new deep learning programs that send data through a complex web of mathematical algorithms, and even Google’s now famous AutoML, took weeks to design its superior image recognition child AI.

Of course, the Google Brain project engineers who created the AI that created the AI “only” had access to 800 Nvidia Graphic Processing Units (GPUs), a type of computer hardware that accelerates the development of new deep learning models, but Titan boasts more than 18,000.

 

See also
New research would allow cyborgs and human organs to join computer networks

 

The ORNL research team’s algorithm, called MENNDL, which stands for “Multinode Evolutionary Neural Networks for Deep Learning,” wasn’t designed to create AI’s that identify cute cat photos, instead, it was designed to test and train thousands of potential neural networks to work on unique science problems, and that requires a different approach from the Google and Facebook AI platforms of the world, notes Steven Young, a postdoctoral research associate at ORNL who helped design MENNDL.

“We’ve discovered that often those [generic] neural networks aren’t optimal for a lot of our problems, because our data, while it can be thought of as images, is different,” he explains, “these images, and the problems, have very different characteristics from [typical] object detection.”

One application of MENNDL, for example, involved a particle physics experiment at the Fermi National Accelerator Laboratory.

 

See also
The death and rebirth of Moore's Law

 

Fermilab researchers are interested in understanding neutrinos, high energy subatomic particles that rarely interact with normal matter but that could be a key to understanding the early formation of the universe, and one particular Fermilab experiment involved taking a “snapshot” of neutrino interactions, and the team wanted the help of an AI that could analyse and classify their detector data. MENNDL evaluated 500,000 neural networks in 24 hours, and its final solution proved superior to all the custom models developed by human data scientists.

Meanwhile, in another case involving a collaboration with St. Jude Children’s Research Hospital in Memphis, MENNDL improved the error rate of a human designed AI algorithm for identifying mitochondria inside 3D electron microscopy images of brain tissue by 30 percent.

“MENDDL is able to [build better models] than humans in a fraction of the time for these sorts of very different datasets that we’re interested in,” said Young.

 

See also
New computer chip transmits all the world's internet traffic in under a second

 

What makes MENNDL particularly adept is its ability to define the best or most optimal “hyper-parameters,” the key variables, to tackle a particular dataset.

“You don’t always need a big, huge deep neural network. Sometimes you just need a small network with the right hyper-parameters,” added Young.

This year the team at ORNL expect MENNDL to make an even bigger when the lab’s next supercomputer, Summit, comes online. While Summit will boast only 4,600 nodes, down from Titan’s 18,000 it will sport the latest Nvidia GPU technology and CPUs from IBM that mean it will be five time more powerful than Titan.

 

See also
Indian surgeon completes world’s first remote robotic human heart surgery

 

“We’ll be able to look at much larger problems on Summit than we were with Titan and hopefully get to a solution much faster,” Young says.

AI’s building AI’s, now machines building AI’s… what next? Self-evolving robots? Ah, I forgot, we saw that last year.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This