Scroll Top

Samsung uses IBM’s brain chip to build a digital eye

article_trunorth

IBM’s TrueNorth, a cognitive “neuromorphic” computing chip whose architecture resembles that of the human brain – although it switches neurons for transistors, is being trialled by Samsung to improve the Dynamic Vision Sensors (DVS) in their next generation gadgets, essentially helping them to build the equivalent of a digital eye that is far superior to today’s camera based technologies. And they’re not the only ones to be using the chip in anger.

IBM’s chip has been optimized for processing large amounts of data on the fly and its 4,096 cores combine to create about a million “digital” neurons and 256 million synapse connections. As a consequence not only does it operate extremely quickly but it also, and most importantly for Samsung, consumes far less energy than typical processors using only 300 milliwatts of power.

 

See also
Fully autonomous GPT-4 AI agents use Zero Days to successfully hack systems

 

That’s a hundredth the power consumption found in a traditional laptop and about a tenth of most smartphones. That said though, despite the chips lofty aspirations of mimicking the architecture of the human brain it still has some way to go before it can process the same tasks with a hundred million times less power than a computer like our brains do.

By using the new chip in their gadgets Samsung has been able to demonstrate that they can up the frames per second from a paltry 120 – which is considered superb for even the most expensive gadgets to over 2,000 and at that frame rate suddenly things become interesting because when you can begin measuring what’s termed the “Time of flight” of light – that is the amount of time that it takes light to travel from a surface to the surface of the sensor a whole host of new applications open up.

In Samsung’s case these equate to better gesture recognition at a distance of over ten feet and the ability to better track motion in a 3D space but across the way at Lawrence Livermore National Laboratory they’re using the same technology to succesfully pick vehicles out of cluttered video surveillance simulations. Meanwhile other organisations are using the new chips to create better 3D maps of the world while others are embedding it into self driving cars and drones.

If you want to know where this technology will ultimately take us then it’s into a world of seeing machines and, when combined with AI and all of the other emerging technologies that are hitting ground, it will be looked back on as one of the technologies that helped revolutionise the world.

Freaky.

Related Posts

Leave a comment

FREE! 2024 TRENDS AND EMERGING TECHNOLOGY CODEXES
+

Awesome! You're now subscribed.

Pin It on Pinterest

Share This