Scroll Top

Nvidia unveil Isaac, a virtual simulator to help companies create advanced robots

WHY THIS MATTERS IN BRIEF

As the world comes to rely more and more on bots and robots we need better, and faster, ways to train them.

 

A newly expanded simulator system could be our high-speed ticket to building fully autonomous bots and robots, or so says Nvidia’s CEO.

 

See also
US proposes to regulate AI the same way it regulates weapons exports

 

Tech giant, Nvidia, recently announced the introduction of their Isaac robotics platform at this year’s GPU Technology Conference in California. Isaac is an advanced deep learning based virtual training system for autonomous robots, and as of recently, it includes an open-source software development kit (SDK) with libraries, drivers, APIs and other tools used for building robotic applications.

“Everything that moves will be autonomous,” said NVIDIA’s founder and CEO, Jensen Huang.

The SDK is a bold step forward, as it attempts to simplify the unification of autonomous bots and artificial intelligence. With an expanded platform, developers can improve robotic movements and interactions, as well as manage data and communications in real time.

Along with program development, Isaac SDK will also tackle the testing and training of machines, a stage that, when autonomous, can be extremely complex and time-consuming in the physical world. Luckily, the tools and applications within the SDK are made to pair seamlessly with one of Isaac’s other evolving features, called Isaac Sim, which provides a highly detailed virtual world where roboticists can immerse and teach autonomous bots.

 

See also
A US start up has created the world's first fully autonomous vertical farm

 

The algorithms that are used to learn in this simulated suite are transferred to the robot’s brain, an AI computing module called NVIDIA Jetson, so that the fruits of training can be applied to real-world environments.

 

Watch Carter navigate its simulated environment

 

This simplified process eradicates some major industry-wide challenges, testing costs and the risks of physical accidents will decrease, for example, and no longer will autonomous bots require months of testing, as training time is reduced to mere minutes.

NVIDIA revealed a reference robot design in which their SDK could eventually be used. Although it remained static on the showfloor, the small bot, known as Carter, is meant to eventually work as a delivery machine in the service industries.

 

See also
Encryption’s arch nemesis is a ticking quantum time bomb

 

“Carter is just an assembly of various technologies, like lidar, stereo, camera… but also localization, mapping and framework,” says Claire Delaunay, NVIDIA’s vice president of engineering, in a video produced by the company, “we had this robot navigating in a fully simulated environment based on an existing building…Then, we tried the robot in reality and we actually let the robot run. We discovered that we were able to localize the robot fully in reality using a simulated model, and it was a very exciting moment.”

Huang and his team believe that Isaac SDK has potential that reaches far beyond Carter’s delivery and service applications, and suggests that we should expect to see the platform being used in manufacturing, agriculture, construction, and other fields quite soon.

Early access to the Isaac SDK network is currently open for registration.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This