Scroll Top

Researchers unveil the first telepathic link between a robot and a human

article_robolearn

 

WHY THIS MATTERS IN BRIEF

  • One of the biggest problems in manufacturing is having to re-train robots to complete new tasks, but now Robot-Human telepathy could make that problem disappear

 

Those of you who wish others could read your mind must think it’s Christmas – whether it’s because Zuckerberg seems committed to turning Facebook into the world’s largest telepathic network, or whether it’s because people and rats have already managed to communicate with each other telepathically.

The dictionary’s definition of telepathy is “the communication of thoughts or ideas by means other than the known senses,” and now that definition can be applied to a breakthrough by a team of scientists led by Professor Thenkurussi Kesavadas at the Industrial and Enterprise Systems Engineering at the University of Illinois (UI) who have created what they’re calling the world’s first telepathic link with a robot using a Brain Machine Interface (BMI).

 

See also
DoNotPay's famous robo-lawyer helps people avoid homelessness

 

I knew you’d be excited – robots that can read our minds, who wouldn’t be excited?

In its third year of funding, this National Science Foundation project has proven that human operators can look at an object on an assembly line and via the sensors in a skull cap have a robot automatically remove it for them without it having to be trained first.

“The robot is actually monitoring your thinking process,” says Kesavadas, “if the robot realizes you saw something bad, it goes and takes care of it. That is the fundamental idea in manufacturing we are trying to explore.”

Kesavadas and his team have created a system that runs parts through a conveyer belt. Using a camera that takes pictures of the objects passing beneath it the team, who are watching a monitor, can spot a faulty items, and when they do it creates a brainwave spike which the robot picks up on, triggering it to remove it from the belt.

 

 

The project, funded by the NSF’s National Robotic Initiative, uses a technique called Steady State Visually Evoked Potentials (SSVEP), which takes brain signals that are natural responses to visual stimulation at specific frequencies. When the retina is excited by a visual stimulus in the 3.5 Hz to 75 Hz range, the brain generates electrical activity at the same frequency of the visual stimulus.

 

See also
The robots of the future might not have a brain

 

It essentially creates a frequency in the brain that matches the frequency of the object that person is looking.

“The signals from the brain are very similar for everybody and we know which part of the brain gives certain signals,” Kesavadas explained, “implementing that in the real world is tougher in that through BMI, you have to pick up the signal precisely.”

Kesavadas says that in high volume manufacturing robots, such as those used in Industry 4.0 applications, can be programmed to detect the defect on their own but that programming is often time consuming and expensive – it’s also arguably one of the main things holding manufacturers back from going fully autonomous in their factories. That said though we are already seeing alternative approaches to teaching old robots new tricks such as the use of Hive Minds where, using a combination of AI and the cloud, robots are able to teach each other new things. However, Kesavadas’ latest breakthrough could either speed up that process, and, or compliment it so either way it’s a very interesting new development.

That is unless you hate robots that learn all by themselves – in which case you’re out of luck.

“Currently programming robots takes a significant amount of time and expertise and technicians who are fully trained to use them,” he said, “in high volume manufacturing, the time for programming the robot is well spent. However, if you go into an unstructured environment, not just in manufacturing but even in agriculture or medicine, where the environment keeps changing, you don’t get nearly the return on your investment. Our goal is to take the knowledge and expertise of the operator and communicate that to a robot in certain situations. If we can prove that process is effective, it can save significant time and money, and one day we could use human operators to train the robots directly without the need for code.”

 

See also
Robots go on patrol in South Korean care homes to detect mental health issues

 

Kesavadas has long been at the forefront at bringing BMI technology to medicine and he also directs the UI Health Care Engineering Systems Center on campus so while this new breakthrough has immediate benefits in the manufacturing world he believes it can have an even greater impact in the medical field. For example, a paraplegic could tell a robot to bring a certain object to them simply by sending the right signal.

Kesavadas notes that while the technology exists, it requires a surgeon to place the sensor inside the brain.

“As we devise an external system to become much more consistent and reliable, it will benefit many people,” he said, “surgically placing the sensors is a more expensive, invasive, and risky process.”

For now, Kesavadas is trying to ignite excitement in manufacturing to realise the technology’s potential and he presented his findings to the NSF in late January.

“Until now, there has been no research in using brain machine interfacing for manufacturing,” he said, “our goal at the onset was to prove these technologies can actually work and that the robots can be used in a more friendly way in manufacturing. We have done that. The next stage is to coordinate with industries that would need this kind of technology and do a demonstration in a real life environment. We want industry to know the potential of this technology, ignite the thinking process and how they can use the role of brain machine interface as a whole to bring a more competitive edge to the industry.”

 

See also
So long Asimov's Laws, say hello to the 23 Laws of Robotics

 

While the breakthrough’s interesting personally I question if you can call it telepathy – after all the link is one way. However, that said it appears to differ from traditional BMI demonstrations because the robot in question is picking up brain waves within a range of frequencies and using it to perform a freehand task, and one day they’ll be able to communicate back using that same interface. And then we’ll see the beginning of a whole new era of human-robot communication.

Related Posts

Leave a comment

FREE! 2024 TRENDS AND EMERGING TECHNOLOGY CODEXES
+

Awesome! You're now subscribed.

Pin It on Pinterest

Share This