Scroll Top

Invisible AI helps cars makers see what machines and humans do wrong

WHY THIS MATTERS IN BRIEF

Increasingly we are using machines to watch human and machine workers and autonomously monitor their performance, report on it, and make interventions.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Increasingly we humans are being monitored and assessed by Artificial Intelligence (AI) – so much so that recently both Amazon and Uber were taken to the European Court of Justice (ECJ) when workers found out that the company’s AI’s were assessing their performance while on the job and firing them automatically if they didn’t hit certain criteria. And while the use of AI in the workplace can be dystopian, in Amazon’s case it’s also helping them identify workers who are lifting boxes and so on incorrectly then making interventions to reduce the risk of injury. So as ever tech is a two sided coin.

 

See also
Neural implants let military pilots control three jets at once with their minds

 

Over the years Tesla CEO Elon Musk has often referred to his automobile Gigafactory as “The machine that builds the machine,” but there are plenty of human workers involved in even the most highly automated plants.

They remain a key part of the exceedingly complex process that is automobile assembly but need to operate as efficiently as their mechanical counterparts to keep cars and trucks coming off the line with a combination of quality and speed.

Weeding out issues and making sure everything is running smoothly has traditionally meant sending quality control personnel up and down the lines to get eyes on the action. But now there’s a way to automate that job with better results than ever before.

 

See also
IEEE publishes the worlds first framework for coding ethical behaviours into AI

 

Palo Alto-based Invisible AI was founded by veterans of the autonomous car industry who saw an alternative for the artificial intelligence-driven machine vision technology they were working on that could come to market long before the mass acceptance of self-driving cars.

The company designed a network of cameras that can monitor an assembly line in real time and spot even the smallest things going wrong.

“Productivity, safety and quality are always top of mind in manufacturing, especially auto,” said Invisible AI CEO Eric Danzinger.

The self-contained units are equipped with stereoscopic vision and onboard processing that allows them to be easily set up in a factory without having to tap into the facility’s own networks.

 

See also
OpenAI are building a team to stop super AI going rogue

 

“Our AI is not just about watching one workstation but about getting that view across the line about where you’re hitting production bottlenecks, where you’re seeing deviations from how the work is supposed to be done and where you’re seeing issues like bad reaches that can cause physical issues for your workers,” Danzinger said.

The cameras don’t need to be programmed with the assembly process. They only have to scan a single, correct cycle, and then the system can determine if anything deviates from it later.

“Our AI system analyzes the video, from raw pixels, to understand the pattern of work that’s happening and then compares those patterns so we can tell if someone is following a standard,” Danzinger explained. “All of that is being done by an intelligent agent in the cameras so a person doesn’t have to.

“If you have 100 cameras on one section of an assembly, you are actually seeing in 3D the living, breathing line.”

 

See also
Blockchain startups are lining up to decentralise and revolutionise the internet

 

Pricing varies by application, but Danzinger said the cost is far less than bringing in a consulting team or trying to accomplish the same work manually, which really can’t be done given the scope of what the system is capable of.

Since they’re self-contained, installing all the cameras can be done in a couple of days between shifts.

“Our system has become the place you can go to help frontline employees understand the work being done,” Danzinger said.

“There are a million things happening. People are sick, bad parts are coming from suppliers, machines are broken down. … To be able to know what’s going on, what’s the most crucial component to fix, how do I meet my numbers? That’s the most important thing.”

 

See also
Geneticists are creating a new form of life that is resistant to all known viruses

 

Invisible AI has collected a roster of a dozen automotive parts suppliers and four original equipment manufactures as clients, including Toyota, which uses the system at a factory in Indiana.

Toyota declined to provide comment but Senior Engineer Jihad Abdul-Rahim said when the project was announced last year that “Invisible AI is not only helping us find opportunities for improvement on the assembly lines, but we’re also constantly finding new use cases for their technology, such as ergonomics analysis to proactively prevent injuries.”

Danzinger said details about its other customers and how they are using the system is confidential and that Invisible AI can’t provide details on their behalf.

 

See also
Researchers cut HIV genes out of living cells using CRISPR

 

As far as privacy is concerned, the system doesn’t have facial recognition technology, and it can blur faces captured on video. But the point of it is to offer direct feedback, so it is not an entirely anonymized analytical tool.

“Most of what we see is helping workers have a voice and raise their hand to say, ‘This is broken. We need help fixing it,’ and actually getting a response,” Danzinger said.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This