Today, February 25th, 2018, I brought up Google and typed in “Robots.” This was one of the top three top stories listed:
Seems stories like these are very popular nowadays. But given the current state of factory automation in the United States, the day when robots are treating us like “guinea pigs” during an “AI Apocalypse” seems very far off. The truth about most factories in the US is that they trail many countries in the degree of automation of their factories. According to the International Federation of Robotics (IFR), the US ranks 7th worldwide.
Why? While the largest US factories - assembling cars, truck, heavy machinery, etc. - are approaching full automation, smaller factories struggle to automate many manual tasks due to limitations in their primary equipment; equipment they can’t afford to replace any time soon. If you are a manufacturer of precision aerospace parts, the 5-axis Lathe CNC machine you are using may have cost close to $1M. And if that machine is more than a few years old, finding a method to electronically integrate that machine into an automated work cell can be difficult, expensive or nearly impossible. It is just easier to pay a person to “tend” to that machine than to try to replace those tending tasks with a robot. That situation may be changing, however, thanks to advancements in computer vision that reduce the need to hard-wire into older machines.
Using a robot to tend or operate machines is gaining momentum. If you have a new CNC lathe or injection molding machine with a modern interface, chances are you can buy a Collaborative Robot (Cobot) and hire an System Integrator like Olympus Controls to buy the additional hardware and/or software to allow that machine to talk to the Cobot and program the entire system. This allows the Cobot to do tasks based on information coming from the machine. Ideally this Cobot can replace one (or more) operators and operate the machine(s) instead of a person (or persons). Current statistics show about 30% of Cobots are being sold for this purpose.
But what if you have one of those older CNC machines? You may be able to track down older hardware to update the interface. You may find the programming manuals so to you can decipher the outputs of the CNC machine into the correct error codes or indicators so you can correctly program the robot. And then you still have to have all of this custom programmed using what information you can get, and this can sometime take weeks. Or never. And then, when you make any changes to your operation you have to program the entire system all over again. Nightmare.
At Tend.ai, we have a different approach that solves this problem. Using our own TendVisionTM software and a simple camera, we allow the Cobot to read the CNC Machine screen(s) just like an operator. Then, our Machine Tending Application (MTA) tells the cobot what to do next. Our MTA is an easy-to-use application that allows you to map the areas of the CNC display to “blocks” that can be monitored and acted upon. When the CNC Machine cycle completes and screen displays update, you have correlated that with a program on the Cobot so it executes the next task. And, just like an operator, if the CNC indicates an error that requires the part be be discarded and a new cycle to begin, the MTA tells the Cobot to discard it and get a new blank to begin a new cycle. The Cobot can even push buttons on the CNC as needed to start additional tasks.
In other words, TendVisionTM gives the Cobot eyes on the CNC machine and our MTA gives it the knowledge of what steps to execute based on what it reads on the CNC screen. And, it works with any CNC, injection molding or other machine that has a visual output. This allows almost ANY machine tending tasks to be automated with a cost-effective solution.
Take a look at this video example, showing a single robot tending to ten 3D printers with Tend.ai:
So, while a lot of robot stories are scary, we at Tend.ai are trying to help more manufactures do more good stuff with Tend.ai. I like this top robot story Google found today much more than the first: