Industry 4.0—Learning from Mistakes
Why voice control of injection machines is not quite ready for prime time.
The trend toward “smart factories,” Internet of Things (IoT) or Industry 4.0 is often summarized as comprising three main components: smart machines, smart processes, and smart services. In the third category, equipment suppliers have been developing internet portals, remote servicing and, more recently, augmented reality (AR). The latest proposed addition to the “smart services” toolbox is voice control of injection molding machines.
Voice control is an experimental addition to the augmented-reality (AR) platform being developed by Wittmann Battenfeld. As shown here, AR features available through special goggles can include visibility of barrel temperatures and animation of robot movements before they occur. (Photo: Wittmann Battenfeld)
The idea of asking a “Siri” or “Alexa” to provide hands-off manual control of your press during setup may seem far-fetched, but at least one major machine supplier is working on it. During a virtual press conference for the recent Fakuma 2021 show in Germany, Wittmann Battenfeld arranged remote viewing of a live demonstration of its HoloVoice developmental project, an addition to the firm’s emerging AR platform. Although the demo did not go quite as planned, it conveyed an important lesson about implementation of advanced technologies.
The demo started just after an injection press in a Wittmann processing lab had injected a part. A technician wearing a wireless headset spoke a command to “Open clamp,” and the machine complied immediately. (Direct wireless communication from the HoloVoice headset to the machine’s B8 controller—not via the internet—eliminated any delay.) The operator next said, “Robot to takeout position,” and the robot moved accordingly. Then the operator said, “Injector forward.”
I immediately thought, “Oops. Did he just say what I thought he said?” Another PT editor who was observing the demo corroborated that he had heard the same thing.
After what was, in effect, a purge shot, the operator corrected himself and said, “Ejector forward,” which enabled the robot to grip the part. At that time, the demo was halted for some hands-on cleanup, and the hosts of the press conference conceded that voice control obviously needed some further development.
The machine did exactly what it was told. Maybe that was the problem.
Why did the operator say, “Injector forward” instead of “ejector forward”? Was he nervous about conducting a live demo over the internet to an international press audience? Was it related to speaking English instead of his native German? (HoloVoice is currently programmed to understand both languages.) The answer is not really important.
What is important is a hint of the work that needs to be done to make such a technology more fail-safe. Even if the operator did not mis-speak, it’s easy to imagine that a voice controller could mistake “injector” for “ejector” or vice versa, especially in a noisy plant environment. The noise factor is a reason why other machine builders express caution about voice-control technology. Engel said in its own Fakuma press conference that it is not currently pursuing voice control. Arburg stated that it has experimented with it in the past but “has currently decided not to make this technology available for practical use yet.” Emphasis on the word “currently” suggests there may indeed be a future for voice control, once the kinks are ironed out.
In the Wittmann Battenfeld demo, voice control “worked,” in a sense: The machine did exactly what it was told. Maybe that was the problem. It seems to me that one necessary safety precaution would be to imbue voice control with some sort of intelligence that would prevent it from executing voice commands that could be illogical or out of sequence. I use personal-computer software every day that asks me, “Are you sure you want to do that?” and thereby saves me from embarrassing email mistakes or other errors. That kind of embedded skepticism in an injection machine controller could prevent far worse.
If or when these safety issues can be overcome, I can see how voice control could help beleaguered technicians speed setups in this era of perpetual staff shortages. Besides putting machines and robots through their paces, Wittmann Battenfeld and Arburg envision voice control as a convenient means of manipulating screen displays—“Machine, show injection curve”—without needing a hand free to tap any keys.
Related Content
Open Automation Enables Flexibility in Chemical Recycling
Software-defined equipment control systems can be duplicated, transferred and scaled with ease.
Read MoreFive Places Where Automation Can Help...the Front Office
Say “manufacturing automation” and thoughts immediately go to the shop floor and specialized production equipment, robotics and material handling systems. But there is another realm of possible automation — the front office.
Read MoreBollegraaf and Greyparrot Team Up in AI Vision Deal
Recycling facilities builder Bollegraaf invests in AI vision technology.
Read MoreSmart Systems Illuminate Material Recovery for Enhanced Plastics Recycling
Data collection and machine learning can give MRF operators and brand owners an enhanced view of the fate of recyclable materials.
Read MoreRead Next
Augmented Reality on the Shop Floor: Remote Servicing
A confluence of increased power and capabilities in the hardware and software that drive augmented reality are pushing to make it mainstream in 2018. How could it help your company?
Read MoreAugmented Reality on the Shop Floor: Reimagining the Instruction Manual
If augmented reality technology providers have their way, equipment manuals and instructional videos could be headed to obsolescence.
Read MoreWith AI: Trust But Verify
A growing number view AI not as “artificial intelligence”, wholly supplanting the human with computing power, but rather as “augmented intelligence”, working alongside a person’s mind to its betterment not replacement.
Read More