Many of us are fascinated with ChatGPT, even if we use AI in other contexts. AI seems to be on everyone’s mind, and many manufacturers are feeling drawn to AI for their facilities.
In a new article in Industry Week, Rick Bohan and Ron Jacques suggest that AI creates problems if your company culture isn’t ready for it.
“Smart manufacturing depends critically on information governance: rules concerning the collection, flow and analysis of performance information, most often in digital form,” they point out. “If your company isn’t already good at these things—if it doesn’t already possess a culture of curiosity, effective data gathering and use of data in decision making and problem solving—it won’t suddenly get good at these things upon installing AI.”
The authors referenced some specific examples in which manufacturers had the data but failed to use them for problem-solving. Instead, their examples saw companies using the data to place blame, accepting the problems with fatalistic reactions, or just gathering the data and taking no action at all.
The Sydney effect
With ChatGPT and Bing’s new Sydney chatbot, AI is having a moment. Even people who have never thought of their Netflix recommendations or predictive maintenance tools as AI are getting excited about AI.
Some journalists have managed to get Bing’s Sydney to claim to have feelings, to lose its temper, and to declare love. We probably should not be amazed that humans can manipulate machinery, but these events have increased the public perception of artificial intelligence as magic.
Smart machines and smart factories can be very smart. But installing a solution is not the same as solving a problem.
When you have a problem with your Rexroth electric industrial motion control systems, we can help. Call for immediate assistance, or fill out the simple form below and we will get in touch with you.