Visit our Content Hub!
Access free downloadable content curated by our editors.

AI and Manufacturing: Nothing to Be Afraid of

An AI expert at PMMI's Annual Meeting breaks down AI and how it can benefit packaging and processing.

David Mc Graw

David McGraw is a seasoned professional in the field of Artificial Intelligence. As a Senior Director at Alvarez & Marsal in Miami, he specializes in applying Generative AI to enhance performance and create value across various business sectors, including manufacturing. At PMMI’s 2023 Annual Meeting, he presented “AI in Manufacturing Ops – How is your Company Using AI?” before sitting down with Sean Riley for an upcoming podcast episode of unPACKed with PMMI. The following is an excerpt of that podcast edited for content and clarity.

Sean Riley:

During your presentation at PMMI’s Annual Meeting, you presented three advantages of generative AI versus traditional AI. What do you feel is the most significant advantage?

David McGraw:

Yeah. So right now, the biggest advantage of generative AI is that tools like ChatGPT tools like Bard and Claude are available, and anyone can use them. It totally democratizes the use of AI. There are zero barriers to entry and zero cost to you to play with it and see what it can do. Generative AI, you can have it right now at no cost to you. Companies like Open AI and Google have done all the heavy lifting for you. These models exist today, and you can play with them right now. Nothing prevents you from doing it other than logging on to the machine and doing it.

Riley:

How does that run counter to traditional AI?

McGraw:

With traditional AI, many of the use cases I talked about on the plant floor are complex. You must figure out, "How will I start collecting data?”

Predictive maintenance use case for traditional AIPredictive maintenance use case for traditional AIPredictive maintenance is a use case I suggest as an entry point for AI and manufacturers. The first thing you must do is collect data. The second thing you must do in that dataset is have failures because you're trying to figure out the pattern. If you're looking at assets that don't fail often, you might collect data for months or even years before you have enough failures in your dataset.

So now, you're already six months to two years from the beginning before you really are doing anything of actual value or interest despite the cost. Then, what are you going to do? You have to start writing the algorithms with your dataset. Then you start testing it and say, "Okay. I can get this level of accuracy in my predictions." What if that accuracy is low? Then, you must go back to the drawing board.

And it's very iterative. And because there's latency between getting the data and doing something with it, it can get costly. And the time-consuming aspect is the hard part because we've all been preconditioned. We want everything right now.

Riley: