AI Adoption In Industrial Manufacturing
To the general public, the concept of artificial intelligence (AI) is an aspect of science that lives on the periphery of our understanding. For most of us, what little we do know of it was learned through the lens of a Hollywood director; so it might surprise you to learn that the idea behind AI is not a new one – in fact, the ancient Greeks and Egyptians are among the earliest societies that postulated on the subjects of robots and automatons. Of course, these largely remained myths until the mid-twentieth century when science was finally able to breathe “life” into AI. Today, artificial intelligence is a crucial part of industrial manufacturing and modern smart factories.
This article will examine the history of artificial intelligence as it evolved from concept to reality, how it entered the world of industrial manufacturing and provide some real-world examples of how it is currently being used in smart factories.
History of AI
Though the idea of artificial intelligence was spoken about in the times of Socrates, Plato, Cleopatra, and Marc Anthony, the science and theorems that were needed to pull the concept into the realm of the possible were not developed until the 1950s. More specifically, the term “artificial intelligence” wasn’t coined until the Dartmouth Summer Research Project on Artificial Intelligence, a week’s long event hosted by Dartmouth College in Hanover, New Hampshire in 1956. It is a widely held belief that this extended brainstorming session, which was attended by a handful of mathematicians and scientists, was the founding event for artificial intelligence as a legitimate scientific field. The research project aimed to examine the following conjecture (among others) “that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it”.
Of course, it was one thing to define the field of artificial intelligence, and quite another thing to achieve it. In the fledgling years of the field, some progress was made; computers were getting faster and machine learning algorithms were getting better. But several reports were published criticizing the slow pace of developments, which promptly resulted in waning public interest and funding. Looking back, the criticism may have been warranted given the resources research into artificial intelligence was consuming – in the 1950’s, leasing a computer could cost projects upwards of $200,000 a month; a not insignificant amount of money even by today’s standards. And while the researchers had a very clear idea of what they wanted to teach machines, the greatest limitation to achieving any meaningful progress came in the form of computational power – computers simply weren’t able to store the vast amounts of information or process it quickly enough.
Fortunately, this was not the death knell of the development of artificial intelligence. Computers continued to get faster by leaps and bounds year over year and in a few short decades they were able to store and sift through mind-boggling amounts of data. By the late 90’s, computers were besting humans in interesting ways. IBM’s Deep Blue bested the reigning world chess champion, Gary Kasparov. Pitting man against machine in a chess match was no coincidence – the win solidified the viability of an artificially intelligent computer’s ability to make decisions.
AI in Industry
Fast forward to today, artificial intelligence has made its way from the fringes of science to the future of industrial manufacturing, simplifying and optimizing factory operations. Unlike the original musings around artificial intelligence which sought to create computer systems that perform complex tasks that mirror those performed by humans, industrial artificial intelligence aims to apply technology that can identify and address manufacturing pain points to improve productivity, reduce costs, facility optimization and more. The following is a list that digs deeper into the industrial applications of artificial intelligence.
● Predictive Maintenance. Rather than waiting for a component to wear down enough to be caught by the human eye or until the quality of outputs begin to drop unexpectedly, AI uses complex algorithms/machine learning in order to predict when a machine may fail. Having this advanced knowledge allows qualified employees, armed with top-notch automation technician training, to make repairs before significant downtime is required. Addressing issues early allows a company to save both time and money, and drastically reduces any additional production issues associated with an unidentified equipment deficiency.
● Quality Improvement. In addition to predicting when a piece of equipment may fail, AI is also being utilized to monitor the quality of the goods being produced. Major issues may be easy enough for a human operator to identify, but their powers of observation are much more limited than an AI-driven computer system. Of course, identifying production issues early will save money, but doing so will also help to mitigate the likelihood of product returns that have already been delivered to customers, and help maintain a positive relationship with said customers.
● Generative Design. One very interesting application of AI is generative design, which allows manufacturers to input several criteria like max cost, production obstacles, materials, etc. into generative design software in order to generate alternatives to the current or proposed product design. The computer analyzes all of the generated variations and identifies those that are optimal for production.
Real World Applications
If it still sounds a little far-fetched, know that some of the biggest companies around the globe are already utilizing AI in their manufacturing operations and they are continuing to invest heavily (read: millions of dollars annually) in its development. Here are some of the most prominent examples of real-world companies making use of it.
In 2016, Siemens, the German multinational conglomerate that also has the distinction of being Europe’s largest industrial manufacturing firm, launched Mindsphere, a smart cloud service that allows manufacturers to monitor equipment around the globe. That same year, Siemens integrated IBM’s Watson Analytics into their solution so that they can harness all data points – no matter how small – in their manufacturing processes in order to find any issues and just as quickly, find solutions to them.
Siemens also uses neural network-based AI applications to reduce their greenhouse gas (GHG) emissions and improve performance of their gas turbines. The turbines are constantly monitored using 500 sensors that monitor a host of different parameters. Machine learning is then used to make decisions on optimal fuel level adjustments in order to maximize performance and reduce waste.
Closer to home, General Electric launched Brilliant Manufacturing Suite - an application whose primary goal is to track and process information gathered from various points throughout the manufacturing process in order to identify and resolve potential problems and failures. The first factory with the application deployed, located in India, saw their facility’s effectiveness rate increase by almost 20%. Not only does this solution monitor entire production facilities, the data gathered by sophisticated sensors is gathered and processed into recommendations that operators can action. If you’re wondering how much data is gathered and processed by GE’s application, you wouldn’t be wrong if you guessed in the one-million-terabytes-of-data-a-day ballpark.
Finally, there’s Fanuc – a Japanese-based company that’s an industry leader in industrial robotics. One of their current goals is to develop technology that allows a number of robots to learn together. Eventually, the hope is that the robots will be able to share the skills they’ve learned with one another which in turn will save time and money in the manufacturing process.
The Future is Bright
Artificial intelligence has clearly come a very long way since its initial conception. In a relatively short amount of time, AI-powered computers have gone from beating chess grandmasters and trivia experts to optimizing smart factory processes and streamlining industrial manufacturing. If you’re interested in pursuing an electronics technician certificate, be sure to check out some of the courses that George Brown College has to offer.