To use the writing tool please open Ingenuity on a desktop device
Some writing tips
Choose a topic that truly fascinates you. Your enthusiasm will come through in your writing.
Write from your own point of view and use your own voice.
Challenge the status quo. In order to write about groundbreaking developments, you sometimes have to take a controversial stance.
Write in a clear, concise and comprehensible way. Write short sentences, use simple words and build a logical narrative.
Encourage discussion. Invite colleagues to contribute and readers to discuss your article.
Always check your grammar and make sure that your facts are correct before you hit “publish”.
Ever since IBM’s Deep Blue chess computer defeated the reigning champion Garri Kasparov in 1997 humans are in fear of being outsmarted, overpowered, and put out of work by intelligent machines. These breakthroughs in machine learning mingled with the doomsday infatuation of the closing millennium. Even though the media still profusely tap into this dystopian storyline a lot has changed in the last 20 years. As industrial use cases of AI-algorithms and machine learning are on the increase we now have a much more specific picture of future human-machine collaboration. Intelligent machinery will change the way we operate machines, produce goods, and solve tasks.
In my last blog I talked about the innovation potential of industrial edge computing and its integration with cloud-based IoT platforms like Mindsphere. Now I would like to answer the question why these industrial edge-cloud solutions offer the optimal setup to get AI and machine learning involved, and why there’s really nothing to be afraid of.
Machine intelligence: human tasks AI can now perform
More and more tasks that require higher cognitive capabilities and the ability to apply complex knowledge to a particular predetermined situation or use case can now be performed by AI. Hopefully, in the future, machine learning will enable AI-algorithms to adapt to dynamic changes in shop floor environments and automation processes.
While earlier applications of AI were rule-based input-output systems, machine learning gained momentum in the 1990s, as new sets of algorithms and increased computing power enabled the artificial intelligence to effectively learn from experience.
The mass availability of labeled data through social networks and refined deep learning algorithms have been the driving force behind a true breakthrough in AI since late 2013. Deep learning involves neural networks performing multiple layers of non-linear transformations on huge data sets. The results are stunning. The accuracy of AI systems analyzing images has skyrocketed leading to artificial intelligence now surpassing humans in complex tasks like image recognition. With deep learning gaining momentum also speech recognition and speech generation, which have been based on machine learning for quite some time, have both undergone significant improvements. We are surrounded by consumer-related applications of AI, which have become part of our everyday life - just think of… Alexa… "What time is it, Alexa?"
In industrial settings we encounter a different picture, chiefly because the nature and availability of data differs from that of social networks and use cases are more diverse.
Data-driven industrial use cases for AI and machine learning
In industrial applications data usually belongs to machine owners. It comes from sensors, cameras and other monitoring devices on the shop floor. This data is highly sensitive and restricted, much unlike the images on public social media profiles. The chief differences between industrial and consumerist use cases lie in
- the data sets and their accessibility
- the nature of tasks to solve, especially their diversity
Nevertheless, these problems have been overcome, and AI is on the way to the shop floor. But why do we need AI on the shop floor in the first place? Current automation systems are perfect for all highly repetitive tasks. They can accurately cut a workpiece out of a piece of metal or assemble a system from components. But in most cases, despite different sensors they are utilizing, they are still “blind” in comparison to a human being. They can hardly interpret data from complex sensors like cameras, react to unexpected situations, or give a complex judgment based on long experience.
This is where AI comes into play. Current AI systems can provide “eyes” to automation systems improving their perception once integrated into such a system, e.g. via edge computing. Future AI system will bring the next level of machine intelligence involving more dynamic interaction with changing environments, f.e. allowing for a higher degree of customization within a fully automated production process.
At Siemens we have been dealing with machine learning for years bringing different use cases of AI to industrial applications. At Motion Control we are dealing with machine learning for Machine Tools or Additive Manufacturing especially in fields like
- quality control
- process optimization
- predictive maintenance
- and self-adjusting machines that can handle changes to f.e. the condition of their components.
Edge-cloud solutions and AI-implementation
Edge-cloud solutions like Siemens Industrial Edge offer enhanced shop floor functionality with full data control. Edge devices allow for local data processing close to the machinery where this data originates while maintaining cloud connectivity for centralized software updates and data analysis across multiple locations.
In my field, motion control and automation, there are several reasons why AI-solutions need to be implemented on a local level, e.g. into edge devices, rather than into cloud-operated systems:
- sensors produce enormous amounts of data, which mostly is of short-lived relevance and cannot be stored due to data security regulations
- low latency of information transmitted to the AI-system and decisions made on the basis of this information
- implementation of AI-systems into local contexts for specialized tasks and environmental situations
AI solutions in this context form a natural extension of motion control and automation systems. While current motion control systems can quickly and accurately control repetitive tasks the AI system can handle more complex scenarios with low predictability.
Another major advantage of AI-implementation into local edge devices is the independence from cloud-connectivity. So when your internet connection breaks down the machine can still continue performing its tasks, even those requiring additional intelligence and only realized with AI support. In a system with AI running in a cloud your machine would instantly loose the ability to react to complex and unpredictable situations.
So why not dispense of the cloud connection altogether? Firstly, you will still want the centralized software updates and broader data analysis. On top of that, some processes of AI implementation in industrial applications do need the computing power of the cloud to learn.
The cloud as a learning campus for new AI applications
Think of hiring a new employee. You will not let her/him go straight at it at the new workplace. Most companies have an onboarding process or learning campus that introduces the new employee to the company culture and work procedures and gives her/him some general training for the upcoming job.
Take f.e. an image recognition algorithm. It will require a (deep) learning phase with a sufficiently large data set before actually ‚going to work‘. This would overstrain the computing power of edge devices, and is better delegated to a cloud-based data management system. Once pattern recognition has reached an adequate level of accuracy, the actual execution phase of implemented AI-solutions kicks in. This can best be performed within the local edge device with greater proximity to the machinery on the shop floor and more flexibility to react to unpredictable situational changes.
While the machines on the production level become increasingly ‚intelligent‘ and capable of performing dynamic non-repetitive tasks, they still need humans to define the tasks and come up with the algorithms enabling machines to learn through ‚experience‘ with a large set of relevant data.
Limits of machine intelligence and human creativity
In my opinion, most of the fears regarding AI are unfounded. While the breakthroughs in image recognition, speech recognition, and reinforcement learning might help create new AI champions in yet another beloved traditional game humans have played for ages, and drive the everyday emergence of new AI applications, we are still very far from an artificial intelligence like Skynet as known from the Terminator movie series. We will not have to form a human resistance army to defend ourselves against fully self-aware and power-hungry machines seeking our mass extinction.
Looking at the actual industrial use cases of machine intelligence like some of the apps developed on Siemens Industrial Edge one gets a far more levelheaded view about the future of machine intelligence and human-machine collaboration. We are in the midst of a new industrial revolution often termed Industry 4.0. And we are encountering some of the same fears humans were plagued by in previous industrial revolutions like the transition from manual labor to mechanization and factory work in the 19th century.
AI developments will certainly change the nature of work humans carry out. We will be freed from a lot of menial repetitive tasks, and gain spare time for creativity, strategic planning, and innovation. Shop floors will become more efficient and flexible as AI-implemented industrial edge-cloud solutions decrease down-time caused by connectivity issues or broken equipment and allow for a higher degree of customization within industrial production processes. We can’t stop it. And there is no need to. We just need to stay tuned and make the best out of it.
As Garri Kasparov, the tragic hero of the 1997 human vs. AI chess match, aptly put it in his recent Ted speech: Don’t fear intelligent machines, work with them.
You have questions about industrial edge and AI implementation? Or a different opinion on the future of human-machine collaboration? Get involved and share your ideas!