Set this page to
Go to Siemens in your region
Set this page to
Go to Siemens in your region
Transporters find their way through factory halls on their own, plants optimize their power consumption during live operation, and machines perform quality-control checks – and make the necessary adjustments – while manufacturing is still in progress. Artificial intelligence offers tremendous potential for industry. It’s already making production more efficient, more flexible, and more reliable.
Industry is becoming increasingly digitalized, the digital enterprise is already a reality. Data is continuously generated, processed, and analyzed. The volumes of data in production environments are the basis on which digital representations of entire plants and systems are generated. These digital twins have been used for some time to structure the planning and design of products and machinery – and production operations themselves – and do so more flexibly and more efficiently while manufacturing high-quality, customized products faster and at an affordable price. But what would happen if the machines and processes could gather insights from these high volumes of data by themselves and optimize their processes during live operation? The potential would be enormous. The good news is that this can already be achieved, step-by-step, using artificial intelligence (AI).
AI has been the focus of research for more than 30 years. During this time, major advances have been made in this area of technology: for example, more powerful hardware and software and improved computing power and data transmission. Using artificial intelligence creates entirely new opportunities for flexible, efficient production, even when it comes to complex and increasingly customized products in small batch runs. The consequences will be significant, as a study by Roland Berger shows: By 2035, intelligent, digitally networked systems and process chains could account for additional growth of roughly €420 billion in western Europe alone. According to a PwC study AI can also contribute up to US$ 15.7 trillion to the global economy in 2030.
The first real applications of artificial intelligence are already finding a place in regular industrial activities including language recognition to perform basic tasks, documenting surroundings using cameras, laser beams, or X-rays, and providing virtual personal assistants in logistics. According to the PwC study a total of 62 percent of large companies are already utilizing AI technology in 2018. Siemens has solutions in its portfolio in the area of service, as for example predictive maintenance, and other applications for engineering and quality testing. Cloud solutions like MindSphere and intelligent applications also provide support for the ongoing process optimization that improves machine efficiency and availability.
Big data and AI give Industry 4.0 a huge boost. Intelligent software solutions can use the high volumes of data generated by a factory to identify trends and patterns that can then be used to make manufacturing processes more efficient and reduce their energy consumption. This is how plants are constantly adapting to new circumstances and undergoing optimization with no need for operator input. And as the level of networking increases, the AI software can learn to “read between the lines,” which can lead to the discovery of many complex connections in systems that aren’t yet or are no longer evident to the human eye. Intelligent software with sufficiently intelligent analytical technology is already available. But whether data processing is performed using a cloud solution or at the local level (for example, using Edge computing) will depend on the user’s requirements. Data on an Edge platform is available more quickly and at a higher resolution, whereas a considerable amount of computing power is available in the cloud. In many cases combining edge and cloud computing is required to benefit from both worlds.
MindSphere, the cloud-based, open IoT operating system from Siemens, can be used to link products, plants, systems, and machines. It is one of the most important foundations enabling the use of AI in industry. MindSphere performs extensive analyses to make the vast amounts of data generated by the Internet of Things (IoT) useful for optimization, simulation, and decision-making.
The digital twin enables virtual testing of a variety of scenarios and promotes smart decisions in areas such as optimizing production. In the future, using a digital representation of a machine tool and the associated manufacturing process, AI will be able to recognize whether the workpiece currently being manufactured meets quality requirements. Moreover, it determines the production parameters that need to be adapted to ensure that this remains the case during the ongoing production process. As a result, production is made even more reliable and more efficient and companies even more competitive.
A precondition for both Industry 4.0 and for artificial intelligence is a state-of-the-art, end-to-end IT infrastructure, regardless of the size of the company. That’s the only way a business can become part of the digital future. But this must always be accompanied by an awareness that digitalization and cyber security need to go hand in hand. The risks are huge without the right safeguards in place. According to the 2018 World Economic Forum’s “Global Risk Report,” business losses through cyber crime over the next five years will amount to $8 trillion, far exceeding Germany’s gross domestic product. Comprehensive protection for industrial facilities, as exemplified by the defense in depth concept from Siemens, will therefore play a key role in the future. After all, hackers are growing smarter all the time, and it is vital that companies stay ahead of them.
In its truest sense, artificial intelligence refers to applications in which machines perform tasks that would normally require functions of human intelligence such as learning, judging, and problem-solving. Tools and technical solutions are being developed for this purpose, enabling humans to work better by extending their abilities.
Machine learning (ML) is what underlies the actual “intelligence” in AI. Computers are trained to recognize patterns in unstructured datasets using algorithms, and to make decisions by themselves based on this “knowledge.” The goal is to have the machine learn from the data and, based on this, use the experience it acquires to constantly improve its ability to perform its tasks.
Deep learning (DL) relies on the use of deep neural networks. The computer accesses data at several node levels simultaneously in order to identify connections, draw conclusions, and make both predictions and decisions. Self-learning algorithms enable the machine to solve even complex non-linear problems by itself, and to interact without instructions.
Stay up to date at all times: everything you need to know about electrification, automation, and digitalization.
It looks like you are using a browser that is not fully supported. Please note that there might be constraints on site display and usability. For the best experience we suggest that you download the newest version of a supported browser: