Gaming system test 

New impulses for developers of autonomous systems

Reduce cycle times:

  • Whereas in the past you had to go through the entire development cycle and test the hardware in reality to check whether complete systems are working, this is now possible with digital twins and system simulations in a fraction of the time.

Work asynchronously:

  • Controllers used to be programmed on the real machine. Virtual commissioning now lets you develop the controller on the simulation model, in many cases before the machine hardware even exists. That’s also possible for AI – training using the digital twin, and based on real measured data if available.

First design phase (Virtual Prototyping)

  • Many variants are considered and simulated in this stage, in order to identify the best system designs, automatically in some cases (Evaluation Framework).
  • Once one or more variants have been selected, the customer (e.g. using Virtual Reality) can consider the prototype or its variants, optionally in the subsequent environment (e.g. the customer’s factory), and optionally interactively (e.g. human-machine cooperation).

Virtual Commissioning

  • Develop and test the control on the simulation model.

Train and test Artificial Intelligence (AI) on the simulation model 

  • In the virtual environment it’s also possible to test scenarios that occur only rarely in the real world (e.g. particular light conditions, communication errors, sensor failures, and test situations that are too expensive or too dangerous to perform in reality (e.g. operating errors, collisions).

Operator Training

  • Work safely with a virtual system in a virtual environment in order to learn how the system behaves and how to operate it correctly.
  • E.g. robotic arm: People can initially use the simulation model to train themselves how to work with it safely and efficiently.
  • E.g. Simit-Unity coupling: Machine operators can use the simulation model to safely practice retracting a machine (directing it away from obstacles) and returning it to normal operation.

Minimize risks:

  • Simulations enable us to test controls, hardware designs, and even complete autonomous systems without risks. 
  • Parallel and faster-than-real-time simulations let you simulate many hours of operation while only a fraction of time passes in reality.
  •  That means the system has already built up many years’ worth of virtual operating hours by the time it first goes into operation for real. That’s very important for autonomous systems in particular, considering their complexity.

Real life operation

  • The simulation models can also be used during real-life operation. Two examples:
  • Navigation: You can use a CAD model of the environment to define the areas where an autonomous system is allowed to move, and the location of important waypoints. This model can then be used during operation as a navigation map.
  • Remote operation: (Human) operators who are not on-site and are remotely operating or monitoring a robotic system must understand the spatial location of the system and which actions it is currently performing there. 3-D models of the robotic system and its environment can be used to generate a clear three-dimensional representation of the situation for the operators with no need for further camera sensor systems.

Robotic arm design:

  • Set arm lengths and motor power values to ensure maximum speed of movement, load-bearing capacity, and service life, while minimizing costs and power consumption.
  • Design joint control to enable the robotic arm to move from A to B without impeding or clashing with itself or its surroundings.
  • Develop an AI system that recognizes surroundings, independently identifies and prioritizes tasks, and doesn’t impede or clash with humans in the working area.
shows that it is much less expensive to find errors early

Identify faults at an early stage:

 

  • Simulations help identify faults and improvement potentials at an early stage and at minimum cost. One example is determining the appropriate robotic arm lengths and that no collisions are pre-programmed in the trajectory planning.
  • This saves costs, because the later errors are found, the more expensive they become.

 

Systems are becoming increasingly autonomous. How can you test them at an early stage, even if they haven’t been built yet? - One way is to test a digital twin in a virtual environment. The real experts in virtual environments work in the games industry. That’s enough reason for some unexpected collaboration.

The world seen through VR glasses is often surprisingly realistic. “But how the environment looks isn’t the most important thing,” Martin Bischoff from the Siemens R&D division points out. “It’s much more important that the virtual environments behave the same as real environments. That physical laws like gravity or centrifugal force also apply there, for example. Special development environments used mainly to develop computer games can be utilized to quickly create virtual environments that are so realistic they don’t just impress people with VR glasses. They’re also ideally suitable as test environments for autonomous systems, for example, or for mobile robots or gripper arms in production facilities”

Access to virtual environments

The challenge is then to ensure compatibility, since the components of a digital twin must be integrated to form a complete (virtual) system, just like a real system. Autonomous systems consist of various components for which different specialists are responsible: the mechanical engineer for the technical drawings of mechanical components, the application developer for conventional controllers, and the AI expert for autonomous functions. They will typically use different development environments for these tasks, which are generally not compatible.  This is where the work starts for Bischoff and his team. They developed the ROS# program, which enables robotic systems with a ROS interface to be directly integrated into a virtual 3-D environment.  “In the past few years we have had many positive experiences with the cross-platform game engine Unity, and have observed that this development environment is particularly well suited to system simulations because it can be flexibly adapted and expanded,” Bischoff notes. “With ROS# we have been in charge of structuring the simulation and communication interfaces between Unity and ROS since 2017.”

Strong collaboration with Unity

Anyone around the world can download the software free from the open development platform github.com, and they can also submit suggested improvements, extensions, or use cases, in the form of either discussion posts or fully developed program code. “Our initiative has led to the organization of a large and steadily growing community, and we have learned a lot from the suggestions and improvements we have received. Our software has been thoroughly tested, and we’ve discovered potential applications we originally hadn’t thought of. In November 2020, Unity itself established a Robotics Hub that’s largely based on our developments. There is a good collaborative arrangement in place with Unity’s robotics team now: We regularly share ideas and also benefit from the open-source developments on Unity’s Robotics Hub,” comments Bischoff.

"In parallel, we’re also working on Unity interfaces for our in-house formats, especially the SIMIT-Unity coupling, which makes it possible to combine 3-D-simulations in Unity with the SIMIT software.  The new interface with Unity makes Siemens software more open, enables us to turn new applications into reality in short timeframes, and makes our products even more interesting for many customers."

Aenne Barnard, February 2021