From masses of data to simple rules
Symbolic regression with artificial intelligence
An innovative Artificial Intelligence process helps improve our understanding of complex connections and describe them using simple equations. These will help machine and system operators become more efficient.
Drivers know that planning ahead can mean big fuel savings. Just a few simple rules – such as accelerating slowly, slowing down before taking a corner, making use of downhill force and engine braking – can help, meaning far fewer trips to the gas station.
“The same applies to almost all machines and systems, including those with automated control systems: they run much more efficiently when you plan ahead and operate them correctly,” explains Dirk Hartmann of Technology, the central research unit at Siemens. “Investing in this area pays off. In most cases you get more out of optimizing how you use a system compared to improving the system itself, since the options for optimizing hardware components have often already been exhausted. Of course it’s possible to develop ideal control systems, for example using Model Predictive Control (MPC). But the algorithms are so CPU-intensive that it’s almost impossible to run them on standard, basic PLCs, or programmable logic controllers. So if we still want to improve these controllers, we need procedures that are less complex, a rule-based controller, for example.”
Finding good rules – at the limits of IT complexity
“Controlling with rules” means setting clearly defined rules for how a controller should respond to a specific condition. In principle, controllers should operate in the same way as the driver planning ahead in the example we mentioned at the beginning.
“Machines and systems controlled using good rules are very efficient, in fact,” says Hartmann. “But we first have to find these good rules for any given machine or system. Previously, this had to be done manually, for example by having experts analyze MPC controllers, which is of course a very complicated task. We’ve advanced a step further now and have developed a procedure for identifying these rules automatically. In slightly more technical terms, by looking at our objectives, such as saving energy, and the operational data a system provides – in other words, the relevant parameters like temperature, time, speed, power consumption – we can derive functions that determine which actions must be performed in response to a specific system condition.”
Mathematicians call this method of finding functions based on data “symbolic regression,” while programmers consider it “a huge challenge.” In general, symbolic regression based on any given data is what’s known as an “NP-hard problem,” a problem that’s so CPU-intensive that even the most powerful systems can’t find solutions in an acceptable timeframe. NP-hard problems can only be solved when it’s possible to simplify the original problem in a suitable way, thus reducing its computational complexity.
“We were able to find that kind of simplification for our purposes,” Hartmann explains. “The critical push came from the latest results of research by Max Tegmark."
Scientist Max Tegmark noticed that mathematical connections between mutually dependent physical parameters were typically, and often, very simple. In his paper he describes symmetries or low-order polynomials as typical features. His Artificial Intelligence algorithm on symbolic regression, “AI Feynman,” makes use of this characteristic by deliberately looking for these typical features. This proved very successful, and he was able to apply his procedure to data in order to derive all the formulas presented in the popular standard physics text “The Feynman Lectures on Physics,” by Richard Feynman.
“The condition data from the machines and systems we want to control also describes physical parameters that interact with each other,” Hartmann explains. “We were therefore able to build on the processes of symbolic regression to develop the approach we are now putting to the test in pilot projects.”
One example is a situation involving Digital Industries, which is once again dealing with driving and looking for rules for the optimal operation of a self-driving vehicle. “In the model, we simulated an automobile (ego), which drives on a two-lane road,” says Theo Papadopoulos. “The automobile must not drive too close to other vehicles at any time, or leave the road. At the same time it must maintain a speed that’s as constant as possible, with no heavy acceleration or braking, in other words. All we know at any given time is how fast the car is driving right now, how far it is from other cars, and where the edge of the road is. Our procedure worked best with this example. Rule-based control delivered control impulses that were almost identical to those of the MPC controller, which are ideal but CPU-intensive. We’re confident that these results can be transferred to other, comparably simple scenarios.
Rules for the microgrid
In a second pilot project, the rule-based procedure was applied to microgrid control. Microgrids typically involve a need to coordinate various electricity generators and storage media, such as PV, wind, batteries, diesel generators, fuel cells, electrolyzers, and so on. “There’s always the question of which mix of generators is best at any given time,” says Ulrich Münz, from Technology in Princeton.
“Our office building in Princeton is a good example of this kind of microgrid: It’s equipped with a PV system, battery storage devices, and charging stations for electric vehicles, and at the same time it’s still supplied by the public grid. Our goal is to control the components to minimize the amount of power we draw, while simultaneously reducing our peak consumption. A high proportion of our power bill in Princeton is based on peak consumption during the previous twelve months – in other words, for a single high peak load you have to pay for an entire year.”
So how must a controller perform in order to keep peak loads to a minimum? It’s a complex problem, since the ideal depends on uncertain parameters, in other words the volume of electricity the PV system will feed in during the next few hours, and how high the demand for electricity will be during the same period. Because some values can only be estimated, it requires a lot of effort to achieve perfect control for the plant using traditional methods like model-predictive control. That’s why the researchers at Princeton are currently using the new method to try to find rules that take account of the estimated PV feed-in, estimated load, the uncertainties these involve, and the battery charge state. Here, too, the first trials have already delivered very good results. “In the next step we aim to further improve and validate the process using data from the microgrid in Princeton. If that’s successful, we want to apply these rules to our microgrid by implementing them in the Siemens Microgrid Controller."
Aenne Barnard, October 2020
Subscribe to our Newsletter
Stay up to date at all times: everything you need to know about electrification, automation, and digitalization.