Wilfried Elmenreich

Wilfried Elmenreich

Professor of Smart Grids
Alpen-Adria-Universität Klagenfurt, Austria

An Artificial Hormone-based Algorithm for Production Scheduling

Artificial hormone systems are inspired by the natural endocrine system that adjusts the metabolism of tissue cells in our body. By connecting decisions and actions in a system to the production and evaporation of artificial hormones, it is possible to create a bio-inspired self-organizing algorithm.

Application areas for such algorithms are problems with many agents to be coordinated, where existing optimization approaches come to their limit. An example of such a problem is the production of logic and power integrated circuits (ICs) in the semiconductor industry. Unlike the high-volume production of memory ICs, wafer production in the logic and power sector has a large product mix. This involves many processing steps and dynamic changes of involved machines.

Weekly workloads can involve around 100 000 operations on thousands of machines. Optimizing such a system for work in progress and flow factor is an NP-hard problem. At this size, existing dispatching rules and linear optimization methods cannot cope with the NP-hard search space, thus not optimize the entire system.

To address this issue, we have modeled a production plant as a self-organizing system of agents that interact with each other in a non-linear way. As it is common in the semiconductor industry, wafers are combined in groups of 25 pieces forming a so-called lot. In our approach, an artificial hormone systems is used to express a lot's urgency and the need for new lots at a machine type, thus providing a system using local information for optimization. The algorithm builds upon five principles, which are 

  • (i) machines produce hormone to attract lots, 
  • (ii) hormone diffuses process-upstream, 
  • (iii) incoming lots diffuse hormone, 
  • (iv) lots are prioritized by their timing, and 
  • (v) lots are attracted by hormone. 

Via these mechanisms, machines can balance their workload by pulling required lots towards them. The algorithm has been implemented and evaluated in a NetLogo simulation model. Simulation results indicate that the artificial hormone system improves around 5% for overall production time and flow factor compared to a baseline algorithm. Future work will investigate if the hormone algorithm can be used on top of existing production systems. In a productive system an improvement of 5% would be highly notable.

More information can be found on the SWILT project webpage and in the paper

Wilfried Elmenreich, Alexander Schnabl, and Melanie Schranz. An artificial hormone-based algorithm for productionscheduling from the bottom-up. In Proceedings of the 13th International Conference on Agents and Artificial Intelligence. SciTePress, February 2021.

Click triangle for Bibtex entry
author = {Elmenreich, Wilfried and Schnabl,
Alexander and Schranz, Melanie},
title = {An artificial hormone-based algorithm
for production scheduling from the bottom-up},
booktitle = {Proceedings of the 13th International
Conference on Agents and Artificial Intelligence},
year = {2021},
month = feb,
publisher = {SciTePress}

Investigating the impact of data quality on the energy yield forecast using data mining techniques

In this paper, we analysed the impact of using optimum combination of input variables and low dimensional subspace on Photovoltaic (PV) production forecasting accuracy. We worked in collaboration with Prof. Mussetta from Politecnico di Milano.

The main contribution presented in the paper is divided in two parts:
  1. Optimum combination of input meteorological features using feature extraction technique
  2. Low dimensional subspace using dimensional reduction technique
We assess and compare two cases when forecasting models are fed with all the features with the case when low subspace of dataset is used as an input to the models.
The simulation results reveal that depending on the location under study and the regression methods, using less variables as input to the forecasting models are enough to generate nearly similar results without affecting the performance. However, it is necessary to conduct the tests under different climatic conditions so as to ensure the reliability of the results. 
The figures below show the results obtained applying Pearsons correlation and principal component analysis. Figure 1 represents strength of association between two variables. Figure 2 is the biplot representation of the input features contributing variance on principal components PC1 and PC2.

Fig.1 : Pearson correlation map

Fig.2 : Biplot representation 

ISGT Europe 2020 was held virtually and we recorded the presentation for the same. The presentation is available at this link.


To support reproducibility and validating the results we have released the dataset utilized in the work along with the codes. 

Github repository with used dataset and evaluation code

For more information please see the paper:

Ekanki Sharma, Marco Mussetta, and Wilfried Elmenreich. Investigating the impact of data quality on the energy yield forecast using data mining techniques. In Proceedings of the IEEE PES Innovative Smart Grid Technologies Europe (ISGT-Europe). IEEE, October 2020.

Investigating the Benefit of Time-Series Imaging for Load Disaggregation

In this paper, we investigate the benefits of time-series imaging in load disaggregation, as we augment the wide-spread sequence-to-sequence approach by a key element: an imaging block.  

A Recurrence Plot
The approach presented in this paper converts an input sequence to an image, which in turn serves as input to a modified version of a common Denoising Autoencoder architecture used in load disaggregation. Based on these input images, the Autoencoder estimates the power consumption of a particular appliance. 

The main contribution presented in this paper is a comparison study of three common imaging techniques: 

  • Gramian Angular Fields, 
  • Markov Transition Fields, 
  • Recurrence Plots.
Further, we assess the performance of our augmented networks by a comparison with two benchmarking implementations, one based on Markov Models and the other one being a common Denoising Autoencoder. ´

The outcome of our study reveals that in 19 of 24 cases, the considered augmentation techniques provide improved performance over the baseline implementation. Further, the findings presented in this paper indicate that the Gramian Angular Field could be better suited, though the Recurrence Plot was observed to be a viable alternative in some cases. 

Our paper is to appear at the 7th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation (BuildSys ’20):

Hafsa Bousbiat, Christoph Klemenjak, and Wilfried Elmenreich. 2020. Exploring Time Series Imaging for Load Disaggregation. In The 7th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation (BuildSys ’20), November 18–20, 2020, Virtual Event, Japan.

 We are looking forward to discussing our paper at BuildSys!

Benford’s Law and the US 2020 Presidential Election Votes

Benford’s law states that if you get a big range of data from the real world and you look at the lead digit of each of the values you get significantly more 1s than other digits if the numbers span multiple magnitudes.

As one application, Benford’s law is used to detect fraud in accounting. There typically, the pairs of the two first digits are analyzed and plotted according to their frequency in order to detect anomalies. An anomaly can have different explanations though.

For example, in the US 2020 presidential elections, the proportion of digits 1 and 2 on first digits for votes for Mr. Biden is lower than expected, while for votes for Mr. Trump the proportion of digits 1 and 2 on first digits is slightly higher.

In the video below, Matt Parker analyzes the situation and shows that the more densely populated areas in the US, where a majority of Mr. Biden's votes are coming from, have precincts with mostly the same size. Thus here the condition of having data spanning multiple magnitudes is not fulfilled, hence we get a distribution of first digits that deviates from the prediction by Benford’s law.

When looking at the frequency of the last digits, there is an anomaly in the voter data for Mr. Trump. Instead of having a roughly equal distribution of frequency of last digits, the lower digits are much higher. This is due to the fact that a majority of votes for Mr. Trump come from smaller precincts thus favoring the smaller numbers.


Thus, the deviation of voting counts (from precincts with a standardized size) from Benford’s Law is not an indicaton of voter fraud but rather a phenomenon to be expected.

Further reading:
Deckert, J., Myagkov, M., & Ordeshook, P. (2011). Benford's Law and the Detection of Election Fraud. Political Analysis, 19(3), 245-268. doi:10.1093/pan/mpr014

Evolving NILM to NIAD: Non-Intrusive Activity Detection

Almost all documented practical use cases of load disaggregation rely on the analysis of appliance operational times and their impact on the monthly electricity bill. However, load disaggregation bears promising potential for other use cases. Recognizing user activities without the need to set up a dedicated sensing infrastructure is one such application, given that many household activities involve the use of electrical appliances. State-of-the-art disaggregation algorithms only provide support for the recognition of one appliance at a time, however. 

In collaboration with Andreas Reinhardt from TU Clausthal, we thus take load disaggregation to the next level, and present to what extent it is applicable to monitor user activities involving multiple appliances (operating sequentially or in parallel) using this technique. For the evaluation of our Non-Intrusive Activity Detection (NIAD), we synthetically generate load signature data to model nine typical user activities, followed by an assessment to what extent they can be detected in aggregate electrical consumption data. Our results prove that state-of-the-art load disaggregation algorithms are also well-suited to identify user activities, at accuracy levels comparable to (but slightly below) the disaggregation of individual appliances.

Our paper is to appear at the 2nd ACM Workshop on Device-Free Human Sensing (DFHS'20):

Andreas Reinhardt and Christoph Klemenjak. 2020. Device-Free User Activity Detection using Non-Intrusive Load Monitoring: A Case Study. In The 2nd ACM Workshop on Device-Free Human Sensing (DFHS ’20), November 15, 2020, Virtual Event, Japan.

 We are happily looking forward to pitching the concept of NIAD to the community!

Stop! Exploring Bayesian Surprise for Load Disaggregation

In our latest paper, which is the result of the ongoing collaboration between our lab and SFU's Computational Sustainability Lab, we bring the concept of Bayesian Surprise to NILM. When has enough prior training been done? When has a NILM algorithm encountered new, unseen data? We apply the notion of Bayesian surprise to answer these important questions for both, supervised and unsupervised algorithms.

"Bayesian surprise quantifies how data affects natural or artificial observers, by measuring differences between posterior and prior beliefs of the observers" - ilab.usc.edu

Bayesian Surprise is measured in "wow"
We compare the performance of several NILM algorithms to establish a suggested threshold on two combined measures of surprise: postdictive surprise and transitional surprise.

We provide preliminary insights and clear evidence showing a point of diminishing returns for model performance with respect to dataset size, which can have implications for future model development, dataset acquisition, as well as aiding in model flexibility during deployment.

The paper is to appear at the 5th International Workshop on Non-Intrusive Load Monitoring (NILM'20): 

Richard Jones, Christoph Klemenjak, Stephen Makonin, and Ivan V. Bajić. 2020. Stop! Exploring Bayesian Surprise to Better Train NILM. In The 5th International Workshop on Non-Intrusive Load Monitoring (NILM ’20), November 18, 2020, Virtual Event, Japan.

An author's copy can be obtained from Christoph's personal website

We are looking forward to discussing this novel approach for NILM.

Swarm Intelligence and Cyber-Physical Systems

Swarm Intelligence (SI) is a popular multi-agent framework that has been originally inspired by swarm behaviors observed in natural systems, such as ant and bee colonies. In a system designed after swarm intelligence, each agent acts autonomously, reacts on dynamic inputs, and, implicitly or explicitly, works collaboratively with other swarm members without a central control. The system as a whole is expected to exhibit global patterns and behaviors.

When is it advantageous to use a Swarm approach?
The scaling principle depicts a range where a swarm
outperforms a linear system of the same size

Although well-designed swarms can show advantages in adaptability, robustness, and scalability, it must be noted that SI system have not really found their way from lab demonstrations to real-world applications, so far. This is particularly true for embodied SI, where the agents are physical entities, such as in swarm robotics scenarios.

In the paper 

Melanie Schranz, Gianni di Caro, Thomas Schmickl, Wilfried Elmenreich, Farshad Arvin, Ahmet Sekercioglu, and Micha Sende. Swarm Intelligence and Cyber-Physical Systems: Concepts, challenges and future trends. Swarm and Evolutionary Computation, 60, 2020. (doi:10.1016/j.swevo.2020.100762)

we start from these observations, outline different definitions and characterizations, and then discuss present challenges in the perspective of future use of swarm intelligence. These include application ideas, research topics, and new sources of inspiration from biology, physics, and human cognition. To motivate future applications of swarms, we make use of the notion of cyber-physical systems (CPS). CPSs are a way to encompass the large spectrum of technologies including robotics, internet of things (IoT), Systems on Chip (SoC), embedded systems, and so on. Thereby, we give concrete examples for visionary applications and their challenges representing the physical embodiment of swarm intelligence in

  • autonomous driving and smart traffic,
  • emergency response,
  • environmental monitoring,
  • electric energy grids,
  • space missions,
  • medical applications,
  • and human networks.

In the future, swarm-based applications will play an important role when there is not enough information to solve the problem in a centralized way, when there are time constraints which do not allow to find an analytical solution, and when the operation needs to be performed in a dynamically changing environment. With an increasing complexity in upcoming applications this will mean that SI will be applied to solve a significant part of ubiquitous complex problems.