2023
-
To which Extent are Simulation Research Papers related to the Real World? – A Survey on the Use of Validation Methods
Anne Vonderheide and Matthias Becker
Proceedings of the 21st annual Industrial Simulation Conference - ISC '23Validation is the methodology at the end of the simulation and modelling cycle that relates the insights gained by computer calculations to the real world scenario, that is subject of study. Validation establishes credibility concerning the simulation results and implications with respect to the real system. Validation is a very important aspect in real world simulation studies, since important financial and security related decisions concerning the planning, design and optimization of the real systems might be relied. In our study, we conducted a survey on research papers in renowned simulation conferences and evaluated whether validation had been used at all, and if yes, which validation methods had been used to which extent. This reveals the relevance and trust of simulation results of the research papers concerning the real world problem addressed in that research. We found that a majority of research papers neglect the application of validation methodology. Obviously, the validity of the results of simulation studies or the impact on practical problems in the real world seems to be questionable in a vast number of cases. -
A Study of Insect Management Models in Agriculture
Matthias Becker and Kin-Woon Yeow
Proceedings of the 21st annual Industrial Simulation Conference - ISC '23In the study of controlling the insect population within the field of agriculture, many simulation models on herbivorous insects have been proposed. These proposed models contains various factors that determine the population development and growth of the herbivorous insects, such as temperature, light intensity and etc. Due to these factors, intensive studies have been carried out to discover the best model that suits the population development for each specific species of herbivorous insect. -
Training Agents for Unknown Logistics Problems
Elisa Schmidt and Matthias Becker
Proceedings of the Companion Conference on Genetic and Evolutionary Computation - GECCO '23 CompanionA methodology on how to prepare agents to succeed on a priori unknown logistics problems is presented. The training of the agents is and can only be executed using a small number of test problems that are taken out of a broad class of generalized logistics problems. The developed agents are then evaluated on unknown instances of the problem class. This work has been developed in the context of last year's AbstractSwarm Multi-Agent Logistics Competition. The most successful algorithms are presented, and additionally, all participating algorithms are discussed with respect to the features of the algorithms that contribute to their success. As a result, we conclude that such a broad variety of a priori unknown logistics problems can be solved efficiently if multiple different good working approaches are used, instead of trying to find one optimal algorithm. For the used test problems this method can undercut, trivial as well as non-trivial implementations, for example, algorithms based on machine learning.
2021
-
Scheduling of Offshore Wind Farm Installation using Simulated Annealing
Shengrui Peng, Daniel Rippel, Matthias Becker and Helena Szczerbicka
17th IFAC Symposium on Information Control Problems in Manufacturing - INCOM '21This paper focuses on the scheduling problem in the offshore wind farm installation process, which is strongly influenced by the offshore weather condition. Due to the nature of the offshore weather condition, i.e., partially predictable and uncontrollable, it is urgent to find a way to schedule the offshore installation process effectively and economically. For this purpose, this work presents a model based on Timed Petri Nets (TPN) approach for the offshore installation process and applies simulated annealing algorithm to find the optimal schedule. -
Tire noise optimization problem: a mixed integer linear programming approach
Matthias Becker, Nicolas Ginoux, S{\'e}bastien Martin and Zsuzsanna R{\'o}ka
Rairo - Operations Research Volume 55, Number 5, September-October 2021We present a Mixed Integer Linear Programming (MILP) approach in order to model the non-linear problem of minimizing the tire noise function. In a recent work, we proposed an exact solution for the Tire Noise Optimization Problem, dealing with an APproximation of the noise (TNOP-AP). Here we study the original non-linear problem modeling the EXact- or real-noise (TNOP-EX) and propose a new scheme to obtain a solution for the TNOP-EX. Relying on the solution for the TNOP-AP, we use a Branch&Cut framework and develop an exact algorithm to solve the TNOP-EX. We also take more industrial constraints into account. Finally, we compare our experimental results with those obtained by other methods. -
Decision-Support System for Offshore Wind Farm Installation
Shengrui Peng, Helena Szczerbicka, Matthias Becker, Daniel Rippel and Michael L{\"u}tjen
The 31st International Ocean and Polar Engineering Conference - ISOPE '21This work presents a prototype of a decision-support system for Offshore Wind Farm (OWF) construction based on the Timed Petri-Nets (TPN) approach, which aims to help the decision-maker in the industry to optimize the process execution. The objective function evaluates the solutions or alternatives to find the optimums. We also present a decoupled scheduling strategy that is computationally efficient and stable comparing to the Mixed-Integer Linear Programming method, which is mathematically exact. Nonetheless, the decision-support system also supports the case, in which multiple agents or installation vessels are used in the projects. The numerical study reveals the relation between the process acceleration and charter cost by using two installation vessels. Last, historical data are used for the weather predictions.
2020
-
Prognosis of Health-Related Unavailability of Professional German Football Players
Stefan Schestakov and Matthias Becker
6th International Conference on Event-Based Control, Communication, and Signal Processing - EBCCSP '20In this work, we statistically analyze publicly available data on the health-related absenteeism of German professional football players and, outgoing from this, develop a machine-learning model for a prognosis of the probability that a player will have a longer health problem during the current season. That prognosis is essential for German insurance companies, since a short illness or injury is covered by the health insurance, whereas a longer absenteeism (over six weeks) is covered by a special insurance. Insurance companies have to calculate the risk for that special case in order offer an appropriate insurance rate. Our model gives an assessment of the risk, and thus plays a vital role in the calculation of the insurance rate that can be offered to a professional football player. -
Simulation-Based Scheduling for Offshore Wind Farm Installation Using Timed Petri Nets Approach
Shengrui Peng, Daniel Rippel, Michael L{\"u}tjen, Matthias Becker and Helena Szczerbicka
Proceedings of the 2020 Summer Simulation Conference - SummerSim '20Despite the success in developments of wind energy technology, there remain challenges, for example, in offshore wind energy installation. Due to changeable and unstable offshore weather conditions, it is hard to effectively schedule the installation logistics. In this work, we propose a simulation-based scheduling strategy to help the operators and the project managers, to make the main decisions during the installation to increase efficiency, e.g. how many offshore wind turbines should be loaded onto the installation vessel. The offshore logistic concept is modeled using timed Petri nets (TPN) approach. The timed transitions in the TPN model are assigned with operation times estimated by means of discrete-time Markov chain (DTMC) approach, which uses historical weather data. Besides, operability is introduced in this work as an indicator to evaluate schedules of a certain time period. -
Modeling and Simulation of Offshore Wind Farm Installation with Multi-Leveled CGSPN Approach
Shengrui Peng, Helena Szczerbicka and Matthias Becker
The 30th International Ocean and Polar Engineering Conference - ISOPE '20This work presents a multi-leveled model based on Colored Generalized Stochastic Petri nets (CGSPN) approach for offshore wind energy installation. The offshore logistics, which describes the organization of offshore operations, is embedded at the root level. The offshore operations, e.g., loading and sailing, are implemented at the secondary level using sub-models. The large scale of the wind turbine components and the ever-changing offshore weather conditions make the scheduling difficult. The aim is to support the project operators and managers in making decisions with the knowledge of the system behavior obtained through stochastic simulation, in which historical weather data measured on the German North Sea from 1958 to 2007 is used. The numerical results show the influence of decision variables, e.g. initial inventory, on a designed offshore wind farm with a size of 80 wind turbines.
2019
-
Approximate Distributed Discrete Event Simulation using Semi-Conservative Look-Ahead Estimation
Desheng Fu, Marcus O'Connor, Matthias Becker and Helena Szczerbicka
2019 IEEE/ACM 23rd International Symposium on Distributed Simulation and Real Time Applications - DS-RT '19A novel way of distributed discrete event simulation, called approximate distributed discrete event simulation, is presented in this paper. Compared with the classic simulation, the models for approximate simulation give some kind of free margin to the simulator during the execution. This can be used in some cases to reduce the overhead of the simulation, especially the execution time. Since the margin can be adjusted arbitrarily in the range, a trade-off between the simulation precision and the execution time can be achieved this way. It's well known that the execution time of distributed discrete event simulation can't be reduced significantly compared with a sequential simulation when the logical processes are tightly coupled and the lookahead is very short. In this study, a framework of approximate distributed discrete event simulation with some novel algorithms is developed, which is aimed to provide a longer look-ahead and further the trade-off between the simulation precision and the execution time using the free margin provided by the model. -
Evaluation of Algorithms for Forecasting of Insect Populations
Matthias Becker
Proceedings of the 33rd annual European Simulation and Modelling Conference - ESM '19 -
A Review on the Planning Problem for the Installation of Offshore Wind Farms
Daniel Rippel, Nicolas Jathe, Matthias Becker, Michael L{\"u}tjen, Helena Szczerbicka and Michael Freitag
9th IFAC Conference on Manufacturing Modelling, Management and Control - MIM '19Offshore wind farms provide a promising technology to produce renewable and sustainable energy. Nevertheless, the installation and operation of offshore wind farms pose a particular challenge to the planning and execution of operations. This article aims to identify requirements towards a decision support tool for the installation planning. Therefore, it provides a review of existing research in planning approaches and summarizes the overall planning problem. Afterwards this problem is decomposed into single tasks, according to their planning horizons. This decomposition shows a high level of interconnection between tasks across all levels. Higher levels provide constraints for the tasks on lower levels, while the results of these tasks are incorporated at higher levels. Finally, the article discusses the advantages and disadvantages of different approaches to solve these tasks.
2018
-
Improving an Evolutionary Approach to Sudoku Puzzles by Intermediate Optimization of the Population
Matthias Becker and Sinan Balci
International Conference on Information Science and Applications - ICISA '18In this work we improve previous approaches based on genetic algorithms (GA) to solve sudoku puzzles. Those approaches use random swap mutations and filtered mutations, where both operations result in relatively slow convergence, the latter suffering a bit less. We suggest to improve GA based approaches by an intermediate local optimization step of the population. Compared to the previous approaches our approach is superior in terms of convergence rate, success rate and speed. As consequence we find the optimum with one population member and within one generation in a few milliseconds instead of nearly one minute. -
JIS: Pest Population Prognosis with Escalator Boxcar Train
Kin-Woon Yeow and Matthias Becker
2018 IEEE International Conference on Industrial Engineering and Engineering Management - IEEM '18Pest population prognosis helps the growers in the greenhouse to keep the pest population below the threshold efficiently. INSIM is one of the recognized pest population simulators. However, the implementation of the INSIM simulation faces some difficulties to be executed as a web service. Thus, we propose a Java-based web application using the mathematical model used in INSIM. Additionally to the known model, our implementation is able to give prognosis boundaries based on uncertainty of the temperature development and pest count. The proposed JIS gives lower and upper estimation of the pest population with the mean accuracy of 66.67% against our experimental validation data. Furthermore, the relationship between the area coverage for each yellow sticky trap and its accuracy percentage is investigated.
2017
-
Indoor Positioning Solely Based on User's Sight
Matthias Becker
Information Science and Applications (ICISA) 2017, Lecture Notes in Electrical Engineering (LNEE) Series - ICISA '17Determination of the absolute geographical position has become every day routine, using the Global Positioning System (GPS), despite the prior existence of maps. However no equally universal solution has been developed for determining one’s location inside a building, which is an equally relevant problem statement, for which GPS cannot be used. Existing solutions usually involve additional infrastructure on the end of the location provider, such as beacon installations or particular configurations of wireless access points. These solutions are generally facilitated by additional native mobile applications on the client device, which connect to this infrastructure. We are aware of such solutions, but believe these to be lacking in simplicity. Our approach for indoor positioning alleviates the necessity for additional hardware by the provider, and software installation by the user. We propose to determine the user’s position inside a building using only a photo of the corridor visible to the user, uploading it to a local positioning server, accessible using a browser, which performs a classification of the photo based on a Neural Network approach. Our results prove the feasibility of our approach. One floor of the university’s building with partially very similar corridors has been learned by a deep convolutional neural network. A person lost in the building simply accesses the positioning server’s website and uploads a photo of his current line of sight. The server responds by generating and displaying a map of the building with the user’s current position and current direction. -
Implementing Real-Life Indoor Positioning Systems Using Machine Learning Approaches
Matthias Becker and Bharat Ahuja
IEEE 8th International Conference on Information, Intelligence, Systems, Applications - IISA '17Position determination in an indoor environment has become a widely discussed problem, due to the growing complexity of building layouts and the lack of any natural heuristics for orientation as compared to the case outdoors. Additionally there is no universal standard for indoor positioning, such as GPS, which however cannot be used for this purpose. Locating oneself in a building serves an increasingly vital function, especially in time-critical scenarios such as airports etc. The use of expensive hardware may assist in solving this problem, which has been studied thoroughly with different technologies being used to achieve a precision of within a few meters. Nevertheless these methods have remained in the academic realm for the most part. This is largely due to the high costs and labour of such hardware installations and the construction of software to interpret the measurements. The goal of this paper is to use existing wireless LAN access points in a building and user-provided smartphones to create a cost-effective positioning system, by omitting the labour and cost of altering building infrastructure, and at the same time simplifying the construction of classifiers for real-life use-cases. An alternative approach using image recognition techniques is presented, for a purely web-based solution. -
Estimating performance of large scale distributed simulation built on homogeneous hardware
Desheng Fu, Matthias Becker, Marcus O'Connor and Helena Szczerbicka
2017 IEEE/ACM 21st International Symposium on Distributed Simulation and Real Time Applications - DS-RT '17Large scale distributed simulation should be well planned before the execution, since applying unnecessary hardware only wastes our time and money. On the other side, we need enough hardware to achieve an acceptable performance. Thus, it is considerable to estimate the performance of a large scale distributed simulation before the execution. Such an estimation also improves the efficiency of the applied hardware in many cases due to the optimization on the simulation algorithm and on the partition of the model. In this paper, we show our approaches to estimate the performance, especially the duration of execution, of a large scale distributed simulation system built on a large set of homogeneous hardware, using a small set of hardware of the same type. Our basic idea is to simulate a distributed simulation in a sequential way for a short time considering all the costs and benefits of the distribution. The results of our case study show that our approaches are able to provide meaningful estimations.
2016
-
Planning in Dynamic, Distributed and Non-automatized Production Systems
Matthias Becker, Michael L{\"u}tjen and Helena Szczerbicka
Information Science and Applications (ICISA) 2016We present a framework for realization of a decentralized decision support system for deployment in production scenarios with a low level of automation, such as in multi-site construction. Furthermore we present the insights concerning the improvements of the production process when applying online simulation based optimization methods in that scenarios. Our solution shows how to realize a central control and decision support station with special focus on easy connection to the possible multiple construction sites and on high usability for nonexperts. Our framework is able to connect to a large number of modeling, simulation and analysis tools. For sake of usability it turned out, that only dialogue-based communication with the end user seems applicable in those scenarios, where often only simple devices are present. -
Improving the performance of distributed discrete event simulation by exchange of conditional look-ahead
Desheng Fu, Matthias Becker and Helena Szczerbicka
Concurrency and Computation: Practice and Experience, Wiley 2016Distributed discrete event simulation is an important approach for enabling the modeling and analysis of the behavior of large systems. This approach presents a major problem, namely, the possible low performance due to the excessive overhead in synchronizing the distributed logical processes. To counter this, our approach to distributed discrete event simulation involves conservative synchronization and its acceleration using dynamic estimation of process-to-process look-ahead with a feedback mechanism. This mechanism allows for the estimation of a larger look-ahead, which may be invalidated and recalculated during the course of the simulation, if one of the processes obtains more detailed knowledge. In this work, we extend the dynamically estimated look-ahead, on the basis of the local state of the logical processes, by exchanging conditional look-aheads, in conjunction with the broadcast of invalidation announcements. A notable reduction in runtime in various cases is thus achieved, especially when the estimated look-ahead is stochastically too conservative. -
Optimization of Tire Noise by Solving an Integer Linear Program (ILP)
Matthias Becker, Nicolas Ginoux, Sebastien Martin and Zsuzsanna Roka
IEEE International Conference on Systems Man and Cybernetics - SMC '16One important aim in tire industry when finalizing a tire design is the modeling of the noise characteristics as received by the passengers of the car. In previous works, the problem was studied using heuristic algorithms to minimize the noise by looking for a sequence under constraints. These constraints are imposed by tire industry. We present a new technique to compute the noise. We also propose an integer linear program based on that technique in order to solve this problem and find an optimal sequence. Our study shows that the integer linear programming approach shows significant improvement of the found tire designs, however it has to be improved further to meet the calculation time restrictions for real world problem size. -
Basic Algorithms for Bee Hive Monitoring and Laser-based Mite Control
Larissa Chazette, Matthias Becker and Helena Szczerbicka
IEEE Symposium Series on Computational Intelligence - IEEE SSCI '16The work in progress described in this paper has the objective to implement a beehive monitoring system to monitor essential parameters of a bee hive (such as temperature, sound, weight) and additionally including an image recognition algorithm to observe the degree of infestation with Varroa mites. Mites should be detected at the entrance and statistics about the degree of infestation should be made available by a web interface. As ultimate approach to fight mites without chemicals the coordinates of the mites are to be detected and a laser will be used to kill them. This work describes approaches relevant to all steps of the aforementioned procedure, however it is still work in progress and the components of the approach still have to be integrated into one system that is deployable in practice. -
Analysing the Cost-Efficiency of the Multi-agent Flood Algorithm in Search and Rescue Scenarios
Florian Blatt, Matthias Becker and Helena Szczerbicka
German Conference on Multiagent System TechnologiesA Multi-Agent algorithm works by using at least two agents to create a synergistic effect, resulting in an emergence of new possibilities which are not programmed implicitly into the various agents. To achieve this synergistic effect the algorithm has to provide the possibility to communicate and consecutively allow cooperation between the agents. Considering the use of multi-agent algorithms in search and rescue scenarios the targeted effect of the emergence is on one hand a more effective search and rescue process or on the other hand only an optimized rescue process. This paper examines the number of agents that is needed for the Multi-Agent Flood algorithm to yield the most beneficial ratio between the used number of agents and time it takes to complete the search process. Our studies show that adding more robots may not be cost efficient for the search and rescue process. This in turn allows for a better planning and coordination of robotic search teams, as the number of needed agents can be anticipated and the possible transport logistics of robots can be optimized. -
On the quality of graphs generated by swarm algorithms
Matthias Becker
2016 IEEE Congress on Evolutionary Computation - CEC '16Swarm algorithms are often used for the generation of graphs. The generated graphs are mostly planar, inexpensive and fault-tolerant. In this work we evaluate the graphs produced by a swarm algorithm with respect to the properties and the quality of the found graphs and to the runtime requirements. The algorithm under consideration is essentially a simulation of the foraging of the slime mold Physarum polycephalum. Especially emphasized are the properties of the algorithm, since its deployment in many other works is limited to the existence of a graph solution, not reporting however the quality of the graph and runtime requirements of the algorithm. We compare the quality of the resulting graphs and the runtime of the algorithm to classical algorithms from graph theory. Our results show that the slime mold algorithm has some interesting features, however it is not the best means to construct graphs of large sets of nodes in terms of efficiency of the algorithm or quality of the outcome.
2015
-
Evaluating heuristic optimization, bio-inspired and graph-theoretic algorithms for the generation of fault-tolerant graphs with minimal costs
Matthias Becker, Markus Kr{\"o}mker and Helena Szczerbicka
Information Science and ApplicationsThe construction of fault-tolerant graphs is a trade-off between costs and degree of fault-tolerance. Thus the construction of such graphs can be viewed as a two-criteria optimization problem. Any algorithm therefore should be able to generate a Pareto-front of graphs so that the right graph can be chosen to match the application and the user’s need. In this work algorithms from three different domains for the construction of fault-tolerant graphs are evaluated. Classical graph-theoretic algorithms, optimization and bio-inspired approaches are compared regarding the quality of the generated graphs as well as concerning the runtime requirements. As result, recommendations for the application of the right algorithm for a certain problem class can be concluded. -
Advantages of Heterogeneous Agent Populations for Exploration and Pathfinding in Unknown Terrain
Florian Blatt, Matthias Becker and Helena Szczerbicka
Open-Access journal Frontiers in Sensors (FS)In this work we demonstrate how the deployment of several types of agents increases the efficiency and the overall success concerning the task to explore unknown terrain, and finding a pathbetween a starting point and various points of interest. The used agents have different capabilities that are typically foundin technical assistance systems used in search and rescueoperations. In our test cases, the environments to be explored have both, regular characteristics like a maze or a building as well as irregular structures. Our simulations using heterogeneous and cooperating agent populationsshow, that this approach is superior to homogeneous populations, with a higher rate of finding the destinations and in shorter time. -
Unterst{\"u}tzung f{\"u}r eine effiziente Montagesteuerung -Simulation identifiziert und bewertet Handlungsoptionen bei St{\"o}rungen in einer Baustellmontage
Eric Hund, Sebastian Bohlmann, Helena Szczerbicka and Matthias Becker
Springer-VDI wt Werkstattstechnik online -
Universal Simulation Engine (USE) A Model-Independent Library for Discrete Event Simulation
Desheng Fu, Matthias Becker and Helena Szczerbicka
SpringSim 2015Universal Simulation Engine (USE) is a C++ library providing a model-independent environment for Discrete Event Simulation (DES) tasks. Unlike most simulators, USE focuses on the general simulation technology and integrates many features, which are necessary to build a correct and efficient simulation system. It is aimed at providing a professional environment to reduce the cost of modeling as well as the execution time of the simulation for almost all DES models. Developers may use USE as an open framework to build models very efficiently. USE also supports many advanced features such as Distributed Discrete Event Simulation (DDES), Virtual Environments (VE), Online Simulation (OS), dynamic coupling / decoupling of sub-models, etc. This paper introduces the distinctive features of USE for practical implementation of DES, DDES and VE systems. We also evaluate the performance of USE by comparison with existing simulators. -
Universal Simulation Engine (USE)
Desheng Fu, Matthias Becker and Helena SzczerbickaUniversal Simulation Engine (USE) is a C++ library providing a model-independent environment for Discrete Event Simulation (DES) tasks. Unlike most simulators, USE focuses on the general simulation technology and integrates many features, which are necessary to build a correct and efficient simulation system. It is aimed at providing a professional environment to reduce the cost of modeling as well as the execution time of the simulation for almost all DES models. Developers may use USE as an open framework to build models very efficiently. USE also supports many advanced features such as Distributed Discrete Event Simulation (DDES), Virtual Environments (VE), Online Simulation (OS), dynamic coupling / decoupling of sub-models, etc. This paper introduces the distinctive features of USE for practical implementation of DES, DDES and VE systems. We also evaluate the performance of USE by comparison with existing simulators. -
Optimizing the exploration efficiency of autonomous search and rescue agents using a concept of layered robust communication
Florian Blatt, Matthias Becker and Helena Szczerbicka
2015 IEEE 20th Conference on Emerging Technologies \& Factory Automation (ETFA)Robots deployed in Search and Rescue missions are required to act autonomously, since in disaster scenarios even a remote control of the robots might not be possible anymore. Additionally it is desirable to use several of these robots at once, since the search for survivors is a time-critical task, and the time to find victims should in the best case scale with the number of robots. In order to achieve this scaling the robots have to communicate and cooperate. This paper discusses how to optimize the search and rescue mission with multiple autonomous yet cooperating robots, evaluating different communication patterns to speed up the search process. Emphasis lies on the evaluation of the diverse types of communication methods, which can be direct, indirect or a combination thereof. Our studies show, that the combination of direct and indirect communication optimizes the search and rescue process. Further using a combined method of data transfer between the agents provides robust communication between the robots, improving the search efficiency. -
On the influence of state update interval length on the prediction success of decision support system in multi-site production environment
Matthias Becker and Helena Szczerbicka
2015 IEEE 20th Conference on Emerging Technologies \& Factory Automation (ETFA)Planning in a multi-site, non-mass production environment is a special challenge because of several sources of uncertainty. Unlike in mass production facilities, in our setting the current state at all sites cannot be determined easily and exactly due to the spatial distribution of sites and the low degree of automation. For re-planning in case of failures, the possible alternative actions have to be formalized on the decision making facility, where the possible alternatives will then be determined and evaluated. In this work, we will present the necessary components for an automated evaluation of alternatives and decision support procedure. The main challenges are the formalization of product plans including alternative steps and the non-automated collection or assessment of the distributed system state of all sites. In our experiments we evaluate different state update intervals and the effect on prediction accuracy. It turns out, that even sparse updates show significant improvement on the production time in comparison to only local static decisions. -
On the Efficiency of Nature-Inspired Algorithms for Generation of Fault-Tolerant Graphs
Matthias Becker
IEEE Conference on Systems, Man and Cybernetics SMC 2015In this study several algorithms for the generation of inexpensive and fault-tolerant graphs are evaluated with respect to the quality of the found graphs and to the runtime requirements. A special focus lies on the properties of the algorithm that basically is a simulation of the foraging of the slime mold Physarum polycephalum, since in many other works the deployment of this algorithm does not go beyond the conclusion, that the algorithm is capable to generate a graph, while quality of the graph and runtime requirements of the algorithm are not reported. Our results show that the slime mold algorithm has some interesting features, however it is not the best means to construct highly efficient graphs out of large sets of nodes. -
A Framework for Decision Support in Systems with a low Level of Automation
Matthias Becker, Sinan Balci and Helena Szczerbicka
International Conference on Computers and Industrial Engineering, CIE45We present a framework for realization of a decentralized decision support system for deployment in production scenarios with a low level of automation, such as in multi-site construction. Our solution shows how to realize a central control and decision support station. Special focus is on easy connection to the possible multiple construction sites and on high usability for non-experts. Our framework is able to connect to a large number of modeling, simulation and analysis tools. For sake of usability it turned out, that only dialogue-based communication with the end user seems applicable in those scenarios, where often only simple devices are present.
2014
-
A concept of layered robust communication between robots in multi-agent search \& rescue scenarios
Matthias Becker, Florian Blatt and Helena Szczerbicka
2014 IEEE/ACM 18th International Symposium on Distributed Simulation and Real Time ApplicationsThis paper introduces an approach for the use of direct and indirect communication between robots for a robust communication framework in a multi-agent scenario. This approach is granting the robots access to a defined minimum of information via indirect communication that is needed to fulfill the needs of the underlying navigation or exploration algorithm and creating a robust mechanism that does not suffer the drawbacks of direct communication via radio signals or light waves. It also allows the agents to access a higher level of communication via creating an ad hoc network for the exchange of additional information which would enhance the communication and cooperation between the agents. This will enable the underlying algorithm to access added information which can be used to significantly boost the navigation or exploration process, while not relying on this high level of communication. -
Accelerating distributed discrete event simulation through exchange of conditional look-ahead
Desheng Fu, Matthias Becker and Helena Szczerbicka
Proceedings of the 2014 IEEE/ACM 18th International Symposium on Distributed Simulation and Real Time ApplicationsDistributed discrete event simulation is a very important method today to analyze the behavior of large models. We investigate the practical implementation of distributed discrete event simulation with conservative synchronization and its acceleration through dynamic estimation of process-to-process look-ahead. Since the dynamic look-ahead changes with time, we have to face the situation, that the look-ahead between some logical processes is decreased temporarily. The shortened lookahead has a very negative influence to the performance of the simulation and it is hard to avoid. However, this effect can be reduced by introducing some extra mechanisms in the simulation. In this paper, we present a mechanism to optimize the simulation for the situation that the look-ahead between some processes is very short. This mechanism is based on exchange of conditional look-ahead and broadcast of invalidation announcement. Our evaluation shows reduction of the execution time of a majority of distributed simulations, especially when the estimated look-ahead is stochastically too conservative. -
Predictive simulation based decision support system for resource failure management in multi-site production environments
Matthias Becker, Sinan Balci and Helena Szczerbicka
2014 International Conference on Control, Decision and Information Technologies (CoDIT)Planning in a multi-site, non-mass production environment is a special challenge because of several sources of uncertainty. Unlike in mass production facilities, in our setting the current state at all sites cannot be determined easily and exactly due to the spatial distribution of sites and the low degree of automation. For re-planning in case of failures, the possible alternative actions have to be formalized on the decision making facility, where the possible alternatives will then be determined and evaluated. In this work, we will present the necessary components for an automated evaluation of alternatives and decision support procedure. The main challenges are the formalization of product plans including alternative steps and the non-automated collection or assessment of the distributed system state of all sites.
2013
-
A Multi-agent Flooding Algorithm for Search and Rescue Operations in Unknown Terrain
Matthias Becker, Florian Blatt and Helena Szczerbicka
Multiagent System TechnologiesIn this paper we will introduce a new multi-agent algorithm for the use in search and rescue scenarios for exploration of unknown terrain. This method combines the concept of exploration from the flood algorithm and the path optimizing features of the ant algorithm. The first part leads to a fast exploration of the unknown terrain, while the second part constructs short paths from points of interest back to the base. Together this enables the starting of rescue operations parallel to the ongoing search. We demonstrate the feasibility of our approach by agent-based simulations. The simulations show, that our approach is comparable in speed and quality with already existing algorithms, delivering the additional benefit of short paths to points of interest, and adhering to the inherent limitations of these kind of scenarios. -
Online simulation based decision support system for resource failure management in multi-site production environments
Sebastian Bohlmann, Matthias Becker, Sinan Balci, Helena Szczerbicka and Eric Hund
2013 IEEE 18th Conference on Emerging Technologies \& Factory Automation (ETFA)Planning in a multi-site, non-mass production environment is a special challenge because of several sources of uncertainty. Unlike in mass production facilities, in our setting the current state is not easily and exactly known when the case of re-planning occurs. The planning procedure has to contribute to that fact, as well as to further uncertainties concerning the effects of a plan when evaluating the plan. Thus in this work, we apply online simulation as means for re-planning multi-site production in the case of resource failure. This work is a first step where two alternatives are considered when a resource fails: either wait for repair of the resource, or transport another instance of this resource from another site, if there is more than one available. Our study shows that the planning using online simulation is superior to a static strategy such as `always wait for repair' or `always import resource' in case of resource failure. -
On the potential of semi-conservative look-ahead estimation in approximative distributed discrete event simulation
Desheng Fu, Matthias Becker and Helena Szczerbicka
Proceedings of the 2013 Summer Computer Simulation ConferenceOne major problem of distributed discrete event simulation is the poor performance due to the huge overhead for maintaining the order of causality, so that the execution time cannot be reduced significantly compared to sequential simulation. This holds especially when the processes are tightly coupled and the look-ahead is very short. On the other hand, results of many simulations are obtained from a number of independent outputs, which are of stochastic nature and a small deviation of a limited amount of outputs is acceptable. Acceptance of such deviations in a controlled way could affect a trade-off between the simulation accuracy and the execution time. The goal of our investigation is to develop a methodology to handle the trade-off. In this paper, we propose a new way of distributed simulation with semi-conservative look-ahead estimation, where we accept causality errors to a certain and limited extent. In our approach, we consider a semi-conservative estimation allowing limited over-estimation. If the look-ahead is over-estimated, unsolved causality errors will be resolved by a very efficient recovery procedure at the expense of simulation errors. Results from a case study demonstrate that our approach is able to maximize the look-ahead with respect to the predefined error bounds and can reduce the execution time of many simulations. We do however also point out the limitations of the mechanism and the trend of our further investigation. -
Applicability of bio-inspired and graph-theoretic algorithms for the design of complex fault-tolerant graphs
Matthias Becker, Florian Schmidt and Helena Szczerbicka
2013 IEEE International Conference on Systems, Man, and CyberneticsFault-tolerant networks are needed for many applications, such as telecommunication networks, electricity networks, traffic, routing, and others. Several methods for constructing fault-tolerant networks out of a given set of nodes exists, among them classical graph-theoretic ones, and recently also several bio-inspired algorithms have been proposed for such application. In this paper we study the performance of these different algorithms for that problem. Performance means that both the complexity of the algorithm for a given problem size and the quality of the generated networks are taken into account. We conclude that classical algorithms that belong to a certain complexity class are efficient for small to medium size problems, while at some point, for larger problems, bio-inspired solutions are more efficient to get a solution.
2012
-
Comparison of Bio-Inspired and Graph-Theoretic Algorithms for Design of Fault-Tolerant Networks
Matthias Becker, Waraphan Sarasureeporn and Helena Szczerbicka
ICAS 2012, The Eighth International Conference on Autonomic and Autonomous SystemsRecently several approaches have been presented that exploit the ability of Physarum polycephalum to connect several food sources via a network of pipes in order to maintain an efficient food distribution inside the organism. These approaches use the mechanisms found in nature in order to solve a technical problem, namely the design of constructing faulttolerant and efficient connection networks. These works comprise experiments with a real slime mold Physarum polycephalum as well as computer simulations based on a tubular model and an agent-based approach. In this work, we study the suitability of those bio-inspired approaches and compare their performance to a graph-theoretic algorithm for construction of fault-tolerant connection networks, the (k, t)-spanner algorithm. The graphtheoretic algorithm is able to construct graphs with a certain degree of fault tolerance as well as meet a given maximal path length between two arbitrary nodes. However the definition of fault tolerance in previous bio-inspired works differs to that used in graph theory. Thus in our contribution we analyze the bio-inspired approaches as well as the graph-theoretic approach for their efficiency of designing optimal fault-tolerant graphs. We demonstrate the usability of the graph-theoretic approach despite relying on a different definition of fault tolerance. We conclude that classical efficient computational algorithms from graph theory can be adapted and applied in the same field as the bio-inspired approaches for the problem of constructing efficient fault tolerant networks. They often provide an easier to use and more direct solution than bio-inspired approaches, that need more parameter tuning before getting satisfactory results. -
Agent-based Approaches for Exploration and Pathfinding in Unknown Environments
Matthias Becker, Florian Blatt and Helena Szczerbicka
17th IEEE International Conference on Emerging Technologies and Factory AutomationThis work evaluates and improves agent-based approaches for exploration of unknown terrain and finding (shortest) paths to a goal with an unknown location. This scenario is typically found in emergency incidents and for technically assisted search and rescue cases, e.g. when a building or a block of a city has been damaged by an earthquake and as a result, former maps are not valid anymore. In our approach we simulate the exploration of the unknown environment by agents that have different capabilities typically found in the technical assistance systems used in search and rescue operations, such as cameras, sensors, and limited communication abilities. The outcome of our simulation experiments indicates, which composition of a population of agents is best suited for efficient exploration of the unknown terrain, in terms of speed and path length. The results give hints which agent populations should be tried in a real world setting, in a search and rescue test scenario with real robots and sensors. -
Orthogonal cut algorithm for value-based event localization in sensor networks
Desheng Fu, Matthias Becker, Sven Schaust and Helena Szczerbicka
2012 IEEE 9th International Conference on Mobile Ad-Hoc and Sensor Systems (MASS 2012)We investigate the capability of ad-hoc sensor networks equipped with simple sensor devices to enable a more accurate spatial event localization in order to support smart camera networks. We present a novel algorithm named Orthogonal Cut which is suitable for many different event detection scenarios, especially when large interferences occur. It estimates the spatial position of a detected event by dividing the surveillance space of a sensor network into smaller areas until a threshold criteria is met.
2011
-
The Effect of Additional Information on the Prediction Quality of Wafer Fabrication Operation with a Neural Network and Clustering Approach
Matthias Becker, Helena Szczerbicka and Mei-Chen Lo
International Journal of Information -
A Simulation Model of Dictyostelium Discoideum for the Study of Evolutionary Selection Mechanisms
Matthias Becker and Helena Szczerbicka
Cybernetics and Systems: An International JournalIn this work we use a simulation model of the slime mold Dictyostelium discoideum to study mechanisms of evolutional selection. Dictyostelium discoideum shows “altruistic” behavior: some cells sacrifice themselves in order to enable the survival of other cells. This behavior contradicts the individual selection on a “survival of the fittest” basis, which is used in many computational algorithms. We will show that individuals with lower fitness will not die out; on the contrary, they might be necessary for the survival of a population. As a conclusion, we should rethink the practice of selecting the best individuals on the basis of one single measure in computational algorithms using a selection scheme. -
Design of fault tolerant networks with agent-based simulation of physarum polycephalum
Matthias Becker
2011 IEEE Congress of Evolutionary Computation (CEC)In this work we evaluate slime mold inspired algorithms, that gained a lot of attention in renowned journals recently, for their ability to construct fault tolerant connection networks. In previous work, experiments with a real slime mold Physarum polycephalum as well as computer simulations based on a tube model of the slime mold showed, that the slime mold (and simulations thereof) is able to construct fault tolerant and efficient transport networks similar to the actual Tokyo rail system. However the quality of the solutions of the real slime mold show big variations, and the tubular computer simulation does not seem to reproduce the natural slime mold very well, since the constructed networks do not show the variety of the naturally build ones, instead they show a heavy dependence of one simulation parameter. Thus in our work we present a different approach for construction of fault tolerant connection networks for the Tokyo rail system using an agent based simulation of Physarum polycephalum. Analysis of the results show that the agent based simulation reproduces the variance in the behavior of the natural slime mold much better. Analyzing the cost benefit ratio of bio-inspired network construction we however conclude that it might be worth to consider classical efficient computational algorithms for the problem of constructing minimal fault tolerant networks.
2010
-
A simulation study of mechanisms of group selection of the slime mold Dictyostelium discoideum
Matthias Becker
2010 IEEE 14th International Conference on Intelligent Engineering SystemsSlime molds are fascinating organisms, they can either live as an organism consisting out of a single cell (protozoa) or they can form a multi-cellular organism (pseudoplasmodium). So from the biological point of view, the slime molds are studied in order to understand the evolutionary step from a single cell organism to a multi-cellular organism. Studies have shown that the behavior of cooperating single cell organisms exhibit synergistic emergent intelligence, for example finding shortest paths. Just recently, simulation and experiments with a real slime mold (Physarum polycephalum) have been used for traveling salesman like problems. In this work we present a simulation model for the slime mold Dictyostelium discoideum. Different to other studies, here the whole life-cycle is modeled and simulated. This model is used to study the mechanism of cooperation of single cells: We compare the mechanism of altruistic group selection against individual and egoistic selection. It turns out that simple signaling mechanisms are sufficient for a complex behavior of Dictyostelium discoideum, that allows altruistic behavior on cell level, helping the whole population to survive. -
A Data Management Framework Providing Online-Connectivity in Symbiotic Simulation
Sebastian Bohlmann, Volkhard Klinger, Helena Szczerbicka and Matthias Becker
24th EUROPEAN Conference on Modelling and Simulation (ECMS), Simulation meets Global Challenges, Kuala Lumpur, MalaysiaSymbiotic simulation in industrial applications requires efficient connectivity between industrial processes and embedded legacy simulators. One main challenge is to handle heterogeneous system parameters like bandwidth, latency, redundancy, security and data representation. Moreover, data management needs to be improved in terms of unified access providing an interface to online, historical and corresponding simulation data. This paper proposes a framework for symbiotic simulation addressing the problem of connectivity. We introduce the Process Data Streaming Protocol (PDSP) managing distributed process data flows. Additionally we present PDSP based modules covering different modes of operation for data processing. The Framework interacts with distributed control systems via object linking and embedding for process control as well as to embedded systems using different hierarchic operation modes. Historical data can be provided transparently through an integrated stream-database. The framework is primarily optimized to be used in JAVA-based simulation environments, but is not limited to these. Finally we demonstrate the usability of the system while interacting with a simulation environment for a hybrid process and present some experimental results. -
Simulation Model For The Whole Life Cycle Of The Slime Mold Dictyostelium Discoideum.
Matthias Becker
Proceedings of the European conference on modeling and simulation (ECMS)Slime molds are fascinating organisms, they can either live as an organism consisting out of a single cell or they can form a multi-cellular organism. Therefore from the biological point of view, the slime molds are studied in order to understand the evolutionary step from a single cell organism to a multi-cellular organism. Studies have shown that the behavior of cooperating single cell organisms exhibits synergistic emergent intelligence, for example finding shortest paths. Just recently, simulation and experiments with a real slime mold (Physarum polycephalum) have been used for traveling salesman like problems. In this work we present a simulation model for the slime mold Dictyostelium discoideum. Different to other studies, here the whole life-cycle is modeled and simulated. Very detailed behavioral patterns and parameters are modeled and as result a simulation model is obtained, that shows a behavior very close to the living slime mold. This result is consolidated by extensive verification experiments. As consequence, this model can be used to further study the mechanism of cooperation of single cells, mechanisms of synergy and emergence, and additionally this model offers the possibility to develop more slime mold inspired algorithms. -
An Optimization Algorithm Similar to the Search of Food of the Slime Mold Dictyostelium Discoideum
Matthias Becker and Malte Wegener
IRAST International Congress on Computer Applications and Computational Science (
2009
-
Performance Efficiency Measuring and Prediction of Wafer Fabrication Operation with a Combined Clustering and Neural Network Approach
Matthias Becker, Helena Szczerbicka and Mei-Chen Lo
Asia Pacific Industrial Engineering and Management Systems ConferenceMost of the performance assessment of semiconductor manufacturer is based on their self-appraisal or subjective judgments. The needs to measure fabrication (fab) operation performance along with its various dimensions have led to the development of a large number of quantitative performance indicators. An overall scheme to measure the performance of fab operation involving multi-input and multi-effects (output) has not been well established yet. In this study, we approach the performance assessment and prediction by combining clustering approaches with Artificial Neural Networks (ANN) approaches. We use historical data from a Taiwan semiconductor major player which comprise input/investment data (such as headcount, salary, cost for machines, running the fab,...) as well as output of each fab (such as margin, waferoutput rate, stepmove, number of patents). The data comprise several years, during which some of the fabs have been ramped up. In the first phase of our approach, we studied several clustering algorithms (K-Means, X-means, Kernel K-Means, SIB, and EM) on the data. We found several clusterings that were meaningful according to human experts. One of the clustering approaches clearly divided one older fab from newer fabs, and also was able to distinguish fabs in ramping up phase from fabs that are in stable operation phase. Other approaches formed clusters according to the grade of performance (bad, medium-bad, medium-good, good) of the data sets. Second, we use the classification to let a neural network learn the status of a fab, so that for a new fab, the status can be judged by the neural net. In a third step, we let a neural network learn the relationship between the multiple inputs and outputs. As result we found a neural net structure that is able to predict changes in the inputs of a fab on the different output factors. By this we enable the fab management to obtain a prediction, which effect a planned measure (eg increase or decrease of headcount) has on the output of the fab, that is the performance figures. -
Tread profile optimization for tires with multiple pitch tracks
Matthias Becker, Sebastian Jaschke and Helena Szczerbicka
Proceedings of the IEEE 13th international conference on Intelligent Engineering SystemsReduction of noise is a growing subject of interest in the automotive industry, especially in tire manufacturing. After construction of the basic tire design, that is design of the material and the basic building blocks called pitches, the last step in noise engineering of a tire is the determination of the pitch sequence of a tire. In this step the different types of pitches are put together regarding several constraints. Since there are a combinatorial number of valid pitch sequences, the goal is to find a valid pitch sequence with optimal noise characteristics. Due to the complexity of the problem, the globally optimal pitch sequence cannot be found by exhaustive search and intelligent algorithms such as Heuristic Optimization Algorithms, have to be used in order to find at least a locally optimal pitch sequence. Several successful approaches for this problem can be found in the literature for tires consisting out of just one pitch sequence. In this work tires consisting out of multiple pitch sequence (several tracks) are considered. We show how we can use algorithms for single track optimization and how we can combine them best for finding noise optimal tire designs for multiple pitch track tires. -
On Classification Approaches for Misbehavior Detection in Wireless Sensor Networks
Matthias Becker, Martin Drozda, Sven Schaust, Sebastian Bohlmann and Helena Szczerbicka
Journal of ComputersAdding security mechanisms to computer and communication systems without degrading their performance is a difficult task. This holds especially for wireless sensor networks, which due to their design are especially vulnerable to intrusion or attack. It is therefore important to find security mechanisms which deal with the limited resources of such systems in terms of energy consumption, computational capabilities and memory requirements. In this document we discuss and evaluate several learning algorithms according to their suitability for intrusion and attack detection. Learning algorithms subject to evaluation include bio-inspired approaches such as Artificial Immune Systems or Neural Networks, and classical such as Decision Trees, Bayes classifier, Support Vector Machines, k-Nearest Neighbors and others. We conclude that, in our setup, the more simplistic approaches such as Decision Trees or Bayes classifier offer a reasonable performance. The performance was, however, found to be significantly dependent on the feature representation. -
Quality control of a light metal die casting process using artificial neural networks
Matthias Becker
2009 IEEE International Conference on Computational Cybernetics (ICCC)In this work we present an approach that uses a neural net for an online control of the cooling process in light metal die casting industry. Normally the die casting process is controlled manually or semi-manually, and quality control is done well after the cooling process. In our approach we increase the product quality during the production process by monitoring the cooling process with an infra red camera and heating or cooling different parts of the mold. The control is done using a neural net, which has been trained with data from previous casting processes, where the quality has been judged by experts. We conclude that this approach is a feasible way to online monitor and increase product quality in die casting.
2008
-
Traffic analysis and classification with bio-inspired and classical algorithms in sensor networks
Matthias Becker, Sebastian Bohlmann and Sven Schaust
Performance Evaluation of Computer and Telecommunication Systems, 2008. SPECTS 2008. International Symposium onIn this work we evaluate the feasibility of both classical machine learning algorithms and bio-inspired algorithms for misbehavior detection in sensor networks, since recent works in that field seem to concentrate mainly on bio-inspired approaches, without a convincing rational reason. As a first step, we analyze the packet traffic of a simulated sensor network in order to find relevant features that distinguish normal network operation from misbehaving nodes. This kind of data analysis is often missing in previous studies. Using these features acquired by the systematic data analysis we study the suitability of classical machine learning algorithms as well as bio-inspired learning algorithms for the given classification problem. We conclude which algorithms perform best in this special scenario, considering classification success and resource-friendliness of the algorithms. As result we can say that classical algorithms have equal or even better detection capabilities compared to some bio-inspired algorithms. It turns out that it is even possible to detect different levels of misbehavior with nearly 100% accuracy. -
Comparing performance of misbehavior detection based on neural networks and ais
Matthias Becker, Martin Drozda, Sebastian Jaschke and Sven Schaust
2008 IEEE International Conference on Systems, Man and CyberneticsWe compare two approaches for misbehavior detection in sensor wireless networks based on artificial immune systems (AIS) and neural networks (NN). We conclude that AIS and NN based misbehavior detection offers a decent detection performance at a very low computational cost. However both approaches are different regarding the length of the preprocessing phase, memory requirements, speed of computation and the rate of false positives. Both approaches are suitable for misbehavior detection in sensor networks, the decision which approach to choose for a specific sensor network depends on the details of the scenario. -
Optimized spike placement on tires with respect to low noise
Matthias Becker
2008 IEEE International Conference on Systems, Man and CyberneticsReducing noise is a growing subject of interest in the automotive industry, especially in tire manufacturing. When it comes to tires with spikes then the reduction of noise is an urgent issue. The placement of spikes on a given tire is combinatorial problem with constraints, since there are restrictions imposed by the traffic law as well as restrictions stemming from the production process. In this work we show that the application of heuristic optimization algorithms is usually not feasible because of the problem size that makes it impossible to find even one valid spike distribution with simple methods. We show three approaches based on the divide and conquer strategy that show promising results, one approach even including an optimization strategy in a genetic algorithms like style. As result our powerful algorithms are now used successfully in tire industry in order to produce valid and noise reduced spike distributions. -
Approaches to Analyze and Optimize Inventory-Controlled Service Systems using Taguchi Method and Ant Algorithms
Matthias Becker, Helena Szczerbicka and Honam Kim
2008 International Conference on Service Systems and Service ManagementIn this paper we present a survey of several methods, that are suitable for modeling service systems with inventory control. One Markovian approximation method is extended with priorities. These approaches also can be coupled with optimization algorithms, where here the Taguchi method and ant algorithms are considered and a serial four stage example is given. -
Performance of Security Mechanisms in Wireless Ad Hoc Networks
Matthias Becker, Martin Drozda and Sven Schaust
Workshop on Security and High Performance Computing SystemsAdding security mechanisms to computer and com-munications systems without degrading their performance is a difficult task. This holds especially for wireless ad hoc networks that, due to their physical and logical openness, are easier to attack than wired networks. Additionally, such networks are expected to have less resources in terms of computational capabilities and must also rely on batteries for power supply. We investigate the impact of two misbehavior detection mechanisms based on Neural Networks and Artificial Immune Systems. In the performance analysis we assume that wireless devices, in many scenarios, must work with extremely limited computational resources. An example of such a scenario are sensor networks. This implicates that the security system of choice has to be very efficient in order not to disturb the normal wireless device operation.
2007
-
DISCRETE EVENT SYSTEMS--PETRI NET-BASED MODELING AND SIMULATION IN THEORY AND PRACTICE
Matthias Becker
EurosimThe theory of modeling formalisms for Discrete Event Systems has a long history and is well developed, many algorithms for modeling and efficient analysis of the modeled systems exist. However in many practical applications or commercial software, the theory is not used. The reasons are manifold. The question is, whether the theoretical concepts are not suited for practical applications, or whether the problem lies in the proper transfer to practice. Other problems lie in the sometimes missing flexibility of theoretical models, to some extend in missing good software that would enable the practical use of such models. In this work we review Petri net based methodologies with regard to their applicability in practice, and try to understand why many aspects of the theory of modeling and simulation do not find their way into practice. We identify crucial factors such as support of complex models and hierarchic modeling capabilities. These factors not only concern the modeling methodology, but also need to be implemented in a software tool. The availability of a software supporting a modeling concept is another important factor. The software should also have an adequate and appealing graphical representation because at the end, the practitioners have to be convinced to’buy’the theoretical concept, and that will only be the case, if decision makers can recognize’their’system easily. Furthermore we survey a number of papers about application of Petri nets to find out to which extend these applications are practical ones, ie whether the applications are of academic nature, proof of concept, toy size or inside a productive environment. -
Performance of routing protocols for real wireless sensor networks
Matthias Becker, Sven Schaust and Eugen Wittmann
Proceedings of the 10th International Symposium on Performance Evaluation of Computer and Telecommunication Systems - SPECTS '07The main task of a Wireless Sensor Network (WSN) is to collect data and either send it to a base station immediately or to store it locally until the data is requested by a base station. WSN form a wireless network without specific infrastructure thus efficient routing protocols are necessary to let a data packet find its way from one specific sensor node through the network to the base station. Since WSN are a quite new technology, in a first step existing routing protocols from other types of wireless networks have been employed in WSN. However these protocols are not well suited for WSN, since the characteristics of the technology and the application of other wireless networks may be quite different from those in WSNs. As consequence the adopted routing protocols often perform badly in the context of WSN. In this work we study the usability of several routing protocols in a real world environmental monitoring task and show how the performance of wireless routing protocols can be improved significantly if adapted carefully for the use in WSN. Finally, the performance of the different available routing protocols is then measured and compared through actual deployment of the WSN using cricket motes, which have been designed by U.C. Berkeley. -
Generating Interactive 3-D Models for Discrete-Event Modeling Formalisms
Matthias Becker
Cyberworlds, 2007. CW'07. International Conference onIn this paper an automatic transformation of arbitrary manufacturing models (modeled as queuing net or stochastic Petri net) to an interactive 3D visualization and animation (realized in a game-engine) is presented. The motivation behind this is, to make the very useful but rather boring mathematical models formulated as queuing net or Petri net easily accessible for users and decision makers, who are not interested in the details of the mathematical modeling formalism. Queuing networks and Petri nets have a long tradition and are a well accepted means for modeling, simulation and analysis of discrete event systems. Although these formalisms have a graphical notation intuitive to use for experts, they lack a good presentation layer which is needed for acceptance in industry or for commercial purposes, or for academic non experts. -
NEURAL NETWORKS AND OPTIMIZATION ALGORITHMS APPLIED FOR CONSTRUCTION OF LOW NOISE TREAD PROFILES
Matthias Becker, Helena Szczerbicka and Michael Thomas
Cybernetics and Systems: An International JournalIn this article we evaluate and compare diverse methodologies for designing low-noise tread profiles. Finding a low noise tread profile under given constraints can be described as a search in search space which is typically of the order of a 50– to 70-dimensional vector space. A complete search for the optimal tread profile is not possible even with today's computers. Thus in this work we compare the feasibility of three classes of algorithms for tread profile construction. First, we discuss approaches of speeding up the generation and analysis of tread profiles. Second we use two algorithms for iterative construction of large tread profiles out of several smaller tread profiles known to be of good quality. One of these algorithms is based on Neural Networks. Third, we evaluate heuristic optimization algorithms such as Genetic Algorithms and Simulated Annealing. Last we compare suitability and efficiency of our approaches.
2006
-
Genetic algorithms for noise reduction in tire design
Matthias Becker
2006 IEEE International Conference on Systems, Man and CyberneticsIn this paper we report about deployment of genetic algorithms in order to optimize tread profiles for tires that will produce an unobtrusive noise. Since the complexity of the problem grows exponentially (the search space is typically of the order of a 65-dimensional vector space), a complete search for the optimal tread profile is not possible even with today's computers. Thus heuristic optimization algorithms are an appropriate means to find (near) optimal tread profiles. We discuss approaches of speeding up the generation and analysis of tread profiles, and results using genetic algorithms. -
Intelligent reduction of tire noise
Matthias Becker and Helena Szczerbicka
International Conference on Knowledge-Based and Intelligent Information and Engineering SystemsIn this paper we report about deployment of intelligent optimisation algorithms for noice reduction in tire manufacturing. Since the complexity of the problem grows exponentially (the search space is typically of the order of a 65-dimensional vector space), a complete search for the optimal tread profile is not possible even with today’s computers. Thus heuristic optimization algorithms such as Genetic Algorithms and Simulated Annealing are an appropriate means to find (near) optimal tread profiles. We discuss approaches of speeding up the generation and analysis of tread profiles, and results using various optimization algorithms.
2005
-
Optimisation of buffer size in manufacturing systems using ant algorithms
Matthias Becker and Helena Szczerbicka
Foundations of Control and Management SciencesIn this article we use the Ant Colony Optimisation (AGO) algorithm in order to find optimal Kanban allocations in Kanban systems represented by Stochastic Petri Net (SPN) models. Like other optimisation algorithms inspired by nature, such as Simulated Annealing/Genetic Algorithms, the AGO algorithm contains a large number of adjustable parameters. Thus we study the influence of the parameters on performance of AGO on the Kanban allocation problem, and identify the most important parameters. -
Approaching Ad Hoc Wireless Networks with Autonomic Computing: A Misbehavior Perspective
Martin Drozda, Helena Szczerbicka, Thomas Bessey, Matthias Becker and Rainer Barton
Proc. 2005 International Symposium on Performance Evaluation of Computer and Telecommunication Systems (SPECTS'05)We review recent approaches to dealing with misbehavior in ad hoc wireless networks. We focus on a specific class of solutions that are based on autonomic computing. This class is motivated by the very efficient and complex system that is able to protect the health of humans against an amazing set of malicious extraneous attacks. We also provide the reader with a summary of misbehavior that is currently considered in the literature. Certain aspects of machine learning and game theory with relevance to misbehavior detection are reviewed as well. Based on relevant design properties of Artificial immune systems motivated by human immunity we present an outline of a four-layer architecture for ad hoc wireless networks. The purpose of this architecture is to impose a high degree of survivability against misbehavior of nodes. -
Parameters Influencing the Performance of Ant Algorithms Applied to Optimisation of Buffer Size in Manufacturing
Matthias Becker and Helena Szczerbicka
Industrial Engineering \& Management SystemsIn this article we study the feasibility of the Ant Colony Optimisation (ACO) algorithm for finding optimal Kanban allocations in Kanban systems represented by Stochastic Petri Net (SPN) models. Like other optimisation algorithms inspired by nature, such as Simulated Annealing/Genetic Algorithms, the ACO algorithm contains a large number of adjustable parameters. Thus we study the influence of the parameters on performance of ACO on the Kanban allocation problem, and identify the most important parameters.
2003
-
Modeling and simulation of a complete semiconductor manufacturing facility using Petri nets
Matthias Becker
Emerging Technologies and Factory Automation, 2003. Proceedings. ETFA'03. IEEE ConferenceMost studies employing Petri nets in semiconductor manufacturing model only one specific area (e.g. etching) in detail, and model the rest of the manufacturing process, e.g. by abstract input/output behavior. In our study, we show the feasibility of using Petri nets for modeling the complete production process. We use the first set of test data provided by the MASM-LAB, Arizona State University. It is a process of a two-product system making non-volatile memory chips. For modeling, we use our own tool PSim, which is based on a combined queuing and Petri net formalism. The integration of queues makes the modeling of parts waiting in front of a machine quite concise and intuitive. PSim offers a hierarchical and modular modeling approach, which is especially feasible for large and complex systems. We use the modular approach by once defining the structure of a machine as Petri net and then instantiating as many machines as needed. Then we model the operators, resources and the movement of parts between the machines as specified in the production plan. As a result, we can state that Petri nets are feasible for modeling a complete semiconductor manufacturing process. -
Planning the Reconstruction of a Shiplift by Simulation of a Stochastic Petri Net Model
Matthias Becker and Thomas Bessey
European Simulation Symposium -
A STUDY OF CONTROL VIA ON-LINE SIMULATION USING STOCHASTIC PETRI NETS
Matthias Becker, Thomas Bessey and Helena Szczerbicka
European Simulation Symposium (ESS)Complex systems such as flexible manufacturing systems and traffic systems typically evolve with alternating periods of transient and nearly steady-state behavior; such systems often show suboptimal performance. Thus, it is desirable to optimize the system’s performance on-line by adjusting the system’s parameters properly before a performance drop is to occur. To this end, the system’s future evolution is assessed in advance repeatedly by means of on-line simulation. However, there are several problems accompanying this approach, particularly the demand of real-time decisions, that have not been sufficiently solved yet. Aiming at studying the dynamics of on-line control as well as its impact on the system’s operation, we built a stochastic Petri net model that simulates online control of a simple open queueing network as it performs by means of on-line simulation. The system under control is easy to study since it has known properties and can be considered as part of a manufacturing system; jobs arriving at the system have to be dispatched to one of two machines, each providing a queue for jobs waiting to be processed. The processing times of the machines are deterministic or stochastic, while the jobs’ arrival times are stochastic. With on-line simulation, the system’s future performance is assessed by virtually dispatching a new job to either of the machines, based on the system’s current state; the results are compared and thus lead to the real decision concerning to what machine the new job should be dispatched in order to minimize the work in progress.
2002
-
Integrating Software Performance Evaluation in Software-Engineering
Matthias Becker, Lutz Twele and Helena Szczerbicka
First international conference on grand challenges for modeling and simulations at Western MultiConferenceDuring a case study we encountered some problems, which could be solved manually this time, but pose a grand challenge regarding a generally applicable methodology. The case study was about verification and performance evaluation of a Fault Tolerant Computer (FTC) System to be employed in the International Space Station (ISS). Four different specifications of the FTC had to be developped for different purposes (chronologically ordered): the informal specification (Data Flow Diagram), OCCAM code (implementation), a CSP model (deadlock-analysis), and a GSPN-model for performance evaluation. Each model has been derived from the predecessors, mostly manually. This was an enduring and error-prone process, for which an integrated methodology should be developed. Why this is urgently necessary, and why this is a challenge and what steps might lead to a possible solution, these questions will be answered in this paper. -
A tool for distributed modeling of nets (DIMON)
Thomas Bessey and Matthias Becker
IEEE International Conference on Systems, Man and CyberneticsIn this paper, we present a tool (DIMON) for distributed modeling of nets such as queueing networks or Petri nets or combinations of both over an IP based computer network, together with a 3-D graphical editor (DIM3ON) for modeling hierarchical nets utilizing that tool. Both the tool and the editor are implemented in Java. In this implementation, the net concept is considered as a superclass and thus can be extended by inheritance to any more specialized concept, eg, to a stochastic Petri net (note that the Petri net concept itself is a subclass of the net concept). The editor is designed to be extensible in a rather easy manner in order to consider yet unsupported net classes. The nets are specified using the Extensible Markup Language (XML) in order to facilitate storage and interchange across a computer network. -
On Modification In Petri Nets
Roger Jahns, Matthias Becker, Thomas Bessey and Helena Szczerbicka
Proc. Symposium on Performance Evaluation of Computer and Telecommunication Systems (SPECTS) -
Comparison of the modeling power of fluid stochastic Petri nets (FSPN) and hybrid Petri nets (HPN)
Matthias Becker and Thomas Bessey
IEEE International Conference on Systems, Man and CyberneticsTwo different types of timed Petri nets that contain continuous tokens have been developed separately. Fluid Stochastic Petri Nets (FSPN) are stochastic Petri nets enhanced by continuous places. Continuous places can be filled from ordinary transitions, while the transitions are enabled by discrete places. Hybrid Petri Nets (HPN) are stochastic Petri nets enhanced by continuous places and continuous transitions. Both kinds of transitions can be enabled by both kinds of places, and both kinds of transitions can be connected by arcs to/from both kinds of places (of course with some restrictions). Each of the continuous Petri net formalisms provides interesting analysis methods, and both formalisms experienced a lot of extensions on modeling level after their first introduction. In this paper, we compare the modeling power of the basic versions and of some extensions of both formalisms. As result we show, that in general, FSPNs can be emulated with HPNs, and vice versa, however depending on the versions considered. Thus, there is no essential difference in both formalisms. A transformation of one type of net to the other one can be found, if for some reason (eg, use of different analysis methods) the other formalism is to prefer.
2001
-
Property-Conserving Transformations in PNiQ (Petri Nets including Queueing Networks)
Matthias Becker
Computer Science and Engineering: Invited Session on Modelling and Analysis based on Petri nets -
Integration of multi-class queueing networks in generalized stochastic Petri Nets
Matthias Becker and Helena Szczerbicka
Systems, Man, and Cybernetics, 2001 IEEE International Conference onIn this paper, we extend Petri Nets including Queueing Networks (PNiQs) with several job classes and define Multi Class-PNiQs (MC-PNiQ). A MC-PNiQ is a Generalized Stochastic Petri Net (GSPN) which contains multiclass queueing networks (MC-QNs). This approach allows to combine the advantages of both concepts on the modelling level as well as on the analysis level. The use of MC-QNs permits concise modelling of waiting room, server and queueing and offers many fast analysis algorithms for large multiclass systems. Additionally, GSPN provide the expressiveness and flexibility for modelling of more complicated structures like fork/join, etc. The definition of MC-PNiQs is especially designed to allow a qualitative analysis based on a reduced state space, as well as automated approximate analysis by aggregation of the multi-class queueing nets and replacing them with GSPN elements. The resulting GSPN can then be analyzed with state-of-the-art methods and tools.
2000
-
Combination of queueing networks and generalized stochastic Petri nets
Matthias Becker -
Combination of Queueing Networks and Generalized Stochastic Petri Nets (Berichte Aus Der Informatik)
Matthias Becker -
PNiQ - a concept for performability evaluation
Matthias Becker and Helena Szczerbicka
System performance evaluationIntegrated performance and dependability analysis, called performability has been receiving considerable attention in the design of complex, fault-tolerant systems. We present Petri Nets including Queueing Networks (PNIQ) a novel high-level modeling technique, which is particularly appropriate for performability evaluation. The definition integrates concepts of generalized stochastic Petri nets (GSPN) and Queueing Networks on the modeling level and specifies interfaces between them. Steady state solution is based on aggregation of queueing networks and replacing them with GSPN constructs that model the delays of tokens introduced by queueing nets. The resulting GSPN model can be analyzed with state of the art methods and tools. This process can be carried out automatically. Applicability of PNIQ for evaluation of performability is shown in an example.
1999
-
PNiQ: Integration of queuing networks in generalised stochastic Petri nets
Matthias Becker and Helena Szczerbicka
IEE Proceedings-SoftwareGeneralised stochastic Petri nets (GSPN) and queuing networks are combined at the modelling level by defining Petri Nets including Queuing Networks (PNiQ). The definition is especially designed to allow approximate analysis by aggregation of the queuing nets and replacing them with GSPN elements. Usually the aggregation of combined GSPN and queuing network models is carried out manually which limits the use of this technique to experts and furthermore may easily lead to modelling errors and larger approximation errors than inherent in the method. These are avoided by the definition of PNiQ which shows how to incorporate queuing networks into GSPN and provides interfaces between them. This makes combined modelling easier and less error-prone. Steady state analysis of the model can be carried out automatically: queuing network parts are analysed with efficient queuing network algorithms for large nets and replaced by GSPN subnets that model the delay of tokens in the queuing network. The resulting GSPN can then be handled with state-of-the-art tools.
1998
-
Genetic algorithms: a tool for modelling, simulation, and optimization of complex systems
Michael Syrjakow, Helena Szczerbicka and Matthias Becker
Cybernetics \& SystemsUntil very recently genetic algorithms GAs were considered to be the proprietary field of general systems theoreticians and important for esoteric or extremely complex optimization studies. This paper endeavors to show that GA are of great utility in cases where complex systems have to be designed and, therefore, rational choices have to be made. The GA approach is based loosely on the theory of natural evolution, genetic diversity, and searching for beneficial adaptations to a complicated and changing environment. GAs can be viewed as a modelling tool and as a technique for simulation of complex systems represented by communities of interacting units. The representation of units can express characteristics, capabilities, or relatively simple strategies. These units compete and are modified by external operators, so that the overall system adapts to its environment. That environment defines the criterion by which the success in adapting can be measured. Genetic algorithms have been successfully applied to many optimization problems including mathematical function optimization, very large scale integration VLSI chip layout, molecular docking, parameter fitting, scheduling, manufacturing, clustering, machine learning, etc. and are still finding increasing acceptance. Modelling and optimization of a Kanban system from the field of flexible manufacturing systems is discussed in the last section. -
Modeling and optimization of Kanban controlled manufacturing systems with GSPN including QN
Matthias Becker and Helena Szczerbicka
Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference onWe investigate the kanban assignment problem for assembly kanban systems. We use REMO, a general purpose tool for optimization. REMO includes several algorithms like hill climbing and genetic algorithms and has an easily adaptable interface to performance analysis tools. As we try to find an optimal kanban assignment with respect to certain performance measures of the system, a fast performance analysis is a crucial factor for sensible and successful application of optimization algorithms. For this purpose we introduce Petri nets including queueing nets (PNiQ) as modeling formalism especially suited for optimization of arbitrary kanban systems. At modeling level, PNiQ allow the use of both the concise description of queueing nets where possible and the notation of stochastic Petri nets where needed, e.g. to model fork/join needed for the matching of kanbans and parts. Approximate performance analysis is carried out by decomposition and aggregation of the queueing net parts. This technique provides a fast numerical solution even for large systems as important requirement for the application of optimization algorithms. The optimization not only yields optimal kanban assignments for various kanban systems but also a common pattern in the set of solutions can be recognized. -
Combined Modeling with Generalized Stochastic Petri Nets including Queuing Nets
Matthias Becker and Helena Szczerbicka
14th UK Computer and Telecommunications Performance, Engineering Workshop: 1998 -
PNiQ Generalized Stochastic Petri Nets including Queuing Networks
Matthias Becker and Helena Szczerbicka
Advances in computer and information sciences' 98: ISCIS'98: proceedings of the 13th International Symposium on Computer and Information Sciences, 26-28 October 1998, Belek-Antalya, Turkey -
Modellierung eines Kanban-Systems mit zwei Produktarten, Priorit{\"a}ten und Umr{\"u}stzeiten
Matthias Becker and Alexander K Sch{\"o}mig
Operations Research Proceedings 1997Im Zuge der raschen Entwicklung und Globalisierung internationaler Märkte sind die Unternehmen gezwungen, auf stark fluktuierende Nachfrageänderungen zu reagieren. Aus dieser Notwendigkeit hat sich die Just-in-Time Philosophie entwickelt, die zum Ziel hat, Lagerbestand und Durchlaufzeiten gering zu halten, um sich schnell an eine neue Marktsituation anpassen zu können. Der Kanban-Mechanismus ist eine Möglichkeit, den Fluß der Teile in einer Fertigungsanlage im Sinne der JIT Philosophie zu steuern. In der Literatur gibt es einige Ansätze zur Analyse von Kanban gesteuerten Produktionssystemen [1],[2], es gibt bisher aber keine Arbeiten, die mehrere Klassen, Rüstzeiten und Prioritäten berücksichtigen. Deshalb erweitern wir den Analyseansatz von Mitra und Mitrani [1] auf die Behandlung von zwei Produktarten, Umrüstzeiten und Prioritäten. Motiviert wurde die Arbeit durch ein Problem aus der Halbleiterfertigung, wo “Engineering” — Wafer Priorität vor regulären Wafer haben. -
Modeling and optimization of ON/OFF sources transmitting data over an unreliable network
Matthias Becker and Helena Szczerbicka
Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference onIn modern broadband systems several on/off traffic sources transmit over a common telecommunication network. Since the network has a finite capacity, the probability of a breakdown due to congestion increases the more sources simultaneously send data packets. In case of a failure due to congestion a time consuming recovery procedure resolves the overload situation. During this procedure no service can be provided for the traffic sources. In this paper we model this scenario with generalized stochastic Petri nets and investigate the influence of several system parameters on the performability of the system. For these parameters (number of traffic sources in the system, length of on/off period in send modus and length of send/idle period of the sources) we try to find optimal settings which maximize the number of transmitted data packets. Optimization was performed with the tool REMO. Moreover we model and optimize a dynamic decision rule which decides, dependent on the load situation, whether to admit an additional source or not. These investigations could be used to create guidelines for service providers how to optimally operate the system.